Energy Technology Data Exchange (ETDEWEB)
Niu, Ben, E-mail: niubenhit@163.com [Department of Mathematics, Harbin Institute of Technology, Weihai 264209 (China); Guo, Yuxiao [Department of Mathematics, Harbin Institute of Technology, Weihai 264209 (China); Jiang, Weihua [Department of Mathematics, Harbin Institute of Technology, Harbin 150001 (China)
2015-09-25
Heterogeneous delays with positive lower bound (gap) are taken into consideration in Kuramoto model. On the Ott–Antonsen's manifold, the dynamical transitional behavior from incoherence to coherence is mediated by Hopf bifurcation. We establish a perturbation technique on complex domain, by which universal normal forms, stability and criticality of the Hopf bifurcation are obtained. Theoretically, a hysteresis loop is found near the subcritically bifurcated coherent state. With respect to Gamma distributed delay with fixed mean and variance, we find that the large gap decreases Hopf bifurcation value, induces supercritical bifurcations, avoids the hysteresis loop and significantly increases in the number of coexisting coherent states. The effect of gap is finally interpreted from the viewpoint of excess kurtosis of Gamma distribution. - Highlights: • Heterogeneously delay-coupled Kuramoto model with minimal delay is considered. • Perturbation technique on complex domain is established for bifurcation analysis. • Hysteresis phenomenon is investigated in a theoretical way. • The effect of excess kurtosis of distributed delays is discussed.
A normal form approach to the theory of nonlinear betatronic motion
International Nuclear Information System (INIS)
Bazzani, A.; Todesco, E.; Turchetti, G.; Servizi, G.
1994-01-01
The betatronic motion of a particle in a circular accelerator is analysed using the transfer map description of the magnetic lattice. In the linear case the transfer matrix approach is shown to be equivalent to the Courant-Snyder theory: In the normal coordinates' representation the transfer matrix is a pure rotation. When the nonlinear effects due to the multipolar components of the magnetic field are taken into account, a similar procedure is used: a nonlinear change of coordinates provides a normal form representation of the map, which exhibits explicit symmetry properties depending on the absence or presence of resonance relations among the linear tunes. The use of normal forms is illustrated in the simplest but significant model of a cell with a sextupolar nonlinearity which is described by the quadratic Henon map. After recalling the basic theoretical results in Hamiltonian dynamics, we show how the normal forms describe the different topological structures of phase space such as KAM tori, chains of islands and chaotic regions; a critical comparison with the usual perturbation theory for Hamilton equations is given. The normal form theory is applied to compute the tune shift and deformation of the orbits for the lattices of the SPS and LHC accelerators, and scaling laws are obtained. Finally, the correction procedure of the multipolar errors of the LHC, based on the analytic minimization of the tune shift computed via the normal forms, is described and the results for a model of the LHC are presented. This application, relevant for the lattice design, focuses on the advantages of normal forms with respect to tracking when parametric dependences have to be explored. (orig.)
Normal form theory and spectral sequences
Sanders, Jan A.
2003-01-01
The concept of unique normal form is formulated in terms of a spectral sequence. As an illustration of this technique some results of Baider and Churchill concerning the normal form of the anharmonic oscillator are reproduced. The aim of this paper is to show that spectral sequences give us a natural framework in which to formulate normal form theory. © 2003 Elsevier Science (USA). All rights reserved.
Nonlinear dynamics exploration through normal forms
Kahn, Peter B
2014-01-01
Geared toward advanced undergraduates and graduate students, this exposition covers the method of normal forms and its application to ordinary differential equations through perturbation analysis. In addition to its emphasis on the freedom inherent in the normal form expansion, the text features numerous examples of equations, the kind of which are encountered in many areas of science and engineering. The treatment begins with an introduction to the basic concepts underlying the normal forms. Coverage then shifts to an investigation of systems with one degree of freedom that model oscillations
a Recursive Approach to Compute Normal Forms
HSU, L.; MIN, L. J.; FAVRETTO, L.
2001-06-01
Normal forms are instrumental in the analysis of dynamical systems described by ordinary differential equations, particularly when singularities close to a bifurcation are to be characterized. However, the computation of a normal form up to an arbitrary order is numerically hard. This paper focuses on the computer programming of some recursive formulas developed earlier to compute higher order normal forms. A computer program to reduce the system to its normal form on a center manifold is developed using the Maple symbolic language. However, it should be stressed that the program relies essentially on recursive numerical computations, while symbolic calculations are used only for minor tasks. Some strategies are proposed to save computation time. Examples are presented to illustrate the application of the program to obtain high order normalization or to handle systems with large dimension.
Normal forms of Hopf-zero singularity
International Nuclear Information System (INIS)
Gazor, Majid; Mokhtari, Fahimeh
2015-01-01
The Lie algebra generated by Hopf-zero classical normal forms is decomposed into two versal Lie subalgebras. Some dynamical properties for each subalgebra are described; one is the set of all volume-preserving conservative systems while the other is the maximal Lie algebra of nonconservative systems. This introduces a unique conservative–nonconservative decomposition for the normal form systems. There exists a Lie-subalgebra that is Lie-isomorphic to a large family of vector fields with Bogdanov–Takens singularity. This gives rise to a conclusion that the local dynamics of formal Hopf-zero singularities is well-understood by the study of Bogdanov–Takens singularities. Despite this, the normal form computations of Bogdanov–Takens and Hopf-zero singularities are independent. Thus, by assuming a quadratic nonzero condition, complete results on the simplest Hopf-zero normal forms are obtained in terms of the conservative–nonconservative decomposition. Some practical formulas are derived and the results implemented using Maple. The method has been applied on the Rössler and Kuramoto–Sivashinsky equations to demonstrate the applicability of our results. (paper)
Normal forms of Hopf-zero singularity
Gazor, Majid; Mokhtari, Fahimeh
2015-01-01
The Lie algebra generated by Hopf-zero classical normal forms is decomposed into two versal Lie subalgebras. Some dynamical properties for each subalgebra are described; one is the set of all volume-preserving conservative systems while the other is the maximal Lie algebra of nonconservative systems. This introduces a unique conservative-nonconservative decomposition for the normal form systems. There exists a Lie-subalgebra that is Lie-isomorphic to a large family of vector fields with Bogdanov-Takens singularity. This gives rise to a conclusion that the local dynamics of formal Hopf-zero singularities is well-understood by the study of Bogdanov-Takens singularities. Despite this, the normal form computations of Bogdanov-Takens and Hopf-zero singularities are independent. Thus, by assuming a quadratic nonzero condition, complete results on the simplest Hopf-zero normal forms are obtained in terms of the conservative-nonconservative decomposition. Some practical formulas are derived and the results implemented using Maple. The method has been applied on the Rössler and Kuramoto-Sivashinsky equations to demonstrate the applicability of our results.
An Algorithm for Higher Order Hopf Normal Forms
Directory of Open Access Journals (Sweden)
A.Y.T. Leung
1995-01-01
Full Text Available Normal form theory is important for studying the qualitative behavior of nonlinear oscillators. In some cases, higher order normal forms are required to understand the dynamic behavior near an equilibrium or a periodic orbit. However, the computation of high-order normal forms is usually quite complicated. This article provides an explicit formula for the normalization of nonlinear differential equations. The higher order normal form is given explicitly. Illustrative examples include a cubic system, a quadratic system and a Duffing–Van der Pol system. We use exact arithmetic and find that the undamped Duffing equation can be represented by an exact polynomial differential amplitude equation in a finite number of terms.
Multidimensional Normalization to Minimize Plate Effects of Suspension Bead Array Data.
Hong, Mun-Gwan; Lee, Woojoo; Nilsson, Peter; Pawitan, Yudi; Schwenk, Jochen M
2016-10-07
Enhanced by the growing number of biobanks, biomarker studies can now be performed with reasonable statistical power by using large sets of samples. Antibody-based proteomics by means of suspension bead arrays offers one attractive approach to analyze serum, plasma, or CSF samples for such studies in microtiter plates. To expand measurements beyond single batches, with either 96 or 384 samples per plate, suitable normalization methods are required to minimize the variation between plates. Here we propose two normalization approaches utilizing MA coordinates. The multidimensional MA (multi-MA) and MA-loess both consider all samples of a microtiter plate per suspension bead array assay and thus do not require any external reference samples. We demonstrate the performance of the two MA normalization methods with data obtained from the analysis of 384 samples including both serum and plasma. Samples were randomized across 96-well sample plates, processed, and analyzed in assay plates, respectively. Using principal component analysis (PCA), we could show that plate-wise clusters found in the first two components were eliminated by multi-MA normalization as compared with other normalization methods. Furthermore, we studied the correlation profiles between random pairs of antibodies and found that both MA normalization methods substantially reduced the inflated correlation introduced by plate effects. Normalization approaches using multi-MA and MA-loess minimized batch effects arising from the analysis of several assay plates with antibody suspension bead arrays. In a simulated biomarker study, multi-MA restored associations lost due to plate effects. Our normalization approaches, which are available as R package MDimNormn, could also be useful in studies using other types of high-throughput assay data.
Rigidity of minimal submanifolds with flat normal bundle
Indian Academy of Sciences (India)
Rigidity of minimal submanifolds with flat normal bundle. 461. = a. ∫. M u2(1+q)+ 2 a f 2 − 2. ∫. M u2q+1f 〈∇f, ∇u〉. − (2q + 1). ∫. M u2qf 2|∇u|2, which gives a .... that depends on n, ϵ and q. We now try to transform (2.15) the right hand side only involved u in the power two. For that, we use Young's inequality: ab ≤ βsas.
Minimally Invasive Procedures - Direct and Video-Assisted Forms in the Treatment of Heart Diseases
International Nuclear Information System (INIS)
Castro, Josué Viana Neto; Melo, Emanuel Carvalho; Silva, Juliana Fernandes; Rebouças, Leonardo Lemos; Corrêa, Larissa Chagas; Germano, Amanda de Queiroz; Machado, João José Aquino
2014-01-01
Minimally invasive cardiovascular procedures have been progressively used in heart surgery. To describe the techniques and immediate results of minimally invasive procedures in 5 years. Prospective and descriptive study in which 102 patients were submitted to minimally invasive procedures in direct and video-assisted forms. Clinical and surgical variables were evaluated as well as the in hospital follow-up of the patients. Fourteen patients were operated through the direct form and 88 through the video-assisted form. Between minimally invasive procedures in direct form, 13 had aortic valve disease. Between minimally invasive procedures in video-assisted forms, 43 had mitral valve disease, 41 atrial septal defect and four tumors. In relation to mitral valve disease, we replaced 26 and reconstructed 17 valves. Aortic clamp, extracorporeal and procedure times were, respectively, 91,6 ± 21,8, 112,7 ± 27,9 e 247,1 ± 20,3 minutes in minimally invasive procedures in direct form. Between minimally invasive procedures in video-assisted forms, 71,6 ± 29, 99,7 ± 32,6 e 226,1 ± 42,7 minutes. Considering intensive care and hospitalization times, these were 41,1 ± 14,7 hours and 4,6 ± 2 days in minimally invasive procedures in direct and 36,8 ± 16,3 hours and 4,3 ± 1,9 days in minimally invasive procedures in video-assisted forms procedures. Minimally invasive procedures were used in two forms - direct and video-assisted - with safety in the surgical treatment of video-assisted, atrial septal defect and tumors of the heart. These procedures seem to result in longer surgical variables. However, hospital recuperation was faster, independent of the access or pathology
Normal form for mirror machine Hamiltonians
International Nuclear Information System (INIS)
Dragt, A.J.; Finn, J.M.
1979-01-01
A systematic algorithm is developed for performing canonical transformations on Hamiltonians which govern particle motion in magnetic mirror machines. These transformations are performed in such a way that the new Hamiltonian has a particularly simple normal form. From this form it is possible to compute analytic expressions for gyro and bounce frequencies. In addition, it is possible to obtain arbitrarily high order terms in the adiabatic magnetic moment expansion. The algorithm makes use of Lie series, is an extension of Birkhoff's normal form method, and has been explicitly implemented by a digital computer programmed to perform the required algebraic manipulations. Application is made to particle motion in a magnetic dipole field and to a simple mirror system. Bounce frequencies and locations of periodic orbits are obtained and compared with numerical computations. Both mirror systems are shown to be insoluble, i.e., trajectories are not confined to analytic hypersurfaces, there is no analytic third integral of motion, and the adiabatic magnetic moment expansion is divergent. It is expected also that the normal form procedure will prove useful in the study of island structure and separatrices associated with periodic orbits, and should facilitate studies of breakdown of adiabaticity and the onset of ''stochastic'' behavior
Volume-preserving normal forms of Hopf-zero singularity
International Nuclear Information System (INIS)
Gazor, Majid; Mokhtari, Fahimeh
2013-01-01
A practical method is described for computing the unique generator of the algebra of first integrals associated with a large class of Hopf-zero singularity. The set of all volume-preserving classical normal forms of this singularity is introduced via a Lie algebra description. This is a maximal vector space of classical normal forms with first integral; this is whence our approach works. Systems with a nonzero condition on their quadratic parts are considered. The algebra of all first integrals for any such system has a unique (modulo scalar multiplication) generator. The infinite level volume-preserving parametric normal forms of any nondegenerate perturbation within the Lie algebra of any such system is computed, where it can have rich dynamics. The associated unique generator of the algebra of first integrals are derived. The symmetry group of the infinite level normal forms are also discussed. Some necessary formulas are derived and applied to appropriately modified Rössler and generalized Kuramoto–Sivashinsky equations to demonstrate the applicability of our theoretical results. An approach (introduced by Iooss and Lombardi) is applied to find an optimal truncation for the first level normal forms of these examples with exponentially small remainders. The numerically suggested radius of convergence (for the first integral) associated with a hypernormalization step is discussed for the truncated first level normal forms of the examples. This is achieved by an efficient implementation of the results using Maple. (paper)
Volume-preserving normal forms of Hopf-zero singularity
Gazor, Majid; Mokhtari, Fahimeh
2013-10-01
A practical method is described for computing the unique generator of the algebra of first integrals associated with a large class of Hopf-zero singularity. The set of all volume-preserving classical normal forms of this singularity is introduced via a Lie algebra description. This is a maximal vector space of classical normal forms with first integral; this is whence our approach works. Systems with a nonzero condition on their quadratic parts are considered. The algebra of all first integrals for any such system has a unique (modulo scalar multiplication) generator. The infinite level volume-preserving parametric normal forms of any nondegenerate perturbation within the Lie algebra of any such system is computed, where it can have rich dynamics. The associated unique generator of the algebra of first integrals are derived. The symmetry group of the infinite level normal forms are also discussed. Some necessary formulas are derived and applied to appropriately modified Rössler and generalized Kuramoto-Sivashinsky equations to demonstrate the applicability of our theoretical results. An approach (introduced by Iooss and Lombardi) is applied to find an optimal truncation for the first level normal forms of these examples with exponentially small remainders. The numerically suggested radius of convergence (for the first integral) associated with a hypernormalization step is discussed for the truncated first level normal forms of the examples. This is achieved by an efficient implementation of the results using Maple.
Normal form and synchronization of strict-feedback chaotic systems
International Nuclear Information System (INIS)
Wang, Feng; Chen, Shihua; Yu Minghai; Wang Changping
2004-01-01
This study concerns the normal form and synchronization of strict-feedback chaotic systems. We prove that, any strict-feedback chaotic system can be rendered into a normal form with a invertible transform and then a design procedure to synchronize the normal form of a non-autonomous strict-feedback chaotic system is presented. This approach needs only a scalar driving signal to realize synchronization no matter how many dimensions the chaotic system contains. Furthermore, the Roessler chaotic system is taken as a concrete example to illustrate the procedure of designing without transforming a strict-feedback chaotic system into its normal form. Numerical simulations are also provided to show the effectiveness and feasibility of the developed methods
Directory of Open Access Journals (Sweden)
Liu Xuejiao
2012-11-01
Full Text Available Abstract Background The urinary proteome has been widely used for biomarker discovery. A urinary proteome database from normal humans can provide a background for discovery proteomics and candidate proteins/peptides for targeted proteomics. Therefore, it is necessary to define the minimum number of individuals required for sampling to represent the normal urinary proteome. Methods In this study, inter-individual and inter-gender variations of urinary proteome were taken into consideration to achieve a representative database. An individual analysis was performed on overnight urine samples from 20 normal volunteers (10 males and 10 females by 1DLC/MS/MS. To obtain a representative result of each sample, a replicate 1DLCMS/MS analysis was performed. The minimal sample number was estimated by statistical analysis. Results For qualitative analysis, less than 5% of new proteins/peptides were identified in a male/female normal group by adding a new sample when the sample number exceeded nine. In addition, in a normal group, the percentage of newly identified proteins/peptides was less than 5% upon adding a new sample when the sample number reached 10. Furthermore, a statistical analysis indicated that urinary proteomes from normal males and females showed different patterns. For quantitative analysis, the variation of protein abundance was defined by spectrum count and western blotting methods. And then the minimal sample number for quantitative proteomic analysis was identified. Conclusions For qualitative analysis, when considering the inter-individual and inter-gender variations, the minimum sample number is 10 and requires a balanced number of males and females in order to obtain a representative normal human urinary proteome. For quantitative analysis, the minimal sample number is much greater than that for qualitative analysis and depends on the experimental methods used for quantification.
Normal forms in Poisson geometry
Marcut, I.T.
2013-01-01
The structure of Poisson manifolds is highly nontrivial even locally. The first important result in this direction is Conn's linearization theorem around fixed points. One of the main results of this thesis (Theorem 2) is a normal form theorem in Poisson geometry, which is the Poisson-geometric
Normalizing biomedical terms by minimizing ambiguity and variability
Directory of Open Access Journals (Sweden)
McNaught John
2008-04-01
Full Text Available Abstract Background One of the difficulties in mapping biomedical named entities, e.g. genes, proteins, chemicals and diseases, to their concept identifiers stems from the potential variability of the terms. Soft string matching is a possible solution to the problem, but its inherent heavy computational cost discourages its use when the dictionaries are large or when real time processing is required. A less computationally demanding approach is to normalize the terms by using heuristic rules, which enables us to look up a dictionary in a constant time regardless of its size. The development of good heuristic rules, however, requires extensive knowledge of the terminology in question and thus is the bottleneck of the normalization approach. Results We present a novel framework for discovering a list of normalization rules from a dictionary in a fully automated manner. The rules are discovered in such a way that they minimize the ambiguity and variability of the terms in the dictionary. We evaluated our algorithm using two large dictionaries: a human gene/protein name dictionary built from BioThesaurus and a disease name dictionary built from UMLS. Conclusions The experimental results showed that automatically discovered rules can perform comparably to carefully crafted heuristic rules in term mapping tasks, and the computational overhead of rule application is small enough that a very fast implementation is possible. This work will help improve the performance of term-concept mapping tasks in biomedical information extraction especially when good normalization heuristics for the target terminology are not fully known.
Diagonalization and Jordan Normal Form--Motivation through "Maple"[R
Glaister, P.
2009-01-01
Following an introduction to the diagonalization of matrices, one of the more difficult topics for students to grasp in linear algebra is the concept of Jordan normal form. In this note, we show how the important notions of diagonalization and Jordan normal form can be introduced and developed through the use of the computer algebra package…
Normal equivariant forms of vector fields
International Nuclear Information System (INIS)
Sanchez Bringas, F.
1992-07-01
We prove a theorem of linearization of type Siegel and a theorem of normal forms of type Poincare-Dulac for germs of holomorphic vector fields in the origin of C 2 , Γ -equivariants, where Γ is a finite subgroup of GL (2,C). (author). 5 refs
Normal forms for Poisson maps and symplectic groupoids around Poisson transversals.
Frejlich, Pedro; Mărcuț, Ioan
2018-01-01
Poisson transversals are submanifolds in a Poisson manifold which intersect all symplectic leaves transversally and symplectically. In this communication, we prove a normal form theorem for Poisson maps around Poisson transversals. A Poisson map pulls a Poisson transversal back to a Poisson transversal, and our first main result states that simultaneous normal forms exist around such transversals, for which the Poisson map becomes transversally linear, and intertwines the normal form data of the transversals. Our second result concerns symplectic integrations. We prove that a neighborhood of a Poisson transversal is integrable exactly when the Poisson transversal itself is integrable, and in that case we prove a normal form theorem for the symplectic groupoid around its restriction to the Poisson transversal, which puts all structure maps in normal form. We conclude by illustrating our results with examples arising from Lie algebras.
AFP Algorithm and a Canonical Normal Form for Horn Formulas
Majdoddin, Ruhollah
2014-01-01
AFP Algorithm is a learning algorithm for Horn formulas. We show that it does not improve the complexity of AFP Algorithm, if after each negative counterexample more that just one refinements are performed. Moreover, a canonical normal form for Horn formulas is presented, and it is proved that the output formula of AFP Algorithm is in this normal form.
Utilizing Nested Normal Form to Design Redundancy Free JSON Schemas
Directory of Open Access Journals (Sweden)
Wai Yin Mok
2016-12-01
Full Text Available JSON (JavaScript Object Notation is a lightweight data-interchange format for the Internet. JSON is built on two structures: (1 a collection of name/value pairs and (2 an ordered list of values (http://www.json.org/. Because of this simple approach, JSON is easy to use and it has the potential to be the data interchange format of choice for the Internet. Similar to XML, JSON schemas allow nested structures to model hierarchical data. As data interchange over the Internet increases exponentially due to cloud computing or otherwise, redundancy free JSON data are an attractive form of communication because they improve the quality of data communication through eliminating update anomaly. Nested Normal Form, a normal form for hierarchical data, is a precise characterization of redundancy. A nested table, or a hierarchical schema, is in Nested Normal Form if and only if it is free of redundancy caused by multivalued and functional dependencies. Using Nested Normal Form as a guide, this paper introduces a JSON schema design methodology that begins with UML use case diagrams, communication diagrams and class diagrams that model a system under study. Based on the use cases’ execution frequencies and the data passed between involved parties in the communication diagrams, the proposed methodology selects classes from the class diagrams to be the roots of JSON scheme trees and repeatedly adds classes from the class diagram to the scheme trees as long as the schemas satisfy Nested Normal Form. This process continues until all of the classes in the class diagram have been added to some JSON scheme trees.
SYNTHESIS METHODS OF ALGEBRAIC NORMAL FORM OF MANY-VALUED LOGIC FUNCTIONS
Directory of Open Access Journals (Sweden)
A. V. Sokolov
2016-01-01
Full Text Available The rapid development of methods of error-correcting coding, cryptography, and signal synthesis theory based on the principles of many-valued logic determines the need for a more detailed study of the forms of representation of functions of many-valued logic. In particular the algebraic normal form of Boolean functions, also known as Zhegalkin polynomial, that well describe many of the cryptographic properties of Boolean functions is widely used. In this article, we formalized the notion of algebraic normal form for many-valued logic functions. We developed a fast method of synthesis of algebraic normal form of 3-functions and 5-functions that work similarly to the Reed-Muller transform for Boolean functions: on the basis of recurrently synthesized transform matrices. We propose the hypothesis, which determines the rules of the synthesis of these matrices for the transformation from the truth table to the coefficients of the algebraic normal form and the inverse transform for any given number of variables of 3-functions or 5-functions. The article also introduces the definition of algebraic degree of nonlinearity of the functions of many-valued logic and the S-box, based on the principles of many-valued logic. Thus, the methods of synthesis of algebraic normal form of 3-functions applied to the known construction of recurrent synthesis of S-boxes of length N = 3k, whereby their algebraic degrees of nonlinearity are computed. The results could be the basis for further theoretical research and practical applications such as: the development of new cryptographic primitives, error-correcting codes, algorithms of data compression, signal structures, and algorithms of block and stream encryption, all based on the perspective principles of many-valued logic. In addition, the fast method of synthesis of algebraic normal form of many-valued logic functions is the basis for their software and hardware implementation.
Cho, Min-Jeong; Hallac, Rami R; Ramesh, Jananie; Seaward, James R; Hermann, Nuno V; Darvann, Tron A; Lipira, Angelo; Kane, Alex A
2018-03-01
Restoring craniofacial symmetry is an important objective in the treatment of many craniofacial conditions. Normal form has been measured using anthropometry, cephalometry, and photography, yet all of these modalities have drawbacks. In this study, the authors define normal pediatric craniofacial form and craniofacial asymmetry using stereophotogrammetric images, which capture a densely sampled set of points on the form. After institutional review board approval, normal, healthy children (n = 533) with no known craniofacial abnormalities were recruited at well-child visits to undergo full head stereophotogrammetric imaging. The children's ages ranged from 0 to 18 years. A symmetric three-dimensional template was registered and scaled to each individual scan using 25 manually placed landmarks. The template was deformed to each subject's three-dimensional scan using a thin-plate spline algorithm and closest point matching. Age-based normal facial models were derived. Mean facial asymmetry and statistical characteristics of the population were calculated. The mean head asymmetry across all pediatric subjects was 1.5 ± 0.5 mm (range, 0.46 to 4.78 mm), and the mean facial asymmetry was 1.2 ± 0.6 mm (range, 0.4 to 5.4 mm). There were no significant differences in the mean head or facial asymmetry with age, sex, or race. Understanding the "normal" form and baseline distribution of asymmetry is an important anthropomorphic foundation. The authors present a method to quantify normal craniofacial form and baseline asymmetry in a large pediatric sample. The authors found that the normal pediatric craniofacial form is asymmetric, and does not change in magnitude with age, sex, or race.
Normal form of linear systems depending on parameters
International Nuclear Information System (INIS)
Nguyen Huynh Phan.
1995-12-01
In this paper we resolve completely the problem to find normal forms of linear systems depending on parameters for the feedback action that we have studied for the special case of controllable linear systems. (author). 24 refs
A New Normal Form for Multidimensional Mode Conversion
International Nuclear Information System (INIS)
Tracy, E. R.; Richardson, A. S.; Kaufman, A. N.; Zobin, N.
2007-01-01
Linear conversion occurs when two wave types, with distinct polarization and dispersion characteristics, are locally resonant in a nonuniform plasma [1]. In recent work, we have shown how to incorporate a ray-based (WKB) approach to mode conversion in numerical algorithms [2,3]. The method uses the ray geometry in the conversion region to guide the reduction of the full NxN-system of wave equations to a 2x2 coupled pair which can be solved and matched to the incoming and outgoing WKB solutions. The algorithm in [2] assumes the ray geometry is hyperbolic and that, in ray phase space, there is an 'avoided crossing', which is the most common type of conversion. Here, we present a new formulation that can deal with more general types of conversion [4]. This formalism is based upon the fact (first proved in [5]) that it is always possible to put the 2x2 wave equation into a 'normal' form, such that the diagonal elements of the dispersion matrix Poisson-commute with the off-diagonals (at leading order). Therefore, if we use the diagonals (rather than the eigenvalues or the determinant) of the dispersion matrix as ray Hamiltonians, the off-diagonals will be conserved quantities. When cast into normal form, the 2x2 dispersion matrix has a very natural physical interpretation: the diagonals are the uncoupled ray hamiltonians and the off-diagonals are the coupling. We discuss how to incorporate the normal form into ray tracing algorithms
Normal forms of invariant vector fields under a finite group action
International Nuclear Information System (INIS)
Sanchez Bringas, F.
1992-07-01
Let Γ be a finite subgroup of GL(n,C). This subgroup acts on the space of germs of holomorphic vector fields vanishing at the origin in C n . We prove a theorem of invariant conjugation to a normal form and linearization for the subspace of invariant elements and we give a description of these normal forms in dimension n=2. (author)
On the relationship between LTL normal forms and Büchi automata
DEFF Research Database (Denmark)
Li, Jianwen; Pu, Geguang; Zhang, Lijun
2013-01-01
In this paper, we revisit the problem of translating LTL formulas to Büchi automata. We first translate the given LTL formula into a special disjuctive-normal form (DNF). The formula will be part of the state, and its DNF normal form specifies the atomic properties that should hold immediately...
Faria, T.; Magalhaes, L. T.
The paper addresses, for retarded functional differential equations (FDEs), the computation of normal forms associated with the flow on a finite-dimensional invariant manifold tangent to invariant spaces for the infinitesimal generator of the linearized equation at a singularity. A phase space appropriate to the computation of these normal forms is introduced, and adequate nonresonance conditions for the computation of the normal forms are derived. As an application, the general situation of Bogdanov-Takens singularity and its versal unfolding for scalar retarded FDEs with nondegeneracy at second order is considered, both in the general case and in the case of differential-delay equations of the form ẋ( t) = ƒ( x( t), x( t-1)).
Reconstruction of normal forms by learning informed observation geometries from data.
Yair, Or; Talmon, Ronen; Coifman, Ronald R; Kevrekidis, Ioannis G
2017-09-19
The discovery of physical laws consistent with empirical observations is at the heart of (applied) science and engineering. These laws typically take the form of nonlinear differential equations depending on parameters; dynamical systems theory provides, through the appropriate normal forms, an "intrinsic" prototypical characterization of the types of dynamical regimes accessible to a given model. Using an implementation of data-informed geometry learning, we directly reconstruct the relevant "normal forms": a quantitative mapping from empirical observations to prototypical realizations of the underlying dynamics. Interestingly, the state variables and the parameters of these realizations are inferred from the empirical observations; without prior knowledge or understanding, they parametrize the dynamics intrinsically without explicit reference to fundamental physical quantities.
Normal Forms for Fuzzy Logics: A Proof-Theoretic Approach
Czech Academy of Sciences Publication Activity Database
Cintula, Petr; Metcalfe, G.
2007-01-01
Roč. 46, č. 5-6 (2007), s. 347-363 ISSN 1432-0665 R&D Projects: GA MŠk(CZ) 1M0545 Institutional research plan: CEZ:AV0Z10300504 Keywords : fuzzy logic * normal form * proof theory * hypersequents Subject RIV: BA - General Mathematics Impact factor: 0.620, year: 2007
A New One-Pass Transformation into Monadic Normal Form
DEFF Research Database (Denmark)
Danvy, Olivier
2003-01-01
We present a translation from the call-by-value λ-calculus to monadic normal forms that includes short-cut boolean evaluation. The translation is higher-order, operates in one pass, duplicates no code, generates no chains of thunks, and is properly tail recursive. It makes a crucial use of symbolic...
Meaning Making Through Minimal Linguistic Forms in Computer-Mediated Communication
Directory of Open Access Journals (Sweden)
Muhammad Shaban Rafi
2014-05-01
Full Text Available The purpose of this study was to investigate the linguistic forms, which commonly constitute meanings in the digital environment. The data were sampled from 200 Bachelor of Science (BS students (who had Urdu as their primary language of communication and English as one of the academic languages or the most prestigious second language of five universities situated in Lahore, Pakistan. The procedure for analysis was conceived within much related theoretical work on text analysis. The study reveals that cyber-language is organized through patterns of use, which can be broadly classified into minimal linguistic forms constituting a meaning-making resource. In addition, the expression of syntactic mood, and discourse roles the participants technically assume tend to contribute to the theory of meaning in the digital environment. It is hoped that the study would make some contribution to the growing literature on multilingual computer-mediated communication (CMC.
Obendorf, Hartmut
2009-01-01
The notion of Minimalism is proposed as a theoretical tool supporting a more differentiated understanding of reduction and thus forms a standpoint that allows definition of aspects of simplicity. This book traces the development of minimalism, defines the four types of minimalism in interaction design, and looks at how to apply it.
Minimal cut-set methodology for artificial intelligence applications
International Nuclear Information System (INIS)
Weisbin, C.R.; de Saussure, G.; Barhen, J.; Oblow, E.M.; White, J.C.
1984-01-01
This paper reviews minimal cut-set theory and illustrates its application with an example. The minimal cut-set approach uses disjunctive normal form in Boolean algebra and various Boolean operators to simplify very complicated tree structures composed of AND/OR gates. The simplification process is automated and performed off-line using existing computer codes to implement the Boolean reduction on the finite, but large tree structure. With this approach, on-line expert diagnostic systems whose response time is critical, could determine directly whether a goal is achievable by comparing the actual system state to a concisely stored set of preprocessed critical state elements
Guerrero Garcia Hall, Mats; Wenner, Jörgen; Öberg, Stefan
2017-03-01
The macroscopic appearance of the normal squamocolumnar junction (SCJ) is often described as serrated with short projections of columnar mucosa that extend into the esophagus. As studies of the normal SCJ are sparse, the aim of this study was to test the hypothesis that the normal SCJ is even and that irregularities are manifestations of acid reflux. Fifty asymptomatic subjects and 149 patients with symptoms suggestive of gastroesophageal reflux disease underwent endoscopy and 48-h pH monitoring with a pH electrode positioned immediately above the SCJ. The shape of the SCJ was assessed according to the Z-line appearance classification and correlated with clinical characteristics and the degree of esophageal acid exposure in the most distal esophagus. Even SCJs without irregularities were significantly more common in asymptomatic subjects compared with patients (50% versus 10%, p acid exposure in individuals with an even SCJ was within normal limits. With increasing degree of irregularity of the SCJ, the frequency and duration of reflux episodes, the degree of distal esophageal acid exposure, and the prevalence of abnormal acid exposure increased progressively and significantly. The shape of the normal SCJ is even and also minimal irregularities are a consequence of acid reflux, likely due to the formation of small areas of metaplastic columnar mucosa.
Hydrogel-forming microneedle arrays: Potential for use in minimally-invasive lithium monitoring.
Eltayib, Eyman; Brady, Aaron J; Caffarel-Salvador, Ester; Gonzalez-Vazquez, Patricia; Zaid Alkilani, Ahlam; McCarthy, Helen O; McElnay, James C; Donnelly, Ryan F
2016-05-01
We describe, for the first time, hydrogel-forming microneedle (s) (MN) arrays for minimally-invasive extraction and quantification of lithium in vitro and in vivo. MN arrays, prepared from aqueous blends of hydrolysed poly(methyl-vinylether-co-maleic anhydride) and crosslinked by poly(ethyleneglycol), imbibed interstitial fluid (ISF) upon skin insertion. Such MN were always removed intact. In vitro, mean detected lithium concentrations showed no significant difference following 30min MN application to excised neonatal porcine skin for lithium citrate concentrations of 0.9 and 2mmol/l. However, after 1h application, the mean lithium concentrations extracted were significantly different, being appropriately concentration-dependent. In vivo, rats were orally dosed with lithium citrate equivalent to 15mg/kg and 30mg/kg lithium carbonate, respectively. MN arrays were applied 1h after dosing and removed 1h later. The two groups, having received different doses, showed no significant difference between lithium concentrations in serum or MN. However, the higher dosed rats demonstrated a lithium concentration extracted from MN arrays equivalent to a mean increase of 22.5% compared to rats which received the lower dose. Hydrogel-forming MN clearly have potential as a minimally-invasive tool for lithium monitoring in outpatient settings. We will now focus on correlation between serum and MN lithium concentrations. Copyright © 2016 The Authors. Published by Elsevier B.V. All rights reserved.
Fast Bitwise Implementation of the Algebraic Normal Form Transform
Bakoev, Valentin
2017-01-01
The representation of Boolean functions by their algebraic normal forms (ANFs) is very important for cryptography, coding theory and other scientific areas. The ANFs are used in computing the algebraic degree of S-boxes, some other cryptographic criteria and parameters of errorcorrecting codes. Their applications require these criteria and parameters to be computed by fast algorithms. Hence the corresponding ANFs should also be obtained by fast algorithms. Here we continue o...
Treatment of cervical agenesis with minimally invasive therapy: Case report
Directory of Open Access Journals (Sweden)
Azami Denas Azinar
2017-11-01
Full Text Available Cervical agenesis is very rare congenital disorder case with cervical not formed. Because of cervical clogged so that menstruation can not be drained. We Report the case of a19 years old women still single with endometrioma, hematometra, cervical agenesis and perform surgery combination laparoscopy and transvaginally with laparoscopic cystectomy, neocervix, and use catheter no 24f in the new cervix. And now she can currently be normal menstruation. Minimally invasive theraphy of congenital anomalies case is recommended to save reproductive function. Keywords: Cervical agenesis, minimal invasive theraphy
Automatic identification and normalization of dosage forms in drug monographs
2012-01-01
Background Each day, millions of health consumers seek drug-related information on the Web. Despite some efforts in linking related resources, drug information is largely scattered in a wide variety of websites of different quality and credibility. Methods As a step toward providing users with integrated access to multiple trustworthy drug resources, we aim to develop a method capable of identifying drug's dosage form information in addition to drug name recognition. We developed rules and patterns for identifying dosage forms from different sections of full-text drug monographs, and subsequently normalized them to standardized RxNorm dosage forms. Results Our method represents a significant improvement compared with a baseline lookup approach, achieving overall macro-averaged Precision of 80%, Recall of 98%, and F-Measure of 85%. Conclusions We successfully developed an automatic approach for drug dosage form identification, which is critical for building links between different drug-related resources. PMID:22336431
Y-formalism and b ghost in the non-minimal pure spinor formalism of superstrings
International Nuclear Information System (INIS)
Oda, Ichiro; Tonin, Mario
2007-01-01
We present the Y-formalism for the non-minimal pure spinor quantization of superstrings. In the framework of this formalism we compute, at the quantum level, the explicit form of the compound operators involved in the construction of the b ghost, their normal-ordering contributions and the relevant relations among them. We use these results to construct the quantum-mechanical b ghost in the non-minimal pure spinor formalism. Moreover we show that this non-minimal b ghost is cohomologically equivalent to the non-covariant b ghost
On the construction of the Kolmogorov normal form for the Trojan asteroids
Gabern, F; Locatelli, U
2004-01-01
In this paper we focus on the stability of the Trojan asteroids for the planar Restricted Three-Body Problem (RTBP), by extending the usual techniques for the neighbourhood of an elliptic point to derive results in a larger vicinity. Our approach is based on the numerical determination of the frequencies of the asteroid and the effective computation of the Kolmogorov normal form for the corresponding torus. This procedure has been applied to the first 34 Trojan asteroids of the IAU Asteroid Catalog, and it has worked successfully for 23 of them. The construction of this normal form allows for computer-assisted proofs of stability. To show it, we have implemented a proof of existence of families of invariant tori close to a given asteroid, for a high order expansion of the Hamiltonian. This proof has been successfully applied to three Trojan asteroids.
Generating All Permutations by Context-Free Grammars in Chomsky Normal Form
Asveld, P.R.J.; Spoto, F.; Scollo, Giuseppe; Nijholt, Antinus
2003-01-01
Let $L_n$ be the finite language of all $n!$ strings that are permutations of $n$ different symbols ($n\\geq 1$). We consider context-free grammars $G_n$ in Chomsky normal form that generate $L_n$. In particular we study a few families $\\{G_n\\}_{n\\geq 1}$, satisfying $L(G_n)=L_n$ for $n\\geq 1$, with
Generating all permutations by context-free grammars in Chomsky normal form
Asveld, P.R.J.
2006-01-01
Let $L_n$ be the finite language of all $n!$ strings that are permutations of $n$ different symbols ($n\\geq1$). We consider context-free grammars $G_n$ in Chomsky normal form that generate $L_n$. In particular we study a few families $\\{G_n\\}_{n\\geq1}$, satisfying $L(G_n)=L_n$ for $n\\geq1$, with
Generating All Permutations by Context-Free Grammars in Chomsky Normal Form
Asveld, P.R.J.
2004-01-01
Let $L_n$ be the finite language of all $n!$ strings that are permutations of $n$ different symbols ($n\\geq 1$). We consider context-free grammars $G_n$ in Chomsky normal form that generate $L_n$. In particular we study a few families $\\{G_n\\}_{n\\geq1}$, satisfying $L(G_n)=L_n$ for $n\\geq 1$, with
On some hypersurfaces with time like normal bundle in pseudo Riemannian space forms
International Nuclear Information System (INIS)
Kashani, S.M.B.
1995-12-01
In this work we classify immersed hypersurfaces with constant sectional curvature in pseudo Riemannian space forms if the normal bundle is time like and the mean curvature is constant. (author). 9 refs
Directory of Open Access Journals (Sweden)
Y. Yuliana
2011-07-01
Full Text Available The aim of an orthodontic treatment is to achieve aesthetic, dental health and the surrounding tissues, occlusal functional relationship, and stability. The success of an orthodontic treatment is influenced by many factors, such as diagnosis and treatment plan. In order to do a diagnosis and a treatment plan, medical record, clinical examination, radiographic examination, extra oral and intra oral photos, as well as study model analysis are needed. The purpose of this study was to evaluate the differences in dental arch form between level four polynomial and pentamorphic arch form and to determine which one is best suitable for normal occlusion sample. This analytic comparative study was conducted at Faculty of Dentistry Universitas Padjadjaran on 13 models by comparing the dental arch form using the level four polynomial method based on mathematical calculations, the pattern of the pentamorphic arch and mandibular normal occlusion as a control. The results obtained were tested using statistical analysis T student test. The results indicate a significant difference both in the form of level four polynomial method and pentamorphic arch form when compared with mandibular normal occlusion dental arch form. Level four polynomial fits better, compare to pentamorphic arch form.
Application of normal form methods to the analysis of resonances in particle accelerators
International Nuclear Information System (INIS)
Davies, W.G.
1992-01-01
The transformation to normal form in a Lie-algebraic framework provides a very powerful method for identifying and analysing non-linear behaviour and resonances in particle accelerators. The basic ideas are presented and illustrated. (author). 4 refs
Normal form of particle motion under the influence of an ac dipole
Directory of Open Access Journals (Sweden)
R. Tomás
2002-05-01
Full Text Available ac dipoles in accelerators are used to excite coherent betatron oscillations at a drive frequency close to the tune. These beam oscillations may last arbitrarily long and, in principle, there is no significant emittance growth if the ac dipole is adiabatically turned on and off. Therefore the ac dipole seems to be an adequate tool for nonlinear diagnostics provided the particle motion is well described in the presence of the ac dipole and nonlinearities. Normal forms and Lie algebra are powerful tools to study the nonlinear content of an accelerator lattice. In this article a way to obtain the normal form of the Hamiltonian of an accelerator with an ac dipole is described. The particle motion to first order in the nonlinearities is derived using Lie algebra techniques. The dependence of the Hamiltonian terms on the longitudinal coordinate is studied showing that they vary differently depending on the ac dipole parameters. The relation is given between the lines of the Fourier spectrum of the turn-by-turn motion and the Hamiltonian terms.
TRASYS form factor matrix normalization
Tsuyuki, Glenn T.
1992-01-01
A method has been developed for adjusting a TRASYS enclosure form factor matrix to unity. This approach is not limited to closed geometries, and in fact, it is primarily intended for use with open geometries. The purpose of this approach is to prevent optimistic form factors to space. In this method, nodal form factor sums are calculated within 0.05 of unity using TRASYS, although deviations as large as 0.10 may be acceptable, and then, a process is employed to distribute the difference amongst the nodes. A specific example has been analyzed with this method, and a comparison was performed with a standard approach for calculating radiation conductors. In this comparison, hot and cold case temperatures were determined. Exterior nodes exhibited temperature differences as large as 7 C and 3 C for the hot and cold cases, respectively when compared with the standard approach, while interior nodes demonstrated temperature differences from 0 C to 5 C. These results indicate that temperature predictions can be artificially biased if the form factor computation error is lumped into the individual form factors to space.
Closed-form confidence intervals for functions of the normal mean and standard deviation.
Donner, Allan; Zou, G Y
2012-08-01
Confidence interval methods for a normal mean and standard deviation are well known and simple to apply. However, the same cannot be said for important functions of these parameters. These functions include the normal distribution percentiles, the Bland-Altman limits of agreement, the coefficient of variation and Cohen's effect size. We present a simple approach to this problem by using variance estimates recovered from confidence limits computed for the mean and standard deviation separately. All resulting confidence intervals have closed forms. Simulation results demonstrate that this approach performs very well for limits of agreement, coefficients of variation and their differences.
DeVille, R. E. Lee; Harkin, Anthony; Holzer, Matt; Josić, Krešimir; Kaper, Tasso J.
2008-06-01
For singular perturbation problems, the renormalization group (RG) method of Chen, Goldenfeld, and Oono [Phys. Rev. E. 49 (1994) 4502-4511] has been shown to be an effective general approach for deriving reduced or amplitude equations that govern the long time dynamics of the system. It has been applied to a variety of problems traditionally analyzed using disparate methods, including the method of multiple scales, boundary layer theory, the WKBJ method, the Poincaré-Lindstedt method, the method of averaging, and others. In this article, we show how the RG method may be used to generate normal forms for large classes of ordinary differential equations. First, we apply the RG method to systems with autonomous perturbations, and we show that the reduced or amplitude equations generated by the RG method are equivalent to the classical Poincaré-Birkhoff normal forms for these systems up to and including terms of O(ɛ2), where ɛ is the perturbation parameter. This analysis establishes our approach and generalizes to higher order. Second, we apply the RG method to systems with nonautonomous perturbations, and we show that the reduced or amplitude equations so generated constitute time-asymptotic normal forms, which are based on KBM averages. Moreover, for both classes of problems, we show that the main coordinate changes are equivalent, up to translations between the spaces in which they are defined. In this manner, our results show that the RG method offers a new approach for deriving normal forms for nonautonomous systems, and it offers advantages since one can typically more readily identify resonant terms from naive perturbation expansions than from the nonautonomous vector fields themselves. Finally, we establish how well the solution to the RG equations approximates the solution of the original equations on time scales of O(1/ɛ).
Koskela, Anne; Vehkalahti, Kaisa
2017-01-01
This article shows the importance of paying attention to the role of professional devices, such as standardised forms, as producers of normality and deviance in the history of education. Our case study focused on the standardised forms used by teachers during child guidance clinic referrals and transfers to special education in northern Finland,…
A Mathematical Framework for Critical Transitions: Normal Forms, Variance and Applications
Kuehn, Christian
2013-06-01
Critical transitions occur in a wide variety of applications including mathematical biology, climate change, human physiology and economics. Therefore it is highly desirable to find early-warning signs. We show that it is possible to classify critical transitions by using bifurcation theory and normal forms in the singular limit. Based on this elementary classification, we analyze stochastic fluctuations and calculate scaling laws of the variance of stochastic sample paths near critical transitions for fast-subsystem bifurcations up to codimension two. The theory is applied to several models: the Stommel-Cessi box model for the thermohaline circulation from geoscience, an epidemic-spreading model on an adaptive network, an activator-inhibitor switch from systems biology, a predator-prey system from ecology and to the Euler buckling problem from classical mechanics. For the Stommel-Cessi model we compare different detrending techniques to calculate early-warning signs. In the epidemics model we show that link densities could be better variables for prediction than population densities. The activator-inhibitor switch demonstrates effects in three time-scale systems and points out that excitable cells and molecular units have information for subthreshold prediction. In the predator-prey model explosive population growth near a codimension-two bifurcation is investigated and we show that early-warnings from normal forms can be misleading in this context. In the biomechanical model we demonstrate that early-warning signs for buckling depend crucially on the control strategy near the instability which illustrates the effect of multiplicative noise.
Bernstein Algorithm for Vertical Normalization to 3NF Using Synthesis
Directory of Open Access Journals (Sweden)
Matija Varga
2013-07-01
Full Text Available This paper demonstrates the use of Bernstein algorithm for vertical normalization to 3NF using synthesis. The aim of the paper is to provide an algorithm for database normalization and present a set of steps which minimize redundancy in order to increase the database management efficiency, and specify tests and algorithms for testing and proving the reversibility (i.e., proving that the normalization did not cause loss of information. Using Bernstein algorithm steps, the paper gives examples of vertical normalization to 3NF through synthesis and proposes a test and an algorithm to demonstrate decomposition reversibility. This paper also sets out to explain that the reasons for generating normal forms are to facilitate data search, eliminate data redundancy as well as delete, insert and update anomalies and explain how anomalies develop using examples-
Energy Technology Data Exchange (ETDEWEB)
Ellison, James A.; Heinemann, Klaus [New Mexico Univ., Albuquerque, NM (United States). Dept. of Mathematics and Statistics; Vogt, Mathias [Deutsches Elektronen-Synchrotron DESY, Hamburg (Germany); Gooden, Matthew [North Carolina State Univ., Raleigh, NC (United States). Dept. of Physics
2013-03-15
We present a mathematical analysis of planar motion of energetic electrons moving through a planar dipole undulator, excited by a fixed planar polarized plane wave Maxwell field in the X-Ray FEL regime. Our starting point is the 6D Lorentz system, which allows planar motions, and we examine this dynamical system as the wave length {lambda} of the traveling wave varies. By scalings and transformations the 6D system is reduced, without approximation, to a 2D system in a form for a rigorous asymptotic analysis using the Method of Averaging (MoA), a long time perturbation theory. The two dependent variables are a scaled energy deviation and a generalization of the so- called ponderomotive phase. As {lambda} varies the system passes through resonant and nonresonant (NR) zones and we develop NR and near-to-resonant (NtoR) MoA normal form approximations. The NtoR normal forms contain a parameter which measures the distance from a resonance. For a special initial condition, for the planar motion and on resonance, the NtoR normal form reduces to the well known FEL pendulum system. We then state and prove NR and NtoR first-order averaging theorems which give explicit error bounds for the normal form approximations. We prove the theorems in great detail, giving the interested reader a tutorial on mathematically rigorous perturbation theory in a context where the proofs are easily understood. The proofs are novel in that they do not use a near identity transformation and they use a system of differential inequalities. The NR case is an example of quasiperiodic averaging where the small divisor problem enters in the simplest possible way. To our knowledge the planar prob- lem has not been analyzed with the generality we aspire to here nor has the standard FEL pendulum system been derived with associated error bounds as we do here. We briefly discuss the low gain theory in light of our NtoR normal form. Our mathematical treatment of the noncollective FEL beam dynamics problem in
International Nuclear Information System (INIS)
Ellison, James A.; Heinemann, Klaus; Gooden, Matthew
2013-03-01
We present a mathematical analysis of planar motion of energetic electrons moving through a planar dipole undulator, excited by a fixed planar polarized plane wave Maxwell field in the X-Ray FEL regime. Our starting point is the 6D Lorentz system, which allows planar motions, and we examine this dynamical system as the wave length λ of the traveling wave varies. By scalings and transformations the 6D system is reduced, without approximation, to a 2D system in a form for a rigorous asymptotic analysis using the Method of Averaging (MoA), a long time perturbation theory. The two dependent variables are a scaled energy deviation and a generalization of the so- called ponderomotive phase. As λ varies the system passes through resonant and nonresonant (NR) zones and we develop NR and near-to-resonant (NtoR) MoA normal form approximations. The NtoR normal forms contain a parameter which measures the distance from a resonance. For a special initial condition, for the planar motion and on resonance, the NtoR normal form reduces to the well known FEL pendulum system. We then state and prove NR and NtoR first-order averaging theorems which give explicit error bounds for the normal form approximations. We prove the theorems in great detail, giving the interested reader a tutorial on mathematically rigorous perturbation theory in a context where the proofs are easily understood. The proofs are novel in that they do not use a near identity transformation and they use a system of differential inequalities. The NR case is an example of quasiperiodic averaging where the small divisor problem enters in the simplest possible way. To our knowledge the planar prob- lem has not been analyzed with the generality we aspire to here nor has the standard FEL pendulum system been derived with associated error bounds as we do here. We briefly discuss the low gain theory in light of our NtoR normal form. Our mathematical treatment of the noncollective FEL beam dynamics problem in the
Hexavalent Chromium Minimization Strategy
2011-05-01
Logistics 4 Initiative - DoD Hexavalent Chromium Minimization Non- Chrome Primer IIEXAVAJ ENT CHRO:M I~UMI CHROMIUM (VII Oil CrfVli.J CANCEfl HAnRD CD...Management Office of the Secretary of Defense Hexavalent Chromium Minimization Strategy Report Documentation Page Form ApprovedOMB No. 0704-0188...00-2011 4. TITLE AND SUBTITLE Hexavalent Chromium Minimization Strategy 5a. CONTRACT NUMBER 5b. GRANT NUMBER 5c. PROGRAM ELEMENT NUMBER 6
Rautaharju, Pentti M; Mason, Jay W; Akiyama, Toshio
2014-07-01
Existing formulas for rate-corrected QT (QTc) commonly fail to properly adjust the upper normal limits which are more critical than the mean QTc for evaluation of prolonged QT. Age- and sex-related differences in QTc are also often overlooked. Our goal was to establish criteria for prolonged QTc using formulas that minimize QTc bias at the upper normal limits. Strict criteria were used in selecting a study group of 57,595 persons aged 5 to 89 years (54% women) and to exclude electrocardiograms (ECG) with possible disease-associated changes. Two QT rate adjustment formulas were identified which both minimized rate-dependency in the 98 th percentile limits: QTcmod, based on an electrophysiological model (QTcMod = QTx(120 + HR)/180)), and QTcLogLin, a power function of the RR interval with exponents 0.37 for men and 0.38 for women. QTc shortened in men during adolescence and QTcMod became 13 ms shorter than in women at age 20-29 years. The sex difference was maintained through adulthood although decreasing with age. The criteria established for prolonged QTc were: Age < 40 years, men 430 ms, women 440 ms; Age 40 to 69, men 440 ms, women 450 ms; Age ≥ 70 years, men 455 ms, and women 460 ms. Sex difference in QTc originates from shortened QT in adolescent males. Upper normal limits for QTc vary substantially by age and sex, and it is essential to use age- and sex-specific criteria for evaluation of QT prolongation. Copyright © 2014 Elsevier Ireland Ltd. All rights reserved.
Generating All Circular Shifts by Context-Free Grammars in Greibach Normal Form
Asveld, Peter R.J.
2007-01-01
For each alphabet Σn = {a1,a2,…,an}, linearly ordered by a1 < a2 < ⋯ < an, let Cn be the language of circular or cyclic shifts over Σn, i.e., Cn = {a1a2 ⋯ an-1an, a2a3 ⋯ ana1,…,ana1 ⋯ an-2an-1}. We study a few families of context-free grammars Gn (n ≥1) in Greibach normal form such that Gn generates
Holmes, Philip J.
1981-06-01
We study the instabilities known to aeronautical engineers as flutter and divergence. Mathematically, these states correspond to bifurcations to limit cycles and multiple equilibrium points in a differential equation. Making use of the center manifold and normal form theorems, we concentrate on the situation in which flutter and divergence become coupled, and show that there are essentially two ways in which this is likely to occur. In the first case the system can be reduced to an essential model which takes the form of a single degree of freedom nonlinear oscillator. This system, which may be analyzed by conventional phase-plane techniques, captures all the qualitative features of the full system. We discuss the reduction and show how the nonlinear terms may be simplified and put into normal form. Invariant manifold theory and the normal form theorem play a major role in this work and this paper serves as an introduction to their application in mechanics. Repeating the approach in the second case, we show that the essential model is now three dimensional and that far more complex behavior is possible, including nonperiodic and ‘chaotic’ motions. Throughout, we take a two degree of freedom system as an example, but the general methods are applicable to multi- and even infinite degree of freedom problems.
International Nuclear Information System (INIS)
Meyer, G.; Hahn, K.; Piepsz, A.; Kolinska, J.; Lepej, J.; Sixt, R.
1998-01-01
Use of technetium-99m labelled mercaptoacetyltriglycine ( 99m Tc-MAG3) simplifies and improves the quantification of renal clearance in children by virtue of its permanent availability, good imaging properties and low radiation exposure. Due to the lack of reference values for 99m Tc-MAG3 clearance in children, the Paediatric Task Group of the EANM initiated a multicentre study to evaluate 99m Tc-MAG3 clearance values in children with minimal renal disease. One hundred and twenty-five children aged between 12 months and 17 years, classified as renally healthy using defined diagnostic criteria, were included in the study. 99m Tc-MAG3 clearance was calculated using an algorithm on the basis of a single blood sample taken at any time between 30 and 40 min after tracer injection. In addition, the absolute 99m -Tc-MAG3 clearance values were normalized to body surface area. For further evaluation the children were classified into several groups according to age. There was a continuous increase in non-corrected 99m Tc-MAG3 clearance values from the age of 1 year up to the age of 17 years (mean value 8 years: 208±66 ml/min). Normal clearance values for adults were achieved by the age of 8 years. Analysis of the relationship between non-corrected clearance and age yielded a correlation coefficient of r=0.7. When these absolute clearance values were normalized to body surface area, we found nearly constant clearance values for all age groups, with a mean clearance value of 315±114 ml/min x 1.73 m 2 . The correlation coefficient for the relationship between normalized clearance and age was r=0.28. In conclusion, the clearance of 99m Tc-MAG3 increases continuously throughout childhood into adolescence due to the maturation and growth of the kidney. After normalization of the absolute clearance to body surface area, no correlation between clearance and age could be proven. (orig.)
Theory and praxis pf map analsys in CHEF part 1: Linear normal form
Energy Technology Data Exchange (ETDEWEB)
Michelotti, Leo; /Fermilab
2008-10-01
This memo begins a series which, put together, could comprise the 'CHEF Documentation Project' if there were such a thing. The first--and perhaps only--three will telegraphically describe theory, algorithms, implementation and usage of the normal form map analysis procedures encoded in CHEF's collection of libraries. [1] This one will begin the sequence by explaining the linear manipulations that connect the Jacobian matrix of a symplectic mapping to its normal form. It is a 'Reader's Digest' version of material I wrote in Intermediate Classical Dynamics (ICD) [2] and randomly scattered across technical memos, seminar viewgraphs, and lecture notes for the past quarter century. Much of its content is old, well known, and in some places borders on the trivial.1 Nevertheless, completeness requires their inclusion. The primary objective is the 'fundamental theorem' on normalization written on page 8. I plan to describe the nonlinear procedures in a subsequent memo and devote a third to laying out algorithms and lines of code, connecting them with equations written in the first two. Originally this was to be done in one short paper, but I jettisoned that approach after its first section exceeded a dozen pages. The organization of this document is as follows. A brief description of notation is followed by a section containing a general treatment of the linear problem. After the 'fundamental theorem' is proved, two further subsections discuss the generation of equilibrium distributions and issue of 'phase'. The final major section reviews parameterizations--that is, lattice functions--in two and four dimensions with a passing glance at the six-dimensional version. Appearances to the contrary, for the most part I have tried to restrict consideration to matters needed to understand the code in CHEF's libraries.
Theory and praxis of map analsys in CHEF part 2: Nonlinear normal form
International Nuclear Information System (INIS)
Michelotti, Leo
2009-01-01
This is the second of three memos describing how normal form map analysis is implemented in CHEF. The first (1) explained the manipulations required to assure that initial, linear transformations preserved Poincare invariants, thereby confirming correct normalization of action-angle coordinates. In this one, the transformation will be extended to nonlinear terms. The third, describing how the algorithms were implemented within the software of CHEF's libraries, most likely will never be written. The first section, Section 2, quickly lays out preliminary concepts and relationships. In Section 3, we shall review the perturbation theory - an iterative sequence of transformations that converts a nonlinear mapping into its normal form - and examine the equation which moves calculations from one step to the next. Following that is a section titled 'Interpretation', which identifies connections between the normalized mappings and idealized, integrable, fictitious Hamiltonian models. A final section contains closing comments, some of which may - but probably will not - preview work to be done later. My reasons for writing this memo and its predecessor have already been expressed. (1) To them can be added this: 'black box code' encourages users to proceed with little or no understanding of what it does or how it operates. So far, CHEF has avoided this trap admirably by failing to attract potential users. However, we reached a watershed last year: even I now have difficulty following the software through its maze of operations. Extensions to CHEF's physics functionalities, software upgrades, and even simple maintenance are becoming more difficult than they should. I hope these memos will mark parts of the maze for easier navigation in the future. Despite appearances to the contrary, I tried to include no (or very little) more than the minimum needed to understand what CHEF's nonlinear analysis modules do.1 As with the first memo, material has been lifted - and modified - from
Theory and praxis of map analsys in CHEF part 2: Nonlinear normal form
Energy Technology Data Exchange (ETDEWEB)
Michelotti, Leo; /FERMILAB
2009-04-01
This is the second of three memos describing how normal form map analysis is implemented in CHEF. The first [1] explained the manipulations required to assure that initial, linear transformations preserved Poincare invariants, thereby confirming correct normalization of action-angle coordinates. In this one, the transformation will be extended to nonlinear terms. The third, describing how the algorithms were implemented within the software of CHEF's libraries, most likely will never be written. The first section, Section 2, quickly lays out preliminary concepts and relationships. In Section 3, we shall review the perturbation theory - an iterative sequence of transformations that converts a nonlinear mapping into its normal form - and examine the equation which moves calculations from one step to the next. Following that is a section titled 'Interpretation', which identifies connections between the normalized mappings and idealized, integrable, fictitious Hamiltonian models. A final section contains closing comments, some of which may - but probably will not - preview work to be done later. My reasons for writing this memo and its predecessor have already been expressed. [1] To them can be added this: 'black box code' encourages users to proceed with little or no understanding of what it does or how it operates. So far, CHEF has avoided this trap admirably by failing to attract potential users. However, we reached a watershed last year: even I now have difficulty following the software through its maze of operations. Extensions to CHEF's physics functionalities, software upgrades, and even simple maintenance are becoming more difficult than they should. I hope these memos will mark parts of the maze for easier navigation in the future. Despite appearances to the contrary, I tried to include no (or very little) more than the minimum needed to understand what CHEF's nonlinear analysis modules do.1 As with the first memo, material
Broer, H.; Hoveijn, I.; Lunter, G.; Vegter, G.
2003-01-01
The Birkhoff normal form procedure is a widely used tool for approximating a Hamiltonian systems by a simpler one. This chapter starts out with an introduction to Hamiltonian mechanics, followed by an explanation of the Birkhoff normal form procedure. Finally we discuss several algorithms for
Directory of Open Access Journals (Sweden)
Min Hyuk Lim
2016-06-01
Full Text Available BackgroundThe oral minimal model is a simple, useful tool for the assessment of β-cell function and insulin sensitivity across the spectrum of glucose tolerance, including normal glucose tolerance (NGT, prediabetes, and type 2 diabetes mellitus (T2DM in humans.MethodsPlasma glucose, insulin, and C-peptide levels were measured during a 180-minute, 75-g oral glucose tolerance test in 24 Korean subjects with NGT (n=10 and T2DM (n=14. The parameters in the computational model were estimated, and the indexes for insulin sensitivity and β-cell function were compared between the NGT and T2DM groups.ResultsThe insulin sensitivity index was lower in the T2DM group than the NGT group. The basal index of β-cell responsivity, basal hepatic insulin extraction ratio, and post-glucose challenge hepatic insulin extraction ratio were not different between the NGT and T2DM groups. The dynamic, static, and total β-cell responsivity indexes were significantly lower in the T2DM group than the NGT group. The dynamic, static, and total disposition indexes were also significantly lower in the T2DM group than the NGT group.ConclusionThe oral minimal model can be reproducibly applied to evaluate β-cell function and insulin sensitivity in Koreans.
A4 see-saw models and form dominance
International Nuclear Information System (INIS)
Chen, M-C; King, Stephen F.
2009-01-01
We introduce the idea of Form Dominance in the (type I) see-saw mechanism, according to which a particular right-handed neutrino mass eigenstate is associated with a particular physical neutrino mass eigenstate, leading to a form diagonalizable effective neutrino mass matrix. Form Dominance, which allows an arbitrary neutrino mass spectrum, may be regarded as a generalization of Constrained Sequential Dominance which only allows strongly hierarchical neutrino masses. We consider alternative implementations of the see-saw mechanism in minimal A 4 see-saw models and show that such models satisfy Form Dominance, leading to neutrino mass sum rules which predict closely spaced neutrino masses with a normal or inverted neutrino mass ordering. To avoid the partial cancellations inherent in such models we propose Natural Form Dominance, in which a different flavon is associated with each physical neutrino mass eigenstate.
The relative volume growth of minimal submanifolds
DEFF Research Database (Denmark)
Markvorsen, Steen; Palmer, V.
2002-01-01
The volume growth of certain well-defined subsets of minimal submanifolds in riemannian spaces are compared with the volume growth of balls and spheres ill space forms of constant curvature.......The volume growth of certain well-defined subsets of minimal submanifolds in riemannian spaces are compared with the volume growth of balls and spheres ill space forms of constant curvature....
International Nuclear Information System (INIS)
Niemeyer, M.G.; St. Antonius Hospital Nieuwegein; Laarman, G.J.; Lelbach, S.; Cramer, M.J.; Ascoop, C.A.P.L.; Verzijlbergen, J.F.; Wall, E.E. van der; Zwinderman, A.H.; Pauwels, E.K.J.
1990-01-01
Quantitative thallium-201 myocardial exercise scintigraphy was tested in two patient populations representing alternative standards for cardiac normality: group I comprised 18 male uncatherized patients with a low likelihood of coronary artery disease (CAD); group II contained 41 patients with normal coronary arteriograms. Group I patients were younger, they achieved a higher rate-pressure product than group II patients; all had normal findings by phisical examination and electrocardiography at rest and exercise. Group II patients comprised 21 females, 11 patients showed abnormal electrocardiography at rest, and five patients showed ischemic ST depression during exercise. Twelve patients had sign of minimal CAD. Twelve patients revealed abnormal visual and quantitative thallium findings, three of these patients had minimal CAD. Profiles of uptake and washout of thallium-201 were derived from both patient groups, and compared with normal limits developed by Maddahi et al. Furthermore, low likelihood and angiographically normal patients may differ substantially, and both sets of normal patients should be considered when establishing criteria of abnormality in exercise thallium imaging. When commercial software containing normal limits for quantitative analysis of exercise thallium-201 imaging is used in clinical practice, it is mandatory to compare these with normal limits of uptake and washout of thallium-201, derived from the less heterogeneous group of low-likelihood subjects, which should be used in selecting a normal population to define normality. (author). 37 refs.; 3 figs; 1 tab
Westinghouse Hanford Company waste minimization actions
International Nuclear Information System (INIS)
Greenhalgh, W.O.
1988-09-01
Companies that generate hazardous waste materials are now required by national regulations to establish a waste minimization program. Accordingly, in FY88 the Westinghouse Hanford Company formed a waste minimization team organization. The purpose of the team is to assist the company in its efforts to minimize the generation of waste, train personnel on waste minimization techniques, document successful waste minimization effects, track dollar savings realized, and to publicize and administer an employee incentive program. A number of significant actions have been successful, resulting in the savings of materials and dollars. The team itself has been successful in establishing some worthwhile minimization projects. This document briefly describes the waste minimization actions that have been successful to date. 2 refs., 26 figs., 3 tabs
International Nuclear Information System (INIS)
Przyjalkowski, V V
2008-01-01
We construct an abstract theory of Gromov-Witten invariants of genus 0 for quantum minimal Fano varieties (a minimal class of varieties which is natural from the quantum cohomological viewpoint). Namely, we consider the minimal Gromov-Witten ring: a commutative algebra whose generators and relations are of the form used in the Gromov-Witten theory of Fano varieties (of unspecified dimension). The Gromov-Witten theory of any quantum minimal variety is a homomorphism from this ring to C. We prove an abstract reconstruction theorem which says that this ring is isomorphic to the free commutative ring generated by 'prime two-pointed invariants'. We also find solutions of the differential equation of type DN for a Fano variety of dimension N in terms of the generating series of one-pointed Gromov-Witten invariants
Bioactive form of resveratrol in glioblastoma cells and its safety for normal brain cells
Directory of Open Access Journals (Sweden)
Xiao-Hong Shu
2013-05-01
Full Text Available ABSTRACTBackground: Resveratrol, a plant polyphenol existing in grapes and many other natural foods, possesses a wide range of biological activities including cancer prevention. It has been recognized that resveratrol is intracellularly biotransformed to different metabolites, but no direct evidence has been available to ascertain its bioactive form because of the difficulty to maintain resveratrol unmetabolized in vivo or in vitro. It would be therefore worthwhile to elucidate the potential therapeutic implications of resveratrol metabolism using a reliable resveratrol-sensitive cancer cells.Objective: To identify the real biological form of trans-resveratrol and to evaluate the safety of the effective anticancer dose of resveratrol for the normal brain cells.Methods: The samples were prepared from the condition media and cell lysates of human glioblastoma U251 cells, and were purified by solid phase extraction (SPE. The samples were subjected to high performance liquid chromatography (HPLC and liquid chromatography/tandem mass spectrometry (LC/MS analysis. According to the metabolite(s, trans-resveratrol was biotransformed in vitro by the method described elsewhere, and the resulting solution was used to treat U251 cells. Meanwhile, the responses of U251 and primarily cultured rat normal brain cells (glial cells and neurons to 100μM trans-resveratrol were evaluated by multiple experimental methods.Results: The results revealed that resveratrol monosulfate was the major metabolite in U251 cells. About half fraction of resveratrol monosulfate was prepared in vitro and this trans-resveratrol and resveratrol monosulfate mixture showed little inhibitory effect on U251 cells. It is also found that rat primary brain cells (PBCs not only resist 100μM but also tolerate as high as 200μM resveratrol treatment.Conclusions: Our study thus demonstrated that trans-resveratrol was the bioactive form in glioblastoma cells and, therefore, the biotransforming
Normal form analysis of linear beam dynamics in a coupled storage ring
International Nuclear Information System (INIS)
Wolski, Andrzej; Woodley, Mark D.
2004-01-01
The techniques of normal form analysis, well known in the literature, can be used to provide a straightforward characterization of linear betatron dynamics in a coupled lattice. Here, we consider both the beam distribution and the betatron oscillations in a storage ring. We find that the beta functions for uncoupled motion generalize in a simple way to the coupled case. Defined in the way that we propose, the beta functions remain well behaved (positive and finite) under all circumstances, and have essentially the same physical significance for the beam size and betatron oscillation amplitude as in the uncoupled case. Application of this analysis to the online modeling of the PEP-II rings is also discussed
Optimization of accelerator parameters using normal form methods on high-order transfer maps
Energy Technology Data Exchange (ETDEWEB)
Snopok, Pavel [Michigan State Univ., East Lansing, MI (United States)
2007-05-01
Methods of analysis of the dynamics of ensembles of charged particles in collider rings are developed. The following problems are posed and solved using normal form transformations and other methods of perturbative nonlinear dynamics: (1) Optimization of the Tevatron dynamics: (a) Skew quadrupole correction of the dynamics of particles in the Tevatron in the presence of the systematic skew quadrupole errors in dipoles; (b) Calculation of the nonlinear tune shift with amplitude based on the results of measurements and the linear lattice information; (2) Optimization of the Muon Collider storage ring: (a) Computation and optimization of the dynamic aperture of the Muon Collider 50 x 50 GeV storage ring using higher order correctors; (b) 750 x 750 GeV Muon Collider storage ring lattice design matching the Tevatron footprint. The normal form coordinates have a very important advantage over the particle optical coordinates: if the transformation can be carried out successfully (general restrictions for that are not much stronger than the typical restrictions imposed on the behavior of the particles in the accelerator) then the motion in the new coordinates has a very clean representation allowing to extract more information about the dynamics of particles, and they are very convenient for the purposes of visualization. All the problem formulations include the derivation of the objective functions, which are later used in the optimization process using various optimization algorithms. Algorithms used to solve the problems are specific to collider rings, and applicable to similar problems arising on other machines of the same type. The details of the long-term behavior of the systems are studied to ensure the their stability for the desired number of turns. The algorithm of the normal form transformation is of great value for such problems as it gives much extra information about the disturbing factors. In addition to the fact that the dynamics of particles is represented
Directory of Open Access Journals (Sweden)
Adam Karbowski
2017-09-01
Full Text Available The purpose of this study is to explore the link between imagine-self perspective-taking and rational self-interested behavior in experimental normal-form games. Drawing on the concept of sympathy developed by Adam Smith and further literature on perspective-taking in games, we hypothesize that introduction of imagine-self perspective-taking by decision-makers promotes rational self-interested behavior in a simple experimental normal-form game. In our study, we examined behavior of 404 undergraduate students in the two-person game, in which the participant can suffer a monetary loss only if she plays her Nash equilibrium strategy and the opponent plays her dominated strategy. Results suggest that the threat of suffering monetary losses effectively discourages the participants from choosing Nash equilibrium strategy. In general, players may take into account that opponents choose dominated strategies due to specific not self-interested motivations or errors. However, adopting imagine-self perspective by the participants leads to more Nash equilibrium choices, perhaps by alleviating participants’ attributions of susceptibility to errors or non-self-interested motivation to the opponents.
Karbowski, Adam; Ramsza, Michał
2017-01-01
The purpose of this study is to explore the link between imagine-self perspective-taking and rational self-interested behavior in experimental normal-form games. Drawing on the concept of sympathy developed by Adam Smith and further literature on perspective-taking in games, we hypothesize that introduction of imagine-self perspective-taking by decision-makers promotes rational self-interested behavior in a simple experimental normal-form game. In our study, we examined behavior of 404 undergraduate students in the two-person game, in which the participant can suffer a monetary loss only if she plays her Nash equilibrium strategy and the opponent plays her dominated strategy. Results suggest that the threat of suffering monetary losses effectively discourages the participants from choosing Nash equilibrium strategy. In general, players may take into account that opponents choose dominated strategies due to specific not self-interested motivations or errors. However, adopting imagine-self perspective by the participants leads to more Nash equilibrium choices, perhaps by alleviating participants' attributions of susceptibility to errors or non-self-interested motivation to the opponents.
International Nuclear Information System (INIS)
Avendaño-Camacho, M; Vallejo, J A; Vorobjev, Yu
2013-01-01
We study the determination of the second-order normal form for perturbed Hamiltonians relative to the periodic flow of the unperturbed Hamiltonian H 0 . The formalism presented here is global, and can be easily implemented in any computer algebra system. We illustrate it by means of two examples: the Hénon–Heiles and the elastic pendulum Hamiltonians. (paper)
Carpenter, Donald A.
2008-01-01
Confusion exists among database textbooks as to the goal of normalization as well as to which normal form a designer should aspire. This article discusses such discrepancies with the intention of simplifying normalization for both teacher and student. This author's industry and classroom experiences indicate such simplification yields quicker…
Plateau inflation from random non-minimal coupling
International Nuclear Information System (INIS)
Broy, Benedict J.; Roest, Diederik
2016-06-01
A generic non-minimal coupling can push any higher-order terms of the scalar potential sufficiently far out in field space to yield observationally viable plateau inflation. We provide analytic and numerical evidence that this generically happens for a non-minimal coupling strength ξ of the order N 2 e . In this regime, the non-minimally coupled field is sub-Planckian during inflation and is thus protected from most higher-order terms. For larger values of ξ, the inflationary predictions converge towards the sweet spot of PLANCK. The latter includes ξ≅10 4 obtained from CMB normalization arguments, thus providing a natural explanation for the inflationary observables measured.
Scheduling stochastic two-machine flow shop problems to minimize expected makespan
Directory of Open Access Journals (Sweden)
Mehdi Heydari
2013-07-01
Full Text Available During the past few years, despite tremendous contribution on deterministic flow shop problem, there are only limited number of works dedicated on stochastic cases. This paper examines stochastic scheduling problems in two-machine flow shop environment for expected makespan minimization where processing times of jobs are normally distributed. Since jobs have stochastic processing times, to minimize the expected makespan, the expected sum of the second machine’s free times is minimized. In other words, by minimization waiting times for the second machine, it is possible to reach the minimum of the objective function. A mathematical method is proposed which utilizes the properties of the normal distributions. Furthermore, this method can be used as a heuristic method for other distributions, as long as the means and variances are available. The performance of the proposed method is explored using some numerical examples.
THE METHOD OF CONSTRUCTING A BOOLEAN FORMULA OF A POLYGON IN THE DISJUNCTIVE NORMAL FORM
Directory of Open Access Journals (Sweden)
A. A. Butov
2014-01-01
Full Text Available The paper focuses on finalizing the method of finding a polygon Boolean formula in disjunctive normal form, described in the previous article [1]. An improved method eliminates the drawback asso-ciated with the existence of a class of problems for which the solution is only approximate. The pro-posed method always allows to find an exact solution. The method can be used, in particular, in the systems of computer-aided design of integrated circuits topology.
High molecular gas fractions in normal massive star-forming galaxies in the young Universe.
Tacconi, L J; Genzel, R; Neri, R; Cox, P; Cooper, M C; Shapiro, K; Bolatto, A; Bouché, N; Bournaud, F; Burkert, A; Combes, F; Comerford, J; Davis, M; Schreiber, N M Förster; Garcia-Burillo, S; Gracia-Carpio, J; Lutz, D; Naab, T; Omont, A; Shapley, A; Sternberg, A; Weiner, B
2010-02-11
Stars form from cold molecular interstellar gas. As this is relatively rare in the local Universe, galaxies like the Milky Way form only a few new stars per year. Typical massive galaxies in the distant Universe formed stars an order of magnitude more rapidly. Unless star formation was significantly more efficient, this difference suggests that young galaxies were much more molecular-gas rich. Molecular gas observations in the distant Universe have so far largely been restricted to very luminous, rare objects, including mergers and quasars, and accordingly we do not yet have a clear idea about the gas content of more normal (albeit massive) galaxies. Here we report the results of a survey of molecular gas in samples of typical massive-star-forming galaxies at mean redshifts of about 1.2 and 2.3, when the Universe was respectively 40% and 24% of its current age. Our measurements reveal that distant star forming galaxies were indeed gas rich, and that the star formation efficiency is not strongly dependent on cosmic epoch. The average fraction of cold gas relative to total galaxy baryonic mass at z = 2.3 and z = 1.2 is respectively about 44% and 34%, three to ten times higher than in today's massive spiral galaxies. The slow decrease between z approximately 2 and z approximately 1 probably requires a mechanism of semi-continuous replenishment of fresh gas to the young galaxies.
Characterisation of minimal-span plane Couette turbulence with pressure gradients
Sekimoto, Atsushi; Atkinson, Callum; Soria, Julio
2018-04-01
The turbulence statistics and dynamics in the spanwise-minimal plane Couette flow with pressure gradients, so-called, Couette-Poiseuille (C-P) flow, are investigated using direct numerical simulation. The large-scale motion is limited in the spanwise box dimension as in the minimal-span channel turbulence of Flores & Jiménez (Phys. Fluids, vol. 22, 2010, 071704). The effect of the top wall, where normal pressure-driven Poiseuille flow is realised, is distinguished from the events on the bottom wall, where the pressure gradient results in mild or almost-zero wall-shear stress. A proper scaling of turbulence statistics in minimal-span C-P flows is presented. Also the ‘shear-less’ wall-bounded turbulence, where the Corrsin shear parameter is very weak compared to normal wall-bounded turbulence, represents local separation, which is also observed as spanwise streaks of reversed flow in full-size plane C-P turbulence. The local separation is a multi-scale event, which grows up to the order of the channel height even in the minimal-span geometry.
Black-Litterman model on non-normal stock return (Case study four banks at LQ-45 stock index)
Mahrivandi, Rizki; Noviyanti, Lienda; Setyanto, Gatot Riwi
2017-03-01
The formation of the optimal portfolio is a method that can help investors to minimize risks and optimize profitability. One model for the optimal portfolio is a Black-Litterman (BL) model. BL model can incorporate an element of historical data and the views of investors to form a new prediction about the return of the portfolio as a basis for preparing the asset weighting models. BL model has two fundamental problems, the assumption of normality and estimation parameters on the market Bayesian prior framework that does not from a normal distribution. This study provides an alternative solution where the modelling of the BL model stock returns and investor views from non-normal distribution.
Researches Concerning to Minimize Vibrations when Processing Normal Lathe
Directory of Open Access Journals (Sweden)
Lenuța Cîndea
2015-09-01
Full Text Available In the cutting process, vibration is inevitable appearance, and in situations where the amplitude exceeds the limits of precision dimensional and shape of the surfaces generated vibrator phenomenon is detrimental.Field vibration is an issue of increasingly developed, so the futures will a better understanding of them and their use even in other sectors.The paper developed experimental measurement of vibrations at the lathe machining normal. The scheme described kinematical machine tool, cutting tool, cutting conditions, presenting experimental facility for measuring vibration occurring at turning. Experimental results have followed measurement of amplitude, which occurs during interior turning the knife without silencer incorporated. The tests were performed continuously for different speed, feed and depth of cut.
Statistically Efficient Construction of α-Risk-Minimizing Portfolio
Directory of Open Access Journals (Sweden)
Hiroyuki Taniai
2012-01-01
Full Text Available We propose a semiparametrically efficient estimator for α-risk-minimizing portfolio weights. Based on the work of Bassett et al. (2004, an α-risk-minimizing portfolio optimization is formulated as a linear quantile regression problem. The quantile regression method uses a pseudolikelihood based on an asymmetric Laplace reference density, and asymptotic properties such as consistency and asymptotic normality are obtained. We apply the results of Hallin et al. (2008 to the problem of constructing α-risk-minimizing portfolios using residual signs and ranks and a general reference density. Monte Carlo simulations assess the performance of the proposed method. Empirical applications are also investigated.
Correction of Bowtie-Filter Normalization and Crescent Artifacts for a Clinical CBCT System.
Zhang, Hong; Kong, Vic; Huang, Ke; Jin, Jian-Yue
2017-02-01
To present our experiences in understanding and minimizing bowtie-filter crescent artifacts and bowtie-filter normalization artifacts in a clinical cone beam computed tomography system. Bowtie-filter position and profile variations during gantry rotation were studied. Two previously proposed strategies (A and B) were applied to the clinical cone beam computed tomography system to correct bowtie-filter crescent artifacts. Physical calibration and analytical approaches were used to minimize the norm phantom misalignment and to correct for bowtie-filter normalization artifacts. A combined procedure to reduce bowtie-filter crescent artifacts and bowtie-filter normalization artifacts was proposed and tested on a norm phantom, CatPhan, and a patient and evaluated using standard deviation of Hounsfield unit along a sampling line. The bowtie-filter exhibited not only a translational shift but also an amplitude variation in its projection profile during gantry rotation. Strategy B was better than strategy A slightly in minimizing bowtie-filter crescent artifacts, possibly because it corrected the amplitude variation, suggesting that the amplitude variation plays a role in bowtie-filter crescent artifacts. The physical calibration largely reduced the misalignment-induced bowtie-filter normalization artifacts, and the analytical approach further reduced bowtie-filter normalization artifacts. The combined procedure minimized both bowtie-filter crescent artifacts and bowtie-filter normalization artifacts, with Hounsfield unit standard deviation being 63.2, 45.0, 35.0, and 18.8 Hounsfield unit for the best correction approaches of none, bowtie-filter crescent artifacts, bowtie-filter normalization artifacts, and bowtie-filter normalization artifacts + bowtie-filter crescent artifacts, respectively. The combined procedure also demonstrated reduction of bowtie-filter crescent artifacts and bowtie-filter normalization artifacts in a CatPhan and a patient. We have developed a step
International Nuclear Information System (INIS)
Cho, Yong Zun; Kim, In Tae; Park, Hwan Seo; Ahn, Byeung Gil; Eun, Hee Chul; Son, Seock Mo; Ah, Su Na
2011-12-01
The purpose of this project is to develop new high level waste (HLW) forms and fabrication processes to dispose of active metal fission products that are removed from electrorefiner salts in the pyroprocessing based fuel cycle. The current technology for disposing of active metal fission products in pyroprocessing involves non selectively discarding of fission product loaded salt in a glass-bonded sodalite ceramic waste form. Selective removal of fission products from the molten salt would greatly minimize the amount of HLW generated and methods were developed to achieve selective separation of fission products during a previous I NERI research project (I NERI 2006 002 K). This I NERI project proceeds from the previous project with the development of suitable waste forms to immobilize the separated fission products. The Korea Atomic Energy Research Institute (KAERI) has focused primarily on developing these waste forms using surrogate waste materials, while the Idaho National Laboratory (INL) has demonstrated fabrication of these waste forms using radioactive electrorefiner salts in hot cell facilities available at INL. Testing and characterization of these radioactive materials was also performed to determine the physical, chemical, and durability properties of the waste forms
Directory of Open Access Journals (Sweden)
Daniel Ventura
2010-01-01
Full Text Available The lambda-calculus with de Bruijn indices assembles each alpha-class of lambda-terms in a unique term, using indices instead of variable names. Intersection types provide finitary type polymorphism and can characterise normalisable lambda-terms through the property that a term is normalisable if and only if it is typeable. To be closer to computations and to simplify the formalisation of the atomic operations involved in beta-contractions, several calculi of explicit substitution were developed mostly with de Bruijn indices. Versions of explicit substitutions calculi without types and with simple type systems are well investigated in contrast to versions with more elaborate type systems such as intersection types. In previous work, we introduced a de Bruijn version of the lambda-calculus with an intersection type system and proved that it preserves subject reduction, a basic property of type systems. In this paper a version with de Bruijn indices of an intersection type system originally introduced to characterise principal typings for beta-normal forms is presented. We present the characterisation in this new system and the corresponding versions for the type inference and the reconstruction of normal forms from principal typings algorithms. We briefly discuss the failure of the subject reduction property and some possible solutions for it.
Classical strings and minimal surfaces
International Nuclear Information System (INIS)
Urbantke, H.
1986-01-01
Real Lorentzian forms of some complex or complexified Euclidean minimal surfaces are obtained as an application of H.A. Schwarz' solution to the initial value problem or a search for surfaces admitting a group of Poincare transformations. (Author)
Energy Technology Data Exchange (ETDEWEB)
Lee, E.T.
1983-01-01
Algorithms for the construction of the Chomsky and Greibach normal forms for a fuzzy context-free grammar using the algebraic approach are presented and illustrated by examples. The results obtained in this paper may have useful applications in fuzzy languages, pattern recognition, information storage and retrieval, artificial intelligence, database and pictorial information systems. 16 references.
International Nuclear Information System (INIS)
Yoon, Hyun Gi; Choi, Jung Woon; Yoon, Ju Hyeon; Chi, Dae Young
2012-01-01
In many research reactors, hot water layer system (HWLS) is used to minimize the pool top radiation level. Reactor pool divided into the hot water layer at the upper part of pool and the cold part below the hot water layer with lower temperature during normal operation. Water mixing between these layers is minimized because the hot water layer is formed above cold water. Therefore the hot water layer suppresses floatation of cold water and reduces the pool top radiation level. Pool water is evaporated form the surface to the building hall because of high temperature of the hot water layer; consequently the pool level is continuously fallen. Therefore, make-up water is necessary to maintain the normal pool level. There are two way to supply demineralized water to the pool, continuous and intermittent methods. In this system design, the continuous water make-up method is adopted to minimize the disturbance of the reactor pool flow. Also, demineralized water make-up is connected to the suction line of the hot water layer system to raise the temperature of make-up water. In conclusion, make-up demineralized water with high temperature is continuously supplied to the hot water layer in the pool
Strong Bayesian evidence for the normal neutrino hierarchy
Energy Technology Data Exchange (ETDEWEB)
Simpson, Fergus; Jimenez, Raul; Verde, Licia [ICCUB, University of Barcelona (UB-IEEC), Marti i Franques 1, Barcelona, 08028 (Spain); Pena-Garay, Carlos, E-mail: fergus2@gmail.com, E-mail: raul.jimenez@icc.ub.edu, E-mail: penagaray@gmail.com, E-mail: liciaverde@icc.ub.edu [I2SysBio, CSIC-UVEG, P.O. 22085, Valencia, 46071 (Spain)
2017-06-01
The configuration of the three neutrino masses can take two forms, known as the normal and inverted hierarchies. We compute the Bayesian evidence associated with these two hierarchies. Previous studies found a mild preference for the normal hierarchy, and this was driven by the asymmetric manner in which cosmological data has confined the available parameter space. Here we identify the presence of a second asymmetry, which is imposed by data from neutrino oscillations. By combining constraints on the squared-mass splittings [1] with the limit on the sum of neutrino masses of Σ m {sub ν} < 0.13 eV [2], and using a minimally informative prior on the masses, we infer odds of 42:1 in favour of the normal hierarchy, which is classified as 'strong' in the Jeffreys' scale. We explore how these odds may evolve in light of higher precision cosmological data, and discuss the implications of this finding with regards to the nature of neutrinos. Finally the individual masses are inferred to be m {sub 1}=3.80{sup +26.2}{sub -3.73}meV; m {sub 2}=8.8{sup +18}{sub -1.2}meV; m {sub 3}=50.4{sup +5.8}{sub -1.2}meV (95% credible intervals).
Exact scaling solutions in normal and Brans-Dicke models of dark energy
International Nuclear Information System (INIS)
Arias, Olga; Gonzalez, Tame; Leyva, Yoelsy; Quiros, Israel
2003-01-01
A linear relationship between the Hubble expansion parameter and the time derivative of the scalar field is explored in order to derive exact cosmological, attractor-like solutions, both in Einstein's theory and in Brans-Dicke gravity with two fluids: a background fluid of ordinary matter and a self-interacting scalar-field fluid accounting for the dark energy in the universe. A priori assumptions about the functional form of the self-interaction potential or about the scale factor behaviour are not necessary. These are obtained as outputs of the assumed relationship between the Hubble parameter and the time derivative of the scalar field. A parametric class of scaling quintessence models given by a self-interaction potential of a peculiar form, a combination of exponentials with dependence on the barotropic index of the background fluid, arises. Both normal quintessence described by a self-interacting scalar field minimally coupled to gravity and Brans-Dicke quintessence given by a non-minimally coupled scalar field are then analysed and the relevance of these models for the description of the cosmic evolution is discussed in some detail. The stability of these solutions is also briefly commented on
Normalization Of Thermal-Radiation Form-Factor Matrix
Tsuyuki, Glenn T.
1994-01-01
Report describes algorithm that adjusts form-factor matrix in TRASYS computer program, which calculates intraspacecraft radiative interchange among various surfaces and environmental heat loading from sources such as sun.
International Nuclear Information System (INIS)
Barrett, S.F.; Tarone, R.E.; Moshell, A.N.; Ganges, M.B.; Robbins, J.H.
1981-01-01
In xeroderma pigmentosum, an inherited disorder of defective DNA repair, post-uv colony-forming ability of fibroblasts from patients in complementation groups A through F correlates with the patients' neurological status. The first xeroderma pigmentosum patient assigned to the recently discovered group G had the neurological abnormalities of XP. Researchers have determined the post-uv colony-forming ability of cultured fibroblasts from this patient and from 5 more control donors. Log-phase fibroblasts were irradiated with 254 nm uv light from a germicidal lamp, trypsinized, and replated at known densities. After 2 to 4 weeks' incubation the cells were fixed, stained and scored for colony formation. The strains' post-uv colony-forming ability curves were obtained by plotting the log of the percent remaining post-uv colony-forming ability as a function of the uv dose. The post-uv colony-forming ability of 2 of the 5 new normal strains was in the previously defined control donor zone, but that of the other 3 extended down to the level of the most resistant xeroderma pigmentosum strain. The post-uv colony-forming ability curve of the group G fibroblasts was not significantly different from the curves of the group D fibroblast strains from patients with clinical histories similar to that of the group G patient
Bicervical normal uterus with normal vagina | Okeke | Annals of ...
African Journals Online (AJOL)
To the best of our knowledge, only few cases of bicervical normal uterus with normal vagina exist in the literature; one of the cases had an anterior‑posterior disposition. This form of uterine abnormality is not explicable by the existing classical theory of mullerian anomalies and suggests that a complex interplay of events ...
Minimal solution of linear formed fuzzy matrix equations
Directory of Open Access Journals (Sweden)
Maryam Mosleh
2012-10-01
Full Text Available In this paper according to the structured element method, the $mimes n$ inconsistent fuzzy matrix equation $Ailde{X}=ilde{B},$ which are linear formed by fuzzy structured element, is investigated. The necessary and sufficient condition for the existence of a fuzzy solution is also discussed. some examples are presented to illustrate the proposed method.
On the isoperimetric rigidity of extrinsic minimal balls
DEFF Research Database (Denmark)
Markvorsen, Steen; Palmer, V.
2003-01-01
We consider an m-dimensional minimal submanifold P and a metric R-sphere in the Euclidean space R-n. If the sphere has its center p on P, then it will cut out a well defined connected component of P which contains this center point. We call this connected component an extrinsic minimal R-ball of P....... The quotient of the volume of the extrinsic ball and the volume of its boundary is not larger than the corresponding quotient obtained in the space form standard situation, where the minimal submanifold is the totally geodesic linear subspace R-m. Here we show that if the minimal submanifold has dimension...... larger than 3, if P is not too curved along the boundary of an extrinsic minimal R-ball, and if the inequality alluded to above is an equality for the extrinsic minimal ball, then the minimal submanifold is totally geodesic....
Isoperimetric inequalities for minimal graphs
International Nuclear Information System (INIS)
Pacelli Bessa, G.; Montenegro, J.F.
2007-09-01
Based on Markvorsen and Palmer's work on mean time exit and isoperimetric inequalities we establish slightly better isoperimetric inequalities and mean time exit estimates for minimal graphs in N x R. We also prove isoperimetric inequalities for submanifolds of Hadamard spaces with tamed second fundamental form. (author)
Improved water chemistry controls for minimizing degradation of materials
International Nuclear Information System (INIS)
Sawochka, S.G.
1986-01-01
The Electric Power Research Institute and the Steam Generator Owners Group have sponsored several efforts to develop secondary water chemistry guidelines to minimize pressurized water reactor (PWR) steam generator tubing degradation. To develop these guidelines, chemical species known to accelerate corrosion of Alloy 600 were identified, and values for normal and abnormal chemistry situations were established. For example, sodium hydroxide was known to accelerate Alloy 600 intergranular attack stress corrosion cracking; thus, guidelines were developed for blowdown sodium concentrations in recirculating steam generator systems. Similarly, formation of acidic solutions, particularly as a result of chloride ingress at seawater sites, was known to accelerate denting; thus, chloride guidelines were established. A blowdown cation conductivity limit was established to minimize concentrations of other anionic species. Guidelines also were developed for condensate and feedwater chemistry to minimize general corrosion of system materials, thereby minimizing sludge and deposit buildup in the steam generators
P-nflation: generating cosmic Inflation with p-forms
Energy Technology Data Exchange (ETDEWEB)
Germani, Cristiano [LUTH, Observatoire de Paris, CNRS UMR 8102, Universite Paris Diderot, 5 Place Jules Janssen, 92195 Meudon Cedex (France); Kehagias, Alex, E-mail: cristiano.germani@obspm.fr, E-mail: kehagias@central.ntua.gr [Department of Physics, National Technical University of Athens, GR-15773, Zografou, Athens (Greece)
2009-03-15
We show that an inflationary background might be realized by using any p-form non-minimally coupled to gravity. Standard scalar field inflation corresponds to the 0-form case and vector inflation to the 1-form. Moreover, we show that the 2- and 3-form fields are dual to a new vector and scalar inflationary theories where the kinetic terms are non-minimally coupled to gravity.
A negentropy minimization approach to adaptive equalization for digital communication systems.
Choi, Sooyong; Lee, Te-Won
2004-07-01
In this paper, we introduce and investigate a new adaptive equalization method based on minimizing approximate negentropy of the estimation error for a finite-length equalizer. We consider an approximate negentropy using nonpolynomial expansions of the estimation error as a new performance criterion to improve performance of a linear equalizer based on minimizing minimum mean squared error (MMSE). Negentropy includes higher order statistical information and its minimization provides improved converge, performance and accuracy compared to traditional methods such as MMSE in terms of bit error rate (BER). The proposed negentropy minimization (NEGMIN) equalizer has two kinds of solutions, the MMSE solution and the other one, depending on the ratio of the normalization parameters. The NEGMIN equalizer has best BER performance when the ratio of the normalization parameters is properly adjusted to maximize the output power(variance) of the NEGMIN equalizer. Simulation experiments show that BER performance of the NEGMIN equalizer with the other solution than the MMSE one has similar characteristics to the adaptive minimum bit error rate (AMBER) equalizer. The main advantage of the proposed equalizer is that it needs significantly fewer training symbols than the AMBER equalizer. Furthermore, the proposed equalizer is more robust to nonlinear distortions than the MMSE equalizer.
Denotational Aspects of Untyped Normalization by Evaluation
DEFF Research Database (Denmark)
Filinski, Andrzej; Rohde, Henning Korsholm
2005-01-01
of soundness (the output term, if any, is in normal form and ß-equivalent to the input term); identification (ß-equivalent terms are mapped to the same result); and completeness (the function is defined for all terms that do have normal forms). We also show how the semantic construction enables a simple yet...... formal correctness proof for the normalization algorithm, expressed as a functional program in an ML-like, call-by-value language. Finally, we generalize the construction to produce an infinitary variant of normal forms, namely Böhm trees. We show that the three-part characterization of correctness...
Clonal status and clinicopathological observation of cervical minimal deviation adenocarcinoma
Directory of Open Access Journals (Sweden)
Lan Miao
2010-04-01
Full Text Available Abstract Background Minimal deviation adenocarcinoma (MDA of the uterine cervix is defined as an extremely well differentiated variant of cervical adenocarcinoma, with well-formed glands that resemble benign glands but show distinct nuclear anaplasia or evidence of stromal invasion. Thus, MDA is difficult to differentiate from other cervical hyperplastic lesions. Monoclonality is a major characteristic of most tumors, whereas normal tissue and reactive hyperplasia are polyclonal. Methods The clinicopathological features and clonality of MDA were investigated using laser microdissection and a clonality assay based on the polymorphism of androgen receptor (AR and X-chromosomal inactivation mosaicism in female somatic tissues. Results The results demonstrated that the glands were positive for CEA, Ki-67, and p53 and negative for estrogen receptor (ER, progesterone receptor (PR, and high-risk human papilloma virus (HPV DNA. The index of proliferation for Ki-67 was more than 50%. However, the stromal cells were positive for ER, PR, vimentin, and SM-actin. The clonal assay showed that MDA was monoclonal. Thus, our findings indicate that MDA is a true neoplasm but is not associated with high-risk HPV. Conclusions Diagnosis of MDA depends mainly on its clinical manifestations, the pathological feature that MDA glands are located deeper than the lower level of normal endocervical glands, and immunostaining.
The minimally tuned minimal supersymmetric standard model
International Nuclear Information System (INIS)
Essig, Rouven; Fortin, Jean-Francois
2008-01-01
The regions in the Minimal Supersymmetric Standard Model with the minimal amount of fine-tuning of electroweak symmetry breaking are presented for general messenger scale. No a priori relations among the soft supersymmetry breaking parameters are assumed and fine-tuning is minimized with respect to all the important parameters which affect electroweak symmetry breaking. The superpartner spectra in the minimally tuned region of parameter space are quite distinctive with large stop mixing at the low scale and negative squark soft masses at the high scale. The minimal amount of tuning increases enormously for a Higgs mass beyond roughly 120 GeV
Minimalism as civilization paradigmat the beginning of the 21. century
Directory of Open Access Journals (Sweden)
Vasilski Dragana
2008-01-01
Full Text Available The term minimalism is currently used descriptively to refer to a style marked by a certain asceticism in the art, architecture and design. It has begun from American Minimal Art of the 1960s in the fields of painting and sculpture and has filtered into other sectors of society. It is now found in fashion, music, literature and interior decoration, as well as architecture. Minimalisms has come to define the result of the use of simple geometric forms, pure and simple lines, the modular principle, surfaces with a smooth industrial appearance that negate any character of handmade individuality. As far as architecture is concerned, minimalism is characterized by the emphasis on essential elements - like light and the way it falls on the volumes and masses that make up buildings and shape space and structure. Linear structures and essential geometric forms define identity but despite the apparent simplicity of these works the effect they make is extremely complex.
The minimal non-minimal standard model
International Nuclear Information System (INIS)
Bij, J.J. van der
2006-01-01
In this Letter I discuss a class of extensions of the standard model that have a minimal number of possible parameters, but can in principle explain dark matter and inflation. It is pointed out that the so-called new minimal standard model contains a large number of parameters that can be put to zero, without affecting the renormalizability of the model. With the extra restrictions one might call it the minimal (new) non-minimal standard model (MNMSM). A few hidden discrete variables are present. It is argued that the inflaton should be higher-dimensional. Experimental consequences for the LHC and the ILC are discussed
International Nuclear Information System (INIS)
Bateman, Grant A.; Levi, Christopher R.; Wang, Yang; Lovett, Elizabeth C.; Schofield, Peter
2005-01-01
Variable results are obtained from the treatment of normal pressure hydrocephalus (NPH) by shunt insertion. There is a high correlation between NPH and the pathology of Alzheimer's disease (AD) on brain biopsy. There is an overlap between AD and vascular dementia (VaD), suggesting that a correlation exists between NPH and other forms of dementia. This study seeks to (1) understand the physiological factors behind, and (2) define the ability of, the aqueduct stroke volume to exclude dementia co-morbidity. Twenty-four patients from a dementia clinic were classified as having either early AD or VaD on the basis of clinical features, Hachinski score and neuropsychological testing. They were compared with 16 subjects with classical clinical findings of NPH and 12 aged-matched non-cognitively impaired subjects. MRI flow quantification was used to measure aqueduct stroke volume and arterial pulse volume. An arterio-cerebral compliance ratio was calculated from the two volumes in each patient. The aqueduct stroke volume was elevated in all three forms of dementia, with no significant difference noted between the groups. The arterial pulse volume was elevated by 24% in VaD and reduced by 35% in NPH, compared to normal (P=0.05 and P=0.002, respectively), and was normal in AD. There was a spectrum of relative compliance with normal compliance in VaD and reduced compliance in AD and NPH. The aqueduct stroke volume depends on the arterial pulse volume and the relative compliance between the arterial tree and brain. The aqueduct stroke volume cannot exclude significant co-morbidity in NPH. (orig.)
Energy Technology Data Exchange (ETDEWEB)
Bateman, Grant A. [John Hunter Hospital, Department of Medical Imaging, Newcastle (Australia); Levi, Christopher R.; Wang, Yang; Lovett, Elizabeth C. [Hunter Medical Research Institute, Clinical Neurosciences Program, Newcastle (Australia); Schofield, Peter [James Fletcher Hospital, Neuropsychiatry Unit, Newcastle (Australia)
2005-10-01
Variable results are obtained from the treatment of normal pressure hydrocephalus (NPH) by shunt insertion. There is a high correlation between NPH and the pathology of Alzheimer's disease (AD) on brain biopsy. There is an overlap between AD and vascular dementia (VaD), suggesting that a correlation exists between NPH and other forms of dementia. This study seeks to (1) understand the physiological factors behind, and (2) define the ability of, the aqueduct stroke volume to exclude dementia co-morbidity. Twenty-four patients from a dementia clinic were classified as having either early AD or VaD on the basis of clinical features, Hachinski score and neuropsychological testing. They were compared with 16 subjects with classical clinical findings of NPH and 12 aged-matched non-cognitively impaired subjects. MRI flow quantification was used to measure aqueduct stroke volume and arterial pulse volume. An arterio-cerebral compliance ratio was calculated from the two volumes in each patient. The aqueduct stroke volume was elevated in all three forms of dementia, with no significant difference noted between the groups. The arterial pulse volume was elevated by 24% in VaD and reduced by 35% in NPH, compared to normal (P=0.05 and P=0.002, respectively), and was normal in AD. There was a spectrum of relative compliance with normal compliance in VaD and reduced compliance in AD and NPH. The aqueduct stroke volume depends on the arterial pulse volume and the relative compliance between the arterial tree and brain. The aqueduct stroke volume cannot exclude significant co-morbidity in NPH. (orig.)
Menn, Lise; And Others
This study examined the role of empathy in the choice of syntactic form and the degree of independence of pragmatic and syntactic abilities in a range of aphasic patients. Study 1 involved 9 English-speaking and 9 Japanese-speaking aphasic subjects with 10 English-speaking and 4 Japanese normal controls. Study 2 involved 14 English- and 6…
Dănăilă, E.; Benea, L.
2017-06-01
The tribocorrosion behaviour of Ti-10Zr alloy and porous TiO2-ZrO2 thin film electrochemical formed on Ti-10Zr alloy was evaluated in Fusayama-Mayer artificial saliva solution. Tribocorrosion experiments were performed using a unidirectional pin-on-disc experimental set-up which was mechanically and electrochemically instrumented, under various solicitation conditions. The effect of applied normal force on tribocorrosion performance of the tested materials was determined. Open circuit potential (OCP) measurements performed before, during and after sliding tests were applied in order to determine the tribocorrosion degradation. The applied normal force was found to greatly affect the potential during tribocorrosion experiments, an increase in the normal force inducing a decrease in potential accelerating the depassivation of the materials studied. The results show a decrease in friction coefficient with gradually increasing the normal load. It was proved that the porous TiO2-ZrO2 thin film electrochemical formed on Ti-10Zr alloy lead to an improvement of tribocorrosion resistance compared to non-anodized Ti-10Zr alloy intended for biomedical applications.
Minimalism and the Pragmatic Frame
Directory of Open Access Journals (Sweden)
Ana Falcato
2016-02-01
Full Text Available In the debate between literalism and contextualism in semantics, Kent Bach’s project is often taken to stand on the latter side of the divide. In this paper I argue this is a misleading assumption and justify it by contrasting Bach’s assessment of the theoretical eliminability of minimal propositions arguably expressed by well-formed sentences with standard minimalist views, and by further contrasting his account of the division of interpretative processes ascribable to the semantics and pragmatics of a language with a parallel analysis carried out by the most radical opponent to semantic minimalism, i.e., by occasionalism. If my analysis proves right, the sum of its conclusions amounts to a refusal of Bach’s main dichotomies.
CREDIT RISK MINIMIZATION WAYS AND PRICING OF BANKING SERVICES
Directory of Open Access Journals (Sweden)
V. E. Gladkova
2011-01-01
Full Text Available Accurate accounting of own expenses on rendering banking services and forming reasonable prices for them make it possible for commercial banks to adequately react to market situation changes. Credit risk minimization comprises: credit rationing (in Russia according to RF Central Bank norms; credit diversification; credit structuring; and forming reserves to cover respective bank risks (also in accordance with RF CB documents. Effective is bank credit hedging (insuring through credit derivatives. Most advanced at international finance markets are such risk minimization systems as Basel-II and IRBA. Pricing models based on individual assessment of each borrower’s risk class (Risk Based Pricing approach are widely used.
Computerprogram for the determination of minimal cardiac transit times
International Nuclear Information System (INIS)
Bosiljanoff, P.; Herzog, H.; Schmid, A.; Sommer, D.; Vyska, K.; Feinendegen, L.E.
1982-10-01
An Anger-Type gamma-camera is used to register the first pass of a radioactive tracer of blood flow through the heart. The acquired data are processed by a suitable computer program yielding time-activity curves for sequential heart segments, which are selected by the region of interest technique. The program prints the minimal cardiac transit times, in terms of total transit times, as well as segmental transit times for the right atrium, right ventricle, lung, left atrium and left ventricle. The measured values are normalized to a rate of 80/min and are compared to normal mean values. The deviation from the normal mean values is characterized by a coefficient F. Moreover, these findings are qualitatively rated. (orig./MG)
International Nuclear Information System (INIS)
Griffiths, Paul D.; Batty, Ruth; Connolly, Dan J.A.; Reeves, Michael J.
2009-01-01
The midline structures of the supra-tentorial brain are important landmarks for judging if the brain has formed correctly. In this article, we consider the normal appearances of the corpus callosum, septum pellucidum and fornix as shown on MR imaging in normal and near-normal states. (orig.)
Directory of Open Access Journals (Sweden)
Knol Dirk L
2006-08-01
Full Text Available Abstract Changes in scores on health status questionnaires are difficult to interpret. Several methods to determine minimally important changes (MICs have been proposed which can broadly be divided in distribution-based and anchor-based methods. Comparisons of these methods have led to insight into essential differences between these approaches. Some authors have tried to come to a uniform measure for the MIC, such as 0.5 standard deviation and the value of one standard error of measurement (SEM. Others have emphasized the diversity of MIC values, depending on the type of anchor, the definition of minimal importance on the anchor, and characteristics of the disease under study. A closer look makes clear that some distribution-based methods have been merely focused on minimally detectable changes. For assessing minimally important changes, anchor-based methods are preferred, as they include a definition of what is minimally important. Acknowledging the distinction between minimally detectable and minimally important changes is useful, not only to avoid confusion among MIC methods, but also to gain information on two important benchmarks on the scale of a health status measurement instrument. Appreciating the distinction, it becomes possible to judge whether the minimally detectable change of a measurement instrument is sufficiently small to detect minimally important changes.
Minimally coupled N-particle scattering integral equations
International Nuclear Information System (INIS)
Kowalski, K.L.
1977-01-01
A concise formalism is developed which permits the efficient representation and generalization of several known techniques for deriving connected-kernel N-particle scattering integral equations. The methods of Kouri, Levin, and Tobocman and Bencze and Redish which lead to minimally coupled integral equations are of special interest. The introduction of channel coupling arrays is characterized in a general manner and the common base of this technique and that of the so-called channel coupling scheme is clarified. It is found that in the Bencze-Redish formalism a particular coupling array has a crucial function but one different from that of the arrays employed by Kouri, Levin, and Tobocman. The apparent dependence of the proof of the minimality of the Bencze-Redish integral equations upon the form of the inhomogeneous term in these equations is eliminated. This is achieved by an investigation of the full (nonminimal) Bencze-Redish kernel. It is shown that the second power of this operator is connected, a result which is needed for the full applicability of the Bencze-Redish formalism. This is used to establish the relationship between the existence of solutions to the homogeneous form of the minimal equations and eigenvalues of the full Bencze-Redish kernel
Normalized modes at selected points without normalization
Kausel, Eduardo
2018-04-01
As every textbook on linear algebra demonstrates, the eigenvectors for the general eigenvalue problem | K - λM | = 0 involving two real, symmetric, positive definite matrices K , M satisfy some well-defined orthogonality conditions. Equally well-known is the fact that those eigenvectors can be normalized so that their modal mass μ =ϕT Mϕ is unity: it suffices to divide each unscaled mode by the square root of the modal mass. Thus, the normalization is the result of an explicit calculation applied to the modes after they were obtained by some means. However, we show herein that the normalized modes are not merely convenient forms of scaling, but that they are actually intrinsic properties of the pair of matrices K , M, that is, the matrices already "know" about normalization even before the modes have been obtained. This means that we can obtain individual components of the normalized modes directly from the eigenvalue problem, and without needing to obtain either all of the modes or for that matter, any one complete mode. These results are achieved by means of the residue theorem of operational calculus, a finding that is rather remarkable inasmuch as the residues themselves do not make use of any orthogonality conditions or normalization in the first place. It appears that this obscure property connecting the general eigenvalue problem of modal analysis with the residue theorem of operational calculus may have been overlooked up until now, but which has in turn interesting theoretical implications.Á
Algebra and Arithmetic of Modular Forms
DEFF Research Database (Denmark)
Rustom, Nadim
In [Rus14b] and [Rus14a], we study graded rings of modular forms over congruence subgroups, with coefficients in subrings A of C, and determine bounds of the weights of modular forms constituting a minimal set of generators, as well as on the degree of the generators of the ideal of relations...... between them. We give an algorithm that computes the structures of these rings, and formulate conjectures on the minimal generating weight for modular forms with coefficients in Z. We discuss questions of finiteness of systems of Hecke eigenvalues modulo pm, for a prime p and an integer m ≥ 2, in analogy...
Dierkes, Ulrich; Sauvigny, Friedrich; Jakob, Ruben; Kuster, Albrecht
2010-01-01
Minimal Surfaces is the first volume of a three volume treatise on minimal surfaces (Grundlehren Nr. 339-341). Each volume can be read and studied independently of the others. The central theme is boundary value problems for minimal surfaces. The treatise is a substantially revised and extended version of the monograph Minimal Surfaces I, II (Grundlehren Nr. 295 & 296). The first volume begins with an exposition of basic ideas of the theory of surfaces in three-dimensional Euclidean space, followed by an introduction of minimal surfaces as stationary points of area, or equivalently
Chen, Chieh-Li; Ishikawa, Hiroshi; Wollstein, Gadi; Bilonick, Richard A; Kagemann, Larry; Schuman, Joel S
2017-02-01
To assess the effect of the previously reported optical coherence tomography (OCT) signal normalization method on reducing the discrepancies in image appearance among spectral-domain OCT (SD-OCT) devices. Healthy eyes and eyes with various retinal pathologies were scanned at the macular region using similar volumetric scan patterns with at least two out of three SD-OCT devices at the same visit (Cirrus HD-OCT, Zeiss, Dublin, CA; RTVue, Optovue, Fremont, CA; and Spectralis, Heidelberg Engineering, Heidelberg, Germany). All the images were processed with the signal normalization. A set of images formed a questionnaire with 24 pairs of cross-sectional images from each eye with any combination of the three SD-OCT devices either both pre- or postsignal normalization. Observers were asked to evaluate the similarity of the two displayed images based on the image appearance. The effects on reducing the differences in image appearance before and after processing were analyzed. Twenty-nine researchers familiar with OCT images participated in the survey. Image similarity was significantly improved after signal normalization for all three combinations ( P ≤ 0.009) as Cirrus and RTVue combination became the most similar pair, followed by Cirrus and Spectralis, and RTVue and Spectralis. The signal normalization successfully minimized the disparities in the image appearance among multiple SD-OCT devices, allowing clinical interpretation and comparison of OCT images regardless of the device differences. The signal normalization would enable direct OCT images comparisons without concerning about device differences and broaden OCT usage by enabling long-term follow-ups and data sharing.
Theories of minimalism in architecture: When prologue becomes palimpsest
Directory of Open Access Journals (Sweden)
Stevanović Vladimir
2014-01-01
Full Text Available This paper examines the modus and conditions of constituting and establishing architectural discourse on minimalism. One of the key topics in this discourse are historical line of development and the analysis of theoretical influences, which comprise connections of recent minimalism with the theorizations of various minimal, architectural and artistic, forms and concepts from the past. The paper shall particularly discuss those theoretical relations which, in a unitary way, link minimalism in architecture with its artistic nominal counterpart - minimal art. These are the relations founded on the basis of interpretative models on self-referentiality, phenomenological experience and contextualism, which are superficialy observed, common to both, artistic and architectural, minimalist discourses. It seems that in this constellation certain relations on the historical line of minimalism in architecture are questionable, while some other are overlooked. Precisely, posmodern fundamentalism is the architectural direction: 1 in which these three interpretations also existed; 2 from which architectural theorists retroactively appropriated many architects proclaiming them minimalists; 3 which establish identical relations with modern and postmodern theoretical and socio-historical contexts, as well as it will be done in minimalism. In spite of this, theoretical field of postmodern fundamentalism is surprisingly neglected in the discourse of minimalism in architecture. Instead of understanding postmodern fundamentalism as a kind of prologue to minimalism in architecture, it becomes an erased palimpsest over whom the different history of minimalism is rewriting, the history in which minimal art which occupies a central place.
Marrow transfusions into normal recipients
International Nuclear Information System (INIS)
Brecher, G.
1983-01-01
During the past several years we have explored the transfusion of bone marrow into normal nonirradiated mice. While transfused marrow proliferates readily in irradiated animals, only minimal proliferation takes place in nonirradiated recipients. It has generally been assumed that this was due to the lack of available proliferative sites in recipients with normal marrow. Last year we were able to report that the transfusion of 200 million bone marrow cells (about 2/3 of the total complement of marrow cells of a normal mouse) resulted in 20% to 25% of the recipient's marrow being replaced by donor marrow. Thus we can now study the behavior of animals that have been transfused (donor) and endogenous (recipient) marrow cells, although none of the tissues of either donor or recipient have been irradiated. With these animals we hope to investigate the nature of the peculiar phenomenon of serial exhaustion of marrow, also referred to as the limited self-replicability of stem cells
Microbiological and radiobiological studies on the hygienic quality of minimally processed food
Energy Technology Data Exchange (ETDEWEB)
Abu El-Nour, S. A. M. [National Center for Radiation Research and Technology, Atomic Energy Authority, Cairo (Egypt)
2007-07-01
In the past, there have been three traditional forms of food trading; fresh, canned and frozen foods. In recent years, a fourth form called {sup m}inimally processed food has been developed to respond to an emerging consumer demand for convenient, high-quality and preservative-free products with appearance of fresh characteristics, while being less severely processed (Saracino et al., 1991). Minimally processed food can be used as ready-to-eat, ready-to-use, or ready-to-cook products. They are stored and marketed under refrigeration conditions (Dignan, 1994). Minimally processed food products were developed in 1980's and now they are produced in many advanced and some developing countries. In Egypt, great amounts of minimally processed vegetables are now produced and commercially sold in certain supermarkets. They include fresh-cut lettuce, packaged mixed vegetables salad, shredded carrots, sliced carrots, shredded cabbage (white and red), fresh-cut green beans, mixed peas with diced carrots, mafa spanish, okra, watermelon, pumpkin, garlic, artichoke, celery, parsley, etc. However, there is an increasing interest to offer some other minimally processed vegetables and some types of fresh-cut fruits that can be used as ready-to-eat or ready-to-use. Preparation steps of minimally processed fruit and vegetable products which may include peeling, slicing, shredding, etc save labor and time for the purchasers, meanwhile removal of waste material during processing reduce transport costs. In addition, the production of such products will make year-round availability of almost all vegetables and fruits possible in fresh form around the world (Baldwin et al., 1995). However, preparation steps of such products increase the native enzymatic activity and the possibility of microbial contamination. Therefore, these products have short shelf-life and this is considered one of the foremost challenging problems in the commercialization of minimally processed foods particularly
Normalization of the psychometric hepatic encephalopathy score for ...
African Journals Online (AJOL)
Aim: To construct normal values for the tests of the psychometric hepatic encephalopathy score (PHES) and evaluate the prevalence of minimal hepatic encephalopathy (MHE) among Turkish patients with liver cirrhosis. Materials and Methods: One hundred and eighty-five healthy subjects and sixty patients with liver ...
Minimal Left-Right Symmetric Dark Matter.
Heeck, Julian; Patra, Sudhanwa
2015-09-18
We show that left-right symmetric models can easily accommodate stable TeV-scale dark matter particles without the need for an ad hoc stabilizing symmetry. The stability of a newly introduced multiplet either arises accidentally as in the minimal dark matter framework or comes courtesy of the remaining unbroken Z_{2} subgroup of B-L. Only one new parameter is introduced: the mass of the new multiplet. As minimal examples, we study left-right fermion triplets and quintuplets and show that they can form viable two-component dark matter. This approach is, in particular, valid for SU(2)×SU(2)×U(1) models that explain the recent diboson excess at ATLAS in terms of a new charged gauge boson of mass 2 TeV.
Perturbed Yukawa textures in the minimal seesaw model
Energy Technology Data Exchange (ETDEWEB)
Rink, Thomas; Schmitz, Kai [Max Planck Institute for Nuclear Physics (MPIK),69117 Heidelberg (Germany)
2017-03-29
We revisit the minimal seesaw model, i.e., the type-I seesaw mechanism involving only two right-handed neutrinos. This model represents an important minimal benchmark scenario for future experimental updates on neutrino oscillations. It features four real parameters that cannot be fixed by the current data: two CP-violating phases, δ and σ, as well as one complex parameter, z, that is experimentally inaccessible at low energies. The parameter z controls the structure of the neutrino Yukawa matrix at high energies, which is why it may be regarded as a label or index for all UV completions of the minimal seesaw model. The fact that z encompasses only two real degrees of freedom allows us to systematically scan the minimal seesaw model over all of its possible UV completions. In doing so, we address the following question: suppose δ and σ should be measured at particular values in the future — to what extent is one then still able to realize approximate textures in the neutrino Yukawa matrix? Our analysis, thus, generalizes previous studies of the minimal seesaw model based on the assumption of exact texture zeros. In particular, our study allows us to assess the theoretical uncertainty inherent to the common texture ansatz. One of our main results is that a normal light-neutrino mass hierarchy is, in fact, still consistent with a two-zero Yukawa texture, provided that the two texture zeros receive corrections at the level of O(10 %). While our numerical results pertain to the minimal seesaw model only, our general procedure appears to be applicable to other neutrino mass models as well.
Constituent period in theoretization of minimalism in architecture
Directory of Open Access Journals (Sweden)
Stevanović Vladimir
2012-01-01
Full Text Available The paper analyzes architectural discourse that is formed around the term minimalism, between 1976 and 1999, a period that I consider constitutive for theorization of the term. The presentation is directed by two hypotheses: I minimalism in architecture does not have a continuous stream of origin, development, and is not a style, direction, movement, school, genre or trend in terms of how it is defined in disciplines such as art history, aesthetics and art theory II the fact that it's rare for an architect to declare himself a minimalist suggests that minimalism in architecture is actually a product or construct of an architectural discourse that emerged from the need to consolidate the existing obvious and widespread formal idiom in architecture partly during and after post-modernism. It is indicative that the writing of history of minimalism in architecture, in its most intensive period - the nineties, takes place mainly in three cities: London, Barcelona and Milan. In this sense, we can examine how each of these centers emphasized its role, through the ambition of minimalism in architecture to appear as an authentic local creation.
Tensioned Fabric Structures with Surface in the Form of Chen-Gackstatter
Directory of Open Access Journals (Sweden)
Yee Hooi Min
2016-01-01
Full Text Available Form-finding has to be carried out for tensioned fabric structure in order to determine the initial equilibrium shape under prescribed support condition and prestress pattern. Tensioned fabric structures are normally designed to be in the form of equal tensioned surface. Tensioned fabric structure is highly suited to be used for realizing surfaces of complex or new forms. However, research study on a new form as a tensioned fabric structure has not attracted much attention. Another source of inspiration minimal surface which could be adopted as form for tensioned fabric structure is very crucial. The aim of this study is to propose initial equilibrium shape of tensioned fabric structures in the form of Chen-Gackstatter. Computational form-finding using nonlinear analysis method is used to determine the Chen-Gackstatter form of uniformly stressed surfaces. A tensioned fabric structure must curve equally in opposite directions to give the resulting surface a three dimensional stability. In an anticlastic doubly curved surface, the sum of all positive and all negative curvatures is zero. This study provides an alternative choice for structural designer to consider the Chen-Gackstatter applied in tensioned fabric structures. The results on factors affecting initial equilibrium shape can serve as a reference for proper selection of surface parameter for achieving a structurally viable surface.
Minimally invasive surgical treatment of valvular heart disease.
Goldstone, Andrew B; Joseph Woo, Y
2014-01-01
Cardiac surgery is in the midst of a practice revolution. Traditionally, surgery for valvular heart disease consisted of valve replacement via conventional sternotomy using cardiopulmonary bypass. However, over the past 20 years, the increasing popularity of less-invasive procedures, accompanied by advancements in imaging, surgical instrumentation, and robotic technology, has motivated and enabled surgeons to develop and perform complex cardiac surgical procedures through small incisions, often eliminating the need for sternotomy or cardiopulmonary bypass. In addition to the benefits of improved cosmesis, minimally invasive mitral valve surgery was pioneered with the intent of reducing morbidity, postoperative pain, blood loss, hospital length of stay, and time to return to normal activity. This article reviews the current state-of-the-art of minimally invasive approaches to the surgical treatment of valvular heart disease. Copyright © 2014 Elsevier Inc. All rights reserved.
Energy Technology Data Exchange (ETDEWEB)
Bobodzhanov, A A; Safonov, V F [National Research University " Moscow Power Engineering Institute" , Moscow (Russian Federation)
2013-07-31
The paper deals with extending the Lomov regularization method to classes of singularly perturbed Fredholm-type integro-differential systems, which have not so far been studied. In these the limiting operator is discretely noninvertible. Such systems are commonly known as problems with unstable spectrum. Separating out the essential singularities in the solutions to these problems presents great difficulties. The principal one is to give an adequate description of the singularities induced by 'instability points' of the spectrum. A methodology for separating singularities by using normal forms is developed. It is applied to the above type of systems and is substantiated in these systems. Bibliography: 10 titles.
Deformed statistics Kullback–Leibler divergence minimization within a scaled Bregman framework
International Nuclear Information System (INIS)
Venkatesan, R.C.; Plastino, A.
2011-01-01
The generalized Kullback–Leibler divergence (K–Ld) in Tsallis statistics [constrained by the additive duality of generalized statistics (dual generalized K–Ld)] is here reconciled with the theory of Bregman divergences for expectations defined by normal averages, within a measure-theoretic framework. Specifically, it is demonstrated that the dual generalized K–Ld is a scaled Bregman divergence. The Pythagorean theorem is derived from the minimum discrimination information principle using the dual generalized K–Ld as the measure of uncertainty, with constraints defined by normal averages. The minimization of the dual generalized K–Ld, with normal averages constraints, is shown to exhibit distinctly unique features. -- Highlights: ► Dual generalized Kullback–Leibler divergence (K–Ld) proven to be scaled Bregman divergence in continuous measure-theoretic framework. ► Minimum dual generalized K–Ld condition established with normal averages constraints. ► Pythagorean theorem derived.
Microbiological and radiobiological studies on the hygienic quality of minimally processed food
International Nuclear Information System (INIS)
Abu El-Nour, S. A. M.
2007-01-01
In the past, there have been three traditional forms of food trading; fresh, canned and frozen foods. In recent years, a fourth form called m inimally processed food has been developed to respond to an emerging consumer demand for convenient, high-quality and preservative-free products with appearance of fresh characteristics, while being less severely processed (Saracino et al., 1991). Minimally processed food can be used as ready-to-eat, ready-to-use, or ready-to-cook products. They are stored and marketed under refrigeration conditions (Dignan, 1994). Minimally processed food products were developed in 1980's and now they are produced in many advanced and some developing countries. In Egypt, great amounts of minimally processed vegetables are now produced and commercially sold in certain supermarkets. They include fresh-cut lettuce, packaged mixed vegetables salad, shredded carrots, sliced carrots, shredded cabbage (white and red), fresh-cut green beans, mixed peas with diced carrots, mafa spanish, okra, watermelon, pumpkin, garlic, artichoke, celery, parsley, etc. However, there is an increasing interest to offer some other minimally processed vegetables and some types of fresh-cut fruits that can be used as ready-to-eat or ready-to-use. Preparation steps of minimally processed fruit and vegetable products which may include peeling, slicing, shredding, etc save labor and time for the purchasers, meanwhile removal of waste material during processing reduce transport costs. In addition, the production of such products will make year-round availability of almost all vegetables and fruits possible in fresh form around the world (Baldwin et al., 1995). However, preparation steps of such products increase the native enzymatic activity and the possibility of microbial contamination. Therefore, these products have short shelf-life and this is considered one of the foremost challenging problems in the commercialization of minimally processed foods particularly fresh
Glucokinase MODY and implications for treatment goals of common forms of diabetes.
Ajjan, Ramzi A; Owen, Katharine R
2014-12-01
Treatment goals in diabetes concentrate on reducing the risk of vascular complications, largely through setting targets for glycated haemoglobin (HbA1c). These targets are based on epidemiological studies of complication development, but so far have not adequately addressed the adverse effects associated with lowering HbA1c towards the normal range. Glucokinase (GCK) mutations cause a monogenic form of hyperglycaemia (GCK-MODY) characterised by fasting hyperglycaemia with low postprandial glucose excursions and a marginally elevated HbA1c. Minimal levels of vascular complications (comparable with nondiabetic individuals) are observed in GCK-MODY, leading to the hypothesis that GCK-MODY may represent a useful paradigm for assessing treatment goals in all forms of diabetes. In this review, we discuss the evidence behind this concept, suggest ways of translating this hypothesis into clinical practice and address some of the caveats of such an approach.
Strike type variation among Tarahumara Indians in minimal sandals versus conventional running shoes
Directory of Open Access Journals (Sweden)
Daniel E. Lieberman
2014-06-01
Conclusion: These data reinforce earlier studies that there is variation among foot strike patterns among minimally shod runners, but also support the hypothesis that foot stiffness and important aspects of running form, including foot strike, differ between runners who grow up using minimal versus modern, conventional footwear.
Chern-Simons forms in gravitation theories
International Nuclear Information System (INIS)
Zanelli, Jorge
2012-01-01
The Chern-Simons (CS) form evolved from an obstruction in mathematics into an important object in theoretical physics. In fact, the presence of CS terms in physics is more common than one may think: they seem to play an important role in high Tc superconductivity and in recently discovered topological insulators. In classical physics, the minimal coupling in electromagnetism and to the action for a mechanical system in Hamiltonian form are examples of CS functionals. CS forms are also the natural generalization of the minimal coupling between the electromagnetic field and a point charge when the source is not point like but an extended fundamental object, a membrane. They are found in relation with anomalies in quantum field theories, and as Lagrangians for gauge fields, including gravity and supergravity. A cursory review of the role of CS forms in gravitation theories is presented at an introductory level. (topical review)
Chern-Simons forms in gravitation theories
Zanelli, Jorge
2012-07-01
The Chern-Simons (CS) form evolved from an obstruction in mathematics into an important object in theoretical physics. In fact, the presence of CS terms in physics is more common than one may think: they seem to play an important role in high Tc superconductivity and in recently discovered topological insulators. In classical physics, the minimal coupling in electromagnetism and to the action for a mechanical system in Hamiltonian form are examples of CS functionals. CS forms are also the natural generalization of the minimal coupling between the electromagnetic field and a point charge when the source is not point like but an extended fundamental object, a membrane. They are found in relation with anomalies in quantum field theories, and as Lagrangians for gauge fields, including gravity and supergravity. A cursory review of the role of CS forms in gravitation theories is presented at an introductory level.
Minimalism in Art, Medical Science and Neurosurgery.
Okten, Ali Ihsan
2018-01-01
The word "minimalism" is a word derived from French the word "minimum". Whereas the lexical meaning of minimum is "the least or the smallest quantity necessary for something", its expression in mathematics can be described as "the lowest step a variable number can descend, least, minimal". Minimalism, which advocates an extreme simplicity of the artistic form, is a current in modern art and music whose origins go to 1960s and which features simplicity and objectivity. Although art, science and philosophy are different disciplines, they support each other from time to time, sometimes they intertwine and sometimes they copy each other. A periodic schools or teaching in one of them can take the others into itself, so, they proceed on their ways empowering each other. It is also true for the minimalism in art and the minimal invasive surgical approaches in science. Concepts like doing with less, avoiding unnecessary materials and reducing the number of the elements in order to increase the effect in the expression which are the main elements of the minimalism in art found their equivalents in medicine and neurosurgery. Their equivalents in medicine or neurosurgery have been to protect the physical integrity of the patient with less iatrogenic injury, minimum damage and the same therapeutic effect in the most effective way and to enable the patient to regain his health in the shortest span of time. As an anticipation, we can consider that the minimal approaches started by Richard Wollheim and Barbara Rose in art and Lars Leksell, Gazi Yaşargil and other neurosurgeons in neurosurgery in the 1960s are the present day equivalents of the minimalist approaches perhaps unconsciously started by Kazimir Malevich in art and Victor Darwin L"Espinasse in neurosurgery in the early 1900s. We can also consider that they have developed interacting with each other, not by chance.
Energy Technology Data Exchange (ETDEWEB)
Martinez Carrillo, Irma
2008-01-15
Power system dynamic behavior is inherently nonlinear and is driven by different processes at different time scales. The size and complexity of these mechanisms has stimulated the search for methods that reduce the original dimension but retain a certain degree of accuracy. In this dissertation, a novel nonlinear dynamical analysis method for the analysis of large amplitude oscillations that embraces ideas from normal form theory and singular perturbation techniques is proposed. This approach allows the full potential of the normal form method to be reached, and is suitably general for application to a wide variety of nonlinear systems. Drawing on the formal theory of dynamical systems, a structure-preserving model of the system is developed that preservers network and load characteristics. By exploiting the separation of fast and slow time scales of the model, an efficient approach based on singular perturbation techniques, is then derived for constructing a nonlinear power system representation that accurately preserves network structure. The method requires no reduction of the constraint equations and gives therefore, information about the effect of network and load characteristics on system behavior. Analytical expressions are then developed that provide approximate solutions to system performance near a singularity and techniques for interpreting these solutions in terms of modal functions are given. New insights into the nature of nonlinear oscillations are also offered and criteria for characterizing network effects on nonlinear system behavior are proposed. Theoretical insight into the behavior of dynamic coupling of differential-algebraic equations and the origin of nonlinearity is given, and implications for analyzing for design and placement of power system controllers in complex nonlinear systems are discussed. The extent of applicability of the proposed procedure is demonstrated by analyzing nonlinear behavior in two realistic test power systems
Manufacturing technology for practical Josephson voltage normals
International Nuclear Information System (INIS)
Kohlmann, Johannes; Kieler, Oliver
2016-01-01
In this contribution we present the manufacturing technology for the fabrication of integrated superconducting Josephson serial circuits for voltage normals. First we summarize some foundations for Josephson voltage normals and sketch the concept and the setup of the circuits, before we describe the manufacturing technology form modern practical Josephson voltage normals.
Inference with minimal Gibbs free energy in information field theory
International Nuclear Information System (INIS)
Ensslin, Torsten A.; Weig, Cornelius
2010-01-01
Non-linear and non-Gaussian signal inference problems are difficult to tackle. Renormalization techniques permit us to construct good estimators for the posterior signal mean within information field theory (IFT), but the approximations and assumptions made are not very obvious. Here we introduce the simple concept of minimal Gibbs free energy to IFT, and show that previous renormalization results emerge naturally. They can be understood as being the Gaussian approximation to the full posterior probability, which has maximal cross information with it. We derive optimized estimators for three applications, to illustrate the usage of the framework: (i) reconstruction of a log-normal signal from Poissonian data with background counts and point spread function, as it is needed for gamma ray astronomy and for cosmography using photometric galaxy redshifts, (ii) inference of a Gaussian signal with unknown spectrum, and (iii) inference of a Poissonian log-normal signal with unknown spectrum, the combination of (i) and (ii). Finally we explain how Gaussian knowledge states constructed by the minimal Gibbs free energy principle at different temperatures can be combined into a more accurate surrogate of the non-Gaussian posterior.
Consensus guidelines on plasma cell myeloma minimal residual disease analysis and reporting.
Arroz, Maria; Came, Neil; Lin, Pei; Chen, Weina; Yuan, Constance; Lagoo, Anand; Monreal, Mariela; de Tute, Ruth; Vergilio, Jo-Anne; Rawstron, Andy C; Paiva, Bruno
2016-01-01
Major heterogeneity between laboratories in flow cytometry (FC) minimal residual disease (MRD) testing in multiple myeloma (MM) must be overcome. Cytometry societies such as the International Clinical Cytometry Society and the European Society for Clinical Cell Analysis recognize a strong need to establish minimally acceptable requirements and recommendations to perform such complex testing. A group of 11 flow cytometrists currently performing FC testing in MM using different instrumentation, panel designs (≥ 6-color) and analysis software compared the procedures between their respective laboratories and reviewed the literature to propose a consensus guideline on flow-MRD analysis and reporting in MM. Consensus guidelines support i) the use of minimum of five initial gating parameters (CD38, CD138, CD45, forward, and sideward light scatter) within the same aliquot for accurate identification of the total plasma cell compartment; ii) the analysis of potentially aberrant phenotypic markers and to report the antigen expression pattern on neoplastic plasma cells as being reduced, normal or increased, when compared to a normal reference plasma cell immunophenotype (obtained using the same instrument and parameters); and iii) the percentage of total bone marrow plasma cells plus the percentages of both normal and neoplastic plasma cells within the total bone marrow plasma cell compartment, and over total bone marrow cells. Consensus guidelines on minimal current and future MRD analyses should target a lower limit of detection of 0.001%, and ideally a limit of quantification of 0.001%, which requires at least 3 × 10(6) and 5 × 10(6) bone marrow cells to be measured, respectively. © 2015 International Clinical Cytometry Society.
Radiocardiography of minimal transit times: a useful diagnostic procedure
International Nuclear Information System (INIS)
Schicha, H.; Vyska, K.; Becker, V.; Feinendegen, L.E.; Duesseldorf Univ., F.R. Germany)
1975-01-01
Contrary to mean transit times, minimal transit times are the differences between arrival times of an indicator. Arrival times in various cardiac compartments can be easily measured with radioisotopes and fast gamma cameras permitting data processing. This paper summarizes data selected from more than 1500 measurements made so far on normal individuals and patients with valvular heart disease, myocardial insufficiency, digitalis effect, atrial fibrillation, hypothyroidism, hyperthyroidism, effort-syndrome and coronary artery disease. (author)
Visual attention and flexible normalization pools
Schwartz, Odelia; Coen-Cagli, Ruben
2013-01-01
Attention to a spatial location or feature in a visual scene can modulate the responses of cortical neurons and affect perceptual biases in illusions. We add attention to a cortical model of spatial context based on a well-founded account of natural scene statistics. The cortical model amounts to a generalized form of divisive normalization, in which the surround is in the normalization pool of the center target only if they are considered statistically dependent. Here we propose that attention influences this computation by accentuating the neural unit activations at the attended location, and that the amount of attentional influence of the surround on the center thus depends on whether center and surround are deemed in the same normalization pool. The resulting form of model extends a recent divisive normalization model of attention (Reynolds & Heeger, 2009). We simulate cortical surround orientation experiments with attention and show that the flexible model is suitable for capturing additional data and makes nontrivial testable predictions. PMID:23345413
Minimal pneumothorax with dynamic changes in ST segment similar to myocardial infarction.
Yeom, Seok-Ran; Park, Sung-Wook; Kim, Young-Dae; Ahn, Byung-Jae; Ahn, Jin-Hee; Wang, Il-Jae
2017-08-01
Pneumothorax can cause a variety of electrocardiographic changes. ST segment elevation, which is mainly observed in myocardial infarction, can also be induced by pneumothorax. The mechanism is presumed to be a decrease in cardiac output, due to increased intra-thoracic pressure. We encountered a patient with ST segment elevation with minimal pneumothorax. Coronary angiography with ergonovine provocation test and echocardiogram had normal findings. The ST segment elevation was normalized by decreasing the amount of pneumothorax. We reviewed the literature and present possible mechanisms for this condition. Copyright © 2017 Elsevier Inc. All rights reserved.
Normalization in Lie algebras via mould calculus and applications
Paul, Thierry; Sauzin, David
2017-11-01
We establish Écalle's mould calculus in an abstract Lie-theoretic setting and use it to solve a normalization problem, which covers several formal normal form problems in the theory of dynamical systems. The mould formalism allows us to reduce the Lie-theoretic problem to a mould equation, the solutions of which are remarkably explicit and can be fully described by means of a gauge transformation group. The dynamical applications include the construction of Poincaré-Dulac formal normal forms for a vector field around an equilibrium point, a formal infinite-order multiphase averaging procedure for vector fields with fast angular variables (Hamiltonian or not), or the construction of Birkhoff normal forms both in classical and quantum situations. As a by-product we obtain, in the case of harmonic oscillators, the convergence of the quantum Birkhoff form to the classical one, without any Diophantine hypothesis on the frequencies of the unperturbed Hamiltonians.
Strong normalization by type-directed partial evaluation and run-time code generation
DEFF Research Database (Denmark)
Balat, Vincent; Danvy, Olivier
1998-01-01
We investigate the synergy between type-directed partial evaluation and run-time code generation for the Caml dialect of ML. Type-directed partial evaluation maps simply typed, closed Caml values to a representation of their long βη-normal form. Caml uses a virtual machine and has the capability...... to load byte code at run time. Representing the long βη-normal forms as byte code gives us the ability to strongly normalize higher-order values (i.e., weak head normal forms in ML), to compile the resulting strong normal forms into byte code, and to load this byte code all in one go, at run time. We...... conclude this note with a preview of our current work on scaling up strong normalization by run-time code generation to the Caml module language....
Strong Normalization by Type-Directed Partial Evaluation and Run-Time Code Generation
DEFF Research Database (Denmark)
Balat, Vincent; Danvy, Olivier
1997-01-01
We investigate the synergy between type-directed partial evaluation and run-time code generation for the Caml dialect of ML. Type-directed partial evaluation maps simply typed, closed Caml values to a representation of their long βη-normal form. Caml uses a virtual machine and has the capability...... to load byte code at run time. Representing the long βη-normal forms as byte code gives us the ability to strongly normalize higher-order values (i.e., weak head normal forms in ML), to compile the resulting strong normal forms into byte code, and to load this byte code all in one go, at run time. We...... conclude this note with a preview of our current work on scaling up strong normalization by run-time code generation to the Caml module language....
A convergent overlapping domain decomposition method for total variation minimization
Fornasier, Massimo; Langer, Andreas; Schö nlieb, Carola-Bibiane
2010-01-01
In this paper we are concerned with the analysis of convergent sequential and parallel overlapping domain decomposition methods for the minimization of functionals formed by a discrepancy term with respect to the data and a total variation
A unified approach to the minimal unitary realizations of noncompact groups and supergroups
International Nuclear Information System (INIS)
Guenaydin, Murat; Pavlyk, Oleksandr
2006-01-01
We study the minimal unitary representations of non-compact groups and supergroups obtained by quantization of their geometric realizations as quasi-conformal groups and supergroups. The quasi-conformal groups G leave generalized light-cones defined by a quartic norm invariant and have maximal rank subgroups of the form H x SL(2, R) such that G/H x SL(2, R) are para-quaternionic symmetric spaces. We give a unified formulation of the minimal unitary representations of simple non-compact groups of type A 2 , G 2 , D 4 , F 4 , E 6 , E 7 , E 8 and Sp(2n, R). The minimal unitary representations of Sp(2n, R) are simply the singleton representations and correspond to a degenerate limit of the unified construction. The minimal unitary representations of the other noncompact groups SU(m, n), SO(m, n), SO*(2n) and SL(m, R) are also given explicitly. We extend our formalism to define and construct the corresponding minimal representations of non-compact supergroups G whose even subgroups are of the form H x SL(2, R). If H is noncompact then the supergroup G does not admit any unitary representations, in general. The unified construction with H simple or Abelian leads to the minimal representations of G(3), F(4) and O Sp(n|2, R) (in the degenerate limit). The minimal unitary representations of O Sp(n|2, R) with even subgroups SO(n) x SL(2, R) are the singleton representations. We also give the minimal realization of the one parameter family of Lie superalgebras D(2, 1; σ)
Toda theories, W-algebras, and minimal models
International Nuclear Information System (INIS)
Mansfield, P.; Spence, B.
1991-01-01
We discuss the classical W-algebra symmetries of Toda field theories in terms of the pseudo-differential Lax operator associated with the Toda Lax pair. We then show how the W-algebra transformations can be understood as the non-abelian gauge transformations which preserve the form of the Lax pair. This provides a new understanding of the W-algebras, and we discuss their closure and co-cycle structure using this approach. The quantum Lax operator is investigated, and we show that this operator, which generates the quantum W-algebra currents, is conserved in the conformally extended Toda theories. The W-algebra minimal model primary fields are shown to arise naturally in these theories, leading to the conjecture that the conformally extended Toda theories provide a lagrangian formulation of the W-algebra minimal models. (orig.)
Minimal Self-Models and the Free Energy Principle
Directory of Open Access Journals (Sweden)
Jakub eLimanowski
2013-09-01
Full Text Available The term "minimal phenomenal selfhood" describes the basic, pre-reflective experience of being a self (Blanke & Metzinger, 2009. Theoretical accounts of the minimal self have long recognized the importance and the ambivalence of the body as both part of the physical world, and the enabling condition for being in this world (Gallagher, 2005; Grafton, 2009. A recent account of minimal phenomenal selfhood (MPS, Metzinger, 2004a centers on the consideration that minimal selfhood emerges as the result of basic self-modeling mechanisms, thereby being founded on pre-reflective bodily processes. The free energy principle (FEP, Friston, 2010 is a novel unified theory of cortical function that builds upon the imperative that self-organizing systems entail hierarchical generative models of the causes of their sensory input, which are optimized by minimizing free energy as an approximation of the log-likelihood of the model. The implementation of the FEP via predictive coding mechanisms and in particular the active inference principle emphasizes the role of embodiment for predictive self-modeling, which has been appreciated in recent publications. In this review, we provide an overview of these conceptions and illustrate thereby the potential power of the FEP in explaining the mechanisms underlying minimal selfhood and its key constituents, multisensory integration, interoception, agency, perspective, and the experience of mineness. We conclude that the conceptualization of MPS can be well mapped onto a hierarchical generative model furnished by the free energy principle and may constitute the basis for higher-level, cognitive forms of self-referral, as well as the understanding of other minds.
Identifying Minimal Changes in Nonerosive Reflux Disease: Is the Pay Worth the Labor?
Gabbard, Scott L; Fass, Ronnie; Maradey-Romero, Carla; Gingold Belfer, Rachel; Dickman, Ram
2016-01-01
Gastroesophageal reflux disease has a variable presentation on upper endoscopy. Gastroesophageal reflux disease can be divided into 3 endoscopic categories: Barrett's esophagus, erosive esophagitis, and normal mucosa/nonerosive reflux disease (NERD). Each of these phenotypes behave in a distinct manner, in regards to symptom response to treatment, and risk of development of complications such as esophageal adenocarcinoma. Recently, it has been proposed to further differentiate NERD into 2 categories: those with and those without "minimal changes." These minimal changes include endoscopic abnormalities, such as villous mucosal surface, mucosal islands, microerosions, and increased vascularity at the squamocolumnar junction. Although some studies have shown that patients with minimal changes may have higher rates of esophageal acid exposure compared with those without minimal changes, it is currently unclear if these patients behave differently than those currently categorized as having NERD. The clinical utility of identifying these lesions should be weighed against the cost of the requisite equipment and the additional time required for diagnosis, compared with conventional white light endoscopy.
Operating envelope to minimize probability of fractures in Zircaloy-2 pressure tubes
International Nuclear Information System (INIS)
Azer, N.; Wong, H.
1994-01-01
The failure mode of primary concern with Candu pressure tubes is fast fracture of a through-wall axial crack, resulting from delayed hydride crack growth. The application of operating envelopes is demonstrated to minimize the probability of fracture in Zircaloy-2 pressure tubes based on Zr-2.5%Nb pressure tube experience. The technical basis for the development of the operating envelopes is also summarized. The operating envelope represents an area on the pressure versus temperature diagram within which the reactor may be operated without undue concern for pressure tube fracture. The envelopes presented address both normal operating conditions and the condition where a pressure tube leak has been detected. The examples in this paper are prepared to illustrate the methodology, and are not intended to be directly applicable to the operation of any specific reactor. The application of operating envelopes to minimized the probability of fracture in 80 mm diameter Zircaloy-2 pressure tubes has been discussed. Both normal operating and leaking pressure tube conditions have been considered. 3 refs., 4 figs
Image registration using stationary velocity fields parameterized by norm-minimizing Wendland kernel
DEFF Research Database (Denmark)
Pai, Akshay Sadananda Uppinakudru; Sommer, Stefan Horst; Sørensen, Lauge
by the regularization term. In a variational formulation, this term is traditionally expressed as a squared norm which is a scalar inner product of the interpolating kernels parameterizing the velocity fields. The minimization of this term using the standard spline interpolation kernels (linear or cubic) is only...... approximative because of the lack of a compatible norm. In this paper, we propose to replace such interpolants with a norm-minimizing interpolant - the Wendland kernel which has the same computational simplicity like B-Splines. An application on the Alzheimer's disease neuroimaging initiative showed...... that Wendland SVF based measures separate (Alzheimer's disease v/s normal controls) better than both B-Spline SVFs (p
Minimally invasive single-site surgery for the digestive system: A technological review
Directory of Open Access Journals (Sweden)
Dhumane Parag
2011-01-01
Full Text Available Minimally Invasive Single Site (MISS surgery is a better terminology to explain the novel concept of scarless surgery, which is increasingly making its way into clinical practice. But, there are some difficulties. We review the existing technologies for MISS surgery with regards to single-port devices, endoscope and camera, instruments, retractors and also the future perspectives for the evolution of MISS surgery. While we need to move ahead cautiously and wait for the development of appropriate technology, we believe that the "Ultimate form of Minimally Invasive Surgery" will be a hybrid form of MISS surgery and Natural Orifice Transluminal Endoscopic Surgery, complimented by technological innovations from the fields of robotics and computer-assisted surgery.
Quadrilateral mesh fitting that preserves sharp features based on multi-normals for Laplacian energy
Directory of Open Access Journals (Sweden)
Yusuke Imai
2014-04-01
Full Text Available Because the cost of performance testing using actual products is expensive, manufacturers use lower-cost computer-aided design simulations for this function. In this paper, we propose using hexahedral meshes, which are more accurate than tetrahedral meshes, for finite element analysis. We propose automatic hexahedral mesh generation with sharp features to precisely represent the corresponding features of a target shape. Our hexahedral mesh is generated using a voxel-based algorithm. In our previous works, we fit the surface of the voxels to the target surface using Laplacian energy minimization. We used normal vectors in the fitting to preserve sharp features. However, this method could not represent concave sharp features precisely. In this proposal, we improve our previous Laplacian energy minimization by adding a term that depends on multi-normal vectors instead of using normal vectors. Furthermore, we accentuate a convex/concave surface subset to represent concave sharp features.
International Nuclear Information System (INIS)
Guenaydin, Murat; Pavlyk, Oleksandr
2005-01-01
We study the symmetries of generalized spacetimes and corresponding phase spaces defined by Jordan algebras of degree three. The generic Jordan family of formally real Jordan algebras of degree three describe extensions of the minkowskian spacetimes by an extra 'dilatonic' coordinate, whose rotation, Lorentz and conformal groups are SO(d-1), SO(d-1,1) x SO(1,1) and SO(d,2) x SO(2,1), respectively. The generalized spacetimes described by simple Jordan algebras of degree three correspond to extensions of minkowskian spacetimes in the critical dimensions (d = 3,4,6,10) by a dilatonic and extra commuting spinorial coordinates, respectively. Their rotation, Lorentz and conformal groups are those that occur in the first three rows of the Magic Square. The Freudenthal triple systems defined over these Jordan algebras describe conformally covariant phase spaces. Following hep-th/0008063, we give a unified geometric realization of the quasiconformal groups that act on their conformal phase spaces extended by an extra 'cocycle' coordinate. For the generic Jordan family the quasiconformal groups are SO(d+2,4), whose minimal unitary realizations are given. The minimal unitary representations of the quasiconformal groups F 4(4) , E 6(2) , E 7(-5) and E 8(-24) of the simple Jordan family were given in our earlier work
Soares, Marcelo B.; Efstratiadis, Argiris
1997-01-01
This invention provides a method to normalize a directional cDNA library constructed in a vector that allows propagation in single-stranded circle form comprising: (a) propagating the directional cDNA library in single-stranded circles; (b) generating fragments complementary to the 3' noncoding sequence of the single-stranded circles in the library to produce partial duplexes; (c) purifying the partial duplexes; (d) melting and reassociating the purified partial duplexes to moderate Cot; and (e) purifying the unassociated single-stranded circles, thereby generating a normalized cDNA library.
Broad Ligament Haematoma Following Normal Vaginal Delivery.
Ibrar, Faiza; Awan, Azra Saeed; Fatima, Touseef; Tabassum, Hina
2017-01-01
A 37-year-old, patient presented in emergency with history of normal vaginal delivery followed by development of abdominal distention, vomiting, constipation for last 3 days. She was para 4 and had normal vaginal delivery by traditional birth attendant at peripheral hospital 3 days back. Imaging study revealed a heterogeneous complex mass, ascites, pleural effusion, air fluid levels with dilatation gut loops. Based upon pelvic examination by senior gynaecologist in combination with ultrasound; a clinical diagnosis of broad ligament haematoma was made. However, vomiting and abdominal distention raised suspicion of intestinal obstruction. Due to worsening abdominal distention exploratory laparotomy was carried out. It was pseudo colonic obstruction and caecostomy was done. Timely intervention by multidisciplinary approach saved patient life with minimal morbidity.
Kinetics of Tc-99m hexakis t-butyl isonitrile in normal and ischemic canine myocardium
International Nuclear Information System (INIS)
Williams, S.J.; Dragotakos, D.L.
1989-01-01
Hexakis 99m Tc-tertiary butyl isonitrile ( 99m Tc-TBI) was studied as a cardiac perfusion imaging agent in nine dogs with partial occlusion of the LAD. Thirty min after applying the stenosis, 99m Tc-TBI was injected into the right atrium (RA) in five dogs and left atrium (LA) in four dogs. Normal and ischemic zone regional myocardial 99m Tc-TBI activities were monitored continuously for 4 h. Dogs with LA injections had minimal and equivalent 4 h fractional clearance from the normal and ischemic zones. Dogs with RA injections had minimal, but significantly lower 4 h fractional 99m Tc clearances in the ischemic zone (0.08±0.08) compared to the normal zone (0.16±0.07, P 99m Tc-TBI a promising cardiac perfusion imaging agent. (orig.)
Investigations on quantum mechanics with minimal length
International Nuclear Information System (INIS)
Chargui, Yassine
2009-01-01
We consider a modified quantum mechanics where the coordinates and momenta are assumed to satisfy a non-standard commutation relation of the form( X i , P j ) = iℎ(δ ij (1+βP 2 )+β'P i P j ). Such an algebra results in a generalized uncertainty relation which leads to the existence of a minimal observable length. Moreover, it incorporates an UV/IR mixing and non commutative position space. We analyse the possible representations in terms of differential operators. The latter are used to study the low energy effects of the minimal length by considering different quantum systems : the harmonic oscillator, the Klein-Gordon oscillator, the spinless Salpeter Coulomb problem, and the Dirac equation with a linear confining potential. We also discuss whether such effects are observable in precision measurements on a relativistic electron trapped in strong magnetic field.
Smooth quantile normalization.
Hicks, Stephanie C; Okrah, Kwame; Paulson, Joseph N; Quackenbush, John; Irizarry, Rafael A; Bravo, Héctor Corrada
2018-04-01
Between-sample normalization is a critical step in genomic data analysis to remove systematic bias and unwanted technical variation in high-throughput data. Global normalization methods are based on the assumption that observed variability in global properties is due to technical reasons and are unrelated to the biology of interest. For example, some methods correct for differences in sequencing read counts by scaling features to have similar median values across samples, but these fail to reduce other forms of unwanted technical variation. Methods such as quantile normalization transform the statistical distributions across samples to be the same and assume global differences in the distribution are induced by only technical variation. However, it remains unclear how to proceed with normalization if these assumptions are violated, for example, if there are global differences in the statistical distributions between biological conditions or groups, and external information, such as negative or control features, is not available. Here, we introduce a generalization of quantile normalization, referred to as smooth quantile normalization (qsmooth), which is based on the assumption that the statistical distribution of each sample should be the same (or have the same distributional shape) within biological groups or conditions, but allowing that they may differ between groups. We illustrate the advantages of our method on several high-throughput datasets with global differences in distributions corresponding to different biological conditions. We also perform a Monte Carlo simulation study to illustrate the bias-variance tradeoff and root mean squared error of qsmooth compared to other global normalization methods. A software implementation is available from https://github.com/stephaniehicks/qsmooth.
Casara, Dario; Rubello, Domenico; Cauzzo, Cristina; Pelizzo, Maria Rosa
2002-01-01
The surgical approach to primary hyperparathyroidism (HPT) is changing. In patients with a high probability to be affected by a solitary parathyroid adenoma (PA), a unilateral neck exploration (UNE) or a minimally invasive radio-guided surgery (MIRS) using the intraoperative gamma probe (IGP) technique have recently been proposed. We investigated the role of IGP in a group of 84 patients with primary HPT who were homogeneously evaluated before surgery by a single-day imaging protocol including 99mTcO4/MIBI subtraction scan and neck ultrasound (US) and then operated on by the same surgical team. Quick parathyroid hormone (QPTH) was intraoperatively measured in all cases to confirm successful parathyroidectomy. In 70 patients with scan/US evidence of a single enlarged parathyroid gland (EPG) and with a normal thyroid gland, MIRS was planned. In the other 14 patients, the IGP technique was utilized during a standard bilateral neck exploration (BNE) because of the presence of concomitant nodular goiter (11 cases) or multiglandular disease (MGD) (3 cases). The IGP technique consisted of the following: (1) in the operating room, a low 99mTc-MIBI dose (37 MBq) was injected intravenously during anesthesia induction; (2) subsequently, the patient's neck was scanned with the probe by the surgeon to localize the cutaneous projection of the EPG; (3) in patients who underwent MIRS, the EPG was detected intraoperatively with the probe and removed through a small, 2 to 2.5 cm skin incision; (4) radioactivity was measured on the EPG both in vivo and ex vivo, the thyroid, the background and the parathyroid bed after EPG removal. In patients with concomitant nodular goiter, the radioactivity was also measured on the thyroid nodules. Surgical and pathologic findings were consistent with a single PA in 78 patients, parathyroid carcinoma in 2, and MGD in 4. MIRS was successfully performed in 67 of the 70 patients (97.7%) in whom this approach was planned. It must be pointed out that
10 CFR 71.71 - Normal conditions of transport.
2010-01-01
... 10 Energy 2 2010-01-01 2010-01-01 false Normal conditions of transport. 71.71 Section 71.71 Energy..., Special Form, and LSA-III Tests 2 § 71.71 Normal conditions of transport. (a) Evaluation. Evaluation of each package design under normal conditions of transport must include a determination of the effect on...
Directory of Open Access Journals (Sweden)
Naci Üngür
2015-06-01
Full Text Available Purpose: The aim of this study is to retrospectively assess the contrubition of the minimal preparation CT to the diagnosis of colorectal cancer in the patients who were refered to department of gatroenterology with colorectal cancer prediagnosis and have consequent colonoscopically visible mass and histopathological proof.Materials and methods: 100 consecutive cases referred from department of gastroenterology between september 2008 and december 2012 with confirmed colonoscopical mass diagnosis were included to our study (Age range: 18–90 Sex: females 41 and 59 males. Radiological findings were statistically compared with pathological findings as a gold standard.Results: Of these patients with coloscopically visible mass, minimal preparation CT revealed asymmetric wall thickening(n:89, extracolonic mass (n:3, and symmetric wall thickening(n:2 and normal wall thickness (n:6. 79 cases had enlarged lymph nodes in pericolonic mesenteric fat tissue while remaning have no lymph nodes(21. 54 cases had stranding in pericolonic mesenteric fat tissue and remanining individuals showed normal fat density. The masses were located in rectum (n:54, sigmoid colon (n:17, descending colon (n:10, transverse colon (n:2, ascending colon (n:14, and cecum (n:3.Conclusion: In colorectal and extracolonic mass investigation we recommend minimal preparation CT, which is highly sensitive and more acceptible by patients.
Directory of Open Access Journals (Sweden)
Shlomo Trachtenberg
Full Text Available Spiroplasma melliferum is a wall-less bacterium with dynamic helical geometry. This organism is geometrically well defined and internally well ordered, and has an exceedingly small genome. Individual cells are chemotactic, polar, and swim actively. Their dynamic helicity can be traced at the molecular level to a highly ordered linear motor (composed essentially of the proteins fib and MreB that is positioned on a defined helical line along the internal face of the cell's membrane. Using an array of complementary, informationally overlapping approaches, we have taken advantage of this uniquely simple, near-minimal life-form and its helical geometry to analyze the copy numbers of Spiroplasma's essential parts, as well as to elucidate how these components are spatially organized to subserve the whole living cell. Scanning transmission electron microscopy (STEM was used to measure the mass-per-length and mass-per-area of whole cells, membrane fractions, intact cytoskeletons and cytoskeletal components. These local data were fit into whole-cell geometric parameters determined by a variety of light microscopy modalities. Hydrodynamic data obtained by analytical ultracentrifugation allowed computation of the hydration state of whole living cells, for which the relative amounts of protein, lipid, carbohydrate, DNA, and RNA were also estimated analytically. Finally, ribosome and RNA content, genome size and gene expression were also estimated (using stereology, spectroscopy and 2D-gel analysis, respectively. Taken together, the results provide a general framework for a minimal inventory and arrangement of the major cellular components needed to support life.
Schoenberg, Mike R; Rum, Ruba S
2017-11-01
Rapid, clear and efficient communication of neuropsychological results is essential to benefit patient care. Errors in communication are a lead cause of medical errors; nevertheless, there remains a lack of consistency in how neuropsychological scores are communicated. A major limitation in the communication of neuropsychological results is the inconsistent use of qualitative descriptors for standardized test scores and the use of vague terminology. PubMed search from 1 Jan 2007 to 1 Aug 2016 to identify guidelines or consensus statements for the description and reporting of qualitative terms to communicate neuropsychological test scores was conducted. The review found the use of confusing and overlapping terms to describe various ranges of percentile standardized test scores. In response, we propose a simplified set of qualitative descriptors for normalized test scores (Q-Simple) as a means to reduce errors in communicating test results. The Q-Simple qualitative terms are: 'very superior', 'superior', 'high average', 'average', 'low average', 'borderline' and 'abnormal/impaired'. A case example illustrates the proposed Q-Simple qualitative classification system to communicate neuropsychological results for neurosurgical planning. The Q-Simple qualitative descriptor system is aimed as a means to improve and standardize communication of standardized neuropsychological test scores. Research are needed to further evaluate neuropsychological communication errors. Conveying the clinical implications of neuropsychological results in a manner that minimizes risk for communication errors is a quintessential component of evidence-based practice. Copyright © 2017 Elsevier B.V. All rights reserved.
Ares, Gastón; Giménez, Ana; Gámbaro, Adriana
2008-01-01
The aim of the present work was to study the influence of context, particularly the stage of the decision-making process (purchase vs consumption stage), on sensory shelf life of minimally processed lettuce. Leaves of butterhead lettuce were placed in common polypropylene bags and stored at 5, 10 and 15 degrees C. Periodically, a panel of six assessors evaluated the appearance of the samples, and a panel of 40 consumers evaluated their appearance and answered "yes" or "no" to the questions: "Imagine you are in a supermarket, you want to buy a minimally processed lettuce, and you find a package of lettuce with leaves like this, would you normally buy it?" and "Imagine you have this leaf of lettuce stored in your refrigerator, would you normally consume it?". Survival analysis was used to calculate the shelf lives of minimally processed lettuce, considering both decision-making stages. Shelf lives estimated considering rejection to purchase were significantly lower than those estimated considering rejection to consume. Therefore, in order to be conservative and assure the products' quality, shelf life should be estimated considering consumers' rejection to purchase instead of rejection to consume, as traditionally has been done. On the other hand, results from logistic regressions of consumers' rejection percentage as a function of the evaluated appearance attributes suggested that consumers considered them differently while deciding whether to purchase or to consume minimally processed lettuce.
The stability of the femoral component of a minimal invasive total hip replacement system.
Willems, M.M.M.; Kooloos, J.G.M.; Gibbons, P.; Minderhoud, N.; Weernink, T.; Verdonschot, N.J.J.
2006-01-01
In this study, the initial stability of the femoral component of a minimal invasive total hip replacement was biomechanically evaluated during simulated normal walking and chair rising. A 20 mm diameter canal was created in the femoral necks of five fresh frozen human cadaver bones and the femoral
Minimally invasive orthognathic surgery.
Resnick, Cory M; Kaban, Leonard B; Troulis, Maria J
2009-02-01
Minimally invasive surgery is defined as the discipline in which operative procedures are performed in novel ways to diminish the sequelae of standard surgical dissections. The goals of minimally invasive surgery are to reduce tissue trauma and to minimize bleeding, edema, and injury, thereby improving the rate and quality of healing. In orthognathic surgery, there are two minimally invasive techniques that can be used separately or in combination: (1) endoscopic exposure and (2) distraction osteogenesis. This article describes the historical developments of the fields of orthognathic surgery and minimally invasive surgery, as well as the integration of the two disciplines. Indications, techniques, and the most current outcome data for specific minimally invasive orthognathic surgical procedures are presented.
Regularity of Minimal Surfaces
Dierkes, Ulrich; Tromba, Anthony J; Kuster, Albrecht
2010-01-01
"Regularity of Minimal Surfaces" begins with a survey of minimal surfaces with free boundaries. Following this, the basic results concerning the boundary behaviour of minimal surfaces and H-surfaces with fixed or free boundaries are studied. In particular, the asymptotic expansions at interior and boundary branch points are derived, leading to general Gauss-Bonnet formulas. Furthermore, gradient estimates and asymptotic expansions for minimal surfaces with only piecewise smooth boundaries are obtained. One of the main features of free boundary value problems for minimal surfaces is t
Takiyama, Ken
2015-01-01
Recent theoretical models suggest that motor learning includes at least two processes: error minimization and memory decay. While learning a novel movement, a motor memory of the movement is gradually formed to minimize the movement error between the desired and actual movements in each training trial, but the memory is slightly forgotten in each trial. The learning effects of error minimization trained with a certain movement are partially available in other non-trained movements, and this t...
A comparison of vowel normalization procedures for language variation research
Adank, Patti; Smits, Roel; van Hout, Roeland
2004-11-01
An evaluation of vowel normalization procedures for the purpose of studying language variation is presented. The procedures were compared on how effectively they (a) preserve phonemic information, (b) preserve information about the talker's regional background (or sociolinguistic information), and (c) minimize anatomical/physiological variation in acoustic representations of vowels. Recordings were made for 80 female talkers and 80 male talkers of Dutch. These talkers were stratified according to their gender and regional background. The normalization procedures were applied to measurements of the fundamental frequency and the first three formant frequencies for a large set of vowel tokens. The normalization procedures were evaluated through statistical pattern analysis. The results show that normalization procedures that use information across multiple vowels (``vowel-extrinsic'' information) to normalize a single vowel token performed better than those that include only information contained in the vowel token itself (``vowel-intrinsic'' information). Furthermore, the results show that normalization procedures that operate on individual formants performed better than those that use information across multiple formants (e.g., ``formant-extrinsic'' F2-F1). .
Minimal Poems Written in 1979 Minimal Poems Written in 1979
Directory of Open Access Journals (Sweden)
Sandra Sirangelo Maggio
2008-04-01
Full Text Available The reading of M. van der Slice's Minimal Poems Written in 1979 (the work, actually, has no title reminded me of a book I have seen a long time ago. called Truth, which had not even a single word printed inside. In either case we have a sample of how often excentricities can prove efficient means of artistic creativity, in this new literary trend known as Minimalism. The reading of M. van der Slice's Minimal Poems Written in 1979 (the work, actually, has no title reminded me of a book I have seen a long time ago. called Truth, which had not even a single word printed inside. In either case we have a sample of how often excentricities can prove efficient means of artistic creativity, in this new literary trend known as Minimalism.
The minimal energetic requirement of sustained awareness after brain injury
DEFF Research Database (Denmark)
Stender, Johan; Mortensen, Kristian Nygaard; Thibaut, Aurore
2016-01-01
of glucose has been proposed as an indicator of consciousness [2 and 3]. Likewise, FDG-PET may contribute to the clinical diagnosis of disorders of consciousness (DOCs) [4 and 5]. However, current methods are non-quantitative and have important drawbacks deriving from visually guided assessment of relative...... changes in brain metabolism [4]. We here used FDG-PET to measure resting state brain glucose metabolism in 131 DOC patients to identify objective quantitative metabolic indicators and predictors of awareness. Quantitation of images was performed by normalizing to extracerebral tissue. We show that 42......% of normal cortical activity represents the minimal energetic requirement for the presence of conscious awareness. Overall, the cerebral metabolic rate accounted for the current level, or imminent return, of awareness in 94% of the patient population, suggesting a global energetic threshold effect...
Weak convergence and uniform normalization in infinitary rewriting
DEFF Research Database (Denmark)
Simonsen, Jakob Grue
2010-01-01
the starkly surprising result that for any orthogonal system with finitely many rules, the system is weakly normalizing under weak convergence if{f} it is strongly normalizing under weak convergence if{f} it is weakly normalizing under strong convergence if{f} it is strongly normalizing under strong...... convergence. As further corollaries, we derive a number of new results for weakly convergent rewriting: Systems with finitely many rules enjoy unique normal forms, and acyclic orthogonal systems are confluent. Our results suggest that it may be possible to recover some of the positive results for strongly...
Normalized Excited Squeezed Vacuum State and Its Applications
International Nuclear Information System (INIS)
Meng Xiangguo; Wang Jisuo; Liang Baolong
2007-01-01
By using the intermediate coordinate-momentum representation in quantum optics and generating function for the normalization of the excited squeezed vacuum state (ESVS), the normalized ESVS is obtained. We find that its normalization constants obtained via two new methods are uniform and a new form which is different from the result obtained by Zhang and Fan [Phys. Lett. A 165 (1992) 14]. By virtue of the normalization constant of the ESVS and the intermediate coordinate-momentum representation, the tomogram of the normalized ESVS and some useful formulae are derived.
Experience with the EPA manual for waste minimization opportunity assessments
International Nuclear Information System (INIS)
Bridges, J.S.
1990-01-01
The EPA Waste Minimization Opportunity Assessment Manual (EPA/625/788/003) was published to assist those responsible for managing waste minimization activities at the waste generating facility and at corporate levels. The Manual sets forth a procedure that incorporates technical and managerial principles and motivates people to develop and implement pollution prevention concepts and ideas. Environmental management has increasingly become one of cooperative endeavor whereby whether in government, industry, or other forms of enterprise, the effectiveness with whirl, people work together toward the attainment of a clean environment is largely determined by the ability of those who hold managerial position. This paper offers a description of the EPA Waste Minimization Opportunity Assessment Manual procedure which supports the waste minimization assessment as a systematic planned procedure with the objective of identifying ways to reduce or eliminate waste generation. The Manual is a management tool that blends science and management principles. The practice of managing waste minimization/pollution prevention makes use of the underlying organized science and engineering knowledge and applies it in the light of realities to gain a desired, practical result. The early stages of EPA's Pollution Prevention Research Program centered on the development of the Manual and its use at a number of facilities within the private and public sectors. This paper identifies a number of case studies and waste minimization opportunity assessment reports that demonstrate the value of using the Manual's approach. Several industry-specific waste minimization assessment manuals have resulted from the Manual's generic approach to waste minimization. There were some modifications to the Manual's generic approach when the waste stream has been other than industrial hazardous waste
International Nuclear Information System (INIS)
Strel'tsov, V.N.
1992-01-01
The physical sense of three forms of the relativity is discussed. The first - instant from - respects in fact the traditional approach based on the concept of instant distance. The normal form corresponds the radar formulation which is based on the light or retarded distances. The front form in the special case is characterized by 'observable' variables, and the known method of k-coefficient is its obvious expression. 16 refs
Random Generators and Normal Numbers
Bailey, David H.; Crandall, Richard E.
2002-01-01
Pursuant to the authors' previous chaotic-dynamical model for random digits of fundamental constants, we investigate a complementary, statistical picture in which pseudorandom number generators (PRNGs) are central. Some rigorous results are achieved: We establish b-normality for constants of the form $\\sum_i 1/(b^{m_i} c^{n_i})$ for certain sequences $(m_i), (n_i)$ of integers. This work unifies and extends previously known classes of explicit normals. We prove that for coprime $b,c>1$ the...
2010-07-01
... OSHA 300 Log. Instead, enter “privacy case” in the space normally used for the employee's name. This...) Basic requirement. You must use OSHA 300, 300-A, and 301 forms, or equivalent forms, for recordable injuries and illnesses. The OSHA 300 form is called the Log of Work-Related Injuries and Illnesses, the 300...
Leck, Kira
2006-10-01
Researchers have associated minimal dating with numerous factors. The present author tested shyness, introversion, physical attractiveness, performance evaluation, anxiety, social skill, social self-esteem, and loneliness to determine the nature of their relationships with 2 measures of self-reported minimal dating in a sample of 175 college students. For women, shyness, introversion, physical attractiveness, self-rated anxiety, social self-esteem, and loneliness correlated with 1 or both measures of minimal dating. For men, physical attractiveness, observer-rated social skill, social self-esteem, and loneliness correlated with 1 or both measures of minimal dating. The patterns of relationships were not identical for the 2 indicators of minimal dating, indicating the possibility that minimal dating is not a single construct as researchers previously believed. The present author discussed implications and suggestions for future researchers.
Masturbation, sexuality, and adaptation: normalization in adolescence.
Shapiro, Theodore
2008-03-01
During adolescence the central masturbation fantasy that is formulated during childhood takes its final form and paradoxically must now be directed outward for appropriate object finding and pair matching in the service of procreative aims. This is a step in adaptation that requires a further developmental landmark that I have called normalization. The path toward airing these private fantasies is facilitated by chumship relationships as a step toward further exposure to the social surround. Hartmann's structuring application of adaptation within psychoanalysis is used as a framework for understanding the process that simultaneously serves intrapsychic and social demands and permits goals that follow evolutionary principles. Variations in the normalization process from masturbatory isolation to a variety of forms of sexual socialization are examined in sociological data concerning current adolescent sexual behavior and in case examples that indicate some routes to normalized experience and practice.
Minimalism in architecture: Abstract conceptualization of architecture
Directory of Open Access Journals (Sweden)
Vasilski Dragana
2015-01-01
Full Text Available Minimalism in architecture contains the idea of the minimum as a leading creative tend to be considered and interpreted in working through phenomena of empathy and abstraction. In the Western culture, the root of this idea is found in empathy of Wilhelm Worringer and abstraction of Kasimir Malevich. In his dissertation, 'Abstraction and Empathy' Worringer presented his thesis on the psychology of style through which he explained the two opposing basic forms: abstraction and empathy. His conclusion on empathy as a psychological basis of observation expression is significant due to the verbal congruence with contemporary minimalist expression. His intuition was enhenced furthermore by figure of Malevich. Abstraction, as an expression of inner unfettered inspiration, has played a crucial role in the development of modern art and architecture of the twentieth century. Abstraction, which is one of the basic methods of learning in psychology (separating relevant from irrelevant features, Carl Jung is used to discover ideas. Minimalism in architecture emphasizes the level of abstraction to which the individual functions are reduced. Different types of abstraction are present: in the form as well as function of the basic elements: walls and windows. The case study is an example of Sou Fujimoto who is unequivocal in its commitment to the autonomy of abstract conceptualization of architecture.
DEFF Research Database (Denmark)
Antola, M.; Di Chiara, S.; Sannino, F.
2011-01-01
We introduce novel extensions of the Standard Model featuring a supersymmetric technicolor sector (supertechnicolor). As the first minimal conformal supertechnicolor model we consider N=4 Super Yang-Mills which breaks to N=1 via the electroweak interactions. This is a well defined, economical......, between unparticle physics and Minimal Walking Technicolor. We consider also other N =1 extensions of the Minimal Walking Technicolor model. The new models allow all the standard model matter fields to acquire a mass....
Chern-Simons forms and four-dimensional N=1 superspace geometry
International Nuclear Information System (INIS)
Girardi, G.; Grimm, R.
1986-12-01
The complete superspace geometry for Yang-Mills, chiral U(1) and Lorentz Chern-Simons forms is constructed. The analysis is completely off-shell and covers the cases of minimal, new minimal and 16-16 supergravity. Supersymmetry is guaranteed by construction. Invariant superfield actions are proposed
Normalization of satellite imagery
Kim, Hongsuk H.; Elman, Gregory C.
1990-01-01
Sets of Thematic Mapper (TM) imagery taken over the Washington, DC metropolitan area during the months of November, March and May were converted into a form of ground reflectance imagery. This conversion was accomplished by adjusting the incident sunlight and view angles and by applying a pixel-by-pixel correction for atmospheric effects. Seasonal color changes of the area can be better observed when such normalization is applied to space imagery taken in time series. In normalized imagery, the grey scale depicts variations in surface reflectance and tonal signature of multi-band color imagery can be directly interpreted for quantitative information of the target.
Wang, Dongyang; Ba, Dechun; Hao, Ming; Duan, Qihui; Liu, Kun; Mei, Qi
2018-05-01
Pneumatic NC (normally closed) valves are widely used in high density microfluidics systems. To improve actuation reliability, the actuation pressure needs to be reduced. In this work, we utilize 3D FEM (finite element method) modelling to get an insight into the valve actuation process numerically. Specifically, the progressive debonding process at the elastomer interface is simulated with CZM (cohesive zone model) method. To minimize the actuation pressure, the V-shape design has been investigated and compared with a normal straight design. The geometrical effects of valve shape has been elaborated, in terms of valve actuation pressure. Based on our simulated results, we formulate the main concerns for micro valve design and fabrication, which is significant for minimizing actuation pressures and ensuring reliable operation.
Normal range of facial asymmetry in spherical coordinates: a CBCT study
Energy Technology Data Exchange (ETDEWEB)
Yoon, Suk Ja [Dept. of Oral and Maxillofacial Radiology, School of Dentistry, Dental Science Research Institute, Chonnam National University, Gwangju (Korea, Republic of); Wang, Rui Feng [Research Laboratory Specialist Intermediate, Department of Biologic and Material Sciences, School of Dentistry, University of Michigan, Ann Arbor, MI (United States); Na, Hee Ja [Dept. ofDental Hygiene, Honam University, Gwangju (Korea, Republic of); Palomo, Juan Matin [Dept. of Orthodontics, School of Dental Medicine, Case Western Reserve University, Cleveland (United States)
2013-03-15
This study aimed to measure the bilateral differences of facial lines in spherical coordinates from faces within a normal range of asymmetry utilizing cone-beam computed tomography (CBCT). CBCT scans from 22 females with normal symmetric-looking faces (mean age 24 years and 8 months) were selected for the study. The average menton deviation was 1.01{+-}0.66 mm. The spherical coordinates, length, and midsagittal and coronal inclination angles of the ramal and mandibular lines were calculated from CBCT. The bilateral differences in the facial lines were determined. All of the study subjects had minimal bilateral differences of facial lines. The normal range of facial asymmetry of the ramal and mandibular lines was obtained in spherical coordinates. The normal range of facial asymmetry in the spherical coordinate system in this study should be useful as a reference for diagnosing facial asymmetry.
DEFF Research Database (Denmark)
2010-01-01
Disclosed herein are techniques, systems, and methods relating to minimizing mutual coupling between a first antenna and a second antenna.......Disclosed herein are techniques, systems, and methods relating to minimizing mutual coupling between a first antenna and a second antenna....
Anomalous normal mode oscillations in semiconductor microcavities
Energy Technology Data Exchange (ETDEWEB)
Wang, H. [Univ. of Oregon, Eugene, OR (United States). Dept. of Physics; Hou, H.Q.; Hammons, B.E. [Sandia National Labs., Albuquerque, NM (United States)
1997-04-01
Semiconductor microcavities as a composite exciton-cavity system can be characterized by two normal modes. Under an impulsive excitation by a short laser pulse, optical polarizations associated with the two normal modes have a {pi} phase difference. The total induced optical polarization is then expected to exhibit a sin{sup 2}({Omega}t)-like oscillation where 2{Omega} is the normal mode splitting, reflecting a coherent energy exchange between the exciton and cavity. In this paper the authors present experimental studies of normal mode oscillations using three-pulse transient four wave mixing (FWM). The result reveals surprisingly that when the cavity is tuned far below the exciton resonance, normal mode oscillation in the polarization is cos{sup 2}({Omega}t)-like, in contrast to what is expected form the simple normal mode model. This anomalous normal mode oscillation reflects the important role of virtual excitation of electronic states in semiconductor microcavities.
Right thoracic curvature in the normal spine
Directory of Open Access Journals (Sweden)
Masuda Keigo
2011-01-01
Full Text Available Abstract Background Trunk asymmetry and vertebral rotation, at times observed in the normal spine, resemble the characteristics of adolescent idiopathic scoliosis (AIS. Right thoracic curvature has also been reported in the normal spine. If it is determined that the features of right thoracic side curvature in the normal spine are the same as those observed in AIS, these findings might provide a basis for elucidating the etiology of this condition. For this reason, we investigated right thoracic curvature in the normal spine. Methods For normal spinal measurements, 1,200 patients who underwent a posteroanterior chest radiographs were evaluated. These consisted of 400 children (ages 4-9, 400 adolescents (ages 10-19 and 400 adults (ages 20-29, with each group comprised of both genders. The exclusion criteria were obvious chest and spinal diseases. As side curvature is minimal in normal spines and the range at which curvature is measured is difficult to ascertain, first the typical curvature range in scoliosis patients was determined and then the Cobb angle in normal spines was measured using the same range as the scoliosis curve, from T5 to T12. Right thoracic curvature was given a positive value. The curve pattern was organized in each collective three groups: neutral (from -1 degree to 1 degree, right (> +1 degree, and left ( Results In child group, Cobb angle in left was 120, in neutral was 125 and in right was 155. In adolescent group, Cobb angle in left was 70, in neutral was 114 and in right was 216. In adult group, Cobb angle in left was 46, in neutral was 102 and in right was 252. The curvature pattern shifts to the right side in the adolescent group (p Conclusions Based on standing chest radiographic measurements, a right thoracic curvature was observed in normal spines after adolescence.
CT in normal pressure hydrocephalus
International Nuclear Information System (INIS)
Fujita, Katsuzo; Nogaki, Hidekazu; Noda, Masaya; Kusunoki, Tadaki; Tamaki, Norihiko
1981-01-01
CT scans were obtained on 33 patients (age 73y. to 31y.) with the diagnosis of normal pressure hydrocephalus. In each case, the diagnosis was made on the basis of the symptoms, CT and cisternographic findings. Underlying diseases of normal pressure hydrocephalus are ruptured aneurysms (21 cases), arteriovenous malformations (2 cases), head trauma (1 case), cerebrovascular accidents (1 case) and idiopathie (8 cases). Sixteen of 33 patients showed marked improvement, five, moderate or minimal improvement, and twelve, no change. The results were compared with CT findings and clinical response to shunting. CT findings were classified into five types, bases on the degree of periventricular hypodensity (P.V.H.), the extent of brain damage by underlying diseases, and the degree of cortical atrophy. In 17 cases of type (I), CT shows the presence of P.V.H. with or without minimal frontal lobe damage and no cortical atrophy. The good surgical improvements were achieved in all cases of type (I) by shunting. In 4 cases of type (II), CT shows the presence of P.V.H. and severe brain damage without cortical atrophy. The fair clinical improvements were achieved in 2 cases (50%) by shunting. In one case of type (III), CT shows the absence of P.V.H. without brain damage nor cortical atrophy. No clinical improvement was obtained by shunting in this type. In 9 cases of type (IV) with mild cortical atrophy, the fair clinical improvement was achieved in two cases (22%) and no improvement in 7 cases. In 2 cases of type (V) with moderate or marked cortical atrophy, no clinical improvement was obtained by shunting. In conclusion, it appeared from the present study that there was a good correlation between the result of shunting and the type of CT, and clinical response to shunting operation might be predicted by classification of CT findings. (author)
International Nuclear Information System (INIS)
Durand, S.; Nikolova, M.
2006-01-01
Many estimation problems amount to minimizing a piecewise C m objective function, with m ≥ 2, composed of a quadratic data-fidelity term and a general regularization term. It is widely accepted that the minimizers obtained using non-convex and possibly non-smooth regularization terms are frequently good estimates. However, few facts are known on the ways to control properties of these minimizers. This work is dedicated to the stability of the minimizers of such objective functions with respect to variations of the data. It consists of two parts: first we consider all local minimizers, whereas in a second part we derive results on global minimizers. In this part we focus on data points such that every local minimizer is isolated and results from a C m-1 local minimizer function, defined on some neighborhood. We demonstrate that all data points for which this fails form a set whose closure is negligible
Normal radiographic findings. 4. act. ed.
International Nuclear Information System (INIS)
Moeller, T.B.
2003-01-01
This book can serve the reader in three ways: First, it presents normal findings for all radiographic techniques including KM. Important data which are criteria of normal findings are indicated directly in the pictures and are also explained in full text and in summary form. Secondly, it teaches the systematics of interpreting a picture - how to look at it, what structures to regard in what order, and for what to look in particular. Checklists are presented in each case. Thirdly, findings are formulated in accordance with the image analysis procedure. All criteria of normal findings are defined in these formulations, which make them an important didactic element. (orig.)
Waste forms for plutonium disposition
International Nuclear Information System (INIS)
Johnson, S.G.; O'Holleran, T.P.; Frank, S.M.; Meyer, M.K.; Hanson, M.; Staples, B.A.; Knecht, D.A.; Kong, P.C.
1997-01-01
The field of plutonium disposition is varied and of much importance, since the Department of Energy has decided on the hybrid option for disposing of the weapons materials. This consists of either placing the Pu into mixed oxide fuel for reactors or placing the material into a stable waste form such as glass. The waste form used for Pu disposition should exhibit certain qualities: (1) provide for a suitable deterrent to guard against proliferation; (2) be of minimal volume, i.e., maximize the loading; and (3) be reasonably durable under repository-like conditions. This paper will discuss several Pu waste forms that display promising characteristics
[A psychosocial view of a number of Jewish mourning rituals during normal and pathological grief].
Maoz, Benyamin; Lauden, Ari; Ben-Zion, Itzhak
2004-04-01
This article describes the three stages of normal and pathological mourning, emphasizing the constellation embodied in Judaism for this process. These stages are: shock, acute mourning, working through and reconciliation. We present the important question: "How to define pathological mourning?" It is certainly not only a matter of extending beyond the accepted time limits of the mourning process, but also a question of the intensity of mourning in ones daily life, the degree of being preoccupied with it, and the degree of priority that this mourning process has in an individual's life. A number of forms of pathological mourning, during the three mentioned stages, are described, with special attention to Jewish mourning rituals, especially: The "rending of the garments" (Kriyah), the Kaddish, the Shiva, and the termination of mourning after a fixed period of time. One of the possible interpretations of these rituals is that they prevent and neutralize manifestations of aggression and violence. This is an analogue to the function of biological (genetic) rituals which according to the theory of Konrad Lorenz, also minimize the dangerous aggression between the species in nature. The religious ritual converts an aggressive behavior to a minimal and symbolic action, often re-directed, so that an originally dangerous behavior becomes a ritual with an important communicative function.
Risk-optimized proton therapy to minimize radiogenic second cancers
Rechner, Laura A.; Eley, John G.; Howell, Rebecca M.; Zhang, Rui; Mirkovic, Dragan; Newhauser, Wayne D.
2015-01-01
Proton therapy confers substantially lower predicted risk of second cancer compared with photon therapy. However, no previous studies have used an algorithmic approach to optimize beam angle or fluence-modulation for proton therapy to minimize those risks. The objectives of this study were to demonstrate the feasibility of risk-optimized proton therapy and to determine the combination of beam angles and fluence weights that minimize the risk of second cancer in the bladder and rectum for a prostate cancer patient. We used 6 risk models to predict excess relative risk of second cancer. Treatment planning utilized a combination of a commercial treatment planning system and an in-house risk-optimization algorithm. When normal-tissue dose constraints were incorporated in treatment planning, the risk model that incorporated the effects of fractionation, initiation, inactivation, and repopulation selected a combination of anterior and lateral beams, which lowered the relative risk by 21% for the bladder and 30% for the rectum compared to the lateral-opposed beam arrangement. Other results were found for other risk models. PMID:25919133
A convergent overlapping domain decomposition method for total variation minimization
Fornasier, Massimo
2010-06-22
In this paper we are concerned with the analysis of convergent sequential and parallel overlapping domain decomposition methods for the minimization of functionals formed by a discrepancy term with respect to the data and a total variation constraint. To our knowledge, this is the first successful attempt of addressing such a strategy for the nonlinear, nonadditive, and nonsmooth problem of total variation minimization. We provide several numerical experiments, showing the successful application of the algorithm for the restoration of 1D signals and 2D images in interpolation/inpainting problems, respectively, and in a compressed sensing problem, for recovering piecewise constant medical-type images from partial Fourier ensembles. © 2010 Springer-Verlag.
New component-based normalization method to correct PET system models
International Nuclear Information System (INIS)
Kinouchi, Shoko; Miyoshi, Yuji; Suga, Mikio; Yamaya, Taiga; Yoshida, Eiji; Nishikido, Fumihiko; Tashima, Hideaki
2011-01-01
Normalization correction is necessary to obtain high-quality reconstructed images in positron emission tomography (PET). There are two basic types of normalization methods: the direct method and component-based methods. The former method suffers from the problem that a huge count number in the blank scan data is required. Therefore, the latter methods have been proposed to obtain high statistical accuracy normalization coefficients with a small count number in the blank scan data. In iterative image reconstruction methods, on the other hand, the quality of the obtained reconstructed images depends on the system modeling accuracy. Therefore, the normalization weighing approach, in which normalization coefficients are directly applied to the system matrix instead of a sinogram, has been proposed. In this paper, we propose a new component-based normalization method to correct system model accuracy. In the proposed method, two components are defined and are calculated iteratively in such a way as to minimize errors of system modeling. To compare the proposed method and the direct method, we applied both methods to our small OpenPET prototype system. We achieved acceptable statistical accuracy of normalization coefficients while reducing the count number of the blank scan data to one-fortieth that required in the direct method. (author)
A non-minimally coupled quintom dark energy model on the warped DGP brane
International Nuclear Information System (INIS)
Nozari, K; Azizi, T; Setare, M R; Behrouz, N
2009-01-01
We construct a quintom dark energy model with two non-minimally coupled scalar fields, one quintessence and the other phantom field, confined to the warped Dvali-Gabadadze-Porrati (DGP) brane. We show that this model accounts for crossing of the phantom divide line in appropriate subspaces of the model parameter space. This crossing occurs for both normal and self-accelerating branches of this DGP-inspired setup.
Normal and Abnormal Behavior in Early Childhood
Spinner, Miriam R.
1981-01-01
Evaluation of normal and abnormal behavior in the period to three years of age involves many variables. Parental attitudes, determined by many factors such as previous childrearing experience, the bonding process, parental psychological status and parental temperament, often influence the labeling of behavior as normal or abnormal. This article describes the forms of crying, sleep and wakefulness, and affective responses from infancy to three years of age.
Manoussakis, G.; Delikaraoglou, D.
2011-01-01
In this paper we form relations for the determination of the elements of the E\\"otv\\"os matrix of the Earth's normal gravity field. In addition a relation between the Gauss curvature of the normal equipotential surface and the Gauss curvature of the actual equipotential surface both passing through the point P is presented. For this purpose we use a global Cartesian system (X, Y, Z) and use the variables X, and Y to form a local parameterization a normal equipotential surface to describe its ...
Directory of Open Access Journals (Sweden)
Savvoula Savvidou
2014-04-01
A case report of a young male with remarkable jaundice due to acute anabolic androgen-induced cholestasis is presented. Interestingly, and #947;-glutamyl transpeptidase remained normal throughout the patient's diagnostic workup. Histopathology was indicative of pure, and ldquo;bland and rdquo; intrahepatic cholestasis with minimal inflammation but significant fibrosis. The patient was successfully treated with ursodeoxycholic acid and glucocorticosteroids. The significance of normal and #947;-glutamyl transpeptidase along with the histopathological findings and the possible pathophysiological mechanisms are finally discussed. [J Interdiscipl Histopathol 2014; 2(2.000: 98-103
UPGMA and the normalized equidistant minimum evolution problem
Moulton, Vincent; Spillner, Andreas; Wu, Taoyang
2017-01-01
UPGMA (Unweighted Pair Group Method with Arithmetic Mean) is a widely used clustering method. Here we show that UPGMA is a greedy heuristic for the normalized equidistant minimum evolution (NEME) problem, that is, finding a rooted tree that minimizes the minimum evolution score relative to the dissimilarity matrix among all rooted trees with the same leaf-set in which all leaves have the same distance to the root. We prove that the NEME problem is NP-hard. In addition, we present some heurist...
Cell cycle control by a minimal Cdk network.
Directory of Open Access Journals (Sweden)
Claude Gérard
2015-02-01
Full Text Available In present-day eukaryotes, the cell division cycle is controlled by a complex network of interacting proteins, including members of the cyclin and cyclin-dependent protein kinase (Cdk families, and the Anaphase Promoting Complex (APC. Successful progression through the cell cycle depends on precise, temporally ordered regulation of the functions of these proteins. In light of this complexity, it is surprising that in fission yeast, a minimal Cdk network consisting of a single cyclin-Cdk fusion protein can control DNA synthesis and mitosis in a manner that is indistinguishable from wild type. To improve our understanding of the cell cycle regulatory network, we built and analysed a mathematical model of the molecular interactions controlling the G1/S and G2/M transitions in these minimal cells. The model accounts for all observed properties of yeast strains operating with the fusion protein. Importantly, coupling the model's predictions with experimental analysis of alternative minimal cells, we uncover an explanation for the unexpected fact that elimination of inhibitory phosphorylation of Cdk is benign in these strains while it strongly affects normal cells. Furthermore, in the strain without inhibitory phosphorylation of the fusion protein, the distribution of cell size at division is unusually broad, an observation that is accounted for by stochastic simulations of the model. Our approach provides novel insights into the organization and quantitative regulation of wild type cell cycle progression. In particular, it leads us to propose a new mechanistic model for the phenomenon of mitotic catastrophe, relying on a combination of unregulated, multi-cyclin-dependent Cdk activities.
Legal incentives for minimizing waste
International Nuclear Information System (INIS)
Clearwater, S.W.; Scanlon, J.M.
1991-01-01
Waste minimization, or pollution prevention, has become an integral component of federal and state environmental regulation. Minimizing waste offers many economic and public relations benefits. In addition, waste minimization efforts can also dramatically reduce potential criminal requirements. This paper addresses the legal incentives for minimizing waste under current and proposed environmental laws and regulations
Energy Technology Data Exchange (ETDEWEB)
Doerr, W. [Medizinische Univ. Wien (Austria). Universitaetsklinik fuer Strahlentherapie; Medizinische Univ. Wien (Austria). Universitaetsklinik fuer Radioonkologie; Medizinische Univ. Wien (Austria). Christian Doppler Labor fuer Medizinische Strahlenforschung fuer die Radioonkologie; Herskind, C. [Universitaetsmedizin Mannheim, Heidelberg Univ., Mannheim (Germany). Labor fuer Zellulaere und Molekulare Radioonkologie
2012-11-15
Radiotherapy involves always the exposure of normal tissue, resulting in an excepted risk of complications. The side effect rate is therefore the compromise between optimized tumor doses and the side effect minimization. The report covers the issues target cell hypothesis and the consequences, new aspect of the pathogenesis of normal issue reactions and strategies of targeted reduction of normal tissue effects. The complexity of the radiobiological processes, the specificity and action mechanisms, the mutual interactions of chemical and radiological processes require further coordinated radiobiological research in the future.
MOCUS, Minimal Cut Sets and Minimal Path Sets from Fault Tree Analysis
International Nuclear Information System (INIS)
Fussell, J.B.; Henry, E.B.; Marshall, N.H.
1976-01-01
1 - Description of problem or function: From a description of the Boolean failure logic of a system, called a fault tree, and control parameters specifying the minimal cut set length to be obtained MOCUS determines the system failure modes, or minimal cut sets, and the system success modes, or minimal path sets. 2 - Method of solution: MOCUS uses direct resolution of the fault tree into the cut and path sets. The algorithm used starts with the main failure of interest, the top event, and proceeds to basic independent component failures, called primary events, to resolve the fault tree to obtain the minimal sets. A key point of the algorithm is that an and gate alone always increases the number of path sets; an or gate alone always increases the number of cut sets and increases the size of path sets. Other types of logic gates must be described in terms of and and or logic gates. 3 - Restrictions on the complexity of the problem: Output from MOCUS can include minimal cut and path sets for up to 20 gates
A Denotational Account of Untyped Normalization by Evaluation
DEFF Research Database (Denmark)
Filinski, Andrzej; Rohde, Henning Korsholm
2004-01-01
Abstract. We show that the standard normalization-by-evaluation construction for the simply-typed λβη-calculus has a natural counterpart for the untyped λβ-calculus, with the central type-indexed logical relation replaced by a “recursively defined” invariant relation, in the style of Pitts. In fact......, the construction can be seen as generalizing a computational adequacy argument for an untyped, call-by-name language to normalization instead of evaluation. In the untyped setting, not all terms have normal forms, so the normalization function is necessarily partial. We establish its correctness in the senses...
Furnes, Bjarte; Norman, Elisabeth
2015-08-01
Metacognition refers to 'cognition about cognition' and includes metacognitive knowledge, strategies and experiences (Efklides, 2008; Flavell, 1979). Research on reading has shown that better readers demonstrate more metacognitive knowledge than poor readers (Baker & Beall, 2009), and that reading ability improves through strategy instruction (Gersten, Fuchs, Williams, & Baker, 2001). The current study is the first to specifically compare the three forms of metacognition in dyslexic (N = 22) versus normally developing readers (N = 22). Participants read two factual texts, with learning outcome measured by a memory task. Metacognitive knowledge and skills were assessed by self-report. Metacognitive experiences were measured by predictions of performance and judgments of learning. Individuals with dyslexia showed insight into their reading problems, but less general knowledge of how to approach text reading. They more often reported lack of available reading strategies, but groups did not differ in the use of deep and surface strategies. Learning outcome and mean ratings of predictions of performance and judgments of learning were lower in dyslexic readers, but not the accuracy with which metacognitive experiences predicted learning. Overall, the results indicate that dyslexic reading and spelling problems are not generally associated with lower levels of metacognitive knowledge, metacognitive strategies or sensitivity to metacognitive experiences in reading situations. 2015 The Authors. Dyslexia Published by John Wiley & Sons Ltd.
International Nuclear Information System (INIS)
Roy Choudhury, S.
2007-01-01
The Ostrovsky equation is an important canonical model for the unidirectional propagation of weakly nonlinear long surface and internal waves in a rotating, inviscid and incompressible fluid. Limited functional analytic results exist for the occurrence of one family of solitary-wave solutions of this equation, as well as their approach to the well-known solitons of the famous Korteweg-de Vries equation in the limit as the rotation becomes vanishingly small. Since solitary-wave solutions often play a central role in the long-time evolution of an initial disturbance, we consider such solutions here (via the normal form approach) within the framework of reversible systems theory. Besides confirming the existence of the known family of solitary waves and its reduction to the KdV limit, we find a second family of multihumped (or N-pulse) solutions, as well as a continuum of delocalized solitary waves (or homoclinics to small-amplitude periodic orbits). On isolated curves in the relevant parameter region, the delocalized waves reduce to genuine embedded solitons. The second and third families of solutions occur in regions of parameter space distinct from the known solitary-wave solutions and are thus entirely new. Directions for future work are also mentioned
Is non-minimal inflation eternal?
International Nuclear Information System (INIS)
Feng, Chao-Jun; Li, Xin-Zhou
2010-01-01
The possibility that the non-minimal coupling inflation could be eternal is investigated. We calculate the quantum fluctuation of the inflaton in a Hubble time and find that it has the same value as that in the minimal case in the slow-roll limit. Armed with this result, we have studied some concrete non-minimal inflationary models including the chaotic inflation and the natural inflation, in which the inflaton is non-minimally coupled to the gravity. We find that the non-minimal coupling inflation could be eternal in some parameter spaces.
Noguchi, Hiroshi; Takehara, Kimie; Ohashi, Yumiko; Suzuki, Ryo; Yamauchi, Toshimasa; Kadowaki, Takashi; Sanada, Hiromi
2016-01-01
Aim. Callus is a risk factor, leading to severe diabetic foot ulcer; thus, prevention of callus formation is important. However, normal stress (pressure) and shear stress associated with callus have not been clarified. Additionally, as new valuables, a shear stress-normal stress (pressure) ratio (SPR) was examined. The purpose was to clarify the external force associated with callus formation in patients with diabetic neuropathy. Methods. The external force of the 1st, 2nd, and 5th metatarsal head (MTH) as callus predilection regions was measured. The SPR was calculated by dividing shear stress by normal stress (pressure), concretely, peak values (SPR-p) and time integral values (SPR-i). The optimal cut-off point was determined. Results. Callus formation region of the 1st and 2nd MTH had high SPR-i rather than noncallus formation region. The cut-off value of the 1st MTH was 0.60 and the 2nd MTH was 0.50. For the 5th MTH, variables pertaining to the external forces could not be determined to be indicators of callus formation because of low accuracy. Conclusions. The callus formation cut-off values of the 1st and 2nd MTH were clarified. In the future, it will be necessary to confirm the effect of using appropriate footwear and gait training on lowering SPR-i. PMID:28050567
Directory of Open Access Journals (Sweden)
Ayumi Amemiya
2016-01-01
Full Text Available Aim. Callus is a risk factor, leading to severe diabetic foot ulcer; thus, prevention of callus formation is important. However, normal stress (pressure and shear stress associated with callus have not been clarified. Additionally, as new valuables, a shear stress-normal stress (pressure ratio (SPR was examined. The purpose was to clarify the external force associated with callus formation in patients with diabetic neuropathy. Methods. The external force of the 1st, 2nd, and 5th metatarsal head (MTH as callus predilection regions was measured. The SPR was calculated by dividing shear stress by normal stress (pressure, concretely, peak values (SPR-p and time integral values (SPR-i. The optimal cut-off point was determined. Results. Callus formation region of the 1st and 2nd MTH had high SPR-i rather than noncallus formation region. The cut-off value of the 1st MTH was 0.60 and the 2nd MTH was 0.50. For the 5th MTH, variables pertaining to the external forces could not be determined to be indicators of callus formation because of low accuracy. Conclusions. The callus formation cut-off values of the 1st and 2nd MTH were clarified. In the future, it will be necessary to confirm the effect of using appropriate footwear and gait training on lowering SPR-i.
Minimal families of curves on surfaces
Lubbes, Niels
2014-11-01
A minimal family of curves on an embedded surface is defined as a 1-dimensional family of rational curves of minimal degree, which cover the surface. We classify such minimal families using constructive methods. This allows us to compute the minimal families of a given surface.The classification of minimal families of curves can be reduced to the classification of minimal families which cover weak Del Pezzo surfaces. We classify the minimal families of weak Del Pezzo surfaces and present a table with the number of minimal families of each weak Del Pezzo surface up to Weyl equivalence.As an application of this classification we generalize some results of Schicho. We classify algebraic surfaces that carry a family of conics. We determine the minimal lexicographic degree for the parametrization of a surface that carries at least 2 minimal families. © 2014 Elsevier B.V.
Complications of minimally invasive cosmetic procedures: Prevention and management
Directory of Open Access Journals (Sweden)
Lauren L Levy
2012-01-01
Full Text Available Over the past decade, facial rejuvenation procedures to circumvent traditional surgery have become increasingly popular. Office-based, minimally invasive procedures can promote a youthful appearance with minimal downtime and low risk of complications. Injectable botulinum toxin (BoNT, soft-tissue fillers, and chemical peels are among the most popular non-invasive rejuvenation procedures, and each has unique applications for improving facial aesthetics. Despite the simplicity and reliability of office-based procedures, complications can occur even with an astute and experienced injector. The goal of any procedure is to perform it properly and safely; thus, early recognition of complications when they do occur is paramount in dictating prevention of long-term sequelae. The most common complications from BoNT and soft-tissue filler injection are bruising, erythema and pain. With chemical peels, it is not uncommon to have erythema, irritation and burning. Fortunately, these side effects are normally transient and have simple remedies. More serious complications include muscle paralysis from BoNT, granuloma formation from soft-tissue filler placement and scarring from chemical peels. Thankfully, these complications are rare and can be avoided with excellent procedure technique, knowledge of facial anatomy, proper patient selection, and appropriate pre- and post-skin care. This article reviews complications of office-based, minimally invasive procedures, with emphasis on prevention and management. Practitioners providing these treatments should be well versed in this subject matter in order to deliver the highest quality care.
Gyrokinetic simulations of neoclassical transport using a minimal collision operator
International Nuclear Information System (INIS)
Dif-Pradalier, G.; Grandgirard, V.; Sarazin, Y.; Garbet, X.; Ghendrih, Ph.; Angelino, P.
2008-01-01
Conventional neoclassical predictions are successfully recovered within a gyrokinetic framework using a minimal Fokker-Planck collision operator. This operator is shown to accurately describe some essential features of neoclassical theory, namely the neoclassical transport, the poloidal rotation and the linear damping of axisymmetric flows while interestingly preserving a high numerical efficiency. Its form makes it especially adapted to Eulerian or Semi-Lagrangian schemes.
Quantum N-body problem with a minimal length
International Nuclear Information System (INIS)
Buisseret, Fabien
2010-01-01
The quantum N-body problem is studied in the context of nonrelativistic quantum mechanics with a one-dimensional deformed Heisenberg algebra of the form [x,p]=i(1+βp 2 ), leading to the existence of a minimal observable length √(β). For a generic pairwise interaction potential, analytical formulas are obtained that allow estimation of the ground-state energy of the N-body system by finding the ground-state energy of a corresponding two-body problem. It is first shown that in the harmonic oscillator case, the β-dependent term grows faster with increasing N than the β-independent term. Then, it is argued that such a behavior should also be observed with generic potentials and for D-dimensional systems. Consequently, quantum N-body bound states might be interesting places to look at nontrivial manifestations of a minimal length, since the more particles that are present, the more the system deviates from standard quantum-mechanical predictions.
Confectionery-based dose forms.
Tangso, Kristian J; Ho, Quy Phuong; Boyd, Ben J
2015-01-01
Conventional dosage forms such as tablets, capsules and syrups are prescribed in the normal course of practice. However, concerns about patient preferences and market demands have given rise to the exploration of novel unconventional dosage forms. Among these, confectionery-based dose forms have strong potential to overcome compliance problems. This report will review the availability of these unconventional dose forms used in treating the oral cavity and for systemic drug delivery, with a focus on medicated chewing gums, medicated lollipops, and oral bioadhesive devices. The aim is to stimulate increased interest in the opportunities for innovative new products that are available to formulators in this field, particularly for atypical patient populations.
Implementation of Waste Minimization at a complex R ampersand D site
International Nuclear Information System (INIS)
Lang, R.E.; Thuot, J.R.; Devgun, J.S.
1995-01-01
Under the 1994 Waste Minimization/Pollution Prevention Crosscut Plan, the Department of Energy (DOE) has set a goal of 50% reduction in waste at its facilities by the end of 1999. Each DOE site is required to set site-specific goals to reduce generation of all types of waste including hazardous, radioactive, and mixed. To meet these goals, Argonne National Laboratory (ANL), Argonne, IL, has developed and implemented a comprehensive Pollution Prevention/Waste Minimization (PP/WMin) Program. The facilities and activities at the site vary from research into basic sciences and research into nuclear fuel cycle to high energy physics and decontamination and decommissioning projects. As a multidisciplinary R ampersand D facility and a multiactivity site, ANL generates waste streams that are varied, in physical form as well as in chemical constituents. This in turn presents a significant challenge to put a cohesive site-wide PP/WMin Program into action. In this paper, we will describe ANL's key activities and waste streams, the regulatory drivers for waste minimization, and the DOE goals in this area, and we will discuss ANL's strategy for waste minimization and it's implementation across the site
Super-acceleration from massless, minimally coupled phi sup 4
Onemli, V K
2002-01-01
We derive a simple form for the propagator of a massless, minimally coupled scalar in a locally de Sitter geometry of arbitrary spacetime dimension. We then employ it to compute the fully renormalized stress tensor at one- and two-loop orders for a massless, minimally coupled phi sup 4 theory which is released in Bunch-Davies vacuum at t=0 in co-moving coordinates. In this system, the uncertainty principle elevates the scalar above the minimum of its potential, resulting in a phase of super-acceleration. With the non-derivative self-interaction the scalar's breaking of de Sitter invariance becomes observable. It is also worth noting that the weak-energy condition is violated on cosmological scales. An interesting subsidiary result is that cancelling overlapping divergences in the stress tensor requires a conformal counterterm which has no effect on purely scalar diagrams.
Minimal and non-minimal standard models: Universality of radiative corrections
International Nuclear Information System (INIS)
Passarino, G.
1991-01-01
The possibility of describing electroweak processes by means of models with a non-minimal Higgs sector is analyzed. The renormalization procedure which leads to a set of fitting equations for the bare parameters of the lagrangian is first reviewed for the minimal standard model. A solution of the fitting equations is obtained, which correctly includes large higher-order corrections. Predictions for physical observables, notably the W boson mass and the Z O partial widths, are discussed in detail. Finally the extension to non-minimal models is described under the assumption that new physics will appear only inside the vector boson self-energies and the concept of universality of radiative corrections is introduced, showing that to a large extent they are insensitive to the details of the enlarged Higgs sector. Consequences for the bounds on the top quark mass are also discussed. (orig.)
Proximity effect in normal-superconductor hybrids for quasiparticle traps
Energy Technology Data Exchange (ETDEWEB)
Hosseinkhani, Amin [Peter Grunberg Institute (PGI-2), Forschungszentrum Julich, D-52425 Julich (Germany); JARA-Institute for Quantum Information, RWTH Aachen University, D-52056 Aachen (Germany)
2016-07-01
Coherent transport of charges in the form of Cooper pairs is the main feature of Josephson junctions which plays a central role in superconducting qubits. However, the presence of quasiparticles in superconducting devices may lead to incoherent charge transfer and limit the coherence time of superconducting qubits. A way around this so-called ''quasiparticle poisoning'' might be using a normal-metal island to trap quasiparticles; this has motivated us to revisit the proximity effect in normal-superconductor hybrids. Using the semiclassical Usadel equations, we study the density of states (DoS) both within and away from the trap. We find that in the superconducting layer the DoS quickly approaches the BCS form; this indicates that normal-metal traps should be effective at localizing quasiparticles.
Patterns of DNA methylation in the normal colon vary by anatomical location, gender, and age
Kaz, Andrew M; Wong, Chao-Jen; Dzieciatkowski, Slavomir; Luo, Yanxin; Schoen, Robert E; Grady, William M
2014-01-01
Alterations in DNA methylation have been proposed to create a field cancerization state in the colon, where molecular alterations that predispose cells to transformation occur in histologically normal tissue. However, our understanding of the role of DNA methylation in field cancerization is limited by an incomplete characterization of the methylation state of the normal colon. In order to determine the colon’s normal methylation state, we extracted DNA from normal colon biopsies from the rectum, sigmoid, transverse, and ascending colon and assessed the methylation status of the DNA by pyrosequencing candidate loci as well as with HumanMethylation450 arrays. We found that methylation levels of repetitive elements LINE-1 and SAT-α showed minimal variability throughout the colon in contrast to other loci. Promoter methylation of EVL was highest in the rectum and progressively lower in the proximal segments, whereas ESR1 methylation was higher in older individuals. Genome-wide methylation analysis of normal DNA revealed 8388, 82, and 93 differentially methylated loci that distinguished right from left colon, males from females, and older vs. younger individuals, respectively. Although variability in methylation between biopsies and among different colon segments was minimal for repetitive elements, analyses of specific cancer-related genes as well as a genome-wide methylation analysis demonstrated differential methylation based on colon location, individual age, and gender. These studies advance our knowledge regarding the variation of DNA methylation in the normal colon, a prerequisite for future studies aimed at understanding methylation differences indicative of a colon field effect. PMID:24413027
Flocking with minimal cooperativity: the panic model.
Pilkiewicz, Kevin R; Eaves, Joel D
2014-01-01
We present a two-dimensional lattice model of self-propelled spins that can change direction only upon collision with another spin. We show that even with ballistic motion and minimal cooperativity, these spins display robust flocking behavior at nearly all densities, forming long bands of stripes. The structural transition in this system is not a thermodynamic phase transition, but it can still be characterized by an order parameter, and we demonstrate that if this parameter is studied as a dynamical variable rather than a steady-state observable, we can extract a detailed picture of how the flocking mechanism varies with density.
Directory of Open Access Journals (Sweden)
Khallli H
2003-04-01
Full Text Available Background: To evaluate the effectiveness of the present educational programs in terms of students' achieving problem solving, decision making and critical thinking skills, reliable, valid and standard instrument are needed. Purposes: To Investigate the Reliability, validity and Norm of CCTST Form.B .The California Critical Thinking Skills Test contain 34 multi-choice questions with a correct answer in the jive Critical Thinking (CT cognitive skills domain. Methods: The translated CCTST Form.B were given t0405 BSN nursing students ojNursing Faculties located in Tehran (Tehran, Iran and Shahid Beheshti Universitiesthat were selected in the through random sampling. In order to determine the face and content validity the test was translated and edited by Persian and English language professor and researchers. it was also confirmed by judgments of a panel of medical education experts and psychology professor's. CCTST reliability was determined with internal consistency and use of KR-20. The construct validity of the test was investigated with factor analysis and internal consistency and group difference. Results: The test coefficien for reliablity was 0.62. Factor Analysis indicated that CCTST has been formed from 5 factor (element namely: Analysis, Evaluation, lriference, Inductive and Deductive Reasoning. Internal consistency method shows that All subscales have been high and positive correlation with total test score. Group difference method between nursing and philosophy students (n=50 indicated that there is meaningfUl difference between nursing and philosophy students scores (t=-4.95,p=0.OOO1. Scores percentile norm also show that percentile offifty scores related to 11 raw score and 95, 5 percentiles are related to 17 and 6 raw score ordinary. Conclusions: The Results revealed that the questions test is sufficiently reliable as a research tool, and all subscales measure a single construct (Critical Thinking and are able to distinguished the
Bakker, Nelleke
2014-01-01
This paper discusses the reception in the Netherlands of Minimal Brain Damage/Dysfunction (MBD) and related labels for normally gifted children with learning disabilities and behavioural problems by child scientists of all sorts from the 1950s up to the late 1980s, when MBD was replaced with
Minimal residual cone-beam reconstruction with attenuation correction in SPECT
International Nuclear Information System (INIS)
La, Valerie; Grangeat, Pierre
1998-01-01
This paper presents an iterative method based on the minimal residual algorithm for tomographic attenuation compensated reconstruction from attenuated cone-beam projections given the attenuation distribution. Unlike conjugate-gradient based reconstruction techniques, the proposed minimal residual based algorithm solves directly a quasisymmetric linear system, which is a preconditioned system. Thus it avoids the use of normal equations, which improves the convergence rate. Two main contributions are introduced. First, a regularization method is derived for quasisymmetric problems, based on a Tikhonov-Phillips regularization applied to the factorization of the symmetric part of the system matrix. This regularization is made spatially adaptive to avoid smoothing the region of interest. Second, our existing reconstruction algorithm for attenuation correction in parallel-beam geometry is extended to cone-beam geometry. A circular orbit is considered. Two preconditioning operators are proposed: the first one is Grangeat's inversion formula and the second one is Feldkamp's inversion formula. Experimental results obtained on simulated data are presented and the shadow zone effect on attenuated data is illustrated. (author)
Normal radiographic findings. 4. act. ed.; Roentgennormalbefunde
Energy Technology Data Exchange (ETDEWEB)
Moeller, T.B. [Gemeinschaftspraxis fuer Radiologie und Nuklearmedizin, Dillingen (Germany)
2003-07-01
This book can serve the reader in three ways: First, it presents normal findings for all radiographic techniques including KM. Important data which are criteria of normal findings are indicated directly in the pictures and are also explained in full text and in summary form. Secondly, it teaches the systematics of interpreting a picture - how to look at it, what structures to regard in what order, and for what to look in particular. Checklists are presented in each case. Thirdly, findings are formulated in accordance with the image analysis procedure. All criteria of normal findings are defined in these formulations, which make them an important didactic element. (orig.)
Advancing Normal Birth: Organizations, Goals, and Research
Hotelling, Barbara A.; Humenick, Sharron S.
2005-01-01
In this column, the support for advancing normal birth is summarized, based on a comparison of the goals of Healthy People 2010, Lamaze International, the Coalition for Improving Maternity Services, and the midwifery model of care. Research abstracts are presented to provide evidence that the midwifery model of care safely and economically advances normal birth. Rates of intervention experienced, as reported in the Listening to Mothers survey, are compared to the forms of care recommended by ...
Minimal Marking: A Success Story
McNeilly, Anne
2014-01-01
The minimal-marking project conducted in Ryerson's School of Journalism throughout 2012 and early 2013 resulted in significantly higher grammar scores in two first-year classes of minimally marked university students when compared to two traditionally marked classes. The "minimal-marking" concept (Haswell, 1983), which requires…
Minimal families of curves on surfaces
Lubbes, Niels
2014-01-01
A minimal family of curves on an embedded surface is defined as a 1-dimensional family of rational curves of minimal degree, which cover the surface. We classify such minimal families using constructive methods. This allows us to compute the minimal
Minimal residual method stronger than polynomial preconditioning
Energy Technology Data Exchange (ETDEWEB)
Faber, V.; Joubert, W.; Knill, E. [Los Alamos National Lab., NM (United States)] [and others
1994-12-31
Two popular methods for solving symmetric and nonsymmetric systems of equations are the minimal residual method, implemented by algorithms such as GMRES, and polynomial preconditioning methods. In this study results are given on the convergence rates of these methods for various classes of matrices. It is shown that for some matrices, such as normal matrices, the convergence rates for GMRES and for the optimal polynomial preconditioning are the same, and for other matrices such as the upper triangular Toeplitz matrices, it is at least assured that if one method converges then the other must converge. On the other hand, it is shown that matrices exist for which restarted GMRES always converges but any polynomial preconditioning of corresponding degree makes no progress toward the solution for some initial error. The implications of these results for these and other iterative methods are discussed.
Waste minimization assessment procedure
International Nuclear Information System (INIS)
Kellythorne, L.L.
1993-01-01
Perry Nuclear Power Plant began developing a waste minimization plan early in 1991. In March of 1991 the plan was documented following a similar format to that described in the EPA Waste Minimization Opportunity Assessment Manual. Initial implementation involved obtaining management's commitment to support a waste minimization effort. The primary assessment goal was to identify all hazardous waste streams and to evaluate those streams for minimization opportunities. As implementation of the plan proceeded, non-hazardous waste streams routinely generated in large volumes were also evaluated for minimization opportunities. The next step included collection of process and facility data which would be useful in helping the facility accomplish its assessment goals. This paper describes the resources that were used and which were most valuable in identifying both the hazardous and non-hazardous waste streams that existed on site. For each material identified as a waste stream, additional information regarding the materials use, manufacturer, EPA hazardous waste number and DOT hazard class was also gathered. Once waste streams were evaluated for potential source reduction, recycling, re-use, re-sale, or burning for heat recovery, with disposal as the last viable alternative
ILUCG algorithm which minimizes in the Euclidean norm
International Nuclear Information System (INIS)
Petravic, M.; Kuo-Petravic, G.
1978-07-01
An algroithm is presented which solves sparse systems of linear equations of the form Ax = Y, where A is non-symmetric, by the Incomplete LU Decomposition-Conjugate Gradient (ILUCG) method. The algorithm minimizes the error in the Euclidean norm vertical bar x/sub i/ - x vertical bar 2 , where x/sub i/ is the solution vector after the i/sup th/ iteration and x the exact solution vector. The results of a test on one real problem indicate that the algorithm is likely to be competitive with the best existing algorithms of its type
Normalization of multidirectional red and NIR reflectances with the SAVI
Huete, A. R.; Hua, G.; Qi, J.; Chehbouni, A.; Van Leeuwen, W. J. D.
1992-01-01
Directional reflectance measurements were made over a semi-desert gramma grassland at various times of the growing season. View angle measurements from +40 to -40 degrees were made at various solar zenith angles and soil moisture conditions. The sensitivity of the Normalized Difference Vegetation Index (NDVI) and the Soil Adjusted Vegetation Index (SAVI) to bidirectional measurements was assessed for purposes of improving remote temporal monitoring of vegetation dynamics. The SAVI view angle response was found to be symmetric about nadir while the NDVI response was strongly anisotropic. This enabled the view angle behavior of the SAVI to be normalized with a cosine function. In contrast to the NDVI, the SAVI was able to minimize soil moisture and shadow influences for all measurement conditions.
Directory of Open Access Journals (Sweden)
Mujeeb ur Rehman Fazili
2012-09-01
Full Text Available The present study was planned to evaluate minimally invasive tube cystotomy technique in calves suffering from obstructive urolithiasis having intact urinary bladder and urethra. Fifteen male non-castrated calves with age ranging from 1-10 months (mean 4.05 months, presented for treatment within one to three days (mean 2.2 days of complete urinary tract obstruction due to urethral calculi with intact bladder and urethra, were included in this study. Under light sedation and local infiltration anaesthesia, all the animals were subjected through left paralumbar fossa, to a minimally invasive surgical tube cystotomy in which catheter was placed in the bladder lumen through a metallic cannula and fixed to the skin with a stay suture (Fazili’s technique. All the animals were discharged the same day. Time taken for the procedure varied from 8 to 17 minutes (mean 11.0 minutes. Normal urination resumed in twelve (80.0% calves. Mean time taken for normal urination was 10.50 days. In two of the remaining calves, urine flow stopped through the catheter prematurely and they were then subjected to standard surgical tube cystotomy. One more calf did not urinate normally for 30 postoperative days and was lost to the follow up thereafter. Recurrence of the obstruction was not detected in ten and nine animals observed up to six and 12 months respectively. In conclusion, the outcome of this minimally invasive technique is similar to the standard tube cystectomy. Additionally, the procedure is cost effective, quick, simple and field applicable. It also minimizes exposure of abdominal cavity of metabolically compromised animals. However, the technique needs to be tried in larger number of such calves wherein better quality catheter of larger diameter be used before recommending its extensive use.
Tallman, John F.; Johnson, William G.; Brady, Roscoe O.
1972-01-01
The catabolism of Tay-Sachs ganglioside, N-acetylgalactosaminyl- (N-acetylneuraminosyl) -galactosylglucosylceramide, has been studied in lysosomal preparations from normal human brain and brain obtained at biopsy from Tay-Sachs patients. Utilizing Tay-Sachs ganglioside labeled with 14C in the N-acetylgalactosaminyl portion or 3H in the N-acetylneuraminosyl portion, the catabolism of Tay-Sachs ganglioside may be initiated by either the removal of the molecule of N-acetylgalactosamine or N-acetylneuraminic acid. The activity of the N-acetylgalactosamine-cleaving enzyme (hexosaminidase) is drastically diminished in such preparations from Tay-Sachs brain whereas the activity of the N-acetylneuraminic acid-cleaving enzyme (neuraminidase) is at a normal level. Total hexosaminidase activity as measured with an artificial fluorogenic substrate is increased in tissues obtained from patients with the B variant form of Tay-Sachs disease and it is virtually absent in the O-variant patients. The addition of purified neuraminidase and various purified hexosaminidases exerted only a minimal synergistic effect on the hydrolysis of Tay-Sachs ganglioside in the lysosomal preparations from the control or patient with the O variant of Tay-Sachs disease. Images PMID:4639018
Minimal but non-minimal inflation and electroweak symmetry breaking
Energy Technology Data Exchange (ETDEWEB)
Marzola, Luca [National Institute of Chemical Physics and Biophysics,Rävala 10, 10143 Tallinn (Estonia); Institute of Physics, University of Tartu,Ravila 14c, 50411 Tartu (Estonia); Racioppi, Antonio [National Institute of Chemical Physics and Biophysics,Rävala 10, 10143 Tallinn (Estonia)
2016-10-07
We consider the most minimal scale invariant extension of the standard model that allows for successful radiative electroweak symmetry breaking and inflation. The framework involves an extra scalar singlet, that plays the rôle of the inflaton, and is compatibile with current experimental bounds owing to the non-minimal coupling of the latter to gravity. This inflationary scenario predicts a very low tensor-to-scalar ratio r≈10{sup −3}, typical of Higgs-inflation models, but in contrast yields a scalar spectral index n{sub s}≃0.97 which departs from the Starobinsky limit. We briefly discuss the collider phenomenology of the framework.
Ruled Laguerre minimal surfaces
Skopenkov, Mikhail
2011-10-30
A Laguerre minimal surface is an immersed surface in ℝ 3 being an extremal of the functional ∫ (H 2/K-1)dA. In the present paper, we prove that the only ruled Laguerre minimal surfaces are up to isometry the surfaces ℝ (φλ) = (Aφ, Bφ, Cφ + D cos 2φ) + λ(sin φ, cos φ, 0), where A,B,C,D ε ℝ are fixed. To achieve invariance under Laguerre transformations, we also derive all Laguerre minimal surfaces that are enveloped by a family of cones. The methodology is based on the isotropic model of Laguerre geometry. In this model a Laguerre minimal surface enveloped by a family of cones corresponds to a graph of a biharmonic function carrying a family of isotropic circles. We classify such functions by showing that the top view of the family of circles is a pencil. © 2011 Springer-Verlag.
Radiation-induced normal tissue damage: implications for radiotherapy
International Nuclear Information System (INIS)
Prasanna, Pataje G.
2014-01-01
Radiotherapy is an important treatment modality for many malignancies, either alone or as a part of combined modality treatment. However, despite technological advances in physical treatment delivery, patients suffer adverse effects from radiation therapy due to normal tissue damage. These side effects may be acute, occurring during or within weeks after therapy, or intermediate to late, occurring months to years after therapy. Minimizing normal tissue damage from radiotherapy will allow enhancement of tumor killing and improve tumor control and patients quality of life. Understanding mechanisms through which radiation toxicity develops in normal tissue will facilitate the development of next generation radiation effect modulators. Translation of these agents to the clinic will also require an understanding of the impact of these protectors and mitigators on tumor radiation response. In addition, normal tissues vary in radiobiologically important ways, including organ sensitivity to radiation, cellular turnover rate, and differences in mechanisms of injury manifestation and damage response. Therefore, successful development of radiation modulators may require multiple approaches to address organ/site-specific needs. These may include treatments that modify cellular damage and death processes, inflammation, alteration of normal flora, wound healing, tissue regeneration and others, specifically to counter cancer site-specific adverse effects. Further, an understanding of mechanisms of normal tissue damage will allow development of predictive biomarkers; however harmonization of such assays is critical. This is a necessary step towards patient-specific treatment customization. Examples of important adverse effects of radiotherapy either alone or in conjunction with chemotherapy, and important limitations in the current approaches of using radioprotectors for improving therapeutic outcome will be highlighted. (author)
Global Analysis of Minimal Surfaces
Dierkes, Ulrich; Tromba, Anthony J
2010-01-01
Many properties of minimal surfaces are of a global nature, and this is already true for the results treated in the first two volumes of the treatise. Part I of the present book can be viewed as an extension of these results. For instance, the first two chapters deal with existence, regularity and uniqueness theorems for minimal surfaces with partially free boundaries. Here one of the main features is the possibility of 'edge-crawling' along free parts of the boundary. The third chapter deals with a priori estimates for minimal surfaces in higher dimensions and for minimizers of singular integ
Minimal Surfaces for Hitchin Representations
DEFF Research Database (Denmark)
Li, Qiongling; Dai, Song
2018-01-01
. In this paper, we investigate the properties of immersed minimal surfaces inside symmetric space associated to a subloci of Hitchin component: $q_n$ and $q_{n-1}$ case. First, we show that the pullback metric of the minimal surface dominates a constant multiple of the hyperbolic metric in the same conformal...... class and has a strong rigidity property. Secondly, we show that the immersed minimal surface is never tangential to any flat inside the symmetric space. As a direct corollary, the pullback metric of the minimal surface is always strictly negatively curved. In the end, we find a fully decoupled system...
Free energy minimization to predict RNA secondary structures and computational RNA design.
Churkin, Alexander; Weinbrand, Lina; Barash, Danny
2015-01-01
Determining the RNA secondary structure from sequence data by computational predictions is a long-standing problem. Its solution has been approached in two distinctive ways. If a multiple sequence alignment of a collection of homologous sequences is available, the comparative method uses phylogeny to determine conserved base pairs that are more likely to form as a result of billions of years of evolution than by chance. In the case of single sequences, recursive algorithms that compute free energy structures by using empirically derived energy parameters have been developed. This latter approach of RNA folding prediction by energy minimization is widely used to predict RNA secondary structure from sequence. For a significant number of RNA molecules, the secondary structure of the RNA molecule is indicative of its function and its computational prediction by minimizing its free energy is important for its functional analysis. A general method for free energy minimization to predict RNA secondary structures is dynamic programming, although other optimization methods have been developed as well along with empirically derived energy parameters. In this chapter, we introduce and illustrate by examples the approach of free energy minimization to predict RNA secondary structures.
'Normal' markets, market imperfections and energy efficiency
International Nuclear Information System (INIS)
Sanstad, A.H.; Howarth, R.B.
1994-01-01
The conventional distinction between 'economic' and 'engineering' approaches to energy analysis obscures key methodological issues concerning the measurement of the costs and benefits of policies to promote the adoption of energy-efficient technologies. The engineering approach is in fact based upon firm economic foundations: the principle of lifecycle cost minimization that arises directly from the theory of rational investment. Thus, evidence that so-called 'market barriers' impede the adoption of cost-effective energy-efficient technologies implies the existence of market failures as defined in the context of microeconomic theory. A widely held view that the engineering view lacks economic justification, is based on the fallacy that markets are 'normally' efficient. (author)
Directory of Open Access Journals (Sweden)
Fredy Ángel Miguel Amaya Robayo
2010-08-01
Full Text Available Es un hecho conocido que toda gramática libre de contexto puede ser transformada a la forma normal de Chomsky de tal forma que los lenguajes generados por las dos gramáticas son equivalentes. Una gramática en forma normal de Chomsky (FNC, tiene algunas ventajas, por ejemplo sus árboles de derivación son binarios, la forma de sus reglas más simples etc. Por eso es siempre deseable poder trabajar con una gramática en FNC en las aplicaciones que lo requieran. Existe un algoritmo que permite transformar una gramática libre de contexto a una en FNC, sin embargo la cantidad de reglas generadas al hacer la transformación depende del número de reglas en la gramática inicial así como de otras características. En este trabajo se analiza desde el punto de vista experimental y estadístico, la relación existente entre el número de reglas iniciales y el número de reglas que resultan luego de transformar una Gramática Libre de Contexto a la FNC. Esto permite planificar la cantidad de recursos computacionales necesarios en caso de tratar con gramáticas de alguna complejidad.It is well known that any context-free grammar can be transformed to the Chomsky normal form so that the languages generated by each one are equivalent. A grammar in Chomsky Normal Form (CNF, has some advantages: their derivation trees are binary, simplest rules and so on. So it is always desirable to work with a grammar in CNF in applications that require them. There is an algorithm that can transform a context-free grammar to one CNF grammar, however the number of rules generated after the transformation depends on the initial grammar and other circumstances. In this work we analyze from the experimental and statistical point of view the relationship between the number of initial rules and the number of resulting rules after transforming. This allows you to plan the amount of computational resources needed in case of dealing with grammars of some complexity.
Minimal Webs in Riemannian Manifolds
DEFF Research Database (Denmark)
Markvorsen, Steen
2008-01-01
For a given combinatorial graph $G$ a {\\it geometrization} $(G, g)$ of the graph is obtained by considering each edge of the graph as a $1-$dimensional manifold with an associated metric $g$. In this paper we are concerned with {\\it minimal isometric immersions} of geometrized graphs $(G, g......)$ into Riemannian manifolds $(N^{n}, h)$. Such immersions we call {\\em{minimal webs}}. They admit a natural 'geometric' extension of the intrinsic combinatorial discrete Laplacian. The geometric Laplacian on minimal webs enjoys standard properties such as the maximum principle and the divergence theorems, which...... are of instrumental importance for the applications. We apply these properties to show that minimal webs in ambient Riemannian spaces share several analytic and geometric properties with their smooth (minimal submanifold) counterparts in such spaces. In particular we use appropriate versions of the divergence...
Waste minimization handbook, Volume 1
Energy Technology Data Exchange (ETDEWEB)
Boing, L.E.; Coffey, M.J.
1995-12-01
This technical guide presents various methods used by industry to minimize low-level radioactive waste (LLW) generated during decommissioning and decontamination (D and D) activities. Such activities generate significant amounts of LLW during their operations. Waste minimization refers to any measure, procedure, or technique that reduces the amount of waste generated during a specific operation or project. Preventive waste minimization techniques implemented when a project is initiated can significantly reduce waste. Techniques implemented during decontamination activities reduce the cost of decommissioning. The application of waste minimization techniques is not limited to D and D activities; it is also useful during any phase of a facility`s life cycle. This compendium will be supplemented with a second volume of abstracts of hundreds of papers related to minimizing low-level nuclear waste. This second volume is expected to be released in late 1996.
Waste minimization handbook, Volume 1
International Nuclear Information System (INIS)
Boing, L.E.; Coffey, M.J.
1995-12-01
This technical guide presents various methods used by industry to minimize low-level radioactive waste (LLW) generated during decommissioning and decontamination (D and D) activities. Such activities generate significant amounts of LLW during their operations. Waste minimization refers to any measure, procedure, or technique that reduces the amount of waste generated during a specific operation or project. Preventive waste minimization techniques implemented when a project is initiated can significantly reduce waste. Techniques implemented during decontamination activities reduce the cost of decommissioning. The application of waste minimization techniques is not limited to D and D activities; it is also useful during any phase of a facility's life cycle. This compendium will be supplemented with a second volume of abstracts of hundreds of papers related to minimizing low-level nuclear waste. This second volume is expected to be released in late 1996
The morphological classification of normal and abnormal red blood cell using Self Organizing Map
Rahmat, R. F.; Wulandari, F. S.; Faza, S.; Muchtar, M. A.; Siregar, I.
2018-02-01
Blood is an essential component of living creatures in the vascular space. For possible disease identification, it can be tested through a blood test, one of which can be seen from the form of red blood cells. The normal and abnormal morphology of the red blood cells of a patient is very helpful to doctors in detecting a disease. With the advancement of digital image processing technology can be used to identify normal and abnormal blood cells of a patient. This research used self-organizing map method to classify the normal and abnormal form of red blood cells in the digital image. The use of self-organizing map neural network method can be implemented to classify the normal and abnormal form of red blood cells in the input image with 93,78% accuracy testing.
Evaluation of the fit of preformed nickel titanium arch wires on normal occlusion dental arches
Directory of Open Access Journals (Sweden)
Rakhn G. Al-Barakati
2016-01-01
Conclusions: Using an archwire form with the best fit to the dental arch should produce minimal changes in the dental arch form when NiTi wires are used and require less customization when stainless-steel wires are used.
Combined Waste Form Cost Trade Study
International Nuclear Information System (INIS)
Gombert, Dirk; Piet, Steve; Trickel, Timothy; Carter, Joe; Vienna, John; Ebert, Bill; Matthern, Gretchen
2008-01-01
A new generation of aqueous nuclear fuel reprocessing, now in development under the auspices of the DOE Office of Nuclear Energy (NE), separates fuel into several fractions, thereby partitioning the wastes into groups of common chemistry. This technology advance enables development of waste management strategies that were not conceivable with simple PUREX reprocessing. Conventional wisdom suggests minimizing high level waste (HLW) volume is desirable, but logical extrapolation of this concept suggests that at some point the cost of reducing volume further will reach a point of diminishing return and may cease to be cost-effective. This report summarizes an evaluation considering three groupings of wastes in terms of cost-benefit for the reprocessing system. Internationally, the typical waste form for HLW from the PUREX process is borosilicate glass containing waste elements as oxides. Unfortunately several fission products (primarily Mo and the noble metals Ru, Rh, Pd) have limited solubility in glass, yielding relatively low waste loading, producing more glass, and greater disposal costs. Advanced separations allow matching the waste form to waste stream chemistry, allowing the disposal system to achieve more optimum waste loading with improved performance. Metals can be segregated from oxides and each can be stabilized in forms to minimize the HLW volume for repository disposal. Thus, a more efficient waste management system making the most effective use of advanced waste forms and disposal design for each waste is enabled by advanced separations and how the waste streams are combined. This trade-study was designed to juxtapose a combined waste form baseline waste treatment scheme with two options and to evaluate the cost-benefit using available data from the conceptual design studies supported by DOE-NE
Scalar perturbations in p-nflation: the 3-form case
Energy Technology Data Exchange (ETDEWEB)
Germani, Cristiano [LUTH, Observatoire de Paris, CNRS UMR 8102, Université Paris Diderot, 5 Place Jules Janssen, 92195 Meudon Cedex (France); Kehagias, Alex, E-mail: cristiano.germani@obspm.fr, E-mail: kehagias@central.ntua.gr [Department of Physics, National Technical University of Athens, Hroon Polytechniou 9, 15780 Zogrtafou, Athens (Greece)
2009-11-01
We calculate the primordial spectrum of scalar perturbations of the 3-form inflation and we find that the curvature perturbations decay at late times. As as result, although a non-minimally coupled massive 3-form field may drive inflation at early times, it should be assisted by other fields in order to reproduce the observed temperature fluctuations of the CMB sky.
Angle Concept: A High School and Tertiary Longitudinal Perspective to Minimize Obstacles
Barabash, Marita
2017-01-01
The concept of angle emerges in numerous forms as the learning of mathematics and its applications advances through the high school and tertiary curriculum. Many difficulties and misconceptions in the usage of this multifaceted concept might be avoided or at least minimized should the lecturers in different areas of pure and applied mathematics be…
Schema Design and Normalization Algorithm for XML Databases Model
Directory of Open Access Journals (Sweden)
Samir Abou El-Seoud
2009-06-01
Full Text Available In this paper we study the problem of schema design and normalization in XML databases model. We show that, like relational databases, XML documents may contain redundant information, and this redundancy may cause update anomalies. Furthermore, such problems are caused by certain functional dependencies among paths in the document. Based on our research works, in which we presented the functional dependencies and normal forms of XML Schema, we present the decomposition algorithm for converting any XML Schema into normalized one, that satisfies X-BCNF.
Energy Technology Data Exchange (ETDEWEB)
Kohlmann, Johannes; Kieler, Oliver [Physikalisch-Technische Bundesanstalt (PTB), Braunschweig (Germany). Arbeitsgruppe 2.43 ' ' Josephson-Schaltungen' '
2016-09-15
In this contribution we present the manufacturing technology for the fabrication of integrated superconducting Josephson serial circuits for voltage normals. First we summarize some foundations for Josephson voltage normals and sketch the concept and the setup of the circuits, before we describe the manufacturing technology form modern practical Josephson voltage normals.
A SDP based design of relay precoding for the power minimization of MIMO AF-relay networks
Rao, Anlei
2015-09-11
Relay precoding for multiple-input and multiple-output (MIMO) relay networks has been approached by either optimizing the efficiency performance with given power consumption constraints or minimizing the power consumption with quality-of-service (QoS) requirements. For the later type design, previous works has worked on minimizing the approximated power consumption. In this paper, exact power consumption for all relays is derived into a quadratic form by diagonalizing the minimum-square error (MSE) matrix, and the relay precoding matrix is designed by optimizing this quadratic form with the help of semidefinite programming (SDP) relaxation. Our simulation results show that such a design can achieve a gain of around 3 dB against the previous design, which optimized the approximated power consumption. © 2015 IEEE.
Sandstone-filled normal faults: A case study from central California
Palladino, Giuseppe; Alsop, G. Ian; Grippa, Antonio; Zvirtes, Gustavo; Phillip, Ruy Paulo; Hurst, Andrew
2018-05-01
Despite the potential of sandstone-filled normal faults to significantly influence fluid transmissivity within reservoirs and the shallow crust, they have to date been largely overlooked. Fluidized sand, forcefully intruded along normal fault zones, markedly enhances the transmissivity of faults and, in general, the connectivity between otherwise unconnected reservoirs. Here, we provide a detailed outcrop description and interpretation of sandstone-filled normal faults from different stratigraphic units in central California. Such faults commonly show limited fault throw, cm to dm wide apertures, poorly-developed fault zones and full or partial sand infill. Based on these features and inferences regarding their origin, we propose a general classification that defines two main types of sandstone-filled normal faults. Type 1 form as a consequence of the hydraulic failure of the host strata above a poorly-consolidated sandstone following a significant, rapid increase of pore fluid over-pressure. Type 2 sandstone-filled normal faults form as a result of regional tectonic deformation. These structures may play a significant role in the connectivity of siliciclastic reservoirs, and may therefore be crucial not just for investigation of basin evolution but also in hydrocarbon exploration.
Effects of variable transformations on errors in FORM results
International Nuclear Information System (INIS)
Qin Quan; Lin Daojin; Mei Gang; Chen Hao
2006-01-01
On the basis of studies on second partial derivatives of the variable transformation functions for nine different non-normal variables the paper comprehensively discusses the effects of the transformation on FORM results and shows that senses and values of the errors in FORM results depend on distributions of the basic variables, whether resistances or actions basic variables represent, and the design point locations in the standard normal space. The transformations of the exponential or Gamma resistance variables can generate +24% errors in the FORM failure probability, and the transformation of Frechet action variables could generate -31% errors
The frontal ventriculocerebral ratio no normal computed tomography
Energy Technology Data Exchange (ETDEWEB)
Choi, Jeong Yeon; Kim, Yeon Jung; Hah, Hae Koo [College of Medicine, Han Yang University, Seoul (Korea, Republic of)
1980-12-15
From the Hanyng University hospital's computer memory bank containing the CT scans of over 3000 patients, 400 normal patients between age of 1 to 69 years were selected at random. These scans were performed as a screening test in patient with minimal, vague neurologic manifestations such as headache, dizziness, convulsion and depression or suffering the traffic accident. Determination of ventridulocerebral ratio between the width of brain and dimension representing the distance between the outer border of the lateral ventricle was made at the two sites, using the image 4 or 5 at the demonstrating the level of Monro's formen.
Relation between Protein Intrinsic Normal Mode Weights and Pre-Existing Conformer Populations.
Ozgur, Beytullah; Ozdemir, E Sila; Gursoy, Attila; Keskin, Ozlem
2017-04-20
Intrinsic fluctuations of a protein enable it to sample a large repertoire of conformers including the open and closed forms. These distinct forms of the protein called conformational substates pre-exist together in equilibrium as an ensemble independent from its ligands. The role of ligand might be simply to alter the equilibrium toward the most appropriate form for binding. Normal mode analysis is proved to be useful in identifying the directions of conformational changes between substates. In this study, we demonstrate that the ratios of normalized weights of a few normal modes driving the protein between its substates can give insights about the ratios of kinetic conversion rates of the substates, although a direct relation between the eigenvalues and kinetic conversion rates or populations of each substate could not be observed. The correlation between the normalized mode weight ratios and the kinetic rate ratios is around 83% on a set of 11 non-enzyme proteins and around 59% on a set of 17 enzymes. The results are suggestive that mode motions carry intrinsic relations with thermodynamics and kinetics of the proteins.
Selective attention in normal and impaired hearing.
Shinn-Cunningham, Barbara G; Best, Virginia
2008-12-01
A common complaint among listeners with hearing loss (HL) is that they have difficulty communicating in common social settings. This article reviews how normal-hearing listeners cope in such settings, especially how they focus attention on a source of interest. Results of experiments with normal-hearing listeners suggest that the ability to selectively attend depends on the ability to analyze the acoustic scene and to form perceptual auditory objects properly. Unfortunately, sound features important for auditory object formation may not be robustly encoded in the auditory periphery of HL listeners. In turn, impaired auditory object formation may interfere with the ability to filter out competing sound sources. Peripheral degradations are also likely to reduce the salience of higher-order auditory cues such as location, pitch, and timbre, which enable normal-hearing listeners to select a desired sound source out of a sound mixture. Degraded peripheral processing is also likely to increase the time required to form auditory objects and focus selective attention so that listeners with HL lose the ability to switch attention rapidly (a skill that is particularly important when trying to participate in a lively conversation). Finally, peripheral deficits may interfere with strategies that normal-hearing listeners employ in complex acoustic settings, including the use of memory to fill in bits of the conversation that are missed. Thus, peripheral hearing deficits are likely to cause a number of interrelated problems that challenge the ability of HL listeners to communicate in social settings requiring selective attention.
Endoscopic and minimally-invasive ear surgery: A path to better outcomes
Directory of Open Access Journals (Sweden)
Natasha Pollak
2017-09-01
Full Text Available The development of endoscopic ear surgery techniques promises to change the way we approach ear surgery. In this review paper, we explore the current evidence, seek to determine the advantages of endoscopic ear surgery, and see if these advantages are both measureable and meaningful. The wide field of view of the endoscope allows the surgeon to better visualize the various recesses of the middle ear cleft. Endoscopes make it possible to address the target pathology transcanal, while minimizing dissection or normal tissue done purely for exposure, leading to the evolution of minimally-invasive ear surgery and reducing morbidity. When used in chronic ear surgery, endoscopy appears to have the potential to significantly reduce cholesteatoma recidivism rates. Using endoscopes as an adjunct can increase the surgeon's confidence in total cholesteatoma removal. By doing so, endoscopes reduce the need to reopen the mastoid during second-look surgery, help preserve the canal wall, or even change post-cholesteatoma follow-up protocols by channeling more patients away from a planned second-look.
Application of Power Geometry and Normal Form Methods to the Study of Nonlinear ODEs
Edneral, Victor
2018-02-01
This paper describes power transformations of degenerate autonomous polynomial systems of ordinary differential equations which reduce such systems to a non-degenerative form. Example of creating exact first integrals of motion of some planar degenerate system in a closed form is given.
On The Extensive Form Of N-Person Cooperative Games | Udeh ...
African Journals Online (AJOL)
On The Extensive Form Of N-Person Cooperative Games. ... games. Keywords: Extensive form game, Normal form game, characteristic function, Coalition, Imputation, Player, Payoff, Strategy and Core ... AJOL African Journals Online. HOW TO ...
Strength of Gamma Rhythm Depends on Normalization
Ray, Supratim; Ni, Amy M.; Maunsell, John H. R.
2013-01-01
Neuronal assemblies often exhibit stimulus-induced rhythmic activity in the gamma range (30–80 Hz), whose magnitude depends on the attentional load. This has led to the suggestion that gamma rhythms form dynamic communication channels across cortical areas processing the features of behaviorally relevant stimuli. Recently, attention has been linked to a normalization mechanism, in which the response of a neuron is suppressed (normalized) by the overall activity of a large pool of neighboring neurons. In this model, attention increases the excitatory drive received by the neuron, which in turn also increases the strength of normalization, thereby changing the balance of excitation and inhibition. Recent studies have shown that gamma power also depends on such excitatory–inhibitory interactions. Could modulation in gamma power during an attention task be a reflection of the changes in the underlying excitation–inhibition interactions? By manipulating the normalization strength independent of attentional load in macaque monkeys, we show that gamma power increases with increasing normalization, even when the attentional load is fixed. Further, manipulations of attention that increase normalization increase gamma power, even when they decrease the firing rate. Thus, gamma rhythms could be a reflection of changes in the relative strengths of excitation and normalization rather than playing a functional role in communication or control. PMID:23393427
Ruled Laguerre minimal surfaces
Skopenkov, Mikhail; Pottmann, Helmut; Grohs, Philipp
2011-01-01
A Laguerre minimal surface is an immersed surface in ℝ 3 being an extremal of the functional ∫ (H 2/K-1)dA. In the present paper, we prove that the only ruled Laguerre minimal surfaces are up to isometry the surfaces ℝ (φλ) = (Aφ, Bφ, Cφ + D cos 2φ
Self-Esteem of Gifted, Normal, and Mild Mentally Handicapped Children.
Chiu, Lian-Hwang
1990-01-01
Administered Coopersmith Self-Esteem Inventory (SEI) Form B to elementary school students (N=450) identified as gifted, normal, and mild mentally handicapped (MiMH). Results indicated that both the gifted and normal children had significantly higher self-esteem than did the MiMH children, but there were no differences between gifted and normal…
Application of Power Geometry and Normal Form Methods to the Study of Nonlinear ODEs
Directory of Open Access Journals (Sweden)
Edneral Victor
2018-01-01
Full Text Available This paper describes power transformations of degenerate autonomous polynomial systems of ordinary differential equations which reduce such systems to a non-degenerative form. Example of creating exact first integrals of motion of some planar degenerate system in a closed form is given.
Method for construction of normalized cDNA libraries
Soares, Marcelo B.; Efstratiadis, Argiris
1998-01-01
This invention provides a method to normalize a directional cDNA library constructed in a vector that allows propagation in single-stranded circle form comprising: (a) propagating the directional cDNA library in single-stranded circles; (b) generating fragments complementary to the 3' noncoding sequence of the single-stranded circles in the library to produce partial duplexes; (c) purifying the partial duplexes; (d) melting and reassociating the purified partial duplexes to appropriate Cot; and (e) purifying the unassociated single-stranded circles, thereby generating a normalized cDNA library. This invention also provides normalized cDNA libraries generated by the above-described method and uses of the generated libraries.
Normal central retinal function and structure preserved in retinitis pigmentosa.
Jacobson, Samuel G; Roman, Alejandro J; Aleman, Tomas S; Sumaroka, Alexander; Herrera, Waldo; Windsor, Elizabeth A M; Atkinson, Lori A; Schwartz, Sharon B; Steinberg, Janet D; Cideciyan, Artur V
2010-02-01
To determine whether normal function and structure, as recently found in forms of Usher syndrome, also occur in a population of patients with nonsyndromic retinitis pigmentosa (RP). Patients with simplex, multiplex, or autosomal recessive RP (n = 238; ages 9-82 years) were studied with static chromatic perimetry. A subset was evaluated with optical coherence tomography (OCT). Co-localized visual sensitivity and photoreceptor nuclear layer thickness were measured across the central retina to establish the relationship of function and structure. Comparisons were made to patients with Usher syndrome (n = 83, ages 10-69 years). Cross-sectional psychophysical data identified patients with RP who had normal rod- and cone-mediated function in the central retina. There were two other patterns with greater dysfunction, and longitudinal data confirmed that progression can occur from normal rod and cone function to cone-only central islands. The retinal extent of normal laminar architecture by OCT corresponded to the extent of normal visual function in patients with RP. Central retinal preservation of normal function and structure did not show a relationship with age or retained peripheral function. Usher syndrome results were like those in nonsyndromic RP. Regional disease variation is a well-known finding in RP. Unexpected was the observation that patients with presumed recessive RP can have regions with functionally and structurally normal retina. Such patients will require special consideration in future clinical trials of either focal or systemic treatment. Whether there is a common molecular mechanism shared by forms of RP with normal regions of retina warrants further study.
Y-12 Plant waste minimization strategy
International Nuclear Information System (INIS)
Kane, M.A.
1987-01-01
The 1984 Amendments to the Resource Conservation and Recovery Act (RCRA) mandate that waste minimization be a major element of hazardous waste management. In response to this mandate and the increasing costs for waste treatment, storage, and disposal, the Oak Ridge Y-12 Plant developed a waste minimization program to encompass all types of wastes. Thus, waste minimization has become an integral part of the overall waste management program. Unlike traditional approaches, waste minimization focuses on controlling waste at the beginning of production instead of the end. This approach includes: (1) substituting nonhazardous process materials for hazardous ones, (2) recycling or reusing waste effluents, (3) segregating nonhazardous waste from hazardous and radioactive waste, and (4) modifying processes to generate less waste or less toxic waste. An effective waste minimization program must provide the appropriate incentives for generators to reduce their waste and provide the necessary support mechanisms to identify opportunities for waste minimization. This presentation focuses on the Y-12 Plant's strategy to implement a comprehensive waste minimization program. This approach consists of four major program elements: (1) promotional campaign, (2) process evaluation for waste minimization opportunities, (3) waste generation tracking system, and (4) information exchange network. The presentation also examines some of the accomplishments of the program and issues which need to be resolved
International Nuclear Information System (INIS)
Hosomichi, Kazuo
2008-01-01
We study FZZT-branes and open string amplitudes in (p, q) minimal string theory. We focus on the simplest boundary changing operators in two-matrix models, and identify the corresponding operators in worldsheet theory through the comparison of amplitudes. Along the way, we find a novel linear relation among FZZT boundary states in minimal string theory. We also show that the boundary ground ring is realized on physical open string operators in a very simple manner, and discuss its use for perturbative computation of higher open string amplitudes.
DEFF Research Database (Denmark)
Channuie, Phongpichit; Jark Joergensen, Jakob; Sannino, Francesco
2011-01-01
We investigate models in which the inflaton emerges as a composite field of a four dimensional, strongly interacting and nonsupersymmetric gauge theory featuring purely fermionic matter. We show that it is possible to obtain successful inflation via non-minimal coupling to gravity, and that the u......We investigate models in which the inflaton emerges as a composite field of a four dimensional, strongly interacting and nonsupersymmetric gauge theory featuring purely fermionic matter. We show that it is possible to obtain successful inflation via non-minimal coupling to gravity...
Minimally invasive approaches in pancreatic pseudocyst: a Case report
Directory of Open Access Journals (Sweden)
Rohollah Y
2009-09-01
Full Text Available "n Normal 0 false false false EN-US X-NONE AR-SA MicrosoftInternetExplorer4 /* Style Definitions */ table.MsoNormalTable {mso-style-name:"Table Normal"; mso-tstyle-rowband-size:0; mso-tstyle-colband-size:0; mso-style-noshow:yes; mso-style-priority:99; mso-style-qformat:yes; mso-style-parent:""; mso-padding-alt:0in 5.4pt 0in 5.4pt; mso-para-margin:0in; mso-para-margin-bottom:.0001pt; mso-pagination:widow-orphan; font-size:11.0pt; font-family:"Calibri","sans-serif"; mso-ascii-font-family:Calibri; mso-ascii-theme-font:minor-latin; mso-fareast-font-family:"Times New Roman"; mso-fareast-theme-font:minor-fareast; mso-hansi-font-family:Calibri; mso-hansi-theme-font:minor-latin; mso-bidi-font-family:Arial; mso-bidi-theme-font:minor-bidi;} Background: According to importance of post operative period, admission duration, post operative pain, and acceptable rate of complications, minimally invasive approaches with endoscope in pancreatic pseudocyst management becomes more popular, but the best choice of procedure and patient selection is currently not completely established. During past decade endoscopic procedures are become first choice in most authors' therapeutic plans, however, open surgery remains gold standard in pancreatic pseudocyst treatment."n"nMethods: we present here a patient with pancreatic pseudocyst unresponsive to conservative management that is intervened endoscopically before 6th week, and review current literatures to depict a schema to management navigation."n"nResults: A 16 year old male patient presented with two episodes of acute pancreatitis with abdominal pain, nausea and vomiting. Hyperamilasemia, pancreatic ascites and a pseudocyst were found in our preliminary investigation. Despite optimal conservative management, including NPO (nil per os and total parentral nutrition, after four weeks, clinical and para-clinical findings deteriorated. Therefore, ERCP and trans-papillary cannulation with placement of 7Fr stent was
Directory of Open Access Journals (Sweden)
João Carlos Magi
2017-04-01
Full Text Available Minimally invasive procedures aim to resolve the disease with minimal trauma to the body, resulting in a rapid return to activities and in reductions of infection, complications, costs and pain. Minimally incised laparotomy, sometimes referred to as minilaparotomy, is an example of such minimally invasive procedures. The aim of this study is to demonstrate the feasibility and utility of laparotomy with minimal incision based on the literature and exemplifying with a case. The case in question describes reconstruction of the intestinal transit with the use of this incision. Male, young, HIV-positive patient in a late postoperative of ileotiflectomy, terminal ileostomy and closing of the ascending colon by an acute perforating abdomen, due to ileocolonic tuberculosis. The barium enema showed a proximal stump of the right colon near the ileostomy. The access to the cavity was made through the orifice resulting from the release of the stoma, with a lateral-lateral ileo-colonic anastomosis with a 25 mm circular stapler and manual closure of the ileal stump. These surgeries require their own tactics, such as rigor in the lysis of adhesions, tissue traction, and hemostasis, in addition to requiring surgeon dexterity – but without the need for investments in technology; moreover, the learning curve is reported as being lower than that for videolaparoscopy. Laparotomy with minimal incision should be considered as a valid and viable option in the treatment of surgical conditions. Resumo: Procedimentos minimamente invasivos visam resolver a doença com o mínimo de trauma ao organismo, resultando em retorno rápido às atividades, reduções nas infecções, complicações, custos e na dor. A laparotomia com incisão mínima, algumas vezes referida como minilaparotomia, é um exemplo desses procedimentos minimamente invasivos. O objetivo deste trabalho é demonstrar a viabilidade e utilidade das laparotomias com incisão mínima com base na literatura e
Thin film photovoltaic devices with a minimally conductive buffer layer
Barnes, Teresa M.; Burst, James
2016-11-15
A thin film photovoltaic device (100) with a tunable, minimally conductive buffer (128) layer is provided. The photovoltaic device (100) may include a back contact (150), a transparent front contact stack (120), and an absorber (140) positioned between the front contact stack (120) and the back contact (150). The front contact stack (120) may include a low resistivity transparent conductive oxide (TCO) layer (124) and a buffer layer (128) that is proximate to the absorber layer (140). The photovoltaic device (100) may also include a window layer (130) between the buffer layer (128) and the absorber (140). In some cases, the buffer layer (128) is minimally conductive, with its resistivity being tunable, and the buffer layer (128) may be formed as an alloy from a host oxide and a high-permittivity oxide. The high-permittivity oxide may further be chosen to have a bandgap greater than the host oxide.
Form 6 - gas balancing agreement
International Nuclear Information System (INIS)
Anon.
1990-01-01
In 1988, a special Committee of the Rocky Mountain Mineral Law Foundation undertook a project to draft a model from gas balancing agreement. This project was initiated at the request of a number of Foundation members who felt that a model form gas balancing agreement would facilitate the negotiation of operating agreement, since gas balancing issues had become sticking points in the process. The Committee was composed of attorneys representing a wide cross-section of the oil and gas industry including both major and independent oil companies, production companies with interstate pipeline affiliates, and private practitioners. The Committee attempted to address the more controversial issues in gas balancing with optional provisions in the Form. To facilitate the negotiation process, the number of optional provisions was minimized. This form may be used as an Appendix to the new A.A.P.L. Form 610-1989 Model Form Operating Agreement. This book includes provision of this Form which are: Ownership of gas production; Balancing of production accounts; Cash balancing upon depletion; Deliverability tests; Nominations; Statements; Payment of taxes; Operating expenses; Overproducing allowable; Payment of leasehold burdens; Operator's liability; Successors and assigns; Audits; Arbitration; and Operator's fees
Disjoint sum forms in reliability theory
Directory of Open Access Journals (Sweden)
B. Anrig
2014-01-01
Full Text Available The structure function f of a binary monotone system is assumed to be known and given in a disjunctive normal form, i.e. as the logical union of products of the indicator variables of the states of its subsystems. Based on this representation of f, an improved Abraham algorithm is proposed for generating the disjoint sum form of f. This form is the base for subsequent numerical reliability calculations. The approach is generalized to multivalued systems. Examples are discussed.
Safety of disclosing amyloid status in cognitively normal older adults.
Burns, Jeffrey M; Johnson, David K; Liebmann, Edward P; Bothwell, Rebecca J; Morris, Jill K; Vidoni, Eric D
2017-09-01
Disclosing amyloid status to cognitively normal individuals remains controversial given our lack of understanding the test's clinical significance and unknown psychological risk. We assessed the effect of amyloid status disclosure on anxiety and depression before disclosure, at disclosure, and 6 weeks and 6 months postdisclosure and test-related distress after disclosure. Clinicians disclosed amyloid status to 97 cognitively normal older adults (27 had elevated cerebral amyloid). There was no difference in depressive symptoms across groups over time. There was a significant group by time interaction in anxiety, although post hoc analyses revealed no group differences at any time point, suggesting a minimal nonsustained increase in anxiety symptoms immediately postdisclosure in the elevated group. Slight but measureable increases in test-related distress were present after disclosure and were related to greater baseline levels of anxiety and depression. Disclosing amyloid imaging results to cognitively normal adults in the clinical research setting with pre- and postdisclosure counseling has a low risk of psychological harm. Copyright © 2017 the Alzheimer's Association. Published by Elsevier Inc. All rights reserved.
Minimizing embarrassment: boys' experiences of pubertal changes.
Flaming, D; Morse, J M
1991-01-01
As very little is known about boys' subjective and emotional experiences when going through puberty, the qualitative research method of grounded theory was used in this study to address the question: "What is the experience of the physical maturational changes in male adolescents?" A Basic Social Psychological Process emerged, Minimizing Embarrassment, with four stages: waiting for the change, noticing the change, dealing with the change, and feeling comfortable with the change. Boys developed expectations from listening to others, by looking at older males, and by wondering and imagining what the changes would eventually be like for them. After developing these expectations, they compared their physical changes to others and to their own expectations. If the boys felt they were different from their peers, they worried about this difference. They used strategies such as avoiding, pretending, and joking to avoid embarrassment or to deal with embarrassing situations. If the boys felt they were "normal," they accepted the fact that they were maturing properly.
International Nuclear Information System (INIS)
Fan, Yize; Wu, Puxun; Yu, Hongwei
2015-01-01
Cosmological perturbations of the non-minimally coupled scalar field dark energy in both the metric and Palatini formalisms are studied in this paper. We find that on the large scales with the energy density of dark energy becoming more and more important in the low redshift region, the gravitational potential becomes smaller and smaller, and the effect of non-minimal coupling becomes more and more apparent. In the metric formalism the value of the gravitational potential in the non-minimally coupled case with a positive coupling constant is less than that in the minimally coupled case, while it is larger if the coupling constant is negative. This is different from that in the Palatini formalism where the value of gravitational potential is always smaller. Based upon the quasi-static approximation on the sub-horizon scales, the linear growth of matter is also analyzed. We obtain that the effective Newton's constants in the metric and Palatini formalisms have different forms. A negative coupling constant enhances the gravitational interaction, while a positive one weakens it. Although the metric and Palatini formalisms give different linear growth rates, the difference is very small and the current observation cannot distinguish them effectively
Monzalvo, Karla; Fluss, Joel; Billard, Catherine; Dehaene, Stanislas; Dehaene-Lambertz, Ghislaine
2012-05-15
In dyslexia, anomalous activations have been described in both left temporo-parietal language cortices and in left ventral visual occipito-temporal cortex. However, the reproducibility, task-dependency, and presence of these brain anomalies in childhood rather than adulthood remain debated. We probed the large-scale organization of ventral visual and spoken language areas in dyslexic children using minimal target-detection tasks that were performed equally well by all groups. In 23 normal and 23 dyslexic 10-year-old children from two different socio-economic status (SES) backgrounds, we compared fMRI activity to visually presented houses, faces, and written strings, and to spoken sentences in the native or in a foreign language. Our results confirm a disorganization of both ventral visual and spoken language areas in dyslexic children. Visually, dyslexic children showed a normal lateral-to-medial mosaic of preferences, as well as normal responses to houses and checkerboards, but a reduced activation to words in the visual word form area (VWFA) and to faces in the right fusiform face area (FFA). Auditorily, dyslexic children exhibited reduced responses to speech in posterior temporal cortex, left insula and supplementary motor area, as well as reduced responses to maternal language in subparts of the planum temporale, left basal language area and VWFA. By correlating these two findings, we identify spoken-language predictors of VWFA activation to written words, which differ for dyslexic and normal readers. Similarities in fMRI deficits in both SES groups emphasize the existence of a core set of brain activation anomalies in dyslexia, regardless of culture, language and SES, without however resolving whether these anomalies are a cause or a consequence of impaired reading. Copyright © 2012 Elsevier Inc. All rights reserved.
Brightness-normalized Partial Least Squares Regression for hyperspectral data
International Nuclear Information System (INIS)
Feilhauer, Hannes; Asner, Gregory P.; Martin, Roberta E.; Schmidtlein, Sebastian
2010-01-01
Developed in the field of chemometrics, Partial Least Squares Regression (PLSR) has become an established technique in vegetation remote sensing. PLSR was primarily designed for laboratory analysis of prepared material samples. Under field conditions in vegetation remote sensing, the performance of the technique may be negatively affected by differences in brightness due to amount and orientation of plant tissues in canopies or the observing conditions. To minimize these effects, we introduced brightness normalization to the PLSR approach and tested whether this modification improves the performance under changing canopy and observing conditions. This test was carried out using high-fidelity spectral data (400-2510 nm) to model observed leaf chemistry. The spectral data was combined with a canopy radiative transfer model to simulate effects of varying canopy structure and viewing geometry. Brightness normalization enhanced the performance of PLSR by dampening the effects of canopy shade, thus providing a significant improvement in predictions of leaf chemistry (up to 3.6% additional explained variance in validation) compared to conventional PLSR. Little improvement was made on effects due to variable leaf area index, while minor improvement (mostly not significant) was observed for effects of variable viewing geometry. In general, brightness normalization increased the stability of model fits and regression coefficients for all canopy scenarios. Brightness-normalized PLSR is thus a promising approach for application on airborne and space-based imaging spectrometer data.
Expression of the pluripotency transcription factor OCT4 in the normal and aberrant mammary gland
Directory of Open Access Journals (Sweden)
Foteini eHassiotou
2013-04-01
Full Text Available Breast cancers with lactating features, some of which are associated with pregnancy and lactation, are often poorly differentiated, lack estrogen receptor, progesterone receptor and HER2 expression and have high mortality. Very little is known about the molecular mechanisms that drive uncontrolled cell proliferation in these tumors and confer lactating features. We have recently reported expression of OCT4 and associated embryonic stem cell (ESC self-renewal genes in the normal lactating breast and breastmilk stem cells (hBSCs. This prompted us to examine OCT4 expression in breast cancers with lactating features and compare it with that observed during normal lactation, using rare specimens of human lactating breast. In accordance with previous literature, the normal resting breast (from non-pregnant, non-lactating women showed minimal OCT4 nuclear expression (0.9%. However, this increased in the normal lactating breast (11.4%, with further increase in lactating adenomas, lactating carcinomas and pregnancy-associated breast cancer (30.7-48.3%. OCT4 was expressed in the epithelium and at lower levels in the stroma, and was co-localized with NANOG. Comparison of normal non-tumorigenic hBSCs with OCT4-overexpressing tumorigenic breast cell lines (OTBCs demonstrated upregulation of OCT4, SOX2 and NANOG in both systems, but OTBCs expressed OCT4 at significantly higher levels than SOX2 and NANOG. Similar to hBSCs, OTBCs displayed multi-lineage differentiation potential, including the ability to differentiate into functional lactocytes synthesizing milk proteins both in vitro and in vivo. Based on these findings, we propose a hypothesis of normal and malignant transformation in the breast, which centers on OCT4 and its associated gene network. Although minimal expression of these embryonic genes can be seen in the breast in its resting state throughout life, a controlled program of upregulation of this gene network may be a potential regulator of the
Meta-heuristic CRPS minimization for the calibration of short-range probabilistic forecasts
Mohammadi, Seyedeh Atefeh; Rahmani, Morteza; Azadi, Majid
2016-08-01
This paper deals with the probabilistic short-range temperature forecasts over synoptic meteorological stations across Iran using non-homogeneous Gaussian regression (NGR). NGR creates a Gaussian forecast probability density function (PDF) from the ensemble output. The mean of the normal predictive PDF is a bias-corrected weighted average of the ensemble members and its variance is a linear function of the raw ensemble variance. The coefficients for the mean and variance are estimated by minimizing the continuous ranked probability score (CRPS) during a training period. CRPS is a scoring rule for distributional forecasts. In the paper of Gneiting et al. (Mon Weather Rev 133:1098-1118, 2005), Broyden-Fletcher-Goldfarb-Shanno (BFGS) method is used to minimize the CRPS. Since BFGS is a conventional optimization method with its own limitations, we suggest using the particle swarm optimization (PSO), a robust meta-heuristic method, to minimize the CRPS. The ensemble prediction system used in this study consists of nine different configurations of the weather research and forecasting model for 48-h forecasts of temperature during autumn and winter 2011 and 2012. The probabilistic forecasts were evaluated using several common verification scores including Brier score, attribute diagram and rank histogram. Results show that both BFGS and PSO find the optimal solution and show the same evaluation scores, but PSO can do this with a feasible random first guess and much less computational complexity.
Investigation of normal organ development with fetal MRI
International Nuclear Information System (INIS)
Prayer, Daniela; Brugger, Peter C.
2007-01-01
The understanding of the presentation of normal organ development on fetal MRI forms the basis for recognition of pathological states. During the second and third trimesters, maturational processes include changes in size, shape and signal intensities of organs. Visualization of these developmental processes requires tailored MR protocols. Further prerequisites for recognition of normal maturational states are unequivocal intrauterine orientation with respect to left and right body halves, fetal proportions, and knowledge about the MR presentation of extrafetal/intrauterine organs. Emphasis is laid on the demonstration of normal MR appearance of organs that are frequently involved in malformation syndromes. In addition, examples of time-dependent contrast enhancement of intrauterine structures are given. (orig.)
Investigation of normal organ development with fetal MRI
Energy Technology Data Exchange (ETDEWEB)
Prayer, Daniela [Medical University of Vienna, Department of Radiology, Vienna (Austria); Brugger, Peter C. [Medical University of Vienna, Center of Anatomy and Cell Biology, Integrative Morphology Group, Vienna (Austria)
2007-10-15
The understanding of the presentation of normal organ development on fetal MRI forms the basis for recognition of pathological states. During the second and third trimesters, maturational processes include changes in size, shape and signal intensities of organs. Visualization of these developmental processes requires tailored MR protocols. Further prerequisites for recognition of normal maturational states are unequivocal intrauterine orientation with respect to left and right body halves, fetal proportions, and knowledge about the MR presentation of extrafetal/intrauterine organs. Emphasis is laid on the demonstration of normal MR appearance of organs that are frequently involved in malformation syndromes. In addition, examples of time-dependent contrast enhancement of intrauterine structures are given. (orig.)
Minimal Flavour Violation and Beyond
Isidori, Gino
2012-01-01
We review the formulation of the Minimal Flavour Violation (MFV) hypothesis in the quark sector, as well as some "variations on a theme" based on smaller flavour symmetry groups and/or less minimal breaking terms. We also review how these hypotheses can be tested in B decays and by means of other flavour-physics observables. The phenomenological consequences of MFV are discussed both in general terms, employing a general effective theory approach, and in the specific context of the Minimal Supersymmetric extension of the SM.
International Nuclear Information System (INIS)
Dagan, E.B.; Selby, K.B.
1993-08-01
The Hanford Site is located in the State of Washington and is subject to state and federal environmental regulations that hamper waste minimization efforts. This paper addresses the negative effect of these regulations on waste minimization and mixed waste issues related to the Hanford Site. Also, issues are addressed concerning the regulations becoming more lenient. In addition to field operations, the Hanford Site is home to the Pacific Northwest Laboratory which has many ongoing waste minimization activities of particular interest to laboratories
The anti-tumor efficacy of nanoparticulate form of ICD-85 versus free form
Directory of Open Access Journals (Sweden)
Zare Mirakabadi, A.
2015-04-01
Full Text Available Biodegradable polymeric nanoparticles (NPs have been intensively studied as a possible way to enhance anti-tumor efficacy while reducing side effects. ICD-85, derived from the venom of two separate species of venomous animals, has been shown to exhibit anti-cancer activity. In this report polymer based sodium alginate nanoparticles of ICD-85 was used to enhance its therapeutic effects and reduce its side effects. The inhibitory effect was evaluated by MTT assay. The necrotic effect was assessed using LDH assay. The induction of apoptosis was analyzed by caspase-8 colorimetric assay kit. Cytotoxicity assay in HeLa cells demonstrated enhanced efficacy of ICD-85 loaded NPs compared to the free ICD-85. The IC50 values obtained in HeLa cells after 48 h, for free ICD-85 and ICD-85 loaded NPs were 26±2.9μg ml-1 and 18±2.5μg ml-1, respectively. While it was observed that free ICD-85 exhibits mild cytotoxicity towards normal MRC-5 cells (IC50>60μg ml-1, ICD-85 loaded NPs was found to have higher efficacy in anti-proliferative activity on HeLa cells in vitro without any significant cytotoxic effect on normal MRC-5 cells. The apoptosis-induction mechanism by both form of ICD-85 on HeLa cells was found to be through activation of caspase-8 with approximately 2 fold greater of ICD-85 loaded NPs as compared to free ICD-85. Our work reveals that although ICD-85 in free form is relatively selective to inhibit the growth of cancer cells via apoptosis as compared to normal cells, but nanoparticulate form increases its selectivity towards cancer cells.
Beattle, A J; Oliver, I
1994-12-01
Biological surveys are in increasing demand while taxonomic resources continue to decline. How much formal taxonomy is required to get the job done? The answer depends on the kind of job but it is possible that taxonomic minimalism, especially (1) the use of higher taxonomic ranks, (2) the use of morphospecies rather than species (as identified by Latin binomials), and (3) the involvement of taxonomic specialists only for training and verification, may offer advantages for biodiversity assessment, environmental monitoring and ecological research. As such, formal taxonomy remains central to the process of biological inventory and survey but resources may be allocated more efficiently. For example, if formal Identification is not required, resources may be concentrated on replication and increasing sample sizes. Taxonomic minimalism may also facilitate the inclusion in these activities of important but neglected groups, especially among the invertebrates, and perhaps even microorganisms. Copyright © 1994. Published by Elsevier Ltd.
Minimizing waste in environmental restoration
International Nuclear Information System (INIS)
Thuot, J.R.; Moos, L.
1996-01-01
Environmental restoration, decontamination and decommissioning, and facility dismantlement projects are not typically known for their waste minimization and pollution prevention efforts. Typical projects are driven by schedules and milestones with little attention given to cost or waste minimization. Conventional wisdom in these projects is that the waste already exists and cannot be reduced or minimized; however, there are significant areas where waste and cost can be reduced by careful planning and execution. Waste reduction can occur in three ways: beneficial reuse or recycling, segregation of waste types, and reducing generation of secondary waste
Quantization of the minimal and non-minimal vector field in curved space
Toms, David J.
2015-01-01
The local momentum space method is used to study the quantized massive vector field (the Proca field) with the possible addition of non-minimal terms. Heat kernel coefficients are calculated and used to evaluate the divergent part of the one-loop effective action. It is shown that the naive expression for the effective action that one would write down based on the minimal coupling case needs modification. We adopt a Faddeev-Jackiw method of quantization and consider the case of an ultrastatic...
Directory of Open Access Journals (Sweden)
Olga Sergeevna Muftahova
2015-11-01
Full Text Available In the article the financing form in case of import revolving leverage leasing is described to modernize of fixed assets.In the research process the methods of mathematical modeling are applied.The mathematical model of the generalized method of calculation leasing payments proposed by the authors is represented according to the results of the research.The presented method includes several forms of leasing utilized both in the domestic and in the foreign practice.The novelty of this method consists in the fact that on the basis of the proposed forms of leasing is calculated the sum of leasing payment, which considers with the calculation insu-rance, financial and currency component to minimize losses from downtime due to the limited using of basic production assets of the enterprise in the organization of the production process.The method presented by authors is intended to minimize the risk of downtime of production equipment. This fact will enable us to provide high qualitative and quantitative indicators of the enterprise, stability and continuity of the production process.
Caronia, Francesco Paolo; Arrigo, Ettore; Failla, Andrea Valentino; Sgalambro, Francesco; Giannone, Giorgio; Lo Monte, Attilio Ignazio; Cajozzo, Massimo; Santini, Mario; Fiorelli, Alfonso
2018-04-01
A 67-year-old man was referred to our attention for management of esophageal adenocarcinoma, localized at the level of the esophagogastric junction and obstructed the 1/3 of the esophageal lumen. Due to the extension of the disease (T3N1M0-Stage IIIA), the patient underwent neo-adjuvant chemo-radiation therapy and he was then scheduled for a minimally invasive surgical procedure including laparoscopic gastroplasty, uniportal thoracoscopic esophageal dissection and intrathoracic end-to-end esophago-gastric anastomosis. No intraoperative and post-operative complications were seen. The patient was discharged in post-operative day 9. Pathological study confirmed the diagnosis of adenocarcinoma (T2N1M0-Stage IIB) and he underwent adjuvant chemotherapy. At the time of present paper, patient is alive and well without signs of recurrence or metastasis. Our minimally approach compared to standard open procedure would help reduce post-operative pain and favours early return to normal activity. However, future experiences with a control group are required before our strategy can be widely used.
International Nuclear Information System (INIS)
Schomburg, A.; Bender, H.; Reichel, C.; Sommer, T.; Ruhlmann, J.; Kozak, B.; Biersack, H.J.
1996-01-01
While the evident advantages of absolute metabolic rate determinations cannot be equalled by static image analysis of fluorine-18 fluorodexyglucose positron emission tomographic (FDG PET) studies, various algorithms for the normalization of static FDG uptake values have been proposed. This study was performed to compare different normalization procedures in terms of dependency on individual patient characteristics. Standardized FDG uptake values (SUVs) were calculated for liver and lung tissue in 126 patients studied with whole-body FDG PET. Uptake values were normalized for total body weight, lean body mass and body surface area. Ranges, means, medians, standard deviations and variation coefficients of these SUV parameters were calculated and their interdependency with total body weight, lean body mass, body surface area, patient height and blood sugar levels was calculated by means of regression analysis. Standardized FDG uptake values normalized for body surface area were clearly superior to SUV parameters normalized for total body weight or lean body mass. Variation and correlation coefficients of body surface area-normalized uptake values were minimal when compared with SUV parameters derived from the other normalization procedures. Normalization for total body weight resulted in uptake values still dependent on body weight and blood sugar levels, while normalization for lean body mass did not eliminate the positive correlation with lean body mass and patient height. It is concluded that normalization of FDG uptake values for body surface area is less dependent on the individual patient characteristics than are FDG uptake values normalized for other parameters, and therefore appears to be preferable for FDG PET studies in oncology. (orig.)
Lie algebra of conformal Killing–Yano forms
International Nuclear Information System (INIS)
Ertem, Ümit
2016-01-01
We provide a generalization of the Lie algebra of conformal Killing vector fields to conformal Killing–Yano forms. A new Lie bracket for conformal Killing–Yano forms that corresponds to slightly modified Schouten–Nijenhuis bracket of differential forms is proposed. We show that conformal Killing–Yano forms satisfy a graded Lie algebra in constant curvature manifolds. It is also proven that normal conformal Killing–Yano forms in Einstein manifolds also satisfy a graded Lie algebra. The constructed graded Lie algebras reduce to the graded Lie algebra of Killing–Yano forms and the Lie algebras of conformal Killing and Killing vector fields in special cases. (paper)
Self-duality in Maxwell-Chern-Simons theories with non minimal coupling with matter field
Chandelier, F; Masson, T; Wallet, J C
2000-01-01
We consider a general class of non-local MCS models whose usual minimal coupling to a conserved current is supplemented with a (non-minimal) magnetic Pauli-type coupling. We find that the considered models exhibit a self-duality whenever the magnetic coupling constant reaches a special value: the partition function is invariant under a set of transformations among the parameter space (the duality transformations) while the original action and its dual counterpart have the same form. The duality transformations have a structure similar to the one underlying self-duality of the (2+1)-dimensional Z sub n - Abelian Higgs model with Chern-Simons and bare mass term.
Gravity-assisted exact unification in minimal supersymmetric SU(5) and its gaugino mass spectrum
International Nuclear Information System (INIS)
Tobe, Kazuhiro; Wells, James D.
2004-01-01
Minimal supersymmetric SU(5) with exact unification is naively inconsistent with proton decay constraints. However, it can be made viable by a gravity-induced non-renormalizable operator connecting the adjoint Higgs boson and adjoint vector boson representations. We compute the allowed coupling space for this theory and find natural compatibility with proton decay constraints even for relatively light superpartner masses. The modifications away from the naive SU(5) theory have an impact on the gaugino mass spectrum, which we calculate. A combination of precision linear collider and large hadron collider measurements of superpartner masses would enable interesting tests of the high-scale form of minimal supersymmetric SU(5)
International Nuclear Information System (INIS)
Blanca, Ernest
1974-10-01
Alpha-numeric boolean expressions, written in the form of sums of products and/or products of sums with many brackets, may be minimized in two steps: syntaxic recognition analysis using precedence operator grammar, syntaxic reduction analysis. These two phases of execution and the different programs of the corresponding machine algorithm are described. Examples of minimization of alpha-numeric boolean expressions written with the help of brackets, utilisation note of the program CHOPIN and theoretical considerations related to language, grammar, precedence operator grammar, sequential systems, boolean sets, boolean representations and treatments of boolean expressions, boolean matrices and their use in grammar theory, are discussed and described. (author) [fr
Electromagnetic form factors of a massive neutrino
International Nuclear Information System (INIS)
Dvornikov, M.S.; Studenikin, A.I.
2004-01-01
Electromagnetic form factors of a massive neutrino are studied in a minimally extended standard model in an arbitrary R ξ gauge and taking into account the dependence on the masses of all interacting particles. The contribution from all Feynman diagrams to the electric, magnetic, and anapole form factors, in which the dependence of the masses of all particles as well as on gauge parameters is accounted for exactly, are obtained for the first time in explicit form. The asymptotic behavior of the magnetic form factor for large negative squares of the momentum of an external photon is analyzed and the expression for the anapole moment of a massive neutrino is derived. The results are generalized to the case of mixing between various flavors of the neutrino. Explicit expressions are obtained for the electric, magnetic, and electric dipole and anapole transitional form factors as well as for the transitional electric dipole moment
Minimal access surgery of pediatric inguinal hernias: a review.
Saranga Bharathi, Ramanathan; Arora, Manu; Baskaran, Vasudevan
2008-08-01
Inguinal hernia is a common problem among children, and herniotomy has been its standard of care. Laparoscopy, which gained a toehold initially in the management of pediatric inguinal hernia (PIH), has managed to steer world opinion against routine contralateral groin exploration by precise detection of contralateral patencies. Besides detection, its ability to repair simultaneously all forms of inguinal hernias (indirect, direct, combined, recurrent, and incarcerated) together with contralateral patencies has cemented its role as a viable alternative to conventional repair. Numerous minimally invasive techniques for addressing PIH have mushroomed in the past two decades. These techniques vary considerably in their approaches to the internal ring (intraperitoneal, extraperitoneal), use of ports (three, two, one), endoscopic instruments (two, one, or none), sutures (absorbable, nonabsorbable), and techniques of knotting (intracorporeal, extracorporeal). In addition to the surgeons' experience and the merits/limitations of individual techniques, it is the nature of the defect that should govern the choice of technique. The emerging techniques show a trend toward increasing use of extracorporeal knotting and diminishing use of working ports and endoscopic instruments. These favor wider adoption of minimal access surgery in addressing PIH by surgeons, irrespective of their laparoscopic skills and experience. Growing experience, wider adoption, decreasing complications, and increasing advantages favor emergence of minimal access surgery as the gold standard for the treatment of PIH in the future. This article comprehensively reviews the laparoscopic techniques of addressing PIH.
Sludge minimization technologies - an overview
Energy Technology Data Exchange (ETDEWEB)
Oedegaard, Hallvard
2003-07-01
The management of wastewater sludge from wastewater treatment plants represents one of the major challenges in wastewater treatment today. The cost of the sludge treatment amounts to more that the cost of the liquid in many cases. Therefore the focus on and interest in sludge minimization is steadily increasing. In the paper an overview is given for sludge minimization (sludge mass reduction) options. It is demonstrated that sludge minimization may be a result of reduced production of sludge and/or disintegration processes that may take place both in the wastewater treatment stage and in the sludge stage. Various sludge disintegration technologies for sludge minimization are discussed, including mechanical methods (focusing on stirred ball-mill, high-pressure homogenizer, ultrasonic disintegrator), chemical methods (focusing on the use of ozone), physical methods (focusing on thermal and thermal/chemical hydrolysis) and biological methods (focusing on enzymatic processes). (author)
Kanda, Hiroyuki; Morimoto, Takeshi; Fujikado, Takashi; Tano, Yasuo; Fukuda, Yutaka; Sawai, Hajime
2004-02-01
Assessment of a novel method of retinal stimulation, known as suprachoroidal-transretinal stimulation (STS), which was designed to minimize insult to the retina by implantation of stimulating electrodes for artificial vision. In 17 normal hooded rats and 12 Royal College of Surgeons (RCS) rats, a small area of the retina was focally stimulated with electric currents through an anode placed on the fenestrated sclera and a cathode inserted into the vitreous chamber. Evoked potentials (EPs) in response to STS were recorded from the surface of the superior colliculus (SC) with a silver-ball electrode, and their physiological properties and localization were studied. In both normal and RCS rats, STS elicited triphasic EPs that were vastly diminished by changing polarity of stimulating electrodes and abolished by transecting the optic nerve. The threshold intensity (C) of the EP response to STS was approximately 7.2 +/- 2.8 nC in normal and 12.9 +/- 7.7 nC in RCS rats. The responses to minimal STS were localized in an area on the SC surface measuring 0.12 +/- 0.07 mm(2) in normal rats and 0.24 +/- 0.12 mm(2) in RCS rats. The responsive area corresponded retinotopically to the retinal region immediately beneath the anodic stimulating electrode. STS is less invasive in the retina than stimulation through epiretinal or subretinal implants. STS can generate focal excitation in retinal ganglion cells in normal animals and in those with degenerated photoreceptors, which suggests that this method of retinal stimulation is suitable for artificial vision.
Spectrogram Image Analysis of Error Signals for Minimizing Impulse Noise
Directory of Open Access Journals (Sweden)
Jeakwan Kim
2016-01-01
Full Text Available This paper presents the theoretical and experimental study on the spectrogram image analysis of error signals for minimizing the impulse input noises in the active suppression of noise. Impulse inputs of some specific wave patterns as primary noises to a one-dimensional duct with the length of 1800 mm are shown. The convergence speed of the adaptive feedforward algorithm based on the least mean square approach was controlled by a normalized step size which was incorporated into the algorithm. The variations of the step size govern the stability as well as the convergence speed. Because of this reason, a normalized step size is introduced as a new method for the control of impulse noise. The spectrogram images which indicate the degree of the attenuation of the impulse input noises are considered to represent the attenuation with the new method. The algorithm is extensively investigated in both simulation and real-time control experiment. It is demonstrated that the suggested algorithm worked with a nice stability and performance against impulse noises. The results in this study can be used for practical active noise control systems.
An augmented reality platform for planning of minimally invasive cardiac surgeries
Chen, Elvis C. S.; Sarkar, Kripasindhu; Baxter, John S. H.; Moore, John; Wedlake, Chris; Peters, Terry M.
2012-02-01
One of the fundamental components in all Image Guided Surgery (IGS) applications is a method for presenting information to the surgeon in a simple, effective manner. This paper describes the first steps in our new Augmented Reality (AR) information delivery program. The system makes use of new "off the shelf" AR glasses that are both light-weight and unobtrusive, with adequate resolution for many IGS applications. Our first application is perioperative planning of minimally invasive robot-assisted cardiac surgery. In this procedure, a combination of tracking technologies and intraoperative ultrasound is used to map the migration of cardiac targets prior to selection of port locations for trocars that enter the chest. The AR glasses will then be used to present this heart migration data to the surgeon, overlaid onto the patients chest. The current paper describes the calibration process for the AR glasses, their integration into our IGS framework for minimally invasive robotic cardiac surgery, and preliminary validation of the system. Validation results indicate a mean 3D triangulation error of 2.9 +/- 3.3mm, 2D projection error of 2.1 +/- 2.1 pixels, and Normalized Stereo Calibration Error of 3.3.
Multiple normalized solutions for a planar gauged nonlinear Schrödinger equation
Luo, Xiao
2018-06-01
We study the existence, multiplicity, quantitative property and asymptotic behavior of normalized solutions for a gauged nonlinear Schrödinger equation arising from the Chern-Simons theory Δ u + ω u +|x|^2u+ λ ( {{h^2}(| x | )}/{{{| x | ^2}}} + \\int \\limits _{| x | }^{ + ∞} {{h(s)}/s} {u^2}(s)ds) u = {| u | ^{p - 2}}u,\\quad x\\in R^2, where ω \\in R, λ >0, p>4 and h(s) = 1/2\\int \\limits _0^s {r{u^2}(r)dr} . Combining constraint minimization method and minimax principle, we prove that the problem possesses at least two normalized solutions: One is a ground state and the other is an excited state. Furthermore, the asymptotic behavior and quantitative property of the ground state are analyzed.
Minimal and careful processing
Nielsen, Thorkild
2004-01-01
In several standards, guidelines and publications, organic food processing is strongly associated with "minimal processing" and "careful processing". The term "minimal processing" is nowadays often used in the general food processing industry and described in literature. The term "careful processing" is used more specifically within organic food processing but is not yet clearly defined. The concept of carefulness seems to fit very well with the processing of organic foods, especially if it i...
Top down and bottom up selection drives variations in frequency and form of a visual signal
Yeh, Chien-Wei; Blamires, Sean J.; Liao, Chen-Pan; Tso, I.-Min
2015-01-01
The frequency and form of visual signals can be shaped by selection from predators, prey or both. When a signal simultaneously attracts predators and prey, selection may favour a strategy that minimizes risks while attracting prey. Accordingly, varying the frequency and form of the silken decorations added to their web may be a way that Argiope spiders minimize predation while attracting prey. Nonetheless, the role of extraneous factors renders the influences of top down and bottom up selecti...
Normal gravity field in relativistic geodesy
Kopeikin, Sergei; Vlasov, Igor; Han, Wen-Biao
2018-02-01
Modern geodesy is subject to a dramatic change from the Newtonian paradigm to Einstein's theory of general relativity. This is motivated by the ongoing advance in development of quantum sensors for applications in geodesy including quantum gravimeters and gradientometers, atomic clocks and fiber optics for making ultra-precise measurements of the geoid and multipolar structure of the Earth's gravitational field. At the same time, very long baseline interferometry, satellite laser ranging, and global navigation satellite systems have achieved an unprecedented level of accuracy in measuring 3-d coordinates of the reference points of the International Terrestrial Reference Frame and the world height system. The main geodetic reference standard to which gravimetric measurements of the of Earth's gravitational field are referred is a normal gravity field represented in the Newtonian gravity by the field of a uniformly rotating, homogeneous Maclaurin ellipsoid of which mass and quadrupole momentum are equal to the total mass and (tide-free) quadrupole moment of Earth's gravitational field. The present paper extends the concept of the normal gravity field from the Newtonian theory to the realm of general relativity. We focus our attention on the calculation of the post-Newtonian approximation of the normal field that is sufficient for current and near-future practical applications. We show that in general relativity the level surface of homogeneous and uniformly rotating fluid is no longer described by the Maclaurin ellipsoid in the most general case but represents an axisymmetric spheroid of the fourth order with respect to the geodetic Cartesian coordinates. At the same time, admitting a post-Newtonian inhomogeneity of the mass density in the form of concentric elliptical shells allows one to preserve the level surface of the fluid as an exact ellipsoid of rotation. We parametrize the mass density distribution and the level surface with two parameters which are
International Nuclear Information System (INIS)
Balakin, A.B.; Zayats, A.E.
2007-01-01
We discuss new exact spherically symmetric static solutions to non-minimally extended Einstein-Yang-Mills equations. The obtained solution to the Yang-Mills subsystem is interpreted as a non-minimal Wu-Yang monopole solution. We focus on the analysis of two classes of the exact solutions to the gravitational field equations. Solutions of the first class belong to the Reissner-Nordstroem type, i.e., they are characterized by horizons and by the singularity at the point of origin. The solutions of the second class are regular ones. The horizons and singularities of a new type, the non-minimal ones, are indicated
Late effects of normal tissues (lent) scoring system: the soma scale
International Nuclear Information System (INIS)
Mornex, F.; Pavy, J.J.; Denekamp, J.
1997-01-01
Radiation tolerance of normal tissues remains the limiting factor for delivering tumoricidal dose. The late toxicity of normal tissues is the most critical element of an irradiation: somatic, functional and structural alterations occur during the actual treatment itself, but late effects manifest months to years after acute effects heal, and may progress with time. The optimal therapeutic ratio ultimately requires not only complete tumor clearance, but also minimal residual injury to surrounding vital normal tissues. The disparity between the intensity of acute and late effects and the inability to predict the eventual manifestation of late normal tissue injury has made radiation oncologists recognize the importance of careful patient follow-up. There is so far no uniform toxicity scoring system to compare several clinical studies in the absence of a 'common toxicity language'. This justifies the need to establish a precise evaluation system for the analysis of late effects of radiation on normal tissues. The SOMA/LENT scoring system results from an international collaboration. European Organization Treatment of Cancer (EORTC) and Radiation Therapy Oncology Group (RTOG) have created subcommittees with the aim of addressing the question of standardized toxic effects criteria. This effort appeared as a necessity to standardize and improve the data recording, to then describe and evaluate uniform toxicity at regular time intervals. The current proposed scale is not yet validated, and should be used cautiously. (authors)
From maximal to minimal supersymmetry in string loop amplitudes
Energy Technology Data Exchange (ETDEWEB)
Berg, Marcus; Buchberger, Igor [Department of Physics, Karlstad University,651 88 Karlstad (Sweden); Schlotterer, Oliver [Max-Planck-Institut für Gravitationsphysik, Albert-Einstein-Institut,14476 Potsdam (Germany)
2017-04-28
We calculate one-loop string amplitudes of open and closed strings with N=1,2,4 supersymmetry in four and six dimensions, by compactification on Calabi-Yau and K3 orbifolds. In particular, we develop a method to combine contributions from all spin structures for arbitrary number of legs at minimal supersymmetry. Each amplitude is cast into a compact form by reorganizing the kinematic building blocks and casting the worldsheet integrals in a basis. Infrared regularization plays an important role to exhibit the expected factorization limits. We comment on implications for the one-loop string effective action.
Wilson loops in minimal surfaces
International Nuclear Information System (INIS)
Drukker, Nadav; Gross, David J.; Ooguri, Hirosi
1999-01-01
The AdS/CFT correspondence suggests that the Wilson loop of the large N gauge theory with N = 4 supersymmetry in 4 dimensions is described by a minimal surface in AdS 5 x S 5 . The authors examine various aspects of this proposal, comparing gauge theory expectations with computations of minimal surfaces. There is a distinguished class of loops, which the authors call BPS loops, whose expectation values are free from ultra-violet divergence. They formulate the loop equation for such loops. To the extent that they have checked, the minimal surface in AdS 5 x S 5 gives a solution of the equation. The authors also discuss the zig-zag symmetry of the loop operator. In the N = 4 gauge theory, they expect the zig-zag symmetry to hold when the loop does not couple the scalar fields in the supermultiplet. They will show how this is realized for the minimal surface
Wilson loops and minimal surfaces
International Nuclear Information System (INIS)
Drukker, Nadav; Gross, David J.; Ooguri, Hirosi
1999-01-01
The AdS-CFT correspondence suggests that the Wilson loop of the large N gauge theory with N=4 supersymmetry in four dimensions is described by a minimal surface in AdS 5 xS 5 . We examine various aspects of this proposal, comparing gauge theory expectations with computations of minimal surfaces. There is a distinguished class of loops, which we call BPS loops, whose expectation values are free from ultraviolet divergence. We formulate the loop equation for such loops. To the extent that we have checked, the minimal surface in AdS 5 xS 5 gives a solution of the equation. We also discuss the zigzag symmetry of the loop operator. In the N=4 gauge theory, we expect the zigzag symmetry to hold when the loop does not couple the scalar fields in the supermultiplet. We will show how this is realized for the minimal surface. (c) 1999 The American Physical Society
Heart failure: when form fails to follow function.
Katz, Arnold M; Rolett, Ellis L
2016-02-01
Cardiac performance is normally determined by architectural, cellular, and molecular structures that determine the heart's form, and by physiological and biochemical mechanisms that regulate the function of these structures. Impaired adaptation of form to function in failing hearts contributes to two syndromes initially called systolic heart failure (SHF) and diastolic heart failure (DHF). In SHF, characterized by high end-diastolic volume (EDV), the left ventricle (LV) cannot eject a normal stroke volume (SV); in DHF, with normal or low EDV, the LV cannot accept a normal venous return. These syndromes are now generally defined in terms of ejection fraction (EF): SHF became 'heart failure with reduced ejection fraction' (HFrEF) while DHF became 'heart failure with normal or preserved ejection fraction' (HFnEF or HFpEF). However, EF is a chimeric index because it is the ratio between SV--which measures function, and EDV--which measures form. In SHF the LV dilates when sarcomere addition in series increases cardiac myocyte length, whereas sarcomere addition in parallel can cause concentric hypertrophy in DHF by increasing myocyte thickness. Although dilatation in SHF allows the LV to accept a greater venous return, it increases the energy cost of ejection and initiates a vicious cycle that contributes to progressive dilatation. In contrast, concentric hypertrophy in DHF facilitates ejection but impairs filling and can cause heart muscle to deteriorate. Differences in the molecular signals that initiate dilatation and concentric hypertrophy can explain why many drugs that improve prognosis in SHF have little if any benefit in DHF. Published on behalf of the European Society of Cardiology. All rights reserved. © The Author 2015. For permissions please email: journals.permissions@oup.com.
On matrix superpotential and three-component normal modes
Energy Technology Data Exchange (ETDEWEB)
Rodrigues, R. de Lima [Centro Brasileiro de Pesquisas Fisicas (CBPF), Rio de Janeiro, RJ (Brazil); Lima, A.F. de [Universidade Federal de Campina Grande (UFCG), PB (Brazil). Dept. de Fisica; Mello, E.R. Bezerra de; Bezerra, V.B. [Universidade Federal da Paraiba (UFPB), Joao Pessoa, PB (Brazil). Dept. de Fisica]. E-mails: rafael@df.ufcg.edu.br; aerlima@df.ufcg.edu.br; emello@fisica.ufpb.br; valdir@fisica.ufpb.br
2007-07-01
We consider the supersymmetric quantum mechanics(SUSY QM) with three-component normal modes for the Bogomol'nyi-Prasad-Sommerfield (BPS) states. An explicit form of the SUSY QM matrix superpotential is presented and the corresponding three-component bosonic zero-mode eigenfunction is investigated. (author)
CT-based needle marking of superficial intracranial lesions for minimal invasive neurosurgery
International Nuclear Information System (INIS)
Marquardt, G.; Wolff, R.; Schick, U.; Lorenz, R.
2000-01-01
A CT-based method of marking superficial intracranial lesions with a needle is presented. This form of neuronavigation can be applied in every neurosurgical centre. Owing to its rapid application it is also suitable for cases of emergency. The neurosurgical approach can be centred precisely over this lesion providing for a minimally invasive operation. The method has proved its efficacy in numerous cases of haematomas and cystic lesions. (author)
Robust Approach to Verifying the Weak Form of the Efficient Market Hypothesis
Střelec, Luboš
2011-09-01
The weak form of the efficient markets hypothesis states that prices incorporate only past information about the asset. An implication of this form of the efficient markets hypothesis is that one cannot detect mispriced assets and consistently outperform the market through technical analysis of past prices. One of possible formulations of the efficient market hypothesis used for weak form tests is that share prices follow a random walk. It means that returns are realizations of IID sequence of random variables. Consequently, for verifying the weak form of the efficient market hypothesis, we can use distribution tests, among others, i.e. some tests of normality and/or some graphical methods. Many procedures for testing the normality of univariate samples have been proposed in the literature [7]. Today the most popular omnibus test of normality for a general use is the Shapiro-Wilk test. The Jarque-Bera test is the most widely adopted omnibus test of normality in econometrics and related fields. In particular, the Jarque-Bera test (i.e. test based on the classical measures of skewness and kurtosis) is frequently used when one is more concerned about heavy-tailed alternatives. As these measures are based on moments of the data, this test has a zero breakdown value [2]. In other words, a single outlier can make the test worthless. The reason so many classical procedures are nonrobust to outliers is that the parameters of the model are expressed in terms of moments, and their classical estimators are expressed in terms of sample moments, which are very sensitive to outliers. Another approach to robustness is to concentrate on the parameters of interest suggested by the problem under this study. Consequently, novel robust testing procedures of testing normality are presented in this paper to overcome shortcomings of classical normality tests in the field of financial data, which are typical with occurrence of remote data points and additional types of deviations from
Topological gravity with minimal matter
International Nuclear Information System (INIS)
Li Keke
1991-01-01
Topological minimal matter, obtained by twisting the minimal N = 2 supeconformal field theory, is coupled to two-dimensional topological gravity. The free field formulation of the coupled system allows explicit representations of BRST charge, physical operators and their correlation functions. The contact terms of the physical operators may be evaluated by extending the argument used in a recent solution of topological gravity without matter. The consistency of the contact terms in correlation functions implies recursion relations which coincide with the Virasoro constraints derived from the multi-matrix models. Topological gravity with minimal matter thus provides the field theoretic description for the multi-matrix models of two-dimensional quantum gravity. (orig.)
Minimizing waste in environmental restoration
International Nuclear Information System (INIS)
Moos, L.; Thuot, J.R.
1996-01-01
Environmental restoration, decontamination and decommissioning and facility dismantelment projects are not typically known for their waste minimization and pollution prevention efforts. Typical projects are driven by schedules and milestones with little attention given to cost or waste minimization. Conventional wisdom in these projects is that the waste already exists and cannot be reduced or minimized. In fact, however, there are three significant areas where waste and cost can be reduced. Waste reduction can occur in three ways: beneficial reuse or recycling; segregation of waste types; and reducing generation of secondary waste. This paper will discuss several examples of reuse, recycle, segregation, and secondary waste reduction at ANL restoration programs
On minimizers of causal variational principles
International Nuclear Information System (INIS)
Schiefeneder, Daniela
2011-01-01
Causal variational principles are a class of nonlinear minimization problems which arise in a formulation of relativistic quantum theory referred to as the fermionic projector approach. This thesis is devoted to a numerical and analytic study of the minimizers of a general class of causal variational principles. We begin with a numerical investigation of variational principles for the fermionic projector in discrete space-time. It is shown that for sufficiently many space-time points, the minimizing fermionic projector induces non-trivial causal relations on the space-time points. We then generalize the setting by introducing a class of causal variational principles for measures on a compact manifold. In our main result we prove under general assumptions that the support of a minimizing measure is either completely timelike, or it is singular in the sense that its interior is empty. In the examples of the circle, the sphere and certain flag manifolds, the general results are supplemented by a more detailed analysis of the minimizers. (orig.)
Boysen, Angela K; Heal, Katherine R; Carlson, Laura T; Ingalls, Anitra E
2018-01-16
The goal of metabolomics is to measure the entire range of small organic molecules in biological samples. In liquid chromatography-mass spectrometry-based metabolomics, formidable analytical challenges remain in removing the nonbiological factors that affect chromatographic peak areas. These factors include sample matrix-induced ion suppression, chromatographic quality, and analytical drift. The combination of these factors is referred to as obscuring variation. Some metabolomics samples can exhibit intense obscuring variation due to matrix-induced ion suppression, rendering large amounts of data unreliable and difficult to interpret. Existing normalization techniques have limited applicability to these sample types. Here we present a data normalization method to minimize the effects of obscuring variation. We normalize peak areas using a batch-specific normalization process, which matches measured metabolites with isotope-labeled internal standards that behave similarly during the analysis. This method, called best-matched internal standard (B-MIS) normalization, can be applied to targeted or untargeted metabolomics data sets and yields relative concentrations. We evaluate and demonstrate the utility of B-MIS normalization using marine environmental samples and laboratory grown cultures of phytoplankton. In untargeted analyses, B-MIS normalization allowed for inclusion of mass features in downstream analyses that would have been considered unreliable without normalization due to obscuring variation. B-MIS normalization for targeted or untargeted metabolomics is freely available at https://github.com/IngallsLabUW/B-MIS-normalization .
DEFF Research Database (Denmark)
Tetens, Inge
claims in relation to silicon and protection against aluminium accumulation in the brain, cardiovascular health, forming a protective coat on the mucous membrane of the stomach, neutralisation of gastric acid, contribution to normal formation of collagen and connective tissue, maintenance of normal bone...
DEFF Research Database (Denmark)
Gildberg, Frederik Alkier; Bradley, Stephen K.; Fristed, Peter Billeskov
2012-01-01
Forensic psychiatry is an area of priority for the Danish Government. As the field expands, this calls for increased knowledge about mental health nursing practice, as this is part of the forensic psychiatry treatment offered. However, only sparse research exists in this area. The aim of this study...... was to investigate the characteristics of forensic mental health nursing staff interaction with forensic mental health inpatients and to explore how staff give meaning to these interactions. The project included 32 forensic mental health staff members, with over 307 hours of participant observations, 48 informal....... The intention is to establish a trusting relationship to form behaviour and perceptual-corrective care, which is characterized by staff's endeavours to change, halt, or support the patient's behaviour or perception in relation to staff's perception of normality. The intention is to support and teach the patient...
Lyplal1 is dispensable for normal fat deposition in mice
Directory of Open Access Journals (Sweden)
Rachel A. Watson
2017-12-01
Full Text Available Genome-wide association studies (GWAS have detected association between variants in or near the Lysophospholipase-like 1 (LYPLAL1 locus and metabolic traits, including central obesity, fatty liver and waist-to-hip ratio. LYPLAL1 is also known to be upregulated in the adipose tissue of obese patients. However, the physiological role of LYPLAL1 is not understood. To investigate the function of Lyplal1 in vivo we investigated the phenotype of the Lyplal1tm1a(KOMPWtsi homozygous mouse. Body composition was unaltered in Lyplal1 knockout mice as assessed by dual-energy X-ray absorptiometry (DEXA scanning, both on normal chow and on a high-fat diet. Adipose tissue distribution between visceral and subcutaneous fat depots was unaltered, with no change in adipocyte cell size. The response to both insulin and glucose dosing was normal in Lyplal1tm1a(KOMPWtsi homozygous mice, with normal fasting blood glucose concentrations. RNAseq analysis of liver, muscle and adipose tissue confirmed that Lyplal1 expression was ablated with minimal additional changes in gene expression. These results suggest that Lyplal1 is dispensable for normal mouse metabolic physiology and that despite having been maintained through evolution Lyplal1 is not an essential gene, suggesting possible functional redundancy. Further studies will be required to clarify its physiological role.
[Minimally invasive approach for cervical spondylotic radiculopathy].
Ding, Liang; Sun, Taicun; Huang, Yonghui
2010-01-01
To summarize the recent minimally invasive approach for cervical spondylotic radiculopathy (CSR). The recent literature at home and abroad concerning minimally invasive approach for CSR was reviewed and summarized. There were two techniques of minimally invasive approach for CSR at present: percutaneous puncture techniques and endoscopic techniques. The degenerate intervertebral disc was resected or nucleolysis by percutaneous puncture technique if CSR was caused by mild or moderate intervertebral disc herniations. The cervical microendoscopic discectomy and foraminotomy was an effective minimally invasive approach which could provide a clear view. The endoscopy techniques were suitable to treat CSR caused by foraminal osteophytes, lateral disc herniations, local ligamentum flavum thickening and spondylotic foraminal stenosis. The minimally invasive procedure has the advantages of simple handling, minimally invasive and low incidence of complications. But the scope of indications is relatively narrow at present.
Molten salt treatment to minimize and optimize waste
International Nuclear Information System (INIS)
Gat, U.; Crosley, S.M.; Gay, R.L.
1993-01-01
A combination molten salt oxidizer (MSO) and molten salt reactor (MSR) is described for treatment of waste. The MSO is proposed for contained oxidization of organic hazardous waste, for reduction of mass and volume of dilute waste by evaporation of the water. The NTSO residue is to be treated to optimize the waste in terms of its composition, chemical form, mixture, concentration, encapsulation, shape, size, and configuration. Accumulations and storage are minimized, shipments are sized for low risk. Actinides, fissile material, and long-lived isotopes are separated and completely burned or transmuted in an MSR. The MSR requires no fuel element fabrication, accepts the materials as salts in arbitrarily small quantities enhancing safety, security, and overall acceptability
Hornschemeier, A. E.; Heckman, T. M.; Ptak, A. F.; Tremonti, C. A.; Colbert, E. J. M.
2005-01-01
We have cross-correlated X-ray catalogs derived from archival Chandra X-Ray Observatory ACIS observations with a Sloan Digital Sky Survey Data Release 2 (DR2) galaxy catalog to form a sample of 42 serendipitously X-ray-detected galaxies over the redshift interval 0.03
Minimal Invasive Circumferential Management of Thoracolumbar Spine Fractures
Directory of Open Access Journals (Sweden)
S. Pesenti
2015-01-01
Full Text Available Introduction. While thoracolumbar fractures are common lesions, no strong consensus is available at the moment. Objectives. The aim of this study was to evaluate the results of a minimal invasive strategy using percutaneous instrumentation and anterior approach in the management of thoracolumbar unstable fractures. Methods. 39 patients were included in this retrospective study. Radiologic evaluation was based on vertebral and regional kyphosis, vertebral body height restoration, and fusion rate. Clinical evaluation was based on Visual Analogic Score (VAS. All evaluations were done preoperatively and at 1-year follow-up. Results. Both vertebral and regional kyphoses were significantly improved on postoperative evaluation (13° and 7° versus −1° and −9° P<0.05, resp. as well as vertebral body height (0.92 versus 1.16, P<0.05. At 1-year follow-up, mean loss of correction was 1°. A solid fusion was visible in all the cases, and mean VAS was significantly reduced form 8/10 preoperatively to 1/10 at the last follow-up. Conclusion. Management of thoracolumbar fractures using percutaneous osteosynthesis and minimal invasive anterior approach (telescopic vertebral body prosthesis is a valuable strategy. Results of this strategy offer satisfactory and stable results in time.
Guidelines for mixed waste minimization
International Nuclear Information System (INIS)
Owens, C.
1992-02-01
Currently, there is no commercial mixed waste disposal available in the United States. Storage and treatment for commercial mixed waste is limited. Host States and compacts region officials are encouraging their mixed waste generators to minimize their mixed wastes because of management limitations. This document provides a guide to mixed waste minimization
Identity Work at a Normal University in Shanghai
Cockain, Alex
2016-01-01
Based upon ethnographic research, this article explores undergraduate students' experiences at a normal university in Shanghai focusing on the types of identities and forms of sociality emerging therein. Although students' symptoms of disappointment seem to indicate the power of university experiences to extinguish purposeful action, this article…
Minimal string theories and integrable hierarchies
Iyer, Ramakrishnan
Well-defined, non-perturbative formulations of the physics of string theories in specific minimal or superminimal model backgrounds can be obtained by solving matrix models in the double scaling limit. They provide us with the first examples of completely solvable string theories. Despite being relatively simple compared to higher dimensional critical string theories, they furnish non-perturbative descriptions of interesting physical phenomena such as geometrical transitions between D-branes and fluxes, tachyon condensation and holography. The physics of these theories in the minimal model backgrounds is succinctly encoded in a non-linear differential equation known as the string equation, along with an associated hierarchy of integrable partial differential equations (PDEs). The bosonic string in (2,2m-1) conformal minimal model backgrounds and the type 0A string in (2,4 m) superconformal minimal model backgrounds have the Korteweg-de Vries system, while type 0B in (2,4m) backgrounds has the Zakharov-Shabat system. The integrable PDE hierarchy governs flows between backgrounds with different m. In this thesis, we explore this interesting connection between minimal string theories and integrable hierarchies further. We uncover the remarkable role that an infinite hierarchy of non-linear differential equations plays in organizing and connecting certain minimal string theories non-perturbatively. We are able to embed the type 0A and 0B (A,A) minimal string theories into this single framework. The string theories arise as special limits of a rich system of equations underpinned by an integrable system known as the dispersive water wave hierarchy. We find that there are several other string-like limits of the system, and conjecture that some of them are type IIA and IIB (A,D) minimal string backgrounds. We explain how these and several other string-like special points arise and are connected. In some cases, the framework endows the theories with a non
Waste minimization at Chalk River Laboratories
Energy Technology Data Exchange (ETDEWEB)
Kranz, P.; Wong, P.C.F. [Atomic Energy of Canada Limited, Chalk River, ON (Canada)
2011-07-01
Waste minimization supports Atomic Energy of Canada Limited (AECL) Environment Policy with regard to pollution prevention and has positive impacts on the environment, human health and safety, and economy. In accordance with the principle of pollution prevention, the quantities and degree of hazard of wastes requiring storage or disposition at facilities within or external to AECL sites shall be minimized, following the principles of Prevent, Reduce, Reuse, and Recycle, to the extent practical. Waste minimization is an important element in the Waste Management Program. The Waste Management Program has implemented various initiatives for waste minimization since 2007. The key initiatives have focused on waste reduction, segregation and recycling, and included: 1) developed waste minimization requirements and recycling procedure to establish the framework for applying the Waste Minimization Hierarchy; 2) performed waste minimization assessments for the facilities, which generate significant amounts of waste, to identify the opportunities for waste reduction and assist the waste generators to develop waste reduction targets and action plans to achieve the targets; 3) implemented the colour-coded, standardized waste and recycling containers to enhance waste segregation; 4) established partnership with external agents for recycling; 5) extended the likely clean waste and recyclables collection to selected active areas; 6) provided on-going communications to promote waste reduction and increase awareness for recycling; and 7) continually monitored performance, with respect to waste minimization, to identify opportunities for improvement and to communicate these improvements. After implementation of waste minimization initiatives at CRL, the solid waste volume generated from routine operations at CRL has significantly decreased, while the amount of recyclables diverted from the onsite landfill has significantly increased since 2007. The overall refuse volume generated at
Waste minimization at Chalk River Laboratories
International Nuclear Information System (INIS)
Kranz, P.; Wong, P.C.F.
2011-01-01
Waste minimization supports Atomic Energy of Canada Limited (AECL) Environment Policy with regard to pollution prevention and has positive impacts on the environment, human health and safety, and economy. In accordance with the principle of pollution prevention, the quantities and degree of hazard of wastes requiring storage or disposition at facilities within or external to AECL sites shall be minimized, following the principles of Prevent, Reduce, Reuse, and Recycle, to the extent practical. Waste minimization is an important element in the Waste Management Program. The Waste Management Program has implemented various initiatives for waste minimization since 2007. The key initiatives have focused on waste reduction, segregation and recycling, and included: 1) developed waste minimization requirements and recycling procedure to establish the framework for applying the Waste Minimization Hierarchy; 2) performed waste minimization assessments for the facilities, which generate significant amounts of waste, to identify the opportunities for waste reduction and assist the waste generators to develop waste reduction targets and action plans to achieve the targets; 3) implemented the colour-coded, standardized waste and recycling containers to enhance waste segregation; 4) established partnership with external agents for recycling; 5) extended the likely clean waste and recyclables collection to selected active areas; 6) provided on-going communications to promote waste reduction and increase awareness for recycling; and 7) continually monitored performance, with respect to waste minimization, to identify opportunities for improvement and to communicate these improvements. After implementation of waste minimization initiatives at CRL, the solid waste volume generated from routine operations at CRL has significantly decreased, while the amount of recyclables diverted from the onsite landfill has significantly increased since 2007. The overall refuse volume generated at
Heavy meson form factors from QCD
International Nuclear Information System (INIS)
Falk, A.F.; Georgi, H.; Grinstein, B.
1990-01-01
We calculate the leading QCD radiative corrections to the relations which follow from the decoupling of the heavy quark spin as the quark mass goes infinity and from the symmetry between systems with different heavy quarks. One of the effects we calculate gives the leading q 2 -dependence of the form factor of a heavy quark, which in turn dominates the q 2 -dependence of the form factors of bound states of the heavy quark with light quarks. This, combined with the normalization of the form factor provided by symmetry, gives us a first principles calculation of the heavy meson (or baryon) form factors in the limit of very large heavy quark mass. (orig.)
International Nuclear Information System (INIS)
Kiritsis, E.B.
1987-01-01
N = 2 superconformal-invariant theories are studied and their general structure is analyzed. The geometry of N = 2 complex superspace is developed as a tool to study the correlation functions of the theories above. The Ward identities of the global N = 2 superconformal symmetry are solved, to restrict the form of correlation functions. Advantage is taken of the existence of the degenerate operators to derive the ''fusion'' rules for the unitary minimal systems with c<1. In particular, the closure of the operator algebra for such systems is shown. The c = (1/3 minimal system is analyzed and its two-, three-, and four-point functions as well as its operator algebra are calculated explicitly
Wilson expansion in the minimal subtraction scheme
International Nuclear Information System (INIS)
Smirnov, V.A.
1989-01-01
The small distance expansion of the product of composite fields is constructed for an arbitrary renormalization procedure of the type of minimal subtraction scheme. Coefficient functions of the expansion are expressed explicitly through the Green functions of composite fields. The expansion has the explicity finite form: the ultraviolet (UV) divergences of the coefficient functions and composite fields are removed by the initial renormalization procedure while the infrared (IR) divergences in massless diagrams with nonvanishing contribution into the coefficient functions are removed by the R-operation which is the IR part of the R-operation. The latter is the generalization of the dimensional renormalization in the case when both UV and IR divergences are present. To derive the expansion, a ''pre-subtracting operator'' is introduced and formulas of the counter-term technique are exploited
Optimal blood glucose level control using dynamic programming based on minimal Bergman model
Rettian Anggita Sari, Maria; Hartono
2018-03-01
The purpose of this article is to simulate the glucose dynamic and the insulin kinetic of diabetic patient. The model used in this research is a non-linear Minimal Bergman model. Optimal control theory is then applied to formulate the problem in order to determine the optimal dose of insulin in the treatment of diabetes mellitus such that the glucose level is in the normal range for some specific time range. The optimization problem is solved using dynamic programming. The result shows that dynamic programming is quite reliable to represent the interaction between glucose and insulin levels in diabetes mellitus patient.
Generation of Strategies for Environmental Deception in Two-Player Normal-Form Games
2015-06-18
found in the literature is pre- sented by Kohlberg and Mertens [23]. A stable equilibrium by their definition is an equi- librium in an extensive-form...the equilibrium in this state provides them with an increased payoff. While interesting, Kohlberg and Mertens’ defi- 13 nition of equilibrium...stability used by Kohlberg and Mertens. Arsham’s work focuses on determining the amount by which a mixed-strategy Nash equilibrium’s payoff values can
Non-minimal inflation revisited
International Nuclear Information System (INIS)
Nozari, Kourosh; Shafizadeh, Somayeh
2010-01-01
We reconsider an inflationary model that inflaton field is non-minimally coupled to gravity. We study the parameter space of the model up to the second (and in some cases third) order of the slow-roll parameters. We calculate inflation parameters in both Jordan and Einstein frames, and the results are compared in these two frames and also with observations. Using the recent observational data from combined WMAP5+SDSS+SNIa datasets, we study constraints imposed on our model parameters, especially the non-minimal coupling ξ.
Minimal quantization and confinement
International Nuclear Information System (INIS)
Ilieva, N.P.; Kalinowskij, Yu.L.; Nguyen Suan Han; Pervushin, V.N.
1987-01-01
A ''minimal'' version of the Hamiltonian quantization based on the explicit solution of the Gauss equation and on the gauge-invariance principle is considered. By the example of the one-particle Green function we show that the requirement for gauge invariance leads to relativistic covariance of the theory and to more proper definition of the Faddeev - Popov integral that does not depend on the gauge choice. The ''minimal'' quantization is applied to consider the gauge-ambiguity problem and a new topological mechanism of confinement
Market clearing of joint energy and reserves auctions using augmented payment minimization
International Nuclear Information System (INIS)
Amjady, N.; Aghaei, J.; Shayanfar, H.A.
2009-01-01
This paper presents the market clearing of joint energy and reserves auctions and its mathematical formulation, focusing on a possible implementation of the Payment Cost Minimization (PCM). It also discusses another key point in debate: whether market clearing algorithm should minimize offer costs or payment costs? An aggregated simultaneous market clearing approach is proposed for provision of ancillary services as well as energy, which is in the form of Mixed Integer Nonlinear Programming (MINLP) formulation. In the MINLP formulation of the market clearing process, the objective function (Payment cost or offer cost) are optimized while meeting AC power flow constraints, system reserve requirements and lost opportunity cost (LOC) considerations. The model is applied to the IEEE 24-bus Reliability Test System (IEEE 24-bus RTS), and simulation studies are carried out to examine the effectiveness of each objective function. (author)
Normal parenchymal enhancement patterns in women undergoing MR screening of the breast
International Nuclear Information System (INIS)
Jansen, Sanaz A.; Lin, Vicky C.; Giger, Maryellen L.; Li, Hui; Karczmar, Gregory S.; Newstead, Gillian M.
2011-01-01
To characterize the kinetic and morphological presentation of normal breast tissue on DCE-MRI in a large cohort of asymptomatic women, and to relate these characteristics to breast tissue density. 335 consecutive breast MR examinations in 229 asymptomatic women undergoing high-risk screening evaluations based on recommendations from the American Cancer Society including strong family history and genetic predisposition were selected for IRB-approved review (average age 49.2 ± 10.5 years). Breast tissue density was assessed on precontrast T 2 -weighted images. Parenchymal enhancement pattern (PEP) was qualitatively classified as minimal, homogeneous, heterogeneous or nodular. Quantitative analysis of parenchymal enhancement kinetics (PEK) was performed, including calculation of initial and peak enhancement percentages (E 1 , E peak ), the time to peak enhancement (T peak ) and the signal enhancement ratio (SER). 41.8% of examinations were classified as minimal, 13.7% homogeneous, 23.9% heterogeneous and 21.2% nodular PEP. Women with heterogeneously or extremely dense breasts exhibited a higher proportion of nodular PEP (44.2% (27/61)) and significantly higher E 1 , and E peak (p < 0.003) compared with those with less dense breasts. Qualitative and quantitative parenchymal enhancement characteristics vary by breast tissue density. In future work, the association between image-derived MR features of the normal breast and breast cancer risk should be explored. (orig.)
Corticocortical feedback increases the spatial extent of normalization.
Nassi, Jonathan J; Gómez-Laberge, Camille; Kreiman, Gabriel; Born, Richard T
2014-01-01
Normalization has been proposed as a canonical computation operating across different brain regions, sensory modalities, and species. It provides a good phenomenological description of non-linear response properties in primary visual cortex (V1), including the contrast response function and surround suppression. Despite its widespread application throughout the visual system, the underlying neural mechanisms remain largely unknown. We recently observed that corticocortical feedback contributes to surround suppression in V1, raising the possibility that feedback acts through normalization. To test this idea, we characterized area summation and contrast response properties in V1 with and without feedback from V2 and V3 in alert macaques and applied a standard normalization model to the data. Area summation properties were well explained by a form of divisive normalization, which computes the ratio between a neuron's driving input and the spatially integrated activity of a "normalization pool." Feedback inactivation reduced surround suppression by shrinking the spatial extent of the normalization pool. This effect was independent of the gain modulation thought to mediate the influence of contrast on area summation, which remained intact during feedback inactivation. Contrast sensitivity within the receptive field center was also unaffected by feedback inactivation, providing further evidence that feedback participates in normalization independent of the circuit mechanisms involved in modulating contrast gain and saturation. These results suggest that corticocortical feedback contributes to surround suppression by increasing the visuotopic extent of normalization and, via this mechanism, feedback can play a critical role in contextual information processing.
International Nuclear Information System (INIS)
Shao, Kan; Gift, Jeffrey S.; Setzer, R. Woodrow
2013-01-01
Continuous responses (e.g. body weight) are widely used in risk assessment for determining the benchmark dose (BMD) which is used to derive a U.S. EPA reference dose. One critical question that is not often addressed in dose–response assessments is whether to model the continuous data as normally or log-normally distributed. Additionally, if lognormality is assumed, and only summarized response data (i.e., mean ± standard deviation) are available as is usual in the peer-reviewed literature, the BMD can only be approximated. In this study, using the “hybrid” method and relative deviation approach, we first evaluate six representative continuous dose–response datasets reporting individual animal responses to investigate the impact on BMD/BMDL estimates of (1) the distribution assumption and (2) the use of summarized versus individual animal data when a log-normal distribution is assumed. We also conduct simulation studies evaluating model fits to various known distributions to investigate whether the distribution assumption has influence on BMD/BMDL estimates. Our results indicate that BMDs estimated using the hybrid method are more sensitive to the distribution assumption than counterpart BMDs estimated using the relative deviation approach. The choice of distribution assumption has limited impact on the BMD/BMDL estimates when the within dose-group variance is small, while the lognormality assumption is a better choice for relative deviation method when data are more skewed because of its appropriateness in describing the relationship between mean and standard deviation. Additionally, the results suggest that the use of summarized data versus individual response data to characterize log-normal distributions has minimal impact on BMD estimates. - Highlights: • We investigate to what extent the distribution assumption can affect BMD estimates. • Both real data analysis and simulation study are conducted. • BMDs estimated using hybrid method are more
Energy Technology Data Exchange (ETDEWEB)
Shao, Kan, E-mail: Shao.Kan@epa.gov [ORISE Postdoctoral Fellow, National Center for Environmental Assessment, U.S. Environmental Protection Agency, Research Triangle Park, NC (United States); Gift, Jeffrey S. [National Center for Environmental Assessment, U.S. Environmental Protection Agency, Research Triangle Park, NC (United States); Setzer, R. Woodrow [National Center for Computational Toxicology, U.S. Environmental Protection Agency, Research Triangle Park, NC (United States)
2013-11-01
Continuous responses (e.g. body weight) are widely used in risk assessment for determining the benchmark dose (BMD) which is used to derive a U.S. EPA reference dose. One critical question that is not often addressed in dose–response assessments is whether to model the continuous data as normally or log-normally distributed. Additionally, if lognormality is assumed, and only summarized response data (i.e., mean ± standard deviation) are available as is usual in the peer-reviewed literature, the BMD can only be approximated. In this study, using the “hybrid” method and relative deviation approach, we first evaluate six representative continuous dose–response datasets reporting individual animal responses to investigate the impact on BMD/BMDL estimates of (1) the distribution assumption and (2) the use of summarized versus individual animal data when a log-normal distribution is assumed. We also conduct simulation studies evaluating model fits to various known distributions to investigate whether the distribution assumption has influence on BMD/BMDL estimates. Our results indicate that BMDs estimated using the hybrid method are more sensitive to the distribution assumption than counterpart BMDs estimated using the relative deviation approach. The choice of distribution assumption has limited impact on the BMD/BMDL estimates when the within dose-group variance is small, while the lognormality assumption is a better choice for relative deviation method when data are more skewed because of its appropriateness in describing the relationship between mean and standard deviation. Additionally, the results suggest that the use of summarized data versus individual response data to characterize log-normal distributions has minimal impact on BMD estimates. - Highlights: • We investigate to what extent the distribution assumption can affect BMD estimates. • Both real data analysis and simulation study are conducted. • BMDs estimated using hybrid method are more
Null-polygonal minimal surfaces in AdS4 from perturbed W minimal models
International Nuclear Information System (INIS)
Hatsuda, Yasuyuki; Ito, Katsushi; Satoh, Yuji
2012-11-01
We study the null-polygonal minimal surfaces in AdS 4 , which correspond to the gluon scattering amplitudes/Wilson loops in N=4 super Yang-Mills theory at strong coupling. The area of the minimal surfaces with n cusps is characterized by the thermodynamic Bethe ansatz (TBA) integral equations or the Y-system of the homogeneous sine-Gordon model, which is regarded as the SU(n-4) 4 /U(1) n-5 generalized parafermion theory perturbed by the weight-zero adjoint operators. Based on the relation to the TBA systems of the perturbed W minimal models, we solve the TBA equations by using the conformal perturbation theory, and obtain the analytic expansion of the remainder function around the UV/regular-polygonal limit for n = 6 and 7. We compare the rescaled remainder function for n=6 with the two-loop one, to observe that they are close to each other similarly to the AdS 3 case.
Piskun, Caroline M; Stein, Timothy J
2016-06-01
Canine osteosarcoma (OS) is an aggressive malignancy associated with poor outcomes. Therapeutic improvements are likely to develop from an improved understanding of signalling pathways contributing to OS development and progression. The Wnt signalling pathway is of interest for its role in osteoblast differentiation, its dysregulation in numerous cancer types, and the relative frequency of cytoplasmic accumulation of β-catenin in canine OS. This study aimed to determine the biological impact of inhibiting canonical Wnt signalling in canine OS, by utilizing either β-catenin siRNA or a dominant-negative T-cell factor (TCF) construct. There were no consistent, significant changes in cell line behaviour with either method compared to parental cell lines. Interestingly, β-catenin transcriptional activity was three-fold higher in normal canine primary osteoblasts compared to canine OS cell lines. These results suggest canonical Wnt signalling is minimally active in canine OS and its targeted inhibition is not a relevant therapeutic strategy. © 2013 John Wiley & Sons Ltd.
Directory of Open Access Journals (Sweden)
Natália Alves Barbosa
2015-08-01
Full Text Available Storing processed food products can cause alterations in their chemical compositions. Thus, the objective of this study was to evaluate carotenoid retention in the kernels of minimally processed normal and vitamin A precursor (proVA-biofortified green corn ears that were packaged in polystyrene trays covered with commercial film or in multilayered polynylon packaging material and were stored. Throughout the storage period, the carotenoids were extracted from the corn kernels using organic solvents and were quantified using HPLC. A completely factorial design including three factors (cultivar, packaging and storage period was applied for analysis. The green kernels of maize cultivars BRS1030 and BRS4104 exhibited similar carotenoid profiles, with zeaxanthin being the main carotenoid. Higher concentrations of the carotenoids lutein, β-cryptoxanthin, and β-carotene, the total carotenoids and the total vitamin A precursor carotenoids were detected in the green kernels of the biofortified BRS4104 maize. The packaging method did not affect carotenoid retention in the kernels of minimally processed green corn ears during the storage period.
Cooperative Content Distribution over Wireless Networks for Energy and Delay Minimization
Atat, Rachad
2012-06-01
Content distribution with mobile-to-mobile cooperation is studied. Data is sent to mobile terminals on a long range link then the terminals exchange the content using an appropriate short range wireless technology. Unicasting and multicasting are investigated, both on the long range and short range links. Energy minimization is formulated as an optimization problem for each scenario, and the optimal solutions are determined in closed form. Moreover, the schemes are applied in public safety vehicular networks, where Long Term Evolution (LTE) network is used for the long range link, while IEEE 802.11 p is considered for inter-vehicle collaboration on the short range links. Finally, relay-based multicasting is applied in high speed trains for energy and delay minimization. Results show that cooperative schemes outperform non-cooperative ones and other previous related work in terms of energy and delay savings. Furthermore, practical implementation aspects of the proposed methods are also discussed.
Peck, Blake; Lillibridge, Jennifer
2005-03-01
This article reports findings from a larger qualitative study conducted to gain insight into the experience of fathers living with their chronically-ill children in rural Victoria, Australia. Data were collected via unstructured interviews with four fathers. The findings presented in this article explore the phenomena of normalization for fathers within the chronic illness experience. Fathers described normalizing the experience of living with their chronically-ill child as involving a combination of various coping strategies and behaviours including: (1) accepting the child's condition, (2) changing expectations, (3) focusing energies on a day-to-day basis, (4) minimizing knowledge-seeking behaviours, and (5) engaging in external distraction activities. Findings highlight the complex and unique normalization strategies these men utilized and contribute to knowledge and understanding of the complex nature of raising a chronically-ill child in rural Australia and provide a sound basis upon which to guide an ongoing and holistic assessment of fathers with chronically-ill children.
A survey on classical minimal surface theory
Meeks, William H
2012-01-01
Meeks and Pérez present a survey of recent spectacular successes in classical minimal surface theory. The classification of minimal planar domains in three-dimensional Euclidean space provides the focus of the account. The proof of the classification depends on the work of many currently active leading mathematicians, thus making contact with much of the most important results in the field. Through the telling of the story of the classification of minimal planar domains, the general mathematician may catch a glimpse of the intrinsic beauty of this theory and the authors' perspective of what is happening at this historical moment in a very classical subject. This book includes an updated tour through some of the recent advances in the theory, such as Colding-Minicozzi theory, minimal laminations, the ordering theorem for the space of ends, conformal structure of minimal surfaces, minimal annular ends with infinite total curvature, the embedded Calabi-Yau problem, local pictures on the scale of curvature and t...
Minimalism and Speakers’ Intuitions
Directory of Open Access Journals (Sweden)
Matías Gariazzo
2011-08-01
Full Text Available Minimalism proposes a semantics that does not account for speakers’ intuitions about the truth conditions of a range of sentences or utterances. Thus, a challenge for this view is to offer an explanation of how its assignment of semantic contents to these sentences is grounded in their use. Such an account was mainly offered by Soames, but also suggested by Cappelen and Lepore. The article criticizes this explanation by presenting four kinds of counterexamples to it, and arrives at the conclusion that minimalism has not successfully answered the above-mentioned challenge.
The re-emergence of the minimal running shoe.
Davis, Irene S
2014-10-01
The running shoe has gone through significant changes since its inception. The purpose of this paper is to review these changes, the majority of which have occurred over the past 50 years. Running footwear began as very minimal, then evolved to become highly cushioned and supportive. However, over the past 5 years, there has been a reversal of this trend, with runners seeking more minimal shoes that allow their feet more natural motion. This abrupt shift toward footwear without cushioning and support has led to reports of injuries associated with minimal footwear. In response to this, the running footwear industry shifted again toward the development of lightweight, partial minimal shoes that offer some support and cushioning. In this paper, studies comparing the mechanics between running in minimal, partial minimal, and traditional shoes are reviewed. The implications for injuries in all 3 conditions are examined. The use of minimal footwear in other populations besides runners is discussed. Finally, areas for future research into minimal footwear are suggested.
Normal modes and continuous spectra
International Nuclear Information System (INIS)
Balmforth, N.J.; Morrison, P.J.
1994-12-01
The authors consider stability problems arising in fluids, plasmas and stellar systems that contain singularities resulting from wave-mean flow or wave-particle resonances. Such resonances lead to singularities in the differential equations determining the normal modes at the so-called critical points or layers. The locations of the singularities are determined by the eigenvalue of the problem, and as a result, the spectrum of eigenvalues forms a continuum. They outline a method to construct the singular eigenfunctions comprising the continuum for a variety of problems
Improving the scaling normalization for high-density oligonucleotide GeneChip expression microarrays
Directory of Open Access Journals (Sweden)
Lu Chao
2004-07-01
Full Text Available Abstract Background Normalization is an important step for microarray data analysis to minimize biological and technical variations. Choosing a suitable approach can be critical. The default method in GeneChip expression microarray uses a constant factor, the scaling factor (SF, for every gene on an array. The SF is obtained from a trimmed average signal of the array after excluding the 2% of the probe sets with the highest and the lowest values. Results Among the 76 U34A GeneChip experiments, the total signals on each array showed 25.8% variations in terms of the coefficient of variation, although all microarrays were hybridized with the same amount of biotin-labeled cRNA. The 2% of the probe sets with the highest signals that were normally excluded from SF calculation accounted for 34% to 54% of the total signals (40.7% ± 4.4%, mean ± sd. In comparison with normalization factors obtained from the median signal or from the mean of the log transformed signal, SF showed the greatest variation. The normalization factors obtained from log transformed signals showed least variation. Conclusions Eliminating 40% of the signal data during SF calculation failed to show any benefit. Normalization factors obtained with log transformed signals performed the best. Thus, it is suggested to use the mean of the logarithm transformed data for normalization, rather than the arithmetic mean of signals in GeneChip gene expression microarrays.
Simulation and Verification of Form Filling with Self-Compacting Concrete
DEFF Research Database (Denmark)
Thrane, Lars Nyholm
2005-01-01
This paper presents a form filling experiment and the corresponding 3D simulation. One side of the form is made of a transparent acrylic plate and to improve the visual observations of the flow behaviour, the first and second half of the form is cast with normal grey and red-pigmented SCC, respec...
10 CFR 20.1406 - Minimization of contamination.
2010-01-01
... 10 Energy 1 2010-01-01 2010-01-01 false Minimization of contamination. 20.1406 Section 20.1406... License Termination § 20.1406 Minimization of contamination. (a) Applicants for licenses, other than early... procedures for operation will minimize, to the extent practicable, contamination of the facility and the...
Optimization of Control Self Assessment Application to Minimize Fraud
Directory of Open Access Journals (Sweden)
Wendy Endrianto
2016-05-01
Full Text Available This article discussed a method that can be done by a company to minimize fraud action by applying Control Self Assessment (CSA. The study was conducted by studying literature on the topics discussed that were presented descriptively in a systematic manner through the review one by one from the initial problem to solve the problem. It can be concluded that CSA is one form of auditing practices that emphasizes anticipatory action (preventive of the act of detection (detective that the concept of modern internal audit which is carried more precise in application. It is one alternative that is most efficient and effective in reducing fraud.
A minimal architecture for joint action
DEFF Research Database (Denmark)
Vesper, Cordula; Butterfill, Stephen; Knoblich, Günther
2010-01-01
What kinds of processes and representations make joint action possible? In this paper we suggest a minimal architecture for joint action that focuses on representations, action monitoring and action prediction processes, as well as ways of simplifying coordination. The architecture spells out...... minimal requirements for an individual agent to engage in a joint action. We discuss existing evidence in support of the architecture as well as open questions that remain to be empirically addressed. In addition, we suggest possible interfaces between the minimal architecture and other approaches...... to joint action. The minimal architecture has implications for theorizing about the emergence of joint action, for human-machine interaction, and for understanding how coordination can be facilitated by exploiting relations between multiple agents’ actions and between actions and the environment....
DEFF Research Database (Denmark)
Foadi, Roshan; Frandsen, Mads Toudal; A. Ryttov, T.
2007-01-01
Different theoretical and phenomenological aspects of the Minimal and Nonminimal Walking Technicolor theories have recently been studied. The goal here is to make the models ready for collider phenomenology. We do this by constructing the low energy effective theory containing scalars......, pseudoscalars, vector mesons and other fields predicted by the minimal walking theory. We construct their self-interactions and interactions with standard model fields. Using the Weinberg sum rules, opportunely modified to take into account the walking behavior of the underlying gauge theory, we find...... interesting relations for the spin-one spectrum. We derive the electroweak parameters using the newly constructed effective theory and compare the results with the underlying gauge theory. Our analysis is sufficiently general such that the resulting model can be used to represent a generic walking technicolor...
Minimal Enteral Nutrition to Improve Adaptation After Intestinal Resection in Piglets and Infants
DEFF Research Database (Denmark)
Aunsholt, Lise; Qvist, Niels; Sangild, Per Torp
2018-01-01
BACKGROUND: Minimal enteral nutrition (MEN) may induce a diet-dependent stimulation of gut adaptation following intestinal resection. Bovine colostrum is rich in growth factors, and we hypothesized that MEN with colostrum would stimulate intestinal adaptation, compared with formula, and would...... be well tolerated in patients with short bowel syndrome. METHODS: In experiment 1, 3-day-old piglets with 50% distal small intestinal resection were fed parenteral nutrition (PN, n = 10) or PN plus MEN given as either colostrum (PN-COL, n = 5) or formula (PN-FORM, n = 9) for 7 days. Intestinal nutrient......, enteral colostrum supplementation was well tolerated, and no infants developed clinical signs of cow's milk allergy. CONCLUSION: Minimal enteral nutrition feeding with bovine colostrum and formula induced similar intestinal adaptation after resection in piglets. Colostrum was well tolerated by newly...
Power Minimization techniques for Networked Data Centers
International Nuclear Information System (INIS)
Low, Steven; Tang, Kevin
2011-01-01
Our objective is to develop a mathematical model to optimize energy consumption at multiple levels in networked data centers, and develop abstract algorithms to optimize not only individual servers, but also coordinate the energy consumption of clusters of servers within a data center and across geographically distributed data centers to minimize the overall energy cost and consumption of brown energy of an enterprise. In this project, we have formulated a variety of optimization models, some stochastic others deterministic, and have obtained a variety of qualitative results on the structural properties, robustness, and scalability of the optimal policies. We have also systematically derived from these models decentralized algorithms to optimize energy efficiency, analyzed their optimality and stability properties. Finally, we have conducted preliminary numerical simulations to illustrate the behavior of these algorithms. We draw the following conclusion. First, there is a substantial opportunity to minimize both the amount and the cost of electricity consumption in a network of datacenters, by exploiting the fact that traffic load, electricity cost, and availability of renewable generation fluctuate over time and across geographical locations. Judiciously matching these stochastic processes can optimize the tradeoff between brown energy consumption, electricity cost, and response time. Second, given the stochastic nature of these three processes, real-time dynamic feedback should form the core of any optimization strategy. The key is to develop decentralized algorithms that can be implemented at different parts of the network as simple, local algorithms that coordinate through asynchronous message passing.
Minimal Marking: A Success Story
Directory of Open Access Journals (Sweden)
Anne McNeilly
2014-11-01
Full Text Available The minimal-marking project conducted in Ryerson’s School of Journalism throughout 2012 and early 2013 resulted in significantly higher grammar scores in two first-year classes of minimally marked university students when compared to two traditionally marked classes. The “minimal-marking” concept (Haswell, 1983, which requires dramatically more student engagement, resulted in more successful learning outcomes for surface-level knowledge acquisition than the more traditional approach of “teacher-corrects-all.” Results suggest it would be effective, not just for grammar, punctuation, and word usage, the objective here, but for any material that requires rote-memory learning, such as the Associated Press or Canadian Press style rules used by news publications across North America.
Miniature, minimally invasive, tunable endoscope for investigation of the middle ear.
Pawlowski, Michal E; Shrestha, Sebina; Park, Jesung; Applegate, Brian E; Oghalai, John S; Tkaczyk, Tomasz S
2015-06-01
We demonstrate a miniature, tunable, minimally invasive endoscope for diagnosis of the auditory system. The probe is designed to sharply image anatomical details of the middle ear without the need for physically adjusting the position of the distal end of the endoscope. This is achieved through the addition of an electrowetted, tunable, electronically-controlled lens to the optical train. Morphological imaging is enabled by scanning light emanating from an optical coherence tomography system. System performance was demonstrated by imaging part of the ossicular chain and wall of the middle ear cavity of a normal mouse. During the experiment, we electronically moved the plane of best focus from the incudo-stapedial joint to the stapedial artery. Repositioning the object plane allowed us to image anatomical details of the middle ear beyond the depth of field of a static optical system. We also demonstrated for the first time to our best knowledge, that an optical system with an electrowetted, tunable lens may be successfully employed to measure sound-induced vibrations within the auditory system by measuring the vibratory amplitude of the tympanic membrane in a normal mouse in response to pure tone stimuli.
Theories of minimalism in architecture: Post scriptum
Directory of Open Access Journals (Sweden)
Stevanović Vladimir
2012-01-01
Full Text Available Owing to the period of intensive development in the last decade of XX century, architectural phenomenon called Minimalism in Architecture was remembered as the Style of the Nineties, which is characterized, morphologically speaking, by simplicity and formal reduction. Simultaneously with its development in practice, on a theoretical level several dominant interpretative models were able to establish themselves. The new millennium and time distance bring new problems; therefore this paper represents a discussion on specific theorization related to Minimalism in Architecture that can bear the designation of post scriptum, because their development starts after the constitutional period of architectural minimalist discourse. In XXI century theories, the problem of definition of minimalism remains important topic, approached by theorists through resolving on the axis: Modernism - Minimal Art - Postmodernism - Minimalism in Architecture. With regard to this, analyzed texts can be categorized in two groups: 1 texts of affirmative nature and historical-associative approach in which minimalism is identified with anything that is simple and reduced, in an idealizing manner, relied mostly on the existing hypotheses; 2 critically oriented texts, in which authors reconsider adequacy of the very term 'minimalism' in the context of architecture and take a metacritical attitude towards previous texts.
Directory of Open Access Journals (Sweden)
Katherine Ruth Gordon
2016-09-01
Full Text Available Research on word learning has focused on children’s ability to identify a target object when given the word form after a minimal number of exposures to novel word-object pairings. However, relatively little research has focused on children’s ability to retrieve the word form when given the target object. The exceptions involve asking children to recall and produce forms, and children typically perform near floor on these measures. In the current study, 3- to 5-year-old children were administered a novel test of word form that allowed for recognition memory and manual responses. Specifically, when asked to label a previously trained object, children were given three forms to choose from: the target, a minimally different form, and a maximally different form. Children demonstrated memory for word forms at three post-training delays: 10 minutes (short-term, 2 to 3 days (long-term, and 6 months to 1 year (very long-term. However, children performed worse at the very long-term delay than the other time points, and the length of the very long-term delay was negatively related to performance. When in error, children were no more likely to select the minimally different form than the maximally different form at all time points. Overall, these results suggest that children remember word forms that are linked to objects over extended post-training intervals, but that their memory for the forms gradually decreases over time without further exposures. Furthermore, memory traces for word forms do not become less phonologically specific over time; rather children either identify the correct form, or they perform at chance.
Sharkey, Amanda J. C.
2007-09-01
Swarm Robotics (SR) is closely related to Swarm Intelligence, and both were initially inspired by studies of social insects. Their guiding principles are based on their biological inspiration and take the form of an emphasis on decentralized local control and communication. Earlier studies went a step further in emphasizing the use of simple reactive robots that only communicate indirectly through the environment. More recently SR studies have moved beyond these constraints to explore the use of non-reactive robots that communicate directly, and that can learn and represent their environment. There is no clear agreement in the literature about how far such extensions of the original principles could go. Should there be any limitations on the individual abilities of the robots used in SR studies? Should knowledge of the capabilities of social insects lead to constraints on the capabilities of individual robots in SR studies? There is a lack of explicit discussion of such questions, and researchers have adopted a variety of constraints for a variety of reasons. A simple taxonomy of swarm robotics is presented here with the aim of addressing and clarifying these questions. The taxonomy distinguishes subareas of SR based on the emphases and justifications for minimalism and individual simplicity.
Density- and wavefunction-normalized Cartesian spherical harmonics for l ≤ 20.
Michael, J Robert; Volkov, Anatoliy
2015-03-01
The widely used pseudoatom formalism [Stewart (1976). Acta Cryst. A32, 565-574; Hansen & Coppens (1978). Acta Cryst. A34, 909-921] in experimental X-ray charge-density studies makes use of real spherical harmonics when describing the angular component of aspherical deformations of the atomic electron density in molecules and crystals. The analytical form of the density-normalized Cartesian spherical harmonic functions for up to l ≤ 7 and the corresponding normalization coefficients were reported previously by Paturle & Coppens [Acta Cryst. (1988), A44, 6-7]. It was shown that the analytical form for normalization coefficients is available primarily for l ≤ 4 [Hansen & Coppens, 1978; Paturle & Coppens, 1988; Coppens (1992). International Tables for Crystallography, Vol. B, Reciprocal space, 1st ed., edited by U. Shmueli, ch. 1.2. Dordrecht: Kluwer Academic Publishers; Coppens (1997). X-ray Charge Densities and Chemical Bonding. New York: Oxford University Press]. Only in very special cases it is possible to derive an analytical representation of the normalization coefficients for 4 4 the density normalization coefficients were calculated numerically to within seven significant figures. In this study we review the literature on the density-normalized spherical harmonics, clarify the existing notations, use the Paturle-Coppens (Paturle & Coppens, 1988) method in the Wolfram Mathematica software to derive the Cartesian spherical harmonics for l ≤ 20 and determine the density normalization coefficients to 35 significant figures, and computer-generate a Fortran90 code. The article primarily targets researchers who work in the field of experimental X-ray electron density, but may be of some use to all who are interested in Cartesian spherical harmonics.
Reliability assessment based on small samples of normal distribution
International Nuclear Information System (INIS)
Ma Zhibo; Zhu Jianshi; Xu Naixin
2003-01-01
When the pertinent parameter involved in reliability definition complies with normal distribution, the conjugate prior of its distributing parameters (μ, h) is of normal-gamma distribution. With the help of maximum entropy and the moments-equivalence principles, the subjective information of the parameter and the sampling data of its independent variables are transformed to a Bayesian prior of (μ,h). The desired estimates are obtained from either the prior or the posterior which is formed by combining the prior and sampling data. Computing methods are described and examples are presented to give demonstrations
Corticocortical feedback increases the spatial extent of normalization
Nassi, Jonathan J.; Gómez-Laberge, Camille; Kreiman, Gabriel; Born, Richard T.
2014-01-01
Normalization has been proposed as a canonical computation operating across different brain regions, sensory modalities, and species. It provides a good phenomenological description of non-linear response properties in primary visual cortex (V1), including the contrast response function and surround suppression. Despite its widespread application throughout the visual system, the underlying neural mechanisms remain largely unknown. We recently observed that corticocortical feedback contributes to surround suppression in V1, raising the possibility that feedback acts through normalization. To test this idea, we characterized area summation and contrast response properties in V1 with and without feedback from V2 and V3 in alert macaques and applied a standard normalization model to the data. Area summation properties were well explained by a form of divisive normalization, which computes the ratio between a neuron's driving input and the spatially integrated activity of a “normalization pool.” Feedback inactivation reduced surround suppression by shrinking the spatial extent of the normalization pool. This effect was independent of the gain modulation thought to mediate the influence of contrast on area summation, which remained intact during feedback inactivation. Contrast sensitivity within the receptive field center was also unaffected by feedback inactivation, providing further evidence that feedback participates in normalization independent of the circuit mechanisms involved in modulating contrast gain and saturation. These results suggest that corticocortical feedback contributes to surround suppression by increasing the visuotopic extent of normalization and, via this mechanism, feedback can play a critical role in contextual information processing. PMID:24910596
Technology applications for radioactive waste minimization
International Nuclear Information System (INIS)
Devgun, J.S.
1994-01-01
The nuclear power industry has achieved one of the most successful examples of waste minimization. The annual volume of low-level radioactive waste shipped for disposal per reactor has decreased to approximately one-fifth the volume about a decade ago. In addition, the curie content of the total waste shipped for disposal has decreased. This paper will discuss the regulatory drivers and economic factors for waste minimization and describe the application of technologies for achieving waste minimization for low-level radioactive waste with examples from the nuclear power industry
Transience and capacity of minimal submanifolds
DEFF Research Database (Denmark)
Markvorsen, Steen; Palmer, V.
2003-01-01
We prove explicit lower bounds for the capacity of annular domains of minimal submanifolds P-m in ambient Riemannian spaces N-n with sectional curvatures bounded from above. We characterize the situations in which the lower bounds for the capacity are actually attained. Furthermore we apply...... these bounds to prove that Brownian motion defined on a complete minimal submanifold is transient when the ambient space is a negatively curved Hadamard-Cartan manifold. The proof stems directly from the capacity bounds and also covers the case of minimal submanifolds of dimension m > 2 in Euclidean spaces....
Analysis of stationary fluence by minimization of a functional in tension and velocity
International Nuclear Information System (INIS)
Loula, A.F.D.; Guerreiro, J.N.C.; Toledo, E.M.
1989-11-01
New mixed finite element formulations for plane elasticity problems are presented with no limitation in the choice of conforming finite element spaces. Adding least square residual form of the governing equations to the classical Galerkin formulation the original saddle point problem is transformed into a minimization problem. Stability analysis, error estimates and numerical results are presented, confirming the error estimates and the good performance of this new formulation. (author) [pt
2-regularity and 2-normality conditions for systems with impulsive controls
Directory of Open Access Journals (Sweden)
Pavlova Natal'ya
2007-01-01
Full Text Available In this paper a controlled system with impulsive controls in the neighborhood of an abnormal point is investigated. The set of pairs (u,μ is considered as a class of admissible controls, where u is a measurable essentially bounded function and μ is a finite-dimensional Borel measure, such that for any Borel set B, μ(B is a subset of the given convex closed pointed cone. In this article the concepts of 2-regularity and 2-normality for the abstract mapping Ф, operating from the given Banach space into a finite-dimensional space, are introduced. The concepts of 2-regularity and 2-normality play a great role in the course of derivation of the first and the second order necessary conditions for the optimal control problem, consisting of the minimization of a certain functional on the set of the admissible processes. These concepts are also important for obtaining the sufficient conditions for the local controllability of the nonlinear systems. The convenient criterion for 2-regularity along the prescribed direction and necessary conditions for 2-normality of systems, linear in control, are introduced in this article as well.
Chemical compatibility of DWPF canistered waste forms
International Nuclear Information System (INIS)
Harbour, J.R.
1993-01-01
The Waste Acceptance Preliminary Specifications (WAPS) require that the contents of the canistered waste form are compatible with one another and the stainless steel canister. The canistered waste form is a closed system comprised of a stainless steel vessel containing waste glass, air, and condensate. This system will experience a radiation field and an elevated temperature due to radionuclide decay. This report discusses possible chemical reactions, radiation interactions, and corrosive reactions within this system both under normal storage conditions and after exposure to temperatures up to the normal glass transition temperature, which for DWPF waste glass will be between 440 and 460 degrees C. Specific conclusions regarding reactions and corrosion are provided. This document is based on the assumption that the period of interim storage prior to packaging at the federal repository may be as long as 50 years
Directory of Open Access Journals (Sweden)
Mizoue S
2014-02-01
Full Text Available Shiro Mizoue,1 Tadashi Nakano,2 Nobuo Fuse,3 Aiko Iwase,4 Shun Matsumoto,5 Keiji Yoshikawa6 On behalf of the IOP CHANGE Study Group7 1Department of Ophthalmology, Ehime University Graduate School of Medicine, Ehime, Japan; 2Department of Ophthalmology, Jikei University School of Medicine, Tokyo, Japan; 3Department of Integrative Genomics, Tohoku Medical Megabank Organization, Miyagi, Japan; 4Tajimi Iwase Eye Clinic, Gifu, Japan; 5Department of Ophthalmology, Tokyo Teishin Hospital, 6Yoshikawa Eye Clinic, Tokyo, Japan; 7IOP CHecked and Assessed in Normal tension Glaucoma by Exceptional Glaucomatologists Study Group, Japan Background: This study aimed to evaluate the effect of travoprost with sofZia® preservative system for lowering the intraocular pressure (IOP of Japanese normal tension glaucoma (NTG patients. Methods: In this prospective, multicenter, open-label study, Japanese NTG patients with baseline IOPs <20 mmHg were enrolled after three consecutive time measurements taken at screening and baseline visits. Travoprost with sofZia® was instilled once daily. Lowering effect on IOP, conjunctival hyperemia, superficial punctate keratopathy, and adverse events were examined at week 4, 8, and 12 after drug instillation. Results: One-hundred and three of the 107 enrolled patients (baseline IOP =15.2±2.0 mmHg [mean ± standard deviation] completed the study. The mean IOP value as well as percent reduction was significantly reduced at each visit after travoprost with sofZia® initiation (P<0.0001. The conjunctival hyperemia score was 1 or less throughout the study, though it increased significantly over time. No significant change was observed in superficial punctate keratopathy. The cumulative incidence of side effects such as eyelash changes, eyelid pigmentation, and deepening of the upper lid was 47.6%, 27.2%, and 16.5%, respectively. Conclusion: Travoprost preserved with sofZia® effectively lowered the IOP of Japanese NTG patients. It was
Safety control and minimization of radioactive wastes
International Nuclear Information System (INIS)
Wang Jinming; Rong Feng; Li Jinyan; Wang Xin
2010-01-01
Compared with the developed countries, the safety control and minimization of the radwastes in China are under-developed. The research of measures for the safety control and minimization of the radwastes is very important for the safety control of the radwastes, and the reduction of the treatment and disposal cost and environment radiation hazards. This paper has systematically discussed the safety control and the minimization of the radwastes produced in the nuclear fuel circulation, nuclear technology applications and the process of decommission of nuclear facilities, and has provided some measures and methods for the safety control and minimization of the radwastes. (authors)
Fui, Stéphanie Li Sun; Bonnichon, Philippe; Bonni, Nicolas; Delbot, Thierry; André, Jean Pascal; Pion-Graff, Joëlle; Berrod, Jean-Louis; Fontaine, Marine; Brunaud, Catherine; Cocagne, Nicolas
2016-10-01
With the current aging of the world's population, diagnosis of primary hyperparathyroidism is being reported in increasingly older patients, with the associated functional symptomatology exacerbating the vicissitudes of age. This retrospective study was designed to establish functional improvements in older patients following parathyroid adenomectomy under local anesthesia as outpatient surgery. Data were collected from 53 patients aged 80 years or older who underwent a minimally invasive parathyroid adenomectomy. All patients underwent a preoperative ultrasound, scintigraphy, and were monitored for the effectiveness of the procedure according to intra- and postdosage of parathyroid hormone (PTH) at 5min, 2h and 4h. Mean preoperative serum calcium level was 2.8mmol/L (112mg/L) and mean PTH was 180pg/ml. Thirty-eight patients were operated under local anesthesia using minimally invasive surgery and 18 patients were operated under general anesthesia. In 26 cases, the procedure was planned on an outpatient basis but could only be carried out in 21 patients. Fifty-one patients had normal serum calcium and PTH levels during the immediate postoperative period. Two patients were reoperated under general anesthesia, since immediate postoperative PTH did not return to normal. Four patients died due to reasons unrelated to hyperparathyroidism. Five patients were lost to follow-up six months to two years postsurgery. Of the 44 patients (83%) with long-term monitoring for PTH, none had recurrence of biological hyperparathyroidism. Excluding the three asymptomatic patients, 38 of the 41 symptomatic patients (93%) with long-term follow-up were considering themselves as "improved" or "strongly improved" after the intervention, notably with respect to fatigue, muscle and bone pain. Two patients (4.9%) reported no difference and one patient (2.4%) said her condition had worsened and regretted having undergone surgery. In patients 80 years or older, minimally invasive surgery as an
Massive Corrections to Entanglement in Minimal E8 Toda Field Theory
Directory of Open Access Journals (Sweden)
Olalla A. Castro-Alvaredo
2017-02-01
Full Text Available In this letter we study the exponentially decaying corrections to saturation of the second R\\'enyi entropy of one interval of length L in minimal E8 Toda field theory. It has been known for some time that the entanglement entropy of a massive quantum field theory in 1+1 dimensions saturates to a constant value for m1 L <<1 where m1 is the mass of the lightest particle in the spectrum. Subsequently, results by Cardy, Castro-Alvaredo and Doyon have shown that there are exponentially decaying corrections to this behaviour which are characterised by Bessel functions with arguments proportional to m1 L. For the von Neumann entropy the leading correction to saturation takes the precise universal form -K0(2m1 L/8 whereas for the R\\'enyi entropies leading corrections which are proportional to K0(m1 L are expected. Recent numerical work by P\\'almai for the second R\\'enyi entropy of minimal E8 Toda has found next-to-leading order corrections decaying as exp(-2m1 L rather than the expected exp(-m1 L. In this paper we investigate the origin of this result and show that it is incorrect. An exact form factor computation of correlators of branch point twist fields reveals that the leading corrections are proportional to K0(m1 L as expected.
Normalizing tweets with edit scripts and recurrent neural embeddings
Chrupala, Grzegorz; Toutanova, Kristina; Wu, Hua
2014-01-01
Tweets often contain a large proportion of abbreviations, alternative spellings, novel words and other non-canonical language. These features are problematic for standard language analysis tools and it can be desirable to convert them to canonical form. We propose a novel text normalization model
Learning regularization parameters for general-form Tikhonov
International Nuclear Information System (INIS)
Chung, Julianne; Español, Malena I
2017-01-01
Computing regularization parameters for general-form Tikhonov regularization can be an expensive and difficult task, especially if multiple parameters or many solutions need to be computed in real time. In this work, we assume training data is available and describe an efficient learning approach for computing regularization parameters that can be used for a large set of problems. We consider an empirical Bayes risk minimization framework for finding regularization parameters that minimize average errors for the training data. We first extend methods from Chung et al (2011 SIAM J. Sci. Comput. 33 3132–52) to the general-form Tikhonov problem. Then we develop a learning approach for multi-parameter Tikhonov problems, for the case where all involved matrices are simultaneously diagonalizable. For problems where this is not the case, we describe an approach to compute near-optimal regularization parameters by using operator approximations for the original problem. Finally, we propose a new class of regularizing filters, where solutions correspond to multi-parameter Tikhonov solutions, that requires less data than previously proposed optimal error filters, avoids the generalized SVD, and allows flexibility and novelty in the choice of regularization matrices. Numerical results for 1D and 2D examples using different norms on the errors show the effectiveness of our methods. (paper)
Minimal string theory is logarithmic
International Nuclear Information System (INIS)
Ishimoto, Yukitaka; Yamaguchi, Shun-ichi
2005-01-01
We study the simplest examples of minimal string theory whose worldsheet description is the unitary (p,q) minimal model coupled to two-dimensional gravity ( Liouville field theory). In the Liouville sector, we show that four-point correlation functions of 'tachyons' exhibit logarithmic singularities, and that the theory turns out to be logarithmic. The relation with Zamolodchikov's logarithmic degenerate fields is also discussed. Our result holds for generic values of (p,q)
Neutrino CP violation and sign of baryon asymmetry in the minimal seesaw model
Shimizu, Yusuke; Takagi, Kenta; Tanimoto, Morimitsu
2018-03-01
We discuss the correlation between the CP violating Dirac phase of the lepton mixing matrix and the cosmological baryon asymmetry based on the leptogenesis in the minimal seesaw model with two right-handed Majorana neutrinos and the trimaximal mixing for neutrino flavors. The sign of the CP violating Dirac phase at low energy is fixed by the observed cosmological baryon asymmetry since there is only one phase parameter in the model. According to the recent T2K and NOνA data of the CP violation, the Dirac neutrino mass matrix of our model is fixed only for the normal hierarchy of neutrino masses.
Meyer, Swanhild U.; Kaiser, Sebastian; Wagner, Carola; Thirion, Christian; Pfaffl, Michael W.
2012-01-01
Background Adequate normalization minimizes the effects of systematic technical variations and is a prerequisite for getting meaningful biological changes. However, there is inconsistency about miRNA normalization performances and recommendations. Thus, we investigated the impact of seven different normalization methods (reference gene index, global geometric mean, quantile, invariant selection, loess, loessM, and generalized procrustes analysis) on intra- and inter-platform performance of two distinct and commonly used miRNA profiling platforms. Methodology/Principal Findings We included data from miRNA profiling analyses derived from a hybridization-based platform (Agilent Technologies) and an RT-qPCR platform (Applied Biosystems). Furthermore, we validated a subset of miRNAs by individual RT-qPCR assays. Our analyses incorporated data from the effect of differentiation and tumor necrosis factor alpha treatment on primary human skeletal muscle cells and a murine skeletal muscle cell line. Distinct normalization methods differed in their impact on (i) standard deviations, (ii) the area under the receiver operating characteristic (ROC) curve, (iii) the similarity of differential expression. Loess, loessM, and quantile analysis were most effective in minimizing standard deviations on the Agilent and TLDA platform. Moreover, loess, loessM, invariant selection and generalized procrustes analysis increased the area under the ROC curve, a measure for the statistical performance of a test. The Jaccard index revealed that inter-platform concordance of differential expression tended to be increased by loess, loessM, quantile, and GPA normalization of AGL and TLDA data as well as RGI normalization of TLDA data. Conclusions/Significance We recommend the application of loess, or loessM, and GPA normalization for miRNA Agilent arrays and qPCR cards as these normalization approaches showed to (i) effectively reduce standard deviations, (ii) increase sensitivity and accuracy of
DEFF Research Database (Denmark)
Lauridsen, Mette Enok Munk; Thiele, Maja; Kimer, N
2013-01-01
Abstract Existing tests for minimal/covert hepatic encephalopathy (m/cHE) are time- and expertise consuming and primarily useable for research purposes. An easy-to-use, fast and reliable diagnostic and grading tool is needed. We here report on the background, experience, and ongoing research......-10) percentile) as a parameter of reaction time variability. The index is a measure of alertness stability and is used to assess attention and cognition deficits. The CRTindex identifies half of patients in a Danish cohort with chronic liver disease, as having m/cHE, a normal value safely precludes HE, it has...
Cloning and characterization of the complementary DNA for the B chain of normal human serum C1q.
Reid, K B; Bentley, D R; Wood, K J
1984-09-06
Normal human C1q is a serum glycoprotein of 460 kDa containing 18 polypeptide chains (6A, 6B, 6C) each 226 amino acids long and each containing an N-terminal collagen-like domain and a C-terminal globular domain. Two unusual forms of C1q have been described: a genetically defective form, which has a molecular mass of approximately 160 kDa and is found in the sera of homozygotes for the defect who show a marked susceptibility to immune complex related disease; a fibroblast form, shown to be synthesized and secreted, in vitro, with a molecular mass of about 800 kDa and with chains approximately 16 kDa greater than those of normal C1q. A higher than normal molecular mass form of C1q has also been described in human colostrum and a form of C1q has been claimed to represent one of the types of Fc receptor on guinea-pig macrophages. To initiate studies, at the genomic level, on these various forms of C1q, and to investigate the possible relation between the C1q genes and the procollagen genes, the complementary DNA corresponding to the B chain of normal C1q has been cloned and characterized.
Directory of Open Access Journals (Sweden)
Andreas Garcia-Bardon
Full Text Available Real-time reverse transcription polymerase chain reaction (PCR is the gold standard for expression analysis. Designed to improve reproducibility and sensitivity, commercial kits are commonly used for the critical step of cDNA synthesis. The present study was designed to determine the impact of these kits. mRNA from mouse brains were pooled to create serial dilutions ranging from 0.0625 μg to 2 μg, which were transcribed into cDNA using four different commercial reverse-transcription kits. Next, we transcribed mRNA from brain tissue after acute brain injury and naïve mice into cDNA for qPCR. Depending on tested genes, some kits failed to show linear results in dilution series and revealed strong variations in cDNA yield. Absolute expression data in naïve and trauma settings varied substantially between these kits. Normalization with a housekeeping gene failed to reduce kit-dependent variations, whereas normalization eliminated differences when naïve samples from the same region were used. The study shows strong evidence that choice of commercial cDNA synthesis kit has a major impact on PCR results and, consequently, on comparability between studies. Additionally, it provides a solution to overcome this limitation by normalization with data from naïve samples. This simple step helps to compare mRNA expression data between different studies and groups.
DEFF Research Database (Denmark)
Dahl, Marianne; Kamper, Jens
2006-01-01
AIM: To describe physical outcome and school performance in a cohort of very-low-birthweight infants treated with early nasal continuous positive airway pressure (NCPAP)/minimal handling regimen with permissive hypercapnia, in comparison to siblings of normal birthweight. MATERIAL AND METHODS......% attended ordinary schools, with near-average performances in mathematics and reading/spelling, which were not statistically different to their siblings. The overall results indicate that these infants fare at least as well as survivors after conventional treatment....... of these differences reached statistical significance. However, the performance ratings correlated significantly with socio-economic conditions. CONCLUSION: In this study of infants treated with a regimen of early NCPAP/minimal handling, we found a relatively low incidence of handicaps and impairments. Nearly 90...
Sequential unconstrained minimization algorithms for constrained optimization
International Nuclear Information System (INIS)
Byrne, Charles
2008-01-01
The problem of minimizing a function f(x):R J → R, subject to constraints on the vector variable x, occurs frequently in inverse problems. Even without constraints, finding a minimizer of f(x) may require iterative methods. We consider here a general class of iterative algorithms that find a solution to the constrained minimization problem as the limit of a sequence of vectors, each solving an unconstrained minimization problem. Our sequential unconstrained minimization algorithm (SUMMA) is an iterative procedure for constrained minimization. At the kth step we minimize the function G k (x)=f(x)+g k (x), to obtain x k . The auxiliary functions g k (x):D subset of R J → R + are nonnegative on the set D, each x k is assumed to lie within D, and the objective is to minimize the continuous function f:R J → R over x in the set C = D-bar, the closure of D. We assume that such minimizers exist, and denote one such by x-circumflex. We assume that the functions g k (x) satisfy the inequalities 0≤g k (x)≤G k-1 (x)-G k-1 (x k-1 ), for k = 2, 3, .... Using this assumption, we show that the sequence {(x k )} is decreasing and converges to f(x-circumflex). If the restriction of f(x) to D has bounded level sets, which happens if x-circumflex is unique and f(x) is closed, proper and convex, then the sequence {x k } is bounded, and f(x*)=f(x-circumflex), for any cluster point x*. Therefore, if x-circumflex is unique, x* = x-circumflex and {x k } → x-circumflex. When x-circumflex is not unique, convergence can still be obtained, in particular cases. The SUMMA includes, as particular cases, the well-known barrier- and penalty-function methods, the simultaneous multiplicative algebraic reconstruction technique (SMART), the proximal minimization algorithm of Censor and Zenios, the entropic proximal methods of Teboulle, as well as certain cases of gradient descent and the Newton–Raphson method. The proof techniques used for SUMMA can be extended to obtain related results
Directory of Open Access Journals (Sweden)
Saidi Badreddine
2016-01-01
Full Text Available The single point incremental forming process is well-known to be perfectly suited for prototyping and small series. One of its fields of applicability is the medicine area for the forming of titanium prostheses or titanium medical implants. However this process is not yet very industrialized, mainly due its geometrical inaccuracy, its not homogeneous thickness distribution& Moreover considerable forces can occur. They must be controlled in order to preserve the tooling. In this paper, a numerical approach is proposed in order to minimize the maximum force achieved during the incremental forming of titanium sheets and to maximize the minimal thickness. A surface response methodology is used to find the optimal values of two input parameters of the process, the punch diameter and the vertical step size of the tool path.
Stability analysis of rough surfaces in adhesive normal contact
Rey, Valentine; Bleyer, Jeremy
2018-03-01
This paper deals with adhesive frictionless normal contact between one elastic flat solid and one stiff solid with rough surface. After computation of the equilibrium solution of the energy minimization principle and respecting the contact constraints, we aim at studying the stability of this equilibrium solution. This study of stability implies solving an eigenvalue problem with inequality constraints. To achieve this goal, we propose a proximal algorithm which enables qualifying the solution as stable or unstable and that gives the instability modes. This method has a low computational cost since no linear system inversion is required and is also suitable for parallel implementation. Illustrations are given for the Hertzian contact and for rough contact.
Pengaruh Pelapis Bionanokomposit terhadap Mutu Mangga Terolah Minimal
Directory of Open Access Journals (Sweden)
Ata Aditya Wardana
2017-04-01
Full Text Available Abstract Minimally-processed mango is a perishable product due to high respiration and transpiration and microbial decay. Edible coating is one of the alternative methods to maintain the quality of minimally - processed mango. The objective of this study was to evaluate the effects of bionanocomposite edible coating from tapioca and ZnO nanoparticles (NP-ZnO on quality of minimally - processed mango cv. Arumanis, stored for 12 days at 8°C. The combination of tapioca and NP-ZnO (0, 1, 2% by weight of tapioca were used to coat minimally processed mango. The result showed that application of bionanocomposite edible coatings were able to maintain the quality of minimally-processed mango during the storage periods. The bionanocomposite from tapioca + NP-ZnO (2% by weight of tapioca was the most effective in reducing weight loss, firmness, browning index, total acidity, total soluble solids ,respiration, and microbial counts. Thus, the use of bionanocomposite edible coating might provide an alternative method to maintain storage quality of minimally-processed mango. Abstrak Mangga terolah minimal merupakan produk yang cepat mengalami kerusakan dikarenakan respirasi yang cepat, transpirasi dan kerusakan oleh mikroba. Edible coating merupakan salah satu alternatif metode untuk mempertahankan mutu mangga terolah minimal. Tujuan dari penelitian ini adalah untuk mengevaluasi pengaruh pelapis bionanokomposit dari tapioka dan nanopartikel ZnO (NP-ZnO terhadap mutu mangga terolah minimal cv. Arumanis yang disimpan selama 12 hari pada suhu 8oC. Kombinasi dari tapioka dan NP-ZnO (0, 1, 2% b/b tapioka digunakan untuk melapisi mangga terolah minimal. Hasil menunjukkan bahwa pelapisan bionanokomposit mampu mempertahankan mutu mangga terolah minimal selama penyimpanan. Bionanokomposit dari tapioka + NP-ZnO (2% b/b tapioka paling efektif dalam menghambat penurunan susut bobot, kekerasan, indeks pencoklatan, total asam, total padatan terlarut, respirasi dan total
Hazardous waste minimization tracking system
International Nuclear Information System (INIS)
Railan, R.
1994-01-01
Under RCRA section 3002 9(b) and 3005f(h), hazardous waste generators and owners/operators of treatment, storage, and disposal facilities (TSDFs) are required to certify that they have a program in place to reduce the volume or quantity and toxicity of hazardous waste to the degree determined to be economically practicable. In many cases, there are environmental, as well as, economic benefits, for agencies that pursue pollution prevention options. Several state governments have already enacted waste minimization legislation (e.g., Massachusetts Toxic Use Reduction Act of 1989, and Oregon Toxic Use Reduction Act and Hazardous Waste Reduction Act, July 2, 1989). About twenty six other states have established legislation that will mandate some type of waste minimization program and/or facility planning. The need to address the HAZMIN (Hazardous Waste Minimization) Program at government agencies and private industries has prompted us to identify the importance of managing The HAZMIN Program, and tracking various aspects of the program, as well as the progress made in this area. The open-quotes WASTEclose quotes is a tracking system, which can be used and modified in maintaining the information related to Hazardous Waste Minimization Program, in a manageable fashion. This program maintains, modifies, and retrieves information related to hazardous waste minimization and recycling, and provides automated report generating capabilities. It has a built-in menu, which can be printed either in part or in full. There are instructions on preparing The Annual Waste Report, and The Annual Recycling Report. The program is very user friendly. This program is available in 3.5 inch or 5 1/4 inch floppy disks. A computer with 640K memory is required
The Quest for Minimal Quotients for Probabilistic Automata
DEFF Research Database (Denmark)
Eisentraut, Christian; Hermanns, Holger; Schuster, Johann
2013-01-01
One of the prevailing ideas in applied concurrency theory and verification is the concept of automata minimization with respect to strong or weak bisimilarity. The minimal automata can be seen as canonical representations of the behaviour modulo the bisimilarity considered. Together with congruence...... results wrt. process algebraic operators, this can be exploited to alleviate the notorious state space explosion problem. In this paper, we aim at identifying minimal automata and canonical representations for concurrent probabilistic models. We present minimality and canonicity results for probabilistic...... automata wrt. strong and weak bisimilarity, together with polynomial time minimization algorithms....
Supersymmetric hybrid inflation with non-minimal Kahler potential
International Nuclear Information System (INIS)
Bastero-Gil, M.; King, S.F.; Shafi, Q.
2007-01-01
Minimal supersymmetric hybrid inflation based on a minimal Kahler potential predicts a spectral index n s ∼>0.98. On the other hand, WMAP three year data prefers a central value n s ∼0.95. We propose a class of supersymmetric hybrid inflation models based on the same minimal superpotential but with a non-minimal Kahler potential. Including radiative corrections using the one-loop effective potential, we show that the prediction for the spectral index is sensitive to the small non-minimal corrections, and can lead to a significantly red-tilted spectrum, in agreement with WMAP
Self-normalizing multiple-echo technique for measuring the in vivo apparent diffusion coefficient
International Nuclear Information System (INIS)
Perman, W.H.; Gado, M.; Sandstrom, J.C.
1989-01-01
This paper presents work to develop a new technique for quantitating the in vivo apparent diffusion/perfusion coefficient (ADC) by obtaining multiple data points from only two images with the capability to normalize the data from consecutive images, thus minimizing the effect of interimage variation. Two multiple-echo (six-to eight-echo) cardiac-gated images are obtained, one without and one with additional diffusion/perfusion encoding gradients placed about the 180 RF pulses of all but the first echo. Since the first echoes of both images have identical pulse sequence parameters, variations in signal intensity-between the first echoes represent image-to-image variation. The signal intensities of the subsequent echoes with additional diffusion/perfusion encoding gradients are then normalized by using the ratio of the first-echo signal intensities
Null-polygonal minimal surfaces in AdS{sub 4} from perturbed W minimal models
Energy Technology Data Exchange (ETDEWEB)
Hatsuda, Yasuyuki [Deutsches Elektronen-Synchrotron (DESY), Hamburg (Germany); Ito, Katsushi [Tokyo Institute of Technology (Japan). Dept. of Physics; Satoh, Yuji [Tsukuba Univ., Sakura, Ibaraki (Japan). Inst. of Physics
2012-11-15
We study the null-polygonal minimal surfaces in AdS{sub 4}, which correspond to the gluon scattering amplitudes/Wilson loops in N=4 super Yang-Mills theory at strong coupling. The area of the minimal surfaces with n cusps is characterized by the thermodynamic Bethe ansatz (TBA) integral equations or the Y-system of the homogeneous sine-Gordon model, which is regarded as the SU(n-4){sub 4}/U(1){sup n-5} generalized parafermion theory perturbed by the weight-zero adjoint operators. Based on the relation to the TBA systems of the perturbed W minimal models, we solve the TBA equations by using the conformal perturbation theory, and obtain the analytic expansion of the remainder function around the UV/regular-polygonal limit for n = 6 and 7. We compare the rescaled remainder function for n=6 with the two-loop one, to observe that they are close to each other similarly to the AdS{sub 3} case.
Assessment of LANL waste minimization plan
International Nuclear Information System (INIS)
Davis, K.D.; McNair, D.A.; Jennrich, E.A.; Lund, D.M.
1991-04-01
The objective of this report is to evaluate the Los Alamos National Laboratory (LANL) Waste Minimization Plan to determine if it meets applicable internal (DOE) and regulatory requirements. The intent of the effort is to assess the higher level elements of the documentation to determine if they have been addressed rather than the detailed mechanics of the program's implementation. The requirement for a Waste Minimization Plan is based in several DOE Orders as well as environmental laws and regulations. Table 2-1 provides a list of the major documents or regulations that require waste minimization efforts. The table also summarizes the applicable requirements
Blackfolds, plane waves and minimal surfaces
Armas, Jay; Blau, Matthias
2015-07-01
Minimal surfaces in Euclidean space provide examples of possible non-compact horizon geometries and topologies in asymptotically flat space-time. On the other hand, the existence of limiting surfaces in the space-time provides a simple mechanism for making these configurations compact. Limiting surfaces appear naturally in a given space-time by making minimal surfaces rotate but they are also inherent to plane wave or de Sitter space-times in which case minimal surfaces can be static and compact. We use the blackfold approach in order to scan for possible black hole horizon geometries and topologies in asymptotically flat, plane wave and de Sitter space-times. In the process we uncover several new configurations, such as black helicoids and catenoids, some of which have an asymptotically flat counterpart. In particular, we find that the ultraspinning regime of singly-spinning Myers-Perry black holes, described in terms of the simplest minimal surface (the plane), can be obtained as a limit of a black helicoid, suggesting that these two families of black holes are connected. We also show that minimal surfaces embedded in spheres rather than Euclidean space can be used to construct static compact horizons in asymptotically de Sitter space-times.
Blackfolds, plane waves and minimal surfaces
Energy Technology Data Exchange (ETDEWEB)
Armas, Jay [Physique Théorique et Mathématique, Université Libre de Bruxelles and International Solvay Institutes, ULB-Campus Plaine CP231, B-1050 Brussels (Belgium); Albert Einstein Center for Fundamental Physics, University of Bern,Sidlerstrasse 5, 3012 Bern (Switzerland); Blau, Matthias [Albert Einstein Center for Fundamental Physics, University of Bern,Sidlerstrasse 5, 3012 Bern (Switzerland)
2015-07-29
Minimal surfaces in Euclidean space provide examples of possible non-compact horizon geometries and topologies in asymptotically flat space-time. On the other hand, the existence of limiting surfaces in the space-time provides a simple mechanism for making these configurations compact. Limiting surfaces appear naturally in a given space-time by making minimal surfaces rotate but they are also inherent to plane wave or de Sitter space-times in which case minimal surfaces can be static and compact. We use the blackfold approach in order to scan for possible black hole horizon geometries and topologies in asymptotically flat, plane wave and de Sitter space-times. In the process we uncover several new configurations, such as black helicoids and catenoids, some of which have an asymptotically flat counterpart. In particular, we find that the ultraspinning regime of singly-spinning Myers-Perry black holes, described in terms of the simplest minimal surface (the plane), can be obtained as a limit of a black helicoid, suggesting that these two families of black holes are connected. We also show that minimal surfaces embedded in spheres rather than Euclidean space can be used to construct static compact horizons in asymptotically de Sitter space-times.
Minimal modification to tribimaximal mixing
International Nuclear Information System (INIS)
He Xiaogang; Zee, A.
2011-01-01
We explore some ways of minimally modifying the neutrino mixing matrix from tribimaximal, characterized by introducing at most one mixing angle and a CP violating phase thus extending our earlier work. One minimal modification, motivated to some extent by group theoretic considerations, is a simple case with the elements V α2 of the second column in the mixing matrix equal to 1/√(3). Modifications by keeping one of the columns or one of the rows unchanged from tribimaximal mixing all belong to the class of minimal modification. Some of the cases have interesting experimentally testable consequences. In particular, the T2K and MINOS collaborations have recently reported indications of a nonzero θ 13 . For the cases we consider, the new data sharply constrain the CP violating phase angle δ, with δ close to 0 (in some cases) and π disfavored.
Directory of Open Access Journals (Sweden)
Neil Curtis
Full Text Available The vertebrate skull evolved to protect the brain and sense organs, but with the appearance of jaws and associated forces there was a remarkable structural diversification. This suggests that the evolution of skull form may be linked to these forces, but an important area of debate is whether bone in the skull is minimised with respect to these forces, or whether skulls are mechanically "over-designed" and constrained by phylogeny and development. Mechanical analysis of diapsid reptile skulls could shed light on this longstanding debate. Compared to those of mammals, the skulls of many extant and extinct diapsids comprise an open framework of fenestrae (window-like openings separated by bony struts (e.g., lizards, tuatara, dinosaurs and crocodiles, a cranial form thought to be strongly linked to feeding forces. We investigated this link by utilising the powerful engineering approach of multibody dynamics analysis to predict the physiological forces acting on the skull of the diapsid reptile Sphenodon. We then ran a series of structural finite element analyses to assess the correlation between bone strain and skull form. With comprehensive loading we found that the distribution of peak von Mises strains was particularly uniform throughout the skull, although specific regions were dominated by tensile strains while others were dominated by compressive strains. Our analyses suggest that the frame-like skulls of diapsid reptiles are probably optimally formed (mechanically ideal: sufficient strength with the minimal amount of bone with respect to functional forces; they are efficient in terms of having minimal bone volume, minimal weight, and also minimal energy demands in maintenance.
Curtis, Neil; Jones, Marc E. H.; Shi, Junfen; O'Higgins, Paul; Evans, Susan E.; Fagan, Michael J.
2011-01-01
The vertebrate skull evolved to protect the brain and sense organs, but with the appearance of jaws and associated forces there was a remarkable structural diversification. This suggests that the evolution of skull form may be linked to these forces, but an important area of debate is whether bone in the skull is minimised with respect to these forces, or whether skulls are mechanically “over-designed” and constrained by phylogeny and development. Mechanical analysis of diapsid reptile skulls could shed light on this longstanding debate. Compared to those of mammals, the skulls of many extant and extinct diapsids comprise an open framework of fenestrae (window-like openings) separated by bony struts (e.g., lizards, tuatara, dinosaurs and crocodiles), a cranial form thought to be strongly linked to feeding forces. We investigated this link by utilising the powerful engineering approach of multibody dynamics analysis to predict the physiological forces acting on the skull of the diapsid reptile Sphenodon. We then ran a series of structural finite element analyses to assess the correlation between bone strain and skull form. With comprehensive loading we found that the distribution of peak von Mises strains was particularly uniform throughout the skull, although specific regions were dominated by tensile strains while others were dominated by compressive strains. Our analyses suggest that the frame-like skulls of diapsid reptiles are probably optimally formed (mechanically ideal: sufficient strength with the minimal amount of bone) with respect to functional forces; they are efficient in terms of having minimal bone volume, minimal weight, and also minimal energy demands in maintenance. PMID:22216358
International Nuclear Information System (INIS)
Freeman, H.
1990-01-01
This book presents an overview of waste minimization. Covers applications of technology to waste reduction, techniques for implementing programs, incorporation of programs into R and D, strategies for private industry and the public sector, and case studies of programs already in effect
Minimal constrained supergravity
Energy Technology Data Exchange (ETDEWEB)
Cribiori, N. [Dipartimento di Fisica e Astronomia “Galileo Galilei”, Università di Padova, Via Marzolo 8, 35131 Padova (Italy); INFN, Sezione di Padova, Via Marzolo 8, 35131 Padova (Italy); Dall' Agata, G., E-mail: dallagat@pd.infn.it [Dipartimento di Fisica e Astronomia “Galileo Galilei”, Università di Padova, Via Marzolo 8, 35131 Padova (Italy); INFN, Sezione di Padova, Via Marzolo 8, 35131 Padova (Italy); Farakos, F. [Dipartimento di Fisica e Astronomia “Galileo Galilei”, Università di Padova, Via Marzolo 8, 35131 Padova (Italy); INFN, Sezione di Padova, Via Marzolo 8, 35131 Padova (Italy); Porrati, M. [Center for Cosmology and Particle Physics, Department of Physics, New York University, 4 Washington Place, New York, NY 10003 (United States)
2017-01-10
We describe minimal supergravity models where supersymmetry is non-linearly realized via constrained superfields. We show that the resulting actions differ from the so called “de Sitter” supergravities because we consider constraints eliminating directly the auxiliary fields of the gravity multiplet.
Minimal constrained supergravity
International Nuclear Information System (INIS)
Cribiori, N.; Dall'Agata, G.; Farakos, F.; Porrati, M.
2017-01-01
We describe minimal supergravity models where supersymmetry is non-linearly realized via constrained superfields. We show that the resulting actions differ from the so called “de Sitter” supergravities because we consider constraints eliminating directly the auxiliary fields of the gravity multiplet.
Yunus, A. I. A.; Muhammad, W. M. N. W.; Saaid, M. N. F.
2018-04-01
Standard form of contract is normally being used in Malaysia construction industry in establishing legal relation between contracting parties. Generally, most of Malaysia federal government construction project used PWD203A which is a standard form of contract to be used where Bills of Quantities Form Part of the Contract and it is issued by Public Works Department (PWD/JKR). On the other hand in Sarawak, the largest state in Malaysia, the state government has issued their own standard form of contract namely JKR Sarawak Form of Contract 2006. Even both forms have been used widely in construction industry; there is still lack of understanding on both forms. The aim of this paper is to identify significant provision on both forms of contract. Document analysis has been adopted in conducting an in-depth review on both forms. It is found that, both forms of contracts have differences and similarities on several provisions specifically matters to definitions and general; execution of the works; payments, completion and final account; and delay, dispute resolution and determination.
DEFF Research Database (Denmark)
Ravnbak, Mette H; Philipsen, Peter A; Wulf, Hans Christian
2010-01-01
To investigate the relation between pre-exposure skin pigmentation and the minimal melanogenesis dose (MMD)/minimal erythema dose (MED) ratio after a single narrowband ultraviolet B (nUVB) and solar simulator (Solar) exposure.......To investigate the relation between pre-exposure skin pigmentation and the minimal melanogenesis dose (MMD)/minimal erythema dose (MED) ratio after a single narrowband ultraviolet B (nUVB) and solar simulator (Solar) exposure....
A Numerical Theory for Impedance Education in Three-Dimensional Normal Incidence Tubes
Watson, Willie R.; Jones, Michael G.
2016-01-01
A method for educing the locally-reacting acoustic impedance of a test sample mounted in a 3-D normal incidence impedance tube is presented and validated. The unique feature of the method is that the excitation frequency (or duct geometry) may be such that high-order duct modes may exist. The method educes the impedance, iteratively, by minimizing an objective function consisting of the difference between the measured and numerically computed acoustic pressure at preselected measurement points in the duct. The method is validated on planar and high-order mode sources with data synthesized from exact mode theory. These data are then subjected to random jitter to simulate the effects of measurement uncertainties on the educed impedance spectrum. The primary conclusions of the study are 1) Without random jitter the method is in excellent agreement with that for known impedance samples, and 2) Random jitter that is compatible to that found in a typical experiment has minimal impact on the accuracy of the educed impedance.
Bassetti, Bene; Sokolović-Perović, Mirjana; Mairano, Paolo; Cerni, Tania
2018-06-01
Research shows that the orthographic forms ("spellings") of second language (L2) words affect speech production in L2 speakers. This study investigated whether English orthographic forms lead L2 speakers to produce English homophonic word pairs as phonological minimal pairs. Targets were 33 orthographic minimal pairs, that is to say homophonic words that would be pronounced as phonological minimal pairs if orthography affects pronunciation. Word pairs contained the same target sound spelled with one letter or two, such as the /n/ in finish and Finnish (both /'fɪnɪʃ/ in Standard British English). To test for effects of length and type of L2 exposure, we compared Italian instructed learners of English, Italian-English late bilinguals with lengthy naturalistic exposure, and English natives. A reading-aloud task revealed that Italian speakers of English L2 produce two English homophonic words as a minimal pair distinguished by different consonant or vowel length, for instance producing the target /'fɪnɪʃ/ with a short [n] or a long [nː] to reflect the number of consonant letters in the spelling of the words finish and Finnish. Similar effects were found on the pronunciation of vowels, for instance in the orthographic pair scene-seen (both /siːn/). Naturalistic exposure did not reduce orthographic effects, as effects were found both in learners and in late bilinguals living in an English-speaking environment. It appears that the orthographic form of L2 words can result in the establishment of a phonological contrast that does not exist in the target language. Results have implications for models of L2 phonological development.
Qualifying and quantifying minimal hepatic encephalopathy
DEFF Research Database (Denmark)
Morgan, Marsha Y; Amodio, Piero; Cook, Nicola A
2016-01-01
Minimal hepatic encephalopathy is the term applied to the neuropsychiatric status of patients with cirrhosis who are unimpaired on clinical examination but show alterations in neuropsychological tests exploring psychomotor speed/executive function and/or in neurophysiological variables. There is ......Minimal hepatic encephalopathy is the term applied to the neuropsychiatric status of patients with cirrhosis who are unimpaired on clinical examination but show alterations in neuropsychological tests exploring psychomotor speed/executive function and/or in neurophysiological variables...... analytical techniques may provide better diagnostic information while the advent of portable wireless headsets may facilitate more widespread use. A large number of other diagnostic tools have been validated for the diagnosis of minimal hepatic encephalopathy including Critical Flicker Frequency......, the Inhibitory Control Test, the Stroop test, the Scan package and the Continuous Reaction Time; each has its pros and cons; strengths and weaknesses; protagonists and detractors. Recent AASLD/EASL Practice Guidelines suggest that the diagnosis of minimal hepatic encephalopathy should be based on the PHES test...
The minimal GUT with inflaton and dark matter unification
Chen, Heng-Yu; Gogoladze, Ilia; Hu, Shan; Li, Tianjun; Wu, Lina
2018-01-01
Giving up the solutions to the fine-tuning problems, we propose the non-supersymmetric flipped SU(5)× U(1)_X model based on the minimal particle content principle, which can be constructed from the four-dimensional SO(10) models, five-dimensional orbifold SO(10) models, and local F-theory SO(10) models. To achieve gauge coupling unification, we introduce one pair of vector-like fermions, which form a complete SU(5)× U(1)_X representation. The proton lifetime is around 5× 10^{35} years, neutrino masses and mixing can be explained via the seesaw mechanism, baryon asymmetry can be generated via leptogenesis, and the vacuum stability problem can be solved as well. In particular, we propose that inflaton and dark matter particles can be unified to a real scalar field with Z_2 symmetry, which is not an axion and does not have the non-minimal coupling to gravity. Such a kind of scenarios can be applied to the generic scalar dark matter models. Also, we find that the vector-like particle corrections to the B_s^0 masses might be about 6.6%, while their corrections to the K^0 and B_d^0 masses are negligible.
The minimal GUT with inflaton and dark matter unification
Energy Technology Data Exchange (ETDEWEB)
Chen, Heng-Yu; Gogoladze, Ilia [University of Delaware, Department of Physics and Astronomy, Bartol Research Institute, Newark, DE (United States); Hu, Shan [Hubei University, Department of Physics, Faculty of Physics and Electronic Sciences, Wuhan (China); Li, Tianjun [Chinese Academy of Sciences, Key Laboratory of Theoretical Physics, Institute of Theoretical Physics, Beijing (China); University of Chinese Academy of Sciences, School of Physical Sciences, Beijing (China); Wu, Lina [Chinese Academy of Sciences, Key Laboratory of Theoretical Physics, Institute of Theoretical Physics, Beijing (China); University of Electronic Science and Technology of China, School of Physical Electronics, Chengdu (China)
2018-01-15
Giving up the solutions to the fine-tuning problems, we propose the non-supersymmetric flipped SU(5) x U(1){sub X} model based on the minimal particle content principle, which can be constructed from the four-dimensional SO(10) models, five-dimensional orbifold SO(10) models, and local F-theory SO(10) models. To achieve gauge coupling unification, we introduce one pair of vector-like fermions, which form a complete SU(5) x U(1){sub X} representation. The proton lifetime is around 5 x 10{sup 35} years, neutrino masses and mixing can be explained via the seesaw mechanism, baryon asymmetry can be generated via leptogenesis, and the vacuum stability problem can be solved as well. In particular, we propose that inflaton and dark matter particles can be unified to a real scalar field with Z{sub 2} symmetry, which is not an axion and does not have the non-minimal coupling to gravity. Such a kind of scenarios can be applied to the generic scalar dark matter models. Also, we find that the vector-like particle corrections to the B{sub s}{sup 0} masses might be about 6.6%, while their corrections to the K{sup 0} and B{sub d}{sup 0} masses are negligible. (orig.)
Ceramic waste forms for fuel-containing masses at Chernobyl
International Nuclear Information System (INIS)
Oversby, V.M.
1994-05-01
The fuel materials originally in the core of the Chernobyl Unit 4 reactor are now present within the Ukrytie in three major forms: (1) very fine particles of fuel dispersed as dust (about 10 tonnes), (2) fragments of the destroyed core, and (3) lavas containing fuel, cladding, and other materials. All of these materials will need to be immobilized into waste forms suitable for final disposal. We propose a ceramic waste form system that could accommodate all three waste types with a single set of processing equipment. The waste form would include the mineral zirconolite for immobilization of actinide materials (including uranium), perovskite, nepheline, spinel, and other phases as dictated by the chemistry of the lava masses. Waste loadings as high as 50% U can be achieved if pyrochlore, a close relative of zirconolite, is used as the U host. The ceramic immobilization could be achieved with low additions of inert chemicals to minimize the final disposal volume while ensuring a durable product. The sequence of processing would be to collect and immobilize the fuel dust first. This material will require minimal preprocessing and will provide experience in the handling of the fuel materials. Core fragments would be processed next, using a cryogenic crushing stage to reduce the size prior to adding ceramic additives. The lavas would be processed last, which is compatible with the likely sequence of availability of materials and with the complexity of the operations. The lavas will require more adjustment of chemical additive composition than the other streams to ensure that the desired phases are produced in the waste form
Microstructural and Mechanical Property Characterization of Shear Formed Aerospace Aluminum Alloys
Troeger, Lillianne P.; Domack, Marcia S.; Wagner, John A.
2000-01-01
Advanced manufacturing processes such as near-net-shape forming can reduce production costs and increase the reliability of launch vehicle and airframe structural components through the reduction of material scrap and part count and the minimization of joints. The current research is an investigation of the processing-microstructure-property relationships for shear formed cylinders of the Al-Cu-Li-Mg-Ag alloy 2195 for space applications and the Al-Cu-Mg-Ag alloy C415 for airframe applications. Cylinders which had undergone various amounts of shear-forming strain were studied to correlate the grain structure, texture, and mechanical properties developed during and after shear forming.
Mowery, Danielle L; South, Brett R; Christensen, Lee; Leng, Jianwei; Peltonen, Laura-Maria; Salanterä, Sanna; Suominen, Hanna; Martinez, David; Velupillai, Sumithra; Elhadad, Noémie; Savova, Guergana; Pradhan, Sameer; Chapman, Wendy W
2016-07-01
The ShARe/CLEF eHealth challenge lab aims to stimulate development of natural language processing and information retrieval technologies to aid patients in understanding their clinical reports. In clinical text, acronyms and abbreviations, also referenced as short forms, can be difficult for patients to understand. For one of three shared tasks in 2013 (Task 2), we generated a reference standard of clinical short forms normalized to the Unified Medical Language System. This reference standard can be used to improve patient understanding by linking to web sources with lay descriptions of annotated short forms or by substituting short forms with a more simplified, lay term. In this study, we evaluate 1) accuracy of participating systems' normalizing short forms compared to a majority sense baseline approach, 2) performance of participants' systems for short forms with variable majority sense distributions, and 3) report the accuracy of participating systems' normalizing shared normalized concepts between the test set and the Consumer Health Vocabulary, a vocabulary of lay medical terms. The best systems submitted by the five participating teams performed with accuracies ranging from 43 to 72 %. A majority sense baseline approach achieved the second best performance. The performance of participating systems for normalizing short forms with two or more senses with low ambiguity (majority sense greater than 80 %) ranged from 52 to 78 % accuracy, with two or more senses with moderate ambiguity (majority sense between 50 and 80 %) ranged from 23 to 57 % accuracy, and with two or more senses with high ambiguity (majority sense less than 50 %) ranged from 2 to 45 % accuracy. With respect to the ShARe test set, 69 % of short form annotations contained common concept unique identifiers with the Consumer Health Vocabulary. For these 2594 possible annotations, the performance of participating systems ranged from 50 to 75 % accuracy. Short form normalization continues
Time-dependent London approach: Dissipation due to out-of-core normal excitations by moving vortices
Kogan, V. G.
2018-03-01
The dissipative currents due to normal excitations are included in the London description. The resulting time-dependent London equations are solved for a moving vortex and a moving vortex lattice. It is shown that the field distribution of a moving vortex loses its cylindrical symmetry. It experiences contraction that is stronger in the direction of the motion than in the direction normal to the velocity v . The London contribution of normal currents to dissipation is small relative to the Bardeen-Stephen core dissipation at small velocities, but it approaches the latter at high velocities, where this contribution is no longer proportional to v2. To minimize the London contribution to dissipation, the vortex lattice is oriented so as to have one of the unit cell vectors along the velocity. This effect is seen in experiments and predicted within the time-dependent Ginzburg-Landau theory.
Matthew Arnold and Minimal Competency Testing.
Tuman, Myron C.
1979-01-01
Presents arguments by Robert Lowe and Matthew Arnold on the 19th century British "Payment by Results" Plan, whereby schools received funds for students who passed minimal competency tests. Emphasizes that the Victorian experience produced acrimonious teachers with low morale and encourages contemporary minimal testing advocates not to…
Minimal constrained supergravity
Directory of Open Access Journals (Sweden)
N. Cribiori
2017-01-01
Full Text Available We describe minimal supergravity models where supersymmetry is non-linearly realized via constrained superfields. We show that the resulting actions differ from the so called “de Sitter” supergravities because we consider constraints eliminating directly the auxiliary fields of the gravity multiplet.
Wang, Ji-Bo; Wang, Ming-Zheng; Ji, Ping
2012-05-01
In this article, we consider a single machine scheduling problem with a time-dependent learning effect and deteriorating jobs. By the effects of time-dependent learning and deterioration, we mean that the job processing time is defined by a function of its starting time and total normal processing time of jobs in front of it in the sequence. The objective is to determine an optimal schedule so as to minimize the total completion time. This problem remains open for the case of -1 < a < 0, where a denotes the learning index; we show that an optimal schedule of the problem is V-shaped with respect to job normal processing times. Three heuristic algorithms utilising the V-shaped property are proposed, and computational experiments show that the last heuristic algorithm performs effectively and efficiently in obtaining near-optimal solutions.
Corporate tax minimization and stock price reactions
Blaufus, Kay; Möhlmann, Axel; Schwäbe, Alexander
2016-01-01
Tax minimization strategies may lead to significant tax savings, which could, in turn, increase firm value. However, such strategies are also associated with significant costs, such as expected penalties and planning, agency, and reputation costs. The overall impact of firms' tax minimization strategies on firm value is, therefore, unclear. To investigate whether corporate tax minimization increases firm value, we analyze the stock price reaction to news concerning corporate tax avoidance or ...
Treatability study of absorbent polymer waste form for mixed waste treatment
International Nuclear Information System (INIS)
Herrmann, S. D.; Lehto, M. A.; Stewart, N. A.; Croft, A. D.; Kern, P. W.
2000-01-01
A treatability study was performed to develop and characterize an absorbent polymer waste form for application to low level (LLW) and mixed low level (MLLW) aqueous wastes at Argonne National Laboratory-West (ANL-W). In this study absorbent polymers proved effective at immobilizing aqueous liquid wastes in order to meet Land Disposal Restrictions for subsurface waste disposal. Treatment of aqueous waste with absorbent polymers provides an alternative to liquid waste solidification via high-shear mixing with clays and cements. Significant advantages of absorbent polymer use over clays and cements include ease of operations and waste volume minimization. Absorbent polymers do not require high-shear mixing as do clays and cements. Granulated absorbent polymer is poured into aqueous solutions and forms a gel which passes the paint filter test as a non-liquid. Pouring versus mixing of a solidification agent not only eliminates the need for a mixing station, but also lessens exposure to personnel and the potential for spread of contamination from treatment of radioactive wastes. Waste minimization is achieved as significantly less mass addition and volume increase is required of and results from absorbent polymer use than that of clays and cements. Operational ease and waste minimization translate into overall cost savings for LLW and MLLW treatment
Drug Use Normalization: A Systematic and Critical Mixed-Methods Review.
Sznitman, Sharon R; Taubman, Danielle S
2016-09-01
Drug use normalization, which is a process whereby drug use becomes less stigmatized and more accepted as normative behavior, provides a conceptual framework for understanding contemporary drug issues and changes in drug use trends. Through a mixed-methods systematic review of the normalization literature, this article seeks to (a) critically examine how the normalization framework has been applied in empirical research and (b) make recommendations for future research in this area. Twenty quantitative, 26 qualitative, and 4 mixed-methods studies were identified through five electronic databases and reference lists of published studies. Studies were assessed for relevance, study characteristics, quality, and aspects of normalization examined. None of the studies applied the most rigorous research design (experiments) or examined all of the originally proposed normalization dimensions. The most commonly assessed dimension of drug use normalization was "experimentation." In addition to the original dimensions, the review identified the following new normalization dimensions in the literature: (a) breakdown of demographic boundaries and other risk factors in relation to drug use; (b) de-normalization; (c) drug use as a means to achieve normal goals; and (d) two broad forms of micro-politics associated with managing the stigma of illicit drug use: assimilative and transformational normalization. Further development in normalization theory and methodology promises to provide researchers with a novel framework for improving our understanding of drug use in contemporary society. Specifically, quasi-experimental designs that are currently being made feasible by swift changes in cannabis policy provide researchers with new and improved opportunities to examine normalization processes.
Widjaja, Effendi; Tan, Boon Hong; Garland, Marc
2006-03-01
Two-dimensional (2D) correlation spectroscopy has been extensively applied to analyze various vibrational spectroscopic data, especially infrared and Raman. However, when it is applied to real-world experimental data, which often contains various imperfections (such as noise interference, baseline fluctuations, and band-shifting) and highly overlapping bands, many artifacts and misleading features in synchronous and asynchronous maps will emerge, and this will lead to difficulties with interpretation. Therefore, an approach that counters many artifacts and therefore leads to simplified interpretation of 2D correlation analysis is certainly useful. In the present contribution, band-target entropy minimization (BTEM) is employed as a spectral pretreatment to handle many of the artifact problems before the application of 2D correlation analysis. BTEM is employed to elucidate the pure component spectra of mixtures and their corresponding concentration profiles. Two alternate forms of analysis result. In the first, the normally vxv problem is converted to an equivalent nvxnv problem, where n represents the number of species present. In the second, the pure component spectra are transformed into simple distributions, and an equivalent and less computationally intensive nv'xnv' problem results (v'evaporation study where in situ Fourier transform infrared (FT-IR) spectroscopy is used as the analytical tool.
Basic characterization of normal multifocal electroretinogram
International Nuclear Information System (INIS)
Fernandez Cherkasova, Lilia; Rojas Rondon, Irene; Castro Perez, Pedro Daniel; Lopez Felipe, Daniel; Santiesteban Freixas, Rosaralis; Mendoza Santiesteban, Carlos E
2008-01-01
A scientific literature review was made on the novel multifocal electroretinogram technique, the involved cell mechanisms and some of the factors modifying its results together with the form of presentation. The basic characteristics of this electrophysiological record obtained from several regions of the retina of normal subjects is important in order to create at a small scale a comparative database to evaluate pathological eye tracing. All this will greatly help in early less invasive electrodiagnosis of localized retinal lesions. (Author)
Annual Waste Minimization Summary Report
International Nuclear Information System (INIS)
Haworth, D.M.
2011-01-01
This report summarizes the waste minimization efforts undertaken by National Security TechnoIogies, LLC, for the U. S. Department of Energy, National Nuclear Security Administration Nevada Site Office (NNSA/NSO), during calendar year 2010. The NNSA/NSO Pollution Prevention Program establishes a process to reduce the volume and toxicity of waste generated by NNSA/NSO activities and ensures that proposed methods of treatment, storage, and/or disposal of waste minimize potential threats to human health and the environment.
Minimal free resolutions over complete intersections
Eisenbud, David
2016-01-01
This book introduces a theory of higher matrix factorizations for regular sequences and uses it to describe the minimal free resolutions of high syzygy modules over complete intersections. Such resolutions have attracted attention ever since the elegant construction of the minimal free resolution of the residue field by Tate in 1957. The theory extends the theory of matrix factorizations of a non-zero divisor, initiated by Eisenbud in 1980, which yields a description of the eventual structure of minimal free resolutions over a hypersurface ring. Matrix factorizations have had many other uses in a wide range of mathematical fields, from singularity theory to mathematical physics.
Minimal-change nephropathy and malignant thymoma.
Varsano, S; Bruderman, I; Bernheim, J L; Rathaus, M; Griffel, B
1980-05-01
A 56-year-old man had fever, precordial pain, and a mediastinal mass. The mass disappeared two months later and the patient remained asymptomatic for 2 1/2 years. At that time a full-blown nephrotic syndrome developed, with minimal-change glomerulopathy. The chest x-ray film showed the reappearance of a giant mediastinal mass. On biopsy of the mass, malignant thymoma was diagnosed. Association between minimal-change disease and Hodgkin's disease is well known, while the association with malignant thymoma has not been previously reported. The relationship between malignant thymoma and minimal-change disease is discussed, and a possible pathogenic mechanism involving cell-mediated immunity is proposed.
Minimally invasive spine surgery: Hurdles to be crossed
Directory of Open Access Journals (Sweden)
Mahesh Bijjawara
2014-01-01
Full Text Available MISS as a concept is noble and all surgeons need to address and minimize the surgical morbidity for better results. However, we need to be cautions and not fall prey into accepting that minimally invasive spine surgery can be done only when certain metal access systems are used. Minimally invasive spine surgery (MISS has come a long way since the description of endoscopic discectomy in 1997 and minimally invasive TLIF (mTLIF in 2003. Today there is credible evidence (though not level-I that MISS has comparable results to open spine surgery with the advantage of early postoperative recovery and decreased blood loss and infection rates. However, apart from decreasing the muscle trauma and decreasing the muscle dissection during multilevel open spinal instrumentation, there has been little contribution to address the other morbidity parameters like operative time , blood loss , access to decompression and atraumatic neural tissue handling with the existing MISS technologies. Since all these parameters contribute to a greater degree than posterior muscle trauma for the overall surgical morbidity, we as surgeons need to introspect before we accept the concept of minimally invasive spine surgery being reduced to surgeries performed with a few tubular retractors. A spine surgeon needs to constantly improve his skills and techniques so that he can minimize blood loss, minimize traumatic neural tissue handling and minimizing operative time without compromising on the surgical goals. These measures actually contribute far more, to decrease the morbidity than approach related muscle damage alone. Minimally invasine spine surgery , though has come a long way, needs to provide technical solutions to minimize all the morbidity parameters involved in spine surgery, before it can replace most of the open spine surgeries, as in the case of laparoscopic surgery or arthroscopic surgery.
Minimal knotted polygons in cubic lattices
International Nuclear Information System (INIS)
Van Rensburg, E J Janse; Rechnitzer, A
2011-01-01
In this paper we examine numerically the properties of minimal length knotted lattice polygons in the simple cubic, face-centered cubic, and body-centered cubic lattices by sieving minimal length polygons from a data stream of a Monte Carlo algorithm, implemented as described in Aragão de Carvalho and Caracciolo (1983 Phys. Rev. B 27 1635), Aragão de Carvalho et al (1983 Nucl. Phys. B 215 209) and Berg and Foester (1981 Phys. Lett. B 106 323). The entropy, mean writhe, and mean curvature of minimal length polygons are computed (in some cases exactly). While the minimal length and mean curvature are found to be lattice dependent, the mean writhe is found to be only weakly dependent on the lattice type. Comparison of our results to numerical results for the writhe obtained elsewhere (see Janse van Rensburg et al 1999 Contributed to Ideal Knots (Series on Knots and Everything vol 19) ed Stasiak, Katritch and Kauffman (Singapore: World Scientific), Portillo et al 2011 J. Phys. A: Math. Theor. 44 275004) shows that the mean writhe is also insensitive to the length of a knotted polygon. Thus, while these results for the mean writhe and mean absolute writhe at minimal length are not universal, our results demonstrate that these values are quite close the those of long polygons regardless of the underlying lattice and length
LLNL Waste Minimization Program Plan
International Nuclear Information System (INIS)
1990-01-01
This document is the February 14, 1990 version of the LLNL Waste Minimization Program Plan (WMPP). The Waste Minimization Policy field has undergone continuous changes since its formal inception in the 1984 HSWA legislation. The first LLNL WMPP, Revision A, is dated March 1985. A series of informal revision were made on approximately a semi-annual basis. This Revision 2 is the third formal issuance of the WMPP document. EPA has issued a proposed new policy statement on source reduction and recycling. This policy reflects a preventative strategy to reduce or eliminate the generation of environmentally-harmful pollutants which may be released to the air, land surface, water, or ground water. In accordance with this new policy new guidance to hazardous waste generators on the elements of a Waste Minimization Program was issued. In response to these policies, DOE has revised and issued implementation guidance for DOE Order 5400.1, Waste Minimization Plan and Waste Reduction reporting of DOE Hazardous, Radioactive, and Radioactive Mixed Wastes, final draft January 1990. This WMPP is formatted to meet the current DOE guidance outlines. The current WMPP will be revised to reflect all of these proposed changes when guidelines are established. Updates, changes and revisions to the overall LLNL WMPP will be made as appropriate to reflect ever-changing regulatory requirements. 3 figs., 4 tabs
Matsumoto, Atsushi; Tobias, Irwin; Olson, Wilma K
2005-01-01
Fine structural and energetic details embedded in the DNA base sequence, such as intrinsic curvature, are important to the packaging and processing of the genetic material. Here we investigate the internal dynamics of a 200 bp closed circular molecule with natural curvature using a newly developed normal-mode treatment of DNA in terms of neighboring base-pair "step" parameters. The intrinsic curvature of the DNA is described by a 10 bp repeating pattern of bending distortions at successive base-pair steps. We vary the degree of intrinsic curvature and the superhelical stress on the molecule and consider the normal-mode fluctuations of both the circle and the stable figure-8 configuration under conditions where the energies of the two states are similar. To extract the properties due solely to curvature, we ignore other important features of the double helix, such as the extensibility of the chain, the anisotropy of local bending, and the coupling of step parameters. We compare the computed normal modes of the curved DNA model with the corresponding dynamical features of a covalently closed duplex of the same chain length constructed from naturally straight DNA and with the theoretically predicted dynamical properties of a naturally circular, inextensible elastic rod, i.e., an O-ring. The cyclic molecules with intrinsic curvature are found to be more deformable under superhelical stress than rings formed from naturally straight DNA. As superhelical stress is accumulated in the DNA, the frequency, i.e., energy, of the dominant bending mode decreases in value, and if the imposed stress is sufficiently large, a global configurational rearrangement of the circle to the figure-8 form takes place. We combine energy minimization with normal-mode calculations of the two states to decipher the configurational pathway between the two states. We also describe and make use of a general analytical treatment of the thermal fluctuations of an elastic rod to characterize the
Improving the performance of minimizers and winnowing schemes.
Marçais, Guillaume; Pellow, David; Bork, Daniel; Orenstein, Yaron; Shamir, Ron; Kingsford, Carl
2017-07-15
The minimizers scheme is a method for selecting k -mers from sequences. It is used in many bioinformatics software tools to bin comparable sequences or to sample a sequence in a deterministic fashion at approximately regular intervals, in order to reduce memory consumption and processing time. Although very useful, the minimizers selection procedure has undesirable behaviors (e.g. too many k -mers are selected when processing certain sequences). Some of these problems were already known to the authors of the minimizers technique, and the natural lexicographic ordering of k -mers used by minimizers was recognized as their origin. Many software tools using minimizers employ ad hoc variations of the lexicographic order to alleviate those issues. We provide an in-depth analysis of the effect of k -mer ordering on the performance of the minimizers technique. By using small universal hitting sets (a recently defined concept), we show how to significantly improve the performance of minimizers and avoid some of its worse behaviors. Based on these results, we encourage bioinformatics software developers to use an ordering based on a universal hitting set or, if not possible, a randomized ordering, rather than the lexicographic order. This analysis also settles negatively a conjecture (by Schleimer et al. ) on the expected density of minimizers in a random sequence. The software used for this analysis is available on GitHub: https://github.com/gmarcais/minimizers.git . gmarcais@cs.cmu.edu or carlk@cs.cmu.edu. © The Author 2017. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com
Danish Sixties Avant-Garde and American Minimal Art
Directory of Open Access Journals (Sweden)
Max Ipsen
2007-05-01
Full Text Available Denmark is peripheral in the history of minimalism in the arts. In an international perspective Danish artists made almost no contributions to minimalism, according to art historians. But the fact is that Danish artists made minimalist works of art, and they did it very early. Art historians tend to describe minimal art as an entirely American phenomenon. America is the centre, Europe the periphery that lagged behind the centre, imitating American art. I will try to query this view with examples from Danish minimalism. I will discuss minimalist tendencies in Danish art and literature in the 1960s, and I will examine whether one can claim that Danish artists were influenced by American minimal art.
Normalization constraint for variational bounds on fluid permeability
International Nuclear Information System (INIS)
Berryman, J.G.; Milton, G.W.
1985-01-01
A careful reexamination of the formulation of Prager's original variational principle for viscous flow through porous media has uncovered a subtle error in the normalization constraint on the trial functions. Although a certain surface integral of the true pressure field over the internal surface area always vanishes for isotropic materials, the corresponding surface integral for a given trial pressure field does not necessarily vanish but has nevertheless been previously neglected in the normalization. When this error is corrected, the form of the variational estimate is actually simpler than before and furthermore the resulting bounds have been shown to improve when the constant trial functions are used in either the two-point or three-point bounds
Willoughby, R A; Ecker, G L; McKee, S L; Riddolls, L J
1991-01-01
Mucociliary clearance rates from the trachea were determined in normal, sedated, diseased and exercised horses from scintigraphs obtained after an injection of technetium-99m sulphide colloid into the tracheal lumen. The group mean tracheal clearance rate of eight clinically normal horses during 42 trials was 2.06 +/- 0.38 cm/min. Significant between horse differences were found (p less than 0.05). When six and seven of these horses were given xylazine and detomidine hydrochloride, respectively, mean group tracheal clearance rates dropped significantly (p less than 0.05). The decreases from each normal horse's mean tracheal clearance rate ranged from 18 to 54%. There did not appear to be a difference between the tracheal clearance rates (TCRs) of the normal horses and those with chronic respiratory disease. Postexercise evaluations were not significantly different from the pre-exercise TCRs in three clinically normal horses and three horses with chronic obstructive pulmonary disease (p greater than 0.05). This minimally invasive scintigraphic technique for determining TCRs has proved to be useful and reliable. Images Fig. 1. PMID:1790485
Anorectal function and outcomes after transanal minimally invasive surgery for rectal tumors
Directory of Open Access Journals (Sweden)
Feza Y Karakayali
2015-01-01
Full Text Available Background: Transanal endoscopic microsurgery is a minimally invasive technique that allows full-thickness resection and suture closure of the defect for large rectal adenomas, selected low-risk rectal cancers, or small cancers in patients who have a high risk for major surgery. Our aim, in the given prospective study was to report our initial clinical experience with TAMIS, and to evaluate its effects on postoperative anorectal functions. Materials and Methods: In 10 patients treated with TAMIS for benign and malignant rectal tumors, preoperative and postoperative anorectal function was evaluated with anorectal manometry and Cleveland Clinic Incontinence Score. Results: The mean distance of the tumors from the anal verge was 5.6 cm, and mean tumor diameter was 2.6 cm. All resection margins were tumor free. There was no difference in preoperative and 3-week postoperative anorectalmanometry findings; only mean minimum rectal sensory volume was lower at 3 weeks after surgery. The Cleveland Clinic Incontinence Score was normal in all patients except one which resolved by 6 weeks after surgery.The mean postoperative follow-up was 28 weeks without any recurrences. Conclusion: Transanal minimally invasive surgery is a safe and effective procedure for treatment of rectal tumors and can be performed without impairing anorectal functions.
Stender, Johan; Kupers, Ron; Rodell, Anders; Thibaut, Aurore; Chatelle, Camille; Bruno, Marie-Aurélie; Gejl, Michael; Bernard, Claire; Hustinx, Roland; Laureys, Steven; Gjedde, Albert
2015-01-01
The differentiation of the vegetative or unresponsive wakefulness syndrome (VS/UWS) from the minimally conscious state (MCS) is an important clinical issue. The cerebral metabolic rate of glucose (CMRglc) declines when consciousness is lost, and may reveal the residual cognitive function of these patients. However, no quantitative comparisons of cerebral glucose metabolism in VS/UWS and MCS have yet been reported. We calculated the regional and whole-brain CMRglc of 41 patients in the states of VS/UWS (n=14), MCS (n=21) or emergence from MCS (EMCS, n=6), and healthy volunteers (n=29). Global cortical CMRglc in VS/UWS and MCS averaged 42% and 55% of normal, respectively. Differences between VS/UWS and MCS were most pronounced in the frontoparietal cortex, at 42% and 60% of normal. In brainstem and thalamus, metabolism declined equally in the two conditions. In EMCS, metabolic rates were indistinguishable from those of MCS. Ordinal logistic regression predicted that patients are likely to emerge into MCS at CMRglc above 45% of normal. Receiver-operating characteristics showed that patients in MCS and VS/UWS can be differentiated with 82% accuracy, based on cortical metabolism. Together these results reveal a significant correlation between whole-brain energy metabolism and level of consciousness, suggesting that quantitative values of CMRglc reveal consciousness in severely brain-injured patients.
Mycoplasmas and their host: emerging and re-emerging minimal pathogens.
Citti, Christine; Blanchard, Alain
2013-04-01
Commonly known as mycoplasmas, bacteria of the class Mollicutes include the smallest and simplest life forms capable of self replication outside of a host. Yet, this minimalism hides major human and animal pathogens whose prevalence and occurrence have long been underestimated. Owing to advances in sequencing methods, large data sets have become available for a number of mycoplasma species and strains, providing new diagnostic approaches, typing strategies, and means for comprehensive studies. A broader picture is thus emerging in which mycoplasmas are successful pathogens having evolved a number of mechanisms and strategies for surviving hostile environments and adapting to new niches or hosts. Copyright © 2013 Elsevier Ltd. All rights reserved.
Roll forming of eco-friendly stud
Keum, Y. T.; Lee, S. Y.; Lee, T. H.; Sim, J. K.
2013-12-01
In order to manufacture an eco-friendly stud, the sheared pattern is designed by the Taguchi method and expanded by the side rolls. The seven geometrical shape of sheared pattern are considered in the structural and thermal analyses to select the best functional one in terms of the durability and fire resistance of dry wall. For optimizing the size of the sheared pattern chosen, the L9 orthogonal array and smaller-the-better characteristics of the Taguchi method are used. As the roll gap causes forming defects when the upper-and-lower roll type is adopted for expanding the sheared pattern, the side roll type is introduced. The stress and strain distributions obtained by the FEM simulation of roll-forming processes are utilized for the design of expanding process. The expanding process by side rolls shortens the length of expanding process and minimizes the cost of dies. Furthermore, the stud manufactured by expanding the sheared pattern of the web is an eco-friend because of the scrapless roll-forming process. In addition, compared to the conventionally roll-formed stud, the material cost is lessened about 13.6% and the weight is lightened about 15.5%.
Largo, Eneko; Gladue, Douglas P; Huarte, Nerea; Borca, Manuel V; Nieva, José L
2014-01-01
Viroporins are small integral membrane proteins functional in viral assembly and egress by promoting permeabilization. Blocking of viroporin function therefore constitutes a target for antiviral development. Classical swine fever virus (CSFV) protein p7 has been recently regarded as a class II viroporin. Here, we sought to establish the determinants of the CSFV p7 permeabilizing activity in a minimal model system. Assessment of an overlapping peptide library mapped the porating domain to the C-terminal hydrophobic stretch (residues 39-67). Pore-opening dependence on pH or sensitivity to channel blockers observed for the full protein required the inclusion of a preceding polar sequence (residues 33-38). Effects of lipid composition and structural data further support that the resulting peptide (residues 33-67), may comprise a bona fide surrogate to assay p7 activity in model membranes. Our observations imply that CSFV p7 relies on genus-specific structures-mechanisms to perform its viroporin function. Copyright © 2013 Elsevier B.V. All rights reserved.
Waste minimization and pollution prevention awareness plan
International Nuclear Information System (INIS)
1991-01-01
The purpose of this plan is to document the Lawrence Livermore National Laboratory (LLNL) Waste Minimization and Pollution Prevention Awareness Program. The plan specifies those activities and methods that are or will be employed to reduce the quantity and toxicity of wastes generated at the site. The intent of this plan is to respond to and comply with (DOE's) policy and guidelines concerning the need for pollution prevention. The Plan is composed of a LLNL Waste Minimization and Pollution Prevention Awareness Program Plan and, as attachments, Program- and Department-specific waste minimization plans. This format reflects the fact that waste minimization is considered a line management responsibility and is to be addressed by each of the Programs and Departments. 14 refs
Waste minimization and pollution prevention awareness plan
Energy Technology Data Exchange (ETDEWEB)
1991-05-31
The purpose of this plan is to document the Lawrence Livermore National Laboratory (LLNL) Waste Minimization and Pollution Prevention Awareness Program. The plan specifies those activities and methods that are or will be employed to reduce the quantity and toxicity of wastes generated at the site. The intent of this plan is to respond to and comply with (DOE's) policy and guidelines concerning the need for pollution prevention. The Plan is composed of a LLNL Waste Minimization and Pollution Prevention Awareness Program Plan and, as attachments, Program- and Department-specific waste minimization plans. This format reflects the fact that waste minimization is considered a line management responsibility and is to be addressed by each of the Programs and Departments. 14 refs.
Inflationary models with non-minimally derivative coupling
International Nuclear Information System (INIS)
Yang, Nan; Fei, Qin; Gong, Yungui; Gao, Qing
2016-01-01
We derive the general formulae for the scalar and tensor spectral tilts to the second order for the inflationary models with non-minimally derivative coupling without taking the high friction limit. The non-minimally kinetic coupling to Einstein tensor brings the energy scale in the inflationary models down to be sub-Planckian. In the high friction limit, the Lyth bound is modified with an extra suppression factor, so that the field excursion of the inflaton is sub-Planckian. The inflationary models with non-minimally derivative coupling are more consistent with observations in the high friction limit. In particular, with the help of the non-minimally derivative coupling, the quartic power law potential is consistent with the observational constraint at 95% CL. (paper)
Finding A Minimally Informative Dirichlet Prior Using Least Squares
International Nuclear Information System (INIS)
Kelly, Dana
2011-01-01
In a Bayesian framework, the Dirichlet distribution is the conjugate distribution to the multinomial likelihood function, and so the analyst is required to develop a Dirichlet prior that incorporates available information. However, as it is a multiparameter distribution, choosing the Dirichlet parameters is less straightforward than choosing a prior distribution for a single parameter, such as p in the binomial distribution. In particular, one may wish to incorporate limited information into the prior, resulting in a minimally informative prior distribution that is responsive to updates with sparse data. In the case of binomial p or Poisson λ, the principle of maximum entropy can be employed to obtain a so-called constrained noninformative prior. However, even in the case of p, such a distribution cannot be written down in the form of a standard distribution (e.g., beta, gamma), and so a beta distribution is used as an approximation in the case of p. In the case of the multinomial model with parametric constraints, the approach of maximum entropy does not appear tractable. This paper presents an alternative approach, based on constrained minimization of a least-squares objective function, which leads to a minimally informative Dirichlet prior distribution. The alpha-factor model for common-cause failure, which is widely used in the United States, is the motivation for this approach, and is used to illustrate the method. In this approach to modeling common-cause failure, the alpha-factors, which are the parameters in the underlying multinomial model for common-cause failure, must be estimated from data that are often quite sparse, because common-cause failures tend to be rare, especially failures of more than two or three components, and so a prior distribution that is responsive to updates with sparse data is needed.
Indentation stiffness does not discriminate between normal and degraded articular cartilage.
Brown, Cameron P; Crawford, Ross W; Oloyede, Adekunle
2007-08-01
Relative indentation characteristics are commonly used for distinguishing between normal healthy and degraded cartilage. The application of this parameter in surgical decision making and an appreciation of articular cartilage biomechanics has prompted us to hypothesise that it is difficult to define a reference stiffness to characterise normal articular cartilage. This hypothesis is tested for validity by carrying out biomechanical indentation of articular cartilage samples that are characterised as visually normal and degraded relative to proteoglycan depletion and collagen disruption. Compressive loading was applied at known strain rates to visually normal, artificially degraded and naturally osteoarthritic articular cartilage and observing the trends of their stress-strain and stiffness characteristics. While our results demonstrated a 25% depreciation in the stiffness of individual samples after proteoglycan depletion, they also showed that when compared to the stiffness of normal samples only 17% lie outside the range of the stress-strain behaviour of normal samples. We conclude that the extent of the variability in the properties of normal samples, and the degree of overlap (81%) of the biomechanical properties of normal and degraded matrices demonstrate that indentation data cannot form an accurate basis for distinguishing normal from abnormal articular cartilage samples with consequences for the application of this mechanical process in the clinical environment.
Minimally inconsistent reasoning in Semantic Web.
Zhang, Xiaowang
2017-01-01
Reasoning with inconsistencies is an important issue for Semantic Web as imperfect information is unavoidable in real applications. For this, different paraconsistent approaches, due to their capacity to draw as nontrivial conclusions by tolerating inconsistencies, have been proposed to reason with inconsistent description logic knowledge bases. However, existing paraconsistent approaches are often criticized for being too skeptical. To this end, this paper presents a non-monotonic paraconsistent version of description logic reasoning, called minimally inconsistent reasoning, where inconsistencies tolerated in the reasoning are minimized so that more reasonable conclusions can be inferred. Some desirable properties are studied, which shows that the new semantics inherits advantages of both non-monotonic reasoning and paraconsistent reasoning. A complete and sound tableau-based algorithm, called multi-valued tableaux, is developed to capture the minimally inconsistent reasoning. In fact, the tableaux algorithm is designed, as a framework for multi-valued DL, to allow for different underlying paraconsistent semantics, with the mere difference in the clash conditions. Finally, the complexity of minimally inconsistent description logic reasoning is shown on the same level as the (classical) description logic reasoning.
Nerve Cells Decide to Orient inside an Injectable Hydrogel with Minimal Structural Guidance.
Rose, Jonas C; Cámara-Torres, María; Rahimi, Khosrow; Köhler, Jens; Möller, Martin; De Laporte, Laura
2017-06-14
Injectable biomaterials provide the advantage of a minimally invasive application but mostly lack the required structural complexity to regenerate aligned tissues. Here, we report a new class of tissue regenerative materials that can be injected and form an anisotropic matrix with controlled dimensions using rod-shaped, magnetoceptive microgel objects. Microgels are doped with small quantities of superparamagnetic iron oxide nanoparticles (0.0046 vol %), allowing alignment by external magnetic fields in the millitesla order. The microgels are dispersed in a biocompatible gel precursor and after injection and orientation are fixed inside the matrix hydrogel. Regardless of the low volume concentration of the microgels below 3%, at which the geometrical constrain for orientation is still minimum, the generated macroscopic unidirectional orientation is strongly sensed by the cells resulting in parallel nerve extension. This finding opens a new, minimal invasive route for therapy after spinal cord injury.
Waste Minimization and Pollution Prevention Awareness Plan
International Nuclear Information System (INIS)
1992-01-01
The purpose of this plan is to document the Lawrence Livermore National Laboratory (LLNL) Waste Minimization and Pollution Prevention Awareness Program. The plan specifies those activities and methods that are or will be employed to reduce the quantity and toxicity of wastes generated at the site. It is intended to satisfy Department of Energy (DOE) and other legal requirements that are discussed in Section C, below. The Pollution Prevention Awareness Program is included with the Waste Minimization Program as suggested by DOE Order 5400.1. The intent of this plan is to respond to and comply with the Department's policy and guidelines concerning the need for pollution prevention. The Plan is composed of a LLNL Waste Minimization and Pollution Prevention Awareness Program Plan and, as attachments, Directorate-, Program- and Department-specific waste minimization plans. This format reflects the fact that waste minimization is considered a line management responsibility and is to be addressed by each of the Directorates, Programs and Departments. Several Directorates have been reorganized, necessitating changes in the Directorate plans that were published in 1991
Non-minimally coupled tachyon and inflation
International Nuclear Information System (INIS)
Piao Yunsong; Huang Qingguo; Zhang Xinmin; Zhang Yuanzhong
2003-01-01
In this Letter, we consider a model of tachyon with a non-minimal coupling to gravity and study its cosmological effects. Regarding inflation, we show that only for a specific coupling of tachyon to gravity this model satisfies observations and solves various problems which exist in the single and multi tachyon inflation models. But noting in the string theory the coupling coefficient of tachyon to gravity is of order g s , which in general is very small, we can hardly expect that the non-minimally coupling of tachyon to gravity could provide a reasonable tachyon inflation scenario. Our work may be a meaningful try for the cosmological effect of tachyon non-minimally coupled to gravity
One-dimensional Gromov minimal filling problem
International Nuclear Information System (INIS)
Ivanov, Alexandr O; Tuzhilin, Alexey A
2012-01-01
The paper is devoted to a new branch in the theory of one-dimensional variational problems with branching extremals, the investigation of one-dimensional minimal fillings introduced by the authors. On the one hand, this problem is a one-dimensional version of a generalization of Gromov's minimal fillings problem to the case of stratified manifolds. On the other hand, this problem is interesting in itself and also can be considered as a generalization of another classical problem, the Steiner problem on the construction of a shortest network connecting a given set of terminals. Besides the statement of the problem, we discuss several properties of the minimal fillings and state several conjectures. Bibliography: 38 titles.
High energy density processing of a free form nickel-alumina nanocomposite
Viswanathan, V; Agarwal, A; Ocelik, V; De Hosson, J T M; Sobczak, N; Seal, S
The development of a free form bulk Nickel reinforced Alumina matrix nano composites using Air Plasma Spray and laser processing has been presented. The process consumes less time and requires further minimal machining and therefore is cost effective. The relative differences in using APS over laser
Minimization of rad waste production in NPP Dukovany
International Nuclear Information System (INIS)
Kulovany, J.
2001-01-01
A whole range of measures has been taken in the power plant in connection with the minimization of radioactive waste. It will lead to the set goals. The procedures that prevent possible endangering of the operation take precedence during introduction of the minimization measures. Further economically undemanding procedures are implemented that bring about minimization in an effective way. In accordance with the EMS principles it can be expected that the minimizing measures will be implemented also in areas where their greatest contribution will be for the environment
Iterative closest normal point for 3D face recognition.
Mohammadzade, Hoda; Hatzinakos, Dimitrios
2013-02-01
The common approach for 3D face recognition is to register a probe face to each of the gallery faces and then calculate the sum of the distances between their points. This approach is computationally expensive and sensitive to facial expression variation. In this paper, we introduce the iterative closest normal point method for finding the corresponding points between a generic reference face and every input face. The proposed correspondence finding method samples a set of points for each face, denoted as the closest normal points. These points are effectively aligned across all faces, enabling effective application of discriminant analysis methods for 3D face recognition. As a result, the expression variation problem is addressed by minimizing the within-class variability of the face samples while maximizing the between-class variability. As an important conclusion, we show that the surface normal vectors of the face at the sampled points contain more discriminatory information than the coordinates of the points. We have performed comprehensive experiments on the Face Recognition Grand Challenge database, which is presently the largest available 3D face database. We have achieved verification rates of 99.6 and 99.2 percent at a false acceptance rate of 0.1 percent for the all versus all and ROC III experiments, respectively, which, to the best of our knowledge, have seven and four times less error rates, respectively, compared to the best existing methods on this database.
Sensitivity to ultraviolet radiation in a dominantly inherited form of xeroderma pigmentosum
International Nuclear Information System (INIS)
Imray, F.P.; Relf, W.; Ramsay, R.G.; Kidson, C.; Hockey, A.
1986-01-01
An Australian family is described in which a mild form of xeroderma pigmentosum (XP) is inherited as an autosomal dominant trait. Studies of lymphoblastoid cells and fibroblasts from affected person demonstrated sensitivity to ultraviolet (UV) light as judged by diminished clonogenicity and higher frequencies of UV induced chromosome aberrations compared to normal controls. After UV irradiation of dominant XP cells, replicative DNA synthesis was depressed to a greater extent than normal and the level of UV induced DNA repair synthesis was lower than that in normal cells. The level of sister chromatid exchanges and the numbers of 6-thioguanine resistant mutants induced by UV irradiation were equal to those found in normal controls. Although two subjects in the family had skin cancers, this dominant form of XP is not apparently associated with high risk, or large numbers of skin cancers in affected persons. (author)
Directory of Open Access Journals (Sweden)
M.M. das Neves
2000-03-01
Full Text Available OBJETIVO: Em alcoolistas portadores de lesões hepáticas mínimas avaliar os níveis de glicose e insulina séricas após estímulo com glicose intravenosa. MÉTODOS: Em oito etilistas, portadores de alterações hepáticas mínimas, caracteriza por biópsia hepática, e em 26 controles sadios não-alcoólicos, foram estudados os níveis glicêmicos e insulinêmicos (RIE nos tempos 1, 3, 5, e 10 minutos após estímulo com glicose intravenosa (0.5g/Kg de peso. RESULTADOS: As médias da insulina sérica dos tempos 1, 3 minutos e resposta integrada total (RIT-10min após estímulo foram menores no grupo alcoolista em relação ao controle (p The chronic pancreatitis (CP may evolve with low insulin levels and develop clinical picture of diabetes mellitus. Low seric levels of insulin and C peptide after stimulus has also been described in asymptomatic alcoholics even with normal glicemic curves. It is known that the chronic alcoholism is the main etiological factor of CP and hepatic diseases, and that the insulin produced by the pancreas is metabolized mainly by the liver. High levels of periferic insulin are described in hepatic cirrhosis due to decrease of hepatic metabolization alone or associated to increase of periferic resistence. AIM: In alcoholics with minimal hepatic lesions to evaluate the seric insulin and glucose levels after stimulus with intravenous glucose. METHODS: In 8 alcoholic patients with minimal hepatic lesions characterized by hepatic biopsy, and 26 non-alcoholics, healthy controls, it was studied the serum glucose and insulin levels in basal time, 1, 3, 5, and 10 minutes after stimulus with intravenous glucose (0.5 g/kg. RESULTS: The insulin means in time 1, 3 minutes and total integrated response after stimulus were lower (p < 0.05 in alcoholic group than in control, even with normal glucose curves. CONCLUSION: Alcoholics with minimal hepatic lesions showed low seric insulin levels after glucose stimulus, similar to former
Systems biology perspectives on minimal and simpler cells.
Xavier, Joana C; Patil, Kiran Raosaheb; Rocha, Isabel
2014-09-01
The concept of the minimal cell has fascinated scientists for a long time, from both fundamental and applied points of view. This broad concept encompasses extreme reductions of genomes, the last universal common ancestor (LUCA), the creation of semiartificial cells, and the design of protocells and chassis cells. Here we review these different areas of research and identify common and complementary aspects of each one. We focus on systems biology, a discipline that is greatly facilitating the classical top-down and bottom-up approaches toward minimal cells. In addition, we also review the so-called middle-out approach and its contributions to the field with mathematical and computational models. Owing to the advances in genomics technologies, much of the work in this area has been centered on minimal genomes, or rather minimal gene sets, required to sustain life. Nevertheless, a fundamental expansion has been taking place in the last few years wherein the minimal gene set is viewed as a backbone of a more complex system. Complementing genomics, progress is being made in understanding the system-wide properties at the levels of the transcriptome, proteome, and metabolome. Network modeling approaches are enabling the integration of these different omics data sets toward an understanding of the complex molecular pathways connecting genotype to phenotype. We review key concepts central to the mapping and modeling of this complexity, which is at the heart of research on minimal cells. Finally, we discuss the distinction between minimizing the number of cellular components and minimizing cellular complexity, toward an improved understanding and utilization of minimal and simpler cells. Copyright © 2014, American Society for Microbiology. All Rights Reserved.
Systems Biology Perspectives on Minimal and Simpler Cells
Xavier, Joana C.; Patil, Kiran Raosaheb
2014-01-01
SUMMARY The concept of the minimal cell has fascinated scientists for a long time, from both fundamental and applied points of view. This broad concept encompasses extreme reductions of genomes, the last universal common ancestor (LUCA), the creation of semiartificial cells, and the design of protocells and chassis cells. Here we review these different areas of research and identify common and complementary aspects of each one. We focus on systems biology, a discipline that is greatly facilitating the classical top-down and bottom-up approaches toward minimal cells. In addition, we also review the so-called middle-out approach and its contributions to the field with mathematical and computational models. Owing to the advances in genomics technologies, much of the work in this area has been centered on minimal genomes, or rather minimal gene sets, required to sustain life. Nevertheless, a fundamental expansion has been taking place in the last few years wherein the minimal gene set is viewed as a backbone of a more complex system. Complementing genomics, progress is being made in understanding the system-wide properties at the levels of the transcriptome, proteome, and metabolome. Network modeling approaches are enabling the integration of these different omics data sets toward an understanding of the complex molecular pathways connecting genotype to phenotype. We review key concepts central to the mapping and modeling of this complexity, which is at the heart of research on minimal cells. Finally, we discuss the distinction between minimizing the number of cellular components and minimizing cellular complexity, toward an improved understanding and utilization of minimal and simpler cells. PMID:25184563
The Effects of Minimal Length, Maximal Momentum, and Minimal Momentum in Entropic Force
Directory of Open Access Journals (Sweden)
Zhong-Wen Feng
2016-01-01
Full Text Available The modified entropic force law is studied by using a new kind of generalized uncertainty principle which contains a minimal length, a minimal momentum, and a maximal momentum. Firstly, the quantum corrections to the thermodynamics of a black hole are investigated. Then, according to Verlinde’s theory, the generalized uncertainty principle (GUP corrected entropic force is obtained. The result shows that the GUP corrected entropic force is related not only to the properties of the black holes but also to the Planck length and the dimensionless constants α0 and β0. Moreover, based on the GUP corrected entropic force, we also derive the modified Einstein’s field equation (EFE and the modified Friedmann equation.
Tan, Charles; Sidhu, Stan; Sywak, Mark; Delbridge, Leigh
2009-05-01
Both surgical excision and radioiodine ablation are effective modalities in the management of hyperfunctioning thyroid nodules. Minimally invasive thyroid surgery (MITS) using the lateral mini-incision approach has previously been demonstrated to be a safe and effective technique for thyroid lobectomy. As such MITS may offer advantages as a surgical approach to hyperfunctioning thyroid nodules without the need for a long cervical incision or extensive dissection associated with formal open hemithyroidectomy. The aim of the present study was to assess the safety and efficacy of MITS for the treatment of hyperfunctioning thyroid nodules. This is a retrospective case study. Data were obtained from the University of Sydney Endocrine Surgical Unit Database from 2002 to 2007. There were 86 cases of hyperfunctioning thyroid nodules surgically removed during the study period, of which 10 (12%) were managed using the MITS approach. The ipsilateral recurrent laryngeal nerve was identified and preserved in all cases with no incidence of temporary or permanent nerve palsy. The external branch of the superior laryngeal nerve was visualized and preserved in eight cases (80%). There were no cases of postoperative bleeding. There was one clinically significant follicular thyroid carcinoma in the series (10%). In nine of 10 cases (90%) normalization of thyroid function followed surgery. MITS is a safe and effective procedure, achieving the benefits of a minimally invasive procedure with minimal morbidity. As such it now presents an attractive alternative to radioiodine ablation for the management of small hyperfunctioning thyroid nodules.
DEFF Research Database (Denmark)
David, Alexandre; Håkansson, John; G. Larsen, Kim
In this paper we present an algorithm to compute DBM substractions with a guaranteed minimal number of splits and disjoint DBMs to avoid any redundance. The substraction is one of the few operations that result in a non-convex zone, and thus, requires splitting. It is of prime importance to reduce...
Abelian groups with a minimal generating set | Ruzicka ...
African Journals Online (AJOL)
We study the existence of minimal generating sets in Abelian groups. We prove that Abelian groups with minimal generating sets are not closed under quotients, nor under subgroups, nor under infinite products. We give necessary and sufficient conditions for existence of a minimal generating set providing that the Abelian ...
International Nuclear Information System (INIS)
Band, V.; Zajchowski, D.; Kulesa, V.; Sager, R.
1990-01-01
Human papilloma virus (HPV) types 16 and 18 are most commonly associated with cervical carcinoma in patients and induce immortalization of human keratinocytes in culture. HPV has not been associated with breast cancer. This report describes the immortalization of normal human mammary epithelial cells (76N) by plasmid pHPV18 or pHPV16, each containing the linearized viral genome. Transfectants were grown continuously for more than 60 passages, whereas 76N cells senesce after 18-20 passages. The transfectants also differ from 76N cells in cloning in a completely defined medium called D2 and growing a minimally supplemented defined medium (D3) containing epidermal growth factor. All transfectant tested contain integrated HPV DNA, express HPV RNA, and produce HPV E7 protein. HPV transfectants do not form tumors in a nude mouse assay. It is concluded that products of the HPV genome induce immortalization of human breast epithelial cells and reduce their growth factor requirements. This result raises the possibility that HPV might be involved in breast cancer. Furthermore, other tissue-specific primary epithelial cells that are presently difficult to grown and investigate may also be immortalized by HPV
ASSESSING RADIATION PRESSURE AS A FEEDBACK MECHANISM IN STAR-FORMING GALAXIES
International Nuclear Information System (INIS)
Andrews, Brett H.; Thompson, Todd A.
2011-01-01
Radiation pressure from the absorption and scattering of starlight by dust grains may be an important feedback mechanism in regulating star-forming galaxies. We compile data from the literature on star clusters, star-forming subregions, normal star-forming galaxies, and starbursts to assess the importance of radiation pressure on dust as a feedback mechanism, by comparing the luminosity and flux of these systems to their dust Eddington limit. This exercise motivates a novel interpretation of the Schmidt law, the L IR -L' CO correlation, and the L IR -L' HCN correlation. In particular, the linear L IR -L' HCN correlation is a natural prediction of radiation pressure regulated star formation. Overall, we find that the Eddington limit sets a hard upper bound to the luminosity of any star-forming region. Importantly, however, many normal star-forming galaxies have luminosities significantly below the Eddington limit. We explore several explanations for this discrepancy, especially the role of 'intermittency' in normal spirals-the tendency for only a small number of subregions within a galaxy to be actively forming stars at any moment because of the time dependence of the feedback process and the luminosity evolution of the stellar population. If radiation pressure regulates star formation in dense gas, then the gas depletion timescale is 6 Myr, in good agreement with observations of the densest starbursts. Finally, we highlight the importance of observational uncertainties, namely, the dust-to-gas ratio and the CO-to-H 2 and HCN-to-H 2 conversion factors, that must be understood before a definitive assessment of radiation pressure as a feedback mechanism in star-forming galaxies.
Effect of food service form on eating rate: meal served in a separated form might lower eating rate.
Suh, Hyung Joo; Jung, Eun Young
2016-01-01
In this study, we investigated the association between food form (mixed vs separated) and eating rate. The experiment used a within-subjects design (n=29, young healthy women with normal weight). Test meals (white rice and side dishes) with the same content and volume were served at lunch in a mixed or separated form. The form in which the food was served had significant effects on consumption volume and eating rate; subjects ate significantly more (p<0.05) when a test meal was served as a mixed form (285 g, 575 kcal) compared to a separated form (244 g, 492 kcal). Moreover, subjects also ate significantly faster (p<0.05) when the test meal was served as a mixed form (22.4 g/min) as compared to a separated form (16.2 g/min). Despite consuming more when the test meal was served as a mixed form than when served as a separated form, the subjects did not feel significantly fuller. In conclusion, we confirmed that meals served in a separated form might lower the eating rate and, moreover, slower eating might be associated with less energy intake, without compromising satiety.
Alternative Forms of the Rey Auditory Verbal Learning Test: A Review
Directory of Open Access Journals (Sweden)
Keith A. Hawkins
2004-01-01
Full Text Available Practice effects in memory testing complicate the interpretation of score changes over repeated testings, particularly in clinical applications. Consequently, several alternative forms of the Auditory Verbal Learning Test (AVLT have been developed. Studies of these typically indicate that the forms examined are equivalent. However, the implication that the forms in the literature are interchangeable must be tempered by several caveats. Few studies of equivalence have been undertaken; most are restricted to the comparison of single pairs of forms, and the pairings vary across studies. These limitations are exacerbated by the minimal overlapping across studies in variables reported, or in the analyses of equivalence undertaken. The data generated by these studies are nonetheless valuable, as significant practice effects result from serial use of the same form. The available data on alternative AVLT forms are summarized, and recommendations regarding form development and the determination of form equivalence are offered.
Restructuring of microparticles in nuclear ceramic materials. Part III. Form distribution
International Nuclear Information System (INIS)
Lameiras, F.S.
1991-01-01
According to the present model, the modification of the microparticle form, tending to an equiaxial one, is a way to decrease the interface energy of a microparticle set. If the microparticles are dispersed, these ones tend to the spherical form. If they form aggregates (grains), the interface energy is stored in the grain boundaries, triple lines and quadruple points. A mean topological structure combining two kinds of nearly equiaxed polyhedra is proposed for aggregates of microparticles in order to minimize the surface of the grain boundaries, the length of the triple lines and the number of the quadruple points. As the restructuring evolutes, the average grain form tends to take the one of this polyhedra structure. (author)
Transfer closed and transfer open multimaps in minimal spaces
International Nuclear Information System (INIS)
Alimohammady, M.; Roohi, M.; Delavar, M.R.
2009-01-01
This paper is devoted to introduce the concepts of transfer closed and transfer open multimaps in minimal spaces. Also, some characterizations of them are considered. Further, the notion of minimal local intersection property will be introduced and characterized. Moreover, some maximal element theorems via minimal transfer closed multimaps and minimal local intersection property are given.
New method for computing ideal MHD normal modes in axisymmetric toroidal geometry
International Nuclear Information System (INIS)
Wysocki, F.; Grimm, R.C.
1984-11-01
Analytic elimination of the two magnetic surface components of the displacement vector permits the normal mode ideal MHD equations to be reduced to a scalar form. A Galerkin procedure, similar to that used in the PEST codes, is implemented to determine the normal modes computationally. The method retains the efficient stability capabilities of the PEST 2 energy principle code, while allowing computation of the normal mode frequencies and eigenfunctions, if desired. The procedure is illustrated by comparison with earlier various of PEST and by application to tilting modes in spheromaks, and to stable discrete Alfven waves in tokamak geometry
Improved polyphase ceramic form for high-level defense nuclear waste
International Nuclear Information System (INIS)
Harker, A.B.; Morgan, P.E.D.; Clarke, D.R.; Flintoff, J.J.; Shaw, T.M.
1983-01-01
An improved ceramic nuclear waste form and fabrication process have been developed using simulated Savannah River Plant defense high-level waste compositions. The waste form provides flexibility with respect to processing conditions while exhibiting superior resistance to ground water leaching than other currently proposed forms. The ceramic, consolidated by hot-isostatic pressing at 1040 0 C and 10,000 psi, is composed of six major phases, nepheline, zirconolite, a murataite-type cubic phase, magnetite-type spinel, a magnetoplumbite solid solution, and perovskite. The waste form provides multiple crystal lattice sites for the waste elements, minimizes amorphous intergranular material, and can accommodate waste loadings in excess of 60 wt %. The fabrication of the ceramic can be accomplished with existing manufacturing technology and eliminates the effects of radionuclide volatilization and off-gas induced corrosion experienced with the molten processes for vitreous form production
Meissner effect in diffusive normal metal/d-wave superconductor junctions
Yokoyama, Takehito; Tanaka, Yukio; Golubov, Alexandre Avraamovitch; Inoue, Jun-ichiro; Asano, Yasuhiro
2005-01-01
The Meissner effect in diffusive normal metal/insulator/d-wave superconductor junctions is studied theoretically in the framework of the Usadel equation under the generalized boundary condition. The effect of midgap Andreev resonant states (MARS) formed at the interface of d-wave superconductor is
Correlated random sampling for multivariate normal and log-normal distributions
International Nuclear Information System (INIS)
Žerovnik, Gašper; Trkov, Andrej; Kodeli, Ivan A.
2012-01-01
A method for correlated random sampling is presented. Representative samples for multivariate normal or log-normal distribution can be produced. Furthermore, any combination of normally and log-normally distributed correlated variables may be sampled to any requested accuracy. Possible applications of the method include sampling of resonance parameters which are used for reactor calculations.
LENUS (Irish Health Repository)
Boyle, E
2008-11-01
Laparoscopic surgery for inflammatory bowel disease (IBD) is technically demanding but can offer improved short-term outcomes. The introduction of minimally invasive surgery (MIS) as the default operative approach for IBD, however, may have inherent learning curve-associated disadvantages. We hypothesise that the establishment of MIS as the standard operative approach does not increase patient morbidity as assessed in the initial period of its introduction into a specialised unit, and that it confers earlier postoperative gastrointestinal recovery and reduced hospitalisation compared with conventional open resection.
Minimal covariant observables identifying all pure states
Energy Technology Data Exchange (ETDEWEB)
Carmeli, Claudio, E-mail: claudio.carmeli@gmail.com [D.I.M.E., Università di Genova, Via Cadorna 2, I-17100 Savona (Italy); I.N.F.N., Sezione di Genova, Via Dodecaneso 33, I-16146 Genova (Italy); Heinosaari, Teiko, E-mail: teiko.heinosaari@utu.fi [Turku Centre for Quantum Physics, Department of Physics and Astronomy, University of Turku (Finland); Toigo, Alessandro, E-mail: alessandro.toigo@polimi.it [Dipartimento di Matematica, Politecnico di Milano, Piazza Leonardo da Vinci 32, I-20133 Milano (Italy); I.N.F.N., Sezione di Milano, Via Celoria 16, I-20133 Milano (Italy)
2013-09-02
It has been recently shown by Heinosaari, Mazzarella and Wolf (2013) [1] that an observable that identifies all pure states of a d-dimensional quantum system has minimally 4d−4 outcomes or slightly less (the exact number depending on d). However, no simple construction of this type of minimal observable is known. We investigate covariant observables that identify all pure states and have minimal number of outcomes. It is shown that the existence of this kind of observables depends on the dimension of the Hilbert space.
Graphical approach for multiple values logic minimization
Awwal, Abdul Ahad S.; Iftekharuddin, Khan M.
1999-03-01
Multiple valued logic (MVL) is sought for designing high complexity, highly compact, parallel digital circuits. However, the practical realization of an MVL-based system is dependent on optimization of cost, which directly affects the optical setup. We propose a minimization technique for MVL logic optimization based on graphical visualization, such as a Karnaugh map. The proposed method is utilized to solve signed-digit binary and trinary logic minimization problems. The usefulness of the minimization technique is demonstrated for the optical implementation of MVL circuits.
Harm minimization among teenage drinkers
DEFF Research Database (Denmark)
Jørgensen, Morten Hulvej; Curtis, Tine; Christensen, Pia Haudrup
2007-01-01
AIM: To examine strategies of harm minimization employed by teenage drinkers. DESIGN, SETTING AND PARTICIPANTS: Two periods of ethnographic fieldwork were conducted in a rural Danish community of approximately 2000 inhabitants. The fieldwork included 50 days of participant observation among 13....... In regulating the social context of drinking they relied on their personal experiences more than on formalized knowledge about alcohol and harm, which they had learned from prevention campaigns and educational programmes. CONCLUSIONS: In this study we found that teenagers may help each other to minimize alcohol...
Minimally invasive hysterectomy in Coatis ( Nasua nasua
Directory of Open Access Journals (Sweden)
Bruno W. Minto
Full Text Available ABSTRACT: Some wildlife species, such as coatis, have a high degree of adaptability to adverse conditions, such as fragmented urban forests, increasingly common on the world stage. The increase in the number of these mesopredators causes drastic changes in the communities of smaller predators, interferes with reproductive success of trees, as well as becoming a form of exchange between domestic and wild areas, favoring the transmission of zoonosis and increasing the occurrence of attacks to animals or people. This report describes the use of minimally invasive hysterectomy in two individuals of the species Nasua nasua, which can be accomplished through the use of hook technique, commonly used to castrate dogs and cats. The small incision and healing speed of incised tissues are fundamental in wild life management since the postoperative care is limited by the behavior of these animals. This technique proved to be effective and can greatly reduce the morbidity of this procedure in coatis.
Minimal entropy approximation for cellular automata
International Nuclear Information System (INIS)
Fukś, Henryk
2014-01-01
We present a method for the construction of approximate orbits of measures under the action of cellular automata which is complementary to the local structure theory. The local structure theory is based on the idea of Bayesian extension, that is, construction of a probability measure consistent with given block probabilities and maximizing entropy. If instead of maximizing entropy one minimizes it, one can develop another method for the construction of approximate orbits, at the heart of which is the iteration of finite-dimensional maps, called minimal entropy maps. We present numerical evidence that the minimal entropy approximation sometimes outperforms the local structure theory in characterizing the properties of cellular automata. The density response curve for elementary CA rule 26 is used to illustrate this claim. (paper)
A minimal path searching approach for active shape model (ASM)-based segmentation of the lung
Guo, Shengwen; Fei, Baowei
2009-02-01
We are developing a minimal path searching method for active shape model (ASM)-based segmentation for detection of lung boundaries on digital radiographs. With the conventional ASM method, the position and shape parameters of the model points are iteratively refined and the target points are updated by the least Mahalanobis distance criterion. We propose an improved searching strategy that extends the searching points in a fan-shape region instead of along the normal direction. A minimal path (MP) deformable model is applied to drive the searching procedure. A statistical shape prior model is incorporated into the segmentation. In order to keep the smoothness of the shape, a smooth constraint is employed to the deformable model. To quantitatively assess the ASM-MP segmentation, we compare the automatic segmentation with manual segmentation for 72 lung digitized radiographs. The distance error between the ASM-MP and manual segmentation is 1.75 +/- 0.33 pixels, while the error is 1.99 +/- 0.45 pixels for the ASM. Our results demonstrate that our ASM-MP method can accurately segment the lung on digital radiographs.
A Minimal Path Searching Approach for Active Shape Model (ASM)-based Segmentation of the Lung.
Guo, Shengwen; Fei, Baowei
2009-03-27
We are developing a minimal path searching method for active shape model (ASM)-based segmentation for detection of lung boundaries on digital radiographs. With the conventional ASM method, the position and shape parameters of the model points are iteratively refined and the target points are updated by the least Mahalanobis distance criterion. We propose an improved searching strategy that extends the searching points in a fan-shape region instead of along the normal direction. A minimal path (MP) deformable model is applied to drive the searching procedure. A statistical shape prior model is incorporated into the segmentation. In order to keep the smoothness of the shape, a smooth constraint is employed to the deformable model. To quantitatively assess the ASM-MP segmentation, we compare the automatic segmentation with manual segmentation for 72 lung digitized radiographs. The distance error between the ASM-MP and manual segmentation is 1.75 ± 0.33 pixels, while the error is 1.99 ± 0.45 pixels for the ASM. Our results demonstrate that our ASM-MP method can accurately segment the lung on digital radiographs.
Pirm wastes: permanent isolation in rock-forming minerals
International Nuclear Information System (INIS)
Smyth, J.R.; Vidale, R.J.; Charles, R.W.
1977-01-01
The most practical system for permanent isolation of radioactive wastes in granitic and pelitic environments may be one which specifically tailors the waste form to the environment. This is true because if recrystallization of the waste form takes place within the half-lives of the hazardous radionuclides, it is likely to be the rate-controlling step for release of these nuclides to the ground-water system. The object of the proposed waste-form research at Los Alamos Scintific Laboratory (LASL) is to define a phase assemblage which will minimize chemical reaction with natural fluids in a granitic or pelitic environment. All natural granites contain trace amounts of all fission product elements (except Tc) and many contain minor amounts of these elements as major components of certain accessory phases. Observation of the geochemistry of fission-product elements has led to the identification of the natural minerals as target phases for research. A proposal is made to experimentally determine the amounts of fission product elements which can stably be incorporated into the phases listed below and to determine the leachability of the assemblage this produced using fluids typical of the proposed environments at the Nevada Test Site. This approach to waste isolation satisfies the following requirements: (1) It minimizes chemical reaction with the environment (i.e., recrystallization) which is likely to be the rate-controlling step for release of radionuclides to groundwater; (2) Waste loading (hence temperature) can be easily varied by dilution with material mined from the disposal site; (3) No physical container is required; (4) No maintenance is required (permanent); (5) The environment acts as a containment buffer. It is proposed that such wastes be termed PIRM wastes, for Permanent Isolation in Rock-forming Minerals
Solid forms for Savannah River Plant radioactive wastes
International Nuclear Information System (INIS)
Wallace, R.M.; Hale, W.H.; Bradley, R.F.; Hull, H.L.; Kelley, J.A.; Stone, J.A.; Thompson, G.H.
1976-01-01
Methods are being developed to immobilize Savannah River Plant wastes in solid forms such as cement, asphalt, or glass. 137 Cs and 90 Sr are the major biological hazards and heat producers in the alkaline wastes produced at SRP. In the conceptual process being studied, 137 Cs removed from alkaline supernates, together with insoluble sludges that contain 90 Sr, will be incorporated into solid forms of high integrity and low volume suitable for storage in a retrievable surface storage facility for about 100 years, and for eventual shipment to an off-site repository. Mineralization of 137 Cs, or its fixation on zeolite prior to incorporation into solid forms, is also being studied. Economic analyses to reduce costs and fault-tree analyses to minimize risks are being conducted. Methods are being studied for removal of sludge from (and final decontamination of) waste tanks
Minimal models of multidimensional computations.
Directory of Open Access Journals (Sweden)
Jeffrey D Fitzgerald
2011-03-01
Full Text Available The multidimensional computations performed by many biological systems are often characterized with limited information about the correlations between inputs and outputs. Given this limitation, our approach is to construct the maximum noise entropy response function of the system, leading to a closed-form and minimally biased model consistent with a given set of constraints on the input/output moments; the result is equivalent to conditional random field models from machine learning. For systems with binary outputs, such as neurons encoding sensory stimuli, the maximum noise entropy models are logistic functions whose arguments depend on the constraints. A constraint on the average output turns the binary maximum noise entropy models into minimum mutual information models, allowing for the calculation of the information content of the constraints and an information theoretic characterization of the system's computations. We use this approach to analyze the nonlinear input/output functions in macaque retina and thalamus; although these systems have been previously shown to be responsive to two input dimensions, the functional form of the response function in this reduced space had not been unambiguously identified. A second order model based on the logistic function is found to be both necessary and sufficient to accurately describe the neural responses to naturalistic stimuli, accounting for an average of 93% of the mutual information with a small number of parameters. Thus, despite the fact that the stimulus is highly non-Gaussian, the vast majority of the information in the neural responses is related to first and second order correlations. Our results suggest a principled and unbiased way to model multidimensional computations and determine the statistics of the inputs that are being encoded in the outputs.
Minimal Invasive Urologic Surgery and Postoperative Ileus
Directory of Open Access Journals (Sweden)
Fouad Aoun
2015-07-01
Full Text Available Postoperative ileus (POI is the most common cause of prolonged length of hospital stays (LOS and associated healthcare costs. The advent of minimal invasive technique was a major breakthrough in the urologic landscape with great potential to progress in the future. In the field of gastrointestinal surgery, several studies had reported lower incidence rates for POI following minimal invasive surgery compared to conventional open procedures. In contrast, little is known about the effect of minimal invasive approach on the recovery of bowel motility after urologic surgery. We performed an overview of the potential benefit of minimal invasive approach on POI for urologic procedures. The mechanisms and risk factors responsible for the onset of POI are discussed with emphasis on the advantages of minimal invasive approach. In the urologic field, POI is the main complication following radical cystectomy but it is rarely of clinical significance for other minimal invasive interventions. Laparoscopy or robotic assisted laparoscopic techniques when studied individually may reduce to their own the duration and prevent the onset of POI in a subset of procedures. The potential influence of age and urinary diversion type on postoperative ileus is contradictory in the literature. There is some evidence suggesting that BMI, blood loss, urinary extravasation, existence of a major complication, bowel resection, operative time and transperitoneal approach are independent risk factors for POI. Treatment of POI remains elusive. One of the most important and effective management strategies for patients undergoing radical cystectomy has been the development and use of enhanced recovery programs. An optimal rational strategy to shorten the duration of POI should incorporate minimal invasive approach when appropriate into multimodal fast track programs designed to reduce POI and shorten LOS.
International Nuclear Information System (INIS)
Mahan, G.D.
1992-01-01
The organizers requested that I give eight lectures on the theory of normal metals, ''with an eye on superconductivity.'' My job was to cover the general properties of metals. The topics were selected according to what the students would need to known for the following lectures on superconductivity. My role was to prepare the ground work for the later lectures. The problem is that there is not yet a widely accepted theory for the mechanism which pairs the electrons. Many mechanisms have been proposed, with those of phonons and spin fluctuations having the most followers. So I tried to discuss both topics. I also introduced the tight-binding model for metals, which forms the basis for most of the work on the cuprate superconductors
Minimally conscious state or cortically mediated state?
Naccache, Lionel
2018-04-01
Durable impairments of consciousness are currently classified in three main neurological categories: comatose state, vegetative state (also recently coined unresponsive wakefulness syndrome) and minimally conscious state. While the introduction of minimally conscious state, in 2002, was a major progress to help clinicians recognize complex non-reflexive behaviours in the absence of functional communication, it raises several problems. The most important issue related to minimally conscious state lies in its criteria: while behavioural definition of minimally conscious state lacks any direct evidence of patient's conscious content or conscious state, it includes the adjective 'conscious'. I discuss this major problem in this review and propose a novel interpretation of minimally conscious state: its criteria do not inform us about the potential residual consciousness of patients, but they do inform us with certainty about the presence of a cortically mediated state. Based on this constructive criticism review, I suggest three proposals aiming at improving the way we describe the subjective and cognitive state of non-communicating patients. In particular, I present a tentative new classification of impairments of consciousness that combines behavioural evidence with functional brain imaging data, in order to probe directly and univocally residual conscious processes.
Application of the moving frame method to deformed Willmore surfaces in space forms
Paragoda, Thanuja
2018-06-01
The main goal of this paper is to use the theory of exterior differential forms in deriving variations of the deformed Willmore energy in space forms and study the minimizers of the deformed Willmore energy in space forms. We derive both first and second order variations of deformed Willmore energy in space forms explicitly using moving frame method. We prove that the second order variation of deformed Willmore energy depends on the intrinsic Laplace Beltrami operator, the sectional curvature and some special operators along with mean and Gauss curvatures of the surface embedded in space forms, while the first order variation depends on the extrinsic Laplace Beltrami operator.
Liu, Yongxun; Koga, Kazuhiro; Khumpuang, Sommawan; Nagao, Masayoshi; Matsukawa, Takashi; Hara, Shiro
2017-06-01
Solid source diffusions of phosphorus (P) and boron (B) into the half-inch (12.5 mm) minimal silicon (Si) wafers by spin on dopants (SOD) have been systematically investigated and the physical-vapor-deposited (PVD) titanium nitride (TiN) metal gate minimal silicon-on-insulator (SOI) complementary metal-oxide-semiconductor (CMOS) field-effect transistors (FETs) have successfully been fabricated using the developed SOD thermal diffusion technique. It was experimentally confirmed that a low temperature oxidation (LTO) process which depresses a boron silicide layer formation is effective way to remove boron-glass in a diluted hydrofluoric acid (DHF) solution. It was also found that top Si layer thickness of SOI wafers is reduced in the SOD thermal diffusion process because of its consumption by thermal oxidation owing to the oxygen atoms included in SOD films, which should be carefully considered in the ultrathin SOI device fabrication. Moreover, normal operations of the fabricated minimal PVD-TiN metal gate SOI-CMOS inverters, static random access memory (SRAM) cells and ring oscillators have been demonstrated. These circuit level results indicate that no remarkable particles and interface traps were introduced onto the minimal wafers during the device fabrication, and the developed solid source diffusion by SOD is useful for the fabrication of functional logic gate minimal SOI-CMOS integrated circuits.
Minimally inconsistent reasoning in Semantic Web.
Directory of Open Access Journals (Sweden)
Xiaowang Zhang
Full Text Available Reasoning with inconsistencies is an important issue for Semantic Web as imperfect information is unavoidable in real applications. For this, different paraconsistent approaches, due to their capacity to draw as nontrivial conclusions by tolerating inconsistencies, have been proposed to reason with inconsistent description logic knowledge bases. However, existing paraconsistent approaches are often criticized for being too skeptical. To this end, this paper presents a non-monotonic paraconsistent version of description logic reasoning, called minimally inconsistent reasoning, where inconsistencies tolerated in the reasoning are minimized so that more reasonable conclusions can be inferred. Some desirable properties are studied, which shows that the new semantics inherits advantages of both non-monotonic reasoning and paraconsistent reasoning. A complete and sound tableau-based algorithm, called multi-valued tableaux, is developed to capture the minimally inconsistent reasoning. In fact, the tableaux algorithm is designed, as a framework for multi-valued DL, to allow for different underlying paraconsistent semantics, with the mere difference in the clash conditions. Finally, the complexity of minimally inconsistent description logic reasoning is shown on the same level as the (classical description logic reasoning.
Kinetics of small lymphocytes in normal and nude mice after splenectomy
DEFF Research Database (Denmark)
Hougen, H P; Hansen, F; Jensen, E K
1977-01-01
Autoradiography and various quantitations on lymphoid tissues have been used to evaluate the kinetics of small lymphocytes in normal (+/nu or +/+) and congenitally athymic nude (nu/nu) NMRI mice 1 month after splenectomy or sham-splenectomy. The results indicate that splenectomy causes depressed...... thymic activity and diminished numbers of T lymphocytes in peripheral lymphoid tissues. The total number of cells in these tissues as well as the blast cell activity, were within normal limits. Bone marrow lymphocyte numbers and kinetics as well as blood lymphocyte levels in splenectomized and sham......-splenectomized normal animals were comparable. Blood lymphocyte numbers were at normal levels in splenectomized nude mice, in spite of reduced numbers of bone marrow and thoracic duct lymphocytes. It is suggested that increased number of newly-formed lymphocytes, found in lymph nodes and blood of splenectomized mice...
Mutual-friction induced instability of normal-fluid vortex tubes in superfluid helium-4
Kivotides, Demosthenes
2018-06-01
It is shown that, as a result of its interactions with superfluid vorticity, a normal-fluid vortex tube in helium-4 becomes unstable and disintegrates. The superfluid vorticity acquires only a small (few percents of normal-fluid tube strength) polarization, whilst expanding in a front-like manner in the intervortex space of the normal-fluid, forming a dense, unstructured tangle in the process. The accompanied energy spectra scalings offer a structural explanation of analogous scalings in fully developed finite-temperature superfluid turbulence. A macroscopic mutual-friction model incorporating these findings is proposed.
Marjanovic, Irena; Karan-Djurasevic, Teodora; Ugrin, Milena; Virijevic, Marijana; Vidovic, Ana; Tomin, Dragica; Suvajdzic Vukovic, Nada; Pavlovic, Sonja; Tosic, Natasa
2017-05-01
Acute myeloid leukemia with normal karyotype (AML-NK) represents the largest group of AML patients classified with an intermediate prognosis. A constant need exists to introduce new molecular markers for more precise risk stratification and for minimal residual disease (MRD) monitoring. Quantitative assessment of Wilms tumor 1 (WT1) gene transcripts was performed using real-time polymerase chain reaction. The bone marrow samples were collected at the diagnosis from 104 AML-NK patients and from 34 of these patients during follow-up or disease relapse. We found that overexpression of the WT1 gene (WT1 high status), present in 25.5% of patients, was an independent unfavorable factor for achieving complete remission. WT1 high status was also associated with resistance to therapy and shorter disease-free survival and overall survival. Assessment of the log reduction value of WT1 expression, measured in paired diagnosis/complete remission samples, revealed that patients with a log reduction of < 2 had a tendency toward shorter disease-free survival and overall survival and a greater incidence of disease relapse. Combining WT1 gene expression status with NPM1 and FLT3-ITD mutational status, we found that the tumor behavior of intermediate patients (FLT3-ITD - /NPM1 - double negative) with WT1 high status is almost the same as the tumor behavior of the adverse risk group. WT1 expression status represents a good molecular marker of prognosis, response to treatment, and MRD monitoring. Above all, the usage of the WT1 expression level as an additional marker for more precise risk stratification of AML-NK patients could lead to more adapted, personalized treatment protocols. Copyright © 2017 Elsevier Inc. All rights reserved.