WorldWideScience

Sample records for grocse ii consists

  1. Results from GROCSE, A Real-Time Search for the Optical Counterparts of Gamma-Ray Bursts

    Science.gov (United States)

    Akerlof, Carl; Lee, Brian; Barthelmy, Scott; Cline, Thomas; Gehrels, Neil; Ables, Elden; Bionta, Richard; Ott, Linda; Park, Hye-Sook; Fishman, Gerald; Kouveliotou, Chryssa; Meegan, Charles; Ferguson, Donald

    1994-12-01

    Since January 12, 1994, an experiment called GROCSE (Gamma-Ray Optical Counterpart Search Experiment) has been monitoring the night sky for the optical counterparts of gamma-ray bursts. The basic detector consists of an 8.9 cm aperture electronic camera attached to a rapid slewing computer-controlled mount. This device is activated by the real-time telemetry data stream from the BATSE instrument onboard the Compton Gamma-Ray Observatory. The BATSE signals are filtered and broadcast via the BACODINE network to Lawrence Livermore National Laboratory using the UNIX socket communication protocol linked via Internet. The typical response time to obtain the first image is approximately 15 seconds following the initial burst detection. The field of view of the camera is restricted to 0.18 sterdians to match the online angular position errors associated with the BACODINE GRB coordinate estimates. Under dark skys, the limiting detection magnitude is 8.5. By October 1994, the GROCSE camera has been triggered by seven BATSE bursts. Data from these events are being analyzed to provide either a detection or an upper limit for GRB optical luminosity. Results will be presented for the ratio of optical to gamma-ray intensity. A second generation camera system is currently under development that is expected to push the limiting magnitude to approximately m_v = 13. The status of this effort will be briefly reported.

  2. Influence of composite resin consistency and placement technique on proximal contact tightness of Class II restorations.

    NARCIS (Netherlands)

    Loomans, B.A.C.; Opdam, N.J.M.; Roeters, F.J.M.; Bronkhorst, E.M.; Plasschaert, A.J.M.

    2006-01-01

    PURPOSE: To investigate the influence of composite resin consistency and placement technique on proximal contact tightness of Class II composite resin restorations. MATERIALS AND METHODS: A manikin model (KaVo Dental) was used with an artificial first molar in which a standardized MO preparation was

  3. Self-consistent calculations of optical properties of type I and type II quantum heterostructures

    Science.gov (United States)

    Shuvayev, Vladimir A.

    In this Thesis the self-consistent computational methods are applied to the study of the optical properties of semiconductor nanostructures with one- and two-dimensional quantum confinements. At first, the self-consistent Schrodinger-Poisson system of equations is applied to the cylindrical core-shell structure with type II band alignment without direct Coulomb interaction between carriers. The electron and hole states and confining potential are obtained from a numerical solution of this system. The photoluminescence kinetics is theoretically analyzed, with the nanostructure size dispersion taken into account. The results are applied to the radiative recombination in the system of ZnTe/ZnSe stacked quantum dots. A good agreement with both continuous wave and time-resolved experimental observations is found. It is shown that size distribution results in the photoluminescence decay that has essentially non-exponential behavior even at the tail of the decay where the carrier lifetime is almost the same due to slowly changing overlap of the electron and hole wavefunctions. Also, a model situation applicable to colloidal core-shell nanowires is investigated and discussed. With respect to the excitons in type I quantum wells, a new computationally efficient and flexible approach of calculating the characteristics of excitons, based on a self-consistent variational treatment of the electron-hole Coulomb interaction, is developed. In this approach, a system of self-consistent equations describing the motion of an electron-hole pair is derived. The motion in the growth direction of the quantum well is separated from the in-plane motion, but each of them occurs in modified potentials found self-consistently. This approach is applied to a shallow quantum well with the delta-potential profile, for which analytical expressions for the exciton binding energy and the ground state eigenfunctions are obtained, and to the quantum well with the square potential profile with several

  4. Workshop on Consistency Problems in UML-based Software Development II

    OpenAIRE

    Kuzniarz, Ludwik; Huzar, Zbigniew; Reggio, Gianna; Sourrouille, Jean Louis; Staron, Miroslaw

    2003-01-01

    Workshop materials of the Second Workshop on Consistency Problems in UML-based Software Development. The workshop is part of the Sixth International Conference on The Unified Modeling Language <> 2003 Workshop Website http://www.ipd.bth.se/consistencyUML/uml2003Workshop.asp

  5. A General Pressure Gradient Formulation for Ocean Models - Part II: Energy, Momentum, and Bottom Torque Consistency

    Science.gov (United States)

    Song, Y.; Wright, D.

    1998-01-01

    A formulation of the pressure gradient force for use in models with topography-following coordinates is proposed and diagnostically analyzed by Song. We investigate numerical consistency with respect to global energy conservation, depth-integrated momentum changes, and the represent of the bottom pressure torque.

  6. Indirect (source-free) integration method. II. Self-force consistent radial fall

    Science.gov (United States)

    Ritter, Patxi; Aoudia, Sofiane; Spallicci, Alessandro D. A. M.; Cordier, Stéphane

    2016-12-01

    We apply our method of indirect integration, described in Part I, at fourth order, to the radial fall affected by the self-force (SF). The Mode-Sum regularization is performed in the Regge-Wheeler gauge using the equivalence with the harmonic gauge for this orbit. We consider also the motion subjected to a self-consistent and iterative correction determined by the SF through osculating stretches of geodesics. The convergence of the results confirms the validity of the integration method. This work complements and justifies the analysis and the results appeared in [Int. J. Geom. Meth. Mod. Phys. 11 (2014) 1450090].

  7. Indirect (source-free) integration method. II. Self-force consistent radial fall

    OpenAIRE

    Ritter, P.; Aoudia, S.; Spallicci, A.; S. Cordier

    2015-01-01

    We apply our method of indirect integration, described in Part I, at fourth order, to the radial fall affected by the self-force. The Mode-Sum regularisation is performed in the Regge-Wheeler gauge using the equivalence with the harmonic gauge for this orbit. We consider also the motion subjected to a self-consistent and iterative correction determined by the self-force through osculating stretches of geodesics. The convergence of the results confirms the validity of the integration method. T...

  8. Indirect (source-free) integration method. II. Self-force consistent radial fall

    CERN Document Server

    Ritter, P; Spallicci, A; Cordier, S

    2015-01-01

    We apply our method of indirect integration, described in Part I, at fourth order, to the radial fall affected by the self-force. The Mode-Sum regularisation is performed in the Regge-Wheeler gauge using the equivalence with the harmonic gauge for this orbit. We consider also the motion subjected to a self-consistent and iterative correction determined by the self-force through osculating stretches of geodesics. The convergence of the results confirms the validity of the integration method. This work complements and justifies the analysis and the results appeared in Int. J. Geom. Meth. Mod. Phys., 11, 1450090 (2014).

  9. On the one-loop corrections to inflation II: The consistency relation

    Energy Technology Data Exchange (ETDEWEB)

    Sloth, Martin S. [Department of Physics and Astronomy, University of Aarhus, DK-8000 Aarhus C (Denmark)]. E-mail: sloth@phys.au.dk

    2007-07-16

    In this paper we extend our previous treatment of the one-loop corrections to inflation. Previously we calculated the one-loop corrections to the background and the two-point correlation function of inflaton fluctuations in a specific model of chaotic inflation. We showed that the loop corrections depend on the total number of e-foldings and estimated that the effect could be as large as a few percent in a {lambda}{phi}{sup 4} model of chaotic inflation. In the present paper we generalize the calculations to general inflationary potentials. We find that effect can be as large as 70% in the simplest model of chaotic inflation with a quadratic m{sup 2}{phi}{sup 2} inflationary potential. We discuss the physical interpretation of the effect in terms of the tensor-to-scalar consistency relation. Finally, we discuss the relation to the work of Weinberg on quantum contributions to cosmological correlators.

  10. On the One Loop Corrections to Inflation II: The Consistency Relation

    CERN Document Server

    Sloth, M S

    2006-01-01

    In this paper we extend our previous treatment of the one-loop corrections to inflation. Previously we calculated the one-loop corrections to the background and the two-point correlation function of inflaton fluctuations in a specific model of chaotic inflation. We showed that the loop corrections depend on the total number of e-foldings and estimated that the effect could be as large as a few percent in a lambda-phi-four model of chaotic inflation. In the present paper we generalize the calculations to general inflationary potentials. We find that effect can be as large as 35% in the simplest model of chaotic inflation with a quadratic inflationary potential. We discuss the physical interpretation of the effect in terms of the tensor-to-scalar consistency relation. Finally, we discuss the relation to the work of Weinberg on quantum contributions to cosmological correlators.

  11. Correlation consistent basis sets for actinides. II. The atoms Ac and Np-Lr

    Science.gov (United States)

    Feng, Rulin; Peterson, Kirk A.

    2017-08-01

    New correlation consistent basis sets optimized using the all-electron third-order Douglas-Kroll-Hess (DKH3) scalar relativistic Hamiltonian are reported for the actinide elements Ac and Np through Lr. These complete the series of sets reported previously for Th-U [K. A. Peterson, J. Chem. Phys. 142, 074105 (2015); M. Vasiliu et al., J. Phys. Chem. A 119, 11422 (2015)]. The new sets range in size from double- to quadruple-zeta and encompass both those optimized for valence (6s6p5f7s6d) and outer-core electron correlations (valence + 5s5p5d). The final sets have been contracted for both the DKH3 and eXact 2-component (X2C) Hamiltonians, yielding cc-pVnZ-DK3/cc-pVnZ-X2C sets for valence correlation and cc-pwCVnZ-DK3/cc-pwCVnZ-X2C sets for outer-core correlation (n = D, T, Q in each case). In order to test the effectiveness of the new basis sets, both atomic and molecular benchmark calculations have been carried out. In the first case, the first three atomic ionization potentials (IPs) of all the actinide elements Ac-Lr have been calculated using the Feller-Peterson-Dixon (FPD) composite approach, primarily with the multireference configuration interaction (MRCI) method. Excellent convergence towards the respective complete basis set (CBS) limits is achieved with the new sets, leading to good agreement with experiment, where these exist, after accurately accounting for spin-orbit effects using the 4-component Dirac-Hartree-Fock method. For a molecular test, the IP and atomization energy (AE) of PuO2 have been calculated also using the FPD method but using a coupled cluster approach with spin-orbit coupling accounted for using the 4-component MRCI. The present calculations yield an IP0 for PuO2 of 159.8 kcal/mol, which is in excellent agreement with the experimental electron transfer bracketing value of 162 ± 3 kcal/mol. Likewise, the calculated 0 K AE of 305.6 kcal/mol is in very good agreement with the currently accepted experimental value of 303.1 ± 5 kcal

  12. The H II Region/PDR Connection: Self-Consistent Calculations of Physical Conditions in Star-Forming Regions

    CERN Document Server

    Abel, N P; Shaw, G; Van Hoof, P A M

    2005-01-01

    We have performed a series of calculations designed to reproduce infrared diagnostics used to determine physical conditions in star forming regions. We self-consistently calculate the thermal and chemical structure of an H II region and photodissociation region (PDR) that are in pressure equilibrium. This differs from previous work, which used separate calculations for each gas phase. Our calculations span a wide range of stellar temperatures, gas densities, and ionization parameters. We describe improvements made to the spectral synthesis code Cloudy that made these calculations possible. These include the addition of a molecular network with ~1000 reactions involving 68 molecular species and improved treatment of the grain physics. Data from the Spitzer First Look Survey, along with other archives, are used to derive important physical characteristics of the H II region and PDR. These include stellar temperatures, electron densities, ionization parameters, UV radiation flux, and PDR density. Finally, we cal...

  13. Self-consistent QM/MM methodologies for structural refinement of photosystem II and other macromolecules of biological interest

    Energy Technology Data Exchange (ETDEWEB)

    Batista, Enrique R [Los Alamos National Laboratory; Sproviero, Eduardo M [YALE UNIV; Newcomer, Michael [YALE UNIV; Gascon, Jose A [YALE UNIV; Batista, Victor S [YALE UNIV

    2008-01-01

    The combination of quantum mechanics and molecular mechanics (QM/MM) is one of the most promising approaches to study the structure, function, and properties of proteins and nucleic acids. However, there some instances in which the limitations of either the MM (lack of a proper electronic description) or QM (limited to a few number of atoms) methods prevent a proper description of the system. To address this issue, we review here our approach to fine-tune the structure of biological systems using post-QM/MM refinements. These protocols are based on spectroscopy data, and/or partitioning of the system to extend the QM description to a larger region of a protein. We illustrate these methodologies through applications to several biomolecules, which were pre-optimized at the QM/MM level and then further refined using postQM/MM refinement methodologies: mod(QM/MM), which refines the atomic charges of the residues included in the MM region accounting for polarization effects; mod(QM/MM)-opt that partition the MM region in smaller parts and optimizes each part in an iterative. self-consistent way, and the Polarized-Extended X-Ray Absorption Fine Structure (P-EXAFS) fitting procedure, which fine-tune the atomic coordinates to reproduce experimental polarized EXAFS spectra. The first two techniques were applied to the guanine quadruplex. while the P-EXAFS refinement was applied to the oxygen evolving complex of photosystem II.

  14. Self-consistent evolution of gas and cosmic rays in Cygnus A and similar FR II classic double radio sources

    CERN Document Server

    Mathews, William G

    2010-01-01

    In Cygnus A and other classical FR II double radio sources, powerful opposing jets from the cores of halo-centered galaxies drive out into the surrounding cluster gas, forming hotspots of shocked and compressed cluster gas at the jet extremities. The moving hotspots are sandwiched between two shocks. An inner-facing shock receives momentum and cosmic rays from the jet and creates additional cosmic rays that form a radio lobe elongated along the jet axis. An outer-facing bow shock moves directly into the undisturbed group or cluster gas, creating a cocoon of shocked gas enclosing the radio lobe. We describe computations that follow the self-consistent dynamical evolution of the shocked cluster gas and the relativistic synchrotron-emitting gas inside the lobes. Relativistic and non-relativistic components exchange momentum by interacting with small magnetic fields having dynamically negligible energy densities. The evolution of Cygnus A is governed almost entirely by cosmic ray energy flowing from the hotspots....

  15. Predictive model for Pb(II) adsorption on soil minerals (oxides and low-crystalline aluminum silicate) consistent with spectroscopic evidence

    Science.gov (United States)

    Usiyama, Tomoki; Fukushi, Keisuke

    2016-10-01

    Mobility of Pb(II) in surface condition is governed by adsorption processes on soil minerals such as iron oxides and low-crystalline aluminum silicates. The adsorption effectiveness and the surface complex structures of Pb(II) vary sensitively with solution conditions such as pH, ionic strength, Pb(II) loading, and electrolyte anion type. This study was undertaken to construct a quantitative model for Pb(II) on soil minerals. It can predict the adsorption effectiveness and surface complex structures under any solution conditions using the extended triple layer model (ETLM). The Pb(II) adsorption data for goethite, hydrous ferric oxide (HFO), quartz, and low-crystalline aluminum silicate (LCAS) were analyzed with ETLM to retrieve the surface complexation reactions and these equilibrium constants. The adsorption data on goethite, HFO and quartz were referred from reports of earlier studies. Those data for LCAS were measured under a wide range of pH, ionic strength and Pb(II) loadings in NaNO3 and NaCl solutions. All adsorption data can be reasonably regressed using ETLM with the assumptions of inner sphere bidentate complexation and inner sphere monodentate ternary complexation with electrolyte anions, which are consistent with previously reported spectroscopic evidence. Predictions of surface speciation under widely various solution conditions using ETLM revealed that the inner sphere bidentate complex is the predominant species at neutral to high pH conditions. The inner sphere monodentate ternary complex becomes important at low pH, high surface Pb(II) coverage, and high electrolyte concentrations, of which the behavior is consistent with the spectroscopic observation. Comparisons of the obtained adsorption constants on goethite, HFO and quartz exhibited good linear relations between the reciprocals of dielectric constants of solids and adsorption constants. Those linear relations support predictions of the adsorption constants of all oxides based on Born

  16. Self-consistent Bogoliubov-de Gennes theory of the vortex lattice state in a two-dimensional strongly type-II superconductor at high magnetic fields

    Science.gov (United States)

    Zhuravlev, Vladimir; Duan, Wenye; Maniv, Tsofar

    2017-01-01

    A self-consistent Bogoliubov-de Gennes theory of the vortex lattice state in a 2D strong type-II superconductor at high magnetic fields reveals a novel quantum mixed state around the semiclassical Hc 2, characterized by a well-defined Landau-Bloch band structure in the quasiparticle spectrum and suppressed order-parameter amplitude, which sharply crossover into the well-known semiclassical (Helfand-Werthamer) results upon decreasing magnetic field. Application to the 2D superconducting state observed recently on the surface of the topological insulator Sb2Te3 accounts well for the experimental data, revealing a strong type-II superconductor, with unusually low carrier density and very small cyclotron mass, which can be realized only in the strong coupling superconductor limit.

  17. A self-consistent numerical method for simulation of quantum transport in high electron mobility transistor; part II: The full quantum transport

    Directory of Open Access Journals (Sweden)

    R. Khoie

    1996-01-01

    Full Text Available In Part I of this paper we reported a self-consistent Boltzmann-Schrödinger-Poisson simulator for HEMT in which only electrons in the first subband were assumed to be quantized with their motion restricted to 2 dimensions. In that model, the electrons in the second and higher subbands were treated as bulk system behaving as a 3 dimensional electron gas. In Part II of this paper, we extend our simulator to a self-consistent full-quantum model in which the electrons in the second subband are also treated as quantized 2 dimensional gas. In this model, we consider the electrons in the lowest two subbands to be in the quantum well forming the 2-dimensional electron gas, and the electrons in the third and higher subbands to behave as bulk electrons with no restrictions in their motion. We have further incorporated an additional self-consistency by calculating the field-dependent, energy-dependent scattering rates due to ionized impurities and polar optical phonons. The two higher moments of Boltzmann transport equation are numerically solved for the two lowest subbands and the bulk system; six transport equations, four for the two subbands and two for the bulk system. The Schrödinger and Poisson equations are also solved self-consistently. The wavefunctions obtained are used to calculate the ionized impurity scattering and the polar optical phonon scattering rates. The rates of transfer of electrons and their energies to and from each subband are calculated from these intersubband and intrasubband scattering rates.

  18. Synthesis and Structure of a Mn(II)-triazolyl Coordination Polymer Consisting of Dinuclear Units%Synthesis and Structure of a Mn(II)-triazolyl Coordination Polymer Consisting of Dinuclear Units

    Institute of Scientific and Technical Information of China (English)

    WU Tao; XU Hong-Yan; KONG Fan-Zhen; YU Zhang-Yu; WANG Rui-Hu

    2012-01-01

    The reaction of bis(1,2,4-triazolyl-4-yl) (btr) and MnClO4-6H20 gave rise to a Mn(II) complex comprised of unprecedented dinuclear Mn(II) units, {[Mn2(btr)s(H20)5]- (ClO4)4(H20)2]n (1). Single-crystal X-ray diffraction analysis revealed that 1 crystallizes in the monoclinic space group C2/c. btr adopts two types of bridging modes. One serves as μ-N1 :N2 bridge through one triazolyl ring of btr forming a dinuclear Mn(II) unit, and the other adopts an exo-bidentate mode using two nitrogen atoms from each triazolyl ring and links the dinuclear units into a 2D cationic layer. ClO4 acts as a counter anion and does not take part in coordination. Interestingly, 2D layers are packed in an ABCABC mode. ClO4- and uncoordinated water molecules locate between the adjacent layers, and extensive hydrogen bonds further stabilize the whole framework.

  19. Interface Consistency

    DEFF Research Database (Denmark)

    Staunstrup, Jørgen

    1998-01-01

    This paper proposes that Interface Consistency is an important issue for the development of modular designs. Byproviding a precise specification of component interfaces it becomes possible to check that separately developedcomponents use a common interface in a coherent matter thus avoiding a very...... significant source of design errors. Awide range of interface specifications are possible, the simplest form is a syntactical check of parameter types.However, today it is possible to do more sophisticated forms involving semantic checks....

  20. Solid consistency

    Science.gov (United States)

    Bordin, Lorenzo; Creminelli, Paolo; Mirbabayi, Mehrdad; Noreña, Jorge

    2017-03-01

    We argue that isotropic scalar fluctuations in solid inflation are adiabatic in the super-horizon limit. During the solid phase this adiabatic mode has peculiar features: constant energy-density slices and comoving slices do not coincide, and their curvatures, parameterized respectively by ζ and Script R, both evolve in time. The existence of this adiabatic mode implies that Maldacena's squeezed limit consistency relation holds after angular average over the long mode. The correlation functions of a long-wavelength spherical scalar mode with several short scalar or tensor modes is fixed by the scaling behavior of the correlators of short modes, independently of the solid inflation action or dynamics of reheating.

  1. Visible-light-induced water oxidation by a hybrid photocatalyst consisting of bismuth vanadate and copper(II) meso-tetra(4-carboxyphenyl)porphyrin.

    Science.gov (United States)

    Nakashima, Shu; Negishi, Ryo; Tada, Hiroaki

    2016-03-04

    Copper(II) meso-tetra(4-carboxyphenyl)porphyrin surface-modified monoclinic scheelite bismuth vanadate (CuTCPP/BiVO4) has been synthesized via a two-step route involving chemisorption of TCPP on BiVO4 and successive Cu(II) ion incorporation into the TCPP, and the surface modification drastically enhances the water oxidation to oxygen (O2) under visible-light irradiation (λ > 430 nm).

  2. Efficacy and Safety of Combination Therapy Consisting of Angiotensin II Type 1 Receptor Blocker, Calcium Channel Blocker and Hydrochlorothiazide in Patients With Hypertension

    Science.gov (United States)

    Shiga, Yuhei; Miura, Shin-ichiro; Motozato, Kota; Yoshimine, Yuka; Norimatsu, Kenji; Arimura, Tadaaki; Koyoshi, Rie; Morii, Joji; Kuwano, Takashi; Inoue, Ken; Shirotani, Tetsuro; Fujisawa, Kazuaki; Matsunaga, Eiyu; Saku, Keijiro

    2017-01-01

    Background Many patients continue to have high blood pressure (BP) even after treatment with high-dose (H)-angiotensin II type 1 receptor blocker (ARB)/calcium channel blocker (CCB) or middle-dose (M)-ARB/CCB/hydrochlorothiazide (HCTZ). Methods Thirty-two hypertensive patients who had the use of H-ARB/CCB or M-ARB/CCB/HCTZ were enrolled in this study. We applied a changeover with a switch to H-ARB (telmisartan 80 mg/day)/CCB (amlodipine 5 mg/day or nifedipine CR 40 mg/day)/HCTZ (12.5 mg/day). Results Systolic BP (SBP) and diastolic BP (DBP) were significantly decreased in all patients and in the H-ARB/CCB and M-ARB/CCB/HCTZ groups after 3 months. Percentage (%) of patients who reached the target BP after 3 months (72%) in all patients was significantly higher than that at 0 months (19%). There were no serious adverse effects in any of the patients. Conclusions Combination therapy with H-ARB/CCB/HCTZ was associated with a significant reduction of BP. PMID:28090225

  3. A Magnetohydrodynamic Model of The M87 Jet. II. Self-consistent Quad-shock Jet Model for Optical Relativistic Motions and Particle Acceleration

    CERN Document Server

    Nakamura, Masanori

    2014-01-01

    We describe a new paradigm for understanding both relativistic motions and particle acceleration in the M87 jet: a magnetically dominated relativistic flow that naturally produces four relativistic magnetohydrodynamic (MHD) shocks (forward/reverse fast and slow modes). We apply this model to a set of optical super- and subluminal motions discovered by Biretta and coworkers with the {\\em Hubble Space Telescope} during 1994 -- 1998. The model concept consists of ejection of a {\\em single} relativistic Poynting jet, which possesses a coherent helical (poloidal + toroidal) magnetic component, at the remarkably flaring point HST-1. We are able to reproduce quantitatively proper motions of components seen in the {\\em optical} observations of HST-1 with the same model we used previously to describe similar features in radio VLBI observations in 2005 -- 2006. This indicates that the quad relativistic MHD shock model can be applied generally to recurring pairs of super/subluminal knots ejected from the upstream edge o...

  4. Immunization with the hybrid protein vaccine, consisting of Leishmania major cysteine proteinases Type I (CPB) and Type II (CPA), partially protects against leishmaniasis.

    Science.gov (United States)

    Zadeh-Vakili, Azita; Taheri, Tahere; Taslimi, Yasaman; Doustdari, Fatemeh; Salmanian, Ali-Hatef; Rafati, Sima

    2004-05-07

    Cysteine proteinases (CPs) are enzymes that belong to the papain superfamily, which are found in a number of organisms from prokaryotes to mammals. On the parasitic protozoan Leishmania, extensive studies have shown that CPs are involved in parasite survival, replication and the onset of disease, and have, therefore, been considered as attractive drugs and/or vaccine targets for the control of leishmaniasis. We have previously shown that cysteine proteinases, Type I (CPB) and Type II (CPA), in Leishmania major (L. major), delivered as recombinant proteins or in plasmid DNA, induce partial protection against infection with the parasite in BALB/c mice. We had shown that the level of protection was greater if a cocktail of cpa and cpb containing DNA constructs was used. Therefore, to reduce the costs associated with the production of these vaccine candidates, a construct was developed, whereby the cpa and cpb genes were fused together to give rise to a single hybrid protein. The genes were fused in tandem where the C-terminal extension (CTE), encoding region of CPB, was located at the 3' of the fused genes, and ultimately expressed in the bacterial expression construct pET-23a. The expression of the CPA/B hybrid protein (60 kDa) was verified using rabbit anti-CPA and anti-CPB antibodies by SDS-PAGE and immunoblotting. The protective potential of the CPA/B hybrid protein against the infection with Leishmania was then assessed in BALB/c mice. The animals were vaccinated with CPA/B, challenged with live L. major promastigotes, and the degree of protection was examined by measuring footpad lesion sizes. It was found that there was a delay in the expansion of lesions size compared to control groups. Furthermore, an immunological analysis of antibody isotypes, before and after infection, showed high levels of IgG2a compared to IgG1 (more than five-fold) in the CPA/B hybrid protein vaccinated group. In addition, a predominant Th1 immune response characterized by in vitro IFN

  5. Description of nuclear systems with a self-consistent configuration-mixing approach. II. Application to structure and reactions in even-even s d -shell nuclei

    Science.gov (United States)

    Robin, C.; Pillet, N.; Dupuis, M.; Le Bloas, J.; Peña Arteaga, D.; Berger, J.-F.

    2017-04-01

    Background: The variational multiparticle-multihole configuration mixing approach to nuclei has been proposed about a decade ago. While the first applications followed rapidly, the implementation of the full formalism of this method has only been recently completed and applied in C. Robin, N. Pillet, D. Peña Arteaga, and J.-F. Berger, [Phys. Rev. C 93, 024302 (2016)], 10.1103/PhysRevC.93.024302 to 12C as a test-case. Purpose: The main objective of the present paper is to carry on the study that was initiated in that reference, in order to put the variational multiparticle-multihole configuration mixing method to more stringent tests. To that aim we perform a systematic study of even-even s d -shell nuclei. Method: The wave function of these nuclei is taken as a configuration mixing built on orbitals of the s d -shell, and both the mixing coefficients of the nuclear state and the single-particle wave functions are determined consistently from the same variational principle. As in the previous works, the calculations are done using the D1S Gogny force. Results: Various ground-state properties are analyzed. In particular, the correlation content and composition of the wave function as well as the single-particle orbitals and energies are examined. Binding energies and charge radii are also calculated and compared to experiment. The description of the first excited state is also examined and the corresponding transition densities are used as input for the calculation of reaction processes such as inelastic electron and proton scattering. Special attention is paid to the effect of the optimization of the single-particle states consistently with the correlations of the system. Conclusions: The variational multiparticle-multihole configuration mixing approach is systematically applied to the description of even-even s d -shell nuclei. Globally, the results are satisfying and encouraging. In particular, charge radii and excitation energies are nicely reproduced. However

  6. Description of nuclear systems with a self-consistent configuration-mixing approach. II: Application to structure and reactions in even-even sd-shell nuclei

    CERN Document Server

    Robin, C; Dupuis, M; Bloas, J Le; Arteaga, D Peña; Berger, J -F

    2016-01-01

    The variational multiparticle-multihole configuration mixing approach (MPMH) to nuclei has been proposed about a decade ago. While the first applications followed rapidly, the implementation of the full formalism of this method has only been recently completed and applied in [C. Robin, N. Pillet, D. Pe\\~na Arteaga and J.-F. Berger, Phys. Rev. C 93, 024302 (2016)] to $^{12}$C as a test-case. The main objective of the present paper is to carry on the study that was initiated in that reference, in order to put the MPMH method to more stringent tests. To that aim we perform a systematic study of even-even sd-shell nuclei. The wave function of these nuclei is taken as a configuration mixing built on orbitals of the sd-shell, and both the mixing coefficients of the nuclear state and the single-particle wave functions are determined consistently from the same variational principle. The calculations are done using the D1S Gogny force. Various ground-state properties are analyzed. In particular, the correlation conten...

  7. Nearest-neighbor sp3s* tight-binding parameters based on the hybrid quasi-particle self-consistent GW method verified by modeling of type-II superlattices

    Science.gov (United States)

    Sawamura, Akitaka; Otsuka, Jun; Kato, Takashi; Kotani, Takao

    2017-06-01

    We report the determination of parameters for the nearest-neighbor sp3s* tight-binding (TB) model for GaP, GaAs, GaSb, InP, InAs, and InSb at 0, 77, and 300 K based on the hybrid quasi-particle self-consistent GW (QSGW) calculation and their application to a type II (InAs)/(GaSb) superlattice. The effects of finite temperature have been incorporated empirically by adjusting the parameter for blending the exchange-correlation terms of the pure QSGW method and local density approximation, in addition to the usage of experimental lattice parameters. As expected, the TB band gap shrinks with temperature and asymptotically with superlattice period when it is large. In addition, a bell curve in the band gap in the case of small superlattice period and slight and remarkable anisotropy in effective masses of electron and hole, both predicted by the hybrid QSGW method, respectively, are reproduced.

  8. Chip Multithreaded Consistency Model

    Institute of Scientific and Technical Information of China (English)

    Zu-Song Li; Dan-Dan Huan; Wei-Wu Hu; Zhi-Min Tang

    2008-01-01

    Multithreaded technique is the developing trend of high performance processor. Memory consistency model is essential to the correctness, performance and complexity of multithreaded processor. The chip multithreaded consistency model adapting to multithreaded processor is proposed in this paper. The restriction imposed on memory event ordering by chip multithreaded consistency is presented and formalized. With the idea of critical cycle built by Wei-Wu Hu, we prove that the proposed chip multithreaded consistency model satisfies the criterion of correct execution of sequential consistency model. Chip multithreaded consistency model provides a way of achieving high performance compared with sequential consistency model and ensures the compatibility of software that the execution result in multithreaded processor is the same as the execution result in uniprocessor. The implementation strategy of chip multithreaded consistency model in Godson-2 SMT processor is also proposed. Godson-2 SMT processor supports chip multithreaded consistency model correctly by exception scheme based on the sequential memory access queue of each thread.

  9. Market-consistent actuarial valuation

    CERN Document Server

    Wüthrich, Mario V

    2016-01-01

    This is the third edition of this well-received textbook, presenting powerful methods for measuring insurance liabilities and assets in a consistent way, with detailed mathematical frameworks that lead to market-consistent values for liabilities. Topics covered are stochastic discounting with deflators, valuation portfolio in life and non-life insurance, probability distortions, asset and liability management, financial risks, insurance technical risks, and solvency. Including updates on recent developments and regulatory changes under Solvency II, this new edition of Market-Consistent Actuarial Valuation also elaborates on different risk measures, providing a revised definition of solvency based on industry practice, and presents an adapted valuation framework which takes a dynamic view of non-life insurance reserving risk.

  10. Consistent model driven architecture

    Science.gov (United States)

    Niepostyn, Stanisław J.

    2015-09-01

    The goal of the MDA is to produce software systems from abstract models in a way where human interaction is restricted to a minimum. These abstract models are based on the UML language. However, the semantics of UML models is defined in a natural language. Subsequently the verification of consistency of these diagrams is needed in order to identify errors in requirements at the early stage of the development process. The verification of consistency is difficult due to a semi-formal nature of UML diagrams. We propose automatic verification of consistency of the series of UML diagrams originating from abstract models implemented with our consistency rules. This Consistent Model Driven Architecture approach enables us to generate automatically complete workflow applications from consistent and complete models developed from abstract models (e.g. Business Context Diagram). Therefore, our method can be used to check practicability (feasibility) of software architecture models.

  11. No consistent bimetric gravity?

    CERN Document Server

    Deser, S; Waldron, A

    2013-01-01

    We discuss the prospects for a consistent, nonlinear, partially massless (PM), gauge symmetry of bimetric gravity (BMG). Just as for single metric massive gravity, ultimate consistency of both BMG and the putative PM BMG theory relies crucially on this gauge symmetry. We argue, however, that it does not exist.

  12. Prizes for consistency

    Energy Technology Data Exchange (ETDEWEB)

    Hiscock, S.

    1986-07-01

    The importance of consistency in coal quality has become of increasing significance recently, with the current trend towards using coal from a range of sources. A significant development has been the swing in responsibilities for coal quality. The increasing demand for consistency in quality has led to a re-examination of where in the trade and transport chain the quality should be assessed and where further upgrading of inspection and preparation facilities are required. Changes are in progress throughout the whole coal transport chain which will improve consistency of delivered coal quality. These include installation of beneficiation plant at coal mines, export terminals, and on the premises of end users. It is suggested that one of the keys to success for the coal industry will be the ability to provide coal of a consistent quality.

  13. Consistent sets contradict

    CERN Document Server

    Kent, A

    1996-01-01

    In the consistent histories formulation of quantum theory, the probabilistic predictions and retrodictions made from observed data depend on the choice of a consistent set. We show that this freedom allows the formalism to retrodict several contradictory propositions which correspond to orthogonal commuting projections and which all have probability one. We also show that the formalism makes contradictory probability one predictions when applied to generalised time-symmetric quantum mechanics.

  14. Network Consistent Data Association.

    Science.gov (United States)

    Chakraborty, Anirban; Das, Abir; Roy-Chowdhury, Amit K

    2016-09-01

    Existing data association techniques mostly focus on matching pairs of data-point sets and then repeating this process along space-time to achieve long term correspondences. However, in many problems such as person re-identification, a set of data-points may be observed at multiple spatio-temporal locations and/or by multiple agents in a network and simply combining the local pairwise association results between sets of data-points often leads to inconsistencies over the global space-time horizons. In this paper, we propose a Novel Network Consistent Data Association (NCDA) framework formulated as an optimization problem that not only maintains consistency in association results across the network, but also improves the pairwise data association accuracies. The proposed NCDA can be solved as a binary integer program leading to a globally optimal solution and is capable of handling the challenging data-association scenario where the number of data-points varies across different sets of instances in the network. We also present an online implementation of NCDA method that can dynamically associate new observations to already observed data-points in an iterative fashion, while maintaining network consistency. We have tested both the batch and the online NCDA in two application areas-person re-identification and spatio-temporal cell tracking and observed consistent and highly accurate data association results in all the cases.

  15. Reporting consistently on CSR

    DEFF Research Database (Denmark)

    Thomsen, Christa; Nielsen, Anne Ellerup

    2006-01-01

    of a case study showing that companies use different and not necessarily consistent strategies for reporting on CSR. Finally, the implications for managerial practice are discussed. The chapter concludes by highlighting the value and awareness of the discourse and the discourse types adopted......This chapter first outlines theory and literature on CSR and Stakeholder Relations focusing on the different perspectives and the contextual and dynamic character of the CSR concept. CSR reporting challenges are discussed and a model of analysis is proposed. Next, our paper presents the results...... in the reporting material. By implementing consistent discourse strategies that interact according to a well-defined pattern or order, it is possible to communicate a strong social commitment on the one hand, and to take into consideration the expectations of the shareholders and the other stakeholders...

  16. A Magnetic Consistency Relation

    CERN Document Server

    Jain, Rajeev Kumar

    2012-01-01

    If cosmic magnetic fields are indeed produced during inflation, they are likely to be correlated with the scalar metric perturbations that are responsible for the Cosmic Microwave Background anisotropies and Large Scale Structure. Within an archetypical model of inflationary magnetogenesis, we show that there exists a new simple consistency relation for the non-Gaussian cross correlation function of the scalar metric perturbation with two powers of the magnetic field in the squeezed limit where the momentum of the metric perturbation vanishes. We emphasize that such a consistency relation turns out to be extremely useful to test some recent calculations in the literature. Apart from primordial non-Gaussianity induced by the curvature perturbations, such a cross correlation might provide a new observational probe of inflation and can in principle reveal the primordial nature of cosmic magnetic fields.

  17. Consistency in Distributed Systems

    OpenAIRE

    Kemme, Bettina; Ramalingam, Ganesan; Schiper, André; Shapiro, Marc; Vaswani, Kapil

    2013-01-01

    International audience; In distributed systems, there exists a fundamental trade-off between data consistency, availability, and the ability to tolerate failures. This trade-off has significant implications on the design of the entire distributed computing infrastructure such as storage systems, compilers and runtimes, application development frameworks and programming languages. Unfortunately, it also has significant, and poorly understood, implications for the designers and developers of en...

  18. Geometrically Consistent Mesh Modification

    KAUST Repository

    Bonito, A.

    2010-01-01

    A new paradigm of adaptivity is to execute refinement, coarsening, and smoothing of meshes on manifolds with incomplete information about their geometry and yet preserve position and curvature accuracy. We refer to this collectively as geometrically consistent (GC) mesh modification. We discuss the concept of discrete GC, show the failure of naive approaches, and propose and analyze a simple algorithm that is GC and accuracy preserving. © 2010 Society for Industrial and Applied Mathematics.

  19. Consistent wind Facilitates Vection

    Directory of Open Access Journals (Sweden)

    Masaki Ogawa

    2011-10-01

    Full Text Available We examined whether a consistent haptic cue suggesting forward self-motion facilitated vection. We used a fan with no blades (Dyson, AM01 providing a wind of constant strength and direction (wind speed was 6.37 m/s to the subjects' faces with the visual stimuli visible through the fan. We used an optic flow of expansion or contraction created by positioning 16,000 dots at random inside a simulated cube (length 20 m, and moving the observer's viewpoint to simulate forward or backward self-motion of 16 m/s. we tested three conditions for fan operation, which were normal operation, normal operation with the fan reversed (ie, no wind, and no operation (no wind and no sound. Vection was facilitated by the wind (shorter latency, longer duration and larger magnitude values with the expansion stimuli. The fan noise did not facilitate vection. The wind neither facilitated nor inhibited vection with the contraction stimuli, perhaps because a headwind is not consistent with backward self-motion. We speculate that the consistency between multi modalities is a key factor in facilitating vection.

  20. Infanticide and moral consistency.

    Science.gov (United States)

    McMahan, Jeff

    2013-05-01

    The aim of this essay is to show that there are no easy options for those who are disturbed by the suggestion that infanticide may on occasion be morally permissible. The belief that infanticide is always wrong is doubtfully compatible with a range of widely shared moral beliefs that underlie various commonly accepted practices. Any set of beliefs about the morality of abortion, infanticide and the killing of animals that is internally consistent and even minimally credible will therefore unavoidably contain some beliefs that are counterintuitive.

  1. The Rucio Consistency Service

    CERN Document Server

    Serfon, Cedric; The ATLAS collaboration

    2016-01-01

    One of the biggest challenge with Large scale data management system is to ensure the consistency between the global file catalog and what is physically on all storage elements. To tackle this issue, the Rucio software which is used by the ATLAS Distributed Data Management system has been extended to automatically handle lost or unregistered files (aka Dark Data). This system automatically detects these inconsistencies and take actions like recovery or deletion of unneeded files in a central manner. In this talk, we will present this system, explain the internals and give some results.

  2. When is holography consistent?

    Energy Technology Data Exchange (ETDEWEB)

    McInnes, Brett, E-mail: matmcinn@nus.edu.sg [National University of Singapore (Singapore); Ong, Yen Chin, E-mail: yenchin.ong@nordita.org [Nordita, KTH Royal Institute of Technology and Stockholm University, Roslagstullsbacken 23, SE-106 91 Stockholm (Sweden)

    2015-09-15

    Holographic duality relates two radically different kinds of theory: one with gravity, one without. The very existence of such an equivalence imposes strong consistency conditions which are, in the nature of the case, hard to satisfy. Recently a particularly deep condition of this kind, relating the minimum of a probe brane action to a gravitational bulk action (in a Euclidean formulation), has been recognized; and the question arises as to the circumstances under which it, and its Lorentzian counterpart, is satisfied. We discuss the fact that there are physically interesting situations in which one or both versions might, in principle, not be satisfied. These arise in two distinct circumstances: first, when the bulk is not an Einstein manifold and, second, in the presence of angular momentum. Focusing on the application of holography to the quark–gluon plasma (of the various forms arising in the early Universe and in heavy-ion collisions), we find that these potential violations never actually occur. This suggests that the consistency condition is a “law of physics” expressing a particular aspect of holography.

  3. Consistent quantum measurements

    Science.gov (United States)

    Griffiths, Robert B.

    2015-11-01

    In response to recent criticisms by Okon and Sudarsky, various aspects of the consistent histories (CH) resolution of the quantum measurement problem(s) are discussed using a simple Stern-Gerlach device, and compared with the alternative approaches to the measurement problem provided by spontaneous localization (GRW), Bohmian mechanics, many worlds, and standard (textbook) quantum mechanics. Among these CH is unique in solving the second measurement problem: inferring from the measurement outcome a property of the measured system at a time before the measurement took place, as is done routinely by experimental physicists. The main respect in which CH differs from other quantum interpretations is in allowing multiple stochastic descriptions of a given measurement situation, from which one (or more) can be selected on the basis of its utility. This requires abandoning a principle (termed unicity), central to classical physics, that at any instant of time there is only a single correct description of the world.

  4. When Is Holography Consistent?

    CERN Document Server

    McInnes, Brett

    2015-01-01

    Holographic duality relates two radically different kinds of theory: one with gravity, one without. The very existence of such an equivalence imposes strong consistency conditions which are, in the nature of the case, hard to satisfy. Recently a particularly deep condition of this kind, relating the minimum of a probe brane action to a gravitational bulk action (in a Euclidean formulation), has been recognised; and the question arises as to the circumstances under which it, and its Lorentzian counterpart, are satisfied. We discuss the fact that there are physically interesting situations in which one or both versions might, in principle, \\emph{not} be satisfied. These arise in two distinct circumstances: first, when the bulk is not an Einstein manifold, and, second, in the presence of angular momentum. Focusing on the application of holography to the quark-gluon plasma (of the various forms arising in the early Universe and in heavy-ion collisions), we find that these potential violations never actually occur...

  5. Evaluating the consistency of the 1982-1999 NDVI trends in the Iberian Peninsula across four time-series derived from the AVHRR sensor: LTDR, GIMMS, FASIR, and PAL-II.

    Science.gov (United States)

    Alcaraz-Segura, Domingo; Liras, Elisa; Tabik, Siham; Paruelo, José; Cabello, Javier

    2010-01-01

    Successive efforts have processed the Advanced Very High Resolution Radiometer (AVHRR) sensor archive to produce Normalized Difference Vegetation Index (NDVI) datasets (i.e., PAL, FASIR, GIMMS, and LTDR) under different corrections and processing schemes. Since NDVI datasets are used to evaluate carbon gains, differences among them may affect nations' carbon budgets in meeting international targets (such as the Kyoto Protocol). This study addresses the consistency across AVHRR NDVI datasets in the Iberian Peninsula (Spain and Portugal) by evaluating whether their 1982-1999 NDVI trends show similar spatial patterns. Significant trends were calculated with the seasonal Mann-Kendall trend test and their spatial consistency with partial Mantel tests. Over 23% of the Peninsula (N, E, and central mountain ranges) showed positive and significant NDVI trends across the four datasets and an additional 18% across three datasets. In 20% of Iberia (SW quadrant), the four datasets exhibited an absence of significant trends and an additional 22% across three datasets. Significant NDVI decreases were scarce (croplands in the Guadalquivir and Segura basins, La Mancha plains, and Valencia). Spatial consistency of significant trends across at least three datasets was observed in 83% of the Peninsula, but it decreased to 47% when comparing across the four datasets. FASIR, PAL, and LTDR were the most spatially similar datasets, while GIMMS was the most different. The different performance of each AVHRR dataset to detect significant NDVI trends (e.g., LTDR detected greater significant trends (both positive and negative) and in 32% more pixels than GIMMS) has great implications to evaluate carbon budgets. The lack of spatial consistency across NDVI datasets derived from the same AVHRR sensor archive, makes it advisable to evaluate carbon gains trends using several satellite datasets and, whether possible, independent/additional data sources to contrast.

  6. Evaluating the Consistency of the 1982–1999 NDVI Trends in the Iberian Peninsula across Four Time-series Derived from the AVHRR Sensor: LTDR, GIMMS, FASIR, and PAL-II

    Directory of Open Access Journals (Sweden)

    José Paruelo

    2010-02-01

    Full Text Available Successive efforts have processed the Advanced Very High Resolution Radiometer (AVHRR sensor archive to produce Normalized Difference Vegetation Index (NDVI datasets (i.e., PAL, FASIR, GIMMS, and LTDR under different corrections and processing schemes. Since NDVI datasets are used to evaluate carbon gains, differences among them may affect nations’ carbon budgets in meeting international targets (such as the Kyoto Protocol. This study addresses the consistency across AVHRR NDVI datasets in the Iberian Peninsula (Spain and Portugal by evaluating whether their 1982–1999 NDVI trends show similar spatial patterns. Significant trends were calculated with the seasonal Mann-Kendall trend test and their spatial consistency with partial Mantel tests. Over 23% of the Peninsula (N, E, and central mountain ranges showed positive and significant NDVI trends across the four datasets and an additional 18% across three datasets. In 20% of Iberia (SW quadrant, the four datasets exhibited an absence of significant trends and an additional 22% across three datasets. Significant NDVI decreases were scarce (croplands in the Guadalquivir and Segura basins, La Mancha plains, and Valencia. Spatial consistency of significant trends across at least three datasets was observed in 83% of the Peninsula, but it decreased to 47% when comparing across the four datasets. FASIR, PAL, and LTDR were the most spatially similar datasets, while GIMMS was the most different. The different performance of each AVHRR dataset to detect significant NDVI trends (e.g., LTDR detected greater significant trends (both positive and negative and in 32% more pixels than GIMMS has great implications to evaluate carbon budgets. The lack of spatial consistency across NDVI datasets derived from the same AVHRR sensor archive, makes it advisable to evaluate carbon gains trends using several satellite datasets and, whether possible, independent/additional data sources to contrast.

  7. Evaluating the Consistency of the 1982–1999 NDVI Trends in the Iberian Peninsula across Four Time-series Derived from the AVHRR Sensor: LTDR, GIMMS, FASIR, and PAL-II

    Science.gov (United States)

    Alcaraz-Segura, Domingo; Liras, Elisa; Tabik, Siham; Paruelo, José; Cabello, Javier

    2010-01-01

    Successive efforts have processed the Advanced Very High Resolution Radiometer (AVHRR) sensor archive to produce Normalized Difference Vegetation Index (NDVI) datasets (i.e., PAL, FASIR, GIMMS, and LTDR) under different corrections and processing schemes. Since NDVI datasets are used to evaluate carbon gains, differences among them may affect nations’ carbon budgets in meeting international targets (such as the Kyoto Protocol). This study addresses the consistency across AVHRR NDVI datasets in the Iberian Peninsula (Spain and Portugal) by evaluating whether their 1982–1999 NDVI trends show similar spatial patterns. Significant trends were calculated with the seasonal Mann-Kendall trend test and their spatial consistency with partial Mantel tests. Over 23% of the Peninsula (N, E, and central mountain ranges) showed positive and significant NDVI trends across the four datasets and an additional 18% across three datasets. In 20% of Iberia (SW quadrant), the four datasets exhibited an absence of significant trends and an additional 22% across three datasets. Significant NDVI decreases were scarce (croplands in the Guadalquivir and Segura basins, La Mancha plains, and Valencia). Spatial consistency of significant trends across at least three datasets was observed in 83% of the Peninsula, but it decreased to 47% when comparing across the four datasets. FASIR, PAL, and LTDR were the most spatially similar datasets, while GIMMS was the most different. The different performance of each AVHRR dataset to detect significant NDVI trends (e.g., LTDR detected greater significant trends (both positive and negative) and in 32% more pixels than GIMMS) has great implications to evaluate carbon budgets. The lack of spatial consistency across NDVI datasets derived from the same AVHRR sensor archive, makes it advisable to evaluate carbon gains trends using several satellite datasets and, whether possible, independent/additional data sources to contrast. PMID:22205868

  8. Time Machines and the Principle of Self-Consistency as a Consequence of the Principle of Stationary Action (ii):. the Cauchy Problem for a Self-Interacting Relativistic Particle

    Science.gov (United States)

    Carlini, A.; Novikov, I. D.

    We consider the action principle to derive the classical, relativistic motion of a selfinteracting particle in a 4D Lorentzian spacetime containing a wormhole and which allows the existence of closed time-like curves. In particular, we study the case of a pointlike particle subject to a “hard-sphere” self-interaction potential and which can traverse the wormhole an arbitrary number of times, and show that the only possible trajectories for which the classical action is stationary are those which are globally self-consistent. Generically, the multiplicity of these trajectories (defined as the number of self-consistent solutions to the equations of motion beginning with given Cauchy data) is finite, and it becomes infinite if certain constraints on the same initial data are satisfied. This confirms the previous conclusions (for a nonrelativistic model) by Echeverria, Klinkhammer and Thorne that the Cauchy initial value problem in the presence of a wormhole “time machine” is classically “ill-posed” (far too many solutions). Our results further extend the recent claim by Novikov et al. that the “principle of self-consistency” is a natural consequence of the “principle of minimal action.”

  9. On the Initial State and Consistency Relations

    CERN Document Server

    Berezhiani, Lasha

    2014-01-01

    We study the effect of the initial state on the consistency conditions for adiabatic perturbations. In order to be consistent with the constraints of General Relativity, the initial state must be diffeomorphism invariant. As a result, we show that initial wavefunctional/density matrix has to satisfy a Slavnov-Taylor identity similar to that of the action. We then investigate the precise ways in which modified initial states can lead to violations of the consistency relations. We find two independent sources of violations: i) the state can include initial non-Gaussianities; ii) even if the initial state is Gaussian, such as a Bogoliubov state, the modified 2-point function can modify the q->0 analyticity properties of the vertex functional and result in violations of the consistency relations.

  10. On the initial state and consistency relations

    Energy Technology Data Exchange (ETDEWEB)

    Berezhiani, Lasha; Khoury, Justin, E-mail: lashaber@sas.upenn.edu, E-mail: jkhoury@sas.upenn.edu [Center for Particle Cosmology, Department of Physics and Astronomy, University of Pennsylvania, 209 South 33rd Street, Philadelphia, PA 19104 (United States)

    2014-09-01

    We study the effect of the initial state on the consistency conditions for adiabatic perturbations. In order to be consistent with the constraints of General Relativity, the initial state must be diffeomorphism invariant. As a result, we show that initial wavefunctional/density matrix has to satisfy a Slavnov-Taylor identity similar to that of the action. We then investigate the precise ways in which modified initial states can lead to violations of the consistency relations. We find two independent sources of violations: i) the state can include initial non-Gaussianities; ii) even if the initial state is Gaussian, such as a Bogoliubov state, the modified 2-point function can modify the q-vector → 0 analyticity properties of the vertex functional and result in violations of the consistency relations.

  11. Consistency in the World Wide Web

    DEFF Research Database (Denmark)

    Thomsen, Jakob Grauenkjær

    Tim Berners-Lee envisioned that computers will behave as agents of humans on the World Wide Web, where they will retrieve, extract, and interact with information from the World Wide Web. A step towards this vision is to make computers capable of extracting this information in a reliable...... and consistent way. In this dissertation we study steps towards this vision by showing techniques for the specication, the verication and the evaluation of the consistency of information in the World Wide Web. We show how to detect certain classes of errors in a specication of information, and we show how...... the World Wide Web, in order to help perform consistent evaluations of web extraction techniques. These contributions are steps towards having computers reliable and consistently extract information from the World Wide Web, which in turn are steps towards achieving Tim Berners-Lee's vision. ii...

  12. cobalt (ii), nickel (ii)

    African Journals Online (AJOL)

    DR. AMINU

    ABSTRACT. The manganese (II), cobalt (II), nickel (II) and copper (II) complexes of N, N' – ... temperature and coordinated water were determined ... indicating fairly stable complex compounds (Table 1). The complex compounds are insoluble [Table 2] in water and common organic solvents, but are readily soluble in ...

  13. Consistency of trace norm minimization

    CERN Document Server

    Bach, Francis

    2007-01-01

    Regularization by the sum of singular values, also referred to as the trace norm, is a popular technique for estimating low rank rectangular matrices. In this paper, we extend some of the consistency results of the Lasso to provide necessary and sufficient conditions for rank consistency of trace norm minimization with the square loss. We also provide an adaptive version that is rank consistent even when the necessary condition for the non adaptive version is not fulfilled.

  14. High SNR Consistent Compressive Sensing

    OpenAIRE

    Kallummil, Sreejith; Kalyani, Sheetal

    2017-01-01

    High signal to noise ratio (SNR) consistency of model selection criteria in linear regression models has attracted a lot of attention recently. However, most of the existing literature on high SNR consistency deals with model order selection. Further, the limited literature available on the high SNR consistency of subset selection procedures (SSPs) is applicable to linear regression with full rank measurement matrices only. Hence, the performance of SSPs used in underdetermined linear models ...

  15. Consistency argued students of fluid

    Science.gov (United States)

    Viyanti; Cari; Suparmi; Winarti; Slamet Budiarti, Indah; Handika, Jeffry; Widyastuti, Fatma

    2017-01-01

    Problem solving for physics concepts through consistency arguments can improve thinking skills of students and it is an important thing in science. The study aims to assess the consistency of the material Fluid student argmentation. The population of this study are College students PGRI Madiun, UIN Sunan Kalijaga Yogyakarta and Lampung University. Samples using cluster random sampling, 145 samples obtained by the number of students. The study used a descriptive survey method. Data obtained through multiple-choice test and interview reasoned. Problem fluid modified from [9] and [1]. The results of the study gained an average consistency argmentation for the right consistency, consistency is wrong, and inconsistent respectively 4.85%; 29.93%; and 65.23%. Data from the study have an impact on the lack of understanding of the fluid material which is ideally in full consistency argued affect the expansion of understanding of the concept. The results of the study as a reference in making improvements in future studies is to obtain a positive change in the consistency of argumentations.

  16. Coordinating user interfaces for consistency

    CERN Document Server

    Nielsen, Jakob

    2001-01-01

    In the years since Jakob Nielsen's classic collection on interface consistency first appeared, much has changed, and much has stayed the same. On the one hand, there's been exponential growth in the opportunities for following or disregarding the principles of interface consistency-more computers, more applications, more users, and of course the vast expanse of the Web. On the other, there are the principles themselves, as persistent and as valuable as ever. In these contributed chapters, you'll find details on many methods for seeking and enforcing consistency, along with bottom-line analys

  17. Consistency of Random Survival Forests.

    Science.gov (United States)

    Ishwaran, Hemant; Kogalur, Udaya B

    2010-07-01

    We prove uniform consistency of Random Survival Forests (RSF), a newly introduced forest ensemble learner for analysis of right-censored survival data. Consistency is proven under general splitting rules, bootstrapping, and random selection of variables-that is, under true implementation of the methodology. Under this setting we show that the forest ensemble survival function converges uniformly to the true population survival function. To prove this result we make one key assumption regarding the feature space: we assume that all variables are factors. Doing so ensures that the feature space has finite cardinality and enables us to exploit counting process theory and the uniform consistency of the Kaplan-Meier survival function.

  18. Process Fairness and Dynamic Consistency

    NARCIS (Netherlands)

    S.T. Trautmann (Stefan); P.P. Wakker (Peter)

    2010-01-01

    textabstractAbstract: When process fairness deviates from outcome fairness, dynamic inconsistencies can arise as in nonexpected utility. Resolute choice (Machina) can restore dynamic consistency under nonexpected utility without using Strotz's precommitment. It can similarly justify dynamically

  19. Gravitation, Causality, and Quantum Consistency

    CERN Document Server

    Hertzberg, Mark P

    2016-01-01

    We examine the role of consistency with causality and quantum mechanics in determining the properties of gravitation. We begin by constructing two different classes of interacting theories of massless spin 2 particles -- gravitons. One involves coupling the graviton with the lowest number of derivatives to matter, the other involves coupling the graviton with higher derivatives to matter, making use of the linearized Riemann tensor. The first class requires an infinite tower of terms for consistency, which is known to lead uniquely to general relativity. The second class only requires a finite number of terms for consistency, which appears as a new class of theories of massless spin 2. We recap the causal consistency of general relativity and show how this fails in the second class for the special case of coupling to photons, exploiting related calculations in the literature. In an upcoming publication [1] this result is generalized to a much broader set of theories. Then, as a causal modification of general ...

  20. Time-consistent and market-consistent evaluations

    NARCIS (Netherlands)

    Pelsser, A.; Stadje, M.A.

    2014-01-01

    We consider evaluation methods for payoffs with an inherent financial risk as encountered for instance for portfolios held by pension funds and insurance companies. Pricing such payoffs in a way consistent to market prices typically involves combining actuarial techniques with methods from mathemati

  1. Consistent Histories in Quantum Cosmology

    CERN Document Server

    Craig, David A; 10.1007/s10701-010-9422-6

    2010-01-01

    We illustrate the crucial role played by decoherence (consistency of quantum histories) in extracting consistent quantum probabilities for alternative histories in quantum cosmology. Specifically, within a Wheeler-DeWitt quantization of a flat Friedmann-Robertson-Walker cosmological model sourced with a free massless scalar field, we calculate the probability that the univese is singular in the sense that it assumes zero volume. Classical solutions of this model are a disjoint set of expanding and contracting singular branches. A naive assessment of the behavior of quantum states which are superpositions of expanding and contracting universes may suggest that a "quantum bounce" is possible i.e. that the wave function of the universe may remain peaked on a non-singular classical solution throughout its history. However, a more careful consistent histories analysis shows that for arbitrary states in the physical Hilbert space the probability of this Wheeler-DeWitt quantum universe encountering the big bang/crun...

  2. The Importance of being consistent

    CERN Document Server

    Wasserman, Adam; Jiang, Kaili; Kim, Min-Cheol; Sim, Eunji; Burke, Kieron

    2016-01-01

    We review the role of self-consistency in density functional theory. We apply a recent analysis to both Kohn-Sham and orbital-free DFT, as well as to Partition-DFT, which generalizes all aspects of standard DFT. In each case, the analysis distinguishes between errors in approximate functionals versus errors in the self-consistent density. This yields insights into the origins of many errors in DFT calculations, especially those often attributed to self-interaction or delocalization error. In many classes of problems, errors can be substantially reduced by using `better' densities. We review the history of these approaches, many of their applications, and give simple pedagogical examples.

  3. Consistent supersymmetric decoupling in cosmology

    NARCIS (Netherlands)

    Sousa Sánchez, Kepa

    2012-01-01

    The present work discusses several problems related to the stability of ground states with broken supersymmetry in supergravity, and to the existence and stability of cosmic strings in various supersymmetric models. In particular we study the necessary conditions to truncate consistently a sector o

  4. Consistence of Network Filtering Rules

    Institute of Scientific and Technical Information of China (English)

    SHE Kun; WU Yuancheng; HUANG Juncai; ZHOU Mingtian

    2004-01-01

    The inconsistence of firewall/VPN(Virtual Private Network) rule makes a huge maintainable cost.With development of Multinational Company,SOHO office,E-government the number of firewalls/VPN will increase rapidly.Rule table in stand-alone or network will be increased in geometric series accordingly.Checking the consistence of rule table manually is inadequate.A formal approach can define semantic consistence,make a theoretic foundation of intelligent management about rule tables.In this paper,a kind of formalization of host rules and network ones for auto rule-validation based on SET theory were proporsed and a rule validation scheme was defined.The analysis results show the superior performance of the methods and demonstrate its potential for the intelligent management based on rule tables.

  5. Self-consistent triaxial models

    CERN Document Server

    Sanders, Jason L

    2015-01-01

    We present self-consistent triaxial stellar systems that have analytic distribution functions (DFs) expressed in terms of the actions. These provide triaxial density profiles with cores or cusps at the centre. They are the first self-consistent triaxial models with analytic DFs suitable for modelling giant ellipticals and dark haloes. Specifically, we study triaxial models that reproduce the Hernquist profile from Williams & Evans (2015), as well as flattened isochrones of the form proposed by Binney (2014). We explore the kinematics and orbital structure of these models in some detail. The models typically become more radially anisotropic on moving outwards, have velocity ellipsoids aligned in Cartesian coordinates in the centre and aligned in spherical polar coordinates in the outer parts. In projection, the ellipticity of the isophotes and the position angle of the major axis of our models generally changes with radius. So, a natural application is to elliptical galaxies that exhibit isophote twisting....

  6. On Modal Refinement and Consistency

    DEFF Research Database (Denmark)

    Nyman, Ulrik; Larsen, Kim Guldstrand; Wasowski, Andrzej

    2007-01-01

    Almost 20 years after the original conception, we revisit several fundamental question about modal transition systems. First, we demonstrate the incompleteness of the standard modal refinement using a counterexample due to Hüttel. Deciding any refinement, complete with respect to the standard...... notions of implementation, is shown to be computationally hard (co-NP hard). Second, we consider four forms of consistency (existence of implementations) for modal specifications. We characterize each operationally, giving algorithms for deciding, and for synthesizing implementations, together...

  7. Tri-Sasakian consistent reduction

    CERN Document Server

    Cassani, Davide

    2011-01-01

    We establish a universal consistent Kaluza-Klein truncation of M-theory based on seven-dimensional tri-Sasakian structure. The four-dimensional truncated theory is an N=4 gauged supergravity with three vector multiplets and a non-abelian gauge group, containing the compact factor SO(3). Consistency follows from the fact that our truncation takes exactly the same form as a left-invariant reduction on a specific coset manifold, and we show that the same holds for the various universal consistent truncations recently put forward in the literature. We describe how the global symmetry group SL(2,R) x SO(6,3) is embedded in the symmetry group E7(7) of maximally supersymmetric reductions, and make the connection with the approach of Exceptional Generalized Geometry. Vacuum AdS4 solutions spontaneously break the amount of supersymmetry from N=4 to N=3,1 or 0, and the spectrum contains massive modes. We find a subtruncation to minimal N=3 gauged supergravity as well as an N=1 subtruncation to the SO(3)-invariant secto...

  8. On the consistency of MPS

    CERN Document Server

    Souto-Iglesias, Antonio; González, Leo M; Cercos-Pita, Jose L

    2013-01-01

    The consistency of Moving Particle Semi-implicit (MPS) method in reproducing the gradient, divergence and Laplacian differential operators is discussed in the present paper. Its relation to the Smoothed Particle Hydrodynamics (SPH) method is rigorously established. The application of the MPS method to solve the Navier-Stokes equations using a fractional step approach is treated, unveiling inconsistency problems when solving the Poisson equation for the pressure. A new corrected MPS method incorporating boundary terms is proposed. Applications to one dimensional boundary value Dirichlet and mixed Neumann-Dirichlet problems and to two-dimensional free-surface flows are presented.

  9. Measuring process and knowledge consistency

    DEFF Research Database (Denmark)

    Edwards, Kasper; Jensen, Klaes Ladeby; Haug, Anders

    2007-01-01

    with a 5 point Liker scale and a corresponding scoring system. Process consistency is measured by using a first-person drawing tool with the respondent in the centre. Respondents sketch the sequence of steps and people they contact when configuring a product. The methodology is tested in one company...... for granted; rather the contrary, and attempting to implement a configuration system may easily ignite a political battle. This is because stakes are high in the sense that the rules and processes chosen may only reflect one part of the practice, ignoring a majority of the employees. To avoid this situation...

  10. Consistency of detrended fluctuation analysis

    Science.gov (United States)

    Løvsletten, O.

    2017-07-01

    The scaling function F (s ) in detrended fluctuation analysis (DFA) scales as F (s ) ˜sH for stochastic processes with Hurst exponent H . This scaling law is proven for stationary stochastic processes with 0 law) autocorrelation function (ACF) scales as ˜s1 /2 . It is also demonstrated that the fluctuation function in DFA is equal in expectation to (i) a weighted sum of the ACF and (ii) a weighted sum of the second-order structure function. These results enable us to compute the exact finite-size bias for signals that are scaling and to employ DFA in a meaningful sense for signals that do not exhibit power-law statistics. The usefulness is illustrated by examples where it is demonstrated that a previous suggested modified DFA will increase the bias for signals with Hurst exponents 1

  11. Maintaining consistency in distributed systems

    Science.gov (United States)

    Birman, Kenneth P.

    1991-01-01

    In systems designed as assemblies of independently developed components, concurrent access to data or data structures normally arises within individual programs, and is controlled using mutual exclusion constructs, such as semaphores and monitors. Where data is persistent and/or sets of operation are related to one another, transactions or linearizability may be more appropriate. Systems that incorporate cooperative styles of distributed execution often replicate or distribute data within groups of components. In these cases, group oriented consistency properties must be maintained, and tools based on the virtual synchrony execution model greatly simplify the task confronting an application developer. All three styles of distributed computing are likely to be seen in future systems - often, within the same application. This leads us to propose an integrated approach that permits applications that use virtual synchrony with concurrent objects that respect a linearizability constraint, and vice versa. Transactional subsystems are treated as a special case of linearizability.

  12. Decentralized Consistent Updates in SDN

    KAUST Repository

    Nguyen, Thanh Dang

    2017-04-10

    We present ez-Segway, a decentralized mechanism to consistently and quickly update the network state while preventing forwarding anomalies (loops and blackholes) and avoiding link congestion. In our design, the centralized SDN controller only pre-computes information needed by the switches during the update execution. This information is distributed to the switches, which use partial knowledge and direct message passing to efficiently realize the update. This separation of concerns has the key benefit of improving update performance as the communication and computation bottlenecks at the controller are removed. Our evaluations via network emulations and large-scale simulations demonstrate the efficiency of ez-Segway, which compared to a centralized approach, improves network update times by up to 45% and 57% at the median and the 99th percentile, respectively. A deployment of a system prototype in a real OpenFlow switch and an implementation in P4 demonstrate the feasibility and low overhead of implementing simple network update functionality within switches.

  13. The Consistent Vehicle Routing Problem

    Energy Technology Data Exchange (ETDEWEB)

    Groer, Christopher S [ORNL; Golden, Bruce [University of Maryland; Edward, Wasil [American University

    2009-01-01

    In the small package shipping industry (as in other industries), companies try to differentiate themselves by providing high levels of customer service. This can be accomplished in several ways, including online tracking of packages, ensuring on-time delivery, and offering residential pickups. Some companies want their drivers to develop relationships with customers on a route and have the same drivers visit the same customers at roughly the same time on each day that the customers need service. These service requirements, together with traditional constraints on vehicle capacity and route length, define a variant of the classical capacitated vehicle routing problem, which we call the consistent VRP (ConVRP). In this paper, we formulate the problem as a mixed-integer program and develop an algorithm to solve the ConVRP that is based on the record-to-record travel algorithm. We compare the performance of our algorithm to the optimal mixed-integer program solutions for a set of small problems and then apply our algorithm to five simulated data sets with 1,000 customers and a real-world data set with more than 3,700 customers. We provide a technique for generating ConVRP benchmark problems from vehicle routing problem instances given in the literature and provide our solutions to these instances. The solutions produced by our algorithm on all problems do a very good job of meeting customer service objectives with routes that have a low total travel time.

  14. Quininium tetrachloridozinc(II

    Directory of Open Access Journals (Sweden)

    Li-Zhuang Chen

    2009-10-01

    Full Text Available The asymmetric unit of the title compound {systematic name: 2-[hydroxy(6-methoxyquinolin-1-ium-4-ylmethyl]-8-vinylquinuclidin-1-ium tetrachloridozinc(II}, (C20H26N2O2[ZnCl4], consists of a double protonated quininium cation and a tetrachloridozinc(II anion. The ZnII ion is in a slightly distorted tetrahedral coordination environment. The crystal structure is stabilized by intermolecular N—H...Cl and O—H...Cl hydrogen bonds.

  15. Market-consistent valuation of pension liabilities

    NARCIS (Netherlands)

    Pelsser, A.; Vlaar, P.

    2008-01-01

    The European Union is currently preparing a new set of rules for the supervision of insurance companies, and is considering implementing these rules for pension funds as well. This framework is known as Solvency II. Similar to the Basle II framework for banking supervision, Solvency II is based on m

  16. Consistent Sets and Contrary Inferences Reply to Griffiths and Hartle

    CERN Document Server

    Kent, A

    1998-01-01

    It was pointed out recently [A. Kent, Phys. Rev. Lett. 78 (1997) 2874] that the consistent histories approach allows contrary inferences to be made from the same data, corresponding to commuting orthogonal projections in different consistent sets. To many, this seems undesirable in a theory of physical inferences. It also raises a specific problem for the consistent histories formalism, since that formalism is set up so as to eliminate contradictory inferences, yet there seems to be no sensible physical distinction between contradictory and contrary inferences. It seems particularly hard to defend this asymmetry, since (i) there is a well-defined quantum histories formalisms which admits both contradictory and contrary inferences, and (ii) there is also a well-defined formalism, based on ordered consistent sets of histories, which excludes both. In a recent comment, Griffiths and Hartle, while accepting the validity of the examples given in the above paper, restate their own preference for the consistent hist...

  17. One-, Two-, and Three-Dimensional Heterospin Complexes Consisting of 4-(N-tert-Butyloxylamino)pyridine (4NOpy), Dicyanamide Ion (DCA), and 3d Metal Ions: Crystal Structures and Magnetic Properties of [M(II)(4NOpy)x(DCA)y(CH3CN)z]n (M = Mn, Co, Ni, Cu, Zn).

    Science.gov (United States)

    Ogawa, Hiraku; Mori, Koya; Murashima, Kensuke; Karasawa, Satoru; Koga, Noboru

    2016-01-19

    Solutions of 3d metal ion salts, M(NO3)2, 4-(N-tert-butyloxylamino)pyridine (4NOpy), and dicyanamide (DCA) in CH3CN were mixed to afford single crystals of the polymeric complexes [M(II)(4NOpy)x(DCA)y(CH3CN)z]n (M(II) = Mn (1), Co (2), Ni (3), Cu (4a and 4b), Zn (5)). X-ray crystallography revealed that the crystal structures are a three-dimensional (3-D) network for 1, 2-D networks for 2, 3, 4a, and 5, and a 1-D chain for 4b. Crystals of 2, 3, 4a, and 5 contained CH3CN molecules as crystal solvents, which were readily desorbed in the ambient atmosphere. After desorption of the CH3CN molecules, the crystal structures of 2 and 3 were confirmed to be slightly shrunk without destruction of the crystal lattice. Crystals of 2, 3, 4a, and 5 after desorption of crystal solvents were used for investigations of the magnetic properties. Complex 1 showed antiferromagnetic interactions to form a ferrimagnetic chain and exhibited the magnetic behavior of a 2-D (or 3-D) spin-canted antiferromagnet with TN = 12 K. Complex 2 containing anisotropic Co(II) ions also showed the behavior of a 1-D (or 2-D) spin-canted antiferromagnet with TN = 6 K. In 3, 4a, and 4b, the aminoxyl of 4NOpy ferromagnetically interacted with the metal ion with coupling constants of JM-NO/kB = 45, 45, and 43 K, respectively. In 5, the magnetic couplings between the aminoxyls in 4NOpy through the diamagnetic Zn(II) ion were weakly antiferromagntic (JNO-NO = -1.2 K). DCA might be a weak antiferromagnetic connector for the metal chains.

  18. Consistent Design of Dependable Control Systems

    DEFF Research Database (Denmark)

    Blanke, M.

    1996-01-01

    Design of fault handling in control systems is discussed, and a method for consistent design is presented.......Design of fault handling in control systems is discussed, and a method for consistent design is presented....

  19. MDCC: Multi-Data Center Consistency

    CERN Document Server

    Kraska, Tim; Franklin, Michael J; Madden, Samuel

    2012-01-01

    Replicating data across multiple data centers not only allows moving the data closer to the user and, thus, reduces latency for applications, but also increases the availability in the event of a data center failure. Therefore, it is not surprising that companies like Google, Yahoo, and Netflix already replicate user data across geographically different regions. However, replication across data centers is expensive. Inter-data center network delays are in the hundreds of milliseconds and vary significantly. Synchronous wide-area replication is therefore considered to be unfeasible with strong consistency and current solutions either settle for asynchronous replication which implies the risk of losing data in the event of failures, restrict consistency to small partitions, or give up consistency entirely. With MDCC (Multi-Data Center Consistency), we describe the first optimistic commit protocol, that does not require a master or partitioning, and is strongly consistent at a cost similar to eventually consiste...

  20. A dual-consistency cache coherence protocol

    OpenAIRE

    Ros, Alberto; Jimborean, Alexandra

    2015-01-01

    Weak memory consistency models can maximize system performance by enabling hardware and compiler optimizations, but increase programming complexity since they do not match programmers’ intuition. The design of an efficient system with an intuitive memory model is an open challenge. This paper proposes SPEL, a dual-consistency cache coherence protocol which simultaneously guarantees the strongest memory consistency model provided by the hardware and yields improvements in both performance and ...

  1. A new approach to hull consistency

    Directory of Open Access Journals (Sweden)

    Kolev Lubomir

    2016-06-01

    Full Text Available Hull consistency is a known technique to improve the efficiency of iterative interval methods for solving nonlinear systems describing steady-states in various circuits. Presently, hull consistency is checked in a scalar manner, i.e. successively for each equation of the nonlinear system with respect to a single variable. In the present poster, a new more general approach to implementing hull consistency is suggested which consists in treating simultaneously several equations with respect to the same number of variables.

  2. Consistent estimators in random censorship semiparametric models

    Institute of Scientific and Technical Information of China (English)

    王启华

    1996-01-01

    For the fixed design regression modelwhen Y, are randomly censored on the right, the estimators of unknown parameter and regression function g from censored observations are defined in the two cases .where the censored distribution is known and unknown, respectively. Moreover, the sufficient conditions under which these estimators are strongly consistent and pth (p>2) mean consistent are also established.

  3. Student Effort, Consistency, and Online Performance

    Science.gov (United States)

    Patron, Hilde; Lopez, Salvador

    2011-01-01

    This paper examines how student effort, consistency, motivation, and marginal learning, influence student grades in an online course. We use data from eleven Microeconomics courses taught online for a total of 212 students. Our findings show that consistency, or less time variation, is a statistically significant explanatory variable, whereas…

  4. Consistent truncations with massive modes and holography

    CERN Document Server

    Cassani, Davide; Faedo, Anton F

    2011-01-01

    We review the basic features of some recently found consistent Kaluza-Klein truncations including massive modes. We emphasize the general ideas underlying the reduction procedure, then we focus on type IIB supergravity on 5-dimensional manifolds admitting a Sasaki-Einstein structure, which leads to half-maximal gauged supergravity in five dimensions. Finally, we comment on the holographic picture of consistency.

  5. CONSISTENT AGGREGATION IN FOOD DEMAND SYSTEMS

    OpenAIRE

    Levedahl, J. William; Reed, Albert J.; Clark, J. Stephen

    2002-01-01

    Two aggregation schemes for food demand systems are tested for consistency with the Generalized Composite Commodity Theorem (GCCT). One scheme is based on the standard CES classification of food expenditures. The second scheme is based on the Food Guide Pyramid. Evidence is found that both schemes are consistent with the GCCT.

  6. A Framework of Memory Consistency Models

    Institute of Scientific and Technical Information of China (English)

    胡伟武; 施巍松; 等

    1998-01-01

    Previous descriptions of memory consistency models in shared-memory multiprocessor systems are mainly expressed as constraints on the memory access event ordering and hence are hardware-centric.This paper presents a framework of memory consistency models which describes the memory consistency model on the behavior level.Based on the understanding that the behavior of an execution is determined by the execution order of conflicting accesses,a memory consistency model is defined as an interprocessor synchronization mechanism which orders the execution of operations from different processors.Synchronization order of an execution under certain consistency model is also defined.The synchronization order,together with the program order determines the behavior of an execution.This paper also presents criteria for correct program and correct implementation of consistency models.Regarding an implementation of a consistency model as certain memory event ordering constraints,this paper provides a method to prove the correctness of consistency model implementations,and the correctness of the lock-based cache coherence protocol is proved with this method.

  7. Sticky continuous processes have consistent price systems

    DEFF Research Database (Denmark)

    Bender, Christian; Pakkanen, Mikko; Sayit, Hasanjan

    Under proportional transaction costs, a price process is said to have a consistent price system, if there is a semimartingale with an equivalent martingale measure that evolves within the bid-ask spread. We show that a continuous, multi-asset price process has a consistent price system, under arb...

  8. Testing the visual consistency of web sites

    NARCIS (Netherlands)

    Geest, van der Thea; Loorbach, Nicole

    2005-01-01

    Consistency in the visual appearance of Web pages is often checked by experts, such as designers or reviewers. This article reports a card sort study conducted to determine whether users rather than experts could distinguish visual (in-)consistency in Web elements and pages. The users proved to agre

  9. Putting Consistent Theories Together in Institutions

    Institute of Scientific and Technical Information of China (English)

    应明生

    1995-01-01

    The problem of putting consistent theories together in institutions is discussed.A general necessary condition for consistency of the resulting theory is carried out,and some sufficient conditions are given for diagrams of theories in which shapes are tree bundles or directed graphs.Moreover,some transformations from complicated cases to simple ones are established.

  10. Modeling and Testing Legacy Data Consistency Requirements

    DEFF Research Database (Denmark)

    Nytun, J. P.; Jensen, Christian Søndergaard

    2003-01-01

    An increasing number of data sources are available on the Internet, many of which offer semantically overlapping data, but based on different schemas, or models. While it is often of interest to integrate such data sources, the lack of consistency among them makes this integration difficult....... This paper addresses the need for new techniques that enable the modeling and consistency checking for legacy data sources. Specifically, the paper contributes to the development of a framework that enables consistency testing of data coming from different types of data sources. The vehicle is UML and its...... accompanying XMI. The paper presents techniques for modeling consistency requirements using OCL and other UML modeling elements: it studies how models that describe the required consistencies among instances of legacy models can be designed in standard UML tools that support XMI. The paper also considers...

  11. Self-Consistent Asset Pricing Models

    CERN Document Server

    Malevergne, Y

    2006-01-01

    We discuss the foundations of factor or regression models in the light of the self-consistency condition that the market portfolio (and more generally the risk factors) is (are) constituted of the assets whose returns it is (they are) supposed to explain. As already reported in several articles, self-consistency implies correlations between the return disturbances. As a consequence, the alpha's and beta's of the factor model are unobservable. Self-consistency leads to renormalized beta's with zero effective alpha's, which are observable with standard OLS regressions. Analytical derivations and numerical simulations show that, for arbitrary choices of the proxy which are different from the true market portfolio, a modified linear regression holds with a non-zero value $\\alpha_i$ at the origin between an asset $i$'s return and the proxy's return. Self-consistency also introduces ``orthogonality'' and ``normality'' conditions linking the beta's, alpha's (as well as the residuals) and the weights of the proxy por...

  12. Quasiparticle self-consistent GW theory.

    Science.gov (United States)

    van Schilfgaarde, M; Kotani, Takao; Faleev, S

    2006-06-09

    In past decades the scientific community has been looking for a reliable first-principles method to predict the electronic structure of solids with high accuracy. Here we present an approach which we call the quasiparticle self-consistent approximation. It is based on a kind of self-consistent perturbation theory, where the self-consistency is constructed to minimize the perturbation. We apply it to selections from different classes of materials, including alkali metals, semiconductors, wide band gap insulators, transition metals, transition metal oxides, magnetic insulators, and rare earth compounds. Apart from some mild exceptions, the properties are very well described, particularly in weakly correlated cases. Self-consistency dramatically improves agreement with experiment, and is sometimes essential. Discrepancies with experiment are systematic, and can be explained in terms of approximations made.

  13. Consistency Relations for Large Field Inflation

    CERN Document Server

    Chiba, Takeshi

    2014-01-01

    Consistency relations for chaotic inflation with a monomial potential and natural inflation and hilltop inflation are given which involve the scalar spectral index $n_s$, the tensor-to-scalar ratio $r$ and the running of the spectral index $\\alpha$. The measurement of $\\alpha$ with $O(10^{-3})$ and the improvement in the measurement of $n_s$ could discriminate monomial model from natural/hilltop inflation models. A consistency region for general large field models is also presented.

  14. Entropy-based consistent model driven architecture

    Science.gov (United States)

    Niepostyn, Stanisław Jerzy

    2016-09-01

    A description of software architecture is a plan of the IT system construction, therefore any architecture gaps affect the overall success of an entire project. The definitions mostly describe software architecture as a set of views which are mutually unrelated, hence potentially inconsistent. Software architecture completeness is also often described in an ambiguous way. As a result most methods of IT systems building comprise many gaps and ambiguities, thus presenting obstacles for software building automation. In this article the consistency and completeness of software architecture are mathematically defined based on calculation of entropy of the architecture description. Following this approach, in this paper we also propose our method of automatic verification of consistency and completeness of the software architecture development method presented in our previous article as Consistent Model Driven Architecture (CMDA). The proposed FBS (Functionality-Behaviour-Structure) entropy-based metric applied in our CMDA approach enables IT architects to decide whether the modelling process is complete and consistent. With this metric, software architects could assess the readiness of undergoing modelling work for the start of IT system building. It even allows them to assess objectively whether the designed software architecture of the IT system could be implemented at all. The overall benefit of such an approach is that it facilitates the preparation of complete and consistent software architecture more effectively as well as it enables assessing and monitoring of the ongoing modelling development status. We demonstrate this with a few industry examples of IT system designs.

  15. Consistency and Derangements in Brane Tilings

    CERN Document Server

    Hanany, Amihay; Ramgoolam, Sanjaye; Seong, Rak-Kyeong

    2015-01-01

    Brane tilings describe Lagrangians (vector multiplets, chiral multiplets, and the superpotential) of four dimensional $\\mathcal{N}=1$ supersymmetric gauge theories. These theories, written in terms of a bipartite graph on a torus, correspond to worldvolume theories on $N$ D$3$-branes probing a toric Calabi-Yau threefold singularity. A pair of permutations compactly encapsulates the data necessary to specify a brane tiling. We show that geometric consistency for brane tilings, which ensures that the corresponding quantum field theories are well behaved, imposes constraints on the pair of permutations, restricting certain products constructed from the pair to have no one-cycles. Permutations without one-cycles are known as derangements. We illustrate this formulation of consistency with known brane tilings. Counting formulas for consistent brane tilings with an arbitrary number of chiral bifundamental fields are written down in terms of delta functions over symmetric groups.

  16. Quantifying the consistency of scientific databases

    CERN Document Server

    Šubelj, Lovro; Boshkoska, Biljana Mileva; Kastrin, Andrej; Levnajić, Zoran

    2015-01-01

    Science is a social process with far-reaching impact on our modern society. In the recent years, for the first time we are able to scientifically study the science itself. This is enabled by massive amounts of data on scientific publications that is increasingly becoming available. The data is contained in several databases such as Web of Science or PubMed, maintained by various public and private entities. Unfortunately, these databases are not always consistent, which considerably hinders this study. Relying on the powerful framework of complex networks, we conduct a systematic analysis of the consistency among six major scientific databases. We found that identifying a single "best" database is far from easy. Nevertheless, our results indicate appreciable differences in mutual consistency of different databases, which we interpret as recipes for future bibliometric studies.

  17. Self-consistent Green's function approaches

    CERN Document Server

    Barbieri, Carlo

    2016-01-01

    We present the fundamental techniques and working equations of many-body Green's function theory for calculating ground state properties and the spectral strength. Green's function methods closely relate to other polynomial scaling approaches discussed in chapters~8 and ~10. However, here we aim directly at a global view of the many-fermion structure. We derive the working equations for calculating many-body propagators, using both the Algebraic Diagrammatic Construction technique and the self-consistent formalism at finite temperature. Their implementation is discussed, as well as the the inclusion of three-nucleon interactions. The self-consistency feature is essential to guarantee thermodynamic consistency. The paring and neutron matter models introduced in previous chapters are solved and compared with the other methods in this book.

  18. Personalized recommendation based on unbiased consistence

    Science.gov (United States)

    Zhu, Xuzhen; Tian, Hui; Zhang, Ping; Hu, Zheng; Zhou, Tao

    2015-08-01

    Recently, in physical dynamics, mass-diffusion-based recommendation algorithms on bipartite network provide an efficient solution by automatically pushing possible relevant items to users according to their past preferences. However, traditional mass-diffusion-based algorithms just focus on unidirectional mass diffusion from objects having been collected to those which should be recommended, resulting in a biased causal similarity estimation and not-so-good performance. In this letter, we argue that in many cases, a user's interests are stable, and thus bidirectional mass diffusion abilities, no matter originated from objects having been collected or from those which should be recommended, should be consistently powerful, showing unbiased consistence. We further propose a consistence-based mass diffusion algorithm via bidirectional diffusion against biased causality, outperforming the state-of-the-art recommendation algorithms in disparate real data sets, including Netflix, MovieLens, Amazon and Rate Your Music.

  19. A Revisit to Probability - Possibility Consistency Principles

    Directory of Open Access Journals (Sweden)

    Mamoni Dhar

    2013-03-01

    Full Text Available In this article, our main intention is to highlight the fact that the probable links between probability and possibility which were established by different authors at different point of time on the basis of some well known consistency principles cannot provide the desired result. That is why the paper discussed some prominent works for transformations between probability and possibility and finally aimed to suggest a new principle because none of the existing principles because none of them found the unique transformation. The new consistency principle which is suggested hereby would in turn replace all others that exist in the literature references by providing a reliable estimate of consistency between the two.Furthermore some properties of entropy of fuzzy numbers are also presented in this article.

  20. Consistent matter couplings for Plebanski gravity

    CERN Document Server

    Tennie, Felix

    2010-01-01

    We develop a scheme for the minimal coupling of all standard types of tensor and spinor field matter to Plebanski gravity. This theory is a geometric reformulation of vacuum general relativity in terms of two-form frames and connection one-forms, and provides a covariant basis for various quantization approaches. Using the spinor formalism we prove the consistency of the newly proposed matter coupling by demonstrating the full equivalence of Plebanski gravity plus matter to Einstein--Cartan gravity. As a byproduct we also show the consistency of some previous suggestions for matter actions.

  1. Consistent matter couplings for Plebanski gravity

    Science.gov (United States)

    Tennie, Felix; Wohlfarth, Mattias N. R.

    2010-11-01

    We develop a scheme for the minimal coupling of all standard types of tensor and spinor field matter to Plebanski gravity. This theory is a geometric reformulation of vacuum general relativity in terms of two-form frames and connection one-forms, and provides a covariant basis for various quantization approaches. Using the spinor formalism we prove the consistency of the newly proposed matter coupling by demonstrating the full equivalence of Plebanski gravity plus matter to Einstein-Cartan gravity. As a by-product we also show the consistency of some previous suggestions for matter actions.

  2. Pb II

    African Journals Online (AJOL)

    Windows User

    ISSN 1684–5315 ©2012 Academic Journals ... Exposure to Pb above permissible limit (50 ppb in water) .... taken and analyzed for residual metal concentration determination. ..... loss in Pb(II) sorption capacity up to five cycles of reuse of.

  3. Container II

    OpenAIRE

    Baraklianou, Stella

    2016-01-01

    Container II, self-published artists book.\\ud The book was made on the occasion of the artists residency at the Banff Arts Centre, in Alberta Canada. \\ud \\ud Container II is a performative piece, it worked in conjunction with the photographic installation "Stage Set: Cool Tone" . (photographic floor installation, Reclaimed wood, frames, 130x145cm, 2016) \\ud The photographic installation was also part of the artists residency titled "New Materiality" at the Banff Arts Centre. \\ud \\ud Limited E...

  4. Consistency in multi-viewpoint architectural design

    NARCIS (Netherlands)

    Dijkman, R.M.; Dijkman, Remco Matthijs

    2006-01-01

    This thesis presents a framework that aids in preserving consistency in multi-viewpoint designs. In a multi-viewpoint design each stakeholder constructs his own design part. We call each stakeholder’s design part the view of that stakeholder. To construct his view, a stakeholder has a viewpoint.

  5. Consistency and stability of recombinant fermentations.

    Science.gov (United States)

    Wiebe, M E; Builder, S E

    1994-01-01

    Production of proteins of consistent quality in heterologous, genetically-engineered expression systems is dependent upon identifying the manufacturing process parameters which have an impact on product structure, function, or purity, validating acceptable ranges for these variables, and performing the manufacturing process as specified. One of the factors which may affect product consistency is genetic instability of the primary product sequence, as well as instability of genes which code for proteins responsible for post-translational modification of the product. Approaches have been developed for mammalian expression systems to assure that product quality is not changing through mechanisms of genetic instability. Sensitive protein analytical methods, particularly peptide mapping, are used to evaluate product structure directly, and are more sensitive in detecting genetic instability than is direct genetic analysis by nucleotide sequencing of the recombinant gene or mRNA. These methods are being employed to demonstrate that the manufacturing process consistently yields a product of defined structure from cells cultured through the range of cell ages used in the manufacturing process and well beyond the maximum cell age defined for the process. The combination of well designed validation studies which demonstrate consistent product quality as a function of cell age, and rigorous quality control of every product lot by sensitive protein analytical methods provide the necessary assurance that product structure is not being altered through mechanisms of mutation and selection.

  6. Developing consistent time series landsat data products

    Science.gov (United States)

    The Landsat series satellite has provided earth observation data record continuously since early 1970s. There are increasing demands on having a consistent time series of Landsat data products. In this presentation, I will summarize the work supported by the USGS Landsat Science Team project from 20...

  7. Developing consistent pronunciation models for phonemic variants

    CSIR Research Space (South Africa)

    Davel, M

    2006-09-01

    Full Text Available from a lexicon containing variants. In this paper we (the authors) address both these issues by creating ‘pseudo-phonemes’ associated with sets of ‘generation restriction rules’ to model those pronunciations that are consistently realised as two or more...

  8. On Consistency Maintenance In Service Discovery

    NARCIS (Netherlands)

    Sundramoorthy, V.; Hartel, Pieter H.; Scholten, Johan

    2005-01-01

    Communication and node failures degrade the ability of a service discovery protocol to ensure Users receive the correct service information when the service changes. We propose that service discovery protocols employ a set of recovery techniques to recover from failures and regain consistency. We

  9. On Consistency Maintenance In Service Discovery

    NARCIS (Netherlands)

    Sundramoorthy, V.; Hartel, Pieter H.; Scholten, Johan

    Communication and node failures degrade the ability of a service discovery protocol to ensure Users receive the correct service information when the service changes. We propose that service discovery protocols employ a set of recovery techniques to recover from failures and regain consistency. We

  10. Consistent feeding positions of great tit parents

    NARCIS (Netherlands)

    Lessells, C.M.; Poelman, E.H.; Mateman, A.C.; Cassey, P.

    2006-01-01

    When parent birds arrive at the nest to provision their young, their position on the nest rim may influence which chick or chicks are fed. As a result, the consistency of feeding positions of the individual parents, and the difference in position between the parents, may affect how equitably food is

  11. Computing Rooted and Unrooted Maximum Consistent Supertrees

    CERN Document Server

    van Iersel, Leo

    2009-01-01

    A chief problem in phylogenetics and database theory is the computation of a maximum consistent tree from a set of rooted or unrooted trees. A standard input are triplets, rooted binary trees on three leaves, or quartets, unrooted binary trees on four leaves. We give exact algorithms constructing rooted and unrooted maximum consistent supertrees in time O(2^n n^5 m^2 log(m)) for a set of m triplets (quartets), each one distinctly leaf-labeled by some subset of n labels. The algorithms extend to weighted triplets (quartets). We further present fast exact algorithms for constructing rooted and unrooted maximum consistent trees in polynomial space. Finally, for a set T of m rooted or unrooted trees with maximum degree D and distinctly leaf-labeled by some subset of a set L of n labels, we compute, in O(2^{mD} n^m m^5 n^6 log(m)) time, a tree distinctly leaf-labeled by a maximum-size subset X of L that all trees in T, when restricted to X, are consistent with.

  12. Addendum to "On the consistency of MPS"

    CERN Document Server

    Souto-Iglesias, Antonio; González, Leo M; Cercos-Pita, Jose L

    2013-01-01

    The analogies between the Moving Particle Semi-implicit method (MPS) and Incompressible Smoothed Particle Hydrodynamics method (ISPH) are established in this note, as an extension of the MPS consistency analysis conducted in "Souto-Iglesias et al., Computer Physics Communications, 184(3), 2013."

  13. Proteolysis and consistency of Meshanger cheese

    NARCIS (Netherlands)

    Jong, de L.

    1978-01-01

    Proteolysis in Meshanger cheese, estimated by quantitative polyacrylamide gel electrophoresis is discussed. The conversion of α s1 -casein was proportional to rennet concentration in the cheese. Changes in consistency, after a maximum, were correlated to breakdown of

  14. On the existence of consistent price systems

    DEFF Research Database (Denmark)

    Bayraktar, Erhan; Pakkanen, Mikko S.; Sayit, Hasanjan

    2014-01-01

    We formulate a sufficient condition for the existence of a consistent price system (CPS), which is weaker than the conditional full support condition (CFS). We use the new condition to show the existence of CPSs for certain processes that fail to have the CFS property. In particular this condition...

  15. A self-consistent Maltsev pulse model

    Science.gov (United States)

    Buneman, O.

    1985-04-01

    A self-consistent model for an electron pulse propagating through a plasma is presented. In this model, the charge imbalance between plasma ions, plasma electrons and pulse electrons creates the travelling potential well in which the pulse electrons are trapped.

  16. Consistent implementation of decisions in the brain.

    Directory of Open Access Journals (Sweden)

    James A R Marshall

    Full Text Available Despite the complexity and variability of decision processes, motor responses are generally stereotypical and independent of decision difficulty. How is this consistency achieved? Through an engineering analogy we consider how and why a system should be designed to realise not only flexible decision-making, but also consistent decision implementation. We specifically consider neurobiologically-plausible accumulator models of decision-making, in which decisions are made when a decision threshold is reached. To trade-off between the speed and accuracy of the decision in these models, one can either adjust the thresholds themselves or, equivalently, fix the thresholds and adjust baseline activation. Here we review how this equivalence can be implemented in such models. We then argue that manipulating baseline activation is preferable as it realises consistent decision implementation by ensuring consistency of motor inputs, summarise empirical evidence in support of this hypothesis, and suggest that it could be a general principle of decision making and implementation. Our goal is therefore to review how neurobiologically-plausible models of decision-making can manipulate speed-accuracy trade-offs using different mechanisms, to consider which of these mechanisms has more desirable decision-implementation properties, and then review the relevant neuroscientific data on which mechanism brains actually use.

  17. Consistency in multi-viewpoint architectural design

    NARCIS (Netherlands)

    Dijkman, Remco Matthijs

    2006-01-01

    This thesis presents a framework that aids in preserving consistency in multi-viewpoint designs. In a multi-viewpoint design each stakeholder constructs his own design part. We call each stakeholder¿s design part the view of that stakeholder. To construct his view, a stakeholder has a viewpoint. Thi

  18. Properties and Update Semantics of Consistent Views

    Science.gov (United States)

    1985-09-01

    8217 PR.OPERTIES AND UPDATE SEMANTICS OF CONSISTENT VIEWS G. Gottlob Institute for Applied Mathematics C.N.H.., G<•nova, Italy Compnt.<•r Sden... Gottlob G., Paolini P., Zicari R., "Proving Properties of Programs ou Database Views", Dipartiuwnto di Elcttronica, Politecnko di Milano (in

  19. Consistency Analysis of Network Traffic Repositories

    NARCIS (Netherlands)

    Lastdrager, Elmer; Pras, Aiko

    2009-01-01

    Traffic repositories with TCP/IP header information are very important for network analysis. Researchers often assume that such repositories reliably represent all traffic that has been flowing over the network; little thoughts are made regarding the consistency of these repositories. Still, for var

  20. Consistency of Network Traffic Repositories: An Overview

    NARCIS (Netherlands)

    Lastdrager, E.; Pras, A.

    2009-01-01

    Traffc repositories with TCP/IP header information are very important for network analysis. Researchers often assume that such repositories reliably represent all traffc that has been flowing over the network; little thoughts are made regarding the consistency of these repositories. Still, for vario

  1. Consistency and variability in functional localisers

    Science.gov (United States)

    Duncan, Keith J.; Pattamadilok, Chotiga; Knierim, Iris; Devlin, Joseph T.

    2009-01-01

    A critical assumption underlying the use of functional localiser scans is that the voxels identified as the functional region-of-interest (fROI) are essentially the same as those activated by the main experimental manipulation. Intra-subject variability in the location of the fROI violates this assumption, reducing the sensitivity of the analysis and biasing the results. Here we investigated consistency and variability in fROIs in a set of 45 volunteers. They performed two functional localiser scans to identify word- and object-sensitive regions of ventral and lateral occipito-temporal cortex, respectively. In the main analyses, fROIs were defined as the category-selective voxels in each region and consistency was measured as the spatial overlap between scans. Consistency was greatest when minimally selective thresholds were used to define “active” voxels (p < 0.05 uncorrected), revealing that approximately 65% of the voxels were commonly activated by both scans. In contrast, highly selective thresholds (p < 10− 4 to 10− 6) yielded the lowest consistency values with less than 25% overlap of the voxels active in both scans. In other words, intra-subject variability was surprisingly high, with between one third and three quarters of the voxels in a given fROI not corresponding to those activated in the main task. This level of variability stands in striking contrast to the consistency seen in retinotopically-defined areas and has important implications for designing robust but efficient functional localiser scans. PMID:19289173

  2. Self-consistency in Capital Markets

    Science.gov (United States)

    Benbrahim, Hamid

    2013-03-01

    Capital Markets are considered, at least in theory, information engines whereby traders contribute to price formation with their diverse perspectives. Regardless whether one believes in efficient market theory on not, actions by individual traders influence prices of securities, which in turn influence actions by other traders. This influence is exerted through a number of mechanisms including portfolio balancing, margin maintenance, trend following, and sentiment. As a result market behaviors emerge from a number of mechanisms ranging from self-consistency due to wisdom of the crowds and self-fulfilling prophecies, to more chaotic behavior resulting from dynamics similar to the three body system, namely the interplay between equities, options, and futures. This talk will address questions and findings regarding the search for self-consistency in capital markets.

  3. Student Effort, Consistency and Online Performance

    Directory of Open Access Journals (Sweden)

    Hilde Patron

    2011-07-01

    Full Text Available This paper examines how student effort, consistency, motivation, and marginal learning, influence student grades in an online course. We use data from eleven Microeconomics courses taught online for a total of 212 students. Our findings show that consistency, or less time variation, is a statistically significant explanatory variable, whereas effort, or total minutes spent online, is not. Other independent variables include GPA and the difference between a pre-test and a post-test. The GPA is used as a measure of motivation, and the difference between a post-test and pre-test as marginal learning. As expected, the level of motivation is found statistically significant at a 99% confidence level, and marginal learning is also significant at a 95% level.

  4. Consistence beats causality in recommender systems

    CERN Document Server

    Zhu, Xuzhen; Hu, Zheng; Zhang, Ping; Zhou, Tao

    2015-01-01

    The explosive growth of information challenges people's capability in finding out items fitting to their own interests. Recommender systems provide an efficient solution by automatically push possibly relevant items to users according to their past preferences. Recommendation algorithms usually embody the causality from what having been collected to what should be recommended. In this article, we argue that in many cases, a user's interests are stable, and thus the previous and future preferences are highly consistent. The temporal order of collections then does not necessarily imply a causality relationship. We further propose a consistence-based algorithm that outperforms the state-of-the-art recommendation algorithms in disparate real data sets, including \\textit{Netflix}, \\textit{MovieLens}, \\textit{Amazon} and \\textit{Rate Your Music}.

  5. A supersymmetric consistent truncation for conifold solutions

    CERN Document Server

    Cassani, Davide

    2010-01-01

    We establish a supersymmetric consistent truncation of type IIB supergravity on the T^{1,1} coset space, based on extending the Papadopoulos-Tseytlin ansatz to the full set of SU(2)xSU(2) invariant Kaluza-Klein modes. The five-dimensional model is a gauged N=4 supergravity with three vector multiplets, which incorporates various conifold solutions and is suitable for the study of their dynamics. By analysing the scalar potential we find a family of new non-supersymmetric AdS_5 extrema interpolating between a solution obtained long ago by Romans and a solution employing an Einstein metric on T^{1,1} different from the standard one. Finally, we discuss some simple consistent subtruncations preserving N=2 supersymmetry. One of them is compatible with the inclusion of smeared D7-branes.

  6. Temporally consistent segmentation of point clouds

    Science.gov (United States)

    Owens, Jason L.; Osteen, Philip R.; Daniilidis, Kostas

    2014-06-01

    We consider the problem of generating temporally consistent point cloud segmentations from streaming RGB-D data, where every incoming frame extends existing labels to new points or contributes new labels while maintaining the labels for pre-existing segments. Our approach generates an over-segmentation based on voxel cloud connectivity, where a modified k-means algorithm selects supervoxel seeds and associates similar neighboring voxels to form segments. Given the data stream from a potentially mobile sensor, we solve for the camera transformation between consecutive frames using a joint optimization over point correspondences and image appearance. The aligned point cloud may then be integrated into a consistent model coordinate frame. Previously labeled points are used to mask incoming points from the new frame, while new and previous boundary points extend the existing segmentation. We evaluate the algorithm on newly-generated RGB-D datasets.

  7. Foundations of consistent couple stress theory

    CERN Document Server

    Hadjesfandiari, Ali R

    2015-01-01

    In this paper, we examine the recently developed skew-symmetric couple stress theory and demonstrate its inner consistency, natural simplicity and fundamental connection to classical mechanics. This hopefully will help the scientific community to overcome any ambiguity and skepticism about this theory, especially the validity of the skew-symmetric character of the couple-stress tensor. We demonstrate that in a consistent continuum mechanics, the response of infinitesimal elements of matter at each point decomposes naturally into a rigid body portion, plus the relative translation and rotation of these elements at adjacent points of the continuum. This relative translation and rotation captures the deformation in terms of stretches and curvatures, respectively. As a result, the continuous displacement field and its corresponding rotation field are the primary variables, which remarkably is in complete alignment with rigid body mechanics, thus providing a unifying basis. For further clarification, we also exami...

  8. Consistent Linearized Gravity in Brane Backgrounds

    CERN Document Server

    Aref'eva, I Ya; Mück, W; Viswanathan, K S; Volovich, I V

    2000-01-01

    A globally consistent treatment of linearized gravity in the Randall-Sundrum background with matter on the brane is formulated. Using a novel gauge, in which the transverse components of the metric are non-vanishing, the brane is kept straight. We analyze the gauge symmetries and identify the physical degrees of freedom of gravity. Our results underline the necessity for non-gravitational confinement of matter to the brane.

  9. Self-consistent model of fermions

    CERN Document Server

    Yershov, V N

    2002-01-01

    We discuss a composite model of fermions based on three-flavoured preons. We show that the opposite character of the Coulomb and strong interactions between these preons lead to formation of complex structures reproducing three generations of quarks and leptons with all their quantum numbers and masses. The model is self-consistent (it doesn't use input parameters). Nevertheless, the masses of the generated structures match the experimental values.

  10. Consistent formulation of the spacelike axial gauge

    Energy Technology Data Exchange (ETDEWEB)

    Burnel, A.; Van der Rest-Jaspers, M.

    1983-12-15

    The usual formulation of the spacelike axial gauge is afflicted with the difficulty that the metric is indefinite while no ghost is involved. We solve this difficulty by introducing a ghost whose elimination is such that the metric becomes positive for physical states. The technique consists in the replacement of the gauge condition nxA = 0 by the weaker one partial/sub 0/nxAroughly-equal0.

  11. Security Policy: Consistency, Adjustments and Restraining Factors

    Institute of Scientific and Technical Information of China (English)

    Yang; Jiemian

    2004-01-01

    In the 2004 U.S. presidential election, despite well-divided domestic opinions and Kerry's appealing slogan of "Reversing the Trend," a slight majority still voted for George W. Bush in the end. It is obvious that, based on the author's analysis, security agenda such as counter-terrorism and Iraqi issue has contributed greatly to the reelection of Mr. Bush. This also indicates that the security policy of Bush's second term will basically be consistent.……

  12. Self-consistent structure of metallic hydrogen

    Science.gov (United States)

    Straus, D. M.; Ashcroft, N. W.

    1977-01-01

    A calculation is presented of the total energy of metallic hydrogen for a family of face-centered tetragonal lattices carried out within the self-consistent phonon approximation. The energy of proton motion is large and proper inclusion of proton dynamics alters the structural dependence of the total energy, causing isotropic lattices to become favored. For the dynamic lattice the structural dependence of terms of third and higher order in the electron-proton interaction is greatly reduced from static lattice equivalents.

  13. Radiometric consistency assessment of hyperspectral infrared sounders

    OpenAIRE

    Wang, L.; Y. Han; Jin, X.; Y. Chen; D. A. Tremblay

    2015-01-01

    The radiometric and spectral consistency among the Atmospheric Infrared Sounder (AIRS), the Infrared Atmospheric Sounding Interferometer (IASI), and the Cross-track Infrared Sounder (CrIS) is fundamental for the creation of long-term infrared (IR) hyperspectral radiance benchmark datasets for both inter-calibration and climate-related studies. In this study, the CrIS radiance measurements on Suomi National Polar-orbiting Partnership (SNPP) satellite are directly com...

  14. The internal consistency of perfect competition

    OpenAIRE

    Jakob Kapeller; Stephan Pühringer

    2010-01-01

    This article surveys some arguments brought forward in defense of the theory of perfect competition. While some critics propose that the theory of perfect competition, and thus also the theory of the firm, are logically flawed, (mainstream) economists defend their most popular textbook model by a series of apparently different arguments. Here it is examined whether these arguments are comparable, consistent and convincing from the point of view of philosophy of science.

  15. Cloud Standardization: Consistent Business Processes and Information

    Directory of Open Access Journals (Sweden)

    Razvan Daniel ZOTA

    2013-01-01

    Full Text Available Cloud computing represents one of the latest emerging trends in distributed computing that enables the existence of hardware infrastructure and software applications as services. The present paper offers a general approach to the cloud computing standardization as a mean of improving the speed of adoption for the cloud technologies. Moreover, this study tries to show out how organizations may achieve more consistent business processes while operating with cloud computing technologies.

  16. Dynamic consistency for Stochastic Optimal Control problems

    CERN Document Server

    Carpentier, Pierre; Cohen, Guy; De Lara, Michel; Girardeau, Pierre

    2010-01-01

    For a sequence of dynamic optimization problems, we aim at discussing a notion of consistency over time. This notion can be informally introduced as follows. At the very first time step $t_0$, the decision maker formulates an optimization problem that yields optimal decision rules for all the forthcoming time step $t_0, t_1, ..., T$; at the next time step $t_1$, he is able to formulate a new optimization problem starting at time $t_1$ that yields a new sequence of optimal decision rules. This process can be continued until final time $T$ is reached. A family of optimization problems formulated in this way is said to be time consistent if the optimal strategies obtained when solving the original problem remain optimal for all subsequent problems. The notion of time consistency, well-known in the field of Economics, has been recently introduced in the context of risk measures, notably by Artzner et al. (2007) and studied in the Stochastic Programming framework by Shapiro (2009) and for Markov Decision Processes...

  17. CMB lens sample covariance and consistency relations

    Science.gov (United States)

    Motloch, Pavel; Hu, Wayne; Benoit-Lévy, Aurélien

    2017-02-01

    Gravitational lensing information from the two and higher point statistics of the cosmic microwave background (CMB) temperature and polarization fields are intrinsically correlated because they are lensed by the same realization of structure between last scattering and observation. Using an analytic model for lens sample covariance, we show that there is one mode, separately measurable in the lensed CMB power spectra and lensing reconstruction, that carries most of this correlation. Once these measurements become lens sample variance dominated, this mode should provide a useful consistency check between the observables that is largely free of sampling and cosmological parameter errors. Violations of consistency could indicate systematic errors in the data and lens reconstruction or new physics at last scattering, any of which could bias cosmological inferences and delensing for gravitational waves. A second mode provides a weaker consistency check for a spatially flat universe. Our analysis isolates the additional information supplied by lensing in a model-independent manner but is also useful for understanding and forecasting CMB cosmological parameter errors in the extended Λ cold dark matter parameter space of dark energy, curvature, and massive neutrinos. We introduce and test a simple but accurate forecasting technique for this purpose that neither double counts lensing information nor neglects lensing in the observables.

  18. Evaluation of the computerized procedures Manual II (COPMA II)

    Energy Technology Data Exchange (ETDEWEB)

    Converse, S.A. [North Carolina State Univ., Raleigh, NC (United States)

    1995-11-01

    The purpose of this study was to evaluate the effects of a computerized procedure system, the Computerized Procedure Manual II (COPMA-II), on the performance and mental workload of licensed reactor operators. To evaluate COPMA-II, eight teams of two operators were trained to operate a scaled pressurized water reactor facility (SPWRF) with traditional paper procedures and with COPMA-II. Following training, each team operated the SPWRF under normal operating conditions with both paper procedures and COPMA-II. The teams then performed one of two accident scenarios with paper procedures, but performed the remaining accident scenario with COPMA-II. Performance measures and subjective estimates of mental workload were recorded for each performance trial. The most important finding of the study was that the operators committed only half as many errors during the accident scenarios with COPMA-II as they committed with paper procedures. However, time to initiate a procedure was fastest for paper procedures for accident scenario trials. For performance under normal operating conditions, there was no difference in time to initiate or to complete a procedure, or in the number of errors committed with paper procedures and with COPMA-II. There were no consistent differences in the mental workload ratings operators recorded for trials with paper procedures and COPMA-II.

  19. TBscore II

    DEFF Research Database (Denmark)

    Rudolf, Frauke; Lemvik, Grethe; Abate, Ebba;

    2013-01-01

    Abstract Background: The TBscore, based on simple signs and symptoms, was introduced to predict unsuccessful outcome in tuberculosis patients on treatment. A recent inter-observer variation study showed profound variation in some variables. Further, some variables depend on a physician assessing...... them, making the score less applicable. The aim of the present study was to simplify the TBscore. Methods: Inter-observer variation assessment and exploratory factor analysis were combined to develop a simplified score, the TBscore II. To validate TBscore II we assessed the association between start...

  20. Consistent constraints on the Standard Model Effective Field Theory

    CERN Document Server

    Berthier, Laure

    2015-01-01

    We develop the global constraint picture in the (linear) effective field theory generalisation of the Standard Model, incorporating data from detectors that operated at PEP, PETRA, TRISTAN, SpS, Tevatron, SLAC, LEPI and LEP II, as well as low energy precision data. We fit one hundred observables. We develop a theory error metric for this effective field theory, which is required when constraints on parameters at leading order in the power counting are to be pushed to the percent level, or beyond, unless the cut off scale is assumed to be large, $\\Lambda \\gtrsim \\, 3 \\, {\\rm TeV}$. We more consistently incorporate theoretical errors in this work, avoiding this assumption, and as a direct consequence bounds on some leading parameters are relaxed. We show how an $\\rm S,T$ analysis is modified by the theory errors we include as an illustrative example.

  1. Evaluating Temporal Consistency in Marine Biodiversity Hotspots.

    Science.gov (United States)

    Piacenza, Susan E; Thurman, Lindsey L; Barner, Allison K; Benkwitt, Cassandra E; Boersma, Kate S; Cerny-Chipman, Elizabeth B; Ingeman, Kurt E; Kindinger, Tye L; Lindsley, Amy J; Nelson, Jake; Reimer, Jessica N; Rowe, Jennifer C; Shen, Chenchen; Thompson, Kevin A; Heppell, Selina S

    2015-01-01

    With the ongoing crisis of biodiversity loss and limited resources for conservation, the concept of biodiversity hotspots has been useful in determining conservation priority areas. However, there has been limited research into how temporal variability in biodiversity may influence conservation area prioritization. To address this information gap, we present an approach to evaluate the temporal consistency of biodiversity hotspots in large marine ecosystems. Using a large scale, public monitoring dataset collected over an eight year period off the US Pacific Coast, we developed a methodological approach for avoiding biases associated with hotspot delineation. We aggregated benthic fish species data from research trawls and calculated mean hotspot thresholds for fish species richness and Shannon's diversity indices over the eight year dataset. We used a spatial frequency distribution method to assign hotspot designations to the grid cells annually. We found no areas containing consistently high biodiversity through the entire study period based on the mean thresholds, and no grid cell was designated as a hotspot for greater than 50% of the time-series. To test if our approach was sensitive to sampling effort and the geographic extent of the survey, we followed a similar routine for the northern region of the survey area. Our finding of low consistency in benthic fish biodiversity hotspots over time was upheld, regardless of biodiversity metric used, whether thresholds were calculated per year or across all years, or the spatial extent for which we calculated thresholds and identified hotspots. Our results suggest that static measures of benthic fish biodiversity off the US West Coast are insufficient for identification of hotspots and that long-term data are required to appropriately identify patterns of high temporal variability in biodiversity for these highly mobile taxa. Given that ecological communities are responding to a changing climate and other

  2. Consistency Relations for the Conformal Mechanism

    CERN Document Server

    Creminelli, Paolo; Khoury, Justin; Simonović, Marko

    2012-01-01

    We systematically derive the consistency relations associated to the non-linearly realized symmetries of theories with spontaneously broken conformal symmetry but with a linearly-realized de Sitter subalgebra. These identities relate (N+1)-point correlation functions with a soft external Goldstone to N-point functions. These relations have direct implications for the recently proposed conformal mechanism for generating density perturbations in the early universe. We study the observational consequences, in particular a novel one-loop contribution to the four-point function, relevant for the stochastic scale-dependent bias and CMB mu-distortion.

  3. Consistency relations for the conformal mechanism

    Energy Technology Data Exchange (ETDEWEB)

    Creminelli, Paolo [Abdus Salam International Centre for Theoretical Physics, Strada Costiera 11, 34151, Trieste (Italy); Joyce, Austin; Khoury, Justin [Center for Particle Cosmology, Department of Physics and Astronomy, University of Pennsylvania, Philadelphia, PA 19104 (United States); Simonović, Marko, E-mail: creminel@ictp.it, E-mail: joyceau@sas.upenn.edu, E-mail: jkhoury@sas.upenn.edu, E-mail: marko.simonovic@sissa.it [SISSA, via Bonomea 265, 34136, Trieste (Italy)

    2013-04-01

    We systematically derive the consistency relations associated to the non-linearly realized symmetries of theories with spontaneously broken conformal symmetry but with a linearly-realized de Sitter subalgebra. These identities relate (N+1)-point correlation functions with a soft external Goldstone to N-point functions. These relations have direct implications for the recently proposed conformal mechanism for generating density perturbations in the early universe. We study the observational consequences, in particular a novel one-loop contribution to the four-point function, relevant for the stochastic scale-dependent bias and CMB μ-distortion.

  4. Improving analytical tomographic reconstructions through consistency conditions

    CERN Document Server

    Arcadu, Filippo; Stampanoni, Marco; Marone, Federica

    2016-01-01

    This work introduces and characterizes a fast parameterless filter based on the Helgason-Ludwig consistency conditions, used to improve the accuracy of analytical reconstructions of tomographic undersampled datasets. The filter, acting in the Radon domain, extrapolates intermediate projections between those existing. The resulting sinogram, doubled in views, is then reconstructed by a standard analytical method. Experiments with simulated data prove that the peak-signal-to-noise ratio of the results computed by filtered backprojection is improved up to 5-6 dB, if the filter is used prior to reconstruction.

  5. Consistency of non-minimal renormalisation schemes

    CERN Document Server

    Jack, I

    2016-01-01

    Non-minimal renormalisation schemes such as the momentum subtraction scheme (MOM) have frequently been used for physical computations. The consistency of such a scheme relies on the existence of a coupling redefinition linking it to MSbar. We discuss the implementation of this procedure in detail for a general theory and show how to construct the relevant redefinition up to three-loop order, for the case of a general theory of fermions and scalars in four dimensions and a general scalar theory in six dimensions.

  6. Gentzen's centenary the quest for consistency

    CERN Document Server

    Rathjen, Michael

    2015-01-01

    Gerhard Gentzen has been described as logic’s lost genius, whom Gödel called a better logician than himself. This work comprises articles by leading proof theorists, attesting to Gentzen’s enduring legacy to mathematical logic and beyond. The contributions range from philosophical reflections and re-evaluations of Gentzen’s original consistency proofs to the most recent developments in proof theory. Gentzen founded modern proof theory. His sequent calculus and natural deduction system beautifully explain the deep symmetries of logic. They underlie modern developments in computer science such as automated theorem proving and type theory.

  7. Consistent Predictions of Future Forest Mortality

    Science.gov (United States)

    McDowell, N. G.

    2014-12-01

    We examined empirical and model based estimates of current and future forest mortality of conifers in the northern hemisphere. Consistent water potential thresholds were found that resulted in mortality of our case study species, pinon pine and one-seed juniper. Extending these results with IPCC climate scenarios suggests that most existing trees in this region (SW USA) will be dead by 2050. Further, independent estimates of future mortality for the entire coniferous biome suggest widespread mortality by 2100. The validity and assumptions and implications of these results are discussed.

  8. Surface consistent finite frequency phase corrections

    Science.gov (United States)

    Kimman, W. P.

    2016-07-01

    Static time-delay corrections are frequency independent and ignore velocity variations away from the assumed vertical ray path through the subsurface. There is therefore a clear potential for improvement if the finite frequency nature of wave propagation can be properly accounted for. Such a method is presented here based on the Born approximation, the assumption of surface consistency and the misfit of instantaneous phase. The concept of instantaneous phase lends itself very well for sweep-like signals, hence these are the focus of this study. Analytical sensitivity kernels are derived that accurately predict frequency-dependent phase shifts due to P-wave anomalies in the near surface. They are quick to compute and robust near the source and receivers. An additional correction is presented that re-introduces the nonlinear relation between model perturbation and phase delay, which becomes relevant for stronger velocity anomalies. The phase shift as function of frequency is a slowly varying signal, its computation therefore does not require fine sampling even for broad-band sweeps. The kernels reveal interesting features of the sensitivity of seismic arrivals to the near surface: small anomalies can have a relative large impact resulting from the medium field term that is dominant near the source and receivers. Furthermore, even simple velocity anomalies can produce a distinct frequency-dependent phase behaviour. Unlike statics, the predicted phase corrections are smooth in space. Verification with spectral element simulations shows an excellent match for the predicted phase shifts over the entire seismic frequency band. Applying the phase shift to the reference sweep corrects for wavelet distortion, making the technique akin to surface consistent deconvolution, even though no division in the spectral domain is involved. As long as multiple scattering is mild, surface consistent finite frequency phase corrections outperform traditional statics for moderately large

  9. Are there consistent models giving observable NSI ?

    CERN Document Server

    Martinez, Enrique Fernandez

    2013-01-01

    While the existing direct bounds on neutrino NSI are rather weak, order 10(−)(1) for propagation and 10(−)(2) for production and detection, the close connection between these interactions and new NSI affecting the better-constrained charged letpon sector through gauge invariance make these bounds hard to saturate in realistic models. Indeed, Standard Model extensions leading to neutrino NSI typically imply constraints at the 10(−)(3) level. The question of whether consistent models leading to observable neutrino NSI naturally arises and was discussed in a dedicated session at NUFACT 11. Here we summarize that discussion.

  10. Consistent thermodynamic properties of lipids systems

    DEFF Research Database (Denmark)

    Cunico, Larissa; Ceriani, Roberta; Sarup, Bent

    Physical and thermodynamic properties of pure components and their mixtures are the basic requirement for process design, simulation, and optimization. In the case of lipids, our previous works[1-3] have indicated a lack of experimental data for pure components and also for their mixtures...... different pressures, with azeotrope behavior observed. Available thermodynamic consistency tests for TPx data were applied before performing parameter regressions for Wilson, NRTL, UNIQUAC and original UNIFAC models. The relevance of enlarging experimental databank of lipids systems data in order to improve...

  11. Consistency Checking of Web Service Contracts

    DEFF Research Database (Denmark)

    Cambronero, M. Emilia; Okika, Joseph C.; Ravn, Anders Peter

    2008-01-01

    Behavioural properties are analyzed for web service contracts formulated in Business Process Execution Language (BPEL) and Choreography Description Language (CDL). The key result reported is an automated technique to check consistency between protocol aspects of the contracts. The contracts...... are abstracted to (timed) automata and from there a simulation is set up, which is checked using automated tools for analyzing networks of finite state processes. Here we use the Concurrency Work Bench. The proposed techniques are illustrated with a case study that include otherwise difficult to analyze fault...

  12. Sludge characterization: the role of physical consistency

    Energy Technology Data Exchange (ETDEWEB)

    Spinosa, Ludovico; Wichmann, Knut

    2003-07-01

    The physical consistency is an important parameter in sewage sludge characterization as it strongly affects almost all treatment, utilization and disposal operations. In addition, in many european Directives a reference to the physical consistency is reported as a characteristic to be evaluated for fulfilling the regulations requirements. Further, in many analytical methods for sludge different procedures are indicated depending on whether a sample is liquid or not, is solid or not. Three physical behaviours (liquid, paste-like and solid) can be observed with sludges, so the development of analytical procedures to define the boundary limit between liquid and paste-like behaviours (flowability) and that between solid and paste-like ones (solidity) is of growing interest. Several devices can be used for evaluating the flowability and solidity properties, but often they are costly and difficult to be operated in the field. Tests have been carried out to evaluate the possibility to adopt a simple extrusion procedure for flowability measurements, and a Vicat needle for solidity ones. (author)

  13. Probability-consistent spectrum and code spectrum

    Institute of Scientific and Technical Information of China (English)

    沈建文; 石树中

    2004-01-01

    In the seismic safety evaluation (SSE) for key projects, the probability-consistent spectrum (PCS), usually obtained from probabilistic seismic hazard analysis (PSHA), is not consistent with the design response spectrum given by Code for Seismic Design of Buildings (GB50011-2001). Sometimes, there may be a remarkable difference between them. If the PCS is lower than the corresponding code design response spectrum (CDS), the seismic fortification criterion for the key projects would be lower than that for the general industry and civil buildings. In the paper, the relation between PCS and CDS is discussed by using the ideal simple potential seismic source. The results show that in the most areas influenced mainly by the potential sources of the epicentral earthquakes and the regional earthquakes, PCS is generally lower than CDS in the long periods. We point out that the long-period response spectra of the code should be further studied and combined with the probability method of seismic zoning as much as possible. Because of the uncertainties in SSE, it should be prudent to use the long-period response spectra given by SSE for key projects when they are lower than CDS.

  14. Consistent mutational paths predict eukaryotic thermostability

    Directory of Open Access Journals (Sweden)

    van Noort Vera

    2013-01-01

    Full Text Available Abstract Background Proteomes of thermophilic prokaryotes have been instrumental in structural biology and successfully exploited in biotechnology, however many proteins required for eukaryotic cell function are absent from bacteria or archaea. With Chaetomium thermophilum, Thielavia terrestris and Thielavia heterothallica three genome sequences of thermophilic eukaryotes have been published. Results Studying the genomes and proteomes of these thermophilic fungi, we found common strategies of thermal adaptation across the different kingdoms of Life, including amino acid biases and a reduced genome size. A phylogenetics-guided comparison of thermophilic proteomes with those of other, mesophilic Sordariomycetes revealed consistent amino acid substitutions associated to thermophily that were also present in an independent lineage of thermophilic fungi. The most consistent pattern is the substitution of lysine by arginine, which we could find in almost all lineages but has not been extensively used in protein stability engineering. By exploiting mutational paths towards the thermophiles, we could predict particular amino acid residues in individual proteins that contribute to thermostability and validated some of them experimentally. By determining the three-dimensional structure of an exemplar protein from C. thermophilum (Arx1, we could also characterise the molecular consequences of some of these mutations. Conclusions The comparative analysis of these three genomes not only enhances our understanding of the evolution of thermophily, but also provides new ways to engineer protein stability.

  15. Viewpoint Consistency: An Eye Movement Study

    Directory of Open Access Journals (Sweden)

    Filipe Cristino

    2012-05-01

    Full Text Available Eye movements have been widely studied, using images and videos in laboratories or portable eye trackers in the real world. Although a good understanding of the saccadic system and extensive models of gaze have been developed over the years, only a few studies have focused on the consistency of eye movements across viewpoints. We have developed a new technique to compute and map the depth of collected eye movements on stimuli rendered from 3D mesh objects using a traditional corneal reflection eye tracker (SR Eyelink 1000. Having eye movements mapped into 3D space (and not on an image space allowed us to compare fixations across viewpoints. Fixation sequences (scanpaths were also studied across viewpoints using the ScanMatch method (Cristino et al 2010, Behavioural and Research Methods 42, 692–700, extended to work with 3D eye movements. In a set of experiments where participants were asked to perform a recognition task on either a set of objects or faces, we recorded their gaze while performing the task. Participants either viewed the stimuli in 2D or using anaglyph glasses. The stimuli were shown from different viewpoints during the learning and testing phases. A high degree of gaze consistency was found across the different viewpoints, particularly between learning and testing phases. Scanpaths were also similar across viewpoints, suggesting not only that the gazed spatial locations are alike, but also their temporal order.

  16. Subgame consistent cooperation a comprehensive treatise

    CERN Document Server

    Yeung, David W K

    2016-01-01

    Strategic behavior in the human and social world has been increasingly recognized in theory and practice. It is well known that non-cooperative behavior could lead to suboptimal or even highly undesirable outcomes. Cooperation suggests the possibility of obtaining socially optimal solutions and the calls for cooperation are prevalent in real-life problems. Dynamic cooperation cannot be sustainable if there is no guarantee that the agreed upon optimality principle at the beginning is maintained throughout the cooperation duration. It is due to the lack of this kind of guarantees that cooperative schemes fail to last till its end or even fail to get started. The property of subgame consistency in cooperative dynamic games and the corresponding solution mechanism resolve this “classic” problem in game theory. This book is a comprehensive treatise on subgame consistent dynamic cooperation covering the up-to-date state of the art analyses in this important topic. It sets out to provide the theory, solution tec...

  17. Consistent evolution in a pedestrian flow

    Science.gov (United States)

    Guan, Junbiao; Wang, Kaihua

    2016-03-01

    In this paper, pedestrian evacuation considering different human behaviors is studied by using a cellular automaton (CA) model combined with the snowdrift game theory. The evacuees are divided into two types, i.e. cooperators and defectors, and two different human behaviors, herding behavior and independent behavior, are investigated. It is found from a large amount of numerical simulations that the ratios of the corresponding evacuee clusters are evolved to consistent states despite 11 typically different initial conditions, which may largely owe to self-organization effect. Moreover, an appropriate proportion of initial defectors who are of herding behavior, coupled with an appropriate proportion of initial defectors who are of rationally independent thinking, are two necessary factors for short evacuation time.

  18. Consistency of warm k-inflation

    CERN Document Server

    Peng, Zhi-Peng; Zhang, Xiao-Min; Zhu, Jian-Yang

    2016-01-01

    We extend the k-inflation which is a type of kinetically driven inflationary model under the standard inflationary scenario to a possible warm inflationary scenario. The dynamical equations of this warm k-inflation model are obtained. We rewrite the slow-roll parameters which are different from the usual potential driven inflationary models and perform a linear stability analysis to give the proper slow-roll conditions in the warm k-inflation. Two cases, a power-law kinetic function and an exponential kinetic function, are studied, when the dissipative coefficient $\\Gamma=\\Gamma_0$ and $\\Gamma=\\Gamma(\\phi)$, respectively. A proper number of e-folds is obtained in both concrete cases of warm k-inflation. We find a constant dissipative coefficient ($\\Gamma=\\Gamma_0$) is not a workable choice for these two cases while the two cases with $\\Gamma=\\Gamma(\\phi)$ are self-consistent warm inflationary models.

  19. Compact difference approximation with consistent boundary condition

    Institute of Scientific and Technical Information of China (English)

    FU Dexun; MA Yanwen; LI Xinliang; LIU Mingyu

    2003-01-01

    For simulating multi-scale complex flow fields it should be noted that all the physical quantities we are interested in must be simulated well. With limitation of the computer resources it is preferred to use high order accurate difference schemes. Because of their high accuracy and small stencil of grid points computational fluid dynamics (CFD) workers pay more attention to compact schemes recently. For simulating the complex flow fields the treatment of boundary conditions at the far field boundary points and near far field boundary points is very important. According to authors' experience and published results some aspects of boundary condition treatment for far field boundary are presented, and the emphasis is on treatment of boundary conditions for the upwind compact schemes. The consistent treatment of boundary conditions at the near boundary points is also discussed. At the end of the paper are given some numerical examples. The computed results with presented method are satisfactory.

  20. Reliability and Consistency of Surface Contamination Measurements

    Energy Technology Data Exchange (ETDEWEB)

    Rouppert, F.; Rivoallan, A.; Largeron, C.

    2002-02-26

    Surface contamination evaluation is a tough problem since it is difficult to isolate the radiations emitted by the surface, especially in a highly irradiating atmosphere. In that case the only possibility is to evaluate smearable (removeable) contamination since ex-situ countings are possible. Unfortunately, according to our experience at CEA, these values are not consistent and thus non relevant. In this study, we show, using in-situ Fourier Transform Infra Red spectrometry on contaminated metal samples, that fixed contamination seems to be chemisorbed and removeable contamination seems to be physisorbed. The distribution between fixed and removeable contamination appears to be variable. Chemical equilibria and reversible ion exchange mechanisms are involved and are closely linked to environmental conditions such as humidity and temperature. Measurements of smearable contamination only give an indication of the state of these equilibria between fixed and removeable contamination at the time and in the environmental conditions the measurements were made.

  1. Evaluating the hydrological consistency of evaporation products

    KAUST Repository

    López, Oliver

    2017-01-18

    Advances in space-based observations have provided the capacity to develop regional- to global-scale estimates of evaporation, offering insights into this key component of the hydrological cycle. However, the evaluation of large-scale evaporation retrievals is not a straightforward task. While a number of studies have intercompared a range of these evaporation products by examining the variance amongst them, or by comparison of pixel-scale retrievals against ground-based observations, there is a need to explore more appropriate techniques to comprehensively evaluate remote-sensing-based estimates. One possible approach is to establish the level of product agreement between related hydrological components: for instance, how well do evaporation patterns and response match with precipitation or water storage changes? To assess the suitability of this "consistency"-based approach for evaluating evaporation products, we focused our investigation on four globally distributed basins in arid and semi-arid environments, comprising the Colorado River basin, Niger River basin, Aral Sea basin, and Lake Eyre basin. In an effort to assess retrieval quality, three satellite-based global evaporation products based on different methodologies and input data, including CSIRO-PML, the MODIS Global Evapotranspiration product (MOD16), and Global Land Evaporation: the Amsterdam Methodology (GLEAM), were evaluated against rainfall data from the Global Precipitation Climatology Project (GPCP) along with Gravity Recovery and Climate Experiment (GRACE) water storage anomalies. To ensure a fair comparison, we evaluated consistency using a degree correlation approach after transforming both evaporation and precipitation data into spherical harmonics. Overall we found no persistent hydrological consistency in these dryland environments. Indeed, the degree correlation showed oscillating values between periods of low and high water storage changes, with a phase difference of about 2–3 months

  2. Consistency of canonical formulation of Horava gravity

    Energy Technology Data Exchange (ETDEWEB)

    Soo, Chopin, E-mail: cpsoo@mail.ncku.edu.tw [Department of Physics, National Cheng Kung University, Tainan, Taiwan (China)

    2011-09-22

    Both the non-projectable and projectable version of Horava gravity face serious challenges. In the non-projectable version, the constraint algebra is seemingly inconsistent. The projectable version lacks a local Hamiltonian constraint, thus allowing for an extra graviton mode which can be problematic. A new formulation (based on arXiv:1007.1563) of Horava gravity which is naturally realized as a representation of the master constraint algebra (instead of the Dirac algebra) studied by loop quantum gravity researchers is presented. This formulation yields a consistent canonical theory with first class constraints; and captures the essence of Horava gravity in retaining only spatial diffeomorphisms as the physically relevant non-trivial gauge symmetry. At the same time the local Hamiltonian constraint is equivalently enforced by the master constraint.

  3. Trisomy 21 consistently activates the interferon response.

    Science.gov (United States)

    Sullivan, Kelly D; Lewis, Hannah C; Hill, Amanda A; Pandey, Ahwan; Jackson, Leisa P; Cabral, Joseph M; Smith, Keith P; Liggett, L Alexander; Gomez, Eliana B; Galbraith, Matthew D; DeGregori, James; Espinosa, Joaquín M

    2016-07-29

    Although it is clear that trisomy 21 causes Down syndrome, the molecular events acting downstream of the trisomy remain ill defined. Using complementary genomics analyses, we identified the interferon pathway as the major signaling cascade consistently activated by trisomy 21 in human cells. Transcriptome analysis revealed that trisomy 21 activates the interferon transcriptional response in fibroblast and lymphoblastoid cell lines, as well as circulating monocytes and T cells. Trisomy 21 cells show increased induction of interferon-stimulated genes and decreased expression of ribosomal proteins and translation factors. An shRNA screen determined that the interferon-activated kinases JAK1 and TYK2 suppress proliferation of trisomy 21 fibroblasts, and this defect is rescued by pharmacological JAK inhibition. Therefore, we propose that interferon activation, likely via increased gene dosage of the four interferon receptors encoded on chromosome 21, contributes to many of the clinical impacts of trisomy 21, and that interferon antagonists could have therapeutic benefits.

  4. On the consistent use of Constructed Observables

    CERN Document Server

    Trott, Michael

    2015-01-01

    We define "constructed observables" as relating experimental measurements to terms in a Lagrangian while simultaneously making assumptions about possible deviations from the Standard Model (SM), in other Lagrangian terms. Ensuring that the SM effective field theory (EFT) is constrained correctly when using constructed observables requires that their defining conditions are imposed on the EFT in a manner that is consistent with the equations of motion. Failing to do so can result in a "functionally redundant" operator basis and the wrong expectation as to how experimental quantities are related in the EFT. We illustrate the issues involved considering the $\\rm S$ parameter and the off shell triple gauge coupling (TGC) verticies. We show that the relationships between $h \\rightarrow V \\bar{f} \\, f$ decay and the off shell TGC verticies are subject to these subtleties, and how the connections between these observables vanish in the limit of strong bounds due to LEP. The challenge of using constructed observables...

  5. Consistently weighted measures for complex network topologies

    CERN Document Server

    Heitzig, Jobst; Zou, Yong; Marwan, Norbert; Kurths, Jürgen

    2011-01-01

    When network and graph theory are used in the study of complex systems, a typically finite set of nodes of the network under consideration is frequently either explicitly or implicitly considered representative of a much larger finite or infinite set of objects of interest. The selection procedure, e.g., formation of a subset or some kind of discretization or aggregation, typically results in individual nodes of the studied network representing quite differently sized parts of the domain of interest. This heterogeneity may induce substantial bias and artifacts in derived network statistics. To avoid this bias, we propose an axiomatic scheme based on the idea of {\\em node splitting invariance} to derive consistently weighted variants of various commonly used statistical network measures. The practical relevance and applicability of our approach is demonstrated for a number of example networks from different fields of research, and is shown to be of fundamental importance in particular in the study of climate n...

  6. Consistent 4-form fluxes for maximal supergravity

    CERN Document Server

    Godazgar, Hadi; Krueger, Olaf; Nicolai, Hermann

    2015-01-01

    We derive new ansaetze for the 4-form field strength of D=11 supergravity corresponding to uplifts of four-dimensional maximal gauged supergravity. In particular, the ansaetze directly yield the components of the 4-form field strength in terms of the scalars and vectors of the four-dimensional maximal gauged supergravity---in this way they provide an explicit uplift of all four-dimensional consistent truncations of D=11 supergravity. The new ansaetze provide a substantially simpler method for uplifting d=4 flows compared to the previously available method using the 3-form and 6-form potential ansaetze. The ansatz for the Freund-Rubin term allows us to conjecture a `master formula' for the latter in terms of the scalar potential of d=4 gauged supergravity and its first derivative. We also resolve a long-standing puzzle concerning the antisymmetry of the flux obtained from uplift ansaetze.

  7. Quantum cosmological consistency condition for inflation

    Energy Technology Data Exchange (ETDEWEB)

    Calcagni, Gianluca [Instituto de Estructura de la Materia, CSIC, calle Serrano 121, 28006 Madrid (Spain); Kiefer, Claus [Institut für Theoretische Physik, Universität zu Köln, Zülpicher Strasse 77, 50937 Köln (Germany); Steinwachs, Christian F., E-mail: calcagni@iem.cfmac.csic.es, E-mail: kiefer@thp.uni-koeln.de, E-mail: christian.steinwachs@physik.uni-freiburg.de [Physikalisches Institut, Albert-Ludwigs-Universität Freiburg, Hermann-Herder-Str. 3, 79104 Freiburg (Germany)

    2014-10-01

    We investigate the quantum cosmological tunneling scenario for inflationary models. Within a path-integral approach, we derive the corresponding tunneling probability distribution. A sharp peak in this distribution can be interpreted as the initial condition for inflation and therefore as a quantum cosmological prediction for its energy scale. This energy scale is also a genuine prediction of any inflationary model by itself, as the primordial gravitons generated during inflation leave their imprint in the B-polarization of the cosmic microwave background. In this way, one can derive a consistency condition for inflationary models that guarantees compatibility with a tunneling origin and can lead to a testable quantum cosmological prediction. The general method is demonstrated explicitly for the model of natural inflation.

  8. Consistent Stochastic Modelling of Meteocean Design Parameters

    DEFF Research Database (Denmark)

    Sørensen, John Dalsgaard; Sterndorff, M. J.

    2000-01-01

    Consistent stochastic models of metocean design parameters and their directional dependencies are essential for reliability assessment of offshore structures. In this paper a stochastic model for the annual maximum values of the significant wave height, and the associated wind velocity, current...... velocity, and water level is presented. The stochastic model includes statistical uncertainty and dependency between the four stochastic variables. Further, a new stochastic model for annual maximum directional significant wave heights is presented. The model includes dependency between the maximum wave...... height from neighboring directional sectors. Numerical examples are presented where the models are calibrated using the Maximum Likelihood method to data from the central part of the North Sea. The calibration of the directional distributions is made such that the stochastic model for the omnidirectional...

  9. Internal Branding and Employee Brand Consistent Behaviours

    DEFF Research Database (Denmark)

    Mazzei, Alessandra; Ravazzani, Silvia

    2017-01-01

    Employee behaviours conveying brand values, named brand consistent behaviours, affect the overall brand evaluation. Internal branding literature highlights a knowledge gap in terms of communication practices intended to sustain such behaviours. This study contributes to the development of a non......-normative and constitutive approach to internal branding by proposing an enablement-oriented communication approach. The conceptual background presents a holistic model of the inside-out process of brand building. This model adopts a theoretical approach to internal branding as a nonnormative practice that facilitates...... constitutive processes. In particular, the paper places emphasis on the role and kinds of communication practices as a central part of the nonnormative and constitutive internal branding process. The paper also discusses an empirical study based on interviews with 32 Italian and American communication managers...

  10. Quantum cosmological consistency condition for inflation

    CERN Document Server

    Calcagni, Gianluca; Steinwachs, Christian F

    2014-01-01

    We investigate the quantum cosmological tunneling scenario for inflationary models. Within a path-integral approach, we derive the corresponding tunneling probability distribution. A sharp peak in this distribution can be interpreted as the initial condition for inflation and therefore as a quantum cosmological prediction for its energy scale. This energy scale is also a genuine prediction of any inflationary model by itself, as the primordial gravitons generated during inflation leave their imprint in the B-polarization of the cosmic microwave background. In this way, one can derive a consistency condition for inflationary models that guarantees compatibility with a tunneling origin and can lead to a testable quantum cosmological prediction. The general method is demonstrated explicitly for the model of natural inflation.

  11. Thermodynamically consistent model calibration in chemical kinetics

    Directory of Open Access Journals (Sweden)

    Goutsias John

    2011-05-01

    Full Text Available Abstract Background The dynamics of biochemical reaction systems are constrained by the fundamental laws of thermodynamics, which impose well-defined relationships among the reaction rate constants characterizing these systems. Constructing biochemical reaction systems from experimental observations often leads to parameter values that do not satisfy the necessary thermodynamic constraints. This can result in models that are not physically realizable and may lead to inaccurate, or even erroneous, descriptions of cellular function. Results We introduce a thermodynamically consistent model calibration (TCMC method that can be effectively used to provide thermodynamically feasible values for the parameters of an open biochemical reaction system. The proposed method formulates the model calibration problem as a constrained optimization problem that takes thermodynamic constraints (and, if desired, additional non-thermodynamic constraints into account. By calculating thermodynamically feasible values for the kinetic parameters of a well-known model of the EGF/ERK signaling cascade, we demonstrate the qualitative and quantitative significance of imposing thermodynamic constraints on these parameters and the effectiveness of our method for accomplishing this important task. MATLAB software, using the Systems Biology Toolbox 2.1, can be accessed from http://www.cis.jhu.edu/~goutsias/CSS lab/software.html. An SBML file containing the thermodynamically feasible EGF/ERK signaling cascade model can be found in the BioModels database. Conclusions TCMC is a simple and flexible method for obtaining physically plausible values for the kinetic parameters of open biochemical reaction systems. It can be effectively used to recalculate a thermodynamically consistent set of parameter values for existing thermodynamically infeasible biochemical reaction models of cellular function as well as to estimate thermodynamically feasible values for the parameters of new

  12. Quininium tetra-chloridozinc(II).

    Science.gov (United States)

    Chen, Li-Zhuang

    2009-09-05

    The asymmetric unit of the title compound {systematic name: 2-[hydr-oxy(6-meth-oxy-quinolin-1-ium-4-yl)meth-yl]-8-vinyl-quinuclidin-1-ium tetra-chlorido-zinc(II)}, (C(20)H(26)N(2)O(2))[ZnCl(4)], consists of a double proton-ated quininium cation and a tetra-chloridozinc(II) anion. The Zn(II) ion is in a slightly distorted tetra-hedral coordination environment. The crystal structure is stabilized by inter-molecular N-H⋯Cl and O-H⋯Cl hydrogen bonds.

  13. Vection Induced by Consistent and Inconsistent Multisensory Stimulation

    Directory of Open Access Journals (Sweden)

    April E. Ash

    2011-05-01

    Full Text Available This study examined physical and simulated self-motion along the horizontal and depth axes. Subjects viewed optic flow which consisted of: (i a component that simulated constant velocity self-motion; and (ii a component that simulated oscillation of their viewpoint. In active self-motion conditions, the latter flow component was either in- or out-of-phase with the observer's own (tracked oscillatory head movements. In passive self-motion conditions, stationary subjects simply viewed playbacks of the oscillating displays generated in previous physical self-motion conditions. We found that adding active in-phase horizontal oscillation to radial flow resulted in a modest vection advantage compared to active out-of-phase horizontal oscillation and passive horizontal display oscillation conditions. By contrast, when actively generated fore-aft display oscillation was added to either radial or lamellar flow we found similar vection strength ratings for both in-phase and out-of-phase depth oscillation. We conclude that multisensory input can enhance the visual perception of self-motion in some situations, but may not have to be consistent (i.e., ecological to generate compelling vection in depth.

  14. Can Co(II) or Cd(II) substitute for Zn(II) in zinc fingers?

    Indian Academy of Sciences (India)

    P Rabindra Reddy; M Radhika

    2001-02-01

    Zinc finger domains consist of sequences of amino acids containing cysteine and histidine residues tetrahedrally coordinated to a zinc ion. The role of zinc in a DNA binding finger was considered purely structural due to the absence of redox chemistry in zinc. However, whether other metals e.g. Co(II) or Cd(II) can substitute Zn(II) is not settled. For an answer the detailed interaction of Co(II) and Cd(II) with cysteine methylester and histidine methylester has been investigated as a model for the zinc core in zinc fingers. The study was extended to different temperatures to evaluate the thermodynamic parameters associated with these interactions. The results suggest that zinc has a unique role.

  15. Consistent lattice Boltzmann equations for phase transitions.

    Science.gov (United States)

    Siebert, D N; Philippi, P C; Mattila, K K

    2014-11-01

    Unlike conventional computational fluid dynamics methods, the lattice Boltzmann method (LBM) describes the dynamic behavior of fluids in a mesoscopic scale based on discrete forms of kinetic equations. In this scale, complex macroscopic phenomena like the formation and collapse of interfaces can be naturally described as related to source terms incorporated into the kinetic equations. In this context, a novel athermal lattice Boltzmann scheme for the simulation of phase transition is proposed. The continuous kinetic model obtained from the Liouville equation using the mean-field interaction force approach is shown to be consistent with diffuse interface model using the Helmholtz free energy. Density profiles, interface thickness, and surface tension are analytically derived for a plane liquid-vapor interface. A discrete form of the kinetic equation is then obtained by applying the quadrature method based on prescribed abscissas together with a third-order scheme for the discretization of the streaming or advection term in the Boltzmann equation. Spatial derivatives in the source terms are approximated with high-order schemes. The numerical validation of the method is performed by measuring the speed of sound as well as by retrieving the coexistence curve and the interface density profiles. The appearance of spurious currents near the interface is investigated. The simulations are performed with the equations of state of Van der Waals, Redlich-Kwong, Redlich-Kwong-Soave, Peng-Robinson, and Carnahan-Starling.

  16. Exploring the Consistent behavior of Information Services

    Directory of Open Access Journals (Sweden)

    Kapidakis Sarantos

    2016-01-01

    Full Text Available Computer services are normally assumed to work well all the time. This usually happens for crucial services like bank electronic services, but not necessarily so for others, that there is no commercial interest in their operation. In this work we examined the operation and the errors of information services and tried to find clues that will help predicting the consistency of the behavior and the quality of the harvesting, which is harder because of the transient conditions and the many services and the huge amount of harvested information. We found many unexpected situations. The services that always successfully satisfy a request may in fact return part of it. A significant part of the OAI services have ceased working while many other serves occasionally fail to respond. Some services fail in the same way each time, and we pronounce them dead, as we do not see a way to overcome that. Others also always, or sometimes fail, but not in the same way, and we hope that their behavior is affected by temporary factors, that may improve later on. We categorized the services into classes, to study their behavior in more detail.

  17. Consistent quadrupole-octupole collective model

    Science.gov (United States)

    Dobrowolski, A.; Mazurek, K.; Góźdź, A.

    2016-11-01

    Within this work we present a consistent approach to quadrupole-octupole collective vibrations coupled with the rotational motion. A realistic collective Hamiltonian with variable mass-parameter tensor and potential obtained through the macroscopic-microscopic Strutinsky-like method with particle-number-projected BCS (Bardeen-Cooper-Schrieffer) approach in full vibrational and rotational, nine-dimensional collective space is diagonalized in the basis of projected harmonic oscillator eigensolutions. This orthogonal basis of zero-, one-, two-, and three-phonon oscillator-like functions in vibrational part, coupled with the corresponding Wigner function is, in addition, symmetrized with respect to the so-called symmetrization group, appropriate to the collective space of the model. In the present model it is D4 group acting in the body-fixed frame. This symmetrization procedure is applied in order to provide the uniqueness of the Hamiltonian eigensolutions with respect to the laboratory coordinate system. The symmetrization is obtained using the projection onto the irreducible representation technique. The model generates the quadrupole ground-state spectrum as well as the lowest negative-parity spectrum in 156Gd nucleus. The interband and intraband B (E 1 ) and B (E 2 ) reduced transition probabilities are also calculated within those bands and compared with the recent experimental results for this nucleus. Such a collective approach is helpful in searching for the fingerprints of the possible high-rank symmetries (e.g., octahedral and tetrahedral) in nuclear collective bands.

  18. A Consistent Phylogenetic Backbone for the Fungi

    Science.gov (United States)

    Ebersberger, Ingo; de Matos Simoes, Ricardo; Kupczok, Anne; Gube, Matthias; Kothe, Erika; Voigt, Kerstin; von Haeseler, Arndt

    2012-01-01

    The kingdom of fungi provides model organisms for biotechnology, cell biology, genetics, and life sciences in general. Only when their phylogenetic relationships are stably resolved, can individual results from fungal research be integrated into a holistic picture of biology. However, and despite recent progress, many deep relationships within the fungi remain unclear. Here, we present the first phylogenomic study of an entire eukaryotic kingdom that uses a consistency criterion to strengthen phylogenetic conclusions. We reason that branches (splits) recovered with independent data and different tree reconstruction methods are likely to reflect true evolutionary relationships. Two complementary phylogenomic data sets based on 99 fungal genomes and 109 fungal expressed sequence tag (EST) sets analyzed with four different tree reconstruction methods shed light from different angles on the fungal tree of life. Eleven additional data sets address specifically the phylogenetic position of Blastocladiomycota, Ustilaginomycotina, and Dothideomycetes, respectively. The combined evidence from the resulting trees supports the deep-level stability of the fungal groups toward a comprehensive natural system of the fungi. In addition, our analysis reveals methodologically interesting aspects. Enrichment for EST encoded data—a common practice in phylogenomic analyses—introduces a strong bias toward slowly evolving and functionally correlated genes. Consequently, the generalization of phylogenomic data sets as collections of randomly selected genes cannot be taken for granted. A thorough characterization of the data to assess possible influences on the tree reconstruction should therefore become a standard in phylogenomic analyses. PMID:22114356

  19. Volume Haptics with Topology-Consistent Isosurfaces.

    Science.gov (United States)

    Corenthy, Loc; Otaduy, Miguel A; Pastor, Luis; Garcia, Marcos

    2015-01-01

    Haptic interfaces offer an intuitive way to interact with and manipulate 3D datasets, and may simplify the interpretation of visual information. This work proposes an algorithm to provide haptic feedback directly from volumetric datasets, as an aid to regular visualization. The haptic rendering algorithm lets users perceive isosurfaces in volumetric datasets, and it relies on several design features that ensure a robust and efficient rendering. A marching tetrahedra approach enables the dynamic extraction of a piecewise linear continuous isosurface. Robustness is achieved using a continuous collision detection step coupled with state-of-the-art proxy-based rendering methods over the extracted isosurface. The introduced marching tetrahedra approach guarantees that the extracted isosurface will match the topology of an equivalent isosurface computed using trilinear interpolation. The proposed haptic rendering algorithm improves the consistency between haptic and visual cues computing a second proxy on the isosurface displayed on screen. Our experiments demonstrate the improvements on the isosurface extraction stage as well as the robustness and the efficiency of the complete algorithm.

  20. Consistency between GRUAN sondes, LBLRTM and IASI

    Directory of Open Access Journals (Sweden)

    X. Calbet

    2017-06-01

    Full Text Available Radiosonde soundings from the GCOS Reference Upper-Air Network (GRUAN data record are shown to be consistent with Infrared Atmospheric Sounding Instrument (IASI-measured radiances via LBLRTM (Line-By-Line Radiative Transfer Model in the part of the spectrum that is mostly affected by water vapour absorption in the upper troposphere (from 700 hPa up. This result is key for climate data records, since GRUAN, IASI and LBLRTM constitute reference measurements or a reference radiative transfer model in each of their fields. This is specially the case for night-time radiosonde measurements. Although the sample size is small (16 cases, daytime GRUAN radiosonde measurements seem to have a small dry bias of 2.5 % in absolute terms of relative humidity, located mainly in the upper troposphere, with respect to LBLRTM and IASI. Full metrological closure is not yet possible and will not be until collocation uncertainties are better characterized and a full uncertainty covariance matrix is clarified for GRUAN.

  1. Retrocausation, Consistency, and the Bilking Paradox

    Science.gov (United States)

    Dobyns, York H.

    2011-11-01

    Retrocausation seems to admit of time paradoxes in which events prevent themselves from occurring and thereby create a physical instance of the liar's paradox, an event which occurs iff it does not occur. The specific version in which a retrocausal event is used to trigger an intervention which prevents its own future cause is called the bilking paradox (the event is bilked of its cause). The analysis of Echeverria, Klinkhammer, and Thorne (EKT) suggests time paradoxes cannot arise even in the presence of retrocausation. Any self-contradictory event sequence will be replaced in reality by a closely related but noncontradictory sequence. The EKT analysis implies that attempts to create bilking must instead produce logically consistent sequences wherein the bilked event arises from alternative causes. Bilking a retrocausal information channel of limited reliability usually results only in failures of signaling. An exception applies when the bilking is conducted in response only to some of the signal values that can be carried on the channel. Theoretical analysis based on EKT predicts that, since some of the channel outcomes are not bilked, the channel is capable of transmitting data with its normal reliability, and the paradox-avoidance effects will instead suppress the outcomes that would lead to forbidden (bilked) transmissions. A recent parapsychological experiment by Bem displays a retrocausal information channel of sufficient reliability to test this theoretical model of physical reality's response to retrocausal effects. A modified version with partial bilking would provide a direct test of the generality of the EKT mechanism.

  2. Ciliate communities consistently associated with coral diseases

    Science.gov (United States)

    Sweet, M. J.; Séré, M. G.

    2016-07-01

    Incidences of coral disease are increasing. Most studies which focus on diseases in these organisms routinely assess variations in bacterial associates. However, other microorganism groups such as viruses, fungi and protozoa are only recently starting to receive attention. This study aimed at assessing the diversity of ciliates associated with coral diseases over a wide geographical range. Here we show that a wide variety of ciliates are associated with all nine coral diseases assessed. Many of these ciliates such as Trochilia petrani and Glauconema trihymene feed on the bacteria which are likely colonizing the bare skeleton exposed by the advancing disease lesion or the necrotic tissue itself. Others such as Pseudokeronopsis and Licnophora macfarlandi are common predators of other protozoans and will be attracted by the increase in other ciliate species to the lesion interface. However, a few ciliate species (namely Varistrombidium kielum, Philaster lucinda, Philaster guamense, a Euplotes sp., a Trachelotractus sp. and a Condylostoma sp.) appear to harbor symbiotic algae, potentially from the coral themselves, a result which may indicate that they play some role in the disease pathology at the very least. Although, from this study alone we are not able to discern what roles any of these ciliates play in disease causation, the consistent presence of such communities with disease lesion interfaces warrants further investigation.

  3. Improving electrofishing catch consistency by standardizing power

    Science.gov (United States)

    Burkhardt, Randy W.; Gutreuter, Steve

    1995-01-01

    The electrical output of electrofishing equipment is commonly standardized by using either constant voltage or constant amperage, However, simplified circuit and wave theories of electricity suggest that standardization of power (wattage) available for transfer from water to fish may be critical for effective standardization of electrofishing. Electrofishing with standardized power ensures that constant power is transferable to fish regardless of water conditions. The in situ performance of standardized power output is poorly known. We used data collected by the interagency Long Term Resource Monitoring Program (LTRMP) in the upper Mississippi River system to assess the effectiveness of standardizing power output. The data consisted of 278 electrofishing collections, comprising 9,282 fishes in eight species groups, obtained during 1990 from main channel border, backwater, and tailwater aquatic areas in four reaches of the upper Mississippi River and one reach of the Illinois River. Variation in power output explained an average of 14.9% of catch variance for night electrofishing and 12.1 % for day electrofishing. Three patterns in catch per unit effort were observed for different species: increasing catch with increasing power, decreasing catch with increasing power, and no power-related pattern. Therefore, in addition to reducing catch variation, controlling power output may provide some capability to select particular species. The LTRMP adopted standardized power output beginning in 1991; standardized power output is adjusted for variation in water conductivity and water temperature by reference to a simple chart. Our data suggest that by standardizing electrofishing power output, the LTRMP has eliminated substantial amounts of catch variation at virtually no additional cost.

  4. Felipe II

    Directory of Open Access Journals (Sweden)

    Carlos Restrepo Canal

    1962-04-01

    Full Text Available Como parte de la monumental Historia de España que bajo la prestante y acertadísima dirección de don Ramón Menéndez Pidal se comenzó a dar a la prensa desde 1954 por la Editorial Espasa Calpe S. A., aparecieron en 1958 dos tomos dedicados al reinado de Felipe II; aquella época en que el imperio español alcanzó su unidad peninsular juntamente con el dilatado poderío que le constituyó en la primera potencia de Europa.

  5. Structural Consistency: Enabling XML Keyword Search to Eliminate Spurious Results Consistently

    CERN Document Server

    Lee, Ki-Hoon; Han, Wook-Shin; Kim, Min-Soo

    2009-01-01

    XML keyword search is a user-friendly way to query XML data using only keywords. In XML keyword search, to achieve high precision without sacrificing recall, it is important to remove spurious results not intended by the user. Efforts to eliminate spurious results have enjoyed some success by using the concepts of LCA or its variants, SLCA and MLCA. However, existing methods still could find many spurious results. The fundamental cause for the occurrence of spurious results is that the existing methods try to eliminate spurious results locally without global examination of all the query results and, accordingly, some spurious results are not consistently eliminated. In this paper, we propose a novel keyword search method that removes spurious results consistently by exploiting the new concept of structural consistency.

  6. Bayesian detection of causal rare variants under posterior consistency.

    KAUST Repository

    Liang, Faming

    2013-07-26

    Identification of causal rare variants that are associated with complex traits poses a central challenge on genome-wide association studies. However, most current research focuses only on testing the global association whether the rare variants in a given genomic region are collectively associated with the trait. Although some recent work, e.g., the Bayesian risk index method, have tried to address this problem, it is unclear whether the causal rare variants can be consistently identified by them in the small-n-large-P situation. We develop a new Bayesian method, the so-called Bayesian Rare Variant Detector (BRVD), to tackle this problem. The new method simultaneously addresses two issues: (i) (Global association test) Are there any of the variants associated with the disease, and (ii) (Causal variant detection) Which variants, if any, are driving the association. The BRVD ensures the causal rare variants to be consistently identified in the small-n-large-P situation by imposing some appropriate prior distributions on the model and model specific parameters. The numerical results indicate that the BRVD is more powerful for testing the global association than the existing methods, such as the combined multivariate and collapsing test, weighted sum statistic test, RARECOVER, sequence kernel association test, and Bayesian risk index, and also more powerful for identification of causal rare variants than the Bayesian risk index method. The BRVD has also been successfully applied to the Early-Onset Myocardial Infarction (EOMI) Exome Sequence Data. It identified a few causal rare variants that have been verified in the literature.

  7. Light harvesting in photosystem II

    NARCIS (Netherlands)

    Amerongen, van H.; Croce, R.

    2013-01-01

    Water oxidation in photosynthesis takes place in photosystem II (PSII). This photosystem is built around a reaction center (RC) where sunlight-induced charge separation occurs. This RC consists of various polypeptides that bind only a few chromophores or pigments, next to several other cofactors. It

  8. Light harvesting in photosystem II

    NARCIS (Netherlands)

    Amerongen, van H.; Croce, R.

    2013-01-01

    Water oxidation in photosynthesis takes place in photosystem II (PSII). This photosystem is built around a reaction center (RC) where sunlight-induced charge separation occurs. This RC consists of various polypeptides that bind only a few chromophores or pigments, next to several other cofactors. It

  9. Time-Consistent and Market-Consistent Evaluations (Revised version of 2012-086)

    NARCIS (Netherlands)

    Stadje, M.A.; Pelsser, A.

    2014-01-01

    Abstract: We consider evaluation methods for payoffs with an inherent financial risk as encountered for instance for portfolios held by pension funds and insurance companies. Pricing such payoffs in a way consistent to market prices typically involves combining actuarial techniques with methods from

  10. Consistently Showing Your Best Side? Intra-individual Consistency in #Selfie Pose Orientation

    Science.gov (United States)

    Lindell, Annukka K.

    2017-01-01

    Painted and photographic portraits of others show an asymmetric bias: people favor their left cheek. Both experimental and database studies confirm that the left cheek bias extends to selfies. To date all such selfie studies have been cross-sectional; whether individual selfie-takers tend to consistently favor the same pose orientation, or switch between multiple poses, remains to be determined. The present study thus examined intra-individual consistency in selfie pose orientations. Two hundred selfie-taking participants (100 male and 100 female) were identified by searching #selfie on Instagram. The most recent 10 single-subject selfies for the each of the participants were selected and coded for type of selfie (normal; mirror) and pose orientation (left, midline, right), resulting in a sample of 2000 selfies. Results indicated that selfie-takers do tend to consistently adopt a preferred pose orientation (α = 0.72), with more participants showing an overall left cheek bias (41%) than would be expected by chance (overall right cheek bias = 31.5%; overall midline bias = 19.5%; no overall bias = 8%). Logistic regression modellng, controlling for the repeated measure of participant identity, indicated that sex did not affect pose orientation. However, selfie type proved a significant predictor when comparing left and right cheek poses, with a stronger left cheek bias for mirror than normal selfies. Overall, these novel findings indicate that selfie-takers show intra-individual consistency in pose orientation, and in addition, replicate the previously reported left cheek bias for selfies and other types of portrait, confirming that the left cheek bias also presents within individuals’ selfie corpora.

  11. Consistently Showing Your Best Side? Intra-individual Consistency in #Selfie Pose Orientation.

    Science.gov (United States)

    Lindell, Annukka K

    2017-01-01

    Painted and photographic portraits of others show an asymmetric bias: people favor their left cheek. Both experimental and database studies confirm that the left cheek bias extends to selfies. To date all such selfie studies have been cross-sectional; whether individual selfie-takers tend to consistently favor the same pose orientation, or switch between multiple poses, remains to be determined. The present study thus examined intra-individual consistency in selfie pose orientations. Two hundred selfie-taking participants (100 male and 100 female) were identified by searching #selfie on Instagram. The most recent 10 single-subject selfies for the each of the participants were selected and coded for type of selfie (normal; mirror) and pose orientation (left, midline, right), resulting in a sample of 2000 selfies. Results indicated that selfie-takers do tend to consistently adopt a preferred pose orientation (α = 0.72), with more participants showing an overall left cheek bias (41%) than would be expected by chance (overall right cheek bias = 31.5%; overall midline bias = 19.5%; no overall bias = 8%). Logistic regression modellng, controlling for the repeated measure of participant identity, indicated that sex did not affect pose orientation. However, selfie type proved a significant predictor when comparing left and right cheek poses, with a stronger left cheek bias for mirror than normal selfies. Overall, these novel findings indicate that selfie-takers show intra-individual consistency in pose orientation, and in addition, replicate the previously reported left cheek bias for selfies and other types of portrait, confirming that the left cheek bias also presents within individuals' selfie corpora.

  12. Spectral and thermodynamic properties of Ag(I), Au(III), Cd(II), Co(II), Fe(III), Hg(II), Mn(II), Ni(II), Pb(II), U(IV), and Zn(II) binding by methanobactin from Methylosinus trichosporium OB3b.

    Science.gov (United States)

    Choi, Dong W; Do, Young S; Zea, Corbin J; McEllistrem, Marcus T; Lee, Sung-W; Semrau, Jeremy D; Pohl, Nicola L; Kisting, Clint J; Scardino, Lori L; Hartsel, Scott C; Boyd, Eric S; Geesey, Gill G; Riedel, Theran P; Shafe, Peter H; Kranski, Kim A; Tritsch, John R; Antholine, William E; DiSpirito, Alan A

    2006-12-01

    Methanobactin (mb) is a novel chromopeptide that appears to function as the extracellular component of a copper acquisition system in methanotrophic bacteria. To examine this potential physiological role, and to distinguish it from iron binding siderophores, the spectral (UV-visible absorption, circular dichroism, fluorescence, and X-ray photoelectron) and thermodynamic properties of metal binding by mb were examined. In the absence of Cu(II) or Cu(I), mb will bind Ag(I), Au(III), Co(II), Cd(II), Fe(III), Hg(II), Mn(II), Ni(II), Pb(II), U(VI), or Zn(II), but not Ba(II), Ca(II), La(II), Mg(II), and Sr(II). The results suggest metals such as Ag(I), Au(III), Hg(II), Pb(II) and possibly U(VI) are bound by a mechanism similar to Cu, whereas the coordination of Co(II), Cd(II), Fe(III), Mn(II), Ni(II) and Zn(II) by mb differs from Cu(II). Consistent with its role as a copper-binding compound or chalkophore, the binding constants of all the metals examined were less than those observed with Cu(II) and copper displaced other metals except Ag(I) and Au(III) bound to mb. However, the binding of different metals by mb suggests that methanotrophic activity also may play a role in either the solubilization or immobilization of many metals in situ.

  13. Elizabeth II uus kunstigalerii

    Index Scriptorium Estoniae

    1999-01-01

    Tähistamaks oma troonile asumise 50. aastapäeva, avab Elizabeth II 6. II 2002 Buckinghami palees uue kunstigalerii, mis ehitatakse palee tiibhoonena. Arhitekt John Simpson. Elizabeth II kunstikogust

  14. Elizabeth II uus kunstigalerii

    Index Scriptorium Estoniae

    1999-01-01

    Tähistamaks oma troonile asumise 50. aastapäeva, avab Elizabeth II 6. II 2002 Buckinghami palees uue kunstigalerii, mis ehitatakse palee tiibhoonena. Arhitekt John Simpson. Elizabeth II kunstikogust

  15. The Belle II Detector

    Science.gov (United States)

    Piilonen, Leo; Belle Collaboration, II

    2017-01-01

    The Belle II detector is now under construction at the KEK laboratory in Japan. This project represents a substantial upgrade of the Belle detector (and the KEKB accelerator). The Belle II experiment will record 50 ab-1 of data, a factor of 50 more than that recorded by Belle. This large data set, combined with the low backgrounds and high trigger efficiencies characteristic of an e+e- experiment, should provide unprecedented sensitivity to new physics signatures in B and D meson decays, and in τ lepton decays. The detector comprises many forefront subsystems. The vertex detector consists of two inner layers of silicon DEPFET pixels and four outer layers of double-sided silicon strips. These layers surround a beryllium beam pipe having a radius of only 10 mm. Outside of the vertex detector is a large-radius, small-cell drift chamber, an ``imaging time-of-propagation'' detector based on Cerenkov radiation for particle identification, and scintillating fibers and resistive plate chambers used to identify muons. The detector will begin commissioning in 2017.

  16. Selected methods for dissolved iron (II, III) and dissolved sulfide (-II) determinations in geothermal waters

    Science.gov (United States)

    Vivit, D.V.; Jenne, E.A.

    1985-01-01

    Dissolved sulfide (-II) and dissolved iron (II, III) were determined in geothermal well water samples collected at Cerro Prieto, Mexico. Most samples consisted of liquid and gas (two phases) at the instant of collection; and a subset of samples, referred to as ' flashed ' samples, consisted of pressurized steam samples which were allowed to condense. Sulfide was determined by sulfide specific ion electrode; Fe(II) and Fe(III) plus Fe(II) were determined spectrophotometrically. The precision and accuracy of the methods were evaluated for these high-silica waters with replicate analyses, spike recoveries, and an alternate method. Direct current (d.c.) argon plasma emission spectrometry was the alternate method used for Fe(III)-plus-Fe(II) analyses. Mean dissolved iron concentrations ranged from 20.2 to 834 micrograms/L (ug/L) as Fe(II) and 26.8 to 904 ug/L as Fe(III) plus Fe(II). Mean sulfide concentrations ranged from about 0.01 to 5.3 mg/L (S-II) Generally, higher S(-II) values and larger Fe(II)/Fe(III) ratios were found in the two-phase samples. These findings suggest that the ' flashed ' samples are at a less reduced state than the two-phase samples. (Author 's abstract)

  17. Consistent Evolution of Software Artifacts and Non-Functional Models

    Science.gov (United States)

    2014-11-14

    Ruscio D., Pierantonio A., Arcelli D., Eramo R., Trubiani C., Tucci M. Dipartimento di Ingegneria e Scienze dell’Informazione e Matematica ...Models (SRMs), and ( ii ) antipattern solutions as Target Role Models (TRMs). Hence, SRM-TRM pairs represent new instruments in the hands of developers to...helps to identify the antipatterns that more heavily contribute to the violation of performance requirements [10], and ( ii ) another one aimed at

  18. Characterization of consistent triggers of migraine with aura

    DEFF Research Database (Denmark)

    Hauge, Anne Werner; Kirchmann, Malene; Olesen, Jes

    2011-01-01

    The aim of the present study was to characterize perceived consistent triggers of migraine with aura (MA).......The aim of the present study was to characterize perceived consistent triggers of migraine with aura (MA)....

  19. Characterization of consistent triggers of migraine with aura

    DEFF Research Database (Denmark)

    Hauge, Anne Werner; Kirchmann, Malene; Olesen, Jes

    2011-01-01

    The aim of the present study was to characterize perceived consistent triggers of migraine with aura (MA).......The aim of the present study was to characterize perceived consistent triggers of migraine with aura (MA)....

  20. The utility of theory of planned behavior in predicting consistent ...

    African Journals Online (AJOL)

    admin

    outcomes of the behavior and the evaluations of these outcomes (behavioral beliefs) ... belief towards consistent condom use and motivation for compliance with .... consistency of the items used before constructing a scale. Results. All of the ...

  1. Generalized contexts and consistent histories in quantum mechanics

    Science.gov (United States)

    Losada, Marcelo; Laura, Roberto

    2014-05-01

    We analyze a restriction of the theory of consistent histories by imposing that a valid description of a physical system must include quantum histories which satisfy the consistency conditions for all states. We prove that these conditions are equivalent to imposing the compatibility conditions of our formalism of generalized contexts. Moreover, we show that the theory of consistent histories with the consistency conditions for all states and the formalism of generalized context are equally useful representing expressions which involve properties at different times.

  2. Study on consistent query answering in inconsistent databases

    Institute of Scientific and Technical Information of China (English)

    XIE Dong; YANG Luming

    2007-01-01

    Consistent query answering is an approach to retrieving consistent answers over databases that might be inconsistent with respect to some given integrity constraints The approach is based on a concept of repair.This paper surveys several recent researches on obtaining consistent information from inconsistent databases,such as the underlying semantic model,a number of approaches to computing consistent query answers and the computational complexity of this problem.Furthermore,the work outlines potential research directions in this area.

  3. CONSISTENCY OF LS ESTIMATOR IN SIMPLE LINEAR EV REGRESSION MODELS

    Institute of Scientific and Technical Information of China (English)

    Liu Jixue; Chen Xiru

    2005-01-01

    Consistency of LS estimate of simple linear EV model is studied. It is shown that under some common assumptions of the model, both weak and strong consistency of the estimate are equivalent but it is not so for quadratic-mean consistency.

  4. Checking Consistency of Pedigree Information is NP-complete

    DEFF Research Database (Denmark)

    Aceto, Luca; Hansen, Jens A.; Ingolfsdottir, Anna

    Consistency checking is a fundamental computational problem in genetics. Given a pedigree and information on the genotypes of some of the individuals in it, the aim of consistency checking is to determine whether these data are consistent with the classic Mendelian laws of inheritance. This probl...

  5. Non-Numeric Intrajudge Consistency Feedback in an Angoff Procedure

    Science.gov (United States)

    Harrison, George M.

    2015-01-01

    The credibility of standard-setting cut scores depends in part on two sources of consistency evidence: intrajudge and interjudge consistency. Although intrajudge consistency feedback has often been provided to Angoff judges in practice, more evidence is needed to determine whether it achieves its intended effect. In this randomized experiment with…

  6. On consistency of the weighted arithmetical mean complex judgement matrix

    Institute of Scientific and Technical Information of China (English)

    2007-01-01

    The weighted arithmetical mean complex judgement matrix(WAMCJM)is the most common method for aggregating group opinions,but it has a shortcoming,namely the WAMCJM of the perfectly consistent judgement matrices given by experts canot guarantee its perfect consistency.An upper bound of the WAMCJM's consistency is presented.Simultaneously,a compatibility index of judging the aggregating extent of group opinions is also introduced.The WAMCJM is of acceptable consistency and is proved provided the compatibilities of all judgement matrices given by experts are smaller than the threshold value of acceptable consistency.These conclusions are important to group decision making.

  7. On multidimensional consistent systems of asymmetric quad-equations

    CERN Document Server

    Boll, Raphael

    2012-01-01

    Multidimensional Consistency becomes more and more important in the theory of discrete integrable systems. Recently, we gave a classification of all 3D consistent 6-tuples of equations with the tetrahedron property, where several novel asymmetric systems have been found. In the present paper we discuss higher-dimensional consistency for 3D consistent systems coming up with this classification. In addition, we will give a classification of certain 4D consistent systems of quad-equations. The results of this paper allow for a proof of the Bianchi permutability among other applications.

  8. NSLS-II Radio Frequency Systems

    Energy Technology Data Exchange (ETDEWEB)

    Rose J.; Gao F.; Goel, A.; Holub, B.; Kulpin, J.; Marques, C.; Yeddulla, M.

    2015-05-03

    The National Synchrotron Light Source II is a 3 GeV X-ray user facility commissioned in 2014. The NSLS-II RF system consists of the master oscillator, digital low level RF controllers, linac, booster and storage ring RF sub-systems, as well as a supporting cryogenic system. Here we will report on RF commissioning and early operation experience of the system.

  9. A CONSISTENT STUDY OF METALLICITY EVOLUTION AT 0.8 < z < 2.6

    Energy Technology Data Exchange (ETDEWEB)

    Wuyts, Eva; Kurk, Jaron; Förster Schreiber, Natascha M.; Genzel, Reinhard; Wisnioski, Emily; Bandara, Kaushala; Wuyts, Stijn; Beifiori, Alessandra; Bender, Ralf; Buschkamp, Peter; Chan, Jeffrey; Davies, Ric; Eisenhauer, Frank; Fossati, Matteo; Kulkarni, Sandesh K.; Lang, Philipp [Max-Planck-Institut für extraterrestrische Physik, Giessenbachstr. 1, D-85741 Garching (Germany); Brammer, Gabriel B. [Space Telescope Science Institute, Baltimore, MD 21218 (United States); Burkert, Andreas [Universitäts-Sternwarte München, Scheinerstr. 1, D-81679 München (Germany); Carollo, C. Marcella; Lilly, Simon J., E-mail: evawuyts@mpe.mpg.de [Institute of Astronomy, Department of Physics, Eidgensösische Technische Hochschule, ETH Zürich, CH-8093 (Switzerland); and others

    2014-07-10

    We present the correlations between stellar mass, star formation rate (SFR), and the [N II]/Hα flux ratio as an indicator of gas-phase metallicity for a sample of 222 galaxies at 0.8 < z < 2.6 and log (M {sub *}/M {sub ☉}) = 9.0-11.5 from the LUCI, SINS/zC-SINF, and KMOS{sup 3D} surveys. This sample provides a unique analysis of the mass-metallicity relation (MZR) over an extended redshift range using consistent data analysis techniques and a uniform strong-line metallicity indicator. We find a constant slope at the low-mass end of the relation and can fully describe its redshift evolution through the evolution of the characteristic turnover mass where the relation begins to flatten at the asymptotic metallicity. At a fixed mass and redshift, our data do not show a correlation between the [N II]/Hα ratio and SFR, which disagrees with the 0.2-0.3 dex offset in [N II]/Hα predicted by the ''fundamental relation'' between stellar mass, SFR, and metallicity discussed in recent literature. However, the overall evolution toward lower [N II]/Hα at earlier times does broadly agree with these predictions.

  10. Small Diameter Bomb Increment II (SDB II)

    Science.gov (United States)

    2013-12-01

    been further delays to the F-35 System Development and Demonstration ( SDD ) program. As a result, the SDB II integration will be accomplished as a...follow-on integration to the F-35 SDD . SDB II OT&E on the F-35 will not be completed by the FRP threshold of October 2019, thus delaying the FRP decision

  11. Preferred crystallographic orientation in the ice I ← II transformation and the flow of ice II

    Science.gov (United States)

    Bennett, K.; Wenk, H.-R.; Durham, W.B.; Stern, L.A.; Kirby, S.H.

    1997-01-01

    The preferred crystallographic orientation developed during the ice I ← II transformation and during the plastic flow of ice II was measured in polycrystalline deuterium oxide (D2O) specimens using low-temperature neutron diffraction. Samples partially transformed from ice I to II under a non-hydrostatic stress developed a preferred crystallographic orientation in the ice II. Samples of pure ice II transformed from ice I under a hydrostatic stress and then when compressed axially, developed a strong preferred orientation of compression axes parallel to (1010). A match to the observed preferred orientation using the viscoplastic self-consistent theory was obtained only when (1010) [0001] was taken as the predominant slip system in ice II.

  12. Incompatible multiple consistent sets of histories and measures of quantumness

    Science.gov (United States)

    Halliwell, J. J.

    2017-07-01

    In the consistent histories approach to quantum theory probabilities are assigned to histories subject to a consistency condition of negligible interference. The approach has the feature that a given physical situation admits multiple sets of consistent histories that cannot in general be united into a single consistent set, leading to a number of counterintuitive or contrary properties if propositions from different consistent sets are combined indiscriminately. An alternative viewpoint is proposed in which multiple consistent sets are classified according to whether or not there exists any unifying probability for combinations of incompatible sets which replicates the consistent histories result when restricted to a single consistent set. A number of examples are exhibited in which this classification can be made, in some cases with the assistance of the Bell, Clauser-Horne-Shimony-Holt, or Leggett-Garg inequalities together with Fine's theorem. When a unifying probability exists logical deductions in different consistent sets can in fact be combined, an extension of the "single framework rule." It is argued that this classification coincides with intuitive notions of the boundary between classical and quantum regimes and in particular, the absence of a unifying probability for certain combinations of consistent sets is regarded as a measure of the "quantumness" of the system. The proposed approach and results are closely related to recent work on the classification of quasiprobabilities and this connection is discussed.

  13. Multi-Arts Resource Guide. Revised [and] [videotapes I and II].

    Science.gov (United States)

    Pascale, Louise

    This resource guide and the accompanying videotapes are based on the artist-generated lesson plans of the Very Special Arts Massachusetts artist residency program. Six arts units are presented: Collage, Puppetry, Movement I & II, Printmaking I & II, Theater Arts I & II, and Self Concept I & II. Each of these units consists of…

  14. Factor II deficiency

    Science.gov (United States)

    ... if one or more of these factors are missing or are not functioning like they should. Factor II is one such coagulation factor. Factor II deficiency runs in families (inherited) and is very rare. Both parents must ...

  15. Integrable Heisenberg Ferromagnet Equations with self-consistent potentials

    CERN Document Server

    Zhunussova, Zh Kh; Tungushbaeva, D I; Mamyrbekova, G K; Nugmanova, G N; Myrzakulov, R

    2013-01-01

    In this paper, we consider some integrable Heisenberg Ferromagnet Equations with self-consistent potentials. We study their Lax representations. In particular we give their equivalent counterparts which are nonlinear Schr\\"odinger type equations. We present the integrable reductions of the Heisenberg Ferromagnet Equations with self-consistent potentials. These integrable Heisenberg Ferromagnet Equations with self-consistent potentials describe nonlinear waves in ferromagnets with magnetic fields.

  16. Behavioural consistency and life history of Rana dalmatina tadpoles

    OpenAIRE

    Urszán, Tamás Janós; Török, János; Hettyey, Attila; Garamszegi, László Z; Herczeg, Gábor

    2015-01-01

    The focus of evolutionary behavioural ecologists has recently turned towards understanding the causes and consequences of behavioural consistency, manifesting either as animal personality (consistency in a single behaviour) or behavioural syndrome (consistency across more behaviours). Behavioural type (mean individual behaviour) has been linked to life-history strategies, leading to the emergence of the integrated pace-of-life syndrome (POLS) theory. Using Rana dalmatina tadpoles as models, w...

  17. Students’ conceptual understanding consistency of heat and temperature

    Science.gov (United States)

    Slamet Budiarti, Indah; Suparmi; Sarwanto; Harjana

    2017-01-01

    The aims of the research were to explore and to describe the consistency of students’ understanding of heat and temperature concept. The sample that was taken using purposive random sampling technique consisted of 99 high school students from 3 senior high schools in Jayapura city. The descriptive qualitative method was employed in this study. The data were collected using tests and interviews regarding the subject matters of Heat and Temperature. Based on the results of data analysis, it was concluded that 3.03% of the students was the consistency of right answer, 79.80% of the students was consistency but wrong answer and 17.17% of the students was inconsistency.

  18. Bootstrap-Based Inference for Cube Root Consistent Estimators

    DEFF Research Database (Denmark)

    Cattaneo, Matias D.; Jansson, Michael; Nagasawa, Kenichi

    This note proposes a consistent bootstrap-based distributional approximation for cube root consistent estimators such as the maximum score estimator of Manski (1975) and the isotonic density estimator of Grenander (1956). In both cases, the standard nonparametric bootstrap is known to be inconsis......This note proposes a consistent bootstrap-based distributional approximation for cube root consistent estimators such as the maximum score estimator of Manski (1975) and the isotonic density estimator of Grenander (1956). In both cases, the standard nonparametric bootstrap is known...

  19. Personality Consistency in Dogs: A Meta-Analysis

    Science.gov (United States)

    Fratkin, Jamie L.; Sinn, David L.; Patall, Erika A.; Gosling, Samuel D.

    2013-01-01

    Personality, or consistent individual differences in behavior, is well established in studies of dogs. Such consistency implies predictability of behavior, but some recent research suggests that predictability cannot be assumed. In addition, anecdotally, many dog experts believe that ‘puppy tests’ measuring behavior during the first year of a dog's life are not accurate indicators of subsequent adult behavior. Personality consistency in dogs is an important aspect of human-dog relationships (e.g., when selecting dogs suitable for substance-detection work or placement in a family). Here we perform the first comprehensive meta-analysis of studies reporting estimates of temporal consistency of dog personality. A thorough literature search identified 31 studies suitable for inclusion in our meta-analysis. Overall, we found evidence to suggest substantial consistency (r = 0.43). Furthermore, personality consistency was higher in older dogs, when behavioral assessment intervals were shorter, and when the measurement tool was exactly the same in both assessments. In puppies, aggression and submissiveness were the most consistent dimensions, while responsiveness to training, fearfulness, and sociability were the least consistent dimensions. In adult dogs, there were no dimension-based differences in consistency. There was no difference in personality consistency in dogs tested first as puppies and later as adults (e.g., ‘puppy tests’) versus dogs tested first as puppies and later again as puppies. Finally, there were no differences in consistency between working versus non-working dogs, between behavioral codings versus behavioral ratings, and between aggregate versus single measures. Implications for theory, practice, and future research are discussed. PMID:23372787

  20. Affective-cognitive consistency and thought-induced attitude polarization.

    Science.gov (United States)

    Chaiken, S; Yates, S

    1985-12-01

    Subjects whose preexperimental attitudes toward either capital punishment or censorship were high or low in affective-cognitive consistency were identified. These four groups thought about their attitudes by writing two essays, one on the topic for which consistency had been assessed (relevant essay) and one on the unassessed topic (distractor essay). In accord with the hypothesis that thought-induced attitude polarization requires the presence of a well-developed knowledge structure, high-consistency subjects evidenced greater polarization than low-consistency subjects only on the relevant topic after writing the relevant essay. Content analyses of subjects' relevant essays yielded additional data confirming Tesser's ideas regarding mediation: High (vs. low) consistency subjects expressed a greater proportion of cognitions that were evaluatively consistent with their prior affect toward the attitude object and a smaller proportion of evaluatively inconsistent and neutral cognitions. Moreover, although high-and low-consistency subjects did not differ in the amount of attitudinally relevant information they possessed or their awareness of inconsistent cognitions, their method of dealing with discrepant information diverged: High-consistency subjects evidenced a greater tendency to assimilate discrepant information by generating refutational thoughts that discredited or minimized the importance of inconsistent information.

  1. Philosophical and Methodological Problem of Consistency of Mathematical Theories

    Directory of Open Access Journals (Sweden)

    Michailova N. V.

    2013-01-01

    Full Text Available Increased abstraction of modern mathematical theories has revived interest in traditional philosophical and methodological problem of internally consistent system of axioms where the contradicting each other statements can’t be deduced. If we are talking about axioms describing a well-known area of mathematical objects from the standpoint of local consistency this problem does not appear to be as relevant. But these problems are associated with the various attempts of formalists to explain the mathematical existence through consistency. But, for example, with regard to the problem of establishing of consistency of mathematical analysis the solution of which would clarify the fate of Hilbert's proof theory it has not solved yet so as the problem of the consistency of axiomatic set theory. Therefore it can be assumed that the criterion of consistency despite its essential role in axiomatic systems both formal and substantive nature is the same auxiliary logical criterion as well as mathematical provability. An adequate solution of the problem of consistency of mathematics can be achieved in the area of methodological and substantive arguments revealing the mechanism of appearance of contradictions in the mathematical theory. The paper shows that from a systemic point of view in the context of philosophical and methodological synthesis of various directions of justification of modern mathematics it can’t insist on only the rationale for consistency of mathematical theories.

  2. MANUFACTURE OF THE FERMENTED SAUSAGES WITH THE SMEARED CONSISTENCE

    Directory of Open Access Journals (Sweden)

    Nesterenko A. A.

    2014-10-01

    Full Text Available In foreign practice we have a great demand of using smoked sausage products with a smeared consistence. In the article the basic aspects of manufacturing smoked sausages with a smeared consistence are resulted: the choice of spices, starting cultures and the way of drawing up of forcemeat

  3. Uniform Consistency for Nonparametric Estimators in Null Recurrent Time Series

    DEFF Research Database (Denmark)

    Gao, Jiti; Kanaya, Shin; Li, Degui

    2015-01-01

    This paper establishes uniform consistency results for nonparametric kernel density and regression estimators when time series regressors concerned are nonstationary null recurrent Markov chains. Under suitable regularity conditions, we derive uniform convergence rates of the estimators. Our...... results can be viewed as a nonstationary extension of some well-known uniform consistency results for stationary time series....

  4. Delimiting Coefficient a from Internal Consistency and Unidimensionality

    Science.gov (United States)

    Sijtsma, Klaas

    2015-01-01

    I discuss the contribution by Davenport, Davison, Liou, & Love (2015) in which they relate reliability represented by coefficient a to formal definitions of internal consistency and unidimensionality, both proposed by Cronbach (1951). I argue that coefficient a is a lower bound to reliability and that concepts of internal consistency and…

  5. Carl Rogers during Initial Interviews: A Moderate and Consistent Therapist.

    Science.gov (United States)

    Edwards, H. P.; And Others

    1982-01-01

    Analyzed two initial interviews by Carl Rogers in their entirety using the Carkhuff scales, Hill's category system, and a brief grammatical analysis to establish the level and consistency with which Rogers provides facilitative conditions. Results indicated his behavior as counselor was stable and consistent within and across interviews. (Author)

  6. Decentralized Consistency Checking in Cross-organizational Workflows

    NARCIS (Netherlands)

    Wombacher, Andreas

    Service Oriented Architectures facilitate loosely coupled composed services, which are established in a decentralized way. One challenge for such composed services is to guarantee consistency, i.e., deadlock-freeness. This paper presents a decentralized approach to consistency checking, which

  7. The Self-Consistency Model of Subjective Confidence

    Science.gov (United States)

    Koriat, Asher

    2012-01-01

    How do people monitor the correctness of their answers? A self-consistency model is proposed for the process underlying confidence judgments and their accuracy. In answering a 2-alternative question, participants are assumed to retrieve a sample of representations of the question and base their confidence on the consistency with which the chosen…

  8. Dynamic Consistency between Value and Coordination Models - Research Issues.

    NARCIS (Netherlands)

    Bodenstaff, L.; Wombacher, Andreas; Reichert, M.U.; meersman, R; Tari, Z; herrero, p

    Inter-organizational business cooperations can be described from different viewpoints each fulfilling a specific purpose. Since all viewpoints describe the same system they must not contradict each other, thus, must be consistent. Consistency can be checked based on common semantic concepts of the

  9. Decentralized Consistency Checking in Cross-organizational Workflows

    NARCIS (Netherlands)

    Wombacher, A.

    2006-01-01

    Service Oriented Architectures facilitate loosely coupled composed services, which are established in a decentralized way. One challenge for such composed services is to guarantee consistency, i.e., deadlock-freeness. This paper presents a decentralized approach to consistency checking, which utiliz

  10. Logical consistency and sum-constrained linear models

    NARCIS (Netherlands)

    van Perlo -ten Kleij, Frederieke; Steerneman, A.G.M.; Koning, Ruud H.

    2006-01-01

    A topic that has received quite some attention in the seventies and eighties is logical consistency of sum-constrained linear models. Loosely defined, a sum-constrained model is logically consistent if the restrictions on the parameters and explanatory variables are such that the sum constraint is a

  11. Structural basis of transcription initiation by RNA polymerase II.

    Science.gov (United States)

    Sainsbury, Sarah; Bernecky, Carrie; Cramer, Patrick

    2015-03-01

    Transcription of eukaryotic protein-coding genes commences with the assembly of a conserved initiation complex, which consists of RNA polymerase II (Pol II) and the general transcription factors, at promoter DNA. After two decades of research, the structural basis of transcription initiation is emerging. Crystal structures of many components of the initiation complex have been resolved, and structural information on Pol II complexes with general transcription factors has recently been obtained. Although mechanistic details await elucidation, available data outline how Pol II cooperates with the general transcription factors to bind to and open promoter DNA, and how Pol II directs RNA synthesis and escapes from the promoter.

  12. The construction and combined operation for fuzzy consistent matrixes

    Institute of Scientific and Technical Information of China (English)

    YAO Min; SHEN Bin; LUO Jian-hua

    2005-01-01

    Fuzziness is one of the general characteristics of human thinking and objective things. Introducing fuzzy techniques into decision-making yields very good results. Fuzzy consistent matrix has many excellent characteristics, especially center-division transitivity conforming to the reality of the human thinking process in decision-making. This paper presents a new approach for creating fuzzy consistent matrix from mutual supplementary matrix in fuzzy decision-making. At the same time,based on the distance between individual fuzzy consistent matrix and average fuzzy consistent matrix, a kind of combined operation for several fuzzy consistent matrixes is presented which reflects most opinions of experienced experts. Finally, a practical example shows its flexibility and practicability further.

  13. Personality and Situation Predictors of Consistent Eating Patterns.

    Directory of Open Access Journals (Sweden)

    Uku Vainik

    Full Text Available A consistent eating style might be beneficial to avoid overeating in a food-rich environment. Eating consistency entails maintaining a similar dietary pattern across different eating situations. This construct is relatively under-studied, but the available evidence suggests that eating consistency supports successful weight maintenance and decreases risk for metabolic syndrome and cardiovascular disease. Yet, personality and situation predictors of consistency have not been studied.A community-based sample of 164 women completed various personality tests, and 139 of them also reported their eating behaviour 6 times/day over 10 observational days. We focused on observations with meals (breakfast, lunch, or dinner. The participants indicated if their momentary eating patterns were consistent with their own baseline eating patterns in terms of healthiness or size of the meal. Further, participants described various characteristics of each eating situation.Eating consistency was positively predicted by trait self-control. Eating consistency was undermined by eating in the evening, eating with others, eating away from home, having consumed alcohol and having undertaken physical exercise. Interactions emerged between personality traits and situations, including punishment sensitivity, restraint, physical activity and alcohol consumption.Trait self-control and several eating situation variables were related to eating consistency. These findings provide a starting point for targeting interventions to improve consistency, suggesting that a focus on self-control skills, together with addressing contextual factors such as social situations and time of day, may be most promising. This work is a first step to provide people with the tools they need to maintain a consistently healthy lifestyle in a food-rich environment.

  14. Personality and Situation Predictors of Consistent Eating Patterns.

    Science.gov (United States)

    Vainik, Uku; Dubé, Laurette; Lu, Ji; Fellows, Lesley K

    2015-01-01

    A consistent eating style might be beneficial to avoid overeating in a food-rich environment. Eating consistency entails maintaining a similar dietary pattern across different eating situations. This construct is relatively under-studied, but the available evidence suggests that eating consistency supports successful weight maintenance and decreases risk for metabolic syndrome and cardiovascular disease. Yet, personality and situation predictors of consistency have not been studied. A community-based sample of 164 women completed various personality tests, and 139 of them also reported their eating behaviour 6 times/day over 10 observational days. We focused on observations with meals (breakfast, lunch, or dinner). The participants indicated if their momentary eating patterns were consistent with their own baseline eating patterns in terms of healthiness or size of the meal. Further, participants described various characteristics of each eating situation. Eating consistency was positively predicted by trait self-control. Eating consistency was undermined by eating in the evening, eating with others, eating away from home, having consumed alcohol and having undertaken physical exercise. Interactions emerged between personality traits and situations, including punishment sensitivity, restraint, physical activity and alcohol consumption. Trait self-control and several eating situation variables were related to eating consistency. These findings provide a starting point for targeting interventions to improve consistency, suggesting that a focus on self-control skills, together with addressing contextual factors such as social situations and time of day, may be most promising. This work is a first step to provide people with the tools they need to maintain a consistently healthy lifestyle in a food-rich environment.

  15. Multi-Graph Matching via Affinity Optimization with Graduated Consistency Regularization.

    Science.gov (United States)

    Yan, Junchi; Cho, Minsu; Zha, Hongyuan; Yang, Xiaokang; Chu, Stephen M

    2016-06-01

    This paper addresses the problem of matching common node correspondences among multiple graphs referring to an identical or related structure. This multi-graph matching problem involves two correlated components: i) the local pairwise matching affinity across pairs of graphs; ii) the global matching consistency that measures the uniqueness of the pairwise matchings by different composition orders. Previous studies typically either enforce the matching consistency constraints in the beginning of an iterative optimization, which may propagate matching error both over iterations and across graph pairs; or separate affinity optimization and consistency enforcement into two steps. This paper is motivated by the observation that matching consistency can serve as a regularizer in the affinity objective function especially when the function is biased due to noises or inappropriate modeling. We propose composition-based multi-graph matching methods to incorporate the two aspects by optimizing the affinity score, meanwhile gradually infusing the consistency. We also propose two mechanisms to elicit the common inliers against outliers. Compelling results on synthetic and real images show the competency of our algorithms.

  16. Consistent assignment of nurse aides: association with turnover and absenteeism.

    Science.gov (United States)

    Castle, Nicholas G

    2013-01-01

    Consistent assignment refers to the same caregivers consistently caring for the same residents almost every time caregivers are on duty. This article examines the association of consistent assignment of nurse aides with turnover and absenteeism. Data came from a survey of nursing home administrators, the Online Survey Certification and Reporting data, and the Area Resource File. The measures were from 2007 and came from 3,941 nursing homes. Multivariate logistic regression models were used to examine turnover and absenteeism. An average of 68% of nursing homes reported using consistent assignment, with 28% of nursing homes using nurse aides consistent assignment at the often recommended level of 85% (or more). Nursing homes using recommended levels of consistent assignment had significantly lower rates of turnover and of absenteeism. In the multivariate analyses, consistent assignment was significantly associated with both lower turnover and lower absenteeism (p assignment is a practice recommended by many policy makers, government agencies, and industry advocates. The findings presented here provide some evidence that the use of this staffing practice can be beneficial.

  17. Co(II), Ni(II), Cu(II) and Zn(II) complexes of a bipyridine bis-phenol conjugate: generation and properties of coordinated radical species.

    Science.gov (United States)

    Arora, Himanshu; Philouze, Christian; Jarjayes, Olivier; Thomas, Fabrice

    2010-11-14

    Four bis-phenolate complexes [Zn(II)L], [Ni(II)L], [Cu(II)L] and [Co(II)L] (where [H(2)L = 2,2'-[2,2']bipyridinyl-6-yl-bis-4,6-di-tert-butylphenol] have been synthesized. The copper(II) and nickel(II) complexes have been characterized by X-ray diffraction, showing a metal ion within a square planar geometry, slightly distorted towards tetrahedral. The cyclic voltametry (CV) curve of [Zn(II)L] consists of a single bi-electronic reversible wave at 0.06 V vs. Fc/Fc(+). The electrochemically generated dication is a diradical species [Zn(II)L˙˙](2+) that exhibits the typical phenoxyl π-π* band at 395 nm. It is EPR-silent due to magnetic interactions between the phenoxyl moieties. The CV curves of [Ni(II)L] and [Cu(II)L] exhibit two distinct ligand-centred one-electron oxidation waves. The first one is observed at E(1/2)(1) = 0.38 and 0.40 V for the nickel and copper complex, respectively, and corresponds to the formation of M(II)-coordinated phenoxyl radicals. Accordingly, [Ni(II)L˙](+) exhibits a strong absorption band at 960 nm and an (S = ½) EPR signal centred at g(iso) = 2.02. [Cu(II)L˙](+) is EPR-silent, in agreement with a magnetic coupling between the metal and the radical spin. In contrast with the other complexes, [Co(II)L] was found to react with dioxygen (mostly in the presence of pyridine), giving rise to a stable (S = ½) superoxo radical complex [Co(III)L(Py)(O(2)˙)]. One-electron oxidation of [Co(II)L] at -0.01 V affords a diamagnetic cobalt(III) complex [Co(III)L](+) that is inert towards O(2) binding, whereas two-electron oxidation leads to the paramagnetic phenoxyl radical species [Co(III)L˙](+) whose EPR spectrum features an (S = ½) signal at g(iso) = 2.00.

  18. THE CONSISTENCY OF STATISTICAL ESTIMATES OF THURSTONE-MOSTELLER

    Directory of Open Access Journals (Sweden)

    Y. V. Bugaev

    2015-01-01

    Full Text Available The traditional method of analysis procedures of collective choice involves three different approaches: investigation operator voting against the characteristic conditions, investigation the properties of the function of choice, analysis on the possibility of manipulating (verification the stability of the voting process under the influence of negative impacts from voters or organizer. Research team of the department ITMU VSUET proposed and implemented a fourth approach, which is to research the probabilistic characteristics the results of the procedures (value of the displacement valuation of estimate the usefulness of specific alternative to its true value, the standard deviation of evaluating of estimate the usefulness of alternative from its true value, the probability of correct ranking of alternatives at the output the procedure of choice, etc.. This article is dedicated to the analysis of the consistency of estimates the usefulness to compare alternatives, obtained at the output of the traditional procedure Thurstone-Mosteller and its generalizations, created by the authors of. In the general, the term of consistency of estimator of statistical estimation assumes tending to zero error of estimation by increasing the sample size. However, depending on the interpretation of "calculation errors" in science are the following main types of consistency: the weak consistency of statistical estimation, based on the notion of convergence in probability of the random quantity; the strong consistency, based on the concept of convergence with probability to one; the consistency of statistical estimation in the mean square. The variance of this assessment tends to zero. This article provides a proof of the theorem, according to which the assumptions rather general nature of estimates the usefulness being ranked alternatives obtained using the procedure Thurstone-Mosteller satisfied the consistency of statistical estimation in the mean square. In this

  19. Reflection symmetries of Isolated Self-consistent Stellar Systems

    OpenAIRE

    An, J; Evans, N.W.; Sanders, J. L.

    2016-01-01

    Isolated, steady-state galaxies correspond to equilibrium solutions of the Poisson--Vlasov system. We show that (i) all galaxies with a distribution function depending on energy alone must be spherically symmetric and (ii) all axisymmetric galaxies with a distribution function depending on energy and the angular momentum component parallel to the symmetry axis must also be reflection-symmetric about the plane $z=0$. The former result is Lichtenstein's Theorem, derived here by a method exploit...

  20. The Consistent Preferences Approach to Deductive Reasoning in Games

    CERN Document Server

    Asheim, Geir B

    2006-01-01

    "The Consistent Preferences Approach to Deductive Reasoning in Games" presents, applies, and synthesizes what my co-authors and I have called the 'consistent preferences' approach to deductive reasoning in games. Briefly described, this means that the object of the analysis is the ranking by each player of his own strategies, rather than his choice. The ranking can be required to be consistent (in different senses) with his beliefs about the opponent's ranking of her strategies. This can be contrasted to the usual 'rational choice' approach where a player's strategy choice is (in dif

  1. On the Consistency of ZFn in ZFn+3

    Institute of Scientific and Technical Information of China (English)

    李旭华

    1993-01-01

    By restricting the common replacement axiom schema of ZF to ∑n-formulae,Profexxor Zhang Jinwen constructed a series of subsystems of Zermelo-Frankel set theory ZF and he called them ZFn.Zhao Xi shun show that the consistency of ZFn can be deducted from ZF.Porfessor Zhang Jinwen raised the question whether the consistency of ZFn can be deducted from ZFn+m(n) for some m(n)≥1.In this paper,we get a positive solution to Professor Zhang's problem.Moreover,we show that the consistency of ZFn can be deducted from ZFn+3.

  2. Model Checking Data Consistency for Cache Coherence Protocols

    Institute of Scientific and Technical Information of China (English)

    Hong Pan; Hui-Min Lin; Yi Lv

    2006-01-01

    A method for automatic verification of cache coherence protocols is presented, in which cache coherence protocols are modeled as concurrent value-passing processes, and control and data consistency requirement are described as formulas in first-orderμ-calculus. A model checker is employed to check if the protocol under investigation satisfies the required properties. Using this method a data consistency error has been revealed in a well-known cache coherence protocol.The error has been corrected, and the revised protocol has been shown free from data consistency error for any data domain size, by appealing to data independence technique.

  3. Consistency of assertive, aggressive, and submissive behavior for children.

    Science.gov (United States)

    Deluty, R H

    1985-10-01

    The interpersonal behavior of 50 third- through fifth-grade children was assessed over an 8-month period in a wide variety of naturally occurring school activities. The consistency of the children's behavior was found to vary as a function of the child's sex, the class of behavior examined, and the similarity/dissimilarity of the contexts in which the behaviors occurred. Boys demonstrated remarkable consistency in their aggressive expression; 46 of 105 intercorrelations for the aggressiveness dimensions were statistically significant. In general, the consistency of assertive behavior for both boys and girls was unexpectedly high.

  4. Transport with Astra in TJ-II; Transporte con Astra en TJ-II

    Energy Technology Data Exchange (ETDEWEB)

    Lopez-Bruna, D.; Castejon, F.; Fontdecaba, J. M.

    2004-07-01

    This report describes the adaptation of the numerical transport shell ASTRA for performing plasma calculations in the TJ-II stellarator device. Firstly, an approximation to the TJ-II geometry is made and a simple transport model is shared with two other codes in order to compare these codes (PROCTR, PRETOR-Stellarator) with ASTRA as calculation tool for TJ-II plasmas are provided: interpretative and predictive transport. The first consists in estimating the transport coefficients from real experimental data, thes being taken from three TJ-II discharges. The predictive facet is illustrated using a model that is able to includes self-consistently thedynamics of transport barriers. The report includes this model, written in the ASTRA programming language, to illustrate the use of ASTRA. (Author) 26 refs.

  5. [C II] and [N II] from dense ionized regions in the Galaxy

    CERN Document Server

    Langer, W D; Pineda, J L

    2016-01-01

    The interstellar medium (ISM) consists of highly ionized and neutral atomic, as well as molecular, components. Knowledge of their distribution is important for tracing the structure and lifecycle of the ISM. Here we determine the properties of the highly ionized and neutral weakly ionized gas in the Galaxy traced by the fine-structure lines of ionized nitrogen, [N II], and ionized carbon, [C II]. To analyze the ionized ISM we utilize [C II] 158 micron and [N II] 205 micron lines taken with the high spectral resolution Heterodyne Instrument in the Far-Infrared (HIFI) on the Herschel Space Observatory along ten lines of sight towards the inner Galaxy. [N II] emission can be used to estimate the contribution of the highly ionized gas to the [C II] emission and separate the highly ionized and weakly ionized neutral gas. We find that [N II] has strong emission in distinct spectral features along all lines of sight associated with strong [C II] emission. The [N II] arises from moderate density extended HII regions ...

  6. On exact triangles consisting of stable vector bundles on tori

    CERN Document Server

    Kobayashi, Kazushi

    2016-01-01

    In this paper, we consider the exact triangles consisting of stable holomorphic vector bundles on one-dimensional complex tori, and discuss their relations with the corresponding Fukaya category via the homological mirror symmetry.

  7. A new insight into the consistency of smoothed particle hydrodynamics

    CERN Document Server

    Sigalotti, Leonardo Di G; Klapp, Jaime; Vargas, Carlos A; Campos, Kilver

    2016-01-01

    In this paper the problem of consistency of smoothed particle hydrodynamics (SPH) is solved. A novel error analysis is developed in $n$-dimensional space using the Poisson summation formula, which enables the treatment of the kernel and particle approximation errors in combined fashion. New consistency integral relations are derived for the particle approximation which correspond to the cosine Fourier transform of the classically known consistency conditions for the kernel approximation. The functional dependence of the error bounds on the SPH interpolation parameters, namely the smoothing length $h$ and the number of particles within the kernel support ${\\cal{N}}$ is demonstrated explicitly from which consistency conditions are seen to follow naturally. As ${\\cal{N}}\\to\\infty$, the particle approximation converges to the kernel approximation independently of $h$ provided that the particle mass scales with $h$ as $m\\propto h^{\\beta}$, with $\\beta >n$. This implies that as $h\\to 0$, the joint limit $m\\to 0$, $...

  8. Island of Stability for Consistent Deformations of Einstein's Gravity

    DEFF Research Database (Denmark)

    Dietrich, Dennis D.; Berkhahn, Felix; Hofmann, Stefan;

    2012-01-01

    We construct deformations of general relativity that are consistent and phenomenologically viable, since they respect, in particular, cosmological backgrounds. These deformations have unique symmetries in accordance with their Minkowski cousins (Fierz-Pauli theory for massive gravitons) and incor...

  9. Consistency of Trend Break Point Estimator with Underspecified Break Number

    Directory of Open Access Journals (Sweden)

    Jingjing Yang

    2017-01-01

    Full Text Available This paper discusses the consistency of trend break point estimators when the number of breaks is underspecified. The consistency of break point estimators in a simple location model with level shifts has been well documented by researchers under various settings, including extensions such as allowing a time trend in the model. Despite the consistency of break point estimators of level shifts, there are few papers on the consistency of trend shift break point estimators in the presence of an underspecified break number. The simulation study and asymptotic analysis in this paper show that the trend shift break point estimator does not converge to the true break points when the break number is underspecified. In the case of two trend shifts, the inconsistency problem worsens if the magnitudes of the breaks are similar and the breaks are either both positive or both negative. The limiting distribution for the trend break point estimator is developed and closely approximates the finite sample performance.

  10. Consistency in experiments on multistable driven delay systems

    Science.gov (United States)

    Oliver, Neus; Larger, Laurent; Fischer, Ingo

    2016-10-01

    We investigate the consistency properties in the responses of a nonlinear delay optoelectronic intensity oscillator subject to different drives, in particular, harmonic and self-generated waveforms. This system, an implementation of the Ikeda oscillator, is operating in a closed-loop configuration, exhibiting its autonomous dynamics while the drive signals are additionally introduced. Applying the same drive multiple times, we compare the dynamical responses of the optoelectronic oscillator and quantify the degree of consistency among them via their correlation. Our results show that consistency is not restricted to conditions close to the first Hopf bifurcation but can be found in a broad range of dynamical regimes, even in the presence of multistability. Finally, we discuss the dependence of consistency on the nature of the drive signal.

  11. Energy-Consistent Multiscale Algorithms for Granular Flows

    Science.gov (United States)

    2014-08-07

    8-98) v Prescribed by ANSI Std. Z39.18 30-07-2014 Final 01-MAY-2011 - 30-APR-2014 AFOSR YIP Energy-Consistent Multiscale Algorithms for Granular...document the achievements made as a result of this Young Investigator Program ( YIP ) project. We worked on the development of multi scale energy... YIP ) project. We worked on the development of multi scale energy-consistent algorithms to simulate and capture flow phenomena in granular

  12. Design of a Turbulence Generator of Medium Consistency Pulp Pumps

    OpenAIRE

    Hong Li; Haifei Zhuang; Weihao Geng

    2012-01-01

    The turbulence generator is a key component of medium consistency centrifugal pulp pumps, with functions to fluidize the medium consistency pulp and to separate gas from the liquid. Structure sizes of the generator affect the hydraulic performance. The radius and the blade laying angle are two important structural sizes of a turbulence generator. Starting with the research on the flow inside and shearing characteristics of the MC pulp, a simple mathematical model at the flow section of the sh...

  13. Consistent Reconstruction of Cortical Surfaces from Longitudinal Brain MR Images

    OpenAIRE

    Li, Gang; Nie, Jingxin; Shen, Dinggang

    2011-01-01

    Accurate and consistent reconstruction of cortical surfaces from longitudinal human brain MR images is of great importance in studying subtle morphological changes of the cerebral cortex. This paper presents a new deformable surface method for consistent and accurate reconstruction of inner, central and outer cortical surfaces from longitudinal MR images. Specifically, the cortical surfaces of the group-mean image of all aligned longitudinal images of the same subject are first reconstructed ...

  14. Consistent Reconstruction of Cortical Surfaces from Longitudinal Brain MR Images

    OpenAIRE

    Li, Gang; Nie, Jingxin; Wu, Guorong; Wang, Yaping; Shen, Dinggang

    2011-01-01

    Accurate and consistent reconstruction of cortical surfaces from longitudinal human brain MR images is of great importance in studying longitudinal subtle change of the cerebral cortex. This paper presents a novel deformable surface method for consistent and accurate reconstruction of inner, central and outer cortical surfaces from longitudinal brain MR images. Specifically, the cortical surfaces of the group-mean image of all aligned longitudinal images of the same subject are first reconstr...

  15. On the consistency of coset space dimensional reduction

    Energy Technology Data Exchange (ETDEWEB)

    Chatzistavrakidis, A. [Institute of Nuclear Physics, NCSR DEMOKRITOS, GR-15310 Athens (Greece); Physics Department, National Technical University of Athens, GR-15780 Zografou Campus, Athens (Greece)], E-mail: cthan@mail.ntua.gr; Manousselis, P. [Physics Department, National Technical University of Athens, GR-15780 Zografou Campus, Athens (Greece); Department of Engineering Sciences, University of Patras, GR-26110 Patras (Greece)], E-mail: pman@central.ntua.gr; Prezas, N. [CERN PH-TH, 1211 Geneva (Switzerland)], E-mail: nikolaos.prezas@cern.ch; Zoupanos, G. [Physics Department, National Technical University of Athens, GR-15780 Zografou Campus, Athens (Greece)], E-mail: george.zoupanos@cern.ch

    2007-11-15

    In this Letter we consider higher-dimensional Yang-Mills theories and examine their consistent coset space dimensional reduction. Utilizing a suitable ansatz and imposing a simple set of constraints we determine the four-dimensional gauge theory obtained from the reduction of both the higher-dimensional Lagrangian and the corresponding equations of motion. The two reductions yield equivalent results and hence they constitute an example of a consistent truncation.

  16. Truncations driven by constraints: consistency and conditions for correct upliftings

    CERN Document Server

    Pons, J M; Pons, Josep M.; Talavera, Pere

    2004-01-01

    We discuss the mechanism of truncations driven by the imposition of constraints. We show how the consistency of such truncations is controlled, and give general theorems that establish conditions for the correct uplifting of solutions. We show in some particular examples how one can get correct upliftings from 7d supergravities to 10d type IIB supergravity, even in cases when the truncation is not initially consistent by its own.

  17. S Matrix Proof of Consistency Condition Derived from Mixed Anomaly

    Science.gov (United States)

    Bhansali, Vineer

    For a confining quantum field theory with conserved current J and stress tensor T, the JJJ> and anomalies computed in terms of elementary quanta must be precisely equal to the same anomalies computed in terms of the exact physical spectrum if the conservation law corresponding to J is unbroken. These strongly constrain the allowed representations of the low energy spectrum. We present a proof of the latter consistency condition based on the proof by Coleman and Grossman of the former consistency condition.

  18. Consistent histories, quantum truth functionals, and hidden variables

    Science.gov (United States)

    Griffiths, Robert B.

    2000-01-01

    A central principle of consistent histories quantum theory, the requirement that quantum descriptions be based upon a single framework (or family), is employed to show that there is no conflict between consistent histories and a no-hidden-variables theorem of Bell, and Kochen and Specker, contrary to a recent claim by Bassi and Ghirardi. The argument makes use of `truth functionals' defined on a Boolean algebra of classical or quantum properties.

  19. Consistent histories, quantum truth functionals, and hidden variables

    CERN Document Server

    Griffiths, R B

    1999-01-01

    A central principle of consistent histories quantum theory, the requirement that quantum descriptions be based upon a single framework (or family), is employed to show that there is no conflict between consistent histories and a no-hidden-variables theorem of Bell, and Kochen and Specker, contrary to a recent claim by Bassi and Ghirardi. The argument makes use of ``truth functionals'' defined on a Boolean algebra of classical or quantum properties.

  20. Behavioural consistency and life history of Rana dalmatina tadpoles.

    Science.gov (United States)

    Urszán, Tamás János; Török, János; Hettyey, Attila; Garamszegi, László Zsolt; Herczeg, Gábor

    2015-05-01

    The focus of evolutionary behavioural ecologists has recently turned towards understanding the causes and consequences of behavioural consistency, manifesting either as animal personality (consistency in a single behaviour) or behavioural syndrome (consistency across more behaviours). Behavioural type (mean individual behaviour) has been linked to life-history strategies, leading to the emergence of the integrated pace-of-life syndrome (POLS) theory. Using Rana dalmatina tadpoles as models, we tested if behavioural consistency and POLS could be detected during the early ontogenesis of this amphibian. We targeted two ontogenetic stages and measured activity, exploration and risk-taking in a common garden experiment, assessing both individual behavioural type and intra-individual behavioural variation. We observed that activity was consistent in all tadpoles, exploration only became consistent with advancing age and risk-taking only became consistent in tadpoles that had been tested, and thus disturbed, earlier. Only previously tested tadpoles showed trends indicative of behavioural syndromes. We found an activity-age at metamorphosis POLS in the previously untested tadpoles irrespective of age. Relative growth rate correlated positively with the intra-individual variation of activity of the previously untested older tadpoles. In previously tested older tadpoles, intra-individual variation of exploration correlated negatively and intra-individual variation of risk-taking correlated positively with relative growth rate. We provide evidence for behavioural consistency and POLS in predator- and conspecific-naive tadpoles. Intra-individual behavioural variation was also correlated to life history, suggesting its relevance for the POLS theory. The strong effect of moderate disturbance related to standard behavioural testing on later behaviour draws attention to the pitfalls embedded in repeated testing.

  1. TWO APPROACHES TO IMPROVING THE CONSISTENCY OF COMPLEMENTARY JUDGEMENT MATRIX

    Institute of Scientific and Technical Information of China (English)

    XuZeshui

    2002-01-01

    By the transformation relations between complementary judgement matrix and reciprocal judgement matrix ,this paper proposes two methods for improving the consistency of complementary judgement matrix and gives two simple practical iterative algorithms. These two algorithms are easy to implement on computer,and the modified complementary judgement matrices remain most information that original matrix contains. Thus the methods supplement and develop the theory and methodology for improving consistency of complementary judgement matrix.

  2. Autonomous Navigation with Constrained Consistency for C-Ranger

    Directory of Open Access Journals (Sweden)

    Shujing Zhang

    2014-06-01

    Full Text Available Autonomous underwater vehicles (AUVs have become the most widely used tools for undertaking complex exploration tasks in marine environments. Their synthetic ability to carry out localization autonomously and build an environmental map concurrently, in other words, simultaneous localization and mapping (SLAM, are considered to be pivotal requirements for AUVs to have truly autonomous navigation. However, the consistency problem of the SLAM system has been greatly ignored during the past decades. In this paper, a consistency constrained extended Kalman filter (EKF SLAM algorithm, applying the idea of local consistency, is proposed and applied to the autonomous navigation of the C-Ranger AUV, which is developed as our experimental platform. The concept of local consistency (LC is introduced after an explicit theoretical derivation of the EKF-SLAM system. Then, we present a locally consistency- constrained EKF-SLAM design, LC-EKF, in which the landmark estimates used for linearization are fixed at the beginning of each local time period, rather than evaluated at the latest landmark estimates. Finally, our proposed LC- EKF algorithm is experimentally verified, both in simulations and sea trials. The experimental results show that the LC-EKF performs well with regard to consistency, accuracy and computational efficiency.

  3. Measuring consistency of autobiographical memory recall in depression.

    LENUS (Irish Health Repository)

    Semkovska, Maria

    2012-05-15

    Autobiographical amnesia assessments in depression need to account for normal changes in consistency over time, contribution of mood and type of memories measured. We report herein validation studies of the Columbia Autobiographical Memory Interview - Short Form (CAMI-SF), exclusively used in depressed patients receiving electroconvulsive therapy (ECT) but without previous published report of normative data. The CAMI-SF was administered twice with a 6-month interval to 44 healthy volunteers to obtain normative data for retrieval consistency of its Semantic, Episodic-Extended and Episodic-Specific components and assess their reliability and validity. Healthy volunteers showed significant large decreases in retrieval consistency on all components. The Semantic and Episodic-Specific components demonstrated substantial construct validity. We then assessed CAMI-SF retrieval consistencies over a 2-month interval in 30 severely depressed patients never treated with ECT compared with healthy controls (n=19). On initial assessment, depressed patients produced less episodic-specific memories than controls. Both groups showed equivalent amounts of consistency loss over a 2-month interval on all components. At reassessment, only patients with persisting depressive symptoms were distinguishable from controls on episodic-specific memories retrieved. Research quantifying retrograde amnesia following ECT for depression needs to control for normal loss in consistency over time and contribution of persisting depressive symptoms.

  4. Autonomous Navigation with Constrained Consistency for C-Ranger

    Directory of Open Access Journals (Sweden)

    Shujing Zhang

    2014-06-01

    Full Text Available Autonomous underwater vehicles (AUVs have become the most widely used tools for undertaking complex exploration tasks in marine environments. Their synthetic ability to carry out localization autonomously and build an environmental map concurrently, in other words, simultaneous localization and mapping (SLAM, are considered to be pivotal requirements for AUVs to have truly autonomous navigation. However, the consistency problem of the SLAM system has been greatly ignored during the past decades. In this paper, a consistency constrained extended Kalman filter (EKF SLAM algorithm, applying the idea of local consistency, is proposed and applied to the autonomous navigation of the C-Ranger AUV, which is developed as our experimental platform. The concept of local consistency (LC is introduced after an explicit theoretical derivation of the EKF-SLAM system. Then, we present a locally consistency-constrained EKF-SLAM design, LC-EKF, in which the landmark estimates used for linearization are fixed at the beginning of each local time period, rather than evaluated at the latest landmark estimates. Finally, our proposed LC-EKF algorithm is experimentally verified, both in simulations and sea trials. The experimental results show that the LC-EKF performs well with regard to consistency, accuracy and computational efficiency.

  5. Consistency analysis of accelerated degradation mechanism based on gray theory

    Institute of Scientific and Technical Information of China (English)

    Yunxia Chen; Hongxia Chen; Zhou Yang; Rui Kang; Yi Yang

    2014-01-01

    A fundamental premise of an accelerated testing is that the failure mechanism under elevated and normal stress levels should remain the same. Thus, verification of the consistency of failure mechanisms is essential during an accelerated testing. A new consistency analysis method based on the gray theory is pro-posed for complex products. First of al , existing consistency ana-lysis methods are reviewed with a focus on the comparison of the differences among them. Then, the proposed consistency ana-lysis method is introduced. Two effective gray prediction models, gray dynamic model and new information and equal dimensional (NIED) model, are adapted in the proposed method. The process to determine the dimension of NIED model is also discussed, and a decision rule is expanded. Based on that, the procedure of ap-plying the new consistent analysis method is developed. Final y, a case study of the consistency analysis of a reliability enhancement testing is conducted to demonstrate and validate the proposed method.

  6. Self-consistency in Bicultural Persons: Dialectical Self-beliefs Mediate the Relation between Identity Integration and Self-consistency

    Science.gov (United States)

    Zhang, Rui; Noels, Kimberly A.; Lalonde, Richard N.; Salas, S. J.

    2017-01-01

    Prior research differentiates dialectical (e.g., East Asian) from non-dialectical cultures (e.g., North American and Latino) and attributes cultural differences in self-concept consistency to naïve dialecticism. In this research, we explored the effects of managing two cultural identities on consistency within the bicultural self-concept via the role of dialectical beliefs. Because the challenge of integrating more than one culture within the self is common to biculturals of various heritage backgrounds, the effects of bicultural identity integration should not depend on whether the heritage culture is dialectical or not. In four studies across diverse groups of bicultural Canadians, we showed that having an integrated bicultural identity was associated with being more consistent across roles (Studies 1–3) and making less ambiguous self-evaluations (Study 4). Furthermore, dialectical self-beliefs mediated the effect of bicultural identity integration on self-consistency (Studies 2–4). Finally, Latino biculturals reported being more consistent across roles than did East Asian biculturals (Study 2), revealing the ethnic heritage difference between the two groups. We conclude that both the content of heritage culture and the process of integrating cultural identities influence the extent of self-consistency among biculturals. Thus, consistency within the bicultural self-concept can be understood, in part, to be a unique psychological product of bicultural experience. PMID:28326052

  7. Structural basis of transcription initiation by RNA polymerase II.

    OpenAIRE

    Sainsbury, S.; Bernecky, C.; Cramer, P

    2015-01-01

    Transcription of eukaryotic protein-coding genes commences with the assembly of a conserved initiation complex, which consists of RNA polymerase II (Pol II) and the general transcription factors, at promoter DNA. After two decades of research, the structural basis of transcription initiation is emerging. Crystal structures of many components of the initiation complex have been resolved, and structural information on Pol II complexes with general transcription factors has recently been obtaine...

  8. Does object view influence the scene consistency effect?

    Science.gov (United States)

    Sastyin, Gergo; Niimi, Ryosuke; Yokosawa, Kazuhiko

    2015-04-01

    Traditional research on the scene consistency effect only used clearly recognizable object stimuli to show mutually interactive context effects for both the object and background components on scene perception (Davenport & Potter in Psychological Science, 15, 559-564, 2004). However, in real environments, objects are viewed from multiple viewpoints, including an accidental, hard-to-recognize one. When the observers named target objects in scenes (Experiments 1a and 1b, object recognition task), we replicated the scene consistency effect (i.e., there was higher accuracy for the objects with consistent backgrounds). However, there was a significant interaction effect between consistency and object viewpoint, which indicated that the scene consistency effect was more important for identifying objects in the accidental view condition than in the canonical view condition. Therefore, the object recognition system may rely more on the scene context when the object is difficult to recognize. In Experiment 2, the observers identified the background (background recognition task) while the scene consistency and object views were manipulated. The results showed that object viewpoint had no effect, while the scene consistency effect was observed. More specifically, the canonical and accidental views both equally provided contextual information for scene perception. These findings suggested that the mechanism for conscious recognition of objects could be dissociated from the mechanism for visual analysis of object images that were part of a scene. The "context" that the object images provided may have been derived from its view-invariant, relatively low-level visual features (e.g., color), rather than its semantic information.

  9. Inter-laboratory consistency of gait analysis measurements.

    Science.gov (United States)

    Benedetti, M G; Merlo, A; Leardini, A

    2013-09-01

    The dissemination of gait analysis as a clinical assessment tool requires the results to be consistent, irrespective of the laboratory. In this work a baseline assessment of between site consistency of one healthy subject examined at 7 different laboratories is presented. Anthropometric and spatio-temporal parameters, pelvis and lower limb joint rotations, joint sagittal moments and powers, and ground reaction forces were compared. The consistency between laboratories for single parameters was assessed by the median absolute deviation and maximum difference, for curves by linear regression. Twenty-one lab-to-lab comparisons were performed and averaged. Large differences were found between the characteristics of the laboratories (i.e. motion capture systems and protocols). Different values for the anthropometric parameters were found, with the largest variability for a pelvis measurement. The spatio-temporal parameters were in general consistent. Segment and joint kinematics consistency was in general high (R2>0.90), except for hip and knee joint rotations. The main difference among curves was a vertical shift associated to the corresponding value in the static position. The consistency between joint sagittal moments ranged form R2=0.90 at the ankle to R2=0.66 at the hip, the latter was increasing when comparing separately laboratories using the same protocol. Pattern similarity was good for ankle power but not satisfactory for knee and hip power. The force was found the most consistent, as expected. The differences found were in general lower than the established minimum detectable changes for gait kinematics and kinetics for healthy adults.

  10. Astrometric Monitoring of the HR 8799 Planets: Orbit Constraints from Self-consistent Measurements

    Science.gov (United States)

    Konopacky, Q. M.; Marois, C.; Macintosh, B. A.; Galicher, R.; Barman, T. S.; Metchev, S. A.; Zuckerman, B.

    2016-08-01

    We present new astrometric measurements from our ongoing monitoring campaign of the HR 8799 directly imaged planetary system. These new data points were obtained with NIRC2 on the W.M. Keck II 10 m telescope between 2009 and 2014. In addition, we present updated astrometry from previously published observations in 2007 and 2008. All data were reduced using the SOSIE algorithm, which accounts for systematic biases present in previously published observations. This allows us to construct a self-consistent data set derived entirely from NIRC2 data alone. From this data set, we detect acceleration for two of the planets (HR 8799b and e) at >3σ. We also assess possible orbital parameters for each of the four planets independently. We find no statistically significant difference in the allowed inclinations of the planets. Fitting the astrometry while forcing coplanarity also returns χ 2 consistent to within 1σ of the best fit values, suggesting that if inclination offsets of ≲20° are present, they are not detectable with current data. Our orbital fits also favor low eccentricities, consistent with predictions from dynamical modeling. We also find period distributions consistent to within 1σ with a 1:2:4:8 resonance between all planets. This analysis demonstrates the importance of minimizing astrometric systematics when fitting for solutions to highly undersampled orbits.

  11. Hydrosol II Project; El Proyecto Hydrosol II

    Energy Technology Data Exchange (ETDEWEB)

    Lopez Martinez, A.

    2008-07-01

    At present energy production is based on the combustion of fossil fuels and is the main cause of greenhouse gas emissions, which is to say it is the main cause of the climate change that is affecting the planet. On a worldwide scale, the use of solar concentration systems with systems capable of dissociating water is considered, from both an energy and an economic standpoint, as the most important long-term goal in the production of solar fuels to reduce the costs of hydrogen and to ensure practically zero carbon dioxide emissions. The Hydrosol II project has the largest pilot plant of its kind, and the Hydrosol II reactors will be capable of breaking up the water molecule on the basis of thermochemical cycles at moderate temperatures. The Hydrosol II project pilot plant is now a reality, located in the SSPS heliostats field of the Almeria Solar Platform. (Author)

  12. Intrafibrillar Mineral May be Absent in Dentinogenesis Imperfecta Type II (DI-II)

    Energy Technology Data Exchange (ETDEWEB)

    Pople, John A.

    2001-03-29

    High-resolution synchrotron radiation computed tomography (SRCT) and small angle x-ray scattering (SAXS) were performed on normal and dentinogenesis imperfecta type II (DI-II) teeth. Three normal and three DI-II human third molars were used in this study. The normal molars were unerupted and had intact enamel; donors were female and ranged in age from 18-21y. The DI-II specimens, which were also unerupted with intact enamel, came from a single female donor age 20y. SRCT showed that the mineral concentration was 33% lower on average in the DI-II dentin with respect to normal dentin. The SAXS spectra from normal dentin exhibited low-angle diffraction peaks at harmonics of 67.6 nm, consistent with nucleation and growth of the apatite phase within gaps in the collagen fibrils (intrafibrillar mineralization). In contrast, the low-angle peaks were almost nonexistent in the DI-II dentin. Crystallite thickness was independent of location in both DI-II and normal dentin, although the crystallites were significantly thicker in DI-II dentin (6.8 nm (s.d. = 0.5) vs 5.1 nm (s.d. = 0.6)). The shape factor of the crystallites, as determined by SAXS, showed a continuous progression in normal dentin from roughly one-dimensional (needle-like) near the pulp to two-dimensional (plate-like) near the dentin-enamel junction. The crystallites in DI-II dentin, on the other hand, remained needle-like throughout. The above observations are consistent with an absence of intrafibrillar mineral in DI-II dentin.

  13. Intrafibrillar Mineral May be Absent in Dentinogenesis Imperfecta Type II (DI-II)

    Energy Technology Data Exchange (ETDEWEB)

    Pople, John A

    2001-03-29

    High-resolution synchrotron radiation computed tomography (SRCT) and small angle x-ray scattering (SAXS) were performed on normal and dentinogenesis imperfecta type II (DI-II) teeth. Three normal and three DI-II human third molars were used in this study. The normal molars were unerupted and had intact enamel; donors were female and ranged in age from 18-21y. The DI-II specimens, which were also unerupted with intact enamel, came from a single female donor age 20y. SRCT showed that the mineral concentration was 33% lower on average in the DI-II dentin with respect to normal dentin. The SAXS spectra from normal dentin exhibited low-angle diffraction peaks at harmonics of 67.6 nm, consistent with nucleation and growth of the apatite phase within gaps in the collagen fibrils (intrafibrillar mineralization). In contrast, the low-angle peaks were almost nonexistent in the DI-II dentin. Crystallite thickness was independent of location in both DI-II and normal dentin, although the crystallites were significantly thicker in DI-II dentin (6.8 nm (s.d. = 0.5) vs 5.1 nm (s.d. = 0.6)). The shape factor of the crystallites, as determined by SAXS, showed a continuous progression in normal dentin from roughly one-dimensional (needle-like) near the pulp to two-dimensional (plate-like) near the dentin-enamel junction. The crystallites in DI-II dentin, on the other hand, remained needle-like throughout. The above observations are consistent with an absence of intrafibrillar mineral in DI-II dentin.

  14. Urotensin II in cardiovascular regulation

    Directory of Open Access Journals (Sweden)

    Fraser D Russell

    2008-08-01

    Full Text Available Fraser D RussellSchool of Health and Sport Sciences, Faculty of Science, Health and Education, University of the Sunshine Coast, Sippy Downs, Queensland, AustraliaAbstract: Cardiovascular function is modulated by neuronal transmitters, circulating hormones, and factors that are released locally from tissues. Urotensin II (UII is an 11 amino acid peptide that stimulates its’ obligatory G protein coupled urotensin II receptors (UT to modulate cardiovascular function in humans and in other animal species, and has been implicated in both vasculoprotective and vasculopathic effects. For example, tissue and circulating concentrations of UII have been reported to increase in some studies involving patients with atherosclerosis, heart failure, hypertension, preeclampsia, diabetes, renal disease and liver disease, raising the possibility that the UT receptor system is involved in the development and/or progression of these conditions. Consistent with this hypothesis, administration of UT receptor antagonists to animal models of cardiovascular disease have revealed improvements in cardiovascular remodelling and hemodynamics. However, recent studies have questioned this contributory role of UII in disease, and have instead postulated a protective effect on the cardiovascular system. For example, high concentrations of circulating UII correlated with improved clinical outcomes in patients with renal disease or myocardial infarction. The purpose of this review is to consider the regulation of the cardiovascular system by UII, giving consideration to methodologies for measurement of plasma concentrations, sites of synthesis and triggers for release.Keywords: urotensin II, cardiovascular disease, heart failure, hypertension

  15. NOAA Next Generation Radar (NEXRAD) Level II Base Data

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — This dataset consists of Level II weather radar data collected from Next-Generation Radar (NEXRAD) stations located in the contiguous United States, Alaska, Hawaii,...

  16. Near-resonant absorption in the time-dependent self-consistent field and multiconfigurational self-consistent field approximations

    DEFF Research Database (Denmark)

    Norman, Patrick; Bishop, David M.; Jensen, Hans Jørgen Aa;

    2001-01-01

    Computationally tractable expressions for the evaluation of the linear response function in the multiconfigurational self-consistent field approximation were derived and implemented. The finite lifetime of the electronically excited states was considered and the linear response function was shown...

  17. Self-consistent pseudopotentials in the thermodynamic limit. II. The state-dependent one-body field

    Science.gov (United States)

    Hernández, E. S.; Plastino, A.; Szybisz, L.

    1980-07-01

    We explore the concept of the pseudopotential, introduced in a previous paper, within the context of the exact boundary condition for a system of fermions interacting through a pair-wise, hard-core potential, in the thermodynamic limit. We discuss several Ansätze for the Lagrange multipliers that allow the inclusion of the boundary conditions into the variational principle. It is found that under a given Ansatz, a dynamical, microscopic interpretation of the pseudopotential can be put forward. A comparison between this situation and the coherent approximation induced by the use of the correlation function is also presented. NUCLEAR STRUCTURE Hard-core interactions, boundary condition, variational principle constrained Hartree-Fock problem; Ansatz, structure of the pseudopotential, state dependence.

  18. Self-consistent pseudopotentials in the thermodynamic limit. II. The state-dependent one-body field

    Energy Technology Data Exchange (ETDEWEB)

    Hernandez, E.S.; Plastino, A.; Szybisz, L.

    1980-07-01

    We explore the concept of the pseudopotential, introduced in a previous paper, within the context of the exact boundary condition for a system of fermions interacting through a pair-wise, hard-core potential, in the thermodynamic limit. We discuss several Ansatze for the Lagrange multipliers that allow the inclusion of the boundary conditions into the variational principle. It is found that under a given Ansatz, a dynamical, microscopic interpretation of the pseudopotential can be put forward. A comparison between this situation and the coherent approximation induced by the use of the correlation function is also presented.

  19. Burkina Faso - BRIGHT II

    Data.gov (United States)

    Millennium Challenge Corporation — Millennium Challenge Corporation hired Mathematica Policy Research to conduct an independent evaluation of the BRIGHT II program. The three main research questions...

  20. Martial arts striking hand peak acceleration, accuracy and consistency.

    Science.gov (United States)

    Neto, Osmar Pinto; Marzullo, Ana Carolina De Miranda; Bolander, Richard P; Bir, Cynthia A

    2013-01-01

    The goal of this paper was to investigate the possible trade-off between peak hand acceleration and accuracy and consistency of hand strikes performed by martial artists of different training experiences. Ten male martial artists with training experience ranging from one to nine years volunteered to participate in the experiment. Each participant performed 12 maximum effort goal-directed strikes. Hand acceleration during the strikes was obtained using a tri-axial accelerometer block. A pressure sensor matrix was used to determine the accuracy and consistency of the strikes. Accuracy was estimated by the radial distance between the centroid of each subject's 12 strikes and the target, whereas consistency was estimated by the square root of the 12 strikes mean squared distance from their centroid. We found that training experience was significantly correlated to hand peak acceleration prior to impact (r(2)=0.456, p =0.032) and accuracy (r(2)=0. 621, p=0.012). These correlations suggest that more experienced participants exhibited higher hand peak accelerations and at the same time were more accurate. Training experience, however, was not correlated to consistency (r(2)=0.085, p=0.413). Overall, our results suggest that martial arts training may lead practitioners to achieve higher striking hand accelerations with better accuracy and no change in striking consistency.

  1. Cognitive consistency and math-gender stereotypes in Singaporean children.

    Science.gov (United States)

    Cvencek, Dario; Meltzoff, Andrew N; Kapur, Manu

    2014-01-01

    In social psychology, cognitive consistency is a powerful principle for organizing psychological concepts. There have been few tests of cognitive consistency in children and no research about cognitive consistency in children from Asian cultures, who pose an interesting developmental case. A sample of 172 Singaporean elementary school children completed implicit and explicit measures of math-gender stereotype (male=math), gender identity (me=male), and math self-concept (me=math). Results showed strong evidence for cognitive consistency; the strength of children's math-gender stereotypes, together with their gender identity, significantly predicted their math self-concepts. Cognitive consistency may be culturally universal and a key mechanism for developmental change in social cognition. We also discovered that Singaporean children's math-gender stereotypes increased as a function of age and that boys identified with math more strongly than did girls despite Singaporean girls' excelling in math. The results reveal both cultural universals and cultural variation in developing social cognition. Copyright © 2013 Elsevier Inc. All rights reserved.

  2. Analysis of Consistency of Printing Blankets using Correlation Technique

    Directory of Open Access Journals (Sweden)

    Lalitha Jayaraman

    2010-01-01

    Full Text Available This paper presents the application of an analytical tool to quantify material consistency of offset printing blankets. Printing blankets are essentially viscoelastic rubber composites of several laminas. High levels of material consistency are expected from rubber blankets for quality print and for quick recovery from smash encountered during the printing process. The present study aims at determining objectively the consistency of printing blankets at three specific torque levels of tension under two distinct stages; 1. under normal printing conditions and 2. on recovery after smash. The experiment devised exhibits a variation in tone reproduction properties of each blanket signifying the levels of inconsistency also in thicknessdirection. Correlation technique was employed on ink density variations obtained from the blanket on paper. Both blankets exhibited good consistency over three torque levels under normal printing conditions. However on smash the recovery of blanket and its consistency was a function of manufacturing and torque levels. This study attempts to provide a new metrics for failure analysis of offset printing blankets. It also underscores the need for optimizing the torque for blankets from different manufacturers.

  3. GRAVITATIONALLY CONSISTENT HALO CATALOGS AND MERGER TREES FOR PRECISION COSMOLOGY

    Energy Technology Data Exchange (ETDEWEB)

    Behroozi, Peter S.; Wechsler, Risa H.; Wu, Hao-Yi [Kavli Institute for Particle Astrophysics and Cosmology, Department of Physics, Stanford University, Stanford, CA 94305 (United States); Busha, Michael T. [Institute for Theoretical Physics, University of Zurich, CH-8006 Zurich (Switzerland); Klypin, Anatoly A. [Astronomy Department, New Mexico State University, Las Cruces, NM 88003 (United States); Primack, Joel R., E-mail: behroozi@stanford.edu, E-mail: rwechsler@stanford.edu [Department of Physics, University of California at Santa Cruz, Santa Cruz, CA 95064 (United States)

    2013-01-20

    We present a new algorithm for generating merger trees and halo catalogs which explicitly ensures consistency of halo properties (mass, position, and velocity) across time steps. Our algorithm has demonstrated the ability to improve both the completeness (through detecting and inserting otherwise missing halos) and purity (through detecting and removing spurious objects) of both merger trees and halo catalogs. In addition, our method is able to robustly measure the self-consistency of halo finders; it is the first to directly measure the uncertainties in halo positions, halo velocities, and the halo mass function for a given halo finder based on consistency between snapshots in cosmological simulations. We use this algorithm to generate merger trees for two large simulations (Bolshoi and Consuelo) and evaluate two halo finders (ROCKSTAR and BDM). We find that both the ROCKSTAR and BDM halo finders track halos extremely well; in both, the number of halos which do not have physically consistent progenitors is at the 1%-2% level across all halo masses. Our code is publicly available at http://code.google.com/p/consistent-trees. Our trees and catalogs are publicly available at http://hipacc.ucsc.edu/Bolshoi/.

  4. Self-consistent generalized Langevin equation for colloidal mixtures.

    Science.gov (United States)

    Chávez-Rojo, Marco Antonio; Medina-Noyola, Magdaleno

    2005-09-01

    A self-consistent theory of collective and tracer diffusion in colloidal mixtures is presented. This theory is based on exact results for the partial intermediate scattering functions derived within the framework of the generalized Langevin equation formalism, plus a number of conceptually simple and sensible approximations. The first of these consists of a Vineyard-like approximation between collective and tracer diffusion, which writes the collective dynamics in terms of the memory function related to tracer diffusion. The second consists of interpolating this only unknown memory function between its two exact limits at small and large wave vectors; for this, a phenomenologically determined, but not arbitrary, interpolating function is introduced: a Lorentzian with its inflection point located at the first minimum of the partial static structure factor. The small wave-vector exact limit involves a time-dependent friction function, for which we take a general approximate result, previously derived within the generalized Langevin equation formalism. This general result expresses the time-dependent friction function in terms of the partial intermediate scattering functions, thus closing the system of equations into a fully self-consistent scheme. This extends to mixtures a recently proposed self-consistent theory developed for monodisperse suspensions [Yeomans-Reyna and Medina-Noyola, Phys. Rev. E 64, 066114 (2001)]. As an illustration of its quantitative accuracy, its application to a simple model of a binary dispersion in the absence of hydrodynamic interactions is reported.

  5. Pulsed laser photoacoustic monitoring of paper pulp consistency

    Science.gov (United States)

    Zhao, Zuomin; Törmänen, Matti; Myllylä, Risto

    2008-06-01

    This study involves measurements of pulp consistency in cuvette and by an online apparatus, by innovatively scattering photoacoustic (SPA) method. The theoretical aspects were described at first. Then, a few kinds of wood fiber suspensions with consistencies from 0.5% to 5% were studied in cuvette. After that, a pilot of online apparatus was built to measure suspensions with fiber consistency lower than 1% and filler content up to 3%. The results showed that although there were many fiber flocks in cuvette which strongly affected the measurement accuracy of samples consistencies, the apparatus can sense fiber types with different optical and acoustic properties. The measurement accuracy can be greatly improved in the online style apparatus, by pumping suspension fluids in a circulating system to improve the suspension homogeneity. The results demonstrated that wood fibers cause larger attenuation of acoustic waves but fillers do not. On the other hand, fillers cause stronger scattering of incident light. Therefore, our SPA apparatus has a potential ability to simultaneously determine fiber and filler fractions in pulp suspensions with consistency up to 5%.

  6. Consistency of Scalar Potentials from Quantum de Sitter Space

    CERN Document Server

    Espinosa, José R; Trépanier, Maxime

    2015-01-01

    We derive constraints on the scalar potential of a quantum field theory in de Sitter space. The constraints, which we argue should be understood as consistency conditions for quantum field theories in dS space, originate from a consistent interpretation of quantum de Sitter space through its Coleman-De Luccia tunneling rate. Indeed, consistency of de Sitter space as a quantum theory of gravity with a finite number of degrees of freedom suggests the tunneling rates to vacua with negative cosmological constants be interpreted as Poincar\\'e recurrences. Demanding the tunneling rate to be a Poincar\\'e recurrence imposes two constraints, or consistency conditions, on the scalar potential. Although the exact consistency conditions depend on the shape of the scalar potential, generically they correspond to: the distance in field space between the de Sitter vacuum and any other vacuum with negative cosmological constant must be of the order of the reduced Planck mass or larger; and the fourth root of the vacuum energ...

  7. Gravitationally Consistent Halo Catalogs and Merger Trees for Precision Cosmology

    Science.gov (United States)

    Behroozi, Peter S.; Wechsler, Risa H.; Wu, Hao-Yi; Busha, Michael T.; Klypin, Anatoly A.; Primack, Joel R.

    2013-01-01

    We present a new algorithm for generating merger trees and halo catalogs which explicitly ensures consistency of halo properties (mass, position, and velocity) across time steps. Our algorithm has demonstrated the ability to improve both the completeness (through detecting and inserting otherwise missing halos) and purity (through detecting and removing spurious objects) of both merger trees and halo catalogs. In addition, our method is able to robustly measure the self-consistency of halo finders; it is the first to directly measure the uncertainties in halo positions, halo velocities, and the halo mass function for a given halo finder based on consistency between snapshots in cosmological simulations. We use this algorithm to generate merger trees for two large simulations (Bolshoi and Consuelo) and evaluate two halo finders (ROCKSTAR and BDM). We find that both the ROCKSTAR and BDM halo finders track halos extremely well; in both, the number of halos which do not have physically consistent progenitors is at the 1%-2% level across all halo masses. Our code is publicly available at http://code.google.com/p/consistent-trees. Our trees and catalogs are publicly available at http://hipacc.ucsc.edu/Bolshoi/.

  8. Analysis of Consistency of Printing Blankets using Correlation Technique

    Directory of Open Access Journals (Sweden)

    Balaraman Kumar

    2010-06-01

    Full Text Available This paper presents the application of an analytical tool to quantify material consistency of offset printing blankets. Printing blankets are essentially viscoelastic rubber composites of several laminas. High levels of material consistency are expected from rubber blankets for quality print and for quick recovery from smash encountered during the printing process. The present study aims at determining objectively the consistency of printing blankets at three specific torque levels of tension under two distinct stages; 1. under normal printing conditions and 2. on recovery after smash. The experiment devised exhibits a variation in tone reproduction properties of each blanket signifying the levels of inconsistency also in thickness direction. Correlation technique was employed on ink density variations obtained from the blanket on paper. Both blankets exhibited good consistency over three torque levels under normal printing conditions. However on smash the recovery of blanket and its consistency was a function of manufacturing and torque levels. This study attempts to provide a new metrics for failure analysis of offset printing blankets. It also underscores the need for optimising the torque for blankets from different manufacturers.

  9. A Dynamical Mechanism for Large Volumes with Consistent Couplings

    CERN Document Server

    Abel, Steven

    2016-01-01

    A mechanism for addressing the 'decompactification problem' is proposed, which consists of balancing the vacuum energy in Scherk-Schwarzed theories against contributions coming from non- perturbative physics. Universality of threshold corrections ensures that, in such situations, the stable minimum will have consistent gauge couplings for any gauge group that shares the same N = 2 beta function for the bulk excitations as the gauge group that takes part in the minimisation. Scherk- Schwarz compactification from 6D to 4D in heterotic strings is discussed explicitly, together with two alternative possibilities for the non-perturbative physics, namely metastable SQCD vacua and a single gaugino condensate. In the former case, it is shown that modular symmetries gives various consistency checks, and allow one to follow soft-terms, playing a similar role to R-symmetry in global SQCD. The latter case is particularly attractive when there is nett Bose-Fermi degeneracy in the massless sector. In such cases, because th...

  10. Consistent group selection in high-dimensional linear regression

    CERN Document Server

    Wei, Fengrong; 10.3150/10-BEJ252

    2010-01-01

    In regression problems where covariates can be naturally grouped, the group Lasso is an attractive method for variable selection since it respects the grouping structure in the data. We study the selection and estimation properties of the group Lasso in high-dimensional settings when the number of groups exceeds the sample size. We provide sufficient conditions under which the group Lasso selects a model whose dimension is comparable with the underlying model with high probability and is estimation consistent. However, the group Lasso is, in general, not selection consistent and also tends to select groups that are not important in the model. To improve the selection results, we propose an adaptive group Lasso method which is a generalization of the adaptive Lasso and requires an initial estimator. We show that the adaptive group Lasso is consistent in group selection under certain conditions if the group Lasso is used as the initial estimator.

  11. Lightness constancy through transparency: internal consistency in layered surface representations.

    Science.gov (United States)

    Singh, Manish

    2004-01-01

    Asymmetric lightness matching was employed to measure how the visual system assigns lightness to surface patches seen through partially-transmissive surfaces. Observers adjusted the luminance of a comparison patch seen through transparency, in order to match the lightness of a standard patch seen in plain view. Plots of matched-to-standard luminance were linear, and their slopes were consistent with Metelli's alpha. A control experiment confirmed that these matches were indeed transparency based. Consistent with recent results, however, when observers directly matched the transmittance of transparent surfaces, their matches deviated strongly and systematically from Metelli's alpha. Although the two sets of results appear to be contradictory, formal analysis reveals a deeper mutual consistency in the representation of the two layers. A ratio-of-contrasts model is shown to explain both the success of Metelli's model in predicting lightness through transparency, and its failure to predict perceived transmittance--and hence is seen to play the primary role in perceptual transparency.

  12. A Consistent Semantics of Self-Adjusting Computation

    CERN Document Server

    Acar, Umut A; Donham, Jacob

    2011-01-01

    This paper presents a semantics of self-adjusting computation and proves that the semantics are correct and consistent. The semantics integrate change propagation with the classic idea of memoization to enable reuse of computations under mutation to memory. During evaluation, reuse of a computation via memoization triggers a change propagation that adjusts the reused computation to reflect the mutated memory. Since the semantics integrate memoization and change-propagation, it involves both non-determinism (due to memoization) and mutation (due to change propagation). Our consistency theorem states that the non-determinism is not harmful: any two evaluations of the same program starting at the same state yield the same result. Our correctness theorem states that mutation is not harmful: self-adjusting programs are consistent with purely functional programming. We formalize the semantics and their meta-theory in the LF logical framework and machine check our proofs using Twelf.

  13. Consistency and Reconciliation Model In Regional Development Planning

    Directory of Open Access Journals (Sweden)

    Dina Suryawati

    2016-10-01

    Full Text Available The aim of this study was to identify the problems and determine the conceptual model of regional development planning. Regional development planning is a systemic, complex and unstructured process. Therefore, this study used soft systems methodology to outline unstructured issues with a structured approach. The conceptual models that were successfully constructed in this study are a model of consistency and a model of reconciliation. Regional development planning is a process that is well-integrated with central planning and inter-regional planning documents. Integration and consistency of regional planning documents are very important in order to achieve the development goals that have been set. On the other hand, the process of development planning in the region involves technocratic system, that is, both top-down and bottom-up system of participation. Both must be balanced, do not overlap and do not dominate each other. regional, development, planning, consistency, reconciliation

  14. The consistency approach for the quality control of vaccines.

    Science.gov (United States)

    Hendriksen, Coenraad; Arciniega, Juan L; Bruckner, Lukas; Chevalier, Michel; Coppens, Emmanuelle; Descamps, Johan; Duchêne, Michel; Dusek, David Michael; Halder, Marlies; Kreeftenberg, Hans; Maes, Alexandrine; Redhead, Keith; Ravetkar, Satish D; Spieser, Jean-Marc; Swam, Hanny

    2008-01-01

    Current lot release testing of conventional vaccines emphasizes quality control of the final product and is characterized by its extensive use of laboratory animals. This report, which is based on the outcome of an ECVAM (European Centre for Validation of Alternative Methods, Institute for Health and Consumer Protection, European Commission Joint Research Centre, Ispra, Italy) workshop, discusses the concept of consistency testing as an alternative approach for lot release testing. The consistency approach for the routine release of vaccines is based upon the principle that the quality of vaccines is a consequence of a quality system and of consistent production of lots with similar characteristics to those lots that have been shown to be safe and effective in humans or the target species. The report indicates why and under which circumstances this approach can be applied, the role of the different stakeholders, and the need for international harmonization. It also gives recommendations for its implementation.

  15. Self-consistent modelling of resonant tunnelling structures

    DEFF Research Database (Denmark)

    Fiig, T.; Jauho, A.P.

    1992-01-01

    We report a comprehensive study of the effects of self-consistency on the I-V-characteristics of resonant tunnelling structures. The calculational method is based on a simultaneous solution of the effective-mass Schrödinger equation and the Poisson equation, and the current is evaluated with the ......We report a comprehensive study of the effects of self-consistency on the I-V-characteristics of resonant tunnelling structures. The calculational method is based on a simultaneous solution of the effective-mass Schrödinger equation and the Poisson equation, and the current is evaluated...... applied voltages and carrier densities at the emitter-barrier interface. We include the two-dimensional accumulation layer charge and the quantum well charge in our self-consistent scheme. We discuss the evaluation of the current contribution originating from the two-dimensional accumulation layer charges...

  16. Model-Consistent Sparse Estimation through the Bootstrap

    CERN Document Server

    Bach, Francis

    2009-01-01

    We consider the least-square linear regression problem with regularization by the $\\ell^1$-norm, a problem usually referred to as the Lasso. In this paper, we first present a detailed asymptotic analysis of model consistency of the Lasso in low-dimensional settings. For various decays of the regularization parameter, we compute asymptotic equivalents of the probability of correct model selection. For a specific rate decay, we show that the Lasso selects all the variables that should enter the model with probability tending to one exponentially fast, while it selects all other variables with strictly positive probability. We show that this property implies that if we run the Lasso for several bootstrapped replications of a given sample, then intersecting the supports of the Lasso bootstrap estimates leads to consistent model selection. This novel variable selection procedure, referred to as the Bolasso, is extended to high-dimensional settings by a provably consistent two-step procedure.

  17. Consistency of the group Lasso and multiple kernel learning

    CERN Document Server

    Bach, Francis

    2007-01-01

    We consider the least-square regression problem with regularization by a block 1-norm, i.e., a sum of Euclidean norms over spaces of dimensions larger than one. This problem, referred to as the group Lasso, extends the usual regularization by the 1-norm where all spaces have dimension one, where it is commonly referred to as the Lasso. In this paper, we study the asymptotic model consistency of the group Lasso. We derive necessary and sufficient conditions for the consistency of group Lasso under practical assumptions, such as model misspecification. When the linear predictors and Euclidean norms are replaced by functions and reproducing kernel Hilbert norms, the problem is usually referred to as multiple kernel learning and is commonly used for learning from heterogeneous data sources and for non linear variable selection. Using tools from functional analysis, and in particular covariance operators, we extend the consistency results to this infinite dimensional case and also propose an adaptive scheme to obt...

  18. The consistent histories approach to loop quantum cosmology

    CERN Document Server

    Craig, David A

    2016-01-01

    We review the application of the consistent (or decoherent) histories formulation of quantum theory to canonical loop quantum cosmology. Conventional quantum theory relies crucially on "measurements" to convert unrealized quantum potentialities into physical outcomes that can be assigned probabilities. In the early universe and other physical contexts in which there are no observers or measuring apparatus (or indeed, in any closed quantum system), what criteria determine which alternative outcomes may be realized and what their probabilities are? In the consistent histories formulation it is the vanishing of interference between the branch wave functions describing alternative histories -- as determined by the system's decoherence functional -- that determines which alternatives may be assigned probabilities. We describe the consistent histories formulation and how it may be applied to canonical loop quantum cosmology, describing in detail the application to homogeneous and isotropic cosmological models with ...

  19. One-particle-irreducible consistency relations for cosmological perturbations

    CERN Document Server

    Goldberger, Walter D; Nicolis, Alberto

    2013-01-01

    We derive consistency relations for correlators of scalar cosmological perturbations which hold in the "squeezed limit" in which one or more of the external momenta become soft. Our results are formulated as relations between suitably defined one-particle irreducible N-point and (N-1)-point functions that follow from residual spatial conformal diffeomorphisms of the unitary gauge Lagrangian. As such, some of these relations are exact to all orders in perturbation theory, and do not rely on approximate deSitter invariance or other dynamical assumptions (e.g., properties of the operator product expansion or the behavior of modes at horizon crossing). The consistency relations apply model-independently to cosmological scenarios where the time evolution is driven by a single scalar field. Besides reproducing the known results for single-field inflation in the slow roll limit, we verify that our consistency relations hold more generally, for instance in ghost condensate models in flat space. We comment on possible...

  20. Multiscale Parameter Regionalization for consistent global water resources modelling

    Science.gov (United States)

    Wanders, Niko; Wood, Eric; Pan, Ming; Samaniego, Luis; Thober, Stephan; Kumar, Rohini; Sutanudjaja, Edwin; van Beek, Rens; Bierkens, Marc F. P.

    2017-04-01

    Due to an increasing demand for high- and hyper-resolution water resources information, it has become increasingly important to ensure consistency in model simulations across scales. This consistency can be ensured by scale independent parameterization of the land surface processes, even after calibration of the water resource model. Here, we use the Multiscale Parameter Regionalization technique (MPR, Samaniego et al. 2010, WRR) to allow for a novel, spatially consistent, scale independent parameterization of the global water resource model PCR-GLOBWB. The implementation of MPR in PCR-GLOBWB allows for calibration at coarse resolutions and subsequent parameter transfer to the hyper-resolution. In this study, the model was calibrated at 50 km resolution over Europe and validation carried out at resolutions of 50 km, 10 km and 1 km. MPR allows for a direct transfer of the calibrated transfer function parameters across scales and we find that we can maintain consistent land-atmosphere fluxes across scales. Here we focus on the 2003 European drought and show that the new parameterization allows for high-resolution calibrated simulations of water resources during the drought. For example, we find a reduction from 29% to 9.4% in the percentile difference in the annual evaporative flux across scales when compared against default simulations. Soil moisture errors are reduced from 25% to 6.9%, clearly indicating the benefits of the MPR implementation. This new parameterization allows us to show more spatial detail in water resources simulations that are consistent across scales and also allow validation of discharge for smaller catchments, even with calibrations at a coarse 50 km resolution. The implementation of MPR allows for novel high-resolution calibrated simulations of a global water resources model, providing calibrated high-resolution model simulations with transferred parameter sets from coarse resolutions. The applied methodology can be transferred to other

  1. Neighborhood consistency in mental arithmetic: Behavioral and ERP evidence

    Directory of Open Access Journals (Sweden)

    Verguts Tom

    2007-12-01

    Full Text Available Abstract Background Recent cognitive and computational models (e.g. the Interacting Neighbors Model state that in simple multiplication decade and unit digits of the candidate answers (including the correct result are represented separately. Thus, these models challenge holistic views of number representation as well as traditional accounts of the classical problem size effect in simple arithmetic (i.e. the finding that large problems are answered slower and less accurate than small problems. Empirical data supporting this view are still scarce. Methods Data of 24 participants who performed a multiplication verification task with Arabic digits (e.g. 8 × 4 = 36 - true or false? are reported. Behavioral (i.e. RT and errors and EEG (i.e. ERP measures were recorded in parallel. Results We provide evidence for neighborhood-consistency effects in the verification of simple multiplication problems (e.g. 8 × 4. Behaviorally, we find that decade-consistent lures, which share their decade digit with the correct result (e.g. 36, are harder to reject than matched inconsistent lures, which differ in both digits from the correct result (e.g. 28. This neighborhood consistency effect in product verification is similar to recent observations in the production of multiplication results. With respect to event-related potentials we find significant differences for consistent compared to inconsistent lures in the N400 (increased negativity and Late Positive Component (reduced positivity. In this respect consistency effects in our paradigm resemble lexico-semantic effects earlier found in simple arithmetic and in orthographic input processing. Conclusion Our data suggest that neighborhood consistency effects in simple multiplication stem at least partly from central (lexico-semantic' stages of processing. These results are compatible with current models on the representation of simple multiplication facts – in particular with the Interacting Neighbors Model

  2. Stiffnites. Part II

    Directory of Open Access Journals (Sweden)

    Maria Teresa Pareschi

    2011-06-01

    Full Text Available

    The dynamics of a stiffnite are here inferred. A stiffnite is a sheet-shaped, gravity-driven submarine sediment flow, with a fabric made up of marine ooze. To infer stiffnite dynamics, order of magnitude estimations are used. Field deposits and experiments on materials taken from the literature are also used. Stiffnites can be tens or hundreds of kilometers wide, and a few centimeters/ meters thick. They move on the sea slopes over hundreds of kilometers, reaching submarine velocities as high as 100 m/s. Hard grain friction favors grain fragmentation and formation of triboelectrically electrified particles and triboplasma (i.e., ions + electrons. Marine lipids favor isolation of electrical charges. At first, two basic assumptions are introduced, and checked a posteriori: (a in a flowing stiffnite, magnetic dipole moments develop, with the magnetization proportional to the shear rate. I have named those dipoles as Ambigua. (b Ambigua are ‘vertically frozen’ along stiffnite streamlines. From (a and (b, it follows that: (i Ambigua create a magnetic field (at peak, >1 T. (ii Lorentz forces sort stiffnite particles into two superimposed sheets. The lower sheet, L+, has a sandy granulometry and a net positive electrical charge density. The upper sheet, L–, has a silty muddy granulometry and a net negative electrical charge density; the grains of sheet L– become finer upwards. (iii Faraday forces push ferromagnetic grains towards the base of a stiffnite, so that a peak of magnetic susceptibility characterizes a stiffnite deposit. (iv Stiffnites harden considerably during their motion, due to magnetic confinement. Stiffnite deposits and inferred stiffnite characteristics are compatible with a stable flow behavior against bending, pinch, or other macro instabilities. In the present report, a consistent hypothesis about the nature of Ambigua is provided.

  3. Agent-Based Context Consistency Management in Smart Space Environments

    Science.gov (United States)

    Jih, Wan-Rong; Hsu, Jane Yung-Jen; Chang, Han-Wen

    Context-aware systems in smart space environments must be aware of the context of their surroundings and adapt to changes in highly dynamic environments. Data management of contextual information is different from traditional approaches because the contextual information is dynamic, transient, and fallible in nature. Consequently, the capability to detect context inconsistency and maintain consistent contextual information are two key issues for context management. We propose an ontology-based model for representing, deducing, and managing consistent contextual information. In addition, we use ontology reasoning to detect and resolve context inconsistency problems, which will be described in a Smart Alarm Clock scenario.

  4. Consistency among integral measurements of aggregate decay heat power

    Energy Technology Data Exchange (ETDEWEB)

    Takeuchi, H.; Sagisaka, M.; Oyamatsu, K.; Kukita, Y. [Nagoya Univ. (Japan)

    1998-03-01

    Persisting discrepancies between summation calculations and integral measurements force us to assume large uncertainties in the recommended decay heat power. In this paper, we develop a hybrid method to calculate the decay heat power of a fissioning system from those of different fissioning systems. Then, this method is applied to examine consistency among measured decay heat powers of {sup 232}Th, {sup 233}U, {sup 235}U, {sup 238}U and {sup 239}Pu at YAYOI. The consistency among the measured values are found to be satisfied for the {beta} component and fairly well for the {gamma} component, except for cooling times longer than 4000 s. (author)

  5. The consistency service of the ATLAS Distributed Data Management system

    CERN Document Server

    Serfon, C; The ATLAS collaboration

    2011-01-01

    With the continuously increasing volume of data produced by ATLAS and stored on the WLCG sites, the probability of data corruption or data losses, due to software and hardware failures is increasing. In order to ensure the consistency of all data produced by ATLAS a Consistency Service has been developed as part of the DQ2 Distributed Data Management system. This service is fed by the different ATLAS tools, i.e. the analysis tools, production tools, DQ2 site services or by site administrators that report corrupted or lost files. It automatically corrects the errors reported and informs the users in case of irrecoverable file loss.

  6. The Consistency Service of the ATLAS Distributed Data Management system

    CERN Document Server

    Serfon, C; The ATLAS collaboration

    2010-01-01

    With the continuously increasing volume of data produced by ATLAS and stored on the WLCG sites, the probability of data corruption or data losses, due to software and hardware failure is increasing. In order to ensure the consistency of all data produced by ATLAS a Consistency Service has been developed as part of the DQ2 Distributed Data Management system. This service is fed by the different ATLAS tools, i.e. the analysis tools, production tools, DQ2 site services or by site administrators that report corrupted or lost files. It automatically correct the errors reported and informs the users in case of irrecoverable file loss.

  7. Towards consistent nuclear models and comprehensive nuclear data evaluations

    Energy Technology Data Exchange (ETDEWEB)

    Bouland, O [Los Alamos National Laboratory; Hale, G M [Los Alamos National Laboratory; Lynn, J E [Los Alamos National Laboratory; Talou, P [Los Alamos National Laboratory; Bernard, D [FRANCE; Litaize, O [FRANCE; Noguere, G [FRANCE; De Saint Jean, C [FRANCE; Serot, O [FRANCE

    2010-01-01

    The essence of this paper is to enlighten the consistency achieved nowadays in nuclear data and uncertainties assessments in terms of compound nucleus reaction theory from neutron separation energy to continuum. Making the continuity of theories used in resolved (R-matrix theory), unresolved resonance (average R-matrix theory) and continuum (optical model) rangcs by the generalization of the so-called SPRT method, consistent average parameters are extracted from observed measurements and associated covariances are therefore calculated over the whole energy range. This paper recalls, in particular, recent advances on fission cross section calculations and is willing to suggest some hints for future developments.

  8. Standard Model Vacuum Stability and Weyl Consistency Conditions

    DEFF Research Database (Denmark)

    Antipin, Oleg; Gillioz, Marc; Krog, Jens;

    2013-01-01

    At high energy the standard model possesses conformal symmetry at the classical level. This is reflected at the quantum level by relations between the different beta functions of the model. These relations are known as the Weyl consistency conditions. We show that it is possible to satisfy them...... order by order in perturbation theory, provided that a suitable coupling constant counting scheme is used. As a direct phenomenological application, we study the stability of the standard model vacuum at high energies and compare with previous computations violating the Weyl consistency conditions....

  9. Remark on the Consistent Gauge Anomaly in Supersymmetric Theories

    CERN Document Server

    Ohshima, Y; Suzuki, H; Yasuta, H; Ohshima, Yoshihisa; Okuyama, Kiyoshi; Suzuki, Hiroshi; Yasuta, Hirofumi

    1999-01-01

    We present a direct field theoretical calculation of the consistent gauge anomaly in the superfield formalism, on the basis of a definition of the effective action through the covariant gauge current. The scheme is conceptually and technically simple and the gauge covariance in intermediate steps reduces calculational labors considerably. The resultant superfield anomaly, being proportional to the anomaly $d^{abc}=\\tr T^a\\{T^b,T^c\\}$, is minimal even without supplementing any counterterms. Our anomaly coincides with the anomaly obtained by Marinkovi\\'c as the solution of the Wess-Zumino consistency condition.

  10. A Van Atta reflector consisting of half-wave dipoles

    DEFF Research Database (Denmark)

    Appel-Hansen, Jørgen

    1966-01-01

    The reradiation pattern of a passive Van Atta reflector consisting of half-wave dipoles is investigated. The character of the reradiation pattern first is deduced by qualitative and physical considerations. Various types of array elements are considered and several geometrical configurations...... of these elements are outlined. Following this, an analysis is made of the reradiation pattern of a linear Van Atta array consisting of four equispaced half-wave dipoles. The general form of the reradiation pattern is studied analytically. The influence of scattering and coupling is determined and the dependence...

  11. HIGH CONSISTENCY PULPING OF OLD NEWSPRINT AND ITS FLOTATION PROPERTIES

    Institute of Scientific and Technical Information of China (English)

    Chunhui Zhang; Menghua Qin

    2004-01-01

    The mechanical and chemical effect on the pulping properties of the old newsprint was studied using a FORMAX Micro-Maelstrom Laboratory Pulper, and the flotation conditions such as velocity of air flow,air pressure and flotation time were also discussed with a FORMAX Deink Cell. The results show that sodium hydroxide, sodium silicate, hydrogen peroxide and deinking agent are the key factors in the chemical effect, and pulping consistency is more important than pulping time and rotation speed in the mechanical effect during the high consistency pulping of the ONP. In general, the chemical effect has a greater influence on the deinked pulp properties than the mechanical effect.

  12. Island of Stability for Consistent Deformations of Einstein's Gravity

    CERN Document Server

    Berkhahn, Felix; Hofmann, Stefan; Kühnel, Florian; Moyassari, Parvin

    2011-01-01

    We construct explicitly deformations of Einstein's theory of gravity that are consistent and phenomenologically viable since they respect, in particular, cosmological backgrounds. We show that these deformations have unique symmetries in accordance with unitarity requirements, and give rise to a curvature induced self-stabilizing mechanism. As a consequence, any nonlinear completed deformation must incorporate self-stabilization on generic spacetimes already at lowest order in perturbation theory. Furthermore, our findings include the possibility of consistent and phenomenologically viable deformations of general relativity that are solely operative on curved spacetime geometries, reducing to Einstein's theory on the Minkowski background.

  13. Quantum monadology: a consistent world model for consciousness and physics.

    Science.gov (United States)

    Nakagomi, Teruaki

    2003-04-01

    The NL world model presented in the previous paper is embodied by use of relativistic quantum mechanics, which reveals the significance of the reduction of quantum states and the relativity principle, and locates consciousness and the concept of flowing time consistently in physics. This model provides a consistent framework to solve apparent incompatibilities between consciousness (as our interior experience) and matter (as described by quantum mechanics and relativity theory). Does matter have an inside? What is the flowing time now? Does physics allow the indeterminism by volition? The problem of quantum measurement is also resolved in this model.

  14. The cluster bootstrap consistency in generalized estimating equations

    KAUST Repository

    Cheng, Guang

    2013-03-01

    The cluster bootstrap resamples clusters or subjects instead of individual observations in order to preserve the dependence within each cluster or subject. In this paper, we provide a theoretical justification of using the cluster bootstrap for the inferences of the generalized estimating equations (GEE) for clustered/longitudinal data. Under the general exchangeable bootstrap weights, we show that the cluster bootstrap yields a consistent approximation of the distribution of the regression estimate, and a consistent approximation of the confidence sets. We also show that a computationally more efficient one-step version of the cluster bootstrap provides asymptotically equivalent inference. © 2012.

  15. Consistent Deformed Bosonic Algebra in Noncommutative Quantum Mechanics

    CERN Document Server

    Zhang, Jian-Zu

    2009-01-01

    In two-dimensional noncommutive space for the case of both position - position and momentum - momentum noncommuting, the consistent deformed bosonic algebra at the non-perturbation level described by the deformed annihilation and creation operators is investigated. A general relation between noncommutative parameters is fixed from the consistency of the deformed Heisenberg - Weyl algebra with the deformed bosonic algebra. A Fock space is found, in which all calculations can be similarly developed as if in commutative space and all effects of spatial noncommutativity are simply represented by parameters.

  16. Consistent Deformed Bosonic Algebra in Noncommutative Quantum Mechanics

    Science.gov (United States)

    Zhang, Jian-Zu

    In two-dimensional noncommutative space for the case of both position-position and momentum-momentum noncommuting, the consistent deformed bosonic algebra at the nonperturbation level described by the deformed annihilation and creation operators is investigated. A general relation between noncommutative parameters is fixed from the consistency of the deformed Heisenberg-Weyl algebra with the deformed bosonic algebra. A Fock space is found, in which all calculations can be similarly developed as if in commutative space and all effects of spatial noncommutativity are simply represented by parameters.

  17. HIGH CONSISTENCY PULPING OF OLD NEWSPRINT AND ITS FLOTATION PROPERTIES

    Institute of Scientific and Technical Information of China (English)

    ChunhuiZhang; MenghuaQin

    2004-01-01

    The mechanical and chemical effect on the pulping properties of the old newsprint was studied using a FORMAX Micro-Maelstrom Laboratory Pulper, and the flotation conditions such as velocity of air flow, air pressure and flotation time were also discussed with a FORMAX Deink Cell The results show that sodium hydroxide, sodium silicate, hydrogen peroxide and deinking agent are the key factors in the chemical effect, and pulping consistency is more important than pulping time and rotation speed in the mechanical effect during the high consistency pulping of the ONP. In general, the chemical effect has a greater influence on the deinked pulp properties than the mechanical effect.

  18. Dynamically Consistent Nonlinear Evaluations with Their Generating Functions in Lp

    Institute of Scientific and Technical Information of China (English)

    Feng HU

    2013-01-01

    In this paper,we study dynamically consistent nonlinear evaluations in Lp (1 < p < 2).One of our aim is to obtain the following result:under a domination condition,an Ft-consistent evaluation is an ∑g-evaluation in Lp.Furthermore,without the assumption that the generating function g(t,ω,y,z) is continuous with respect to t,we provide some useful characterizations of an εg-evaluation by g and give some applications.These results include and extend some existing results.

  19. Consistency of Social Sensing Signatures Across Major US Cities

    CERN Document Server

    Soliman, Aiman; Padmanabhan, Anand; Wang, Shaowen

    2016-01-01

    Previous studies have shown that Twitter users have biases to tweet from certain locations, locational bias, and during certain hours, temporal bias. We used three years of geolocated Twitter Data to quantify these biases and test our central hypothesis that Twitter users biases are consistent across US cities. Our results suggest that temporal and locational bias of Twitter users are inconsistent between three US metropolitan cities. We derive conclusions about the role of the complexity of the underlying data producing process on its consistency and argue for the potential research avenue for Geospatial Data Science to test and quantify these inconsistencies in the class of organically evolved Big Data.

  20. Small Diameter Bomb Increment II (SDB II)

    Science.gov (United States)

    2015-12-01

    Equipment and the Joint Mission Planning System. The SDB II Program will develop and field a single-weapon USAF storage container and a dual DoN weapon...weapon directly impacted the target but did not detonate. Due to a lack of telemetry data, because live fire test assets are not equipped with telemetry

  1. Brief Report: Consistency of Search Engine Rankings for Autism Websites

    Science.gov (United States)

    Reichow, Brian; Naples, Adam; Steinhoff, Timothy; Halpern, Jason; Volkmar, Fred R.

    2012-01-01

    The World Wide Web is one of the most common methods used by parents to find information on autism spectrum disorders and most consumers find information through search engines such as Google or Bing. However, little is known about how the search engines operate or the consistency of the results that are returned over time. This study presents the…

  2. Consistency Within Diversity: Guidelines for Programs to Honor Exemplary Teaching.

    Science.gov (United States)

    Svinicki, Marilla D.; Menges, Robert J.

    1996-01-01

    Good programs for recognizing exemplary college teaching are consistent with institutional mission and values, are grounded in research-based competencies and practices, recognize all significant facets of instruction, reward both collaborative and individual achievements, neither preclude nor replace the institutional reward system, call on those…

  3. Weakly time consistent concave valuations and their dual representations

    NARCIS (Netherlands)

    Roorda, Berend; Schumacher, Johannes M.

    2016-01-01

    We derive dual characterizations of two notions of weak time consistency for concave valuations, which are convex risk measures under a positive sign convention. Combined with a suitable risk aversion property, these notions are shown to amount to three simple rules for not necessarily minimal repr

  4. A Consistent Procedure for Pseudo-Component Delumping

    DEFF Research Database (Denmark)

    Leibovici, Claude; Stenby, Erling Halfdan; Knudsen, Kim

    1996-01-01

    . Thereby infinite dilution K-values can be obtained exactly without any further computation.Based on these results a consistent procedure for the estimation of equilibrium constants in the more classical cases of finite dilution has been developed. It can be used when moderate binary interaction parameters...

  5. Weakly time consistent concave valuations and their dual representations

    NARCIS (Netherlands)

    Roorda, B.; Schumacher, Hans

    2016-01-01

    We derive dual characterizations of two notions of weak time consistency for concave valuations, which are convex risk measures under a positive sign convention. Combined with a suitable risk aversion property, these notions are shown to amount to three simple rules for not necessarily minimal repre

  6. Hippocampography Guides Consistent Mesial Resections in Neocortical Temporal Lobe Epilepsy

    Directory of Open Access Journals (Sweden)

    Marcus C. Ng

    2016-01-01

    Full Text Available Background. The optimal surgery in lesional neocortical temporal lobe epilepsy is unknown. Hippocampal electrocorticography maximizes seizure freedom by identifying normal-appearing epileptogenic tissue for resection and minimizes neuropsychological deficit by limiting resection to demonstrably epileptogenic tissue. We examined whether standardized hippocampal electrocorticography (hippocampography guides resection for more consistent hippocampectomy than unguided resection in conventional electrocorticography focused on the lesion. Methods. Retrospective chart reviews any kind of electrocorticography (including hippocampography as part of combined lesionectomy, anterolateral temporal lobectomy, and hippocampectomy over 8 years . Patients were divided into mesial (i.e., hippocampography and lateral electrocorticography groups. Primary outcome was deviation from mean hippocampectomy length. Results. Of 26 patients, fourteen underwent hippocampography-guided mesial temporal resection. Hippocampography was associated with 2.6 times more consistent resection. The range of hippocampal resection was 0.7 cm in the mesial group and 1.8 cm in the lateral group (p=0.01. 86% of mesial group versus 42% of lateral group patients achieved seizure freedom (p=0.02. Conclusions. By rationally tailoring excision to demonstrably epileptogenic tissue, hippocampography significantly reduces resection variability for more consistent hippocampectomy than unguided resection in conventional electrocorticography. More consistent hippocampal resection may avoid overresection, which poses greater neuropsychological risk, and underresection, which jeopardizes postoperative seizure freedom.

  7. Discrete anomalies in supergravity and consistency of string backgrounds

    Science.gov (United States)

    Minasian, Ruben; Sasmal, Soumya; Savelli, Raffaele

    2017-02-01

    We examine SL(2, ℤ) anomalies in ten and eight-dimensional supergravities, the induced local counterterms and their realization in string theory. Composite connections play an important rôle in the cancellation mechanism. At the same time their global properties lead to novel non-trivial consistency constraints on compactifications.

  8. Robust Visual Tracking Via Consistent Low-Rank Sparse Learning

    KAUST Repository

    Zhang, Tianzhu

    2014-06-19

    Object tracking is the process of determining the states of a target in consecutive video frames based on properties of motion and appearance consistency. In this paper, we propose a consistent low-rank sparse tracker (CLRST) that builds upon the particle filter framework for tracking. By exploiting temporal consistency, the proposed CLRST algorithm adaptively prunes and selects candidate particles. By using linear sparse combinations of dictionary templates, the proposed method learns the sparse representations of image regions corresponding to candidate particles jointly by exploiting the underlying low-rank constraints. In addition, the proposed CLRST algorithm is computationally attractive since temporal consistency property helps prune particles and the low-rank minimization problem for learning joint sparse representations can be efficiently solved by a sequence of closed form update operations. We evaluate the proposed CLRST algorithm against 14 state-of-the-art tracking methods on a set of 25 challenging image sequences. Experimental results show that the CLRST algorithm performs favorably against state-of-the-art tracking methods in terms of accuracy and execution time.

  9. Checking Consistency of Pedigree Information is NP-complete

    DEFF Research Database (Denmark)

    Aceto, Luca; Hansen, Jens A.; Ingolfsdottir, Anna

    arose originally from the geneticists' need to filter their input data from erroneous information, and is well motivated from both a biological and a sociological viewpoint. This paper shows that consistency checking is NP-complete, even in the presence of three alleles. Several other results...

  10. An algebraic method for constructing stable and consistent autoregressive filters

    Energy Technology Data Exchange (ETDEWEB)

    Harlim, John, E-mail: jharlim@psu.edu [Department of Mathematics, the Pennsylvania State University, University Park, PA 16802 (United States); Department of Meteorology, the Pennsylvania State University, University Park, PA 16802 (United States); Hong, Hoon, E-mail: hong@ncsu.edu [Department of Mathematics, North Carolina State University, Raleigh, NC 27695 (United States); Robbins, Jacob L., E-mail: jlrobbi3@ncsu.edu [Department of Mathematics, North Carolina State University, Raleigh, NC 27695 (United States)

    2015-02-15

    In this paper, we introduce an algebraic method to construct stable and consistent univariate autoregressive (AR) models of low order for filtering and predicting nonlinear turbulent signals with memory depth. By stable, we refer to the classical stability condition for the AR model. By consistent, we refer to the classical consistency constraints of Adams–Bashforth methods of order-two. One attractive feature of this algebraic method is that the model parameters can be obtained without directly knowing any training data set as opposed to many standard, regression-based parameterization methods. It takes only long-time average statistics as inputs. The proposed method provides a discretization time step interval which guarantees the existence of stable and consistent AR model and simultaneously produces the parameters for the AR models. In our numerical examples with two chaotic time series with different characteristics of decaying time scales, we find that the proposed AR models produce significantly more accurate short-term predictive skill and comparable filtering skill relative to the linear regression-based AR models. These encouraging results are robust across wide ranges of discretization times, observation times, and observation noise variances. Finally, we also find that the proposed model produces an improved short-time prediction relative to the linear regression-based AR-models in forecasting a data set that characterizes the variability of the Madden–Julian Oscillation, a dominant tropical atmospheric wave pattern.

  11. Consistency of the Takens estimator for the correlation dimension

    NARCIS (Netherlands)

    Borovkova, S; Burton, R; Dehling, H

    1999-01-01

    Motivated by the problem of estimating the fractal dimension of a strange attractor, we prove weak consistency of U-statistics for stationary ergodic and mixing sequences when the kernel function is unbounded, extending by this earlier results of Aaronson, Burton, Dehling, Gilat, Hill and Weiss. We

  12. Usability problem reports for comparative studies: consistency and inspectability

    NARCIS (Netherlands)

    Vermeeren, A.P.O.S.; Attema, J.; Akar, E.; De Ridder, H.; Van Doorn, A.J.; Erburg, Ç.; Berkman, A.E.; Maguire, M.

    2008-01-01

    This study explores issues of consistency and inspectability in usability test data analysis processes and reports. Problem reports resulting from usability tests performed by three professional usability labs in three different countries are compared. Each of the labs conducted a usability test on

  13. Body saccades of Drosophila consist of stereotyped banked turns

    NARCIS (Netherlands)

    Muijres, F.T.; Elzinga, M.J.; Iwasaki, N.A.; Dickinson, M.H.

    2015-01-01

    The flight pattern of many fly species consists of straight flight segments interspersed with rapid turns called body saccades, a strategy that is thought to minimize motion blur. We analyzed the body saccades of fruit flies (Drosophila hydei), using high-speed 3D videography to track body and wing

  14. Assessing atmospheric bias correction for dynamical consistency using potential vorticity

    Science.gov (United States)

    Rocheta, Eytan; Evans, Jason P.; Sharma, Ashish

    2014-12-01

    Correcting biases in atmospheric variables prior to impact studies or dynamical downscaling can lead to new biases as dynamical consistency between the ‘corrected’ fields is not maintained. Use of these bias corrected fields for subsequent impact studies and dynamical downscaling provides input conditions that do not appropriately represent intervariable relationships in atmospheric fields. Here we investigate the consequences of the lack of dynamical consistency in bias correction using a measure of model consistency—the potential vorticity (PV). This paper presents an assessment of the biases present in PV using two alternative correction techniques—an approach where bias correction is performed individually on each atmospheric variable, thereby ignoring the physical relationships that exists between the multiple variables that are corrected, and a second approach where bias correction is performed directly on the PV field, thereby keeping the system dynamically coherent throughout the correction process. In this paper we show that bias correcting variables independently results in increased errors above the tropopause in the mean and standard deviation of the PV field, which are improved when using the alternative proposed. Furthermore, patterns of spatial variability are improved over nearly all vertical levels when applying the alternative approach. Results point to a need for a dynamically consistent atmospheric bias correction technique which results in fields that can be used as dynamically consistent lateral boundaries in follow-up downscaling applications.

  15. Discrete anomalies in supergravity and consistency of string backgrounds

    CERN Document Server

    Minasian, Ruben; Savelli, Raffaele

    2016-01-01

    We examine SL(2, Z) anomalies in ten and eight-dimensional supergravities, the induced local counterterms and their realization in string theory. Composite connections play an important role in the cancellation mechanism. At the same time their global properties lead to novel non-trivial consistency constraints on compactifications.

  16. New sequential quadratic programming algorithm with consistent subproblems

    Institute of Scientific and Technical Information of China (English)

    贺国平; 高自友; 赖炎连

    1997-01-01

    One of the most interesting topics related to sequential quadratic programming algorithms is how to guarantee the consistence of all quadratic programming subproblems. In this decade, much work trying to change the form of constraints to obtain the consistence of the subproblems has been done The method proposed by De O. Panto-ja J F A and coworkers solves the consistent problem of SQP method, and is the best to the authors’ knowledge. However, the scale and complexity of the subproblems in De O. Pantoja’s work will be increased greatly since all equality constraints have to be changed into absolute form A new sequential quadratic programming type algorithm is presented by means of a special ε-active set scheme and a special penalty function. Subproblems of the new algorithm are all consistent, and the form of constraints of the subproblems is as simple as one of the general SQP type algorithms. It can be proved that the new method keeps global convergence and local superhnear convergence.

  17. Personalities in great tits, Parus major : stability and consistency

    NARCIS (Netherlands)

    Carere, C; Drent, Piet J.; Privitera, Lucia; Koolhaas, Jaap M.; Groothuis, TGG

    2005-01-01

    We carried out a longitudinal study on great tits from two lines bidirectionally selected for fast or slow exploratory performance during the juvenile phase, a trait thought to reflect different personalities. We analysed temporal stability and consistency of responses within and between situations

  18. Self-Consistence of Semi-Classical Gravity

    CERN Document Server

    Suen, W M

    1992-01-01

    Simon argued that the semi-classical theory of gravity, unless with some of its solutions excluded, is unacceptable for reasons of both self-consistency and experiment, and that it has to be replaced by a constrained semi-classical theory. We examined whether the evidence is conclusive.

  19. SOCIAL COMPARISON, SELF-CONSISTENCY AND THE PRESENTATION OF SELF.

    Science.gov (United States)

    MORSE, STANLEY J.; GERGEN, KENNETH J.

    TO DISCOVER HOW A PERSON'S (P) SELF-CONCEPT IS AFFECTED BY THE CHARACTERISTICS OF ANOTHER (O) WHO SUDDENLY APPEARS IN THE SAME SOCIAL ENVIRONMENT, SEVERAL QUESTIONNAIRES, INCLUDING THE GERGEN-MORSE (1967) SELF-CONSISTENCY SCALE AND HALF THE COOPERSMITH SELF-ESTEEM INVENTORY, WERE ADMINISTERED TO 78 UNDERGRADUATE MEN WHO HAD ANSWERED AN AD FOR WORK…

  20. Plant functional traits have globally consistent effects on competition

    NARCIS (Netherlands)

    Kunstler, Georges; Falster, Daniel; Coomes, David A.; Poorter, Lourens

    2016-01-01

    Phenotypic traits and their associated trade-offs have been shown to have globally consistent effects on individual plant physiological functions, but how these effects scale up to influence competition, a key driver of community assembly in terrestrial vegetation, has remained unclear. Here we

  1. Fully self-consistent GW calculations for molecules

    DEFF Research Database (Denmark)

    Rostgaard, Carsten; Jacobsen, Karsten Wedel; Thygesen, Kristian Sommer

    2010-01-01

    We calculate single-particle excitation energies for a series of 34 molecules using fully self-consistent GW, one-shot G0W0, Hartree-Fock (HF), and hybrid density-functional theory (DFT). All calculations are performed within the projector-augmented wave method using a basis set of Wannier...

  2. On ZRP wind input term consistency in Hasselmann equation

    CERN Document Server

    Zakharov, Vladimir; Pushkarev, Andrei

    2016-01-01

    The new ZRP wind input source term (Zakharov et al. 2012) is checked for its consistency via numerical simulation of Hasselmann equation. The results are compared to field experimental data, collected at different sites around the world, and theoretical predictions of self-similarity analysis. Good agreement is obtained for limited fetch and time domain statements

  3. Consistency in behavior of the CEO regarding corporate social responsibility

    NARCIS (Netherlands)

    Elving, W.J.L.; Kartal, D.

    2012-01-01

    Purpose - When corporations adopt a corporate social responsibility (CSR) program and use and name it in their external communications, their members should act in line with CSR. The purpose of this paper is to present an experiment in which the consistent or inconsistent behavior of a CEO was

  4. An Intuitionistic Epistemic Logic for Sequential Consistency on Shared Memory

    Science.gov (United States)

    Hirai, Yoichi

    In the celebrated Gödel Prize winning papers, Herlihy, Shavit, Saks and Zaharoglou gave topological characterization of waitfree computation. In this paper, we characterize waitfree communication logically. First, we give an intuitionistic epistemic logic k∨ for asynchronous communication. The semantics for the logic k∨ is an abstraction of Herlihy and Shavit's topological model. In the same way Kripke model for intuitionistic logic informally describes an agent increasing its knowledge over time, the semantics of k∨ describes multiple agents passing proofs around and developing their knowledge together. On top of the logic k∨, we give an axiom type that characterizes sequential consistency on shared memory. The advantage of intuitionistic logic over classical logic then becomes apparent as the axioms for sequential consistency are meaningless for classical logic because they are classical tautologies. The axioms are similar to the axiom type for prelinearity (ϕ ⊃ ψ) ∨ (ψ ⊃ ϕ). This similarity reflects the analogy between sequential consistency for shared memory scheduling and linearity for Kripke frames: both require total order on schedules or models. Finally, under sequential consistency, we give soundness and completeness between a set of logical formulas called waitfree assertions and a set of models called waitfree schedule models.

  5. Noncommuting Electric Fields and Algebraic Consistency in Noncommutative Gauge theories

    CERN Document Server

    Banerjee, R

    2003-01-01

    We show that noncommuting electric fields occur naturally in noncommutative gauge theories. Using this noncommutativity, which is field dependent, and a hamiltonian generalisation of the Seiberg-Witten Map, the algebraic consistency in the lagrangian and hamiltonian formulations of these theories, is established. The stability of the Poisson algebra, under this generalised map, is studied.

  6. Efficient self-consistent quantum transport simulator for quantum devices

    Energy Technology Data Exchange (ETDEWEB)

    Gao, X., E-mail: xngao@sandia.gov; Mamaluy, D.; Nielsen, E.; Young, R. W.; Lilly, M. P.; Bishop, N. C.; Carroll, M. S.; Muller, R. P. [Sandia National Laboratories, 1515 Eubank SE, Albuquerque, New Mexico 87123 (United States); Shirkhorshidian, A. [Sandia National Laboratories, 1515 Eubank SE, Albuquerque, New Mexico 87123 (United States); University of New Mexico, Albuquerque, New Mexico 87131 (United States)

    2014-04-07

    We present a self-consistent one-dimensional (1D) quantum transport simulator based on the Contact Block Reduction (CBR) method, aiming for very fast and robust transport simulation of 1D quantum devices. Applying the general CBR approach to 1D open systems results in a set of very simple equations that are derived and given in detail for the first time. The charge self-consistency of the coupled CBR-Poisson equations is achieved by using the predictor-corrector iteration scheme with the optional Anderson acceleration. In addition, we introduce a new way to convert an equilibrium electrostatic barrier potential calculated from an external simulator to an effective doping profile, which is then used by the CBR-Poisson code for transport simulation of the barrier under non-zero biases. The code has been applied to simulate the quantum transport in a double barrier structure and across a tunnel barrier in a silicon double quantum dot. Extremely fast self-consistent 1D simulations of the differential conductance across a tunnel barrier in the quantum dot show better qualitative agreement with experiment than non-self-consistent simulations.

  7. Context-dependent individual behavioral consistency in Daphnia

    DEFF Research Database (Denmark)

    Heuschele, Jan; Ekvall, Mikael T.; Bianco, Giuseppe

    2017-01-01

    , whereas studies focusing on smaller aquatic organisms are still rare. Here, we show individual differences in the swimming behavior of Daphnia magna, a clonal freshwater invertebrate, before, during, and after being exposed to a lethal threat, ultraviolet radiation (UVR). We show consistency in swimming...

  8. FINITE DEFORMATION ELASTO-PLASTIC THEORY AND CONSISTENT ALGORITHM

    Institute of Scientific and Technical Information of China (English)

    Liu Xuejun; Li Mingrui; Huang Wenbin

    2001-01-01

    By using the logarithmic strain, the finite deformation plastic theory, corresponding to the infinitesimal plastic theory, is established successively. The plastic consistent algorithm with first order accuracy for the finite element method (FEM) is developed. Numerical examples are presented to illustrate the validity of the theory and effectiveness of the algorithm.

  9. Consistent measurements comparing the drift features of noble gas mixtures

    CERN Document Server

    Becker, U; Fortunato, E M; Kirchner, J; Rosera, K; Uchida, Y

    1999-01-01

    We present a consistent set of measurements of electron drift velocities and Lorentz deflection angles for all noble gases with methane and ethane as quenchers in magnetic fields up to 0.8 T. Empirical descriptions are also presented. Details on the World Wide Web allow for guided design and optimization of future detectors.

  10. A Nonparametric Approach to Estimate Classification Accuracy and Consistency

    Science.gov (United States)

    Lathrop, Quinn N.; Cheng, Ying

    2014-01-01

    When cut scores for classifications occur on the total score scale, popular methods for estimating classification accuracy (CA) and classification consistency (CC) require assumptions about a parametric form of the test scores or about a parametric response model, such as item response theory (IRT). This article develops an approach to estimate CA…

  11. Evaluating Reflective Writing for Appropriateness, Fairness, and Consistency.

    Science.gov (United States)

    Kennison, Monica Metrick; Misselwitz, Shirley

    2002-01-01

    Samples from 17 reflective journals of nursing students were evaluated by 6 faculty. Results indicate a lack of consistency in grading reflective writing, lack of consensus regarding evaluation, and differences among faculty regarding their view of such exercises. (Contains 26 references.) (JOW)

  12. Improving consistency in student evaluation at affiliated family practice centers.

    Science.gov (United States)

    Rabinowitz, H K

    1986-01-01

    The Department of Family Medicine at Jefferson Medical College has since 1974 been successful in administering a required third-year family medicine clerkship, providing students with a structured, didactic, and experiential curriculum in six affiliated family practice centers. Prior analysis (1976-1981) had indicated, however, that variation existed in evaluating similar students, depending on the clerkship training site, i.e., three sites graded students in a significantly different fashion than the three other sites. Utilizing these data to focus on the evaluation process, a comprehensive and specific six-point plan was developed to improve consistency in evaluations at the different training sites. This plan consisted of a yearly meeting of affiliate faculty, assigning predoctoral training administrative responsibility to one faculty member at each training site, increased telephone communication, affiliate-faculty attendance at the university site evaluation session, faculty rotation to spend time at other training sites, and financial reimbursement to the affiliate training sites. After intervention, analysis (1981-1983) indicated that five of the six clerkship sites now grade students in a consistent fashion, with only one affiliate using different grading standards. The intervention was therefore judged to be successful for five of the six training sites, allowing for better communication and more critical and consistent evaluation of medical students.

  13. [Consistent presentation of medical images based on CPI integration profile].

    Science.gov (United States)

    Jiang, Tao; An, Ji-ye; Chen, Zhong-yong; Lu, Xu-dong; Duan, Hui-long

    2007-11-01

    Because of different display parameters and other factors, digital medical images present different display states in different section offices of a hospital. Based on CPI integration profile of IHE, this paper implements the consistent presentation of medical images, and it is helpful for doctors to carry out medical treatments of teamwork.

  14. Measures of Consistency for Holland-Type Codes.

    Science.gov (United States)

    Strahan, Robert F.

    1987-01-01

    Describes two new measures of consistency which refer to the extent to which more closely related scale types are found together in Holland's Self-Directed Search sort. One measure is based on the hexagonal model for use with three-point codes. The other is based on conditional probabilities for use with two-point codes. (Author/ABL)

  15. On Cu(II) Cu(II) distance measurements using pulsed electron electron double resonance

    Science.gov (United States)

    Yang, Zhongyu; Becker, James; Saxena, Sunil

    2007-10-01

    The effects of orientational selectivity on the 4-pulse electron electron double resonance (PELDOR) ESR spectra of coupled Cu(II)-Cu(II) spins are presented. The data were collected at four magnetic fields on a poly-proline peptide containing two Cu(II) centers. The Cu(II)-PELDOR spectra of this peptide do not change appreciably with magnetic field at X-band. The data were analyzed by adapting the theory of Maryasov, Tsvetkov, and Raap [A.G. Maryasov, Y.D. Tsvetkov, J. Raap, Weakly coupled radical pairs in solids:ELDOR in ESE structure studies, Appl. Magn. Reson. 14 (1998) 101-113]. Simulations indicate that orientational effects are important for Cu(II)-PELDOR. Based on simulations, the field-independence of the PELDOR data for this peptide is likely due to two effects. First, for this peptide, the Cu(II) g-tensor(s) are in a very specific orientation with respect to the interspin vector. Second, the flexibility of the peptide washes out the orientation effects. These effects reduce the suitability of the poly-proline based peptide as a good model system to experimentally probe orientational effects in such experiments. An average Cu(II)-Cu(II) distance of 2.1-2.2 nm was determined, which is consistent with earlier double quantum coherence ESR results.

  16. Synthesis, characterization, and photoactivated DNA cleavage by copper (II)/cobalt (II) mediated macrocyclic complexes.

    Science.gov (United States)

    Naik, H R Prakash; Naik, H S Bhojya; Aravinda, T; Lamani, D S

    2010-01-01

    We report the synthesis of new photonuclease consisting of two Co(II)/Cu(II) complexes of macrocyclic fused quinoline. Metal complexes are [MLX(2)], type where M = Co(II) (5), Cu(II) (6), and X = Cl, and are well characterized by elemental analysis, Fourier transform infrared spectroscopy, (1)H-NMR and electronic spectra. We have shown that photocleavage of plasmid DNA is markedly enhanced when this ligand is irradiated in the presence of Cu(II), and more so than that of cobalt. The chemistry of ternary and binary Co(II) complexes showing efficient light induced (360 nm) DNA cleavage activity is summarized. The role of the metal in photoinduced DNA cleavage reactions is explored by designing complex molecules having macrocyclic structure. The mechanistic pathways are found to be concentration dependent on Co(II)/Cu(II) complexes and the photoexcitation energy photoredox chemistry. Highly effective DNA cleavage ability of 6 is attributed to the effective cooperation of the metal moiety.

  17. Rom II-forordningen

    DEFF Research Database (Denmark)

    Pii, Tine; Nielsen, Peter Arnt

    2008-01-01

    Artiklen redegør for de vigtigste regler i Europaparlamentets og Rådets forordning om lovvalgsregler for forpligtelser uden for kontraktforhold (Rom II) og sammenligner dem med dansk ret.......Artiklen redegør for de vigtigste regler i Europaparlamentets og Rådets forordning om lovvalgsregler for forpligtelser uden for kontraktforhold (Rom II) og sammenligner dem med dansk ret....

  18. Ovarian Cancer Stage II

    Science.gov (United States)

    ... hyphen, e.g. -historical Searches are case-insensitive Ovarian Cancer Stage II Add to My Pictures View /Download : ... 1650x675 View Download Large: 3300x1350 View Download Title: Ovarian Cancer Stage II Description: Three-panel drawing of stage ...

  19. Summary of CPAS Gen II Parachute Analysis

    Science.gov (United States)

    Morris, Aaron L.; Bledsoe, Kristin J.; Fraire, Usbaldo, Jr.; Moore, James W.; Olson, Leah M.; Ray, Eric

    2011-01-01

    The Orion spacecraft is currently under development by NASA and Lockheed Martin. Like Apollo, Orion will use a series of parachutes to slow its descent and splashdown safely. The Orion parachute system, known as the CEV Parachute Assembly System (CPAS), is being designed by NASA, the Engineering and Science Contract Group (ESCG), and Airborne Systems. The first generation (Gen I) of CPAS testing consisted of thirteen tests and was executed in the 2007-2008 timeframe. The Gen I tests provided an initial understanding of the CPAS parachutes. Knowledge gained from Gen I testing was used to plan the second generation of testing (Gen II). Gen II consisted of six tests: three singleparachute tests, designated as Main Development Tests, and three Cluster Development Tests. Gen II required a more thorough investigation into parachute performance than Gen I. Higher fidelity instrumentation, enhanced analysis methods and tools, and advanced test techniques were developed. The results of the Gen II test series are being incorporated into the CPAS design. Further testing and refinement of the design and model of parachute performance will occur during the upcoming third generation of testing (Gen III). This paper will provide an overview of the developments in CPAS analysis following the end of Gen I, including descriptions of new tools and techniques as well as overviews of the Gen II tests.

  20. Consistent Data Assimilation of Isotopes: 242Pu and 105Pd

    Energy Technology Data Exchange (ETDEWEB)

    G. Palmiotti; H. Hiruta; M. Salvatores

    2012-09-01

    In this annual report we illustrate the methodology of the consistent data assimilation that allows to use the information coming from integral experiments for improving the basic nuclear parameters used in cross section evaluation. A series of integral experiments are analyzed using the EMPIRE evaluated files for 242Pu and 105Pd. In particular irradiation experiments (PROFIL-1 and -2, TRAPU-1, -2 and -3) provide information about capture cross sections, and a critical configuration, COSMO, where fission spectral indexes were measured, provides information about fission cross section. The observed discrepancies between calculated and experimental results are used in conjunction with the computed sensitivity coefficients and covariance matrix for nuclear parameters in a consistent data assimilation. The results obtained by the consistent data assimilation indicate that not so large modifications on some key identified nuclear parameters allow to obtain reasonable C/E. However, for some parameters such variations are outside the range of 1 s of their initial standard deviation. This can indicate a possible conflict between differential measurements (used to calculate the initial standard deviations) and the integral measurements used in the statistical data adjustment. Moreover, an inconsistency between the C/E of two sets of irradiation experiments (PROFIL and TRAPU) is observed for 242Pu. This is the end of this project funded by the Nuclear Physics Program of the DOE Office of Science. We can indicate that a proof of principle has been demonstrated for a few isotopes for this innovative methodology. However, we are still far from having explored all the possibilities and made this methodology to be considered proved and robust. In particular many issues are worth further investigation: • Non-linear effects • Flexibility of nuclear parameters in describing cross sections • Multi-isotope consistent assimilation • Consistency between differential and integral

  1. Performance and consistency of indicator groups in two biodiversity hotspots.

    Directory of Open Access Journals (Sweden)

    Joaquim Trindade-Filho

    Full Text Available BACKGROUND: In a world limited by data availability and limited funds for conservation, scientists and practitioners must use indicator groups to define spatial conservation priorities. Several studies have evaluated the effectiveness of indicator groups, but still little is known about the consistency in performance of these groups in different regions, which would allow their a priori selection. METHODOLOGY/PRINCIPAL FINDINGS: We systematically examined the effectiveness and the consistency of nine indicator groups in representing mammal species in two top-ranked Biodiversity Hotspots (BH: the Brazilian Cerrado and the Atlantic Forest. To test for group effectiveness we first found the best sets of sites able to maximize the representation of each indicator group in the BH and then calculated the average representation of different target species by the indicator groups in the BH. We considered consistent indicator groups whose representation of target species was not statistically different between BH. We called effective those groups that outperformed the target-species representation achieved by random sets of species. Effective indicator groups required the selection of less than 2% of the BH area for representing target species. Restricted-range species were the most effective indicators for the representation of all mammal diversity as well as target species. It was also the only group with high consistency. CONCLUSIONS/SIGNIFICANCE: We show that several indicator groups could be applied as shortcuts for representing mammal species in the Cerrado and the Atlantic Forest to develop conservation plans, however, only restricted-range species consistently held as the most effective indicator group for such a task. This group is of particular importance in conservation planning as it captures high diversity of endemic and endangered species.

  2. The internal consistency of the North Sea carbonate system

    Science.gov (United States)

    Salt, Lesley A.; Thomas, Helmuth; Bozec, Yann; Borges, Alberto V.; de Baar, Hein J. W.

    2016-05-01

    In 2002 (February) and 2005 (August), the full suite of carbonate system parameters (total alkalinity (AT), dissolved inorganic carbon (DIC), pH, and partial pressure of CO2 (pCO2) were measured on two re-occupations of the entire North Sea basin, with three parameters (AT, DIC, pCO2) measured on four additional re-occupations, covering all four seasons, allowing an assessment of the internal consistency of the carbonate system. For most of the year, there is a similar level of internal consistency, with AT being calculated to within ± 6 μmol kg- 1 using DIC and pH, DIC to ± 6 μmol kg- 1 using AT and pH, pH to ± 0.008 using AT and pCO2, and pCO2 to ± 8 μatm using DIC and pH, with the dissociation constants of Millero et al. (2006). In spring, however, we observe a significant decline in the ability to accurately calculate the carbonate system. Lower consistency is observed with an increasing fraction of Baltic Sea water, caused by the high contribution of organic alkalinity in this water mass, not accounted for in the carbonate system calculations. Attempts to improve the internal consistency by accounting for the unconventional salinity-borate relationships in freshwater and the Baltic Sea, and through application of the new North Atlantic salinity-boron relationship (Lee et al., 2010), resulted in no significant difference in the internal consistency.

  3. Consistency of accuracy assessment indices for soft classification: Simulation analysis

    Science.gov (United States)

    Chen, Jin; Zhu, Xiaolin; Imura, Hidefumi; Chen, Xuehong

    Accuracy assessment plays a crucial role in the implementation of soft classification. Even though many indices of accuracy assessment for soft classification have been proposed, the consistencies among these indices are not clear, and the impact of sample size on these consistencies has not been investigated. This paper examines two kinds of indices: map-level indices, including root mean square error ( rmse), kappa, and overall accuracy ( oa) from the sub-pixel confusion matrix (SCM); and category-level indices, including crmse, user accuracy ( ua) and producer accuracy ( pa). A careful simulation was conducted to investigate the consistency of these indices and the effect of sample size. The major findings were as follows: (1) The map-level indices are highly consistent with each other, whereas the category-level indices are not. (2) The consistency among map-level and category-level indices becomes weaker when the sample size decreases. (3) The rmse is more affected by error distribution among classes than are kappa and oa. Based on these results, we recommend that rmse can be used for map-level accuracy due to its simplicity, although kappa and oa may be better alternatives when the sample size is limited because the two indices are affected less by the error distribution among classes. We also suggest that crmse should be provided when map users are not concerned about the error source, whereas ua and pa are more useful when the complete information about different errors is required. The results of this study will be of benefit to the development and application of soft classifiers.

  4. Consistent SPH Simulations of Protostellar Collapse and Fragmentation

    Science.gov (United States)

    Gabbasov, Ruslan; Sigalotti, Leonardo Di G.; Cruz, Fidel; Klapp, Jaime; Ramírez-Velasquez, José M.

    2017-02-01

    We study the consistency and convergence of smoothed particle hydrodynamics (SPH) as a function of the interpolation parameters, namely the number of particles N, the number of neighbors n, and the smoothing length h, using simulations of the collapse and fragmentation of protostellar rotating cores. The calculations are made using a modified version of the GADGET-2 code that employs an improved scheme for the artificial viscosity and power-law dependences of n and h on N, as was recently proposed by Zhu et al., which comply with the combined limit N\\to ∞ , h\\to 0, and n\\to ∞ with n/N\\to 0 for full SPH consistency as the domain resolution is increased. We apply this realization to the “standard isothermal test case” in the variant calculated by Burkert & Bodenheimer and the Gaussian cloud model of Boss to investigate the response of the method to adaptive smoothing lengths in the presence of large density and pressure gradients. The degree of consistency is measured by tracking how well the estimates of the consistency integral relations reproduce their continuous counterparts. In particular, C 0 and C 1 particle consistency is demonstrated, meaning that the calculations are close to second-order accuracy. As long as n is increased with N, mass resolution also improves as the minimum resolvable mass {M}\\min ∼ {n}-1. This aspect allows proper calculation of small-scale structures in the flow associated with the formation and instability of protostellar disks around the growing fragments, which are seen to develop a spiral structure and fragment into close binary/multiple systems as supported by recent observations.

  5. Reflection symmetries of Isolated Self-consistent Stellar Systems

    CERN Document Server

    An, J; Sanders, J L

    2016-01-01

    Isolated, steady-state galaxies correspond to equilibrium solutions of the Poisson--Vlasov system. We show that (i) all galaxies with a distribution function depending on energy alone must be spherically symmetric and (ii) all axisymmetric galaxies with a distribution function depending on energy and the angular momentum component parallel to the symmetry axis must also be reflection-symmetric about the plane $z=0$. The former result is Lichtenstein's Theorem, derived here by a method exploiting symmetries of solutions of elliptic partial differential equations, while the latter result is new. These results are subsumed into the Symmetry Theorem, which specifies how the symmetries of the distribution function in configuration or velocity space can control the planes of reflection symmetries of the ensuing stellar system.

  6. Flood damage: a model for consistent, complete and multipurpose scenarios

    Science.gov (United States)

    Menoni, Scira; Molinari, Daniela; Ballio, Francesco; Minucci, Guido; Mejri, Ouejdane; Atun, Funda; Berni, Nicola; Pandolfo, Claudia

    2016-12-01

    Effective flood risk mitigation requires the impacts of flood events to be much better and more reliably known than is currently the case. Available post-flood damage assessments usually supply only a partial vision of the consequences of the floods as they typically respond to the specific needs of a particular stakeholder. Consequently, they generally focus (i) on particular items at risk, (ii) on a certain time window after the occurrence of the flood, (iii) on a specific scale of analysis or (iv) on the analysis of damage only, without an investigation of damage mechanisms and root causes. This paper responds to the necessity of a more integrated interpretation of flood events as the base to address the variety of needs arising after a disaster. In particular, a model is supplied to develop multipurpose complete event scenarios. The model organizes available information after the event according to five logical axes. This way post-flood damage assessments can be developed that (i) are multisectoral, (ii) consider physical as well as functional and systemic damage, (iii) address the spatial scales that are relevant for the event at stake depending on the type of damage that has to be analyzed, i.e., direct, functional and systemic, (iv) consider the temporal evolution of damage and finally (v) allow damage mechanisms and root causes to be understood. All the above features are key for the multi-usability of resulting flood scenarios. The model allows, on the one hand, the rationalization of efforts currently implemented in ex post damage assessments, also with the objective of better programming financial resources that will be needed for these types of events in the future. On the other hand, integrated interpretations of flood events are fundamental to adapting and optimizing flood mitigation strategies on the basis of thorough forensic investigation of each event, as corroborated by the implementation of the model in a case study.

  7. Stable functional networks exhibit consistent timing in the human brain.

    Science.gov (United States)

    Chapeton, Julio I; Inati, Sara K; Zaghloul, Kareem A

    2017-03-01

    Despite many advances in the study of large-scale human functional networks, the question of timing, stability, and direction of communication between cortical regions has not been fully addressed. At the cellular level, neuronal communication occurs through axons and dendrites, and the time required for such communication is well defined and preserved. At larger spatial scales, however, the relationship between timing, direction, and communication between brain regions is less clear. Here, we use a measure of effective connectivity to identify connections between brain regions that exhibit communication with consistent timing. We hypothesized that if two brain regions are communicating, then knowledge of the activity in one region should allow an external observer to better predict activity in the other region, and that such communication involves a consistent time delay. We examine this question using intracranial electroencephalography captured from nine human participants with medically refractory epilepsy. We use a coupling measure based on time-lagged mutual information to identify effective connections between brain regions that exhibit a statistically significant increase in average mutual information at a consistent time delay. These identified connections result in sparse, directed functional networks that are stable over minutes, hours, and days. Notably, the time delays associated with these connections are also highly preserved over multiple time scales. We characterize the anatomic locations of these connections, and find that the propagation of activity exhibits a preferred posterior to anterior temporal lobe direction, consistent across participants. Moreover, networks constructed from connections that reliably exhibit consistent timing between anatomic regions demonstrate features of a small-world architecture, with many reliable connections between anatomically neighbouring regions and few long range connections. Together, our results demonstrate

  8. Consistência do padrão de agrupamento de cultivares de milho Clustering pattern consistency of corn cultivars

    Directory of Open Access Journals (Sweden)

    Alberto Cargnelutti Filho

    2011-09-01

    Full Text Available O objetivo deste trabalho foi avaliar a consistência do padrão de agrupamento obtido a partir da combinação de duas medidas de dissimilaridade e quatro métodos de agrupamento, em cenários formados por combinações de número de cultivares e número de variáveis, com dados reais de cultivares de milho (Zea mays L. e com dados simulados. Foram usados os dados reais de cinco variáveis mensuradas em 69 experimentos de competição de cultivares de milho, cujo número de cultivares avaliadas oscilou entre 9 e 40. A fim de investigar os resultados com maior número de cultivares e de variáveis, foram simulados, sob distribuição normal padrão, 1.000 experimentos para cada um dos 54 cenários formados pela combinação entre o número de cultivares (20, 30, 40, 50, 60, 70, 80, 90 e 100 e o número de variáveis (5, 6, 7, 8, 9 e 10. Foram realizadas análises de correlação, de diagnóstico de multicolinearidade e de agrupamento. A consistência do padrão de agrupamento foi avaliada por meio do coeficiente de correlação cofenética. Há decréscimo da consistência do padrão de agrupamento com o acréscimo do número de cultivares e de variáveis. A distância euclidiana proporciona maior consistência no padrão de agrupamento em relação à distância de Manhattan. A consistência do padrão de agrupamento entre os métodos aumenta na seguinte ordem: Ward, ligação completa, ligação simples e ligação média entre grupo.The objective of this research was to evaluate the clustering pattern consistency obtained from the combination of the two dissimilarity measures and four clustering methods, in scenarios consist of combinations number of cultivars and number of variables, with real data in corn cultivars (Zea mays L. and simulated data. We used real data from five variables measured in 69 trials involving corn cultivars, the number of cultivars ranged between 9 and 40. In order to investigate the results with more cultivars and

  9. Dinuclear cadmium(II), zinc(II), and manganese(II), trinuclear nickel(II), and pentanuclear copper(II) complexes with novel macrocyclic and acyclic Schiff-base ligands having enantiopure or racemic camphoric diamine components.

    Science.gov (United States)

    Jiang, Jue-Chao; Chu, Zhao-Lian; Huang, Wei; Wang, Gang; You, Xiao-Zeng

    2010-07-05

    Four novel [3 + 3] Schiff-base macrocyclic ligands I-IV condensed from 2,6-diformyl-4-substituted phenols (R = CH(3) or Cl) and enantiopure or racemic camphoric diamines have been synthesized and characterized. Metal-ion complexations of these enantiopure and racemic [3 + 3] macrocyclic ligands with different cadmium(II), zinc(II), manganese(II), nickel(II), and copper(II) salts lead to the cleavage of Schiff-base C horizontal lineN double bonds and subsequent ring contraction of the macrocyclic ligands due to the size effects and the spatial restrictions of the coordination geometry of the central metals, the steric hindrance of ligands, and the counterions used. As a result, five [2 + 2] and one [1 + 2] dinuclear cadmium(II) complexes (1-6), two [2 + 2] dinuclear zinc(II) (7 and 8), and two [2 + 2] dinuclear manganese(II) (9 and 10) complexes together with one [1 + 1] trinuclear nickel(II) complex (11) and one [1 + 2] pentanuclear copper(II) complex (12), bearing enantiopure or racemic ligands, different substituent groups in the phenyl rings, and different anionic ligands (Cl(-), Br(-), OAc(-), and SCN(-)), have been obtained in which the chiral carbon atoms in the camphoric backbones are arranged in different ways (RRSS for the enantiopure ligands in 1, 2, 4, 5, and 7-10 and RSRS for the racemic ligands in 3, 6, 11, and 12). The steric hindrance effects of the methyl group bonded to one of the chiral carbon atoms of camphoric diamine units are believed to play important roles in the formation of the acyclic [1 + 1] trinuclear complex 11 and [1 + 2] dinuclear and pentanuclear complexes 6 and 12. In dinuclear cadmium(II), zinc(II), and manganese(II) complexes 1-10, the sequence of separations between the metal centers is consistent with that of the ionic radii shortened from cadmium(II) to manganese(II) to zinc(II) ions. Furthermore, UV-vis, circular dichroism, (1)H NMR, and fluorescence spectra have been used to characterize and compare the structural

  10. Biologically active new Fe(II, Co(II, Ni(II, Cu(II, Zn(II and Cd(II complexes of N-(2-thienylmethylenemethanamine

    Directory of Open Access Journals (Sweden)

    C. SPÎNU

    2008-04-01

    Full Text Available Iron(II, cobalt(II, nickel (II, copper (II, zinc(II and cadmium(II complexes of the type ML2Cl2, where M is a metal and L is the Schiff base N-(2-thienylmethylenemethanamine (TNAM formed by the condensation of 2-thiophenecarboxaldehyde and methylamine, were prepared and characterized by elemental analysis as well as magnetic and spectroscopic measurements. The elemental analyses suggest the stoichiometry to be 1:2 (metal:ligand. Magnetic susceptibility data coupled with electronic, ESR and Mössbauer spectra suggest a distorted octahedral structure for the Fe(II, Co(II and Ni(II complexes, a square-planar geometry for the Cu(II compound and a tetrahedral geometry for the Zn(II and Cd(II complexes. The infrared and NMR spectra of the complexes agree with co-ordination to the central metal atom through nitrogen and sulphur atoms. Conductance measurements suggest the non-electrolytic nature of the complexes, except for the Cu(II, Zn(II and Cd(II complexes, which are 1:2 electrolytes. The Schiff base and its metal chelates were screened for their biological activity against Escherichia coli, Staphylococcus aureus and Pseudomonas aeruginosa and the metal chelates were found to possess better antibacterial activity than that of the uncomplexed Schiff base.

  11. Tetraaquabis(pyridine-κNnickel(II dinitrate

    Directory of Open Access Journals (Sweden)

    Mario Wriedt

    2010-07-01

    Full Text Available In the title compound, [Ni(C5H5N2(H2O4](NO32, the NiII ion is coordinated by two N-bonded pyridine ligands and four water molecules in an octahedral coordination mode. The asymmetric unit consists of one NiII ion located on an inversion center, as well as one pyridine ligand, one nitrate anion and two water molecules in general positions. In the crystal structure, the discrete complex cations and nitrate anions are connected by O—H...O and C—H...O hydrogen bonds.

  12. (II) and Pb (II) ions from aqueous media using Sta

    African Journals Online (AJOL)

    Joshua Konne

    Removal of Ni (II), Co (II) and Pb (II) ions from aqueous media using Starch. Stabilized Magnetic ... initial metal concentration and contact time on the removal processes was investigated. The results .... India) supplied NaOH and the Fe salts.

  13. Violation of consistency relations and the protoinflationary transition

    CERN Document Server

    Giovannini, Massimo

    2014-01-01

    If we posit the validity of the consistency relations, the tensor spectral index and the relative amplitude of the scalar and tensor power spectra are both fixed by a single slow roll parameter. The physics of the protoinflationary transition can break explicitly the consistency relations causing a reduction of the inflationary curvature scale in comparison with the conventional lore. After a critical scrutiny, we argue that the inflationary curvature scale, the total number of inflationary efolds and, ultimately, the excursion of the inflaton across its Planckian boundary are all characterized by a computable theoretical error. While these considerations ease some of the tensions between the Bicep2 data and the other satellite observations, they also demand an improved understanding of the protoinflationary transition whose physical features may be assessed, in the future, through a complete analysis of the spectral properties of the B mode autocorrelations.

  14. Design of a Turbulence Generator of Medium Consistency Pulp Pumps

    Directory of Open Access Journals (Sweden)

    Hong Li

    2012-01-01

    Full Text Available The turbulence generator is a key component of medium consistency centrifugal pulp pumps, with functions to fluidize the medium consistency pulp and to separate gas from the liquid. Structure sizes of the generator affect the hydraulic performance. The radius and the blade laying angle are two important structural sizes of a turbulence generator. Starting with the research on the flow inside and shearing characteristics of the MC pulp, a simple mathematical model at the flow section of the shearing chamber is built, and the formula and procedure to calculate the radius of the turbulence generator are established. The blade laying angle is referenced from the turbine agitator which has the similar shape with the turbulence generator, and the CFD simulation is applied to study the different flow fields with different blade laying angles. Then the recommended blade laying angle of the turbulence generator is formed to be between 60° and 75°.

  15. Non-trivial checks of novel consistency relations

    Energy Technology Data Exchange (ETDEWEB)

    Berezhiani, Lasha; Khoury, Justin [Center for Particle Cosmology, Department of Physics and Astronomy, University of Pennsylvania, Philadelphia, PA 19104 (United States); Wang, Junpu, E-mail: lashaber@gmail.com, E-mail: jkhoury@sas.upenn.edu, E-mail: jwang217@jhu.edu [Department of Physics and Astronomy, Johns Hopkins University, Baltimore, MD 21218 (United States)

    2014-06-01

    Single-field perturbations satisfy an infinite number of consistency relations constraining the squeezed limit of correlation functions at each order in the soft momentum. These can be understood as Ward identities for an infinite set of residual global symmetries, or equivalently as Slavnov-Taylor identities for spatial diffeomorphisms. In this paper, we perform a number of novel, non-trivial checks of the identities in the context of single field inflationary models with arbitrary sound speed. We focus for concreteness on identities involving 3-point functions with a soft external mode, and consider all possible scalar and tensor combinations for the hard-momentum modes. In all these cases, we check the consistency relations up to and including cubic order in the soft momentum. For this purpose, we compute for the first time the 3-point functions involving 2 scalars and 1 tensor, as well as 2 tensors and 1 scalar, for arbitrary sound speed.

  16. Non-Trivial Checks of Novel Consistency Relations

    CERN Document Server

    Berezhiani, Lasha; Wang, Junpu

    2014-01-01

    Single-field perturbations satisfy an infinite number of consistency relations constraining the squeezed limit of correlation functions at each order in the soft momentum. These can be understood as Ward identities for an infinite set of residual global symmetries, or equivalently as Slavnov-Taylor identities for spatial diffeomorphisms. In this paper, we perform a number of novel, non-trivial checks of the identities in the context of slow-roll single field inflationary models with arbitrary sound speed. We focus for concreteness on identities involving 3-point functions with a soft external mode, and consider all possible scalar and tensor combinations for the hard-momentum modes. In all these cases, we check the consistency relations up to and including cubic order in the soft momentum. For this purpose, we compute for the first time the 3-point functions involving 2 scalars and 1 tensor, as well as 2 tensors and 1 scalar, for arbitrary sound speed.

  17. Multiconfigurational self-consistent reaction field theory for nonequilibrium solvation

    DEFF Research Database (Denmark)

    Mikkelsen, Kurt V.; Cesar, Amary; Ågren, Hans

    1995-01-01

    We present multiconfigurational self-consistent reaction field theory and implementation for solvent effects on a solute molecular system that is not in equilibrium with the outer solvent. The approach incorporates two different polarization vectors for studying the influence of the solvent...... states influenced by the two types of polarization vectors. The general treatment of the correlation problem through the use of complete and restricted active space methodologies makes the present multiconfigurational self-consistent reaction field approach general in that it can handle any type of state......, open-shell, excited, and transition states. We demonstrate the theory by computing solvatochromatic shifts in optical/UV spectra of some small molecules and electron ionization and electron detachment energies of the benzene molecule. It is shown that the dependency of the solvent induced affinity...

  18. Branch dependence in the "consistent histories" approach to quantum mechanics

    CERN Document Server

    Müller, T

    2005-01-01

    In the consistent histories formalism one specifies a family of histories as an exhaustive set of pairwise exclusive descriptions of the dynamics of a quantum system. We define branching families of histories, which strike a middle ground between the two available mathematically precise definitions of families of histories, viz., product families and Isham's history projector operator formalism. The former are too narrow for applications, and the latter's generality comes at a certain cost, barring an intuitive reading of the ``histories''. Branching families retain the intuitiveness of product families, they allow for the interpretation of a history's weight as a probability, and they allow one to distinguish two kinds of coarse-graining. It is shown that for branching families, the ``consistency condition'' is not a precondition for assigning probabilities, but for a specific kind of coarse-graining.

  19. Structures, profile consistency, and transport scaling in electrostatic convection

    DEFF Research Database (Denmark)

    Bian, N.H.; Garcia, O.E.

    2005-01-01

    that for interchange modes, profile consistency is in fact due to mixing by persistent large-scale convective cells. This mechanism is not a turbulent diffusion, cannot occur in collisionless systems, and is the analog of the well-known laminar "magnetic flux expulsion" in magneiohydrodynamics. This expulsion process...... involves a "pinch" across closed streamlines and further results in the formation of pressure fingers along the-separatrix of the convective cells. By nature, these coherent structures are dissipative because the mixing process that leads to their formation relies on a finite amount of collisional...... diffusion. Numerical simulations of two-dimensional interchange modes confirm the role of laminar expulsion by convective cells, for profile consistency and structure formation. They also show that the fingerlike pressure structures ultimately control the rate of heat transport across the plasma layer...

  20. Turbulent MHD transport coefficients - An attempt at self-consistency

    Science.gov (United States)

    Chen, H.; Montgomery, D.

    1987-01-01

    In this paper, some multiple scale perturbation calculations of turbulent MHD transport coefficients begun in earlier papers are first completed. These generalize 'alpha effect' calculations by treating the velocity field and magnetic field on the same footing. Then the problem of rendering such calculations self-consistent is addressed, generalizing an eddy-viscosity hypothesis similar to that of Heisenberg for the Navier-Stokes case. The method also borrows from Kraichnan's direct interaction approximation. The output is a set of integral equations relating the spectra and the turbulent transport coefficients. Previous 'alpha effect' and 'beta effect' coefficients emerge as limiting cases. A treatment of the inertial range can also be given, consistent with a -5/3 energy spectrum power law. In the Navier-Stokes limit, a value of 1.72 is extracted for the Kolmogorov constant. Further applications to MHD are possible.

  1. Consistent return mapping algorithm for plane stress elastoplasticity

    Energy Technology Data Exchange (ETDEWEB)

    Simo, J.C.; Taylor, R.L.

    1985-05-01

    An unconditionally stable algorithm for plane stress elastoplasticity is developed, based upon the motion of elastic predictor return mapping (plastic corrector). Enforcement of the consistency condition is shown to reduce to the solution of a simple nonlinear equation. Consistent elastoplastic tangent moduli are obtained by exact linearization of the algorithm. Use of these moduli is essential in order to preserve the asymptotic rate of quadratic convergence of Newton methods. An exact solution for constant strain rate over the typical time step is derived. On the basis of this solution the accuracy of the algorithm is assessed by means of iso-error maps. The excellent performance of the algorithm for large time steps is illustrated in numerical experiments.

  2. Strong Consistency of the Empirical Martingale Simulation Option Price Estimator

    Institute of Scientific and Technical Information of China (English)

    Zhu-shun Yuan; Ge-mai Chen

    2009-01-01

    A simulation technique known as empirical martingale simulation (EMS) was proposed to improve simulation accuracy. By an adjustment to the standard Monte Carlo simulation, EMS ensures that the simulated price satisfies the rational option pricing bounds and that the estimated derivative contract price is strongly consistent with payoffs that satisfy Lipschitz condition. However, for some currently used contracts such as self-quanto options and asymmetric or symmetric power options, it is open whether the above asymptotic result holds. In this paper, we prove that the strong consistency of the EMS option price estimator holds for a wider class of univariate payoffs than those restricted by Lipschitz condition. Numerical experiments demonstrate that EMS can also substantially increase simulation accuracy in the extended setting.

  3. A correlation consistency based multivariate alarm thresholds optimization approach.

    Science.gov (United States)

    Gao, Huihui; Liu, Feifei; Zhu, Qunxiong

    2016-11-01

    Different alarm thresholds could generate different alarm data, resulting in different correlations. A new multivariate alarm thresholds optimization methodology based on the correlation consistency between process data and alarm data is proposed in this paper. The interpretative structural modeling is adopted to select the key variables. For the key variables, the correlation coefficients of process data are calculated by the Pearson correlation analysis, while the correlation coefficients of alarm data are calculated by kernel density estimation. To ensure the correlation consistency, the objective function is established as the sum of the absolute differences between these two types of correlations. The optimal thresholds are obtained using particle swarm optimization algorithm. Case study of Tennessee Eastman process is given to demonstrate the effectiveness of proposed method.

  4. Current Status Of Velocity Field Surveys: A Consistency Check

    CERN Document Server

    Sarkar, D; Watkins, R; Sarkar, Devdeep; Feldman, Hume A.

    2006-01-01

    We present a statistical analysis comparing the bulk--flow measurements for six recent peculiar velocity surveys, namely, ENEAR, SFI, RFGC, SBF and the Mark III singles and group catalogs. We study whether the bulk--flow estimates are consistent with each other and construct the full three dimensional bulk--flow vectors. The method we discuss could be used to test the consistency of all velocity field surveys. We show that although these surveys differ in their geometry and measurement errors, their bulk flow vectors are expected to be highly correlated and in fact show impressive agreement in all cases. Our results suggest that even though the surveys we study target galaxies of different morphology and use different distance measures, they all reliably reflect the same underlying large-scale flow.

  5. Stochastic multi-configurational self-consistent field theory

    CERN Document Server

    Thomas, Robert E; Alavi, Ali; Booth, George H

    2015-01-01

    The multi-configurational self-consistent field theory is considered the standard starting point for almost all multireference approaches required for strongly-correlated molecular problems. The limitation of the approach is generally given by the number of strongly-correlated orbitals in the molecule, as its cost will grow exponentially with this number. We present a new multi-configurational self-consistent field approach, wherein linear determinant coefficients of a multi-configurational wavefunction are optimized via the stochastic full configuration interaction quantum Monte Carlo technique at greatly reduced computational cost, with non-linear orbital rotation parameters updated variationally based on this sampled wavefunction. This extends this approach to strongly-correlated systems with far larger active spaces than it is possible to treat by conventional means. By comparison with this traditional approach, we demonstrate that the introduction of stochastic noise in both the determinant amplitudes an...

  6. Consistency Across Standards or Standards in a New Business Model

    Science.gov (United States)

    Russo, Dane M.

    2010-01-01

    Presentation topics include: standards in a changing business model, the new National Space Policy is driving change, a new paradigm for human spaceflight, consistency across standards, the purpose of standards, danger of over-prescriptive standards, a balance is needed (between prescriptive and general standards), enabling versus inhibiting, characteristics of success-oriented standards, characteristics of success-oriented standards, and conclusions. Additional slides include NASA Procedural Requirements 8705.2B identifies human rating standards and requirements, draft health and medical standards for human rating, what's been done, government oversight models, examples of consistency from anthropometry, examples of inconsistency from air quality and appendices of government and non-governmental human factors standards.

  7. Exceptional generalised geometry for massive IIA and consistent reductions

    CERN Document Server

    Cassani, Davide; Petrini, Michela; Strickland-Constable, Charles; Waldram, Daniel

    2016-01-01

    We develop an exceptional generalised geometry formalism for massive type IIA supergravity. In particular, we construct a deformation of the generalised Lie derivative, which generates the type IIA gauge transformations as modified by the Romans mass. We apply this new framework to consistent Kaluza-Klein reductions preserving maximal supersymmetry. We find a generalised parallelisation of the exceptional tangent bundle on S^6, and from this reproduce the consistent truncation ansatz and embedding tensor leading to dyonically gauged ISO(7) supergravity in four dimensions. We also discuss closely related hyperboloid reductions, yielding a dyonic ISO(p,7-p) gauging. Finally, while for vanishing Romans mass we find a generalised parallelisation on S^d, d=4,3,2, leading to a maximally supersymmetric reduction with gauge group SO(d+1) (or larger), we provide evidence that an analogous reduction does not exist in the massive theory.

  8. Exceptional generalised geometry for massive IIA and consistent reductions

    Science.gov (United States)

    Cassani, Davide; de Felice, Oscar; Petrini, Michela; Strickland-Constable, Charles; Waldram, Daniel

    2016-08-01

    We develop an exceptional generalised geometry formalism for massive type IIA supergravity. In particular, we construct a deformation of the generalised Lie derivative, which generates the type IIA gauge transformations as modified by the Romans mass. We apply this new framework to consistent Kaluza-Klein reductions preserving maximal supersymmetry. We find a generalised parallelisation of the exceptional tangent bundle on S 6, and from this reproduce the consistent truncation ansatz and embedding tensor leading to dyonically gauged ISO(7) supergravity in four dimensions. We also discuss closely related hyperboloid reductions, yielding a dyonic ISO( p, 7 - p) gauging. Finally, while for vanishing Romans mass we find a generalised parallelisation on S d , d = 4 , 3 , 2, leading to a maximally supersymmetric reduction with gauge group SO( d + 1) (or larger), we provide evidence that an analogous reduction does not exist in the massive theory.

  9. Bolasso: model consistent Lasso estimation through the bootstrap

    CERN Document Server

    Bach, Francis

    2008-01-01

    We consider the least-square linear regression problem with regularization by the l1-norm, a problem usually referred to as the Lasso. In this paper, we present a detailed asymptotic analysis of model consistency of the Lasso. For various decays of the regularization parameter, we compute asymptotic equivalents of the probability of correct model selection (i.e., variable selection). For a specific rate decay, we show that the Lasso selects all the variables that should enter the model with probability tending to one exponentially fast, while it selects all other variables with strictly positive probability. We show that this property implies that if we run the Lasso for several bootstrapped replications of a given sample, then intersecting the supports of the Lasso bootstrap estimates leads to consistent model selection. This novel variable selection algorithm, referred to as the Bolasso, is compared favorably to other linear regression methods on synthetic data and datasets from the UCI machine learning rep...

  10. Consistency Relations for an Implicit n-dimensional Regularization Scheme

    CERN Document Server

    Scarpelli, A P B; Nemes, M C

    2001-01-01

    We extend an implicit regularization scheme to be applicable in the $n$-dimensional space-time. Within this scheme divergences involving parity violating objects can be consistently treated without recoursing to dimensional continuation. Special attention is paid to differences between integrals of the same degree of divergence, typical of one loop calculations, which are in principle undetermined. We show how to use symmetries in order to fix these quantities consistently. We illustrate with examples in which regularization plays a delicate role in order to both corroborate and elucidate the results in the literature for the case of CPT violation in extended $QED_4$, topological mass generation in 3-dimensional gauge theories and the Schwinger Model and its chiral version.

  11. Collisional decoherence of tunneling molecules: a consistent histories treatment

    CERN Document Server

    Coles, Patrick J; Griffiths, Robert B

    2012-01-01

    The decoherence of a two-state tunneling molecule, such as a chiral molecule or ammonia, due to collisions with a buffer gas is analyzed in terms of a succession of quantum states of the molecule satisfying the conditions for a consistent family of histories. With $\\hbar \\omega$ the separation in energy of the levels in the isolated molecule and $\\gamma$ a decoherence rate proportional to the rate of collisions, we find for $\\gamma \\gg \\omega$ (strong decoherence) a consistent family in which the molecule flips randomly back and forth between the left- and right-handed chiral states in a stationary Markov process. For $\\gamma \\omega$ and for $\\gamma < \\omega$. In addition we relate the speed with which chiral information is transferred to the environment to the rate of decrease of complementary types of information (e.g., parity information) remaining in the molecule itself.

  12. A New Heuristic for Feature Selection by Consistent Biclustering

    CERN Document Server

    Mucherino, Antonio

    2010-01-01

    Given a set of data, biclustering aims at finding simultaneous partitions in biclusters of its samples and of the features which are used for representing the samples. Consistent biclusterings allow to obtain correct classifications of the samples from the known classification of the features, and vice versa, and they are very useful for performing supervised classifications. The problem of finding consistent biclusterings can be seen as a feature selection problem, where the features that are not relevant for classification purposes are removed from the set of data, while the total number of features is maximized in order to preserve information. This feature selection problem can be formulated as a linear fractional 0-1 optimization problem. We propose a reformulation of this problem as a bilevel optimization problem, and we present a heuristic algorithm for an efficient solution of the reformulated problem. Computational experiments show that the presented algorithm is able to find better solutions with re...

  13. Viscoelastic models with consistent hypoelasticity for fluids undergoing finite deformations

    Science.gov (United States)

    Altmeyer, Guillaume; Rouhaud, Emmanuelle; Panicaud, Benoit; Roos, Arjen; Kerner, Richard; Wang, Mingchuan

    2015-08-01

    Constitutive models of viscoelastic fluids are written with rate-form equations when considering finite deformations. Trying to extend the approach used to model these effects from an infinitesimal deformation to a finite transformation framework, one has to ensure that the tensors and their rates are indifferent with respect to the change of observer and to the superposition with rigid body motions. Frame-indifference problems can be solved with the use of an objective stress transport, but the choice of such an operator is not obvious and the use of certain transports usually leads to physically inconsistent formulation of hypoelasticity. The aim of this paper is to present a consistent formulation of hypoelasticity and to combine it with a viscosity model to construct a consistent viscoelastic model. In particular, the hypoelastic model is reversible.

  14. Sparse motion segmentation using multiple six-point consistencies

    CERN Document Server

    Zografos, Vasileios; Ellis, Liam

    2010-01-01

    We present a method for segmenting an arbitrary number of moving objects in image sequences using the geometry of 6 points in 2D to infer motion consistency. The method has been evaluated on the Hopkins 155 database and surpasses current state-of-the-art methods such as SSC, both in terms of overall performance on two and three motions but also in terms of maximum errors. The method works by ?nding initial clusters in the spatial domain, and then classifying each remaining point as belonging to the cluster that minimizes a motion consistency score. In contrast to most other motion segmentation methods that are based on an a?ne camera model, the proposed method is fully projective.

  15. Consistency and axiomatization of a natural extensional combinatory logic

    Institute of Scientific and Technical Information of China (English)

    蒋颖

    1996-01-01

    In the light of a question of J. L. Krivine about the consistency of an extensional λ-theory,an extensional combinatory logic ECL+U(G)+RU_∞+ is established, with its consistency model provedtheoretically and it is shown the it is not equivalent to any system of universal axioms. It is expressed bythe theory in first order logic that, for every given group G of order n, there simultaneously exist infinitelymany universal retractions and a surjective n-tuple notion, such that each element of G acts as a permutationof the components of the n-tuple, and as an Ap-automorphism of the model; further each of the universalretractions is invarian under the action of the Ap-automorphisms induced by G The difference between thetheory and that of Krivine is the G need not be a symmetric group.

  16. A minimal model of self-consistent partial synchrony

    Science.gov (United States)

    Clusella, Pau; Politi, Antonio; Rosenblum, Michael

    2016-09-01

    We show that self-consistent partial synchrony in globally coupled oscillatory ensembles is a general phenomenon. We analyze in detail appearance and stability properties of this state in possibly the simplest setup of a biharmonic Kuramoto-Daido phase model as well as demonstrate the effect in limit-cycle relaxational Rayleigh oscillators. Such a regime extends the notion of splay state from a uniform distribution of phases to an oscillating one. Suitable collective observables such as the Kuramoto order parameter allow detecting the presence of an inhomogeneous distribution. The characteristic and most peculiar property of self-consistent partial synchrony is the difference between the frequency of single units and that of the macroscopic field.

  17. Planck 2013 results. XXXI. Consistency of the Planck data

    DEFF Research Database (Denmark)

    Ade, P. A. R.; Arnaud, M.; Ashdown, M.

    2014-01-01

    by deviation of the ratio from unity) between 70 and 100 GHz power spectra averaged over 70 ≤∫≥ 390 at the 0.8% level, and agreement between 143 and 100 GHz power spectra of 0.4% over the same ` range. These values are within and consistent with the overall uncertainties in calibration given in the Planck 2013...... foreground emission. In this paper, we analyse the level of consistency achieved in the 2013 Planck data. We concentrate on comparisons between the 70, 100, and 143 GHz channel maps and power spectra, particularly over the angular scales of the first and second acoustic peaks, on maps masked for diuse....../100 ratio. Correcting for this, the 70, 100, and 143 GHz power spectra agree to 0.4% over the first two acoustic peaks. The likelihood analysis that produced the 2013 cosmological parameters incorporated uncertainties larger than this. We show explicitly that correction of the missing near sidelobe power...

  18. Time-Consistent and Market-Consistent Evaluations (Revised version of CentER DP 2011-063)

    NARCIS (Netherlands)

    Pelsser, A.; Stadje, M.A.

    2012-01-01

    Abstract: We consider evaluation methods for payoffs with an inherent financial risk as encountered for instance for portfolios held by pension funds and insurance companies. Pricing such payoffs in a way consistent to market prices typically involves combining actuarial techniques with methods from

  19. Time-Consistent and Market-Consistent Evaluations (replaced by CentER DP 2012-086)

    NARCIS (Netherlands)

    Pelsser, A.; Stadje, M.A.

    2011-01-01

    We consider evaluation methods for payoffs with an inherent financial risk as encountered for instance for portfolios held by pension funds and insurance companies. Pricing such payoffs in a way consistent to market prices typically involves combining actuarial techniques with methods from mathemati

  20. Developing consistent Landsat data sets for large area applications: the MRLC 2001 protocol

    Science.gov (United States)

    Chander, G.; Huang, C.; Yang, L.; Homer, C.; Larson, C.

    2009-01-01

    One of the major efforts in large area land cover mapping over the last two decades was the completion of two U.S. National Land Cover Data sets (NLCD), developed with nominal 1992 and 2001 Landsat imagery under the auspices of the MultiResolution Land Characteristics (MRLC) Consortium. Following the successful generation of NLCD 1992, a second generation MRLC initiative was launched with two primary goals: (1) to develop a consistent Landsat imagery data set for the U.S. and (2) to develop a second generation National Land Cover Database (NLCD 2001). One of the key enhancements was the formulation of an image preprocessing protocol and implementation of a consistent image processing method. The core data set of the NLCD 2001 database consists of Landsat 7 Enhanced Thematic Mapper Plus (ETM+) images. This letter details the procedures for processing the original ETM+ images and more recent scenes added to the database. NLCD 2001 products include Anderson Level II land cover classes, percent tree canopy, and percent urban imperviousness at 30-m resolution derived from Landsat imagery. The products are freely available for download to the general public from the MRLC Consortium Web site at http://www.mrlc.gov.

  1. Self-Consistent Density Functional Including Long-Range van der Waals Interactions

    Science.gov (United States)

    Ferri, Nicola; Distasio, Robert A., Jr.; Car, Roberto; Scheffler, Matthias; Tkatchenko, Alexandre

    2013-03-01

    Van der Waals (vdW) interactions are significant for a wide variety of systems, from noble-gas dimers to organic/inorganic interfaces. The long-range vdW energy is a tiny fraction (0.001%) of the total energy, hence it is typically assumed not to change electronic properties. Although the vdW-DF functional includes the effect of vdW energy on electronic structure, the influence of ``true'' long-range vdW interactions is difficult to assess since a significant part of vdW-DF energy arises from short distances. Here, we present a self-consistent (SC) implementation of the long-range Tkatchenko-Scheffler (TS) functional, including its extension to surfaces. The analysis of self-consistency for rare-gas dimers allows us to reconcile two different views on vdW interactions: (i) Feynman's view that claims changes in the electron density and (ii) atoms separated by infinite barrier. In agreement with previous work, we find negligible contribution from self-consistency in the structure and stability of vdW-bound complexes. However, a closer look at organic/inorganic interfaces reveals notable modification of energy levels when using the SC-TS vdW density functional.

  2. Non-autonomous discrete Boussinesq equation: Solutions and consistency

    Science.gov (United States)

    Nong, Li-Juan; Zhang, Da-Juan

    2014-07-01

    A non-autonomous 3-component discrete Boussinesq equation is discussed. Its spacing parameters pn and qm are related to independent variables n and m, respectively. We derive bilinear form and solutions in Casoratian form. The plain wave factor is defined through the cubic roots of unity. The plain wave factor also leads to extended non-autonomous discrete Boussinesq equation which contains a parameter δ. Tree-dimendional consistency and Lax pair of the obtained equation are discussed.

  3. An Algebraic Characterization of Inductive Soundness in Proof by Consistency

    Institute of Scientific and Technical Information of China (English)

    邵志清; 宋国新

    1995-01-01

    Kapur and Musser studied the theoretical basis for proof by consistency and obtained an inductive completeness result:p=q if and only if p=q is true in every inductive model.However,there is a loophole in their proof for the soundness part:p=q implies p=q is true in every inductive model.The aim of this paper is to give a correct characterization of inductive soundness from an algebraic view by introducing strong inductive models.

  4. Consistency analysis of a nonbirefringent Lorentz-violating planar model

    CERN Document Server

    Casana, Rodolfo; Moreira, Roemir P M

    2011-01-01

    In this work analyze the physical consistency of a nonbirefringent Lorentz-violating planar model via the analysis of the pole structure of its Feynman's propagators. The nonbirefringent planar model, obtained from the dimensional reduction of the CPT-even gauge sector of the standard model extension, is composed of a gauge and a scalar fields, being affected by Lorentz-violating (LIV) coefficients encoded in the symmetric tensor $\\kappa_{\\mu\

  5. Incomplete Lineage Sorting: Consistent Phylogeny Estimation From Multiple Loci

    CERN Document Server

    Mossel, Elchanan

    2008-01-01

    We introduce a simple algorithm for reconstructing phylogenies from multiple gene trees in the presence of incomplete lineage sorting, that is, when the topology of the gene trees may differ from that of the species tree. We show that our technique is statistically consistent under standard stochastic assumptions, that is, it returns the correct tree given sufficiently many unlinked loci. We also show that it can tolerate moderate estimation errors.

  6. Consistent 4D Brain Extraction of Serial Brain MR Images

    OpenAIRE

    Wang, Yaping; Li, Gang; Nie, Jingxin; Yap, Pew-Thian; Guo, Lei; Shen, Dinggang

    2013-01-01

    Accurate and consistent skull stripping of serial brain MR images is of great importance in longitudinal studies that aim to detect subtle brain morphological changes. To avoid inconsistency and the potential bias introduced by independently performing skull-stripping for each time-point image, we propose an effective method that is capable of skull-stripping serial brain MR images simultaneously. Specifically, all serial images of the same subject are first affine aligned in a groupwise mann...

  7. Consistency argument and classification problem in λ-calculus

    Institute of Scientific and Technical Information of China (English)

    王驹; 赵希顺; 黄且圆; 蒋颖

    1999-01-01

    Enlightened by Mal’cev theorem in universal algebra, a new criterion for consistency argument in λ-calculus has been introduced. It is equivalent to Jacopini and Baeten-Boerboom’ s, but more convenient to use. Based on the new criterion, one uses an enhanced technique to show a few results which provides a deeper insight in the classification problem of λ-terms with no normal forms.

  8. Security Policy:Consistency,Adjustments and Restraining Factors

    Institute of Scientific and Technical Information of China (English)

    Yang Jiemian

    2004-01-01

    @@ In the 2004 U.S. presidential election, despite well-divided domestic opinions and Kerry's appealing slogan of "Reversing the Trend," a slight majority still voted for George W. Bush in the end. It is obvious that, based on the author's analysis, security agenda such as counter-terrorism and Iraqi issue has contributed greatly to the reelection of Mr. Bush. This also indicates that the security policy of Bush's second term will basically be consistent.

  9. Measuring Consistent Poverty in Ireland with EU SILC Data

    OpenAIRE

    Whelan, Christopher T.; Nolan, Brian; Maitre, Bertrand

    2006-01-01

    In this paper we seek to make use of the newly available Irish component of the European Union Statistics on Income and Living Conditions (EU-SILC) in order to develop a measure of consistent poverty that overcomes some of the difficulties associated with the original indicators employed as targets in the Irish National Anti-Poverty Strategy. Our analysis leads us to propose a set of economic strain indicators that cover a broader range than the original basic deprivation set. The accumulated...

  10. Personalities in great tits, Parus major: stability and consistency

    OpenAIRE

    Carere, C; Drent, PJ; Privitera, L; Koolhaas, JM; Groothuis, TGG; Drent, Piet J; Koolhaas, Jaap M.; Groothuis, Ton G.G.

    2005-01-01

    We carried out a longitudinal study on great tits from two lines bidirectionally selected for fast or slow exploratory performance during the juvenile phase, a trait thought to reflect different personalities. We analysed temporal stability and consistency of responses within and between situations involving exploratory and sociosexual behaviour. Exploratory behaviour was assessed both in the juvenile phase and in adulthood (2-3-year interval) by means of a novel object test and an open field...

  11. Noncommuting electric fields and algebraic consistency in noncommutative gauge theories

    Science.gov (United States)

    Banerjee, Rabin

    2003-05-01

    We show that noncommuting electric fields occur naturally in θ-expanded noncommutative gauge theories. Using this noncommutativity, which is field dependent, and a Hamiltonian generalization of the Seiberg-Witten map, the algebraic consistency in the Lagrangian and Hamiltonian formulations of these theories is established. A comparison of results in different descriptions shows that this generalized map acts as a canonical transformation in the physical subspace only. Finally, we apply the Hamiltonian formulation to derive the gauge symmetries of the action.

  12. Identification of consistency in rating curve data: Bidirectional Reach (BReach)

    Science.gov (United States)

    Van Eerdenbrugh, Katrien; Van Hoey, Stijn; Verhoest, Niko E. C.

    2016-04-01

    Before calculating rating curve discharges, it is crucial to identify possible interruptions in data consistency. In this research, a methodology to perform this preliminary analysis is developed and validated. This methodology, called Bidirectional Reach (BReach), evaluates in each data point results of a rating curve model with randomly sampled parameter sets. The combination of a parameter set and a data point is classified as non-acceptable if the deviation between the accompanying model result and the measurement exceeds observational uncertainty. Moreover, a tolerance degree that defines satisfactory behavior of a sequence of model results is chosen. This tolerance degree equals the percentage of observations that are allowed to have non-acceptable model results. Subsequently, the results of the classification is used to assess the maximum left and right reach for each data point of a chronologically sorted time series. This maximum left and right reach in a gauging point represent the data points in the direction of the previous respectively the following observations beyond which none of the sampled parameter sets both are satisfactory and result in an acceptable deviation. This analysis is repeated for a variety of tolerance degrees. Plotting results of this analysis for all data points and all tolerance degrees in a combined BReach plot enables the detection of changes in data consistency. Moreover, if consistent periods are detected, limits of these periods can be derived. The methodology is validated with various synthetic stage-discharge data sets and proves to be a robust technique to investigate temporal consistency of rating curve data. It provides satisfying results despite of low data availability, large errors in the estimated observational uncertainty, and a rating curve model that is known to cover only a limited part of the observations.

  13. Consistent Algorithm for Multi-value Constraint with Continuous Variables

    Institute of Scientific and Technical Information of China (English)

    1999-01-01

    Mature algorithms for the Constraint Satisfaction Problem (CSP) of binary constraint with discrete variables have already been obtained for the application. For the instance of multi-value constraint with continuous variables, the approach will be quite different and the difficulty of settling will aggrandize a lot. This paper presents the algorithm for realizing global consistency of continuous variable. And this algorithm can be applied to multi-value constraint.

  14. Influence of Sensor Ingestion Timing on Consistency of Temperature Measures

    Science.gov (United States)

    2009-01-01

    Copyright @ 200 by the American College of Sports Medicine. Unauthorized reproduction of this article is prohibited.9 Influence of Sensor Ingestion ... Ingestion Timing on Consistency of Temperature Measures. Med. Sci. Sports Exerc., Vol. 41, No. 3, pp. 597–602, 2009. Purpose: The validity and the...reliability of using intestinal temperature (Tint) via ingestible temperature sensors (ITS) to measure core body temperature have been demonstrated. However

  15. Spectrally Consistent Satellite Image Fusion with Improved Image Priors

    DEFF Research Database (Denmark)

    Nielsen, Allan Aasbjerg; Aanæs, Henrik; Jensen, Thomas B.S.;

    2006-01-01

    Here an improvement to our previous framework for satellite image fusion is presented. A framework purely based on the sensor physics and on prior assumptions on the fused image. The contributions of this paper are two fold. Firstly, a method for ensuring 100% spectrally consistency is proposed......, even when more sophisticated image priors are applied. Secondly, a better image prior is introduced, via data-dependent image smoothing....

  16. GW method with the self-consistent Sternheimer equation

    OpenAIRE

    2010-01-01

    We propose a novel approach to quasiparticle GW calculations which does not require the computation of unoccupied electronic states. In our approach the screened Coulomb interaction is evaluated by solving self-consistent linear-response Sternheimer equations, and the noninteracting Green's function is evaluated by solving inhomogeneous linear systems. The frequency-dependence of the screened Coulomb interaction is explicitly taken into account. In order to avoid the singularities of the scre...

  17. Internal consistency reliability is a poor predictor of responsiveness

    Directory of Open Access Journals (Sweden)

    Heels-Ansdell Diane

    2005-05-01

    Full Text Available Abstract Background Whether responsiveness represents a measurement property of health-related quality of life (HRQL instruments that is distinct from reliability and validity is an issue of debate. We addressed the claims of a recent study, which suggested that investigators could rely on internal consistency to reflect instrument responsiveness. Methods 516 patients with chronic obstructive pulmonary disease or knee injury participating in four longitudinal studies completed generic and disease-specific HRQL questionnaires before and after an intervention that impacted on HRQL. We used Pearson correlation coefficients and linear regression to assess the relationship between internal consistency reliability (expressed as Cronbach's alpha, instrument type (generic and disease-specific and responsiveness (expressed as the standardised response mean, SRM. Results Mean Cronbach's alpha was 0.83 (SD 0.08 and mean SRM was 0.59 (SD 0.33. The correlation between Cronbach's alpha and SRMs was 0.10 (95% CI -0.12 to 0.32 across all studies. Cronbach's alpha alone did not explain variability in SRMs (p = 0.59, r2 = 0.01 whereas the type of instrument was a strong predictor of the SRM (p = 0.012, r2 = 0.37. In multivariable models applied to individual studies Cronbach's alpha consistently failed to predict SRMs (regression coefficients between -0.45 and 1.58, p-values between 0.15 and 0.98 whereas the type of instrument did predict SRMs (regression coefficients between -0.25 to -0.59, p-values between Conclusion Investigators must look to data other than internal consistency reliability to select a responsive instrument for use as an outcome in clinical trials.

  18. Consistent deniable lying : privacy in mobile social networks

    OpenAIRE

    Belle, Sebastian Kay; Waldvogel, Marcel

    2008-01-01

    Social networking is moving to mobile phones. This not only means continuous access, but also allows to link virtual and physical neighbourhood in novel ways. To make such systems useful, personal data such as lists of friends and interests need to be shared with more and frequently unknown people, posing a risk to your privacy. In this paper, we present our approach to social networking, Consistent Deniable Lying (CDL). Using easy-to-understand mechanisms and tuned to this environment, i...

  19. On the scalar consistency relation away from slow roll

    CERN Document Server

    Sreenath, V; Sriramkumar, L

    2014-01-01

    As is well known, the non-Gaussianity parameter $f_{_{\\rm NL}}$, which is often used to characterize the amplitude of the scalar bi-spectrum, can be expressed completely in terms of the scalar spectral index $n_{\\rm s}$ in the squeezed limit, a relation that is referred to as the consistency condition. This relation, while it is largely discussed in the context of slow roll inflation, is actually expected to hold in any single field model of inflation, irrespective of the dynamics of the underlying model. In this work, we explicitly examine the validity of the consistency relation, analytically as well as numerically, away from slow roll. Analytically, we first arrive at the relation in the simple case of power law inflation. We also consider the non-trivial example of the Starobinsky model involving a linear potential with a sudden change in its slope (which leads to a brief period of fast roll), and establish the condition completely analytically. We then numerically examine the validity of the consistency ...

  20. Wide baseline stereo matching based on double topological relationship consistency

    Science.gov (United States)

    Zou, Xiaohong; Liu, Bin; Song, Xiaoxue; Liu, Yang

    2009-07-01

    Stereo matching is one of the most important branches in computer vision. In this paper, an algorithm is proposed for wide-baseline stereo vision matching. Here, a novel scheme is presented called double topological relationship consistency (DCTR). The combination of double topological configuration includes the consistency of first topological relationship (CFTR) and the consistency of second topological relationship (CSTR). It not only sets up a more advanced model on matching, but discards mismatches by iteratively computing the fitness of the feature matches and overcomes many problems of traditional methods depending on the powerful invariance to changes in the scale, rotation or illumination across large view changes and even occlusions. Experimental examples are shown where the two cameras have been located in very different orientations. Also, epipolar geometry can be recovered using RANSAC by far the most widely method adopted possibly. By the method, we can obtain correspondences with high precision on wide baseline matching problems. Finally, the effectiveness and reliability of this method are demonstrated in wide-baseline experiments on the image pairs.

  1. Consistency Property of Finite FC-Normal Logic Programs

    Institute of Scientific and Technical Information of China (English)

    Yi-Song Wang; Ming-Yi Zhang; Yu-Ping Shen

    2007-01-01

    Marek's forward-chaining construction is one of the important techniques for investigating the non-monotonic reasoning. By introduction of consistency property over a logic program, they proposed a class of logic programs, FC-normal programs, each of which has at least one stable model. However, it is not clear how to choose one appropriate consistency property for deciding whether or not a logic program is FC-normal. In this paper, we firstly discover that, for any finite logic program ∏, there exists the least consistency property LCon(∏) over ∏, which just depends on ∏ itself, such that, ∏ is FC-normal if and only if ∏ is FC-normal with respect to (w.r.t.) LCon(∏). Actually, in order to determine the FC-normality of a logic program, it is sufficient to check the monotonic closed sets in LCon(∏) for all non-monotonic rules, that is LFC(∏). Secondly, we present an algorithm for computing LFC(∏). Finally, we reveal that the brave reasoning task and cautious reasoning task for FC-normal logic programs are of the same difficulty as that of normal logic programs.

  2. Context-specific metabolic networks are consistent with experiments.

    Directory of Open Access Journals (Sweden)

    Scott A Becker

    2008-05-01

    Full Text Available Reconstructions of cellular metabolism are publicly available for a variety of different microorganisms and some mammalian genomes. To date, these reconstructions are "genome-scale" and strive to include all reactions implied by the genome annotation, as well as those with direct experimental evidence. Clearly, many of the reactions in a genome-scale reconstruction will not be active under particular conditions or in a particular cell type. Methods to tailor these comprehensive genome-scale reconstructions into context-specific networks will aid predictive in silico modeling for a particular situation. We present a method called Gene Inactivity Moderated by Metabolism and Expression (GIMME to achieve this goal. The GIMME algorithm uses quantitative gene expression data and one or more presupposed metabolic objectives to produce the context-specific reconstruction that is most consistent with the available data. Furthermore, the algorithm provides a quantitative inconsistency score indicating how consistent a set of gene expression data is with a particular metabolic objective. We show that this algorithm produces results consistent with biological experiments and intuition for adaptive evolution of bacteria, rational design of metabolic engineering strains, and human skeletal muscle cells. This work represents progress towards producing constraint-based models of metabolism that are specific to the conditions where the expression profiling data is available.

  3. On the kernel and particle consistency in smoothed particle hydrodynamics

    CERN Document Server

    Sigalotti, Leonardo Di G; Rendón, Otto; Vargas, Carlos A; Peña-Polo, Franklin

    2016-01-01

    The problem of consistency of smoothed particle hydrodynamics (SPH) has demanded considerable attention in the past few years due to the ever increasing number of applications of the method in many areas of science and engineering. A loss of consistency leads to an inevitable loss of approximation accuracy. In this paper, we revisit the issue of SPH kernel and particle consistency and demonstrate that SPH has a limiting second-order convergence rate. Numerical experiments with suitably chosen test functions validate this conclusion. In particular, we find that when using the root mean square error as a model evaluation statistics, well-known corrective SPH schemes, which were thought to converge to second, or even higher order, are actually first-order accurate, or at best close to second order. We also find that observing the joint limit when $N\\to\\infty$, $h\\to 0$, and $n\\to\\infty$, as was recently proposed by Zhu et al., where $N$ is the total number of particles, $h$ is the smoothing length, and $n$ is th...

  4. Consistently Trained Artificial Neural Network for Automatic Ship Berthing Control

    Directory of Open Access Journals (Sweden)

    Y.A. Ahmed

    2015-09-01

    Full Text Available In this paper, consistently trained Artificial Neural Network controller for automatic ship berthing is discussed. Minimum time course changing manoeuvre is utilised to ensure such consistency and a new concept named ‘virtual window’ is introduced. Such consistent teaching data are then used to train two separate multi-layered feed forward neural networks for command rudder and propeller revolution output. After proper training, several known and unknown conditions are tested to judge the effectiveness of the proposed controller using Monte Carlo simulations. After getting acceptable percentages of success, the trained networks are implemented for the free running experiment system to judge the network’s real time response for Esso Osaka 3-m model ship. The network’s behaviour during such experiments is also investigated for possible effect of initial conditions as well as wind disturbances. Moreover, since the final goal point of the proposed controller is set at some distance from the actual pier to ensure safety, therefore a study on automatic tug assistance is also discussed for the final alignment of the ship with actual pier.

  5. A dynamical mechanism for large volumes with consistent couplings

    Science.gov (United States)

    Abel, Steven

    2016-11-01

    A mechanism for addressing the "decompactification problem" is proposed, which consists of balancing the vacuum energy in Scherk-Schwarzed theories against contributions coming from non-perturbative physics. Universality of threshold corrections ensures that, in such situations, the stable minimum will have consistent gauge couplings for any gauge group that shares the same N = 2 beta function for the bulk excitations as the gauge group that takes part in the minimisation. Scherk-Schwarz compactification from 6D to 4D in heterotic strings is discussed explicitly, together with two alternative possibilities for the non-perturbative physics, namely metastable SQCD vacua and a single gaugino condensate. In the former case, it is shown that modular symmetries gives various consistency checks, and allow one to follow soft-terms, playing a similar role to R-symmetry in global SQCD. The latter case is particularly attractive when there is nett Bose-Fermi degeneracy in the massless sector. In such cases, because the original Casimir energy is generated entirely by excited and/or non-physical string modes, it is completely immune to the non-perturbative IR physics. Such a separation between UV and IR contributions to the potential greatly simplifies the analysis of stabilisation, and is a general possibility that has not been considered before.

  6. The consistent histories approach to loop quantum cosmology

    Science.gov (United States)

    Craig, David A.

    2016-06-01

    We review the application of the consistent (or decoherent) histories formulation of quantum theory to canonical loop quantum cosmology. Conventional quantum theory relies crucially on “measurements” to convert unrealized quantum potentialities into physical outcomes that can be assigned probabilities. In the early universe and other physical contexts in which there are no observers or measuring apparatus (or indeed, in any closed quantum system), what criteria determine which alternative outcomes may be realized and what their probabilities are? In the consistent histories formulation it is the vanishing of interference between the branch wave functions describing alternative histories — as determined by the system’s decoherence functional — that determines which alternatives may be assigned probabilities. We describe the consistent histories formulation and how it may be applied to canonical loop quantum cosmology, describing in detail the application to homogeneous and isotropic cosmological models with scalar matter. We show how the theory may be used to make definite physical predictions in the absence of “observers”. As an application, we demonstrate how the theory predicts that loop quantum models “bounce” from large volume to large volume, while conventional “Wheeler-DeWitt”-quantized universes are invariably singular. We also briefly indicate the relation to other work.

  7. Consistent Adjoint Driven Importance Sampling using Space, Energy and Angle

    Energy Technology Data Exchange (ETDEWEB)

    Peplow, Douglas E. [ORNL; Mosher, Scott W [ORNL; Evans, Thomas M [ORNL

    2012-08-01

    For challenging radiation transport problems, hybrid methods combine the accuracy of Monte Carlo methods with the global information present in deterministic methods. One of the most successful hybrid methods is CADIS Consistent Adjoint Driven Importance Sampling. This method uses a deterministic adjoint solution to construct a biased source distribution and consistent weight windows to optimize a specific tally in a Monte Carlo calculation. The method has been implemented into transport codes using just the spatial and energy information from the deterministic adjoint and has been used in many applications to compute tallies with much higher figures-of-merit than analog calculations. CADIS also outperforms user-supplied importance values, which usually take long periods of user time to develop. This work extends CADIS to develop weight windows that are a function of the position, energy, and direction of the Monte Carlo particle. Two types of consistent source biasing are presented: one method that biases the source in space and energy while preserving the original directional distribution and one method that biases the source in space, energy, and direction. Seven simple example problems are presented which compare the use of the standard space/energy CADIS with the new space/energy/angle treatments.

  8. Mechanism of Consistent Gyrus Formation: an Experimental and Computational Study

    Science.gov (United States)

    Zhang, Tuo; Razavi, Mir Jalil; Li, Xiao; Chen, Hanbo; Liu, Tianming; Wang, Xianqiao

    2016-11-01

    As a significant type of cerebral cortical convolution pattern, the gyrus is widely preserved across species. Although many hypotheses have been proposed to study the underlying mechanisms of gyrus formation, it is currently still far from clear which factors contribute to the regulation of consistent gyrus formation. In this paper, we employ a joint analysis scheme of experimental data and computational modeling to investigate the fundamental mechanism of gyrus formation. Experimental data on mature human brains and fetal brains show that thicker cortices are consistently found in gyral regions and gyral cortices have higher growth rates. We hypothesize that gyral convolution patterns might stem from heterogeneous regional growth in the cortex. Our computational simulations show that gyral convex patterns may occur in locations where the cortical plate grows faster than the cortex of the brain. Global differential growth can only produce a random gyrification pattern, but it cannot guarantee gyrus formation at certain locations. Based on extensive computational modeling and simulations, it is suggested that a special area in the cerebral cortex with a relatively faster growth speed could consistently engender gyri.

  9. The Different Varieties of the Suyama-Yamaguchi Consistency Relation

    CERN Document Server

    Rodriguez, Yeinzon

    2013-01-01

    We present the different consistency relations that can be seen as variations of the well known Suyama-Yamaguchi (SY) consistency relation \\tau_{NL} \\geqslant (\\frac{6}{5} f_{NL})^2, the latter involving the levels of non-gaussianity f_{NL} and \\tau_{NL} in the primordial curvature perturbation \\zeta, they being scale-invariant. We explicitly state under which conditions the SY consistency relation has been claimed to hold in its different varieties (implicitly) presented in the literature since its inception back in 2008; as a result, we show for the first time that the variety \\tau_{NL} ({\\bf k}_1, {\\bf k}_1) \\geqslant (\\frac{6}{5} f_{NL} ({\\bf k}_1))^2, which we call "the fifth variety", is always satisfied even when there is strong scale-dependence and high levels of statistical anisotropy as long as statistical homogeneity holds: thus, an observed violation of this specific variety would prevent the comparison between theory and observation, shaking this way the foundations of cosmology as a science.

  10. Temporal and contextual consistency of leadership in homing pigeon flocks.

    Directory of Open Access Journals (Sweden)

    Carlos D Santos

    Full Text Available Organized flight of homing pigeons (Columba livia was previously shown to rely on simple leadership rules between flock mates, yet the stability of this social structuring over time and across different contexts remains unclear. We quantified the repeatability of leadership-based flock structures within a flight and across multiple flights conducted with the same animals. We compared two contexts of flock composition: flocks of birds of the same age and flight experience; and, flocks of birds of different ages and flight experience. All flocks displayed consistent leadership-based structures over time, showing that individuals have stable roles in the navigational decisions of the flock. However, flocks of balanced age and flight experience exhibited reduced leadership stability, indicating that these factors promote flock structuring. Our study empirically demonstrates that leadership and followership are consistent behaviours in homing pigeon flocks, but such consistency is affected by the heterogeneity of individual flight experiences and/or age. Similar evidence from other species suggests leadership as an important mechanism for coordinated motion in small groups of animals with strong social bonds.

  11. Dynamic self-consistent field theory for unentangled homopolymer fluids

    Science.gov (United States)

    Mihajlovic, Maja; Lo, Tak Shing; Shnidman, Yitzhak

    2005-10-01

    We present a lattice formulation of a dynamic self-consistent field (DSCF) theory that is capable of resolving interfacial structure, dynamics, and rheology in inhomogeneous, compressible melts and blends of unentangled homopolymer chains. The joint probability distribution of all the Kuhn segments in the fluid, interacting with adjacent segments and walls, is approximated by a product of one-body probabilities for free segments interacting solely with an external potential field that is determined self-consistently. The effect of flow on ideal chain conformations is modeled with finitely extensible, nonlinearly elastic dumbbells in the Peterlin approximation, and related to stepping probabilities in a random walk. Free segment and stepping probabilities generate statistical weights for chain conformations in a self-consistent field, and determine local volume fractions of chain segments. Flux balance across unit lattice cells yields mean field transport equations for the evolution of free segment probabilities and of momentum densities on the Kuhn length scale. Diffusive and viscous contributions to the fluxes arise from segmental hops modeled as a Markov process, with transition rates reflecting changes in segmental interaction, kinetic energy, and entropic contributions to the free energy under flow. We apply the DSCF equations to study both transient and steady-state interfacial structure, flow, and rheology in a sheared planar channel containing either a one-component melt or a phase-separated, two-component blend.

  12. One-particle-irreducible consistency relations for cosmological perturbations

    Science.gov (United States)

    Goldberger, Walter D.; Hui, Lam; Nicolis, Alberto

    2013-05-01

    We derive consistency relations for correlators of scalar cosmological perturbations that hold in the “squeezed limit” in which one or more of the external momenta become soft. Our results are formulated as relations between suitably defined one-particle-irreducible N-point and (N-1)-point functions that follow from residual spatial conformal diffeomorphisms of the unitary gauge Lagrangian. As such, some of these relations are exact to all orders in perturbation theory and do not rely on approximate de Sitter invariance or other dynamical assumptions (e.g., properties of the operator product expansion or the behavior of modes at the horizon crossing). The consistency relations apply model-independently to cosmological scenarios in which the time evolution is driven by a single scalar field. Besides reproducing the known results for single-field inflation in the slow-roll limit, we verify that our consistency relations hold more generally, for instance, in ghost condensate models in flat space. We comment on possible extensions of our results to multifield models.

  13. Consistency of scalar potentials from quantum de Sitter space

    Science.gov (United States)

    Espinosa, José R.; Fortin, Jean-François; Trépanier, Maxime

    2016-06-01

    Consistency of the unconventional view of de Sitter space as a quantum theory of gravity with a finite number of degrees of freedom requires that Coleman-De Luccia tunneling rates to vacua with negative cosmological constant should be interpreted as recurrences to low-entropy states. This demand translates into two constraints, or consistency conditions, on the scalar potential that are generically as follows: (1) the distance in field space between the de Sitter vacuum and any other vacuum with negative cosmological constant must be of the order of the reduced Planck mass or larger and (2) the fourth root of the vacuum energy density of the de Sitter vacuum must be smaller than the fourth root of the typical scale of the scalar potential. These consistency conditions shed a different light on both outstanding hierarchy problems of the standard model of particle physics: the scale of electroweak symmetry breaking and the scale of the cosmological constant. Beyond the unconventional interpretation of quantum de Sitter space, we complete the analytic understanding of the thin-wall approximation of Coleman-De Luccia tunneling, extend its numerical analysis to generic potentials and discuss the role of gravity in stabilizing the standard model potential.

  14. Towards a self-consistent halo model for the nonlinear large-scale structure

    CERN Document Server

    Schmidt, Fabian

    2015-01-01

    The halo model is a theoretically and empirically well-motivated framework for predicting the statistics of the nonlinear matter distribution in the Universe. However, current incarnations of the halo model suffer from two major deficiencies: $(i)$ they do not enforce the stress-energy conservation of matter; $(ii)$ they are not guaranteed to recover exact perturbation theory results on large scales. Here, we provide a formulation of the halo model ("EHM") that remedies both drawbacks in a consistent way, while attempting to maintain the predictivity of the approach. In the formulation presented here, mass and momentum conservation are guaranteed, and results of perturbation theory and the effective field theory can in principle be matched to any desired order on large scales. We find that a key ingredient in the halo model power spectrum is the halo stochasticity covariance, which has been studied to a much lesser extent than other ingredients such as mass function, bias, and profiles of halos. As written he...

  15. General second order complete active space self-consistent-field solver for large-scale systems

    CERN Document Server

    Sun, Qiming

    2016-01-01

    One challenge of the complete active space self-consistent field (CASSCF) program is to solve the transition metal complexes which are typically medium or large-size molecular systems with large active space. We present an AO-driven second order CASSCF solver to efficiently handle systems which have a large number of AO functions and many active orbitals. This solver allows user to replace the active space Full CI solver with any multiconfigurational solver without breaking the quadratic convergence feature. We demonstrate the capability of the CASSCF solver with the study of Fe(ii)-porphine ground state using DMRG-CASSCF method for 22 electrons in 27 active orbitals and 3000 basis functions.

  16. ALK, the key gene for gelatinization temperature, is a modifier gene for gel consistency in rice.

    Science.gov (United States)

    Gao, Zhenyu; Zeng, Dali; Cheng, Fangmin; Tian, Zhixi; Guo, Longbiao; Su, Yan; Yan, Meixian; Jiang, Hua; Dong, Guojun; Huang, Yuchen; Han, Bin; Li, Jiayang; Qian, Qian

    2011-09-01

    Gelatinization temperature (GT) is an important parameter in evaluating the cooking and eating quality of rice. Indeed, the phenotype, biochemistry and inheritance of GT have been widely studied in recent times. Previous map-based cloning revealed that GT was controlled by ALK gene, which encodes a putative soluble starch synthase II-3. Complementation vector and RNAi vector were constructed and transformed into Nipponbare mediated by Agrobacterium. Phenotypic and molecular analyses of transgenic lines provided direct evidence for ALK as a key gene for GT. Meanwhile, amylose content, gel consistency and pasting properties were also affected in transgenic lines. Two of four nonsynonymous single nucleotide polymorphisms in coding sequence of ALK were identified as essential for GT. Based on the single nucleotide polymorphisms (SNPs), two new sets of SNP markers combined with one cleaved amplified polymorphic sequence marker were developed for application in rice quality breeding. © 2011 Institute of Botany, Chinese Academy of Sciences.

  17. Consistent individual differences in fathering in threespined stickleback Gasterosteus aculeatus

    Institute of Scientific and Technical Information of China (English)

    Laura R. STEIN; Alison M. BELL

    2012-01-01

    There is growing evidence that individual animals show consistent differences in behavior.For example,individual threespined stickleback fish differ in how they react to predators and how aggressive they are during social interactions with conspecifics.A relatively unexplored but potentially important axis of variation is parental behavior.In sticklebacks,fathers provide all of the parental care that is necessary for offspring survival; therefore paternal care is directly tied to fimess.In this study,we assessed whether individual male sticklebacks differ consistently from each other in parental behavior.We recorded visits to nest,total time fanning,and activity levels of 11 individual males every day throughout one clutch,and then allowed the males to breed again.Half of the males were exposed to predation risk while parenting during the fast clutch,and the other half of the males experienced predation risk during the second clutch.We detected dramatic temporal changes in parental behaviors over the course of the clutch:for example,total time fanning increased six-fold prior to eggs hatching,then decreased to approximately zero.Despite these temporal changes,males retained their individually-distinctive parenting styles within a clutch that could not be explained by differences in body size or egg mass.Moreover,individual differences in parenting were maintained when males reproduced for a second time.Males that were exposed to simulated predation risk briefly decreased fanning and increased activity levels.Altogether,these results show that individual sticklebacks consistently differ from each other in how they behave as parents [Current Zoology 58 (1):45-52,2012].

  18. Flood damage curves for consistent global risk assessments

    Science.gov (United States)

    de Moel, Hans; Huizinga, Jan; Szewczyk, Wojtek

    2016-04-01

    Assessing potential damage of flood events is an important component in flood risk management. Determining direct flood damage is commonly done using depth-damage curves, which denote the flood damage that would occur at specific water depths per asset or land-use class. Many countries around the world have developed flood damage models using such curves which are based on analysis of past flood events and/or on expert judgement. However, such damage curves are not available for all regions, which hampers damage assessments in those regions. Moreover, due to different methodologies employed for various damage models in different countries, damage assessments cannot be directly compared with each other, obstructing also supra-national flood damage assessments. To address these problems, a globally consistent dataset of depth-damage curves has been developed. This dataset contains damage curves depicting percent of damage as a function of water depth as well as maximum damage values for a variety of assets and land use classes (i.e. residential, commercial, agriculture). Based on an extensive literature survey concave damage curves have been developed for each continent, while differentiation in flood damage between countries is established by determining maximum damage values at the country scale. These maximum damage values are based on construction cost surveys from multinational construction companies, which provide a coherent set of detailed building cost data across dozens of countries. A consistent set of maximum flood damage values for all countries was computed using statistical regressions with socio-economic World Development Indicators from the World Bank. Further, based on insights from the literature survey, guidance is also given on how the damage curves and maximum damage values can be adjusted for specific local circumstances, such as urban vs. rural locations, use of specific building material, etc. This dataset can be used for consistent supra

  19. Consistency and sealing of advanced bipolar tissue sealers

    Directory of Open Access Journals (Sweden)

    Chekan EG

    2015-04-01

    Full Text Available Edward G Chekan , Mark A Davison, David W Singleton, John Z Mennone, Piet Hinoul Ethicon, Inc., Cincinnati, OH, USA Objectives: The aim of this study was to evaluate two commonly used advanced bipolar devices (ENSEAL® G2 Tissue Sealers and LigaSure™ Blunt Tip for compression uniformity, vessel sealing strength, and consistency in bench-top analyses. Methods: Compression analysis was performed with a foam pad/sensor apparatus inserted between closed jaws of the instruments. Average pressures (psi were recorded across the entire inside surface of the jaws, and over the distal one-third of jaws. To test vessel sealing strength, ex vivo pig carotid arteries were sealed and transected and left and right (sealed halves of vessels were subjected to burst pressure testing. The maximum bursting pressures of each half of vessels were averaged to obtain single data points for analysis. The absence or presence of tissue sticking to device jaws was noted for each transected vessel. Results: Statistically higher average compression values were found for ENSEAL® instruments (curved jaw and straight jaw compared to LigaSure™, P<0.05. Moreover, the ENSEAL® devices retained full compression at the distal end of jaws. Significantly higher and more consistent median burst pressures were noted for ENSEAL® devices relative to LigaSure™ through 52 firings of each device (P<0.05. LigaSure™ showed a significant reduction in median burst pressure for the final three firings (cycles 50–52 versus the first three firings (cycles 1–3, P=0.027. Tissue sticking was noted for 1.39% and 13.3% of vessels transected with ENSEAL® and LigaSure™, respectively. Conclusion: In bench-top testing, ENSEAL® G2 sealers produced more uniform compression, stronger and more consistent vessel sealing, and reduced tissue sticking relative to LigaSure™. Keywords: ENSEAL, sealing, burst pressure, laparoscopic, compression, LigaSure

  20. Consistency of EEG source localization and connectivity estimates.

    Science.gov (United States)

    Mahjoory, Keyvan; Nikulin, Vadim V; Botrel, Loïc; Linkenkaer-Hansen, Klaus; Fato, Marco M; Haufe, Stefan

    2017-05-15

    As the EEG inverse problem does not have a unique solution, the sources reconstructed from EEG and their connectivity properties depend on forward and inverse modeling parameters such as the choice of an anatomical template and electrical model, prior assumptions on the sources, and further implementational details. In order to use source connectivity analysis as a reliable research tool, there is a need for stability across a wider range of standard estimation routines. Using resting state EEG recordings of N=65 participants acquired within two studies, we present the first comprehensive assessment of the consistency of EEG source localization and functional/effective connectivity metrics across two anatomical templates (ICBM152 and Colin27), three electrical models (BEM, FEM and spherical harmonics expansions), three inverse methods (WMNE, eLORETA and LCMV), and three software implementations (Brainstorm, Fieldtrip and our own toolbox). Source localizations were found to be more stable across reconstruction pipelines than subsequent estimations of functional connectivity, while effective connectivity estimates where the least consistent. All results were relatively unaffected by the choice of the electrical head model, while the choice of the inverse method and source imaging package induced a considerable variability. In particular, a relatively strong difference was found between LCMV beamformer solutions on one hand and eLORETA/WMNE distributed inverse solutions on the other hand. We also observed a gradual decrease of consistency when results are compared between studies, within individual participants, and between individual participants. In order to provide reliable findings in the face of the observed variability, additional simulations involving interacting brain sources are required. Meanwhile, we encourage verification of the obtained results using more than one source imaging procedure. Copyright © 2017 Elsevier Inc. All rights reserved.

  1. Consistent individual differences in fathering in threespined stickleback Gasterosteus aculeatus

    Directory of Open Access Journals (Sweden)

    Laura R. STEIN, Alison M. BELL

    2012-02-01

    Full Text Available There is growing evidence that individual animals show consistent differences in behavior. For example, individual threespined stickleback fish differ in how they react to predators and how aggressive they are during social interactions with conspecifics. A relatively unexplored but potentially important axis of variation is parental behavior. In sticklebacks, fathers provide all of the parental care that is necessary for offspring survival; therefore paternal care is directly tied to fitness. In this study, we assessed whether individual male sticklebacks differ consistently from each other in parental behavior. We recorded visits to nest, total time fanning, and activity levels of 11 individual males every day throughout one clutch, and then allowed the males to breed again. Half of the males were exposed to predation risk while parenting during the first clutch, and the other half of the males experienced predation risk during the second clutch. We detected dramatic temporal changes in parental behaviors over the course of the clutch: for example, total time fanning increased six-fold prior to eggs hatching, then decreased to approximately zero. Despite these temporal changes, males retained their individually-distinctive parenting styles within a clutch that could not be explained by differences in body size or egg mass. Moreover, individual differences in parenting were maintained when males reproduced for a second time. Males that were exposed to simulated predation risk briefly decreased fanning and increased activity levels. Altogether, these results show that individual sticklebacks consistently differ from each other in how they behave as parents [Current Zoology 58 (1: 45–52, 2012].

  2. Leo II PC

    Data.gov (United States)

    Kansas Data Access and Support Center — LEO II is a second-generation software system developed for use on the PC, which is designed to convert location references accurately between legal descriptions and...

  3. NNDSS - Table II. Vibriosis

    Data.gov (United States)

    U.S. Department of Health & Human Services — NNDSS - Table II. Vibriosis - 2017. In this Table, provisional cases of selected notifiable diseases (≥1,000 cases reported during the preceding year), and...

  4. Strongly luminescing ruthenium(II)/ruthenium(II) and ruthenium(II)/platinum(II) binuclear complexes

    Energy Technology Data Exchange (ETDEWEB)

    Sahai, R.; Baucom, D.A.; Rillema, D.P.

    1986-10-08

    Two strongly luminescing complexes, ruthenium(II)/ruthenium(II) homobinuclear complex and ruthenium(II)/platinum(II) heterobinuclear complex, have been prepared and characterized. The organic part of the complex is 4,4'-dimethyl-2,2' bipyridine dimer. The luminescence behavior of the homobinuclear and heterobinculear complexes was found to be comparable to that of Ru(bpy)/sub 3//sup 2 +/, although the luminescence maxima were shifted from 615 to 620 nm. These complexes exhibit good stability due to the bidentate chelating capability of the bridging ligand. These new complexes can provide the opportunity for detailed photophysical studies related to donor-acceptor interactions and to the possibility of two simultaneous single-electron transfer events. 17 references, 2 figures.

  5. Gamble II Facility

    Data.gov (United States)

    Federal Laboratory Consortium — FUNCTION: Gamble II produces a high-voltage (2 MV), high-current (1 MA), short (100 ns) pulse of energy of either positive or negative polarity. This terawatt power...

  6. Poor consistency in evaluating South African adults with neurogenic dysphagia.

    Science.gov (United States)

    Andrews, Mckinley; Pillay, Mershen

    2017-01-23

    Speech-language therapists are specifically trained in clinically evaluating swallowing in adults with acute stroke. Incidence of dysphagia following acute stroke is high in South Africa, and health implications can be fatal, making optimal management of this patient population crucial. However, despite training and guidelines for best practice in clinically evaluating swallowing in adults with acute stroke, there are low levels of consistency in these practice patterns. The aim was to explore the clinical practice activities of speech-language therapists in the clinical evaluation of swallowing in adults with acute stroke. Practice activities reviewed included the use and consistency of clinical components and resources utilised. Clinical components were the individual elements evaluated in the clinical evaluation of swallowing (e.g. lip seal, vocal quality, etc.)Methods: The questionnaire used in the study was replicated and adapted from a study increasing content- and criterion-related validity. A narrative literature review determined what practice patterns existed in the clinical evaluation of swallowing in adults. A pilot study was conducted to increase validity and reliability. Purposive sampling was used by sending a self-administered, electronic questionnaire to members of the South African Speech-Language-Hearing Association. Thirty-eight participants took part in the study. Descriptive statistics were used to analyse the data and the small qualitative component was subjected to textual analysis. There was high frequency of use of 41% of the clinical components in more than 90% of participants (n = 38). Less than 50% of participants frequently assessed sensory function and gag reflex and used pulse oximetry, cervical auscultation and indirect laryngoscopy. Approximately a third of participants showed high (30.8%), moderate (35.9%) and poor (33.3%) consistency of practice each. Nurses, food and liquids and medical consumables were used usually and

  7. Poor consistency in evaluating South African adults with neurogenic dysphagia

    Directory of Open Access Journals (Sweden)

    Mckinley Andrews

    2017-01-01

    Full Text Available Background: Speech-language therapists are specifically trained in clinically evaluating swallowing in adults with acute stroke. Incidence of dysphagia following acute stroke is high in South Africa, and health implications can be fatal, making optimal management of this patient population crucial. However, despite training and guidelines for best practice in clinically evaluating swallowing in adults with acute stroke, there are low levels of consistency in these practice patterns.Objective: The aim was to explore the clinical practice activities of speech-language therapists in the clinical evaluation of swallowing in adults with acute stroke. Practice activities reviewed included the use and consistency of clinical components and resources utilised. Clinical components were the individual elements evaluated in the clinical evaluation of swallowing (e.g. lip seal, vocal quality, etc.Methods: The questionnaire used in the study was replicated and adapted from a study increasing content- and criterion-related validity. A narrative literature review determined what practice patterns existed in the clinical evaluation of swallowing in adults. A pilot study was conducted to increase validity and reliability. Purposive sampling was used by sending a self-administered, electronic questionnaire to members of the South African Speech-Language-Hearing Association. Thirty-eight participants took part in the study. Descriptive statistics were used to analyse the data and the small qualitative component was subjected to textual analysis.Results: There was high frequency of use of 41% of the clinical components in more than 90% of participants (n = 38. Less than 50% of participants frequently assessed sensory function and gag reflex and used pulse oximetry, cervical auscultation and indirect laryngoscopy. Approximately a third of participants showed high (30.8%, moderate (35.9% and poor (33.3% consistency of practice each. Nurses, food and liquids and

  8. A detailed self-consistent vertical Milky Way disc model

    Directory of Open Access Journals (Sweden)

    Gao S.

    2012-02-01

    Full Text Available We present a self-consistent vertical disc model of thin and thick disc in the solar vicinity. The model is optimized to fit the local kinematics of main sequence stars by varying the star formation history and the dynamical heating function. The star formation history and the dynamical heating function are not uniquely determined by the local kinematics alone. For four different pairs of input functions we calculate star count predictions at high galactic latitude as a function of colour. The comparison with North Galactic Pole data of SDSS/SEGUE leads to significant constraints of the local star formation history.

  9. Testing the consistency between cosmological measurements of distance and age

    Directory of Open Access Journals (Sweden)

    Remya Nair

    2015-05-01

    Full Text Available We present a model independent method to test the consistency between cosmological measurements of distance and age, assuming the distance duality relation. We use type Ia supernovae, baryon acoustic oscillations, and observational Hubble data, to reconstruct the luminosity distance DL(z, the angle-averaged distance DV(z and the Hubble rate H(z, using Gaussian processes regression technique. We obtain estimate of the distance duality relation in the redshift range 0.1

  10. Equations of Electromagnetic Self-Consistency in a Plasma

    Institute of Scientific and Technical Information of China (English)

    Evangelos Chaliasos

    2003-01-01

    The set of equations governing a system consisting of an electromagnetic field plus charges in it is obtainedby varying the appropriate action. It is not assumed that the currents are given, which in fact leads to the Maxwellequations governing the fields. Nor is it assumed that the fields are given, which in fact would lead to the determinationof the motions of the charges (the currents) through the Lorentz force. On the contrary, currents and fields are left freeto interplay, and they can be found simultaneously from the equations obtained.

  11. Simplified Models for Dark Matter Face their Consistent Completions

    Energy Technology Data Exchange (ETDEWEB)

    Goncalves, Dorival [Pittsburgh U.; Machado, Pedro N. [Madrid, IFT; No, Jose Miguel [Sussex U.

    2016-11-14

    Simplified dark matter models have been recently advocated as a powerful tool to exploit the complementarity between dark matter direct detection, indirect detection and LHC experimental probes. Focusing on pseudoscalar mediators between the dark and visible sectors, we show that the simplified dark matter model phenomenology departs significantly from that of consistent ${SU(2)_{\\mathrm{L}} \\times U(1)_{\\mathrm{Y}}}$ gauge invariant completions. We discuss the key physics simplified models fail to capture, and its impact on LHC searches. Notably, we show that resonant mono-Z searches provide competitive sensitivities to standard mono-jet analyses at $13$ TeV LHC.

  12. Simplified Models for Dark Matter Face their Consistent Completions

    CERN Document Server

    Goncalves, Dorival; No, Jose Miguel

    2016-01-01

    Simplified dark matter models have been recently advocated as a powerful tool to exploit the complementarity between dark matter direct detection, indirect detection and LHC experimental probes. Focusing on pseudoscalar mediators between the dark and visible sectors, we show that the simplified dark matter model phenomenology departs significantly from that of consistent ${SU(2)_{\\mathrm{L}} \\times U(1)_{\\mathrm{Y}}}$ gauge invariant completions. We discuss the key physics simplified models fail to capture, and its impact on LHC searches. Notably, we show that resonant mono-Z searches provide competitive sensitivities to standard mono-jet analyses at $13$ TeV LHC.

  13. Exploiting N=2 in consistent coset reductions of type IIA

    CERN Document Server

    Cassani, Davide

    2009-01-01

    We study compactifications of type IIA supergravity on cosets exhibiting SU(3) structure. We establish the consistency of the truncation based on left-invariance, providing a justification for the choice of expansion forms which yields gauged N=2 supergravity in 4 dimensions. We explore N=1 solutions of these theories, emphasizing the requirements of flux quantization, as well as their non-supersymmetric companions. In particular, we obtain a no-go result for de Sitter solutions at string tree level, and, exploiting the enhanced leverage of the N=2 setup, provide a preliminary analysis of the existence of de Sitter vacua at all string loop order.

  14. Radio data and synchrotron emission in consistent cosmic ray models

    CERN Document Server

    Bringmann, Torsten; Lineros, Roberto A

    2011-01-01

    We consider the propagation of electrons in phenomenological two-zone diffusion models compatible with cosmic-ray nuclear data and compute the diffuse synchrotron emission resulting from their interaction with galactic magnetic fields. We find models in agreement not only with cosmic ray data but also with radio surveys at essentially all frequencies. Requiring such a globally consistent description strongly disfavors both a very large (L>15 kpc) and small (L<1 kpc) effective size of the diffusive halo. This has profound implications for, e.g., indirect dark matter searches.

  15. Large-Scale Self-Consistent Nuclear Mass Calculations

    CERN Document Server

    Stoitsov, M V; Dobaczewski, J; Nazarewicz, W

    2006-01-01

    The program of systematic large-scale self-consistent nuclear mass calculations that is based on the nuclear density functional theory represents a rich scientific agenda that is closely aligned with the main research directions in modern nuclear structure and astrophysics, especially the radioactive nuclear beam physics. The quest for the microscopic understanding of the phenomenon of nuclear binding represents, in fact, a number of fundamental and crucial questions of the quantum many-body problem, including the proper treatment of correlations and dynamics in the presence of symmetry breaking. Recent advances and open problems in the field of nuclear mass calculations are presented and discussed.

  16. Designing apps for success developing consistent app design practices

    CERN Document Server

    David, Matthew

    2014-01-01

    In 2007, Apple released the iPhone. With this release came tools as revolutionary as the internet was to businesses and individuals back in the mid- and late-nineties: Apps. Much like websites drove (and still drive) business, so too do apps drive sales, efficiencies and communication between people. But also like web design and development, in its early years and iterations, guidelines and best practices for apps are few and far between.Designing Apps for Success provides web/app designers and developers with consistent app design practices that result in timely, appropriate, and efficiently

  17. Consistency of FMEA used in the validation of analytical procedures

    DEFF Research Database (Denmark)

    Oldenhof, M.T.; van Leeuwen, J.F.; Nauta, Maarten

    2011-01-01

    In order to explore the consistency of the outcome of a Failure Mode and Effects Analysis (FMEA) in the validation of analytical procedures, an FMEA was carried out by two different teams. The two teams applied two separate FMEAs to a High Performance Liquid Chromatography-Diode Array Detection......-Mass Spectrometry (HPLC-DAD-MS) analytical procedure used in the quality control of medicines. Each team was free to define their own ranking scales for the probability of severity (S), occurrence (O), and detection (D) of failure modes. We calculated Risk Priority Numbers (RPNs) and we identified the failure modes...

  18. Consistency of the IGS contribution to ITRF2008

    Science.gov (United States)

    Ferland, R.

    2009-04-01

    The reanalysis of historical IGS data is one important activity that has been undertaken for the contribution to the new ITRF2008. It includes data available from the IGS Data Centers going back to approximately 1994. The main objective of this exercise is to provide consistent products using the latest modeling and analysis tools. Nine Analysis Centers (ACs) have so far contributed weekly solutions and more are expected in the final reanalysis. The years 2000 to 2008 have been combined and contributed to ITRF2008. The procedure used for the combination of the reanalysis is very similar to the one used for the official weekly IGS combination products. Most of the ACs weekly solutions include station coordinates, implicit apparent geocenter estimates as well as daily Earth rotation parameters together with their fully populated covariance matrices. The ACs include between approximately 100 and 300 stations in their weekly solutions. The combined weekly solutions generally include about 400 stations as well as explicit apparent geocenter position and daily ERP's. The combined solutions include complete covariance information and are aligned to the IGS realization of ITRF2005 (IGS05). For the period between 2000 and 2008, over 600 stations are available. The majority of stations are in North America and Europe. The ACs station consistency (horizontal/vertical) with respect to: 1) IGS05 is about 2-3mm/8mm; 2) the IGS weekly solutions is about 1-2mm/3mm and 3) the IGS cumulative solution is about 2mm/5mm. The consistency of the ACs pole position and rates approaches 0.02-0.03mas and 0.05mas/d, respectively. For weekly estimates of the apparent geocenter the ACs consistency is about 4mm in the X and Y components and 7mm in the Z component, while the combined estimates agrees with the ITRF2005 origin at about 5mm. One important improvement of this contribution to ITRF is the use absolute antenna phase centers. Previous contributions used only relative antenna phase

  19. A Consistent Pricing Model for Index Options and Volatility Derivatives

    DEFF Research Database (Denmark)

    Kokholm, Thomas

    We propose a flexible modeling framework for the joint dynamics of an index and a set of forward variance swap rates written on this index. Our model reproduces various empirically observed properties of variance swap dynamics and enables volatility derivatives and options on the underlying index...... to be priced consistently, while allowing for jumps in volatility and returns. An affine specification using Lévy processes as building blocks leads to analytically tractable pricing formulas for volatility derivatives, such as VIX options, as well as efficient numerical methods for pricing of European options...

  20. A Consistent Pricing Model for Index Options and Volatility Derivatives

    DEFF Research Database (Denmark)

    Cont, Rama; Kokholm, Thomas

    2013-01-01

    We propose a flexible modeling framework for the joint dynamics of an index and a set of forward variance swap rates written on this index. Our model reproduces various empirically observed properties of variance swap dynamics and enables volatility derivatives and options on the underlying index...... to be priced consistently, while allowing for jumps in volatility and returns. An affine specification using Lévy processes as building blocks leads to analytically tractable pricing formulas for volatility derivatives, such as VIX options, as well as efficient numerical methods for pricing of European options...

  1. Hydrodynamic self-consistent field theory for inhomogeneous polymer melts.

    Science.gov (United States)

    Hall, David M; Lookman, Turab; Fredrickson, Glenn H; Banerjee, Sanjoy

    2006-09-15

    We introduce a mesoscale technique for simulating the structure and rheology of block-copolymer melts and blends in hydrodynamic flows. The technique couples dynamic self-consistent field theory with continuum hydrodynamics and flow penalization to simulate polymeric fluid flows in channels of arbitrary geometry. We demonstrate the method by studying phase separation of an ABC triblock copolymer melt in a submicron channel with neutral wall wetting conditions. We find that surface wetting effects and shear effects compete, producing wall-perpendicular lamellae in the absence of flow and wall-parallel lamellae in cases where the shear rate exceeds some critical Weissenberg number.

  2. Self consistent modeling of accretion columns in accretion powered pulsars

    Science.gov (United States)

    Falkner, Sebastian; Schwarm, Fritz-Walter; Wolff, Michael Thomas; Becker, Peter A.; Wilms, Joern

    2016-04-01

    We combine three physical models to self-consistently derive the observed flux and pulse profiles of neutron stars' accretion columns. From the thermal and bulk Comptonization model by Becker & Wolff (2006) we obtain seed photon continua produced in the dense inner regions of the accretion column. In a thin outer layer these seed continua are imprinted with cyclotron resonant scattering features calculated using Monte Carlo simulations. The observed phase and energy dependent flux corresponding to these emission profiles is then calculated, taking relativistic light bending into account. We present simulated pulse profiles and the predicted dependency of the observable X-ray spectrum as a function of pulse phase.

  3. The Consistency Of High Attorney Of Papua In Corruption Investigation

    Directory of Open Access Journals (Sweden)

    Samsul Tamher

    2015-08-01

    Full Text Available This study aimed to determine the consistency of High Attorney of Papua in corruption investigation and efforts to return the state financial loss. The type of study used in this paper is a normative-juridical and empirical-juridical. The results showed that the High Attorney of Papua in corruption investigation is not optimal due to the political interference on a case that involving local officials so that the High Attorney in decide the case is not accordance with the rule of law. The efforts of the High Attorney of Papua to return the state financial loss through State Auction Body civil- and criminal laws.

  4. Consistent gravitino couplings in non-supersymmetric strings

    CERN Document Server

    Dudas, E A

    2001-01-01

    The massless spectrum of the ten dimensional USp(32) type I string has an N=1 supergravity multiplet coupled to non-supersymmetric matter. This raises the question of the consistency of the gravitino interactions. We resolve this apparent puzzle by arguing for the existence of a local supersymmetry which is non-linearly realised in the open sector. We determine the leading order low energy effective Lagrangian. Similar results are expected to apply to lower dimensional type I models where supergravity is coupled to non-supersymmetric branes.

  5. Island of stability for consistent deformations of Einstein's gravity.

    Science.gov (United States)

    Berkhahn, Felix; Dietrich, Dennis D; Hofmann, Stefan; Kühnel, Florian; Moyassari, Parvin

    2012-03-30

    We construct deformations of general relativity that are consistent and phenomenologically viable, since they respect, in particular, cosmological backgrounds. These deformations have unique symmetries in accordance with their Minkowski cousins (Fierz-Pauli theory for massive gravitons) and incorporate a background curvature induced self-stabilizing mechanism. Self-stabilization is essential in order to guarantee hyperbolic evolution in and unitarity of the covariantized theory, as well as the deformation's uniqueness. We show that the deformation's parameter space contains islands of absolute stability that are persistent through the entire cosmic evolution.

  6. Emergent Dynamics of a Thermodynamically Consistent Particle Model

    Science.gov (United States)

    Ha, Seung-Yeal; Ruggeri, Tommaso

    2017-03-01

    We present a thermodynamically consistent particle (TCP) model motivated by the theory of multi-temperature mixture of fluids in the case of spatially homogeneous processes. The proposed model incorporates the Cucker-Smale (C-S) type flocking model as its isothermal approximation. However, it is more complex than the C-S model, because the mutual interactions are not only " mechanical" but are also affected by the "temperature effect" as individual particles may exhibit distinct internal energies. We develop a framework for asymptotic weak and strong flocking in the context of the proposed model.

  7. A Consistent Pricing Model for Index Options and Volatility Derivatives

    DEFF Research Database (Denmark)

    Kokholm, Thomas

    We propose a flexible modeling framework for the joint dynamics of an index and a set of forward variance swap rates written on this index. Our model reproduces various empirically observed properties of variance swap dynamics and enables volatility derivatives and options on the underlying index...... to be priced consistently, while allowing for jumps in volatility and returns. An affine specification using Lévy processes as building blocks leads to analytically tractable pricing formulas for volatility derivatives, such as VIX options, as well as efficient numerical methods for pricing of European options...

  8. Mean-field theory and self-consistent dynamo modeling

    Energy Technology Data Exchange (ETDEWEB)

    Yoshizawa, Akira; Yokoi, Nobumitsu [Tokyo Univ. (Japan). Inst. of Industrial Science; Itoh, Sanae-I [Kyushu Univ., Fukuoka (Japan). Research Inst. for Applied Mechanics; Itoh, Kimitaka [National Inst. for Fusion Science, Toki, Gifu (Japan)

    2001-12-01

    Mean-field theory of dynamo is discussed with emphasis on the statistical formulation of turbulence effects on the magnetohydrodynamic equations and the construction of a self-consistent dynamo model. The dynamo mechanism is sought in the combination of the turbulent residual-helicity and cross-helicity effects. On the basis of this mechanism, discussions are made on the generation of planetary magnetic fields such as geomagnetic field and sunspots and on the occurrence of flow by magnetic fields in planetary and fusion phenomena. (author)

  9. Consistent Initial Conditions for the DNS of Compressible Turbulence

    Science.gov (United States)

    Ristorcelli, J. R.; Blaisdell, G. A.

    1996-01-01

    Relationships between diverse thermodynamic quantities appropriate to weakly compressible turbulence are derived. It is shown that for turbulence of a finite turbulent Mach number there is a finite element of compressibility. A methodology for generating initial conditions for the fluctuating pressure, density and dilatational velocity is given which is consistent with finite Mach number effects. Use of these initial conditions gives rise to a smooth development of the flow, in contrast to cases in which these fields are specified arbitrarily or set to zero. Comparisons of the effect of different types of initial conditions are made using direct numerical simulation of decaying isotropic turbulence.

  10. Mod II engine development

    Science.gov (United States)

    Karl, David W.

    1987-01-01

    The Mod II engine, a four-cylinder, automotive Stirling engine utilizing the Siemens-Rinia double-acting concept, was assembled and became operational in January 1986. This paper describes the Mod II engine, its first assembly, and the subsequent development work done on engine components up to the point that engine performance characterization testing took place. Performance data for the engine are included.

  11. DUMAND II status report

    Energy Technology Data Exchange (ETDEWEB)

    Aoki, T. (ICRR, University of Tokyo, Japan (JP)); Becker-Szendy, R.; Bosetti, P.; Boynton, P.E.; Bradner, H.; Camerini, U.; Clem, J.; Commichau, V.; Dau, D.; Dye, S.; Grieder, P.K.F.; Hayashino, T.; Hazen, E.; Jaworski, M.; Kitamura, T.; Kobayakawa, K.; Koske, P.; Learned, J.G.; Ley, C.; Lord, J.J.; March, R.; Matsuno, S.; Minkowski, P.; Mitsui, K.; O' Connor, D.; Ohashi, Y.; Okada, A.; Peterson, V.Z.; Rathlev, J.; Roberts, A.; Roos, C.E.; Sakuda, M.; Samm, D.; Stenger, V.J.; Tanaka, S.; Uehara, S.; Webster, M.; Wilkins, G.; Wilkes, R.J.; Yamaguchi, A.; Yamamoto, I.; Young, K.K. (University of Bern, Switzerland (CH) Boston University, (USA) University of Hawaii, (USA) University of Kiel, Germany (DE) Kobe University, Japan (JP) Kinki University, Japan (JP) Okayama Science University, Japan (JP) Scripps Institute of Oceanography, (USA) Tohoku University, Japan (JP) ICRR, University of tokyo, Japan (JP) NLHEP Tsukuba, Japan (JP) Vanderbilt University, (USA) University of Washington, (US

    1991-04-05

    The scientific goals, design, capabilities, and status of the DUMAND II detector system are described. In June, 1989, the High Energy Physics Advisory Panel recommended support for construction of DUMAND II to the U.S. Department of Energy. Funding began in 1990, and prototype development for various detector subsystems is under way. Current plans include deployment of the shore cable, junction box and three strings of optical detector modules in 1992, and expansion to the full 9-string configuration in 1993.

  12. DUMAND II status report

    Science.gov (United States)

    Aoki, T.; Becker-Szendy, R.; Bosetti, P.; Boynton, P. E.; Bradner, H.; Camerini, U.; Clem, J.; Commichau, V.; Dau, D.; Dye, S.; Grieder, P. K. F.; Hayashino, T.; Hazen, E.; Jaworski, M.; Kitamura, T.; Kobayakawa, K.; Koske, P.; Learned, J. G.; Ley, C.; Lord, J. J.; March, R.; Matsuno, S.; Minkowski, P.; Mitsui, K.; O'Connor, D.; Ohashi, Y.; Okada, A.; Peterson, V. Z.; Rathlev, J.; Roberts, A.; Roos, C. E.; Sakuda, M.; Samm, D.; Stenger, V. J.; Tanaka, S.; Uehara, S.; Webster, M.; Wilkins, G.; Wilkes, R. J.; Yamaguchi, A.; Yamamoto, I.; Young, K. K.

    1991-04-01

    The scientific goals, design, capabilities, and status of the DUMAND II detector system are described. In June, 1989, the High Energy Physics Advisory Panel recommended support for construction of DUMAND II to the U.S. Department of Energy. Funding began in 1990, and prototype development for various detector subsystems is under way. Current plans include deployment of the shore cable, junction box and three strings of optical detector modules in 1992, and expansion to the full 9-string configuration in 1993.

  13. Ecuaciones Diferenciales II

    OpenAIRE

    Mañas Baena, Manuel; Martínez Alonso, Luis

    2015-01-01

    En este manual se revisan diferentes aspectos sobre las ecuaciones diferenciales en derivadas parciales de utilidad para los físicos. Se elaboraron como notas de clase de la asignatura Ecuaciones II, del plan 1993 de la Licenciatura de Física de la UCM. Actualmente cubre un 75% de la asignatura Métodos Matemáticos II del Grado de Física de la UCM.

  14. ASTRID II satellit projekt

    DEFF Research Database (Denmark)

    Jørgensen, John Leif; Primdahl, Fritz

    1997-01-01

    The report describes the instruments developed for the Swedish micro satellite "ASTRID II". Specifications of the two instruments realized under this contract, a Stellar Compass and a CSC magnetometer are given follwed by a description of the project status and plan.......The report describes the instruments developed for the Swedish micro satellite "ASTRID II". Specifications of the two instruments realized under this contract, a Stellar Compass and a CSC magnetometer are given follwed by a description of the project status and plan....

  15. A thermodynamically consistent model of the post-translational Kai circadian clock

    Science.gov (United States)

    Lubensky, David K.; ten Wolde, Pieter Rein

    2017-01-01

    The principal pacemaker of the circadian clock of the cyanobacterium S. elongatus is a protein phosphorylation cycle consisting of three proteins, KaiA, KaiB and KaiC. KaiC forms a homohexamer, with each monomer consisting of two domains, CI and CII. Both domains can bind and hydrolyze ATP, but only the CII domain can be phosphorylated, at two residues, in a well-defined sequence. While this system has been studied extensively, how the clock is driven thermodynamically has remained elusive. Inspired by recent experimental observations and building on ideas from previous mathematical models, we present a new, thermodynamically consistent, statistical-mechanical model of the clock. At its heart are two main ideas: i) ATP hydrolysis in the CI domain provides the thermodynamic driving force for the clock, switching KaiC between an active conformational state in which its phosphorylation level tends to rise and an inactive one in which it tends to fall; ii) phosphorylation of the CII domain provides the timer for the hydrolysis in the CI domain. The model also naturally explains how KaiA, by acting as a nucleotide exchange factor, can stimulate phosphorylation of KaiC, and how the differential affinity of KaiA for the different KaiC phosphoforms generates the characteristic temporal order of KaiC phosphorylation. As the phosphorylation level in the CII domain rises, the release of ADP from CI slows down, making the inactive conformational state of KaiC more stable. In the inactive state, KaiC binds KaiB, which not only stabilizes this state further, but also leads to the sequestration of KaiA, and hence to KaiC dephosphorylation. Using a dedicated kinetic Monte Carlo algorithm, which makes it possible to efficiently simulate this system consisting of more than a billion reactions, we show that the model can describe a wealth of experimental data. PMID:28296888

  16. Investigating consistency of a pro-market perspective amongst conservationists

    Directory of Open Access Journals (Sweden)

    Libby Blanchard

    2016-01-01

    Full Text Available While biodiversity conservation has had a longstanding relationship with markets, the recent past has seen a proliferation of novel market-based instruments in conservation such as payments for ecosystem services. Whilst a number of conservation organisations have aligned themselves with this 'neoliberal' shift, relatively few studies interrogate the extent to which this move resonates with the values held by conservation professionals. An earlier study of the views of conservationists participating in the 2011 Society for Conservation Biology conference found both supportive and critical perspectives on the use of markets in conservation (Sandbrook et al. 2013b. This paper investigates the consistency of the perspectives identified in the earlier study by applying the same Q methodology survey to a group of Cambridge, UK-based conservationists. While both studies reveal supporting and more sceptical perspectives on the use of markets in conservation, the pro-market perspective in each sample is nearly identical. This finding provides empirical confirmation of a growing body of research that suggests that a relatively consistent set of pro-market perspectives have permeated the thinking of decision makers and staff of conservation organisations. It also lends some support to the suggestion that a transnational conservation elite may be driving this uptake of market approaches.

  17. Cortical orofacial motor representation: effect of diet consistency.

    Science.gov (United States)

    Avivi-Arber, L; Lee, J C; Sessle, B J

    2010-10-01

    Jaw and tongue motor alterations may occur following changes in food consistency, but whether such changes are associated with re-organization of motor representations within the facial sensorimotor cortex is unclear. We used intracortical microstimulation (ICMS) and recordings of evoked electromyographic responses to determine jaw (anterior digastric) and tongue (genioglossus) motor representations within the histologically defined face primary motor cortex (face-M1) and adjacent somatosensory cortex (face-S1) of rats fed hard (N = 6) or soft (N = 6) diet for 2 to 3 weeks. ICMS evoked jaw and tongue responses from an extensive area within the face-M1 and a smaller area within the face-S1. A significant contralateral predominance was reflected in the number and latency of ICMS-evoked jaw responses (p diet groups in jaw and tongue motor representations, suggesting that the rat's ability to adapt to changes in diet consistency may not be associated with significant neuroplasticity of sensorimotor cortex motor outputs.

  18. ECB MONETARY POLICY CONSISTENCY AND INTERBANK INTEREST RATES FORECASTS

    Directory of Open Access Journals (Sweden)

    GIOVANNI VERGA

    2011-03-01

    Full Text Available The European Central Bank has often declared that it has two main monetary policy tools: the official interest rate (Repo and its communications to the public (the monthly President’s Conferences above all. In this paper an ECB’s reaction function formed by a system of two non-linear equations is employed to explain both ECB’s Repo and communications, and to verify if the two policy instruments are used consistently. It turned out that the estimated system is particularly robust, and the consistently is proved. During the financial crisis, however, also an index of monetary market risk must enter the equations in order to maintain the other parameters stable. By employing those two monetary policy tools as regressors, along with risk and liquidity, a good deal of the future changes in the interbank interest rates can be explained. During the crisis such forecasts are much better than those obtained by applying the usual term structure theory.

  19. Are paleoclimate model ensembles consistent with the MARGO data synthesis?

    Directory of Open Access Journals (Sweden)

    J. C. Hargreaves

    2011-03-01

    Full Text Available We investigate the consistency of various ensembles of model simulations with the Multiproxy Approach for the Reconstruction of the Glacial Ocean Surface (MARGO sea surface temperature data synthesis. We discover that while two multi-model ensembles, created through the Paleoclimate Model Intercomparison Projects (PMIP and PMIP2, pass our simple tests of reliability, an ensemble based on parameter variation in a single model does not perform so well. We show that accounting for observational uncertainty in the MARGO database is of prime importance for correctly evaluating the ensembles. Perhaps surprisingly, the inclusion of a coupled dynamical ocean (compared to the use of a slab ocean does not appear to cause a wider spread in the sea surface temperature anomalies, but rather causes systematic changes with more heat transported north in the Atlantic. There is weak evidence that the sea surface temperature data may be more consistent with meridional overturning in the North Atlantic being similar for the LGM and the present day, however, the small size of the PMIP2 ensemble prevents any statistically significant results from being obtained.

  20. Are paleoclimate model ensembles consistent with the MARGO data synthesis?

    Directory of Open Access Journals (Sweden)

    J. C. Hargreaves

    2011-08-01

    Full Text Available We investigate the consistency of various ensembles of climate model simulations with the Multiproxy Approach for the Reconstruction of the Glacial Ocean Surface (MARGO sea surface temperature data synthesis. We discover that while two multi-model ensembles, created through the Paleoclimate Model Intercomparison Projects (PMIP and PMIP2, pass our simple tests of reliability, an ensemble based on parameter variation in a single model does not perform so well. We show that accounting for observational uncertainty in the MARGO database is of prime importance for correctly evaluating the ensembles. Perhaps surprisingly, the inclusion of a coupled dynamical ocean (compared to the use of a slab ocean does not appear to cause a wider spread in the sea surface temperature anomalies, but rather causes systematic changes with more heat transported north in the Atlantic. There is weak evidence that the sea surface temperature data may be more consistent with meridional overturning in the North Atlantic being similar for the LGM and the present day. However, the small size of the PMIP2 ensemble prevents any statistically significant results from being obtained.

  1. Measuring consistent masses for 25 Milky Way globular clusters

    Energy Technology Data Exchange (ETDEWEB)

    Kimmig, Brian; Seth, Anil; Ivans, Inese I.; Anderton, Tim; Gregersen, Dylan [Physics and Astronomy Department, University of Utah, SLC, UT 84112 (United States); Strader, Jay [Department of Physics and Astronomy, Michigan State University, East Lansing, MI 48824 (United States); Caldwell, Nelson [Harvard-Smithsonian Center for Astrophysics, 60 Garden Street, Cambridge, MA 02138 (United States)

    2015-02-01

    We present central velocity dispersions, masses, mass-to-light ratios (M/Ls ), and rotation strengths for 25 Galactic globular clusters (GCs). We derive radial velocities of 1951 stars in 12 GCs from single order spectra taken with Hectochelle on the MMT telescope. To this sample we add an analysis of available archival data of individual stars. For the full set of data we fit King models to derive consistent dynamical parameters for the clusters. We find good agreement between single-mass King models and the observed radial dispersion profiles. The large, uniform sample of dynamical masses we derive enables us to examine trends of M/L with cluster mass and metallicity. The overall values of M/L and the trends with mass and metallicity are consistent with existing measurements from a large sample of M31 clusters. This includes a clear trend of increasing M/L with cluster mass and lower than expected M/Ls for the metal-rich clusters. We find no clear trend of increasing rotation with increasing cluster metallicity suggested in previous work.

  2. Sparse PDF Volumes for Consistent Multi-Resolution Volume Rendering

    KAUST Repository

    Sicat, Ronell Barrera

    2014-12-31

    This paper presents a new multi-resolution volume representation called sparse pdf volumes, which enables consistent multi-resolution volume rendering based on probability density functions (pdfs) of voxel neighborhoods. These pdfs are defined in the 4D domain jointly comprising the 3D volume and its 1D intensity range. Crucially, the computation of sparse pdf volumes exploits data coherence in 4D, resulting in a sparse representation with surprisingly low storage requirements. At run time, we dynamically apply transfer functions to the pdfs using simple and fast convolutions. Whereas standard low-pass filtering and down-sampling incur visible differences between resolution levels, the use of pdfs facilitates consistent results independent of the resolution level used. We describe the efficient out-of-core computation of large-scale sparse pdf volumes, using a novel iterative simplification procedure of a mixture of 4D Gaussians. Finally, our data structure is optimized to facilitate interactive multi-resolution volume rendering on GPUs.

  3. Sparse PDF Volumes for Consistent Multi-Resolution Volume Rendering.

    Science.gov (United States)

    Sicat, Ronell; Krüger, Jens; Möller, Torsten; Hadwiger, Markus

    2014-12-01

    This paper presents a new multi-resolution volume representation called sparse pdf volumes, which enables consistent multi-resolution volume rendering based on probability density functions (pdfs) of voxel neighborhoods. These pdfs are defined in the 4D domain jointly comprising the 3D volume and its 1D intensity range. Crucially, the computation of sparse pdf volumes exploits data coherence in 4D, resulting in a sparse representation with surprisingly low storage requirements. At run time, we dynamically apply transfer functions to the pdfs using simple and fast convolutions. Whereas standard low-pass filtering and down-sampling incur visible differences between resolution levels, the use of pdfs facilitates consistent results independent of the resolution level used. We describe the efficient out-of-core computation of large-scale sparse pdf volumes, using a novel iterative simplification procedure of a mixture of 4D Gaussians. Finally, our data structure is optimized to facilitate interactive multi-resolution volume rendering on GPUs.

  4. Enhanced data consistency of a portable gait measurement system

    Science.gov (United States)

    Lin, Hsien-I.; Chiang, Y. P.

    2013-11-01

    A gait measurement system is a useful tool for rehabilitation applications. Such a system is used to conduct gait experiments in large workplaces such as laboratories where gait measurement equipment can be permanently installed. However, a gait measurement system should be portable if it is to be used in clinics or community centers for aged people. In a portable gait measurement system, the workspace is limited and landmarks on a subject may not be visible to the cameras during experiments. Thus, we propose a virtual-marker function to obtain positions of unseen landmarks for maintaining data consistency. This work develops a portable clinical gait measurement system consisting of lightweight motion capture devices, force plates, and a walkway assembled from plywood boards. We evaluated the portable clinic gait system with 11 normal subjects in three consecutive days in a limited experimental space. Results of gait analysis based on the verification of within-day and between-day coefficients of multiple correlations show that the proposed portable gait system is reliable.

  5. Effect of irradiation on Brazilian honeys' consistency and their acceptability

    Science.gov (United States)

    Matsuda, A. H.; Sabato, S. F.

    2004-09-01

    Contamination of bee products may occur during packing or even during the process of collection. Gamma irradiation was found to decrease the number of bacteria and fungi. However, little information is available on the effects of gamma irradiation on viscosity which is an important property of honey. In this work the viscosity of two varieties of Brazilian honey was measured when they were irradiated at 5 and 10 kGy. The viscosity was measured at four temperatures (25°C, 30°C, 35°C and 40°C) for both samples and compared with control and within the doses. The sensory evaluation was carried on for the parameters color, odor, taste and consistency, using a 9-point hedonic scale. All the data were treated with a statistical tool (Statistica 5.1, StatSoft, 1998). The viscosity was not impaired significantly by gamma irradiation in doses 5 and 10 kGy ( ptaste and consistency) is presented. The taste for Parana type indicated a significant difference among irradiation doses ( pOrganic honey presented the taste parameter for 10 kGy, significantly lower than the control mean but it did not differ significantly from the 5 kGy value.

  6. Consistency analysis of a nonbirefringent Lorentz-violating planar model

    Energy Technology Data Exchange (ETDEWEB)

    Casana, Rodolfo; Ferreira, Manoel M.; Moreira, Roemir P.M. [Universidade Federal do Maranhao (UFMA), Departamento de Fisica, Sao Luis, MA (Brazil)

    2012-07-15

    In this work analyze the physical consistency of a nonbirefringent Lorentz-violating planar model via the analysis of the pole structure of its Feynman propagators. The nonbirefringent planar model, obtained from the dimensional reduction of the CPT-even gauge sector of the standard model extension, is composed of a gauge and a scalar fields, being affected by Lorentz-violating (LIV) coefficients encoded in the symmetric tensor {kappa}{sub {mu}{nu}}. The propagator of the gauge field is explicitly evaluated and expressed in terms of linear independent symmetric tensors, presenting only one physical mode. The same holds for the scalar propagator. A consistency analysis is performed based on the poles of the propagators. The isotropic parity-even sector is stable, causal and unitary mode for 0{<=}{kappa}{sub 00}<1. On the other hand, the anisotropic sector is stable and unitary but in general noncausal. Finally, it is shown that this planar model interacting with a {lambda}{phi}{sup 4}-Higgs field supports compact-like vortex configurations. (orig.)

  7. Consistency analysis of a nonbirefringent Lorentz-violating planar model

    Science.gov (United States)

    Casana, Rodolfo; Ferreira, Manoel M.; Moreira, Roemir P. M.

    2012-07-01

    In this work analyze the physical consistency of a nonbirefringent Lorentz-violating planar model via the analysis of the pole structure of its Feynman propagators. The nonbirefringent planar model, obtained from the dimensional reduction of the CPT-even gauge sector of the standard model extension, is composed of a gauge and a scalar fields, being affected by Lorentz-violating (LIV) coefficients encoded in the symmetric tensor κ μν . The propagator of the gauge field is explicitly evaluated and expressed in terms of linear independent symmetric tensors, presenting only one physical mode. The same holds for the scalar propagator. A consistency analysis is performed based on the poles of the propagators. The isotropic parity-even sector is stable, causal and unitary mode for 0≤ κ 00<1. On the other hand, the anisotropic sector is stable and unitary but in general noncausal. Finally, it is shown that this planar model interacting with a λ| φ|4-Higgs field supports compactlike vortex configurations.

  8. On consistent truncations in ${\\cal N}=2^*$ holography

    CERN Document Server

    Balasubramanian, Venkat

    2013-01-01

    Although Pilch-Warner (PW) gravitational renormalization group flow [arXiv:hep-th/0004063] passes a number of important consistency checks to be identified as a holographic dual to a large-$N$ $SU(N)$ ${\\cal N}=2^*$ supersymmetric gauge theory, it fails to reproduce the free energy of the theory on $S^4$, computed with the localization techniques. This disagreement points to the existence of a larger dual gravitational consistent truncation, which in the gauge theory flat-space limit reduces to a PW flow. Such truncation was recently identified by Bobev-Elvang-Freedman-Pufu (BEFP) [arXiv:1311.1508]. Additional bulk scalars of the BEFP gravitation truncation might lead to destabilization of the finite-temperature deformed PW flows, and thus modify the low-temperature thermodynamics and hydrodynamics of ${\\cal N} =2^*$ plasma. We compute the quasinormal spectrum of these bulk scalar fields in the thermal PW flows and demonstrate that these modes do not condense, as long as the masses of the ${\\cal N}=2^*$ hyper...

  9. Testing consistency of general relativity with kinematic and dynamical probes

    CERN Document Server

    Duan, Xiao-Wei; Zhang, Tong-Jie

    2016-01-01

    In this work, we test consistency relations between a kinematic probe, the observational Hubble data, and a dynamical probe, the growth rates for cosmic large scale structure, which should hold if general relativity is the correct theory of gravity on cosmological scales. Moreover, we summarize the development history of parametrization in testings and make an improvement of it. Taking advantage of the Hubble parameter given from both parametric and non-parametric methods, we propose three equations and test two of them performed by means of two-dimensional parameterizations, including one using trigonometric functions we propose. As a result, it is found that the consistency relations satisfies well at $1\\sigma$ CL and trigonometric functions turn out to be efficient tools in parameterizations. Furthermore, in order to confirm the validity of our test, we introduce a model of modified gravity, DGP model and compare the testing results in the cases of $\\Lambda$CDM, "DGP in GR" and DGP model with mock data. It...

  10. Self-consistent field theory for obligatory coassembly

    Science.gov (United States)

    Voets, I. K.; Leermakers, F. A. M.

    2008-12-01

    We present a first-order model for obligatory coassembly of block copolymers via an associative driving force in a nonselective solvent, making use of the classical self-consistent field (SCF) theory. The key idea is to use a generic associative driving force to bring two polymer blocks together into the core of the micelle and to employ one block of the copolymer(s) to provide a classical stopping mechanism for micelle formation. The driving force is generated by assuming a negative value for the relevant short-range Flory-Huggins interaction parameter. Hence, the model may be adopted to study micellization via H bonding, acceptor-donor interactions, and electrostatic interactions. Here, we limit ourselves to systems that resemble experimental ones where the mechanism of coassembly is electrostatic attraction leading to charge compensation. The resulting micelles are termed complex coacervate core micelles (CCCMs). We show that the predictions are qualitatively consistent with a wide variety of experimentally observed phenomena, even though the model does not yet account for the charges explicitly. For example, it successfully mimics the effect of salt on CCCMs. In the absence of salt CCCMs are far more stable than in excess salt, where the driving force for self-assembly is screened. The main limitations of the SCF model are related to the occurrence of soluble complexes, i.e., soluble, charged particles that coexist with the CCCMs.

  11. Consistency of FMEA used in the validation of analytical procedures.

    Science.gov (United States)

    Oldenhof, M T; van Leeuwen, J F; Nauta, M J; de Kaste, D; Odekerken-Rombouts, Y M C F; Vredenbregt, M J; Weda, M; Barends, D M

    2011-02-20

    In order to explore the consistency of the outcome of a Failure Mode and Effects Analysis (FMEA) in the validation of analytical procedures, an FMEA was carried out by two different teams. The two teams applied two separate FMEAs to a High Performance Liquid Chromatography-Diode Array Detection-Mass Spectrometry (HPLC-DAD-MS) analytical procedure used in the quality control of medicines. Each team was free to define their own ranking scales for the probability of severity (S), occurrence (O), and detection (D) of failure modes. We calculated Risk Priority Numbers (RPNs) and we identified the failure modes above the 90th percentile of RPN values as failure modes needing urgent corrective action; failure modes falling between the 75th and 90th percentile of RPN values were identified as failure modes needing necessary corrective action, respectively. Team 1 and Team 2 identified five and six failure modes needing urgent corrective action respectively, with two being commonly identified. Of the failure modes needing necessary corrective actions, about a third were commonly identified by both teams. These results show inconsistency in the outcome of the FMEA. To improve consistency, we recommend that FMEA is always carried out under the supervision of an experienced FMEA-facilitator and that the FMEA team has at least two members with competence in the analytical method to be validated. However, the FMEAs of both teams contained valuable information that was not identified by the other team, indicating that this inconsistency is not always a drawback.

  12. [Promotion of the systematization of consistent education for medical technologists].

    Science.gov (United States)

    Shiba, Kiyoko; Sato, Kenji

    2006-03-01

    Although only about 35 years have passed since the birth of medical technology, marked advances have been made in the clinical laboratory science field. However, the educational system for technologists attached importance only to the learning of techniques for a long period because special training schools primarily provided medical technologist education. With the passing of time, the need for advanced knowledge has increased, and a plan to change the education system for medical technologists to 4-year colleges was evaluated. In 1989, the Course of Laboratory Sciences as a 4-year system for medical technologist education was established in the Department of Medicine, Tokyo Medical & Dental University. The Doctoral Course of Graduate School (first term) was established in 1993 and the Doctoral Course of Graduate School(second term) in 1995. In 2001, these courses formed a graduate university as the Division of Biomedical Laboratory Sciences, the Graduate School of Allied Health Sciences. Thus, a consistent educational system for medical technologists was established. By March 2005, about 500 students had graduated from this division. Based on this experience, we produced a 4-stage developmental program and provide an advanced educational system for the promotion of the systematization of consistent medical technologist education.

  13. High accuracy and visibility-consistent dense multiview stereo.

    Science.gov (United States)

    Vu, Hoang-Hiep; Labatut, Patrick; Pons, Jean-Philippe; Keriven, Renaud

    2012-05-01

    Since the initial comparison of Seitz et al., the accuracy of dense multiview stereovision methods has been increasing steadily. A number of limitations, however, make most of these methods not suitable to outdoor scenes taken under uncontrolled imaging conditions. The present work consists of a complete dense multiview stereo pipeline which circumvents these limitations, being able to handle large-scale scenes without sacrificing accuracy. Highly detailed reconstructions are produced within very reasonable time thanks to two key stages in our pipeline: a minimum s-t cut optimization over an adaptive domain that robustly and efficiently filters a quasidense point cloud from outliers and reconstructs an initial surface by integrating visibility constraints, followed by a mesh-based variational refinement that captures small details, smartly handling photo-consistency, regularization, and adaptive resolution. The pipeline has been tested over a wide range of scenes: from classic compact objects taken in a laboratory setting, to outdoor architectural scenes, landscapes, and cultural heritage sites. The accuracy of its reconstructions has also been measured on the dense multiview benchmark proposed by Strecha et al., showing the results to compare more than favorably with the current state-of-the-art methods.

  14. Consistency and sealing of advanced bipolar tissue sealers.

    Science.gov (United States)

    Chekan, Edward G; Davison, Mark A; Singleton, David W; Mennone, John Z; Hinoul, Piet

    2015-01-01

    The aim of this study was to evaluate two commonly used advanced bipolar devices (ENSEAL(®) G2 Tissue Sealers and LigaSure™ Blunt Tip) for compression uniformity, vessel sealing strength, and consistency in bench-top analyses. Compression analysis was performed with a foam pad/sensor apparatus inserted between closed jaws of the instruments. Average pressures (psi) were recorded across the entire inside surface of the jaws, and over the distal one-third of jaws. To test vessel sealing strength, ex vivo pig carotid arteries were sealed and transected and left and right (sealed) halves of vessels were subjected to burst pressure testing. The maximum bursting pressures of each half of vessels were averaged to obtain single data points for analysis. The absence or presence of tissue sticking to device jaws was noted for each transected vessel. Statistically higher average compression values were found for ENSEAL(®) instruments (curved jaw and straight jaw) compared to LigaSure™, Pbench-top testing, ENSEAL(®) G2 sealers produced more uniform compression, stronger and more consistent vessel sealing, and reduced tissue sticking relative to LigaSure™.

  15. Towards a fully size-consistent method of increments

    CERN Document Server

    Fertitta, E; Paulus, B; Barcza, G; Legeza, Ö

    2016-01-01

    The method of increments (MoI) allows one to successfully calculate cohesive energies of bulk materials with high accuracy, but it encounters difficulties when calculating whole dissociation curves. The reason is that its standard formalism is based on a single Hartree-Fock (HF) configuration whose orbitals are localized and used for the many-body expansion. Therefore, in those situations where HF does not allow a size-consistent description of the dissociation, the MoI cannot yield proper results either. Herein we address the problem by employing a size-consistent multiconfigurational reference for the MoI formalism. This leads to a matrix equation where a coupling derived by the reference itself is employed. In principle, such approach allows one to evaluate approximate values for the ground as well as excited states energies. While the latter are accurate close to the avoided crossing only, the ground state results are very promising for the whole dissociation curve, as shown by the comparison with density...

  16. Self consistent tight binding model for dissociable water

    Science.gov (United States)

    Lin, You; Wynveen, Aaron; Halley, J. W.; Curtiss, L. A.; Redfern, P. C.

    2012-05-01

    We report results of development of a self consistent tight binding model for water. The model explicitly describes the electrons of the liquid self consistently, allows dissociation of the water and permits fast direct dynamics molecular dynamics calculations of the fluid properties. It is parameterized by fitting to first principles calculations on water monomers, dimers, and trimers. We report calculated radial distribution functions of the bulk liquid, a phase diagram and structure of solvated protons within the model as well as ac conductivity of a system of 96 water molecules of which one is dissociated. Structural properties and the phase diagram are in good agreement with experiment and first principles calculations. The estimated DC conductivity of a computational sample containing a dissociated water molecule was an order of magnitude larger than that reported from experiment though the calculated ratio of proton to hydroxyl contributions to the conductivity is very close to the experimental value. The conductivity results suggest a Grotthuss-like mechanism for the proton component of the conductivity.

  17. Inconsistent handers show higher psychopathy than consistent handers.

    Science.gov (United States)

    Shobe, Elizabeth; Desimone, Kailey

    2016-01-01

    Three hundred and forty-two university students completed the Short Dark Triad (SD3) and the Edinburgh Handedness Inventory (EHI). Inconsistent handers showed higher psychopathy scores than consistent handers, and no handedness differences were observed for narcissism or Machiavellianism. Participants were further subdivided by quartile into low, moderately low, moderately high, and high psychopathy groups (non-clinical). Absolute EHI scores were equally distributed among low and moderate groups, but were significantly lower for the high psychopathy group. These findings suggest that inconsistent handedness is only associated with the upper quartile of psychopathy scores. Also, males showed significantly higher psychopathy scores than females, and the ratio of male to female inconsistent handers decreased as psychopathy score increased. No gender × handedness interaction indicated that both female and male inconsistent handers have higher psychopathy scores than consistent handers. Although significant, the effects were small and 99.6% of participants were not in the range of a potential clinical diagnosis. The reader, therefore, is strongly cautioned against equating inconsistent handedness with psychopathy.

  18. Self-consistent conversion of a viscous fluid to particles

    Science.gov (United States)

    Molnar, Denes; Wolff, Zack

    2017-02-01

    Comparison of hydrodynamic and "hybrid" hydrodynamics+transport calculations with heavy-ion data inevitably requires the conversion of the fluid to particles. For dissipative fluids the conversion is ambiguous without additional theory input complementing hydrodynamics. We obtain self-consistent shear viscous phase-space corrections from linearized Boltzmann transport theory for a gas of hadrons. These corrections depend on the particle species, and incorporating them in Cooper-Frye freeze-out affects identified particle observables. For example, with additive quark model cross sections, proton elliptic flow is larger than pion elliptic flow at moderately high pT in Au+Au collisions at the BNL Relativistic Heavy Ion Collider. This is in contrast to Cooper-Frye freeze-out with the commonly used "democratic Grad" ansatz that assumes no species dependence. Various analytic and numerical results are also presented for massless and massive two-component mixtures to better elucidate how species dependence arises. For convenient inclusion in pure hydrodynamic and hybrid calculations, Appendix G contains self-consistent viscous corrections for each species both in tabulated and parametrized form.

  19. Consistent linguistic fuzzy preference relations method with ranking fuzzy numbers

    Science.gov (United States)

    Ridzuan, Siti Amnah Mohd; Mohamad, Daud; Kamis, Nor Hanimah

    2014-12-01

    Multi-Criteria Decision Making (MCDM) methods have been developed to help decision makers in selecting the best criteria or alternatives from the options given. One of the well known methods in MCDM is the Consistent Fuzzy Preference Relation (CFPR) method, essentially utilizes a pairwise comparison approach. This method was later improved to cater subjectivity in the data by using fuzzy set, known as the Consistent Linguistic Fuzzy Preference Relations (CLFPR). The CLFPR method uses the additive transitivity property in the evaluation of pairwise comparison matrices. However, the calculation involved is lengthy and cumbersome. To overcome this problem, a method of defuzzification was introduced by researchers. Nevertheless, the defuzzification process has a major setback where some information may lose due to the simplification process. In this paper, we propose a method of CLFPR that preserves the fuzzy numbers form throughout the process. In obtaining the desired ordering result, a method of ranking fuzzy numbers is utilized in the procedure. This improved procedure for CLFPR is implemented to a case study to verify its effectiveness. This method is useful for solving decision making problems and can be applied to many areas of applications.

  20. Plant functional traits have globally consistent effects on competition.

    Science.gov (United States)

    Kunstler, Georges; Falster, Daniel; Coomes, David A; Hui, Francis; Kooyman, Robert M; Laughlin, Daniel C; Poorter, Lourens; Vanderwel, Mark; Vieilledent, Ghislain; Wright, S Joseph; Aiba, Masahiro; Baraloto, Christopher; Caspersen, John; Cornelissen, J Hans C; Gourlet-Fleury, Sylvie; Hanewinkel, Marc; Herault, Bruno; Kattge, Jens; Kurokawa, Hiroko; Onoda, Yusuke; Peñuelas, Josep; Poorter, Hendrik; Uriarte, Maria; Richardson, Sarah; Ruiz-Benito, Paloma; Sun, I-Fang; Ståhl, Göran; Swenson, Nathan G; Thompson, Jill; Westerlund, Bertil; Wirth, Christian; Zavala, Miguel A; Zeng, Hongcheng; Zimmerman, Jess K; Zimmermann, Niklaus E; Westoby, Mark

    2016-01-14

    Phenotypic traits and their associated trade-offs have been shown to have globally consistent effects on individual plant physiological functions, but how these effects scale up to influence competition, a key driver of community assembly in terrestrial vegetation, has remained unclear. Here we use growth data from more than 3 million trees in over 140,000 plots across the world to show how three key functional traits--wood density, specific leaf area and maximum height--consistently influence competitive interactions. Fast maximum growth of a species was correlated negatively with its wood density in all biomes, and positively with its specific leaf area in most biomes. Low wood density was also correlated with a low ability to tolerate competition and a low competitive effect on neighbours, while high specific leaf area was correlated with a low competitive effect. Thus, traits generate trade-offs between performance with competition versus performance without competition, a fundamental ingredient in the classical hypothesis that the coexistence of plant species is enabled via differentiation in their successional strategies. Competition within species was stronger than between species, but an increase in trait dissimilarity between species had little influence in weakening competition. No benefit of dissimilarity was detected for specific leaf area or wood density, and only a weak benefit for maximum height. Our trait-based approach to modelling competition makes generalization possible across the forest ecosystems of the world and their highly diverse species composition.