WorldWideScience

Sample records for methods study consisted

  1. Measuring consistency in translation memories: a mixed-methods case study

    OpenAIRE

    Moorkens, Joss

    2012-01-01

    Introduced in the early 1990s, translation memory (TM) tools have since become widely used as an aid to human translation based on commonly‐held assumptions that they save time, reduce cost, and maximise consistency. The purpose of this research is twofold: it aims to develop a method for measuring consistency in TMs; and it aims to use this method to interrogate selected TMs from the localisation industry in order to find out whether the use of TM tools does, in fact, promote consistency in ...

  2. Self-consistent study of nuclei far from stability with the energy density method

    CERN Document Server

    Tondeur, F

    1981-01-01

    The self-consistent energy density method has been shown to give good results with a small number of parameters for the calculation of nuclear masses, radii, deformations, neutron skins, shell and sub- shell effects. It is here used to study the properties of nuclei far from stability, like densities, shell structure, even-odd mass differences, single-particle potentials and nuclear deformations. A few possible consequences of the results for astrophysical problems are briefly considered. The predictions of the model in the super- heavy region are summarised. (34 refs).

  3. Quasiparticle self-consistent GW method: a short summary

    International Nuclear Information System (INIS)

    Kotani, Takao; Schilfgaarde, Mark van; Faleev, Sergey V; Chantis, Athanasios

    2007-01-01

    We have developed a quasiparticle self-consistent GW method (QSGW), which is a new self-consistent method to calculate the electronic structure within the GW approximation. The method is formulated based on the idea of a self-consistent perturbation; the non-interacting Green function G 0 , which is the starting point for GWA to obtain G, is determined self-consistently so as to minimize the perturbative correction generated by GWA. After self-consistency is attained, we have G 0 , W (the screened Coulomb interaction) and G self-consistently. This G 0 can be interpreted as the optimum non-interacting propagator for the quasiparticles. We will summarize some theoretical discussions to justify QSGW. Then we will survey results which have been obtained up to now: e.g., band gaps for normal semiconductors are predicted to a precision of 0.1-0.3 eV; the self-consistency including the off-diagonal part is required for NiO and MnO; and so on. There are still some remaining disagreements with experiments; however, they are very systematic, and can be explained from the neglect of excitonic effects

  4. Using network screening methods to determine locations with specific safety issues: A design consistency case study.

    Science.gov (United States)

    Butsick, Andrew J; Wood, Jonathan S; Jovanis, Paul P

    2017-09-01

    The Highway Safety Manual provides multiple methods that can be used to identify sites with promise (SWiPs) for safety improvement. However, most of these methods cannot be used to identify sites with specific problems. Furthermore, given that infrastructure funding is often specified for use related to specific problems/programs, a method for identifying SWiPs related to those programs would be very useful. This research establishes a method for Identifying SWiPs with specific issues. This is accomplished using two safety performance functions (SPFs). This method is applied to identifying SWiPs with geometric design consistency issues. Mixed effects negative binomial regression was used to develop two SPFs using 5 years of crash data and over 8754km of two-lane rural roadway. The first SPF contained typical roadway elements while the second contained additional geometric design consistency parameters. After empirical Bayes adjustments, sites with promise (SWiPs) were identified. The disparity between SWiPs identified by the two SPFs was evident; 40 unique sites were identified by each model out of the top 220 segments. By comparing sites across the two models, candidate road segments can be identified where a lack design consistency may be contributing to an increase in expected crashes. Practitioners can use this method to more effectively identify roadway segments suffering from reduced safety performance due to geometric design inconsistency, with detailed engineering studies of identified sites required to confirm the initial assessment. Copyright © 2017 Elsevier Ltd. All rights reserved.

  5. Self-consistent Bayesian analysis of space-time symmetry studies

    International Nuclear Information System (INIS)

    Davis, E.D.

    1996-01-01

    We introduce a Bayesian method for the analysis of epithermal neutron transmission data on space-time symmetries in which unique assignment of the prior is achieved by maximisation of the cross entropy and the imposition of a self-consistency criterion. Unlike the maximum likelihood method used in previous analyses of parity-violation data, our method is freed of an ad hoc cutoff parameter. Monte Carlo studies indicate that our self-consistent Bayesian analysis is superior to the maximum likelihood method when applied to the small data samples typical of symmetry studies. (orig.)

  6. Consistent forcing scheme in the cascaded lattice Boltzmann method.

    Science.gov (United States)

    Fei, Linlin; Luo, Kai Hong

    2017-11-01

    In this paper, we give an alternative derivation for the cascaded lattice Boltzmann method (CLBM) within a general multiple-relaxation-time (MRT) framework by introducing a shift matrix. When the shift matrix is a unit matrix, the CLBM degrades into an MRT LBM. Based on this, a consistent forcing scheme is developed for the CLBM. The consistency of the nonslip rule, the second-order convergence rate in space, and the property of isotropy for the consistent forcing scheme is demonstrated through numerical simulations of several canonical problems. Several existing forcing schemes previously used in the CLBM are also examined. The study clarifies the relation between MRT LBM and CLBM under a general framework.

  7. Consistent forcing scheme in the cascaded lattice Boltzmann method

    Science.gov (United States)

    Fei, Linlin; Luo, Kai Hong

    2017-11-01

    In this paper, we give an alternative derivation for the cascaded lattice Boltzmann method (CLBM) within a general multiple-relaxation-time (MRT) framework by introducing a shift matrix. When the shift matrix is a unit matrix, the CLBM degrades into an MRT LBM. Based on this, a consistent forcing scheme is developed for the CLBM. The consistency of the nonslip rule, the second-order convergence rate in space, and the property of isotropy for the consistent forcing scheme is demonstrated through numerical simulations of several canonical problems. Several existing forcing schemes previously used in the CLBM are also examined. The study clarifies the relation between MRT LBM and CLBM under a general framework.

  8. Linear augmented plane wave method for self-consistent calculations

    International Nuclear Information System (INIS)

    Takeda, T.; Kuebler, J.

    1979-01-01

    O.K. Andersen has recently introduced a linear augmented plane wave method (LAPW) for the calculation of electronic structure that was shown to be computationally fast. A more general formulation of an LAPW method is presented here. It makes use of a freely disposable number of eigenfunctions of the radial Schroedinger equation. These eigenfunctions can be selected in a self-consistent way. The present formulation also results in a computationally fast method. It is shown that Andersen's LAPW is obtained in a special limit from the present formulation. Self-consistent test calculations for copper show the present method to be remarkably accurate. As an application, scalar-relativistic self-consistent calculations are presented for the band structure of FCC lanthanum. (author)

  9. On Consistency Test Method of Expert Opinion in Ecological Security Assessment.

    Science.gov (United States)

    Gong, Zaiwu; Wang, Lihong

    2017-09-04

    To reflect the initiative design and initiative of human security management and safety warning, ecological safety assessment is of great value. In the comprehensive evaluation of regional ecological security with the participation of experts, the expert's individual judgment level, ability and the consistency of the expert's overall opinion will have a very important influence on the evaluation result. This paper studies the consistency measure and consensus measure based on the multiplicative and additive consistency property of fuzzy preference relation (FPR). We firstly propose the optimization methods to obtain the optimal multiplicative consistent and additively consistent FPRs of individual and group judgments, respectively. Then, we put forward a consistency measure by computing the distance between the original individual judgment and the optimal individual estimation, along with a consensus measure by computing the distance between the original collective judgment and the optimal collective estimation. In the end, we make a case study on ecological security for five cities. Result shows that the optimal FPRs are helpful in measuring the consistency degree of individual judgment and the consensus degree of collective judgment.

  10. An eigenvalue approach to quantum plasmonics based on a self-consistent hydrodynamics method.

    Science.gov (United States)

    Ding, Kun; Chan, C T

    2018-02-28

    Plasmonics has attracted much attention not only because it has useful properties such as strong field enhancement, but also because it reveals the quantum nature of matter. To handle quantum plasmonics effects, ab initio packages or empirical Feibelman d-parameters have been used to explore the quantum correction of plasmonic resonances. However, most of these methods are formulated within the quasi-static framework. The self-consistent hydrodynamics model offers a reliable approach to study quantum plasmonics because it can incorporate the quantum effect of the electron gas into classical electrodynamics in a consistent manner. Instead of the standard scattering method, we formulate the self-consistent hydrodynamics method as an eigenvalue problem to study quantum plasmonics with electrons and photons treated on the same footing. We find that the eigenvalue approach must involve a global operator, which originates from the energy functional of the electron gas. This manifests the intrinsic nonlocality of the response of quantum plasmonic resonances. Our model gives the analytical forms of quantum corrections to plasmonic modes, incorporating quantum electron spill-out effects and electrodynamical retardation. We apply our method to study the quantum surface plasmon polariton for a single flat interface.

  11. An algebraic method for constructing stable and consistent autoregressive filters

    International Nuclear Information System (INIS)

    Harlim, John; Hong, Hoon; Robbins, Jacob L.

    2015-01-01

    In this paper, we introduce an algebraic method to construct stable and consistent univariate autoregressive (AR) models of low order for filtering and predicting nonlinear turbulent signals with memory depth. By stable, we refer to the classical stability condition for the AR model. By consistent, we refer to the classical consistency constraints of Adams–Bashforth methods of order-two. One attractive feature of this algebraic method is that the model parameters can be obtained without directly knowing any training data set as opposed to many standard, regression-based parameterization methods. It takes only long-time average statistics as inputs. The proposed method provides a discretization time step interval which guarantees the existence of stable and consistent AR model and simultaneously produces the parameters for the AR models. In our numerical examples with two chaotic time series with different characteristics of decaying time scales, we find that the proposed AR models produce significantly more accurate short-term predictive skill and comparable filtering skill relative to the linear regression-based AR models. These encouraging results are robust across wide ranges of discretization times, observation times, and observation noise variances. Finally, we also find that the proposed model produces an improved short-time prediction relative to the linear regression-based AR-models in forecasting a data set that characterizes the variability of the Madden–Julian Oscillation, a dominant tropical atmospheric wave pattern

  12. A Benchmark Estimate for the Capital Stock. An Optimal Consistency Method

    OpenAIRE

    Jose Miguel Albala-Bertrand

    2001-01-01

    There are alternative methods to estimate a capital stock for a benchmark year. These methods, however, do not allow for an independent check, which could establish whether the estimated benchmark level is too high or too low. I propose here an optimal consistency method (OCM), which may allow estimating a capital stock level for a benchmark year and/or checking the consistency of alternative estimates of a benchmark capital stock.

  13. Self-consistent studies of magnetic thin film Ni (001)

    International Nuclear Information System (INIS)

    Wang, C.S.; Freeman, A.J.

    1979-01-01

    Advances in experimental methods for studying surface phenomena have provided the stimulus to develop theoretical methods capable of interpreting this wealth of new information. Of particular interest have been the relative roles of bulk and surface contributions since in several important cases agreement between experiment and bulk self-consistent (SC) calculations within the local spin density functional formalism (LSDF) is lacking. We discuss our recent extension of the (LSDF) approach to the study of thin films (slabs) and the role of surface effects on magnetic properties. Results are described for Ni (001) films using our new SC numerical basis set LCAO method. Self-consistency within the superposition of overlapping spherical atomic charge density model is obtained iteratively with the atomic configuration as the adjustable parameter. Results are presented for the electronic charge densities and local density of states. The origin and role of (magnetic) surface states is discussed by comparison with results of earlier bulk calculations

  14. Statistically Consistent k-mer Methods for Phylogenetic Tree Reconstruction.

    Science.gov (United States)

    Allman, Elizabeth S; Rhodes, John A; Sullivant, Seth

    2017-02-01

    Frequencies of k-mers in sequences are sometimes used as a basis for inferring phylogenetic trees without first obtaining a multiple sequence alignment. We show that a standard approach of using the squared Euclidean distance between k-mer vectors to approximate a tree metric can be statistically inconsistent. To remedy this, we derive model-based distance corrections for orthologous sequences without gaps, which lead to consistent tree inference. The identifiability of model parameters from k-mer frequencies is also studied. Finally, we report simulations showing that the corrected distance outperforms many other k-mer methods, even when sequences are generated with an insertion and deletion process. These results have implications for multiple sequence alignment as well since k-mer methods are usually the first step in constructing a guide tree for such algorithms.

  15. Bootstrap embedding: An internally consistent fragment-based method

    Energy Technology Data Exchange (ETDEWEB)

    Welborn, Matthew; Tsuchimochi, Takashi; Van Voorhis, Troy [Department of Chemistry, Massachusetts Institute of Technology, 77 Massachusetts Avenue, Cambridge, Massachusetts 02139 (United States)

    2016-08-21

    Strong correlation poses a difficult problem for electronic structure theory, with computational cost scaling quickly with system size. Fragment embedding is an attractive approach to this problem. By dividing a large complicated system into smaller manageable fragments “embedded” in an approximate description of the rest of the system, we can hope to ameliorate the steep cost of correlated calculations. While appealing, these methods often converge slowly with fragment size because of small errors at the boundary between fragment and bath. We describe a new electronic embedding method, dubbed “Bootstrap Embedding,” a self-consistent wavefunction-in-wavefunction embedding theory that uses overlapping fragments to improve the description of fragment edges. We apply this method to the one dimensional Hubbard model and a translationally asymmetric variant, and find that it performs very well for energies and populations. We find Bootstrap Embedding converges rapidly with embedded fragment size, overcoming the surface-area-to-volume-ratio error typical of many embedding methods. We anticipate that this method may lead to a low-scaling, high accuracy treatment of electron correlation in large molecular systems.

  16. Self-consistent collective coordinate method for large amplitude collective motions

    International Nuclear Information System (INIS)

    Sakata, F.; Hashimoto, Y.; Marumori, T.; Une, T.

    1982-01-01

    A recent development of the self-consistent collective coordinate method is described. The self-consistent collective coordinate method was proposed on the basis of the fundamental principle called the invariance principle of the Schroedinger equation. If this is formulated within a framework of the time dependent Hartree Fock (TDHF) theory, a classical version of the theory is obtained. A quantum version of the theory is deduced by formulating it within a framework of the unitary transformation method with auxiliary bosons. In this report, the discussion is concentrated on a relation between the classical theory and the quantum theory, and an applicability of the classical theory. The aim of the classical theory is to extract a maximally decoupled collective subspace out of a huge dimensional 1p - 1h parameter space introduced by the TDHF theory. An intimate similarity between the classical theory and a full quantum boson expansion method (BEM) was clarified. Discussion was concentrated to a simple Lipkin model. Then a relation between the BEM and the unitary transformation method with auxiliary bosons was discussed. It became clear that the quantum version of the theory had a strong relation to the BEM, and that the BEM was nothing but a quantum analogue of the present classical theory. The present theory was compared with the full TDHF calculation by using a simple model. (Kato, T.)

  17. Homogenization of Periodic Masonry Using Self-Consistent Scheme and Finite Element Method

    Science.gov (United States)

    Kumar, Nitin; Lambadi, Harish; Pandey, Manoj; Rajagopal, Amirtham

    2016-01-01

    Masonry is a heterogeneous anisotropic continuum, made up of the brick and mortar arranged in a periodic manner. Obtaining the effective elastic stiffness of the masonry structures has been a challenging task. In this study, the homogenization theory for periodic media is implemented in a very generic manner to derive the anisotropic global behavior of the masonry, through rigorous application of the homogenization theory in one step and through a full three-dimensional behavior. We have considered the periodic Eshelby self-consistent method and the finite element method. Two representative unit cells that represent the microstructure of the masonry wall exactly are considered for calibration and numerical application of the theory.

  18. Method used to test the imaging consistency of binocular camera's left-right optical system

    Science.gov (United States)

    Liu, Meiying; Wang, Hu; Liu, Jie; Xue, Yaoke; Yang, Shaodong; Zhao, Hui

    2016-09-01

    To binocular camera, the consistency of optical parameters of the left and the right optical system is an important factor that will influence the overall imaging consistency. In conventional testing procedure of optical system, there lacks specifications suitable for evaluating imaging consistency. In this paper, considering the special requirements of binocular optical imaging system, a method used to measure the imaging consistency of binocular camera is presented. Based on this method, a measurement system which is composed of an integrating sphere, a rotary table and a CMOS camera has been established. First, let the left and the right optical system capture images in normal exposure time under the same condition. Second, a contour image is obtained based on the multiple threshold segmentation result and the boundary is determined using the slope of contour lines near the pseudo-contour line. Third, the constraint of gray level based on the corresponding coordinates of left-right images is established and the imaging consistency could be evaluated through standard deviation σ of the imaging grayscale difference D (x, y) between the left and right optical system. The experiments demonstrate that the method is suitable for carrying out the imaging consistency testing for binocular camera. When the standard deviation 3σ distribution of imaging gray difference D (x, y) between the left and right optical system of the binocular camera does not exceed 5%, it is believed that the design requirements have been achieved. This method could be used effectively and paves the way for the imaging consistency testing of the binocular camera.

  19. Time-consistent and market-consistent evaluations

    NARCIS (Netherlands)

    Pelsser, A.; Stadje, M.A.

    2014-01-01

    We consider evaluation methods for payoffs with an inherent financial risk as encountered for instance for portfolios held by pension funds and insurance companies. Pricing such payoffs in a way consistent to market prices typically involves combining actuarial techniques with methods from

  20. Linearized self-consistent quasiparticle GW method: Application to semiconductors and simple metals

    International Nuclear Information System (INIS)

    Kutepov, A. L.

    2017-01-01

    We present a code implementing the linearized self-consistent quasiparticle GW method (QSGW) in the LAPW basis. Our approach is based on the linearization of the self-energy around zero frequency which differs it from the existing implementations of the QSGW method. The linearization allows us to use Matsubara frequencies instead of working on the real axis. This results in efficiency gains by switching to the imaginary time representation in the same way as in the space time method. The all electron LAPW basis set eliminates the need for pseudopotentials. We discuss the advantages of our approach, such as its N 3 scaling with the system size N, as well as its shortcomings. We apply our approach to study the electronic properties of selected semiconductors, insulators, and simple metals and show that our code produces the results very close to the previously published QSGW data. Our implementation is a good platform for further many body diagrammatic resummations such as the vertex-corrected GW approach and the GW+DMFT method.

  1. DFTB3: Extension of the self-consistent-charge density-functional tight-binding method (SCC-DFTB).

    Science.gov (United States)

    Gaus, Michael; Cui, Qiang; Elstner, Marcus

    2012-04-10

    The self-consistent-charge density-functional tight-binding method (SCC-DFTB) is an approximate quantum chemical method derived from density functional theory (DFT) based on a second-order expansion of the DFT total energy around a reference density. In the present study we combine earlier extensions and improve them consistently with, first, an improved Coulomb interaction between atomic partial charges, and second, the complete third-order expansion of the DFT total energy. These modifications lead us to the next generation of the DFTB methodology called DFTB3, which substantially improves the description of charged systems containing elements C, H, N, O, and P, especially regarding hydrogen binding energies and proton affinities. As a result, DFTB3 is particularly applicable to biomolecular systems. Remaining challenges and possible solutions are also briefly discussed.

  2. Self-consistent field variational cellular method as applied to the band structure calculation of sodium

    International Nuclear Information System (INIS)

    Lino, A.T.; Takahashi, E.K.; Leite, J.R.; Ferraz, A.C.

    1988-01-01

    The band structure of metallic sodium is calculated, using for the first time the self-consistent field variational cellular method. In order to implement the self-consistency in the variational cellular theory, the crystal electronic charge density was calculated within the muffin-tin approximation. The comparison between our results and those derived from other calculations leads to the conclusion that the proposed self-consistent version of the variational cellular method is fast and accurate. (author) [pt

  3. Consistency argued students of fluid

    Science.gov (United States)

    Viyanti; Cari; Suparmi; Winarti; Slamet Budiarti, Indah; Handika, Jeffry; Widyastuti, Fatma

    2017-01-01

    Problem solving for physics concepts through consistency arguments can improve thinking skills of students and it is an important thing in science. The study aims to assess the consistency of the material Fluid student argmentation. The population of this study are College students PGRI Madiun, UIN Sunan Kalijaga Yogyakarta and Lampung University. Samples using cluster random sampling, 145 samples obtained by the number of students. The study used a descriptive survey method. Data obtained through multiple-choice test and interview reasoned. Problem fluid modified from [9] and [1]. The results of the study gained an average consistency argmentation for the right consistency, consistency is wrong, and inconsistent respectively 4.85%; 29.93%; and 65.23%. Data from the study have an impact on the lack of understanding of the fluid material which is ideally in full consistency argued affect the expansion of understanding of the concept. The results of the study as a reference in making improvements in future studies is to obtain a positive change in the consistency of argumentations.

  4. The study of consistent properties of gelatinous shampoo with minoxidil

    Directory of Open Access Journals (Sweden)

    I. V. Gnitko

    2016-04-01

    Full Text Available The aim of the work is the study of consistent properties of gelatinous shampoo with minoxidil 1% for the complex therapy and prevention of alopecia. This shampoo with minoxidil was selected according to the complex physical-chemical, biopharmaceutical and microbiological investigations. Methods and results. It has been established that consistent properties of the gelatinous minoxidil 1% shampoo and the «mechanical stability» (1.70 describe the formulation as exceptionally thixotropic composition with possibility of restoration after mechanical loads. Also this fact allows to predict stability of the consistent properties during long storage. Conclusion. Factors of dynamic flowing for the foam detergent gel with minoxidil (Кd1=38.9%; Kd2=78.06% quantitatively confirm sufficient degree of distribution at the time of spreading composition on the skin surface of the hairy part of head or during technological operations of manufacturing. Insignificant difference of «mechanical stability» for the gelatinous minoxidil 1% shampoo and its base indicates the absence of interactions between active substance and the base.

  5. Consistency analysis of subspace identification methods based on a linear regression approach

    DEFF Research Database (Denmark)

    Knudsen, Torben

    2001-01-01

    In the literature results can be found which claim consistency for the subspace method under certain quite weak assumptions. Unfortunately, a new result gives a counter example showing inconsistency under these assumptions and then gives new more strict sufficient assumptions which however does n...... not include important model structures as e.g. Box-Jenkins. Based on a simple least squares approach this paper shows the possible inconsistency under the weak assumptions and develops only slightly stricter assumptions sufficient for consistency and which includes any model structure...

  6. Outcomes Definitions and Statistical Tests in Oncology Studies: A Systematic Review of the Reporting Consistency.

    Science.gov (United States)

    Rivoirard, Romain; Duplay, Vianney; Oriol, Mathieu; Tinquaut, Fabien; Chauvin, Franck; Magne, Nicolas; Bourmaud, Aurelie

    2016-01-01

    Quality of reporting for Randomized Clinical Trials (RCTs) in oncology was analyzed in several systematic reviews, but, in this setting, there is paucity of data for the outcomes definitions and consistency of reporting for statistical tests in RCTs and Observational Studies (OBS). The objective of this review was to describe those two reporting aspects, for OBS and RCTs in oncology. From a list of 19 medical journals, three were retained for analysis, after a random selection: British Medical Journal (BMJ), Annals of Oncology (AoO) and British Journal of Cancer (BJC). All original articles published between March 2009 and March 2014 were screened. Only studies whose main outcome was accompanied by a corresponding statistical test were included in the analysis. Studies based on censored data were excluded. Primary outcome was to assess quality of reporting for description of primary outcome measure in RCTs and of variables of interest in OBS. A logistic regression was performed to identify covariates of studies potentially associated with concordance of tests between Methods and Results parts. 826 studies were included in the review, and 698 were OBS. Variables were described in Methods section for all OBS studies and primary endpoint was clearly detailed in Methods section for 109 RCTs (85.2%). 295 OBS (42.2%) and 43 RCTs (33.6%) had perfect agreement for reported statistical test between Methods and Results parts. In multivariable analysis, variable "number of included patients in study" was associated with test consistency: aOR (adjusted Odds Ratio) for third group compared to first group was equal to: aOR Grp3 = 0.52 [0.31-0.89] (P value = 0.009). Variables in OBS and primary endpoint in RCTs are reported and described with a high frequency. However, statistical tests consistency between methods and Results sections of OBS is not always noted. Therefore, we encourage authors and peer reviewers to verify consistency of statistical tests in oncology studies.

  7. A Study on the Consistency of Discretization Equation in Unsteady Heat Transfer Calculations

    Directory of Open Access Journals (Sweden)

    Wenhua Zhang

    2013-01-01

    Full Text Available The previous studies on the consistency of discretization equation mainly focused on the finite difference method, but the issue of consistency still remains with several problems far from totally solved in the actual numerical computation. For instance, the consistency problem is involved in the numerical case where the boundary variables are solved explicitly while the variables away from the boundary are solved implicitly. And when the coefficient of discretization equation of nonlinear numerical case is the function of variables, calculating the coefficient explicitly and the variables implicitly might also give rise to consistency problem. Thus the present paper mainly researches the consistency problems involved in the explicit treatment of the second and third boundary conditions and that of thermal conductivity which is the function of temperature. The numerical results indicate that the consistency problem should be paid more attention and not be neglected in the practical computation.

  8. Integrating the Toda Lattice with Self-Consistent Source via Inverse Scattering Method

    International Nuclear Information System (INIS)

    Urazboev, Gayrat

    2012-01-01

    In this work, there is shown that the solutions of Toda lattice with self-consistent source can be found by the inverse scattering method for the discrete Sturm-Liuville operator. For the considered problem the one-soliton solution is obtained.

  9. Linearized self-consistent quasiparticle GW method: Application to semiconductors and simple metals

    Science.gov (United States)

    Kutepov, A. L.; Oudovenko, V. S.; Kotliar, G.

    2017-10-01

    We present a code implementing the linearized quasiparticle self-consistent GW method (LQSGW) in the LAPW basis. Our approach is based on the linearization of the self-energy around zero frequency which differs it from the existing implementations of the QSGW method. The linearization allows us to use Matsubara frequencies instead of working on the real axis. This results in efficiency gains by switching to the imaginary time representation in the same way as in the space time method. The all electron LAPW basis set eliminates the need for pseudopotentials. We discuss the advantages of our approach, such as its N3 scaling with the system size N, as well as its shortcomings. We apply our approach to study the electronic properties of selected semiconductors, insulators, and simple metals and show that our code produces the results very close to the previously published QSGW data. Our implementation is a good platform for further many body diagrammatic resummations such as the vertex-corrected GW approach and the GW+DMFT method. Program Files doi:http://dx.doi.org/10.17632/cpchkfty4w.1 Licensing provisions: GNU General Public License Programming language: Fortran 90 External routines/libraries: BLAS, LAPACK, MPI (optional) Nature of problem: Direct implementation of the GW method scales as N4 with the system size, which quickly becomes prohibitively time consuming even in the modern computers. Solution method: We implemented the GW approach using a method that switches between real space and momentum space representations. Some operations are faster in real space, whereas others are more computationally efficient in the reciprocal space. This makes our approach scale as N3. Restrictions: The limiting factor is usually the memory available in a computer. Using 10 GB/core of memory allows us to study the systems up to 15 atoms per unit cell.

  10. Dosimetric Consistency of Co-60 Teletherapy Unit- a ten years Study.

    Science.gov (United States)

    Baba, Misba H; Mohib-Ul-Haq, M; Khan, Aijaz A

    2013-01-01

    The goal of the Radiation standards and Dosimetry is to ensure that the output of the Teletherapy Unit is within ±2% of the stated one and the output of the treatment dose calculation methods are within ±5%. In the present paper, we studied the dosimetry of Cobalt-60 (Co-60) Teletherapy unit at Sher-I-Kashmir Institute of Medical Sciences (SKIMS) for last 10 years. Radioactivity is the phenomenon of disintegration of unstable nuclides called radionuclides. Among these radionuclides, Cobalt-60, incorporated in Telecobalt Unit, is commonly used in therapeutic treatment of cancer. Cobalt-60 being unstable decays continuously into Ni-60 with half life of 5.27 years thereby resulting in the decrease in its activity, hence dose rate (output). It is, therefore, mandatory to measure the dose rate of the Cobalt-60 source regularly so that the patient receives the same dose every time as prescribed by the radiation oncologist. The under dosage may lead to unsatisfactory treatment of cancer and over dosage may cause radiation hazards. Our study emphasizes the consistency between actual output and output obtained using decay method. The methodology involved in the present study is the calculations of actual dose rate of Co-60 Teletherapy Unit by two techniques i.e. Source to Surface Distance (SSD) and Source to Axis Distance (SAD), used for the External Beam Radiotherapy, of various cancers, using the standard methods. Thereby, a year wise comparison has been made between average actual dosimetric output (dose rate) and the average expected output values (obtained by using decay method for Co-60.). The present study shows that there is a consistency in the average output (dose rate) obtained by the actual dosimetry values and the expected output values obtained using decay method. The values obtained by actual dosimetry are within ±2% of the expected values. The results thus obtained in a year wise comparison of average output by actual dosimetry done regularly as a part of

  11. Simplified DFT methods for consistent structures and energies of large systems

    Science.gov (United States)

    Caldeweyher, Eike; Gerit Brandenburg, Jan

    2018-05-01

    Kohn–Sham density functional theory (DFT) is routinely used for the fast electronic structure computation of large systems and will most likely continue to be the method of choice for the generation of reliable geometries in the foreseeable future. Here, we present a hierarchy of simplified DFT methods designed for consistent structures and non-covalent interactions of large systems with particular focus on molecular crystals. The covered methods are a minimal basis set Hartree–Fock (HF-3c), a small basis set screened exchange hybrid functional (HSE-3c), and a generalized gradient approximated functional evaluated in a medium-sized basis set (B97-3c), all augmented with semi-classical correction potentials. We give an overview on the methods design, a comprehensive evaluation on established benchmark sets for geometries and lattice energies of molecular crystals, and highlight some realistic applications on large organic crystals with several hundreds of atoms in the primitive unit cell.

  12. An Economical Approach to Estimate a Benchmark Capital Stock. An Optimal Consistency Method

    OpenAIRE

    Jose Miguel Albala-Bertrand

    2003-01-01

    There are alternative methods of estimating capital stock for a benchmark year. However, these methods are costly and time-consuming, requiring the gathering of much basic information as well as the use of some convenient assumptions and guesses. In addition, a way is needed of checking whether the estimated benchmark is at the correct level. This paper proposes an optimal consistency method (OCM), which enables a capital stock to be estimated for a benchmark year, and which can also be used ...

  13. Self-consistent DFT +U method for real-space time-dependent density functional theory calculations

    Science.gov (United States)

    Tancogne-Dejean, Nicolas; Oliveira, Micael J. T.; Rubio, Angel

    2017-12-01

    We implemented various DFT+U schemes, including the Agapito, Curtarolo, and Buongiorno Nardelli functional (ACBN0) self-consistent density-functional version of the DFT +U method [Phys. Rev. X 5, 011006 (2015), 10.1103/PhysRevX.5.011006] within the massively parallel real-space time-dependent density functional theory (TDDFT) code octopus. We further extended the method to the case of the calculation of response functions with real-time TDDFT+U and to the description of noncollinear spin systems. The implementation is tested by investigating the ground-state and optical properties of various transition-metal oxides, bulk topological insulators, and molecules. Our results are found to be in good agreement with previously published results for both the electronic band structure and structural properties. The self-consistent calculated values of U and J are also in good agreement with the values commonly used in the literature. We found that the time-dependent extension of the self-consistent DFT+U method yields improved optical properties when compared to the empirical TDDFT+U scheme. This work thus opens a different theoretical framework to address the nonequilibrium properties of correlated systems.

  14. RELIABILITY ASSESSMENT OF ENTROPY METHOD FOR SYSTEM CONSISTED OF IDENTICAL EXPONENTIAL UNITS

    Institute of Scientific and Technical Information of China (English)

    Sun Youchao; Shi Jun

    2004-01-01

    The reliability assessment of unit-system near two levels is the most important content in the reliability multi-level synthesis of complex systems. Introducing the information theory into system reliability assessment, using the addible characteristic of information quantity and the principle of equivalence of information quantity, an entropy method of data information conversion is presented for the system consisted of identical exponential units. The basic conversion formulae of entropy method of unit test data are derived based on the principle of information quantity equivalence. The general models of entropy method synthesis assessment for system reliability approximate lower limits are established according to the fundamental principle of the unit reliability assessment. The applications of the entropy method are discussed by way of practical examples. Compared with the traditional methods, the entropy method is found to be valid and practicable and the assessment results are very satisfactory.

  15. A Combined Self-Consistent Method to Estimate the Effective Properties of Polypropylene/Calcium Carbonate Composites

    Directory of Open Access Journals (Sweden)

    Zhongqiang Xiong

    2018-01-01

    Full Text Available In this work, trying to avoid difficulty of application due to the irregular filler shapes in experiments, self-consistent and differential self-consistent methods were combined to obtain a decoupled equation. The combined method suggests a tenor γ independent of filler-contents being an important connection between high and low filler-contents. On one hand, the constant parameter can be calculated by Eshelby’s inclusion theory or the Mori–Tanaka method to predict effective properties of composites coinciding with its hypothesis. On the other hand, the parameter can be calculated with several experimental results to estimate the effective properties of prepared composites of other different contents. In addition, an evaluation index σ f ′ of the interactional strength between matrix and fillers is proposed based on experiments. In experiments, a hyper-dispersant was synthesized to prepare polypropylene/calcium carbonate (PP/CaCO3 composites up to 70 wt % of filler-content with dispersion, whose dosage was only 5 wt % of the CaCO3 contents. Based on several verifications, it is hoped that the combined self-consistent method is valid for other two-phase composites in experiments with the same application progress as in this work.

  16. Consistency of extreme flood estimation approaches

    Science.gov (United States)

    Felder, Guido; Paquet, Emmanuel; Penot, David; Zischg, Andreas; Weingartner, Rolf

    2017-04-01

    Estimations of low-probability flood events are frequently used for the planning of infrastructure as well as for determining the dimensions of flood protection measures. There are several well-established methodical procedures to estimate low-probability floods. However, a global assessment of the consistency of these methods is difficult to achieve, the "true value" of an extreme flood being not observable. Anyway, a detailed comparison performed on a given case study brings useful information about the statistical and hydrological processes involved in different methods. In this study, the following three different approaches for estimating low-probability floods are compared: a purely statistical approach (ordinary extreme value statistics), a statistical approach based on stochastic rainfall-runoff simulation (SCHADEX method), and a deterministic approach (physically based PMF estimation). These methods are tested for two different Swiss catchments. The results and some intermediate variables are used for assessing potential strengths and weaknesses of each method, as well as for evaluating the consistency of these methods.

  17. A fast inverse consistent deformable image registration method based on symmetric optical flow computation

    International Nuclear Information System (INIS)

    Yang Deshan; Li Hua; Low, Daniel A; Deasy, Joseph O; Naqa, Issam El

    2008-01-01

    Deformable image registration is widely used in various radiation therapy applications including daily treatment planning adaptation to map planned tissue or dose to changing anatomy. In this work, a simple and efficient inverse consistency deformable registration method is proposed with aims of higher registration accuracy and faster convergence speed. Instead of registering image I to a second image J, the two images are symmetrically deformed toward one another in multiple passes, until both deformed images are matched and correct registration is therefore achieved. In each pass, a delta motion field is computed by minimizing a symmetric optical flow system cost function using modified optical flow algorithms. The images are then further deformed with the delta motion field in the positive and negative directions respectively, and then used for the next pass. The magnitude of the delta motion field is forced to be less than 0.4 voxel for every pass in order to guarantee smoothness and invertibility for the two overall motion fields that are accumulating the delta motion fields in both positive and negative directions, respectively. The final motion fields to register the original images I and J, in either direction, are calculated by inverting one overall motion field and combining the inversion result with the other overall motion field. The final motion fields are inversely consistent and this is ensured by the symmetric way that registration is carried out. The proposed method is demonstrated with phantom images, artificially deformed patient images and 4D-CT images. Our results suggest that the proposed method is able to improve the overall accuracy (reducing registration error by 30% or more, compared to the original and inversely inconsistent optical flow algorithms), reduce the inverse consistency error (by 95% or more) and increase the convergence rate (by 100% or more). The overall computation speed may slightly decrease, or increase in most cases

  18. Consistent calculation of the polarization electric dipole moment by the shell-correction method

    International Nuclear Information System (INIS)

    Denisov, V.Yu.

    1992-01-01

    Macroscopic calculations of the polarization electric dipole moment which arises in nuclei with an octupole deformation are discussed in detail. This dipole moment is shown to depend on the position of the center of gravity. The conditions of consistency of the radii of the proton and neutron potentials and the radii of the proton and neutron surfaces, respectively, are discussed. These conditions must be incorporated in a shell-correction calculation of this dipole moment. A correct calculation of this moment by the shell-correction method is carried out. Dipole transitions between (on the one hand) levels belonging to an octupole vibrational band and (on the other) the ground state in rare-earth nuclei with a large quadrupole deformation are studied. 19 refs., 3 figs

  19. A self-consistent nodal method in response matrix formalism for the multigroup diffusion equations

    International Nuclear Information System (INIS)

    Malambu, E.M.; Mund, E.H.

    1996-01-01

    We develop a nodal method for the multigroup diffusion equations, based on the transverse integration procedure (TIP). The efficiency of the method rests upon the convergence properties of a high-order multidimensional nodal expansion and upon numerical implementation aspects. The discrete 1D equations are cast in response matrix formalism. The derivation of the transverse leakage moments is self-consistent i.e. does not require additional assumptions. An outstanding feature of the method lies in the linear spatial shape of the local transverse leakage for the first-order scheme. The method is described in the two-dimensional case. The method is validated on some classical benchmark problems. (author)

  20. Quasiparticle self-consistent GW method for the spectral properties of complex materials.

    Science.gov (United States)

    Bruneval, Fabien; Gatti, Matteo

    2014-01-01

    The GW approximation to the formally exact many-body perturbation theory has been applied successfully to materials for several decades. Since the practical calculations are extremely cumbersome, the GW self-energy is most commonly evaluated using a first-order perturbative approach: This is the so-called G 0 W 0 scheme. However, the G 0 W 0 approximation depends heavily on the mean-field theory that is employed as a basis for the perturbation theory. Recently, a procedure to reach a kind of self-consistency within the GW framework has been proposed. The quasiparticle self-consistent GW (QSGW) approximation retains some positive aspects of a self-consistent approach, but circumvents the intricacies of the complete GW theory, which is inconveniently based on a non-Hermitian and dynamical self-energy. This new scheme allows one to surmount most of the flaws of the usual G 0 W 0 at a moderate calculation cost and at a reasonable implementation burden. In particular, the issues of small band gap semiconductors, of large band gap insulators, and of some transition metal oxides are then cured. The QSGW method broadens the range of materials for which the spectral properties can be predicted with confidence.

  1. Bosons system with finite repulsive interaction: self-consistent field method

    International Nuclear Information System (INIS)

    Renatino, M.M.B.

    1983-01-01

    Some static properties of a boson system (T = zero degree Kelvin), under the action of a repulsive potential are studied. For the repulsive potential, a model was adopted consisting of a region where it is constant (r c ), and a decay as 1/r (r > r c ). The self-consistent field approximation used takes into account short range correlations through a local field corrections, which leads to an effective field. The static structure factor S(q-vector) and the effective potential ψ(q-vector) are obtained through a self-consistent calculation. The pair-correlation function g(r-vector) and the energy of the collective excitations E(q-vector) are also obtained, from the structure factor. The density of the system and the parameters of the repulsive potential, that is, its height and the size of the constant region were used as variables for the problem. The results obtained for S(q-vector), g(r-vector) and E(q-vector) for a fixed ratio r o /r c and a variable λ, indicates the raising of a system structure, which is more noticeable when the potential became more repulsive. (author)

  2. VLE measurements using a static cell vapor phase manual sampling method accompanied with an empirical data consistency test

    International Nuclear Information System (INIS)

    Freitag, Joerg; Kosuge, Hitoshi; Schmelzer, Juergen P.; Kato, Satoru

    2015-01-01

    Highlights: • We use a new, simple static cell vapor phase manual sampling method (SCVMS) for VLE (x, y, T) measurement. • The method is applied to non-azeotropic, asymmetric and two-liquid phase forming azeotropic binaries. • The method is approved by a data consistency test, i.e., a plot of the polarity exclusion factor vs. pressure. • The consistency test reveals that with the new SCVMS method accurate VLE near ambient temperature can be measured. • Moreover, the consistency test approves that the effect of air in the SCVMS system is negligible. - Abstract: A new static cell vapor phase manual sampling (SCVMS) method is used for the simple measurement of constant temperature x, y (vapor + liquid) equilibria (VLE). The method was applied to the VLE measurements of the (methanol + water) binary at T/K = (283.2, 298.2, 308.2 and 322.9), asymmetric (acetone + 1-butanol) binary at T/K = (283.2, 295.2, 308.2 and 324.2) and two-liquid phase forming azeotropic (water + 1-butanol) binary at T/K = (283.2 and 298.2). The accuracy of the experimental data was approved by a data consistency test, that is, an empirical plot of the polarity exclusion factor, β, vs. the system pressure, P. The SCVMS data are accurate, because the VLE data converge to the same lnβ vs. lnP straight line determined from conventional distillation-still method and a headspace gas chromatography method

  3. New exact solutions of the(2+1-dimensional Broer-Kaup equation by the consistent Riccati expansion method

    Directory of Open Access Journals (Sweden)

    Jiang Ying

    2017-01-01

    Full Text Available In this work, we study the (2+1-D Broer-Kaup equation. The composite periodic breather wave, the exact composite kink breather wave and the solitary wave solutions are obtained by using the coupled degradation technique and the consistent Riccati expansion method. These results may help us to investigate some complex dynamical behaviors and the interaction between composite non-linear waves in high dimensional models

  4. Consistency analysis of Keratograph and traditional methods to evaluate tear film function

    Directory of Open Access Journals (Sweden)

    Pei-Yang Shen

    2015-05-01

    Full Text Available AIM: To investigate repeatability and accuracy of a latest Keratograph for evaluating the tear film stability and to compare its measurements with that of traditional examination methods. METHODS: The results of noninvasive tear film break-up time(NI-BUTincluding the first tear film break-up time(BUT-fand the average tear film break-up time(BUT-avewere measured by Keratograph. The repeatability of the measurements was evaluated by coefficient of variation(CVand intraclass correlation coefficient(ICC. Wilcoxon Signed-Rank test was used to compare NI-BUT with fluorescein tear film break-up time(FBUTto confirm the correlation between NI-BUT and FBUT, Schirmer I test values. Bland-Altman analysis was used to evaluate consistency. RESULTS: The study recruited 48 subjects(48 eyes(mean age 38.7±15.2 years. The CV and ICC of BUT-f were respectively 12.6% and 0.95, those of BUT-ave were 9.8% and 0.96. The value of BUT-f was lower than that of FBUT. The difference had statistical significance(6.16±2.46s vs 7.46±1.92s, PPCONCLUSION: Keratograph can provide NI-BUT data that has a better repeatability and reliability, which has great application prospects in diagnosis and treatment of dry eye and refractive corneal surgery.

  5. Internal consistency and validity of an observational method for assessing disability in mobility in patients with osteoarthritis.

    NARCIS (Netherlands)

    Steultjens, M.P.M.; Dekker, J.; Baar, M.E. van; Oostendorp, R.A.B.; Bijlsma, J.W.J.

    1999-01-01

    Objective: To establish the internal consistency of validity of an observational method for assessing diasbility in mobility in patients with osteoarthritis (OA), Methods: Data were obtained from 198 patients with OA of the hip or knee. Results of the observational method were compared with results

  6. Time-Consistent and Market-Consistent Evaluations (Revised version of 2012-086)

    NARCIS (Netherlands)

    Stadje, M.A.; Pelsser, A.

    2014-01-01

    Abstract: We consider evaluation methods for payoffs with an inherent financial risk as encountered for instance for portfolios held by pension funds and insurance companies. Pricing such payoffs in a way consistent to market prices typically involves combining actuarial techniques with methods from

  7. A Preliminary Study toward Consistent Soil Moisture from AMSR2

    NARCIS (Netherlands)

    Parinussa, R.M.; Holmes, T.R.H.; Wanders, N.; Dorigo, W.A.; de Jeu, R.A.M.

    2015-01-01

    A preliminary study toward consistent soil moisture products from the Advanced Microwave Scanning Radiometer 2 (AMSR2) is presented. Its predecessor, the Advanced Microwave Scanning Radiometer for Earth Observing System (AMSR-E), has providedEarth scientists with a consistent and continuous global

  8. Benefits of extensive recruitment effort persist during follow-ups and are consistent across age group and survey method. The TRAILS study

    Directory of Open Access Journals (Sweden)

    Nederhof Esther

    2012-07-01

    . Conclusions First, extensive recruitment effort at the first assessment wave of a prospective population based cohort study has long lasting positive effects. Second, characteristics of hard-to-recruit responders are largely consistent across age groups and survey methods.

  9. Benefits of extensive recruitment effort persist during follow-ups and are consistent across age group and survey method. The TRAILS study

    Science.gov (United States)

    2012-01-01

    recruitment effort at the first assessment wave of a prospective population based cohort study has long lasting positive effects. Second, characteristics of hard-to-recruit responders are largely consistent across age groups and survey methods. PMID:22747967

  10. A consistent modelling methodology for secondary settling tanks: a reliable numerical method.

    Science.gov (United States)

    Bürger, Raimund; Diehl, Stefan; Farås, Sebastian; Nopens, Ingmar; Torfs, Elena

    2013-01-01

    The consistent modelling methodology for secondary settling tanks (SSTs) leads to a partial differential equation (PDE) of nonlinear convection-diffusion type as a one-dimensional model for the solids concentration as a function of depth and time. This PDE includes a flux that depends discontinuously on spatial position modelling hindered settling and bulk flows, a singular source term describing the feed mechanism, a degenerating term accounting for sediment compressibility, and a dispersion term for turbulence. In addition, the solution itself is discontinuous. A consistent, reliable and robust numerical method that properly handles these difficulties is presented. Many constitutive relations for hindered settling, compression and dispersion can be used within the model, allowing the user to switch on and off effects of interest depending on the modelling goal as well as investigate the suitability of certain constitutive expressions. Simulations show the effect of the dispersion term on effluent suspended solids and total sludge mass in the SST. The focus is on correct implementation whereas calibration and validation are not pursued.

  11. Dynamic and quantitative method of analyzing service consistency evolution based on extended hierarchical finite state automata.

    Science.gov (United States)

    Fan, Linjun; Tang, Jun; Ling, Yunxiang; Li, Benxian

    2014-01-01

    This paper is concerned with the dynamic evolution analysis and quantitative measurement of primary factors that cause service inconsistency in service-oriented distributed simulation applications (SODSA). Traditional methods are mostly qualitative and empirical, and they do not consider the dynamic disturbances among factors in service's evolution behaviors such as producing, publishing, calling, and maintenance. Moreover, SODSA are rapidly evolving in terms of large-scale, reusable, compositional, pervasive, and flexible features, which presents difficulties in the usage of traditional analysis methods. To resolve these problems, a novel dynamic evolution model extended hierarchical service-finite state automata (EHS-FSA) is constructed based on finite state automata (FSA), which formally depict overall changing processes of service consistency states. And also the service consistency evolution algorithms (SCEAs) based on EHS-FSA are developed to quantitatively assess these impact factors. Experimental results show that the bad reusability (17.93% on average) is the biggest influential factor, the noncomposition of atomic services (13.12%) is the second biggest one, and the service version's confusion (1.2%) is the smallest one. Compared with previous qualitative analysis, SCEAs present good effectiveness and feasibility. This research can guide the engineers of service consistency technologies toward obtaining a higher level of consistency in SODSA.

  12. Dynamic and Quantitative Method of Analyzing Service Consistency Evolution Based on Extended Hierarchical Finite State Automata

    Directory of Open Access Journals (Sweden)

    Linjun Fan

    2014-01-01

    Full Text Available This paper is concerned with the dynamic evolution analysis and quantitative measurement of primary factors that cause service inconsistency in service-oriented distributed simulation applications (SODSA. Traditional methods are mostly qualitative and empirical, and they do not consider the dynamic disturbances among factors in service’s evolution behaviors such as producing, publishing, calling, and maintenance. Moreover, SODSA are rapidly evolving in terms of large-scale, reusable, compositional, pervasive, and flexible features, which presents difficulties in the usage of traditional analysis methods. To resolve these problems, a novel dynamic evolution model extended hierarchical service-finite state automata (EHS-FSA is constructed based on finite state automata (FSA, which formally depict overall changing processes of service consistency states. And also the service consistency evolution algorithms (SCEAs based on EHS-FSA are developed to quantitatively assess these impact factors. Experimental results show that the bad reusability (17.93% on average is the biggest influential factor, the noncomposition of atomic services (13.12% is the second biggest one, and the service version’s confusion (1.2% is the smallest one. Compared with previous qualitative analysis, SCEAs present good effectiveness and feasibility. This research can guide the engineers of service consistency technologies toward obtaining a higher level of consistency in SODSA.

  13. Self-consistent study of localization

    International Nuclear Information System (INIS)

    Brezini, A.; Olivier, G.

    1981-08-01

    The localization models of Abou-Chacra et al. and Kumar et al. are critically re-examined in the limit of weak disorder. By using an improved method of approximation, we have studied the displacement of the band edge and the mobility edge as function of disorder and compared the results of Abou-Chacra et al. and Kumar et al. in the light of the present approximation. (author)

  14. Solvent effects in time-dependent self-consistent field methods. II. Variational formulations and analytical gradients

    International Nuclear Information System (INIS)

    Bjorgaard, J. A.; Velizhanin, K. A.; Tretiak, S.

    2015-01-01

    This study describes variational energy expressions and analytical excited state energy gradients for time-dependent self-consistent field methods with polarizable solvent effects. Linear response, vertical excitation, and state-specific solventmodels are examined. Enforcing a variational ground stateenergy expression in the state-specific model is found to reduce it to the vertical excitation model. Variational excited state energy expressions are then provided for the linear response and vertical excitation models and analytical gradients are formulated. Using semiempiricalmodel chemistry, the variational expressions are verified by numerical and analytical differentiation with respect to a static external electric field. Lastly, analytical gradients are further tested by performing microcanonical excited state molecular dynamics with p-nitroaniline

  15. Benefits of extensive recruitment effort persist during follow-ups and are consistent across age group and survey method. The TRAILS study.

    Science.gov (United States)

    Nederhof, Esther; Jörg, Frederike; Raven, Dennis; Veenstra, René; Verhulst, Frank C; Ormel, Johan; Oldehinkel, Albertine J

    2012-07-02

    assessment wave of a prospective population based cohort study has long lasting positive effects. Second, characteristics of hard-to-recruit responders are largely consistent across age groups and survey methods.

  16. Physically consistent data assimilation method based on feedback control for patient-specific blood flow analysis.

    Science.gov (United States)

    Ii, Satoshi; Adib, Mohd Azrul Hisham Mohd; Watanabe, Yoshiyuki; Wada, Shigeo

    2018-01-01

    This paper presents a novel data assimilation method for patient-specific blood flow analysis based on feedback control theory called the physically consistent feedback control-based data assimilation (PFC-DA) method. In the PFC-DA method, the signal, which is the residual error term of the velocity when comparing the numerical and reference measurement data, is cast as a source term in a Poisson equation for the scalar potential field that induces flow in a closed system. The pressure values at the inlet and outlet boundaries are recursively calculated by this scalar potential field. Hence, the flow field is physically consistent because it is driven by the calculated inlet and outlet pressures, without any artificial body forces. As compared with existing variational approaches, although this PFC-DA method does not guarantee the optimal solution, only one additional Poisson equation for the scalar potential field is required, providing a remarkable improvement for such a small additional computational cost at every iteration. Through numerical examples for 2D and 3D exact flow fields, with both noise-free and noisy reference data as well as a blood flow analysis on a cerebral aneurysm using actual patient data, the robustness and accuracy of this approach is shown. Moreover, the feasibility of a patient-specific practical blood flow analysis is demonstrated. Copyright © 2017 John Wiley & Sons, Ltd.

  17. Internal consistency and validity of an observational method for assessing disability in mobility in patients with osteoarthritis

    NARCIS (Netherlands)

    Steultjens, M. P.; Dekker, J.; van Baar, M. E.; Oostendorp, R. A.; Bijlsma, J. W.

    1999-01-01

    To establish the internal consistency and validity of an observational method for assessing disability in mobility in patients with osteoarthritis (OA). Data were obtained from 198 patients with OA of the hip or knee. Results of the observational method were compared with results of self-report

  18. Self-consistent modelling of resonant tunnelling structures

    DEFF Research Database (Denmark)

    Fiig, T.; Jauho, A.P.

    1992-01-01

    We report a comprehensive study of the effects of self-consistency on the I-V-characteristics of resonant tunnelling structures. The calculational method is based on a simultaneous solution of the effective-mass Schrödinger equation and the Poisson equation, and the current is evaluated...... applied voltages and carrier densities at the emitter-barrier interface. We include the two-dimensional accumulation layer charge and the quantum well charge in our self-consistent scheme. We discuss the evaluation of the current contribution originating from the two-dimensional accumulation layer charges......, and our qualitative estimates seem consistent with recent experimental studies. The intrinsic bistability of resonant tunnelling diodes is analyzed within several different approximation schemes....

  19. Multiphase flows of N immiscible incompressible fluids: A reduction-consistent and thermodynamically-consistent formulation and associated algorithm

    Science.gov (United States)

    Dong, S.

    2018-05-01

    We present a reduction-consistent and thermodynamically consistent formulation and an associated numerical algorithm for simulating the dynamics of an isothermal mixture consisting of N (N ⩾ 2) immiscible incompressible fluids with different physical properties (densities, viscosities, and pair-wise surface tensions). By reduction consistency we refer to the property that if only a set of M (1 ⩽ M ⩽ N - 1) fluids are present in the system then the N-phase governing equations and boundary conditions will exactly reduce to those for the corresponding M-phase system. By thermodynamic consistency we refer to the property that the formulation honors the thermodynamic principles. Our N-phase formulation is developed based on a more general method that allows for the systematic construction of reduction-consistent formulations, and the method suggests the existence of many possible forms of reduction-consistent and thermodynamically consistent N-phase formulations. Extensive numerical experiments have been presented for flow problems involving multiple fluid components and large density ratios and large viscosity ratios, and the simulation results are compared with the physical theories or the available physical solutions. The comparisons demonstrate that our method produces physically accurate results for this class of problems.

  20. A method for consistent precision radiation therapy

    International Nuclear Information System (INIS)

    Leong, J.

    1985-01-01

    Using a meticulous setup procedure in which repeated portal films were taken before each treatment until satisfactory portal verifications were obtained, a high degree of precision in patient positioning was achieved. A fluctuation from treatment to treatment, over 11 treatments, of less than +-0.10 cm (S.D.) for anatomical points inside the treatment field was obtained. This, however, only applies to specific anatomical points selected for this positioning procedure and does not apply to all points within the portal. We have generalized this procedure and have suggested a means by which any target volume can be consistently positioned which may approach this degree of precision. (orig.)

  1. Analytical free energy gradient for the molecular Ornstein-Zernike self-consistent-field method

    Directory of Open Access Journals (Sweden)

    N.Yoshida

    2007-09-01

    Full Text Available An analytical free energy gradient for the molecular Ornstein-Zernike self-consistent-field (MOZ-SCF method is presented. MOZ-SCF theory is one of the theories to considering the solvent effects on the solute electronic structure in solution. [Yoshida N. et al., J. Chem. Phys., 2000, 113, 4974] Molecular geometries of water, formaldehyde, acetonitrile and acetone in water are optimized by analytical energy gradient formula. The results are compared with those from the polarizable continuum model (PCM, the reference interaction site model (RISM-SCF and the three dimensional (3D RISM-SCF.

  2. Time-Consistent and Market-Consistent Evaluations (replaced by CentER DP 2012-086)

    NARCIS (Netherlands)

    Pelsser, A.; Stadje, M.A.

    2011-01-01

    We consider evaluation methods for payoffs with an inherent financial risk as encountered for instance for portfolios held by pension funds and insurance companies. Pricing such payoffs in a way consistent to market prices typically involves combining actuarial techniques with methods from

  3. Consistent model driven architecture

    Science.gov (United States)

    Niepostyn, Stanisław J.

    2015-09-01

    The goal of the MDA is to produce software systems from abstract models in a way where human interaction is restricted to a minimum. These abstract models are based on the UML language. However, the semantics of UML models is defined in a natural language. Subsequently the verification of consistency of these diagrams is needed in order to identify errors in requirements at the early stage of the development process. The verification of consistency is difficult due to a semi-formal nature of UML diagrams. We propose automatic verification of consistency of the series of UML diagrams originating from abstract models implemented with our consistency rules. This Consistent Model Driven Architecture approach enables us to generate automatically complete workflow applications from consistent and complete models developed from abstract models (e.g. Business Context Diagram). Therefore, our method can be used to check practicability (feasibility) of software architecture models.

  4. Assessment of rigid multi-modality image registration consistency using the multiple sub-volume registration (MSR) method

    International Nuclear Information System (INIS)

    Ceylan, C; Heide, U A van der; Bol, G H; Lagendijk, J J W; Kotte, A N T J

    2005-01-01

    Registration of different imaging modalities such as CT, MRI, functional MRI (fMRI), positron (PET) and single photon (SPECT) emission tomography is used in many clinical applications. Determining the quality of any automatic registration procedure has been a challenging part because no gold standard is available to evaluate the registration. In this note we present a method, called the 'multiple sub-volume registration' (MSR) method, for assessing the consistency of a rigid registration. This is done by registering sub-images of one data set on the other data set, performing a crude non-rigid registration. By analysing the deviations (local deformations) of the sub-volume registrations from the full registration we get a measure of the consistency of the rigid registration. Registration of 15 data sets which include CT, MR and PET images for brain, head and neck, cervix, prostate and lung was performed utilizing a rigid body registration with normalized mutual information as the similarity measure. The resulting registrations were classified as good or bad by visual inspection. The resulting registrations were also classified using our MSR method. The results of our MSR method agree with the classification obtained from visual inspection for all cases (p < 0.02 based on ANOVA of the good and bad groups). The proposed method is independent of the registration algorithm and similarity measure. It can be used for multi-modality image data sets and different anatomic sites of the patient. (note)

  5. The method and program system CABEI for adjusting consistency between natural element and its isotopes data

    Energy Technology Data Exchange (ETDEWEB)

    Tingjin, Liu; Zhengjun, Sun [Chinese Nuclear Data Center, Beijing, BJ (China)

    1996-06-01

    To meet the requirement of nuclear engineering, especially nuclear fusion reactor, now the data in the major evaluated libraries are given not only for natural element but also for its isotopes. Inconsistency between element and its isotopes data is one of the main problem in present evaluated neutron libraries. The formulas for adjusting to satisfy simultaneously the two kinds of consistent relationships were derived by means of least square method, the program system CABEI were developed. This program was tested by calculating the Fe data in CENDL-2.1. The results show that adjusted values satisfy the two kinds of consistent relationships.

  6. Does Methodological Guidance Produce Consistency? A Review of Methodological Consistency in Breast Cancer Utility Value Measurement in NICE Single Technology Appraisals.

    Science.gov (United States)

    Rose, Micah; Rice, Stephen; Craig, Dawn

    2017-07-05

    Since 2004, National Institute for Health and Care Excellence (NICE) methodological guidance for technology appraisals has emphasised a strong preference for using the validated EuroQol 5-Dimensions (EQ-5D) quality-of-life instrument, measuring patient health status from patients or carers, and using the general public's preference-based valuation of different health states when assessing health benefits in economic evaluations. The aim of this study was to review all NICE single technology appraisals (STAs) for breast cancer treatments to explore consistency in the use of utility scores in light of NICE methodological guidance. A review of all published breast cancer STAs was undertaken using all publicly available STA documents for each included assessment. Utility scores were assessed for consistency with NICE-preferred methods and original data sources. Furthermore, academic assessment group work undertaken during the STA process was examined to evaluate the emphasis of NICE-preferred quality-of-life measurement methods. Twelve breast cancer STAs were identified, and many STAs used evidence that did not follow NICE's preferred utility score measurement methods. Recent STA submissions show companies using EQ-5D and mapping. Academic assessment groups rarely emphasized NICE-preferred methods, and queries about preferred methods were rare. While there appears to be a trend in recent STA submissions towards following NICE methodological guidance, historically STA guidance in breast cancer has generally not used NICE's preferred methods. Future STAs in breast cancer and reviews of older guidance should ensure that utility measurement methods are consistent with the NICE reference case to help produce consistent, equitable decision making.

  7. Geometry of the self-consistent collective-coordinate method for the large-amplitude collective motion

    International Nuclear Information System (INIS)

    Sakata, Fumihiko; Marumori, Toshio; Hashimoto, Yukio; Une, Tsutomu.

    1983-05-01

    The geometry of the self-consistent collective-coordinate (SCC) method formulated within the framework of the time-dependent Hartree-Fock (TDHF) theory is investigated by associating the variational parameters with a symplectic manifold (a TDHF manifold). With the use of a canonical-variables parametrization, it is shown that the TDHF equation is equivalent to the canonical equations of motion in classical mechanics in the TDHF manifold. This enables us to investigate geometrical structure of the SCC method in the language of the classical mechanics. The SCC method turns out to give a prescription how to dynamically extract a ''maximally-decoupled'' collective submanifold (hypersurface) out of the TDHF manifold, in such a way that a certain kind of trajectories corresponding to the large-amplitude collective motion under consideration can be reproduced on the hypersurface as precisely as possible. The stability of the hypersurface at each point on it is investigated, in order to see whether the hypersurface obtained by the SCC method is really an approximate integral surface in the TDHF manifold or not. (author)

  8. Consistent Visual Analyses of Intrasubject Data

    Science.gov (United States)

    Kahng, SungWoo; Chung, Kyong-Mee; Gutshall, Katharine; Pitts, Steven C.; Kao, Joyce; Girolami, Kelli

    2010-01-01

    Visual inspection of single-case data is the primary method of interpretation of the effects of an independent variable on a dependent variable in applied behavior analysis. The purpose of the current study was to replicate and extend the results of DeProspero and Cohen (1979) by reexamining the consistency of visual analysis across raters. We…

  9. A Dynamic Linear Hashing Method for Redundancy Management in Train Ethernet Consist Network

    Directory of Open Access Journals (Sweden)

    Xiaobo Nie

    2016-01-01

    Full Text Available Massive transportation systems like trains are considered critical systems because they use the communication network to control essential subsystems on board. Critical system requires zero recovery time when a failure occurs in a communication network. The newly published IEC62439-3 defines the high-availability seamless redundancy protocol, which fulfills this requirement and ensures no frame loss in the presence of an error. This paper adopts these for train Ethernet consist network. The challenge is management of the circulating frames, capable of dealing with real-time processing requirements, fast switching times, high throughout, and deterministic behavior. The main contribution of this paper is the in-depth analysis it makes of network parameters imposed by the application of the protocols to train control and monitoring system (TCMS and the redundant circulating frames discarding method based on a dynamic linear hashing, using the fastest method in order to resolve all the issues that are dealt with.

  10. Parquet equations for numerical self-consistent-field theory

    International Nuclear Information System (INIS)

    Bickers, N.E.

    1991-01-01

    In recent years increases in computational power have provided new motivation for the study of self-consistent-field theories for interacting electrons. In this set of notes, the so-called parquet equations for electron systems are derived pedagogically. The principal advantages of the parquet approach are outlined, and its relationship to simpler self-consistent-field methods, including the Baym-Kadanoff technique, is discussed in detail. (author). 14 refs, 9 figs

  11. Time-Consistent and Market-Consistent Evaluations (Revised version of CentER DP 2011-063)

    NARCIS (Netherlands)

    Pelsser, A.; Stadje, M.A.

    2012-01-01

    Abstract: We consider evaluation methods for payoffs with an inherent financial risk as encountered for instance for portfolios held by pension funds and insurance companies. Pricing such payoffs in a way consistent to market prices typically involves combining actuarial techniques with methods from

  12. Self-consistent quark bags

    International Nuclear Information System (INIS)

    Rafelski, J.

    1979-01-01

    After an introductory overview of the bag model the author uses the self-consistent solution of the coupled Dirac-meson fields to represent a bound state of strongly ineteracting fermions. In this framework he discusses the vivial approach to classical field equations. After a short description of the used numerical methods the properties of bound states of scalar self-consistent Fields and the solutions of a self-coupled Dirac field are considered. (HSI) [de

  13. FDTD method for computing the off-plane band structure in a two-dimensional photonic crystal consisting of nearly free-electron metals

    Energy Technology Data Exchange (ETDEWEB)

    Xiao Sanshui; He Sailing

    2002-12-01

    An FDTD numerical method for computing the off-plane band structure of a two-dimensional photonic crystal consisting of nearly free-electron metals is presented. The method requires only a two-dimensional discretization mesh for a given off-plane wave number k{sub z} although the off-plane propagation is a three-dimensional problem. The off-plane band structures of a square lattice of metallic rods with the high-frequency metallic model in the air are studied, and a complete band gap for some nonzero off-plane wave number k{sub z} is founded.

  14. FDTD method for computing the off-plane band structure in a two-dimensional photonic crystal consisting of nearly free-electron metals

    International Nuclear Information System (INIS)

    Xiao Sanshui; He Sailing

    2002-01-01

    An FDTD numerical method for computing the off-plane band structure of a two-dimensional photonic crystal consisting of nearly free-electron metals is presented. The method requires only a two-dimensional discretization mesh for a given off-plane wave number k z although the off-plane propagation is a three-dimensional problem. The off-plane band structures of a square lattice of metallic rods with the high-frequency metallic model in the air are studied, and a complete band gap for some nonzero off-plane wave number k z is founded

  15. Consistent interactive segmentation of pulmonary ground glass nodules identified in CT studies

    Science.gov (United States)

    Zhang, Li; Fang, Ming; Naidich, David P.; Novak, Carol L.

    2004-05-01

    Ground glass nodules (GGNs) have proved especially problematic in lung cancer diagnosis, as despite frequently being malignant they characteristically have extremely slow rates of growth. This problem is further magnified by the small size of many of these lesions now being routinely detected following the introduction of multislice CT scanners capable of acquiring contiguous high resolution 1 to 1.25 mm sections throughout the thorax in a single breathhold period. Although segmentation of solid nodules can be used clinically to determine volume doubling times quantitatively, reliable methods for segmentation of pure ground glass nodules have yet to be introduced. Our purpose is to evaluate a newly developed computer-based segmentation method for rapid and reproducible measurements of pure ground glass nodules. 23 pure or mixed ground glass nodules were identified in a total of 8 patients by a radiologist and subsequently segmented by our computer-based method using Markov random field and shape analysis. The computer-based segmentation was initialized by a click point. Methodological consistency was assessed using the overlap ratio between 3 segmentations initialized by 3 different click points for each nodule. The 95% confidence interval on the mean of the overlap ratios proved to be [0.984, 0.998]. The computer-based method failed on two nodules that were difficult to segment even manually either due to especially low contrast or markedly irregular margins. While achieving consistent manual segmentation of ground glass nodules has proven problematic most often due to indistinct boundaries and interobserver variability, our proposed method introduces a powerful new tool for obtaining reproducible quantitative measurements of these lesions. It is our intention to further document the value of this approach with a still larger set of ground glass nodules.

  16. Direct numerical simulation of interfacial instabilities: A consistent, conservative, all-speed, sharp-interface method

    Science.gov (United States)

    Chang, Chih-Hao; Deng, Xiaolong; Theofanous, Theo G.

    2013-06-01

    We present a conservative and consistent numerical method for solving the Navier-Stokes equations in flow domains that may be separated by any number of material interfaces, at arbitrarily-high density/viscosity ratios and acoustic-impedance mismatches, subjected to strong shock waves and flow speeds that can range from highly supersonic to near-zero Mach numbers. A principal aim is prediction of interfacial instabilities under superposition of multiple potentially-active modes (Rayleigh-Taylor, Kelvin-Helmholtz, Richtmyer-Meshkov) as found for example with shock-driven, immersed fluid bodies (locally oblique shocks)—accordingly we emphasize fidelity supported by physics-based validation, including experiments. Consistency is achieved by satisfying the jump discontinuities at the interface within a conservative 2nd-order scheme that is coupled, in a conservative manner, to the bulk-fluid motions. The jump conditions are embedded into a Riemann problem, solved exactly to provide the pressures and velocities along the interface, which is tracked by a level set function to accuracy of O(Δx5, Δt4). Subgrid representation of the interface is achieved by allowing curvature of its constituent interfacial elements to obtain O(Δx3) accuracy in cut-cell volume, with attendant benefits in calculating cell- geometric features and interface curvature (O(Δx3)). Overall the computation converges at near-theoretical O(Δx2). Spurious-currents are down to machine error and there is no time-step restriction due to surface tension. Our method is built upon a quadtree-like adaptive mesh refinement infrastructure. When necessary, this is supplemented by body-fitted grids to enhance resolution of the gas dynamics, including flow separation, shear layers, slip lines, and critical layers. Comprehensive comparisons with exact solutions for the linearized Rayleigh-Taylor and Kelvin-Helmholtz problems demonstrate excellent performance. Sample simulations of liquid drops subjected to

  17. Market-consistent actuarial valuation

    CERN Document Server

    Wüthrich, Mario V

    2016-01-01

    This is the third edition of this well-received textbook, presenting powerful methods for measuring insurance liabilities and assets in a consistent way, with detailed mathematical frameworks that lead to market-consistent values for liabilities. Topics covered are stochastic discounting with deflators, valuation portfolio in life and non-life insurance, probability distortions, asset and liability management, financial risks, insurance technical risks, and solvency. Including updates on recent developments and regulatory changes under Solvency II, this new edition of Market-Consistent Actuarial Valuation also elaborates on different risk measures, providing a revised definition of solvency based on industry practice, and presents an adapted valuation framework which takes a dynamic view of non-life insurance reserving risk.

  18. A novel method for identification and quantification of consistently differentially methylated regions.

    Directory of Open Access Journals (Sweden)

    Ching-Lin Hsiao

    Full Text Available Advances in biotechnology have resulted in large-scale studies of DNA methylation. A differentially methylated region (DMR is a genomic region with multiple adjacent CpG sites that exhibit different methylation statuses among multiple samples. Many so-called "supervised" methods have been established to identify DMRs between two or more comparison groups. Methods for the identification of DMRs without reference to phenotypic information are, however, less well studied. An alternative "unsupervised" approach was proposed, in which DMRs in studied samples were identified with consideration of nature dependence structure of methylation measurements between neighboring probes from tiling arrays. Through simulation study, we investigated effects of dependencies between neighboring probes on determining DMRs where a lot of spurious signals would be produced if the methylation data were analyzed independently of the probe. In contrast, our newly proposed method could successfully correct for this effect with a well-controlled false positive rate and a comparable sensitivity. By applying to two real datasets, we demonstrated that our method could provide a global picture of methylation variation in studied samples. R source codes to implement the proposed method were freely available at http://www.csjfann.ibms.sinica.edu.tw/eag/programlist/ICDMR/ICDMR.html.

  19. Consistency Study About Critical Thinking Skill of PGSD Students (Teacher Candidate of Elementary School) on Energy Material

    Science.gov (United States)

    Wijayanti, M. D.; Raharjo, S. B.; Saputro, S.; Mulyani, S.

    2017-09-01

    This study aims to examine the consistency of critical thinking ability of PGSD students in Energy material. The study population is PGSD students in UNS Surakarta. Samples are using cluster random sampling technique obtained by 101 students. Consistency of student’s response in knowing the critical thinking ability of PGSD students can be used as a benchmark of PGSD students’ understanding to see the equivalence of IPA problem, especially in energy material presented with various phenomena. This research uses descriptive method. Data are obtained through questionnaires and interviews. The research results that the average level of critical thinking in this study is divided into 3 levels, i.e.: level 1 (54.85%), level 2 (19.93%), and level 3 (25.23%). The data of the research result affect to the weak of students’ Energy materials’ understanding. In addition, indicators identify that assumptions and arguments analysis are also still low. Ideally, the consistency of critical thinking ability as a whole has an impact on the expansion of students’ conceptual understanding. The results of the study may become a reference to improve the subsequent research in order to obtain positive changes in the ability of critical thinking of students who directly improve the concept of students’ better understanding, especially in energy materials at various real problems occured.

  20. Fully self-consistent multiparticle-multi-hole configuration mixing method - Applications to a few light nuclei

    International Nuclear Information System (INIS)

    Robin, Caroline

    2014-01-01

    This thesis project takes part in the development of the multiparticle-multi-hole configuration mixing method aiming to describe the structure of atomic nuclei. Based on a double variational principle, this approach allows to determine the expansion coefficients of the wave function and the single-particle states at the same time. In this work we apply for the first time the fully self-consistent formalism of the mp-mh method to the description of a few p- and sd-shell nuclei, using the D1S Gogny interaction. A first study of the 12 C nucleus is performed in order to test the doubly iterative convergence procedure when different types of truncation criteria are applied to select the many-body configurations included in the wave-function. A detailed analysis of the effect caused by the orbital optimization is conducted. In particular, its impact on the one-body density and on the fragmentation of the ground state wave function is analyzed. A systematic study of sd-shell nuclei is then performed. A careful analysis of the correlation content of the ground state is first conducted and observables quantities such as binding and separation energies, as well as charge radii are calculated and compared to experimental data. Satisfactory results are found. Spectroscopic properties are also studied. Excitation energies of low-lying states are found in very good agreement with experiment, and the study of magnetic dipole features are also satisfactory. Calculation of electric quadrupole properties, and in particular transition probabilities B(E2), however reveal a clear lack of collectivity of the wave function, due to the reduced valence space used to select the many-body configurations. Although the renormalization of orbitals leads to an important fragmentation of the ground state wave function, only little effect is observed on B(E2) probabilities. A tentative explanation is given. Finally, the structure description of nuclei provided by the multiparticle

  1. Structural Consistency, Consistency, and Sequential Rationality.

    OpenAIRE

    Kreps, David M; Ramey, Garey

    1987-01-01

    Sequential equilibria comprise consistent beliefs and a sequentially ra tional strategy profile. Consistent beliefs are limits of Bayes ratio nal beliefs for sequences of strategies that approach the equilibrium strategy. Beliefs are structurally consistent if they are rationaliz ed by some single conjecture concerning opponents' strategies. Consis tent beliefs are not necessarily structurally consistent, notwithstan ding a claim by Kreps and Robert Wilson (1982). Moreover, the spirit of stru...

  2. Electronic structure of thin films by the self-consistent numerical-basis-set linear combination of atomic orbitals method: Ni(001)

    International Nuclear Information System (INIS)

    Wang, C.S.; Freeman, A.J.

    1979-01-01

    We present the self-consistent numerical-basis-set linear combination of atomic orbitals (LCAO) discrete variational method for treating the electronic structure of thin films. As in the case of bulk solids, this method provides for thin films accurate solutions of the one-particle local density equations with a non-muffin-tin potential. Hamiltonian and overlap matrix elements are evaluated accurately by means of a three-dimensional numerical Diophantine integration scheme. Application of this method is made to the self-consistent solution of one-, three-, and five-layer Ni(001) unsupported films. The LCAO Bloch basis set consists of valence orbitals (3d, 4s, and 4p states for transition metals) orthogonalized to the frozen-core wave functions. The self-consistent potential is obtained iteratively within the superposition of overlapping spherical atomic charge density model with the atomic configurations treated as adjustable parameters. Thus the crystal Coulomb potential is constructed as a superposition of overlapping spherically symmetric atomic potentials and, correspondingly, the local density Kohn-Sham (α = 2/3) potential is determined from a superposition of atomic charge densities. At each iteration in the self-consistency procedure, the crystal charge density is evaluated using a sampling of 15 independent k points in (1/8)th of the irreducible two-dimensional Brillouin zone. The total density of states (DOS) and projected local DOS (by layer plane) are calculated using an analytic linear energy triangle method (presented as an Appendix) generalized from the tetrahedron scheme for bulk systems. Distinct differences are obtained between the surface and central plane local DOS. The central plane DOS is found to converge rapidly to the DOS of bulk paramagnetic Ni obtained by Wang and Callaway. Only a very small surplus charge (0.03 electron/atom) is found on the surface planes, in agreement with jellium model calculations

  3. Solution of degenerate hypergeometric system of Horn consisting of three equations

    Science.gov (United States)

    Tasmambetov, Zhaksylyk N.; Zhakhina, Ryskul U.

    2017-09-01

    The possibilities of constructing normal-regular solutions of a system consisting of three partial differential equations of the second order are studied by the Frobenius-Latysheva method. The method of determining unknown coefficients is shown and the relationship of the studied system with the system, which solution is Laguerre's polynomial of three variables is indicated. The generalization of the Frobenius-Latysheva method to the case of a system consisting of three equations makes it possible to clarify the relationship of such systems, which solutions are special functions of three variables. These systems include the functions of Whittaker and Bessel, 205 special functions of three variables from the list of M. Srivastava and P.W. Carlsson, as well as orthogonal polynomials of three variables. All this contributes to the further development of the analytic theory of systems consisting of three partial differential equations of the second order.

  4. Self-consistent study of space-charge-dominated beams in a misaligned transport system

    International Nuclear Information System (INIS)

    Sing Babu, P.; Goswami, A.; Pandit, V.S.

    2013-01-01

    A self-consistent particle-in-cell (PIC) simulation method is developed to investigate the dynamics of space-charge-dominated beams through a misaligned solenoid based transport system. Evolution of beam centroid, beam envelope and emittance is studied as a function of misalignment parameters for various types of beam distributions. Simulation results performed up to 40 mA of proton beam indicate that centroid oscillations induced by the displacement and rotational misalignments of solenoids do not depend of the beam distribution. It is shown that the beam envelope around the centroid is independent of the centroid motion for small centroid oscillation. In addition, we have estimated the loss of beam during the transport caused by the misalignment for various beam distributions

  5. Coordinating user interfaces for consistency

    CERN Document Server

    Nielsen, Jakob

    2001-01-01

    In the years since Jakob Nielsen's classic collection on interface consistency first appeared, much has changed, and much has stayed the same. On the one hand, there's been exponential growth in the opportunities for following or disregarding the principles of interface consistency-more computers, more applications, more users, and of course the vast expanse of the Web. On the other, there are the principles themselves, as persistent and as valuable as ever. In these contributed chapters, you'll find details on many methods for seeking and enforcing consistency, along with bottom-line analys

  6. Toward thermodynamic consistency of quasiparticle picture

    International Nuclear Information System (INIS)

    Biro, T.S.; Toneev, V.D.; Shanenko, A.A.

    2003-01-01

    The purpose of the present article is to call attention to some realistic quasiparticle-based description of quark/gluon matter and its consistent implementation in thermodynamics. A simple and transparent representation of the thermodynamic consistency conditions is given. This representation allows one to review critically and systemize available phenomenological approaches to the deconfinement problem with respect to their thermodynamic consistency. Particular attention is paid to the development of a method for treating the string screening in the dense matter of unbound color charges. The proposed method yields an integrable effective pair potential that can be incorporated into the mean-field picture. The results of its application are in reasonable agreement with lattice data on the QCD thermodynamics

  7. A new approach to hull consistency

    Directory of Open Access Journals (Sweden)

    Kolev Lubomir

    2016-06-01

    Full Text Available Hull consistency is a known technique to improve the efficiency of iterative interval methods for solving nonlinear systems describing steady-states in various circuits. Presently, hull consistency is checked in a scalar manner, i.e. successively for each equation of the nonlinear system with respect to a single variable. In the present poster, a new more general approach to implementing hull consistency is suggested which consists in treating simultaneously several equations with respect to the same number of variables.

  8. A self consistent study of the phase transition in the scalar electroweak theory at finite temperature

    International Nuclear Information System (INIS)

    Kerres, U.; Mack, G.; Palma, G.

    1994-12-01

    We propose the study of the phase transition in the scalar electroweak theory at finite temperature by a two-step method. It combines i) dimensional reduction to a 3-dimensional lattice theory via perturbative blockspin transformation, and ii) either further real space renormalization group transformations, or solution of gap equations, for the 3d lattice theory. A gap equation can be obtained by using the Peierls inequality to find the best quadratic approximation to the 3d action. This method avoids the lack of self consistency of the usual treatments which do not separate infrared and UV-problems by introduction of a lattice cutoff. The effective 3d lattice action could also be used in computer simulations. (orig.)

  9. A self consistent study of the phase transition in the scalar electroweak theory at finite temperature

    International Nuclear Information System (INIS)

    Kerres, U.

    1995-01-01

    We propose the study of the phase transition in the scalar electroweak theory at finite temperature by a two-step method. It combines i) dimensional reduction to a 3-dimensional lattice theory via perturbative blockspin transformation, and ii) either further real space renormalization group transformations, or solution of gap equations, for the 3d lattice theory. A gap equation can be obtained by using the Peierls inequality to find the best quadratic approximation to the 3d action. This method avoids the lack of self consistency of the usual treatments which do not separate infrared and UV-problems by introduction of a lattice cutoff. The effective 3d lattice action could also be used in computer simulations. ((orig.))

  10. An exact and consistent adjoint method for high-fidelity discretization of the compressible flow equations

    Science.gov (United States)

    Subramanian, Ramanathan Vishnampet Ganapathi

    , and can be tailored to achieve global conservation up to arbitrary orders of accuracy. We again confirm that the sensitivity gradient for turbulent jet noise computed using our dual-consistent method is only limited by computing precision.

  11. Fully consistent CFD methods for incompressible flow computations

    DEFF Research Database (Denmark)

    Kolmogorov, Dmitry; Shen, Wen Zhong; Sørensen, Niels N.

    2014-01-01

    Nowadays collocated grid based CFD methods are one of the most e_cient tools for computations of the ows past wind turbines. To ensure the robustness of the methods they require special attention to the well-known problem of pressure-velocity coupling. Many commercial codes to ensure the pressure...

  12. Parental depressive and anxiety symptoms during pregnancy and attention problems in children: A cross-cohort consistency study

    OpenAIRE

    Batenburg-Eddes, Tamara; Brion, Maria; Henrichs, Jens; Jaddoe, Vincent; Hofman, Albert; Verhulst, Frank; Lawlor, Debbie; Davey-Smith, George; Tiemeier, Henning

    2013-01-01

    Background: Maternal depression and anxiety during pregnancy have been associated with offspring-attention deficit problems. Aim: We explored possible intrauterine effects by comparing maternal and paternal symptoms during pregnancy, by investigating cross-cohort consistency, and by investigating whether parental symptoms in early childhood may explain any observed intrauterine effect. Methods: This study was conducted in two cohorts (Generation R, n = 2,280 and ALSPAC, n = 3,442). Pregnant w...

  13. Towards thermodynamical consistency of quasiparticle picture

    International Nuclear Information System (INIS)

    Biro, T.S.; Shanenko, A.A.; Toneev, V.D.; Research Inst. for Particle and Nuclear Physics, Hungarian Academy of Sciences, Budapest

    2003-01-01

    The purpose of the present article is to call attention to some realistic quasi-particle-based description of the quark/gluon matter and its consistent implementation in thermodynamics. A simple and transparent representation of the thermodynamical consistency conditions is given. This representation allows one to review critically and systemize available phenomenological approaches to the deconfinement problem with respect to their thermodynamical consistency. A particular attention is paid to the development of a method for treating the string screening in the dense matter of unbound color charges. The proposed method yields an integrable effective pair potential, which can be incorporated into the mean-field picture. The results of its application are in reasonable agreement with lattice data on the QCD thermodynamics [ru

  14. Self-Consistent-Field Method and τ-Functional Method on Group Manifold in Soliton Theory: a Review and New Results

    Directory of Open Access Journals (Sweden)

    Seiya Nishiyama

    2009-01-01

    Full Text Available The maximally-decoupled method has been considered as a theory to apply an basic idea of an integrability condition to certain multiple parametrized symmetries. The method is regarded as a mathematical tool to describe a symmetry of a collective submanifold in which a canonicity condition makes the collective variables to be an orthogonal coordinate-system. For this aim we adopt a concept of curvature unfamiliar in the conventional time-dependent (TD self-consistent field (SCF theory. Our basic idea lies in the introduction of a sort of Lagrange manner familiar to fluid dynamics to describe a collective coordinate-system. This manner enables us to take a one-form which is linearly composed of a TD SCF Hamiltonian and infinitesimal generators induced by collective variable differentials of a canonical transformation on a group. The integrability condition of the system read the curvature C = 0. Our method is constructed manifesting itself the structure of the group under consideration. To go beyond the maximaly-decoupled method, we have aimed to construct an SCF theory, i.e., υ (external parameter-dependent Hartree-Fock (HF theory. Toward such an ultimate goal, the υ-HF theory has been reconstructed on an affine Kac-Moody algebra along the soliton theory, using infinite-dimensional fermion. An infinite-dimensional fermion operator is introduced through a Laurent expansion of finite-dimensional fermion operators with respect to degrees of freedom of the fermions related to a υ-dependent potential with a Υ-periodicity. A bilinear equation for the υ-HF theory has been transcribed onto the corresponding τ-function using the regular representation for the group and the Schur-polynomials. The υ-HF SCF theory on an infinite-dimensional Fock space F∞ leads to a dynamics on an infinite-dimensional Grassmannian Gr∞ and may describe more precisely such a dynamics on the group manifold. A finite-dimensional Grassmannian is identified with a Gr

  15. Consistent application of codes and standards

    International Nuclear Information System (INIS)

    Scott, M.A.

    1989-01-01

    The guidelines presented in the US Department of Energy, General Design Criteria (DOE 6430.1A), and the Design and Evaluation Guidelines for Department of Energy Facilities Subject to Natural Phenomena Hazards (UCRL-15910) provide a consistent and well defined approach to determine the natural phenomena hazards loads for US Department of Energy site facilities. The guidelines for the application of loads combinations and allowables criteria are not as well defined and are more flexible in interpretation. This flexibility in the interpretation of load combinations can lead to conflict between the designer and overseer. The establishment of an efficient set of acceptable design criteria, based on US Department of Energy guidelines, provides a consistent baseline for analysis, design, and review. Additionally, the proposed method should not limit the design and analytical innovation necessary to analyze or qualify the unique structure. This paper investigates the consistent application of load combinations, analytical methods, and load allowables and suggests a reference path consistent with the US Department of Energy guidelines

  16. Time-consistent actuarial valuations

    NARCIS (Netherlands)

    Pelsser, A.A.J.; Salahnejhad Ghalehjooghi, A.

    2016-01-01

    Time-consistent valuations (i.e. pricing operators) can be created by backward iteration of one-period valuations. In this paper we investigate the continuous-time limits of well-known actuarial premium principles when such backward iteration procedures are applied. This method is applied to an

  17. From virtual clustering analysis to self-consistent clustering analysis: a mathematical study

    Science.gov (United States)

    Tang, Shaoqiang; Zhang, Lei; Liu, Wing Kam

    2018-03-01

    In this paper, we propose a new homogenization algorithm, virtual clustering analysis (VCA), as well as provide a mathematical framework for the recently proposed self-consistent clustering analysis (SCA) (Liu et al. in Comput Methods Appl Mech Eng 306:319-341, 2016). In the mathematical theory, we clarify the key assumptions and ideas of VCA and SCA, and derive the continuous and discrete Lippmann-Schwinger equations. Based on a key postulation of "once response similarly, always response similarly", clustering is performed in an offline stage by machine learning techniques (k-means and SOM), and facilitates substantial reduction of computational complexity in an online predictive stage. The clear mathematical setup allows for the first time a convergence study of clustering refinement in one space dimension. Convergence is proved rigorously, and found to be of second order from numerical investigations. Furthermore, we propose to suitably enlarge the domain in VCA, such that the boundary terms may be neglected in the Lippmann-Schwinger equation, by virtue of the Saint-Venant's principle. In contrast, they were not obtained in the original SCA paper, and we discover these terms may well be responsible for the numerical dependency on the choice of reference material property. Since VCA enhances the accuracy by overcoming the modeling error, and reduce the numerical cost by avoiding an outer loop iteration for attaining the material property consistency in SCA, its efficiency is expected even higher than the recently proposed SCA algorithm.

  18. Image recognition and consistency of response

    Science.gov (United States)

    Haygood, Tamara M.; Ryan, John; Liu, Qing Mary A.; Bassett, Roland; Brennan, Patrick C.

    2012-02-01

    Purpose: To investigate the connection between conscious recognition of an image previously encountered in an experimental setting and consistency of response to the experimental question. Materials and Methods: Twenty-four radiologists viewed 40 frontal chest radiographs and gave their opinion as to the position of a central venous catheter. One-to-three days later they again viewed 40 frontal chest radiographs and again gave their opinion as to the position of the central venous catheter. Half of the radiographs in the second set were repeated images from the first set and half were new. The radiologists were asked of each image whether it had been included in the first set. For this study, we are evaluating only the 20 repeated images. We used the Kruskal-Wallis test and Fisher's exact test to determine the relationship between conscious recognition of a previously interpreted image and consistency in interpretation of the image. Results. There was no significant correlation between recognition of the image and consistency in response regarding the position of the central venous catheter. In fact, there was a trend in the opposite direction, with radiologists being slightly more likely to give a consistent response with respect to images they did not recognize than with respect to those they did recognize. Conclusion: Radiologists' recognition of previously-encountered images in an observer-performance study does not noticeably color their interpretation on the second encounter.

  19. RPA method based on the self-consistent cranking model for 168Er and 158Dy

    International Nuclear Information System (INIS)

    Kvasil, J.; Cwiok, S.; Chariev, M.M.; Choriev, B.

    1983-01-01

    The low-lying nuclear states in 168 Er and 158 Dy are analysed within the random phase approximation (RPA) method based on the self-consistent cranking model (SCCM). The moment of inertia, the value of chemical potential, and the strength constant k 1 have been obtained from the symmetry condition. The pairing strength constants Gsub(tau) have been determined from the experimental values of neutron and proton pairing energies for nonrotating nuclei. A quite good agreement with experimental energies of states with positive parity was obtained without introducing the two-phonon vibrational states

  20. Probability Machines: Consistent Probability Estimation Using Nonparametric Learning Machines

    Science.gov (United States)

    Malley, J. D.; Kruppa, J.; Dasgupta, A.; Malley, K. G.; Ziegler, A.

    2011-01-01

    Summary Background Most machine learning approaches only provide a classification for binary responses. However, probabilities are required for risk estimation using individual patient characteristics. It has been shown recently that every statistical learning machine known to be consistent for a nonparametric regression problem is a probability machine that is provably consistent for this estimation problem. Objectives The aim of this paper is to show how random forests and nearest neighbors can be used for consistent estimation of individual probabilities. Methods Two random forest algorithms and two nearest neighbor algorithms are described in detail for estimation of individual probabilities. We discuss the consistency of random forests, nearest neighbors and other learning machines in detail. We conduct a simulation study to illustrate the validity of the methods. We exemplify the algorithms by analyzing two well-known data sets on the diagnosis of appendicitis and the diagnosis of diabetes in Pima Indians. Results Simulations demonstrate the validity of the method. With the real data application, we show the accuracy and practicality of this approach. We provide sample code from R packages in which the probability estimation is already available. This means that all calculations can be performed using existing software. Conclusions Random forest algorithms as well as nearest neighbor approaches are valid machine learning methods for estimating individual probabilities for binary responses. Freely available implementations are available in R and may be used for applications. PMID:21915433

  1. Can ancestry be consistently determined from the skeleton?

    Directory of Open Access Journals (Sweden)

    Sierp Ingrid

    2015-03-01

    Full Text Available Although the concept of race has been thoroughly criticised in biological anthropology, forensic anthropology still uses a number of methods to determine the ‘race’ of a skeleton. The methods must be evaluated to see how effective they are given large individual variation. This study used 20 cases of skeletons of varied provenance to test whether the nine published methods of ‘race’ determination, using a range of various approaches, were able to consistently identify the ethnic origin. No one individual was identified as belonging to just one ‘major racial class’, e.g. European, meaning that complete consistency across all nine methods was not observed. In 14 cases (70%, various methods identified the same individual as belonging to all three racial classes. This suggests that the existing methods for the determination of ‘race’ are compromised. The very concept of ‘race’ is inapplicable to variation that occurs between populations only in small ways and the methods are limited by the geographic population from which their discriminant functions or observations of morphological traits were derived. Methods of multivariate linear discriminant analysis, e.g. CRANID, are supposed to allocate an individual skull to a specific population rather than a ‘major race’. In our analysis CRANID did not produce convincing allocations of individual skeletons to specific populations. The findings of this study show that great caution must be taken when attempting to ascertain the ‘race’ of a skeleton, as the outcome is not only dependent on which skeletal sites are available for assessment, but also the degree to which the unknown skeleton’s population of origin has been investigated.

  2. Self-consistent ab initio Calculations for Photoionization and Electron-Ion Recombination Using the R-Matrix Method

    Science.gov (United States)

    Nahar, S. N.

    2003-01-01

    Most astrophysical plasmas entail a balance between ionization and recombination. We present new results from a unified method for self-consistent and ab initio calculations for the inverse processes of photoionization and (e + ion) recombination. The treatment for (e + ion) recombination subsumes the non-resonant radiative recombination and the resonant dielectronic recombination processes in a unified scheme (S.N. Nahar and A.K. Pradhan, Phys. Rev. A 49, 1816 (1994);H.L. Zhang, S.N. Nahar, and A.K. Pradhan, J.Phys.B, 32,1459 (1999)). Calculations are carried out using the R-matrix method in the close coupling approximation using an identical wavefunction expansion for both processes to ensure self-consistency. The results for photoionization and recombination cross sections may also be compared with state-of-the-art experiments on synchrotron radiation sources for photoionization, and on heavy ion storage rings for recombination. The new experiments display heretofore unprecedented detail in terms of resonances and background cross sections and thereby calibrate the theoretical data precisely. We find a level of agreement between theory and experiment at about 10% for not only the ground state but also the metastable states. The recent experiments therefore verify the estimated accuracy of the vast amount of photoionization data computed under the OP, IP and related works. features. Present work also reports photoionization cross sections including relativistic effects in the Breit-Pauli R-matrix (BPRM) approximation. Detailed features in the calculated cross sections exhibit the missing resonances due to fine structure. Self-consistent datasets for photoionization and recombination have so far been computed for approximately 45 atoms and ions. These are being reported in a continuing series of publications in Astrophysical J. Supplements (e.g. references below). These data will also be available from the electronic database TIPTOPBASE (http://heasarc.gsfc.nasa.gov)

  3. ANTHEM: a two-dimensional multicomponent self-consistent hydro-electron transport code for laser-matter interaction studies

    International Nuclear Information System (INIS)

    Mason, R.J.

    1982-01-01

    The ANTHEM code for the study of CO 2 -laser-generated transport is outlined. ANTHEM treats the background plasma as coupled Eulerian thermal and ion fluids, and the suprathermal electrons as either a third fluid or a body of evolving collisional PIC particles. The electrons scatter off the ions; the suprathermals drag against the thermal background. Self-consistent E- and B-fields are computed by the Implicit Moment Method. The current status of the code is described. Typical output from ANTHEM is discussed with special application to Augmented-Return-Current CO 2 -laser-driven targets

  4. The nuclear N-body problem and the effective interaction in self-consistent mean-field methods

    International Nuclear Information System (INIS)

    Duguet, Thomas

    2002-01-01

    This work deals with two aspects of mean-field type methods extensively used in low-energy nuclear structure. The first study is at the mean-field level. The link between the wave-function describing an even-even nucleus and the odd-even neighbor is revisited. To get a coherent description as a function of the pairing intensity in the system, the utility of the formalization of this link through a two steps process is demonstrated. This two-steps process allows to identify the role played by different channels of the force when a nucleon is added in the system. In particular, perturbative formula evaluating the contribution of time-odd components of the functional to the nucleon separation energy are derived for zero and realistic pairing intensities. Self-consistent calculations validate the developed scheme as well as the derived perturbative formula. This first study ends up with an extended analysis of the odd-even mass staggering in nuclei. The new scheme allows to identify the contribution to this observable coming from different channels of the force. The necessity of a better understanding of time-odd terms in order to decide which odd-even mass formulae extracts the pairing gap the most properly is identified. These terms being nowadays more or less out of control, extended studies are needed to make precise the fit of a pairing force through the comparison of theoretical and experimental odd-even mass differences. The second study deals with beyond mean-field methods taking care of the correlations associated with large amplitude oscillations in nuclei. Their effects are usually incorporated through the GCM or the projected mean-field method. We derive a perturbation theory motivating such variational calculations from a diagrammatic point of view for the first time. Resuming two-body correlations in the energy expansion, we obtain an effective interaction removing the hard-core problem in the context of configuration mixing calculations. Proceeding to a

  5. Application of the adiabatic self-consistent collective coordinate method to a solvable model of prolate-oblate shape coexistence

    International Nuclear Information System (INIS)

    Kobayasi, Masato; Matsuyanagi, Kenichi; Nakatsukasa, Takashi; Matsuo, Masayuki

    2003-01-01

    The adiabatic self-consistent collective coordinate method is applied to an exactly solvable multi-O(4) model that is designed to describe nuclear shape coexistence phenomena. The collective mass and dynamics of large amplitude collective motion in this model system are analyzed, and it is shown that the method yields a faithful description of tunneling motion through a barrier between the prolate and oblate local minima in the collective potential. The emergence of the doublet pattern is clearly described. (author)

  6. The Principle of Energetic Consistency

    Science.gov (United States)

    Cohn, Stephen E.

    2009-01-01

    A basic result in estimation theory is that the minimum variance estimate of the dynamical state, given the observations, is the conditional mean estimate. This result holds independently of the specifics of any dynamical or observation nonlinearity or stochasticity, requiring only that the probability density function of the state, conditioned on the observations, has two moments. For nonlinear dynamics that conserve a total energy, this general result implies the principle of energetic consistency: if the dynamical variables are taken to be the natural energy variables, then the sum of the total energy of the conditional mean and the trace of the conditional covariance matrix (the total variance) is constant between observations. Ensemble Kalman filtering methods are designed to approximate the evolution of the conditional mean and covariance matrix. For them the principle of energetic consistency holds independently of ensemble size, even with covariance localization. However, full Kalman filter experiments with advection dynamics have shown that a small amount of numerical dissipation can cause a large, state-dependent loss of total variance, to the detriment of filter performance. The principle of energetic consistency offers a simple way to test whether this spurious loss of variance limits ensemble filter performance in full-blown applications. The classical second-moment closure (third-moment discard) equations also satisfy the principle of energetic consistency, independently of the rank of the conditional covariance matrix. Low-rank approximation of these equations offers an energetically consistent, computationally viable alternative to ensemble filtering. Current formulations of long-window, weak-constraint, four-dimensional variational methods are designed to approximate the conditional mode rather than the conditional mean. Thus they neglect the nonlinear bias term in the second-moment closure equation for the conditional mean. The principle of

  7. Preventing syndemic Zika virus, HIV/STIs and unintended pregnancy: dual method use and consistent condom use among Brazilian women in marital and civil unions.

    Science.gov (United States)

    Tsuyuki, Kiyomi; Gipson, Jessica D; Barbosa, Regina Maria; Urada, Lianne A; Morisky, Donald E

    2017-12-12

    Syndemic Zika virus, HIV and unintended pregnancy call for an urgent understanding of dual method (condoms with another modern non-barrier contraceptive) and consistent condom use. Multinomial and logistic regression analysis using data from the Pesquisa Nacional de Demografia e Saúde da Criança e da Mulher (PNDS), a nationally representative household survey of reproductive-aged women in Brazil, identified the socio-demographic, fertility and relationship context correlates of exclusive non-barrier contraception, dual method use and condom use consistency. Among women in marital and civil unions, half reported dual protection (30% condoms, 20% dual methods). In adjusted models, condom use was associated with older age and living in the northern region of Brazil or in urban areas, whereas dual method use (versus condom use) was associated with younger age, living in the southern region of Brazil, living in non-urban areas and relationship age homogamy. Among condom users, consistent condom use was associated with reporting Afro-religion or other religion, not wanting (more) children and using condoms only (versus dual methods). Findings highlight that integrated STI prevention and family planning services should target young married/in union women, couples not wanting (more) children and heterogamous relationships to increase dual method use and consistent condom use.

  8. Evidence for Consistency of the Glycation Gap in Diabetes

    OpenAIRE

    Nayak, Ananth U.; Holland, Martin R.; Macdonald, David R.; Nevill, Alan; Singh, Baldev M.

    2011-01-01

    OBJECTIVE Discordance between HbA1c and fructosamine estimations in the assessment of glycemia is often encountered. A number of mechanisms might explain such discordance, but whether it is consistent is uncertain. This study aims to coanalyze paired glycosylated hemoglobin (HbA1c)-fructosamine estimations by using fructosamine to determine a predicted HbA1c, to calculate a glycation gap (G-gap) and to determine whether the G-gap is consistent over time. RESEARCH DESIGN AND METHODS We include...

  9. MRI Study on the Functional and Spatial Consistency of Resting State-Related Independent Components of the Brain Network

    Energy Technology Data Exchange (ETDEWEB)

    Jeong, Bum Seok [Korea Advanced Institute of Science and Technology, Daejeon (Korea, Republic of); Choi, Jee Wook [Daejeon St. Mary' s Hospital, The Catholic University of Korea College of Medicine, Daejeon (Korea, Republic of); Kim, Ji Woong [College of Medical Science, Konyang University, Daejeon(Korea, Republic of)

    2012-06-15

    Resting-state networks (RSNs), including the default mode network (DMN), have been considered as markers of brain status such as consciousness, developmental change, and treatment effects. The consistency of functional connectivity among RSNs has not been fully explored, especially among resting-state-related independent components (RSICs). This resting-state fMRI study addressed the consistency of functional connectivity among RSICs as well as their spatial consistency between 'at day 1' and 'after 4 weeks' in 13 healthy volunteers. We found that most RSICs, especially the DMN, are reproducible across time, whereas some RSICs were variable in either their spatial characteristics or their functional connectivity. Relatively low spatial consistency was found in the basal ganglia, a parietal region of left frontoparietal network, and the supplementary motor area. The functional connectivity between two independent components, the bilateral angular/supramarginal gyri/intraparietal lobule and bilateral middle temporal/occipital gyri, was decreased across time regardless of the correlation analysis method employed, (Pearson's or partial correlation). RSICs showing variable consistency are different between spatial characteristics and functional connectivity. To understand the brain as a dynamic network, we recommend further investigation of both changes in the activation of specific regions and the modulation of functional connectivity in the brain network.

  10. MRI Study on the Functional and Spatial Consistency of Resting State-Related Independent Components of the Brain Network

    International Nuclear Information System (INIS)

    Jeong, Bum Seok; Choi, Jee Wook; Kim, Ji Woong

    2012-01-01

    Resting-state networks (RSNs), including the default mode network (DMN), have been considered as markers of brain status such as consciousness, developmental change, and treatment effects. The consistency of functional connectivity among RSNs has not been fully explored, especially among resting-state-related independent components (RSICs). This resting-state fMRI study addressed the consistency of functional connectivity among RSICs as well as their spatial consistency between 'at day 1' and 'after 4 weeks' in 13 healthy volunteers. We found that most RSICs, especially the DMN, are reproducible across time, whereas some RSICs were variable in either their spatial characteristics or their functional connectivity. Relatively low spatial consistency was found in the basal ganglia, a parietal region of left frontoparietal network, and the supplementary motor area. The functional connectivity between two independent components, the bilateral angular/supramarginal gyri/intraparietal lobule and bilateral middle temporal/occipital gyri, was decreased across time regardless of the correlation analysis method employed, (Pearson's or partial correlation). RSICs showing variable consistency are different between spatial characteristics and functional connectivity. To understand the brain as a dynamic network, we recommend further investigation of both changes in the activation of specific regions and the modulation of functional connectivity in the brain network.

  11. Optimum collective submanifold in resonant cases by the self-consistent collective-coordinate method for large-amplitude collective motion

    International Nuclear Information System (INIS)

    Hashimoto, Y.; Marumori, T.; Sakata, F.

    1987-01-01

    With the purpose of clarifying characteristic difference of the optimum collective submanifolds in nonresonant and resonant cases, we develop an improved method of solving the basic equations of the self-consistent collective-coordinate (SCC) method for large-amplitude collective motion. It is shown that, in the resonant cases, there inevitably arise essential coupling terms which break the maximal-decoupling property of the collective motion, and we have to extend the optimum collective submanifold so as to properly treat the degrees of freedom which bring about the resonances

  12. Decentralized Method for Load Sharing and Power Management in a Hybrid Single/Three-Phase-Islanded Microgrid Consisting of Hybrid Source PV/Battery Units

    DEFF Research Database (Denmark)

    Karimi, Yaser; Oraee, Hashem; Guerrero, Josep M.

    2017-01-01

    This paper proposes a new decentralized power management and load sharing method for a photovoltaic based, hybrid single/three-phase islanded microgrid consisting of various PV units, battery units and hybrid PV/battery units. The proposed method is not limited to the systems with separate PV...... in different load, PV generation and battery conditions is validated experimentally in a microgrid lab prototype consisted of one three-phase unit and two single-phase units....

  13. Self-consistent simulation studies of periodically focused intense charged-particle beams

    International Nuclear Information System (INIS)

    Chen, C.; Jameson, R.A.

    1995-01-01

    A self-consistent two-dimensional model is used to investigate intense charged-particle beam propagation through a periodic solenoidal focusing channel, particularly in the regime in which there is a mismatch between the beam and the focusing channel. The present self-consistent studies confirm that mismatched beams exhibit nonlinear resonances and chaotic behavior in the envelope evolution, as predicted by an earlier envelope analysis [C. Chen and R. C. Davidson, Phys. Rev. Lett. 72, 2195 (1994)]. Transient effects due to emittance growth are studied, and halo formation is investigated. The halo size is estimated. The halo characteristics for a periodic focusing channel are found to be qualitatively the same as those for a uniform focusing channel. A threshold condition is obtained numerically for halo formation in mismatched beams in a uniform focusing channel, which indicates that relative envelope mismatch must be kept well below 20% to prevent space-charge-dominated beams from developing halos

  14. Optimization of preparation method for ketoprofen-loaded microspheres consisting polymeric blends using simplex lattice mixture design

    Energy Technology Data Exchange (ETDEWEB)

    Das, Sanjoy Kumar, E-mail: sanjoydasju@gmail.com; Khanam, Jasmina; Nanda, Arunabha

    2016-12-01

    In the present investigation, simplex lattice mixture design was applied for formulation development and optimization of a controlled release dosage form of ketoprofen microspheres consisting polymers like ethylcellulose and Eudragit{sup ®}RL 100; when those were formed by oil-in-oil emulsion solvent evaporation method. The investigation was carried out to observe the effects of polymer amount, stirring speed and emulsifier concentration (% w/w) on percentage yield, average particle size, drug entrapment efficiency and in vitro drug release in 8 h from the microspheres. Analysis of variance (ANOVA) was used to estimate the significance of the models. Based on the desirability function approach numerical optimization was carried out. Optimized formulation (KTF-O) showed close match between actual and predicted responses with desirability factor 0.811. No adverse reaction between drug and polymers were observed on the basis of Fourier transform infrared (FTIR) spectroscopy and Differential scanning calorimetric (DSC) analysis. Scanning electron microscopy (SEM) was carried out to show discreteness of microspheres (149.2 ± 1.25 μm) and their surface conditions during pre and post dissolution operations. The drug release pattern from KTF-O was best explained by Korsmeyer-Peppas and Higuchi models. The batch of optimized microspheres were found with maximum entrapment (~ 90%), minimum loss (~ 10%) and prolonged drug release for 8 h (91.25%) which may be considered as favourable criteria of controlled release dosage form. - Graphical abstract: Optimization of preparation method for ketoprofen-loaded microspheres consisting polymeric blends using simplex lattice mixture design. - Highlights: • Simplex lattice design was used to optimize ketoprofen-loaded microspheres. • Polymeric blend (Ethylcellulose and Eudragit® RL 100) was used. • Microspheres were prepared by oil-in-oil emulsion solvent evaporation method. • Optimized formulation depicted favourable

  15. Bootstrap-Based Inference for Cube Root Consistent Estimators

    DEFF Research Database (Denmark)

    Cattaneo, Matias D.; Jansson, Michael; Nagasawa, Kenichi

    This note proposes a consistent bootstrap-based distributional approximation for cube root consistent estimators such as the maximum score estimator of Manski (1975) and the isotonic density estimator of Grenander (1956). In both cases, the standard nonparametric bootstrap is known...... to be inconsistent. Our method restores consistency of the nonparametric bootstrap by altering the shape of the criterion function defining the estimator whose distribution we seek to approximate. This modification leads to a generic and easy-to-implement resampling method for inference that is conceptually distinct...... from other available distributional approximations based on some form of modified bootstrap. We offer simulation evidence showcasing the performance of our inference method in finite samples. An extension of our methodology to general M-estimation problems is also discussed....

  16. Methods for regionalization of impacts of non-toxic air pollutants in life-cycle assessments often tell a consistent story

    DEFF Research Database (Denmark)

    Djomo, Sylvestre Njakou; Knudsen, Marie Trydeman; Andersen, Mikael Skou

    2017-01-01

    There is an ongoing debate regarding the influence of the source location of pollution on the fate of pollutants and their subsequent impacts. Several methods have been developed to derive site-dependent characterization factors (CFs) for use in life-cycle assessment (LCA). Consistent, precise, a...

  17. Consistent force fields for saccharides

    DEFF Research Database (Denmark)

    Rasmussen, Kjeld

    1999-01-01

    Consistent force fields for carbohydrates were hitherto developed by extensive optimization ofpotential energy function parameters on experimental data and on ab initio results. A wide range of experimental data is used: internal structures obtained from gas phase electron diffraction and from x......-anomeric effects are accounted for without addition of specific terms. The work is done in the framework of the Consistent Force Field which originatedin Israel and was further developed in Denmark. The actual methods and strategies employed havebeen described previously. Extensive testing of the force field...

  18. Image velocimetry for clouds with relaxation labeling based on deformation consistency

    International Nuclear Information System (INIS)

    Horinouchi, Takeshi; Murakami, Shin-ya; Yamazaki, Atsushi; Kouyama, Toru; Ogohara, Kazunori; Yamada, Manabu; Watanabe, Shigeto

    2017-01-01

    Correlation-based cloud tracking has been extensively used to measure atmospheric winds, but still difficulty remains. In this study, aiming at developing a cloud tracking system for Akatsuki, an artificial satellite orbiting Venus, a formulation is developed for improving the relaxation labeling technique to select appropriate peaks of cross-correlation surfaces which tend to have multiple peaks. The formulation makes an explicit use of consistency inherent in the type of cross-correlation method where template sub-images are slid without deformation; if the resultant motion vectors indicate a too-large deformation, it is contradictory to the assumption of the method. The deformation consistency is exploited further to develop two post processes; one clusters the motion vectors into groups within each of which the consistency is perfect, and the other extends the groups using the original candidate lists. These processes are useful to eliminate erroneous vectors, distinguish motion vectors at different altitudes, and detect phase velocities of waves in fluids such as atmospheric gravity waves. As a basis of the relaxation labeling and the post processes as well as uncertainty estimation, the necessity to find isolated (well-separated) peaks of cross-correlation surfaces is argued, and an algorithm to realize it is presented. All the methods are implemented, and their effectiveness is demonstrated with initial images obtained by the ultraviolet imager onboard Akatsuki. Since the deformation consistency regards the logical consistency inherent in template matching methods, it should have broad application beyond cloud tracking. (paper)

  19. Image velocimetry for clouds with relaxation labeling based on deformation consistency

    Science.gov (United States)

    Horinouchi, Takeshi; Murakami, Shin-ya; Kouyama, Toru; Ogohara, Kazunori; Yamazaki, Atsushi; Yamada, Manabu; Watanabe, Shigeto

    2017-08-01

    Correlation-based cloud tracking has been extensively used to measure atmospheric winds, but still difficulty remains. In this study, aiming at developing a cloud tracking system for Akatsuki, an artificial satellite orbiting Venus, a formulation is developed for improving the relaxation labeling technique to select appropriate peaks of cross-correlation surfaces which tend to have multiple peaks. The formulation makes an explicit use of consistency inherent in the type of cross-correlation method where template sub-images are slid without deformation; if the resultant motion vectors indicate a too-large deformation, it is contradictory to the assumption of the method. The deformation consistency is exploited further to develop two post processes; one clusters the motion vectors into groups within each of which the consistency is perfect, and the other extends the groups using the original candidate lists. These processes are useful to eliminate erroneous vectors, distinguish motion vectors at different altitudes, and detect phase velocities of waves in fluids such as atmospheric gravity waves. As a basis of the relaxation labeling and the post processes as well as uncertainty estimation, the necessity to find isolated (well-separated) peaks of cross-correlation surfaces is argued, and an algorithm to realize it is presented. All the methods are implemented, and their effectiveness is demonstrated with initial images obtained by the ultraviolet imager onboard Akatsuki. Since the deformation consistency regards the logical consistency inherent in template matching methods, it should have broad application beyond cloud tracking.

  20. Self-consistent velocity dependent effective interactions

    International Nuclear Information System (INIS)

    Kubo, Takayuki; Sakamoto, Hideo; Kammuri, Tetsuo; Kishimoto, Teruo.

    1993-09-01

    The field coupling method is extended to a system with a velocity dependent mean potential. By means of this method, we can derive the effective interactions which are consistent with the mean potential. The self-consistent velocity dependent effective interactions are applied to the microscopic analysis of the structures of giant dipole resonances (GDR) of 148,154 Sm, of the first excited 2 + states of Sn isotopes and of the first excited 3 - states of Mo isotopes. It is clarified that the interactions play crucial roles in describing the splitting of the resonant structure of GDR peaks, in restoring the energy weighted sum rule values, and in reducing B (Eλ) values. (author)

  1. Do Health Systems Have Consistent Performance Across Locations and Is Consistency Associated With Higher Performance?

    Science.gov (United States)

    Crespin, Daniel J; Christianson, Jon B; McCullough, Jeffrey S; Finch, Michael D

    This study addresses whether health systems have consistent diabetes care performance across their ambulatory clinics and whether increasing consistency is associated with improvements in clinic performance. Study data included 2007 to 2013 diabetes care intermediate outcome measures for 661 ambulatory clinics in Minnesota and bordering states. Health systems provided more consistent performance, as measured by the standard deviation of performance for clinics in a system, relative to propensity score-matched proxy systems created for comparison purposes. No evidence was found that improvements in consistency were associated with higher clinic performance. The combination of high performance and consistent care is likely to enhance a health system's brand reputation, allowing it to better mitigate the financial risks of consumers seeking care outside the organization. These results suggest that larger health systems are most likely to deliver the combination of consistent and high-performance care. Future research should explore the mechanisms that drive consistent care within health systems.

  2. Consistency Analysis of Ultrasound Echoes within a Dual Symmetric Path Inspection Framework

    Directory of Open Access Journals (Sweden)

    VASILE, C.

    2015-05-01

    Full Text Available Non-destructive ultrasound inspection of metallic structures is a perpetual high-interest area of research because of its well-known benefits in industrial applications, especially from an economic point of view, where detection and localisation of defects in their most initial stages can help maintain high production capabilities for any enterprise. This paper is aimed at providing further validation regarding a new technique for detecting and localising defects in metals, the Matched Filter-based Dual Symmetric Path Inspection (MF-DSPI. This validation consists in demonstrating the consistency of the useful ultrasound echoes, within the framework of the MF-DSPI. A description of the MF-DSPI method and the related work of the authors with it are presented in this paper, along with an experimental setup used to obtain the data with which the useful echo consistency was studied. The four proposed methods are: signal envelope analysis, L2-norm criterion, correlation coefficient criterion and sliding bounding rectangle analysis. The aim of this paper is to verify the useful echo consistency (with the help of these four approaches, as the MF-DSPI method strongly relies on this feature. The results and their implications are discussed in the latter portion of this study.

  3. Providing Consistent (A)ATSR Solar Channel Radiometry for Climate Studies

    Science.gov (United States)

    Smith, D.; Latter, B. G.; Poulsen, C.

    2012-04-01

    Data from the solar reflectance channels of the Along Track Scanning Radiometer (ATSR) series of instruments are being used in applications for monitoring trends in clouds and aerosols. In order to provide quantitative information, the radiometric calibrations of the sensors must be consistent, stable and ideally traced to international standards. This paper describes the methods used to monitor the calibrations of the ATSR instruments to provide consistent Level 1b radiometric data sets. Comparisons of the in-orbit calibrations are made by reference to data from quasi stable sites such as DOME-C in Antarctica or Saharan Desert sites. Comparisons are performed either by time coincident match-ups of the radiometric data for sensors having similar spectral bands and view/solar geometry and overpass times as for AATSR and MERIS; or via a reference BRDF model derived from averages of measurements over the site from a reference sensor where there is limited or no temporal overlap (e.g. MODIS-Aqua, ATSR-1 and ATSR-2). The results of the intercomparisons provide values of the long term calibration drift and systematic biases between the sensors. Look-up tables based on smoothed averages of the drift measurements are used to provide the corrections to Level 1b data. The uncertainty budgets for the comparisons are provided. It is also possible to perform comparisons of measurements against high spectral resolution instruments that are co-located on the same platform, i.e. AATSR/SCIA on ENVISAT and ATSR-2/GOME on ERS-2. The comparisons are performed by averaging the spectrometer data to the spectral response of the filter radiometer, and averaging the radiometer data to the spatial resolution of the spectrometer. In this paper, the authors present the results of the inter-comparisons to achieve a consistent calibration for the solar channels of the complete ATSR dataset. An assessment of the uncertainties associated with the techniques will be discussed. The impacts of the

  4. Optical forces, torques, and force densities calculated at a microscopic level using a self-consistent hydrodynamics method

    Science.gov (United States)

    Ding, Kun; Chan, C. T.

    2018-04-01

    The calculation of optical force density distribution inside a material is challenging at the nanoscale, where quantum and nonlocal effects emerge and macroscopic parameters such as permittivity become ill-defined. We demonstrate that the microscopic optical force density of nanoplasmonic systems can be defined and calculated using the microscopic fields generated using a self-consistent hydrodynamics model that includes quantum, nonlocal, and retardation effects. We demonstrate this technique by calculating the microscopic optical force density distributions and the optical binding force induced by external light on nanoplasmonic dimers. This approach works even in the limit when the nanoparticles are close enough to each other so that electron tunneling occurs, a regime in which classical electromagnetic approach fails completely. We discover that an uneven distribution of optical force density can lead to a light-induced spinning torque acting on individual particles. The hydrodynamics method offers us an accurate and efficient approach to study optomechanical behavior for plasmonic systems at the nanoscale.

  5. Cation solvation with quantum chemical effects modeled by a size-consistent multi-partitioning quantum mechanics/molecular mechanics method.

    Science.gov (United States)

    Watanabe, Hiroshi C; Kubillus, Maximilian; Kubař, Tomáš; Stach, Robert; Mizaikoff, Boris; Ishikita, Hiroshi

    2017-07-21

    In the condensed phase, quantum chemical properties such as many-body effects and intermolecular charge fluctuations are critical determinants of the solvation structure and dynamics. Thus, a quantum mechanical (QM) molecular description is required for both solute and solvent to incorporate these properties. However, it is challenging to conduct molecular dynamics (MD) simulations for condensed systems of sufficient scale when adapting QM potentials. To overcome this problem, we recently developed the size-consistent multi-partitioning (SCMP) quantum mechanics/molecular mechanics (QM/MM) method and realized stable and accurate MD simulations, using the QM potential to a benchmark system. In the present study, as the first application of the SCMP method, we have investigated the structures and dynamics of Na + , K + , and Ca 2+ solutions based on nanosecond-scale sampling, a sampling 100-times longer than that of conventional QM-based samplings. Furthermore, we have evaluated two dynamic properties, the diffusion coefficient and difference spectra, with high statistical certainty. Furthermore the calculation of these properties has not previously been possible within the conventional QM/MM framework. Based on our analysis, we have quantitatively evaluated the quantum chemical solvation effects, which show distinct differences between the cations.

  6. Standard test methods for determining chemical durability of nuclear, hazardous, and mixed waste glasses and multiphase glass ceramics: The product consistency test (PCT)

    CERN Document Server

    American Society for Testing and Materials. Philadelphia

    2002-01-01

    1.1 These product consistency test methods A and B evaluate the chemical durability of homogeneous glasses, phase separated glasses, devitrified glasses, glass ceramics, and/or multiphase glass ceramic waste forms hereafter collectively referred to as “glass waste forms” by measuring the concentrations of the chemical species released to a test solution. 1.1.1 Test Method A is a seven-day chemical durability test performed at 90 ± 2°C in a leachant of ASTM-Type I water. The test method is static and conducted in stainless steel vessels. Test Method A can specifically be used to evaluate whether the chemical durability and elemental release characteristics of nuclear, hazardous, and mixed glass waste forms have been consistently controlled during production. This test method is applicable to radioactive and simulated glass waste forms as defined above. 1.1.2 Test Method B is a durability test that allows testing at various test durations, test temperatures, mesh size, mass of sample, leachant volume, a...

  7. Assessment of consistency of the whole tumor and single section perfusion imaging with 256-slice spiral CT: a preliminary study

    International Nuclear Information System (INIS)

    Sun Hongliang; Xu Yanyan; Hu Yingying; Tian Yuanjiang; Wang Wu

    2014-01-01

    Objective: To determine the consistency between quantitative CT perfusion measurements of colorectal cancer obtained from single section with maximal tumor dimension and from average of whole tumor, and compare intra- and inter-observer consistency of the two analysis methods. Methods: Twenty-two patients with histologically proven colorectal cancer were examined prospectively with 256-slice CT and the whole tumor perfusion images were obtained. Perfusion parameters were obtained from region of interest (ROI) inserted in single section showing maximal tumor dimension, then from ROI inserted in all tumor-containing sections by two radiologists. Consistency between values of blood flow (BF), blood volume (BV) and time to peak (TTP) calculated by two methods was assessed. Intra-observer consistency was evaluated by comparing repeated measurements done by the same radiologist using both methods after 3 months. Perfusion measurements were done by another radiologist independently to assess inter-observer consistency of both methods. The results from different methods were compared using paired t test and Bland-Altman plot. Results: Twenty-two patients were examined successfully. The perfusion parameters BF, BV and TTP obtained by whole tumor perfusion and single-section analysis were (35.59 ± 14.59) ml · min -1 · 100 g -1 , (17.55 ±4.21) ml · 100 g -1 , (21.30 ±7.57) s and (34.64 ± 13.29)ml · min -1 · 100 g -1 , (17.61 ±6.39)ml · 100 g -1 , (19.82 ±9.01) s, respectively. No significant differences were observed between the means of the perfusion parameters (BF, BV, TTP) calculated by the two methods (t=0.218, -0.033, -0.668, P>0.05, respectively). The intra-observer 95% limits of consistency of perfusion parameters were BF -5.3% to 10.0%, BV -13.8% to 10.8%, TTP -15.0% to 12.6% with whole tumor analysis, respectively; BF -14.3% to 16.5%, BV -24.2% to 22.2%, TTP -19.0% to 16.1% with single section analysis, respectively. The inter-observer 95% limits of

  8. Poisson solvers for self-consistent multi-particle simulations

    International Nuclear Information System (INIS)

    Qiang, J; Paret, S

    2014-01-01

    Self-consistent multi-particle simulation plays an important role in studying beam-beam effects and space charge effects in high-intensity beams. The Poisson equation has to be solved at each time-step based on the particle density distribution in the multi-particle simulation. In this paper, we review a number of numerical methods that can be used to solve the Poisson equation efficiently. The computational complexity of those numerical methods will be O(N log(N)) or O(N) instead of O(N2), where N is the total number of grid points used to solve the Poisson equation

  9. How consistent are beliefs about the causes and solutions to illness? An experimental study.

    OpenAIRE

    Ogden, J; Jubb, A

    2008-01-01

    Objectives: Research illustrates that people hold beliefs about the causes and solutions to illness. This study aimed to assess the consistency in these beliefs in terms of their variation according to type of problem and whether they are consistent with each other. Further, the study aimed to assess whether they are open to change and whether changing beliefs about cause resulted in a subsequent shift in beliefs about solutions. Design: Experimental factorial 3 (problem) × 2 (manipulated cau...

  10. A consistent response spectrum analysis including the resonance range

    International Nuclear Information System (INIS)

    Schmitz, D.; Simmchen, A.

    1983-01-01

    The report provides a complete consistent Response Spectrum Analysis for any component. The effect of supports with different excitation is taken into consideration, at is the description of the resonance ranges. It includes information explaining how the contributions of the eigenforms with higher eigenfrequencies are to be considered. Stocking of floor response spectra is also possible using the method described here. However, modified floor response spectra must now be calculated for each building mode. Once these have been prepared, the calculation of the dynamic component values is practically no more complicated than with the conventional, non-consistent methods. The consistent Response Spectrum Analysis can supply smaller and larger values than the conventional theory, a fact which can be demonstrated using simple examples. The report contains a consistent Response Spectrum Analysis (RSA), which, as far as we know, has been formulated in this way for the first time. A consistent RSA is so important because today this method is preferentially applied as an important tool for the earthquake proof of components in nuclear power plants. (orig./HP)

  11. Young women's consistency of contraceptive use--does depression or stress matter?

    Science.gov (United States)

    Stidham Hall, Kelli; Moreau, Caroline; Trussell, James; Barber, Jennifer

    2013-11-01

    We prospectively examined the influence of young women's depression and stress symptoms on their weekly consistency of contraceptive method use. Women ages 18-20 years (n = 689) participating in a longitudinal cohort study completed weekly journals assessing reproductive, relationship and health characteristics. We used data through 12 months of follow-up (n = 8877 journals) to examine relationships between baseline depression (CES-D) and stress (PSS-10) symptoms and consistency of contraceptive methods use with sexual activity each week. We analyzed data with random effects multivarible logistic regression. Consistent contraceptive use (72% of weeks) was 10-15 percentage points lower among women with moderate/severe baseline depression and stress symptoms than those without symptoms (p contraceptive consistency each week than those without symptoms, respectively (OR 0.53, CI 0.31-0.91 and OR 0.31, CI 0.18-0.52). Stress predicted inconsistent use of oral contraceptives (OR 0.27, CI 0.12-0.58), condoms (OR 0.40, CI 0.23-0.69) and withdrawal (OR 0.12, CI 0.03-0.50). Women with depression and stress symptoms appear to be at increased risk for user-related contraceptive failures, especially for the most commonly used methods. Our study has shown that young women with elevated depression and stress symptoms appear to be at risk for inconsistent contraceptive use patterns, especially for the most common methods that require greater user effort and diligence. Based upon these findings, clinicians should consider women's psychological and emotional status when helping patients with contraceptive decision-making and management. User-dependent contraceptive method efficacy is important to address in education and counseling sessions, and women with stress or depression may be ideal candidates for long-acting reversible methods, which offer highly effective options with less user-related burden. Ongoing research will provide a greater understanding of how young women

  12. Consistency of parametric registration in serial MRI studies of brain tumor progression

    International Nuclear Information System (INIS)

    Mang, Andreas; Buzug, Thorsten M.; Schnabel, Julia A.; Crum, William R.; Modat, Marc; Ourselin, Sebastien; Hawkes, David J.; Camara-Rey, Oscar; Palm, Christoph; Caseiras, Gisele Brasil; Jaeger, H.R.

    2008-01-01

    The consistency of parametric registration in multi-temporal magnetic resonance (MR) imaging studies was evaluated. Serial MRI scans of adult patients with a brain tumor (glioma) were aligned by parametric registration. The performance of low-order spatial alignment (6/9/12 degrees of freedom) of different 3D serial MR-weighted images is evaluated. A registration protocol for the alignment of all images to one reference coordinate system at baseline is presented. Registration results were evaluated for both, multimodal intra-timepoint and mono-modal multi-temporal registration. The latter case might present a challenge to automatic intensity-based registration algorithms due to ill-defined correspondences. The performance of our algorithm was assessed by testing the inverse registration consistency. Four different similarity measures were evaluated to assess consistency. Careful visual inspection suggests that images are well aligned, but their consistency may be imperfect. Sub-voxel inconsistency within the brain was found for allsimilarity measures used for parametric multi-temporal registration. T1-weighted images were most reliable for establishing spatial correspondence between different timepoints. The parametric registration algorithm is feasible for use in this application. The sub-voxel resolution mean displacement error of registration transformations demonstrates that the algorithm converges to an almost identical solution for forward and reverse registration. (orig.)

  13. Young women's consistency of contraceptive use – Does depression or stress matter?

    Science.gov (United States)

    Moreau, Caroline; Trussell, James; Barber, Jennifer

    2013-01-01

    Background We prospectively examined the influence of young women's depression and stress symptoms on their weekly consistency of contraceptive method use. Study Design Women ages 18-20 years (n=689) participating in a longitudinal cohort study completed weekly journals assessing reproductive, relationship and health characteristics. We used data through 12 months follow-up (n=8,877 journals) to examine relationships between baseline depression (CES-D) and stress (PSS-10) symptoms and consistency of contraceptive methods use with sexual activity each week. We analyzed data with random effects multinomial logistic regression. Results Consistent contraceptive use (72% of weeks) was 10-15 percentage points lower among women with moderate/severe baseline depression and stress symptoms than those without symptoms (p-valuescontraceptive consistency each week than those without symptoms, respectively (OR 0.53, CI 0.31-0.91 and OR 0.31, CI 0.18-0.52). Stress predicted inconsistent use of oral contraceptives (OR 0.27, CI 0.12-0.58), condoms (OR 0.40, CI 0.23-0.69) and withdrawal (OR 0.12, CI 0.03-0.50). Conclusion Women with depression and stress symptoms appear to be at increased risk for user-related contraceptive failures, especially for the most commonly used methods. Implications Our study has shown that young women with elevated depression and stress symptoms appear to be at risk for inconsistent contraceptive use patterns, especially for the most common methods that require greater user effort and diligence. Based upon these findings, clinicians should consider women's psychological and emotional status when helping patients with contraceptive decision-making and management. User-dependent contraceptive method efficacy is important to address in education and counseling sessions, and women with stress or depression may be ideal candidates for long-acting reversible methods, which offer highly effective options with less user-related burden. Ongoing research will

  14. The consistency assessment of topological relations in cartographic generalization

    Science.gov (United States)

    Zheng, Chunyan; Guo, Qingsheng; Du, Xiaochu

    2006-10-01

    The field of research in the generalization assessment has been less studied than the generalization process itself, and it is very important to keep topological relation consistency for meeting generalization quality. This paper proposes a methodology to assess the quality of generalized map from topological relations consistency. Taking roads (including railway) and residential areas for examples, from the viewpoint of the spatial cognition, some issues about topological consistency in different map scales are analyzed. The statistic information about the inconsistent topological relations can be obtained by comparing the two matrices: one is the matrix for the topological relations in the generalized map; the other is the theoretical matrix for the topological relations that should be maintained after generalization. Based on the fuzzy set theory and the classification of map object types, the consistency evaluation model of topological relations is established. The paper proves the feasibility of the method through the example about how to evaluate the local topological relations between simple roads and residential area finally.

  15. Evaluation of a Consistent LES/PDF Method Using a Series of Experimental Spray Flames

    Science.gov (United States)

    Heye, Colin; Raman, Venkat

    2012-11-01

    A consistent method for the evolution of the joint-scalar probability density function (PDF) transport equation is proposed for application to large eddy simulation (LES) of turbulent reacting flows containing evaporating spray droplets. PDF transport equations provide the benefit of including the chemical source term in closed form, however, additional terms describing LES subfilter mixing must be modeled. The recent availability of detailed experimental measurements provide model validation data for a wide range of evaporation rates and combustion regimes, as is well-known to occur in spray flames. In this work, the experimental data will used to investigate the impact of droplet mass loading and evaporation rates on the subfilter scalar PDF shape in comparison with conventional flamelet models. In addition, existing model term closures in the PDF transport equations are evaluated with a focus on their validity in the presence of regime changes.

  16. Using qualitative methods to inform the trade-off between content validity and consistency in utility assessment: the example of type 2 diabetes and Alzheimer's Disease

    Directory of Open Access Journals (Sweden)

    Gargon Elizabeth

    2010-02-01

    Full Text Available Abstract Background Key stakeholders regard generic utility instruments as suitable tools to inform health technology assessment decision-making regarding allocation of resources across competing interventions. These instruments require a 'descriptor', a 'valuation' and a 'perspective' of the economic evaluation. There are various approaches that can be taken for each of these, offering a potential lack of consistency between instruments (a basic requirement for comparisons across diseases. The 'reference method' has been proposed as a way to address the limitations of the Quality-Adjusted Life Year (QALY. However, the degree to which generic measures can assess patients' specific experiences with their disease would remain unresolved. This has been neglected in the discussions on methods development and its impact on the QALY values obtained and resulting cost per QALY estimate underestimated. This study explored the content of utility instruments relevant to type 2 diabetes and Alzheimer's disease (AD as examples, and the role of qualitative research in informing the trade-off between content coverage and consistency. Method A literature review was performed to identify qualitative and quantitative studies regarding patients' experiences with type 2 diabetes or AD, and associated treatments. Conceptual models for each indication were developed. Generic- and disease-specific instruments were mapped to the conceptual models. Results Findings showed that published descriptions of relevant concepts important to patients with type 2 diabetes or AD are available for consideration in deciding on the most comprehensive approach to utility assessment. While the 15-dimensional health related quality of life measure (15D seemed the most comprehensive measure for both diseases, the Health Utilities Index 3 (HUI 3 seemed to have the least coverage for type 2 diabetes and the EuroQol-5 Dimensions (EQ-5D for AD. Furthermore, some of the utility instruments

  17. A consistent method for finite volume discretization of body forces on collocated grids applied to flow through an actuator disk

    DEFF Research Database (Denmark)

    Troldborg, Niels; Sørensen, Niels N.; Réthoré, Pierre-Elouan

    2015-01-01

    This paper describes a consistent algorithm for eliminating the numerical wiggles appearing when solving the finite volume discretized Navier-Stokes equations with discrete body forces in a collocated grid arrangement. The proposed method is a modification of the Rhie-Chow algorithm where the for...

  18. Dispersion Differences and Consistency of Artificial Periodic Structures.

    Science.gov (United States)

    Cheng, Zhi-Bao; Lin, Wen-Kai; Shi, Zhi-Fei

    2017-10-01

    Dispersion differences and consistency of artificial periodic structures, including phononic crystals, elastic metamaterials, as well as periodic structures composited of phononic crystals and elastic metamaterials, are investigated in this paper. By developing a K(ω) method, complex dispersion relations and group/phase velocity curves of both the single-mechanism periodic structures and the mixing-mechanism periodic structures are calculated at first, from which dispersion differences of artificial periodic structures are discussed. Then, based on a unified formulation, dispersion consistency of artificial periodic structures is investigated. Through a comprehensive comparison study, the correctness for the unified formulation is verified. Mathematical derivations of the unified formulation for different artificial periodic structures are presented. Furthermore, physical meanings of the unified formulation are discussed in the energy-state space.

  19. Self-consistent molecular dynamics calculation of diffusion in higher n-alkanes.

    Science.gov (United States)

    Kondratyuk, Nikolay D; Norman, Genri E; Stegailov, Vladimir V

    2016-11-28

    Diffusion is one of the key subjects of molecular modeling and simulation studies. However, there is an unresolved lack of consistency between Einstein-Smoluchowski (E-S) and Green-Kubo (G-K) methods for diffusion coefficient calculations in systems of complex molecules. In this paper, we analyze this problem for the case of liquid n-triacontane. The non-conventional long-time tails of the velocity autocorrelation function (VACF) are found for this system. Temperature dependence of the VACF tail decay exponent is defined. The proper inclusion of the long-time tail contributions to the diffusion coefficient calculation results in the consistency between G-K and E-S methods. Having considered the major factors influencing the precision of the diffusion rate calculations in comparison with experimental data (system size effects and force field parameters), we point to hydrogen nuclear quantum effects as, presumably, the last obstacle to fully consistent n-alkane description.

  20. Consistent Estimation of Pricing Kernels from Noisy Price Data

    OpenAIRE

    Vladislav Kargin

    2003-01-01

    If pricing kernels are assumed non-negative then the inverse problem of finding the pricing kernel is well-posed. The constrained least squares method provides a consistent estimate of the pricing kernel. When the data are limited, a new method is suggested: relaxed maximization of the relative entropy. This estimator is also consistent. Keywords: $\\epsilon$-entropy, non-parametric estimation, pricing kernel, inverse problems.

  1. Self-consistent descriptions of vector mesons in hot matter reexamined

    International Nuclear Information System (INIS)

    Riek, Felix; Knoll, Joern

    2010-01-01

    Technical concepts are presented that improve the self-consistent treatment of vector mesons in a hot and dense medium. First applications concern an interacting gas of pions and ρ mesons. As an extension of earlier studies, we thereby include random-phase-approximation-type vertex corrections and further use dispersion relations to calculate the real part of the vector-meson self-energy. An improved projection method preserves the four transversality of the vector-meson polarization tensor throughout the self-consistent calculations, thereby keeping the scheme void of kinematical singularities.

  2. Existence and uniqueness of consistent conjectural variation equilibrium in electricity markets

    International Nuclear Information System (INIS)

    Liu, Youfei; Cai, Bin; Ni, Y.X.; Wu, Felix F.

    2007-01-01

    The game-theory based methods are widely applied to analyze the market equilibrium and to study the strategic behavior in the oligopolistic electricity markets. Recently, the conjecture variation approach, one of well-studied methods in game theory, is reported to model the strategic behavior in deregulated electricity markets. Unfortunately, the conjecture variation models have been criticized for the drawback of logical inconsistence and possibility of abundant equilibria. Aiming for this, this paper investigates the existence and uniqueness of consistent conjectural variation equilibrium in electricity markets. With several good characteristics of the electricity market and with an infinite horizon optimization model, it is shown that the consistent conjecture variation will satisfy a set of coupled nonlinear equations and there will be only one equilibrium. This result can provide the fundamentals for further applications of the conjecture variation approach. (author)

  3. Self-consistent gravitational self-force

    International Nuclear Information System (INIS)

    Pound, Adam

    2010-01-01

    I review the problem of motion for small bodies in general relativity, with an emphasis on developing a self-consistent treatment of the gravitational self-force. An analysis of the various derivations extant in the literature leads me to formulate an asymptotic expansion in which the metric is expanded while a representative worldline is held fixed. I discuss the utility of this expansion for both exact point particles and asymptotically small bodies, contrasting it with a regular expansion in which both the metric and the worldline are expanded. Based on these preliminary analyses, I present a general method of deriving self-consistent equations of motion for arbitrarily structured (sufficiently compact) small bodies. My method utilizes two expansions: an inner expansion that keeps the size of the body fixed, and an outer expansion that lets the body shrink while holding its worldline fixed. By imposing the Lorenz gauge, I express the global solution to the Einstein equation in the outer expansion in terms of an integral over a worldtube of small radius surrounding the body. Appropriate boundary data on the tube are determined from a local-in-space expansion in a buffer region where both the inner and outer expansions are valid. This buffer-region expansion also results in an expression for the self-force in terms of irreducible pieces of the metric perturbation on the worldline. Based on the global solution, these pieces of the perturbation can be written in terms of a tail integral over the body's past history. This approach can be applied at any order to obtain a self-consistent approximation that is valid on long time scales, both near and far from the small body. I conclude by discussing possible extensions of my method and comparing it to alternative approaches.

  4. Self-consistent study of local and nonlocal magnetoresistance in a YIG/Pt bilayer

    Science.gov (United States)

    Wang, Xi-guang; Zhou, Zhen-wei; Nie, Yao-zhuang; Xia, Qing-lin; Guo, Guang-hua

    2018-03-01

    We present a self-consistent study of the local spin Hall magnetoresistance (SMR) and nonlocal magnon-mediated magnetoresistance (MMR) in a heavy-metal/magnetic-insulator heterostructure at finite temperature. We find that the thermal fluctuation of magnetization significantly affects the SMR. It appears unidirectional with respect to the direction of electrical current (or magnetization). The unidirectionality of SMR originates from the asymmetry of creation or annihilation of thermal magnons induced by the spin Hall torque. Also, a self-consistent model can well describe the features of MMR.

  5. Consistently Showing Your Best Side? Intra-individual Consistency in #Selfie Pose Orientation

    Science.gov (United States)

    Lindell, Annukka K.

    2017-01-01

    Painted and photographic portraits of others show an asymmetric bias: people favor their left cheek. Both experimental and database studies confirm that the left cheek bias extends to selfies. To date all such selfie studies have been cross-sectional; whether individual selfie-takers tend to consistently favor the same pose orientation, or switch between multiple poses, remains to be determined. The present study thus examined intra-individual consistency in selfie pose orientations. Two hundred selfie-taking participants (100 male and 100 female) were identified by searching #selfie on Instagram. The most recent 10 single-subject selfies for the each of the participants were selected and coded for type of selfie (normal; mirror) and pose orientation (left, midline, right), resulting in a sample of 2000 selfies. Results indicated that selfie-takers do tend to consistently adopt a preferred pose orientation (α = 0.72), with more participants showing an overall left cheek bias (41%) than would be expected by chance (overall right cheek bias = 31.5%; overall midline bias = 19.5%; no overall bias = 8%). Logistic regression modellng, controlling for the repeated measure of participant identity, indicated that sex did not affect pose orientation. However, selfie type proved a significant predictor when comparing left and right cheek poses, with a stronger left cheek bias for mirror than normal selfies. Overall, these novel findings indicate that selfie-takers show intra-individual consistency in pose orientation, and in addition, replicate the previously reported left cheek bias for selfies and other types of portrait, confirming that the left cheek bias also presents within individuals’ selfie corpora. PMID:28270790

  6. Decentralized method for load sharing and power management in a hybrid single/three-phase islanded microgrid consisting of hybrid source PV/battery units

    DEFF Research Database (Denmark)

    Karimi, Yaser; Guerrero, Josep M.; Oraee, Hashem

    2016-01-01

    This paper proposes a new decentralized power management and load sharing method for a photovoltaic based, hybrid single/three-phase islanded microgrid consisting of various PV units, battery units and hybrid PV/battery units. The proposed method takes into account the available PV power...... and battery conditions of the units to share the load among them and power flow among different phases is performed automatically through three-phase units. Modified active power-frequency droop functions are used according to operating states of each unit and the frequency level is used as trigger...... for switching between the states. Efficacy of the proposed method in different load, PV generation and battery conditions is validated experimentally in a microgrid lab prototype consisted of one three-phase unit and two single-phase units....

  7. Self-Consistent Study of Conjugated Aromatic Molecular Transistors

    International Nuclear Information System (INIS)

    Jing, Wang; Yun-Ye, Liang; Hao, Chen; Peng, Wang; Note, R.; Mizuseki, H.; Kawazoe, Y.

    2010-01-01

    We study the current through conjugated aromatic molecular transistors modulated by a transverse field. The self-consistent calculation is realized with density function theory through the standard quantum chemistry software Gaussian03 and the non-equilibrium Green's function formalism. The calculated I – V curves controlled by the transverse field present the characteristics of different organic molecular transistors, the transverse field effect of which is improved by the substitutions of nitrogen atoms or fluorine atoms. On the other hand, the asymmetry of molecular configurations to the axis connecting two sulfur atoms is in favor of realizing the transverse field modulation. Suitably designed conjugated aromatic molecular transistors possess different I – V characteristics, some of them are similar to those of metal-oxide-semiconductor field-effect transistors (MOSFET). Some of the calculated molecular devices may work as elements in graphene electronics. Our results present the richness and flexibility of molecular transistors, which describe the colorful prospect of next generation devices. (condensed matter: electronic structure, electrical, magnetic, and optical properties)

  8. A self-consistent theory of the magnetic polaron

    International Nuclear Information System (INIS)

    Marvakov, D.I.; Kuzemsky, A.L.; Vlahov, J.P.

    1984-10-01

    A finite temperature self-consistent theory of magnetic polaron in the s-f model of ferromagnetic semiconductors is developed. The calculations are based on the novel approach of the thermodynamic two-time Green function methods. This approach consists in the introduction of the ''irreducible'' Green functions (IGF) and derivation of the exact Dyson equation and exact self-energy operator. It is shown that IGF method gives a unified and natural approach for a calculation of the magnetic polaron states by taking explicitly into account the damping effects and finite lifetime. (author)

  9. Self-consistent approximations beyond the CPA: Part II

    International Nuclear Information System (INIS)

    Kaplan, T.; Gray, L.J.

    1982-01-01

    This paper concentrates on a self-consistent approximation for random alloys developed by Kaplan, Leath, Gray, and Diehl. The construction of the augmented space formalism for a binary alloy is sketched, and the notation to be used derived. Using the operator methods of the augmented space, the self-consistent approximation is derived for the average Green's function, and for evaluating the self-energy, taking into account the scattering by clusters of excitations. The particular cluster approximation desired is derived by treating the scattering by the excitations with S /SUB T/ exactly. Fourier transforms on the disorder-space clustersite labels solve the self-consistent set of equations. Expansion to short range order in the alloy is also discussed. A method to reduce the problem to a computationally tractable form is described

  10. Robust Visual Tracking Via Consistent Low-Rank Sparse Learning

    KAUST Repository

    Zhang, Tianzhu

    2014-06-19

    Object tracking is the process of determining the states of a target in consecutive video frames based on properties of motion and appearance consistency. In this paper, we propose a consistent low-rank sparse tracker (CLRST) that builds upon the particle filter framework for tracking. By exploiting temporal consistency, the proposed CLRST algorithm adaptively prunes and selects candidate particles. By using linear sparse combinations of dictionary templates, the proposed method learns the sparse representations of image regions corresponding to candidate particles jointly by exploiting the underlying low-rank constraints. In addition, the proposed CLRST algorithm is computationally attractive since temporal consistency property helps prune particles and the low-rank minimization problem for learning joint sparse representations can be efficiently solved by a sequence of closed form update operations. We evaluate the proposed CLRST algorithm against 14 state-of-the-art tracking methods on a set of 25 challenging image sequences. Experimental results show that the CLRST algorithm performs favorably against state-of-the-art tracking methods in terms of accuracy and execution time.

  11. METHODS FOR SPEECH CONTACT ANALYSIS: THE CASE OF THE ‘UBIQUITY OF RHETORIC’. PROOFING THE CONCEPTUAL CONSISTENCY OF SPEECH AS LINGUISTIC MACRO-SETTING

    Directory of Open Access Journals (Sweden)

    Dr. Fee-Alexandra HAASE

    2012-11-01

    Full Text Available In this article we will apply a method of proof for conceptual consistency in a long historical range taking the example of rhetoric and persuasion. We will analyze the evidentially present linguistic features of this concept within three linguistic areas: The Indo-European languages, the Semitic languages, and the Afro-Asiatic languages. We have chosen the case of the concept ‘rhetoric’ / ’persuasion’ as a paradigm for this study. With the phenomenon of ‘linguistic dispersion’ we can explain the development of language as undirected, but with linguistic consistency across the borders of language families. We will prove that the Semitic and Indo-European languages are related. As a consequence, the strict differentiation between the Semitic and the Indo-European language families is outdated following the research positions of Starostin. In contrast to this, we will propose a theory of cultural exchange between the two language families.

  12. Consistence of Network Filtering Rules

    Institute of Scientific and Technical Information of China (English)

    SHE Kun; WU Yuancheng; HUANG Juncai; ZHOU Mingtian

    2004-01-01

    The inconsistence of firewall/VPN(Virtual Private Network) rule makes a huge maintainable cost.With development of Multinational Company,SOHO office,E-government the number of firewalls/VPN will increase rapidly.Rule table in stand-alone or network will be increased in geometric series accordingly.Checking the consistence of rule table manually is inadequate.A formal approach can define semantic consistence,make a theoretic foundation of intelligent management about rule tables.In this paper,a kind of formalization of host rules and network ones for auto rule-validation based on SET theory were proporsed and a rule validation scheme was defined.The analysis results show the superior performance of the methods and demonstrate its potential for the intelligent management based on rule tables.

  13. A consistent, differential versus integral, method for measuring the delayed neutron yield in fissions

    International Nuclear Information System (INIS)

    Flip, A.; Pang, H.F.; D'Angelo, A.

    1995-01-01

    Due to the persistent uncertainties: ∼ 5 % (the uncertainty, here and there after, is at 1σ) in the prediction of the 'reactivity scale' (β eff ) for a fast power reactor, an international project was recently initiated in the framework of the OECD/NEA activities for reevaluation, new measurements and integral benchmarking of delayed neutron (DN) data and related kinetic parameters (principally β eff ). Considering that the major part of this uncertainty is due to uncertainties in the DN yields (v d ) and the difficulty for further improvement of the precision in differential (e.g. Keepin's method) measurements, an international cooperative strategy was adopted aiming at extracting and consistently interpreting information from both differential (nuclear) and integral (in reactor) measurements. The main problem arises from the integral side; thus the idea was to realize β eff like measurements (both deterministic and noise) in 'clean' assemblies. The 'clean' calculational context permitted the authors to develop a theory allowing to link explicitly this integral experimental level with the differential one, via a unified 'Master Model' which relates v d and measurables quantities (on both levels) linearly. The combined error analysis is consequently largely simplified and the final uncertainty drastically reduced (theoretically, by a factor √3). On the other hand the same theoretical development leading to the 'Master Model', also resulted in a structured scheme of approximations of the general (stochastic) Boltzmann equation allowing a consistent analysis of the large range of measurements concerned (stochastic, dynamic, static ... ). This paper is focused on the main results of this theoretical development and its application to the analysis of the Preliminary results of the BERENICE program (β eff measurements in MASURCA, the first assembly in CADARACHE-FRANCE)

  14. Modeling of the 3RS tau protein with self-consistent field method and Monte Carlo simulation

    NARCIS (Netherlands)

    Leermakers, F.A.M.; Jho, Y.S.; Zhulina, E.B.

    2010-01-01

    Using a model with amino acid resolution of the 196 aa N-terminus of the 3RS tau protein, we performed both a Monte Carlo study and a complementary self-consistent field (SCF) analysis to obtain detailed information on conformational properties of these moieties near a charged plane (mimicking the

  15. Multiscale methods framework: self-consistent coupling of molecular theory of solvation with quantum chemistry, molecular simulations, and dissipative particle dynamics.

    Science.gov (United States)

    Kovalenko, Andriy; Gusarov, Sergey

    2018-01-31

    In this work, we will address different aspects of self-consistent field coupling of computational chemistry methods at different time and length scales in modern materials and biomolecular science. Multiscale methods framework yields dramatically improved accuracy, efficiency, and applicability by coupling models and methods on different scales. This field benefits many areas of research and applications by providing fundamental understanding and predictions. It could also play a particular role in commercialization by guiding new developments and by allowing quick evaluation of prospective research projects. We employ molecular theory of solvation which allows us to accurately introduce the effect of the environment on complex nano-, macro-, and biomolecular systems. The uniqueness of this method is that it can be naturally coupled with the whole range of computational chemistry approaches, including QM, MM, and coarse graining.

  16. Brief Report: Consistency of Search Engine Rankings for Autism Websites

    Science.gov (United States)

    Reichow, Brian; Naples, Adam; Steinhoff, Timothy; Halpern, Jason; Volkmar, Fred R.

    2012-01-01

    The World Wide Web is one of the most common methods used by parents to find information on autism spectrum disorders and most consumers find information through search engines such as Google or Bing. However, little is known about how the search engines operate or the consistency of the results that are returned over time. This study presents the…

  17. Consistency and Inconsistency in PhD Thesis Examination

    Science.gov (United States)

    Holbrook, Allyson; Bourke, Sid; Lovat, Terry; Fairbairn, Hedy

    2008-01-01

    This is a mixed methods investigation of consistency in PhD examination. At its core is the quantification of the content and conceptual analysis of examiner reports for 804 Australian theses. First, the level of consistency between what examiners say in their reports and the recommendation they provide for a thesis is explored, followed by an…

  18. Self-consistent asset pricing models

    Science.gov (United States)

    Malevergne, Y.; Sornette, D.

    2007-08-01

    We discuss the foundations of factor or regression models in the light of the self-consistency condition that the market portfolio (and more generally the risk factors) is (are) constituted of the assets whose returns it is (they are) supposed to explain. As already reported in several articles, self-consistency implies correlations between the return disturbances. As a consequence, the alphas and betas of the factor model are unobservable. Self-consistency leads to renormalized betas with zero effective alphas, which are observable with standard OLS regressions. When the conditions derived from internal consistency are not met, the model is necessarily incomplete, which means that some sources of risk cannot be replicated (or hedged) by a portfolio of stocks traded on the market, even for infinite economies. Analytical derivations and numerical simulations show that, for arbitrary choices of the proxy which are different from the true market portfolio, a modified linear regression holds with a non-zero value αi at the origin between an asset i's return and the proxy's return. Self-consistency also introduces “orthogonality” and “normality” conditions linking the betas, alphas (as well as the residuals) and the weights of the proxy portfolio. Two diagnostics based on these orthogonality and normality conditions are implemented on a basket of 323 assets which have been components of the S&P500 in the period from January 1990 to February 2005. These two diagnostics show interesting departures from dynamical self-consistency starting about 2 years before the end of the Internet bubble. Assuming that the CAPM holds with the self-consistency condition, the OLS method automatically obeys the resulting orthogonality and normality conditions and therefore provides a simple way to self-consistently assess the parameters of the model by using proxy portfolios made only of the assets which are used in the CAPM regressions. Finally, the factor decomposition with the

  19. Method and apparatus for fabricating a composite structure consisting of a filamentary material in a metal matrix

    Science.gov (United States)

    Banker, J.G.; Anderson, R.C.

    1975-10-21

    A method and apparatus are provided for preparing a composite structure consisting of filamentary material within a metal matrix. The method is practiced by the steps of confining the metal for forming the matrix in a first chamber, heating the confined metal to a temperature adequate to effect melting thereof, introducing a stream of inert gas into the chamber for pressurizing the atmosphere in the chamber to a pressure greater than atmospheric pressure, confining the filamentary material in a second chamber, heating the confined filamentary material to a temperature less than the melting temperature of the metal, evacuating the second chamber to provide an atmosphere therein at a pressure, placing the second chamber in registry with the first chamber to provide for the forced flow of the molten metal into the second chamber to effect infiltration of the filamentary material with the molten metal, and thereafter cooling the metal infiltrated-filamentary material to form said composite structure.

  20. Method and apparatus for fabricating a composite structure consisting of a filamentary material in a metal matrix

    International Nuclear Information System (INIS)

    Banker, J.G.; Anderson, R.C.

    1975-01-01

    A method and apparatus are provided for preparing a composite structure consisting of filamentary material within a metal matrix. The method is practiced by the steps of confining the metal for forming the matrix in a first chamber, heating the confined metal to a temperature adequate to effect melting thereof, introducing a stream of inert gas into the chamber for pressurizing the atmosphere in the chamber to a pressure greater than atmospheric pressure, confining the filamentary material in a second chamber, heating the confined filamentary material to a temperature less than the melting temperature of the metal, evacuating the second chamber to provide an atmosphere therein at a pressure, placing the second chamber in registry with the first chamber to provide for the forced flow of the molten metal into the second chamber to effect infiltration of the filamentary material with the molten metal, and thereafter cooling the metal infiltrated-filamentary material to form said composite structure

  1. Consistency of Higher Education Institutions’ Strategies: A Study Based on the Stakeholders’ Perception using the Balanced Scorecard

    Directory of Open Access Journals (Sweden)

    Alexsandra Barcelos Dias

    2016-10-01

    Full Text Available The strategic orientation of the company was conceived as a management tool known as the Balanced Scorecard (BSC, which aims to measure and monitor the strategy in action. The objective of this study was to verify the strategic consistency in the perception of the stakeholders at Private Higher Education Institutions (HEI, through the perspective of the Balanced Scorecard. The method used was a descriptive research, through a quantitative approach. Data were collected through a questionnaire, applied at four HEIs in the State of Minas Gerais, including directors / coordinators, teachers and students called stakeholders, to identify, based on a Balanced Scorecard model with four indicators in each perspective (financial, clients, learning and growth and internal processes, the consistency of the strategies as perceived by these groups. The main results pointed to a perception difference of the managers regarding the perspectives, with a greater degree of importance given to the perspective “Learning and Growth” and “Internal Processes”. The group of teachers attributed less importance to the “Customers” perspective. The main inconsistencies were found in the “Internal Processes” perspective. The “Financial” perspective presented less gaps when compared between the groups, which reveals a strategic inconsistency at the HEIs through the stakeholders’ perception. It is concluded that strategic consistency can contribute to organizational competitiveness, identifying the existence of alignment in the actions developed that result in greater efficiency for a competitive scenario according to its stakeholders.

  2. Neuroimaging studies of word and pseudoword reading: consistencies, inconsistencies, and limitations.

    Science.gov (United States)

    Mechelli, Andrea; Gorno-Tempini, Maria Luisa; Price, Cathy J

    2003-02-15

    Several functional neuroimaging studies have compared words and pseudowords to test different cognitive models of reading. There are difficulties with this approach, however, because cognitive models do not make clear-cut predictions at the neural level. Therefore, results can only be interpreted on the basis of prior knowledge of cognitive anatomy. Furthermore, studies comparing words and pseudowords have produced inconsistent results. The inconsistencies could reflect false-positive results due to the low statistical thresholds applied or confounds from nonlexical aspects of the stimuli. Alternatively, they may reflect true effects that are inconsistent across subjects; dependent on experimental parameters such as stimulus rate or duration; or not replicated across studies because of insufficient statistical power. In this fMRI study, we investigate consistent and inconsistent differences between word and pseudoword reading in 20 subjects, and distinguish between effects associated with increases and decreases in activity relative to fixation. In addition, the interaction of word type with stimulus duration is explored. We find that words and pseudowords activate the same set of regions relative to fixation, and within this system, there is greater activation for pseudowords than words in the left frontal operculum, left posterior inferior temporal gyrus, and the right cerebellum. The only effects of words relative to pseudowords consistent over subjects are due to decreases in activity for pseudowords relative to fixation; and there are no significant interactions between word type and stimulus duration. Finally, we observe inconsistent but highly significant effects of word type at the individual subject level. These results (i) illustrate that pseudowords place increased demands on areas that have previously been linked to lexical retrieval, and (ii) highlight the importance of including one or more baselines to qualify word type effects. Furthermore, (iii

  3. A semi-nonparametric mixture model for selecting functionally consistent proteins.

    Science.gov (United States)

    Yu, Lianbo; Doerge, Rw

    2010-09-28

    High-throughput technologies have led to a new era of proteomics. Although protein microarray experiments are becoming more common place there are a variety of experimental and statistical issues that have yet to be addressed, and that will carry over to new high-throughput technologies unless they are investigated. One of the largest of these challenges is the selection of functionally consistent proteins. We present a novel semi-nonparametric mixture model for classifying proteins as consistent or inconsistent while controlling the false discovery rate and the false non-discovery rate. The performance of the proposed approach is compared to current methods via simulation under a variety of experimental conditions. We provide a statistical method for selecting functionally consistent proteins in the context of protein microarray experiments, but the proposed semi-nonparametric mixture model method can certainly be generalized to solve other mixture data problems. The main advantage of this approach is that it provides the posterior probability of consistency for each protein.

  4. Designing A Mixed Methods Study In Primary Care

    Science.gov (United States)

    Creswell, John W.; Fetters, Michael D.; Ivankova, Nataliya V.

    2004-01-01

    BACKGROUND Mixed methods or multimethod research holds potential for rigorous, methodologically sound investigations in primary care. The objective of this study was to use criteria from the literature to evaluate 5 mixed methods studies in primary care and to advance 3 models useful for designing such investigations. METHODS We first identified criteria from the social and behavioral sciences to analyze mixed methods studies in primary care research. We then used the criteria to evaluate 5 mixed methods investigations published in primary care research journals. RESULTS Of the 5 studies analyzed, 3 included a rationale for mixing based on the need to develop a quantitative instrument from qualitative data or to converge information to best understand the research topic. Quantitative data collection involved structured interviews, observational checklists, and chart audits that were analyzed using descriptive and inferential statistical procedures. Qualitative data consisted of semistructured interviews and field observations that were analyzed using coding to develop themes and categories. The studies showed diverse forms of priority: equal priority, qualitative priority, and quantitative priority. Data collection involved quantitative and qualitative data gathered both concurrently and sequentially. The integration of the quantitative and qualitative data in these studies occurred between data analysis from one phase and data collection from a subsequent phase, while analyzing the data, and when reporting the results. DISCUSSION We recommend instrument-building, triangulation, and data transformation models for mixed methods designs as useful frameworks to add rigor to investigations in primary care. We also discuss the limitations of our study and the need for future research. PMID:15053277

  5. A particle method with adjustable transport properties - the generalized consistent Boltzmann algorithm

    International Nuclear Information System (INIS)

    Garcia, A.L.; Alexander, F.J.; Alder, B.J.

    1997-01-01

    The consistent Boltzmann algorithm (CBA) for dense, hard-sphere gases is generalized to obtain the van der Waals equation of state and the corresponding exact viscosity at all densities except at the highest temperatures. A general scheme for adjusting any transport coefficients to higher values is presented

  6. A consistent and efficient graphical analysis method to improve the quantification of reversible tracer binding in radioligand receptor dynamic PET studies

    OpenAIRE

    Zhou, Yun; Ye, Weiguo; Brašić, James R.; Crabb, Andrew H.; Hilton, John; Wong, Dean F.

    2008-01-01

    The widely used Logan plot in radioligand receptor dynamic PET studies produces marked noise-induced negative biases in the estimates of total distribution volume (DVT) and binding potential (BP). To avoid the inconsistencies in the estimates from the Logan plot, a new graphical analysis method was proposed and characterized in this study. The new plot with plasma input and with reference tissue input was first derived to estimate DVT and BP. A condition was provided to ensure that the estima...

  7. A self-consistent MoD-WM/MM structural refinement method: characterization of hydrogen bonding in the orytricha nova G-1uar

    Energy Technology Data Exchange (ETDEWEB)

    Batista, Enrique R [Los Alamos National Laboratory; Newcomer, Micharel B [YALE UNIV; Raggin, Christina M [YALE UNIV; Gascon, Jose A [YALE UNIV; Loria, J Patrick [YALE UNIV; Batista, Victor S [YALE UNIV

    2008-01-01

    This paper generalizes the MoD-QM/MM hybrid method, developed for ab initio computations of protein electrostatic potentials [Gasc6n, l.A.; Leung, S.S.F.; Batista, E.R.; Batista, V.S. J. Chem. Theory Comput. 2006,2, 175-186], as a practical algorithm for structural refinement of extended systems. The computational protocol involves a space-domain decomposition scheme for the formal fragmentation of extended systems into smaller, partially overlapping, molecular domains and the iterative self-consistent energy minimization of the constituent domains by relaxation of their geometry and electronic structure. The method accounts for mutual polarization of the molecular domains, modeled as Quantum-Mechanical (QM) layers embedded in the otherwise classical Molecular-Mechanics (MM) environment according to QM/MM hybrid methods. The method is applied to the description of benchmark models systems that allow for direct comparisons with full QM calculations, and subsequently applied to the structural characterization of the DNA Oxytricha nova Guanine quadruplex (G4). The resulting MoD-QM/MM structural model of the DNA G4 is compared to recently reported highresolution X-ray diffraction and NMR models, and partially validated by direct comparisons between {sup 1}H NMR chemical shifts that are highly sensitive to hydrogen-bonding and stacking interactions and the corresponding theoretical values obtained at the density functional theory DFT QM/MM (BH&H/6-31 G*:Amber) level in conjunction with the gauge independent atomic orbital (GIAO) method for the ab initio self consistent-field (SCF) calculation of NMR chemical shifts.

  8. Object-Oriented Hierarchy Radiation Consistency for Different Temporal and Different Sensor Images

    Directory of Open Access Journals (Sweden)

    Nan Su

    2018-02-01

    Full Text Available In the paper, we propose a novel object-oriented hierarchy radiation consistency method for dense matching of different temporal and different sensor data in the 3D reconstruction. For different temporal images, our illumination consistency method is proposed to solve both the illumination uniformity for a single image and the relative illumination normalization for image pairs. Especially in the relative illumination normalization step, singular value equalization and linear relationship of the invariant pixels is combined used for the initial global illumination normalization and the object-oriented refined illumination normalization in detail, respectively. For different sensor images, we propose the union group sparse method, which is based on improving the original group sparse model. The different sensor images are set to a similar smoothness level by the same threshold of singular value from the union group matrix. Our method comprehensively considered the influence factors on the dense matching of the different temporal and different sensor stereoscopic image pairs to simultaneously improve the illumination consistency and the smoothness consistency. The radiation consistency experimental results verify the effectiveness and superiority of the proposed method by comparing two other methods. Moreover, in the dense matching experiment of the mixed stereoscopic image pairs, our method has more advantages for objects in the urban area.

  9. College Students; Justification for Digital Piracy: A Mixed Methods Study

    Science.gov (United States)

    Yu, Szde

    2012-01-01

    A mixed methods project was devoted to understanding college students' justification for digital piracy. The project consisted of two studies, a qualitative one and a quantitative one. Qualitative interviews were conducted to identify main themes in students' justification for digital piracy, and then the findings were tested in a quantitative…

  10. Bayesian detection of causal rare variants under posterior consistency.

    KAUST Repository

    Liang, Faming

    2013-07-26

    Identification of causal rare variants that are associated with complex traits poses a central challenge on genome-wide association studies. However, most current research focuses only on testing the global association whether the rare variants in a given genomic region are collectively associated with the trait. Although some recent work, e.g., the Bayesian risk index method, have tried to address this problem, it is unclear whether the causal rare variants can be consistently identified by them in the small-n-large-P situation. We develop a new Bayesian method, the so-called Bayesian Rare Variant Detector (BRVD), to tackle this problem. The new method simultaneously addresses two issues: (i) (Global association test) Are there any of the variants associated with the disease, and (ii) (Causal variant detection) Which variants, if any, are driving the association. The BRVD ensures the causal rare variants to be consistently identified in the small-n-large-P situation by imposing some appropriate prior distributions on the model and model specific parameters. The numerical results indicate that the BRVD is more powerful for testing the global association than the existing methods, such as the combined multivariate and collapsing test, weighted sum statistic test, RARECOVER, sequence kernel association test, and Bayesian risk index, and also more powerful for identification of causal rare variants than the Bayesian risk index method. The BRVD has also been successfully applied to the Early-Onset Myocardial Infarction (EOMI) Exome Sequence Data. It identified a few causal rare variants that have been verified in the literature.

  11. Example-Based Image Colorization Using Locality Consistent Sparse Representation.

    Science.gov (United States)

    Bo Li; Fuchen Zhao; Zhuo Su; Xiangguo Liang; Yu-Kun Lai; Rosin, Paul L

    2017-11-01

    Image colorization aims to produce a natural looking color image from a given gray-scale image, which remains a challenging problem. In this paper, we propose a novel example-based image colorization method exploiting a new locality consistent sparse representation. Given a single reference color image, our method automatically colorizes the target gray-scale image by sparse pursuit. For efficiency and robustness, our method operates at the superpixel level. We extract low-level intensity features, mid-level texture features, and high-level semantic features for each superpixel, which are then concatenated to form its descriptor. The collection of feature vectors for all the superpixels from the reference image composes the dictionary. We formulate colorization of target superpixels as a dictionary-based sparse reconstruction problem. Inspired by the observation that superpixels with similar spatial location and/or feature representation are likely to match spatially close regions from the reference image, we further introduce a locality promoting regularization term into the energy formulation, which substantially improves the matching consistency and subsequent colorization results. Target superpixels are colorized based on the chrominance information from the dominant reference superpixels. Finally, to further improve coherence while preserving sharpness, we develop a new edge-preserving filter for chrominance channels with the guidance from the target gray-scale image. To the best of our knowledge, this is the first work on sparse pursuit image colorization from single reference images. Experimental results demonstrate that our colorization method outperforms the state-of-the-art methods, both visually and quantitatively using a user study.

  12. Bayesian detection of causal rare variants under posterior consistency.

    Directory of Open Access Journals (Sweden)

    Faming Liang

    Full Text Available Identification of causal rare variants that are associated with complex traits poses a central challenge on genome-wide association studies. However, most current research focuses only on testing the global association whether the rare variants in a given genomic region are collectively associated with the trait. Although some recent work, e.g., the Bayesian risk index method, have tried to address this problem, it is unclear whether the causal rare variants can be consistently identified by them in the small-n-large-P situation. We develop a new Bayesian method, the so-called Bayesian Rare Variant Detector (BRVD, to tackle this problem. The new method simultaneously addresses two issues: (i (Global association test Are there any of the variants associated with the disease, and (ii (Causal variant detection Which variants, if any, are driving the association. The BRVD ensures the causal rare variants to be consistently identified in the small-n-large-P situation by imposing some appropriate prior distributions on the model and model specific parameters. The numerical results indicate that the BRVD is more powerful for testing the global association than the existing methods, such as the combined multivariate and collapsing test, weighted sum statistic test, RARECOVER, sequence kernel association test, and Bayesian risk index, and also more powerful for identification of causal rare variants than the Bayesian risk index method. The BRVD has also been successfully applied to the Early-Onset Myocardial Infarction (EOMI Exome Sequence Data. It identified a few causal rare variants that have been verified in the literature.

  13. Bayesian detection of causal rare variants under posterior consistency.

    KAUST Repository

    Liang, Faming; Xiong, Momiao

    2013-01-01

    Identification of causal rare variants that are associated with complex traits poses a central challenge on genome-wide association studies. However, most current research focuses only on testing the global association whether the rare variants in a given genomic region are collectively associated with the trait. Although some recent work, e.g., the Bayesian risk index method, have tried to address this problem, it is unclear whether the causal rare variants can be consistently identified by them in the small-n-large-P situation. We develop a new Bayesian method, the so-called Bayesian Rare Variant Detector (BRVD), to tackle this problem. The new method simultaneously addresses two issues: (i) (Global association test) Are there any of the variants associated with the disease, and (ii) (Causal variant detection) Which variants, if any, are driving the association. The BRVD ensures the causal rare variants to be consistently identified in the small-n-large-P situation by imposing some appropriate prior distributions on the model and model specific parameters. The numerical results indicate that the BRVD is more powerful for testing the global association than the existing methods, such as the combined multivariate and collapsing test, weighted sum statistic test, RARECOVER, sequence kernel association test, and Bayesian risk index, and also more powerful for identification of causal rare variants than the Bayesian risk index method. The BRVD has also been successfully applied to the Early-Onset Myocardial Infarction (EOMI) Exome Sequence Data. It identified a few causal rare variants that have been verified in the literature.

  14. Personality consistency in dogs: a meta-analysis.

    Science.gov (United States)

    Fratkin, Jamie L; Sinn, David L; Patall, Erika A; Gosling, Samuel D

    2013-01-01

    Personality, or consistent individual differences in behavior, is well established in studies of dogs. Such consistency implies predictability of behavior, but some recent research suggests that predictability cannot be assumed. In addition, anecdotally, many dog experts believe that 'puppy tests' measuring behavior during the first year of a dog's life are not accurate indicators of subsequent adult behavior. Personality consistency in dogs is an important aspect of human-dog relationships (e.g., when selecting dogs suitable for substance-detection work or placement in a family). Here we perform the first comprehensive meta-analysis of studies reporting estimates of temporal consistency of dog personality. A thorough literature search identified 31 studies suitable for inclusion in our meta-analysis. Overall, we found evidence to suggest substantial consistency (r = 0.43). Furthermore, personality consistency was higher in older dogs, when behavioral assessment intervals were shorter, and when the measurement tool was exactly the same in both assessments. In puppies, aggression and submissiveness were the most consistent dimensions, while responsiveness to training, fearfulness, and sociability were the least consistent dimensions. In adult dogs, there were no dimension-based differences in consistency. There was no difference in personality consistency in dogs tested first as puppies and later as adults (e.g., 'puppy tests') versus dogs tested first as puppies and later again as puppies. Finally, there were no differences in consistency between working versus non-working dogs, between behavioral codings versus behavioral ratings, and between aggregate versus single measures. Implications for theory, practice, and future research are discussed.

  15. Personality consistency in dogs: a meta-analysis.

    Directory of Open Access Journals (Sweden)

    Jamie L Fratkin

    Full Text Available Personality, or consistent individual differences in behavior, is well established in studies of dogs. Such consistency implies predictability of behavior, but some recent research suggests that predictability cannot be assumed. In addition, anecdotally, many dog experts believe that 'puppy tests' measuring behavior during the first year of a dog's life are not accurate indicators of subsequent adult behavior. Personality consistency in dogs is an important aspect of human-dog relationships (e.g., when selecting dogs suitable for substance-detection work or placement in a family. Here we perform the first comprehensive meta-analysis of studies reporting estimates of temporal consistency of dog personality. A thorough literature search identified 31 studies suitable for inclusion in our meta-analysis. Overall, we found evidence to suggest substantial consistency (r = 0.43. Furthermore, personality consistency was higher in older dogs, when behavioral assessment intervals were shorter, and when the measurement tool was exactly the same in both assessments. In puppies, aggression and submissiveness were the most consistent dimensions, while responsiveness to training, fearfulness, and sociability were the least consistent dimensions. In adult dogs, there were no dimension-based differences in consistency. There was no difference in personality consistency in dogs tested first as puppies and later as adults (e.g., 'puppy tests' versus dogs tested first as puppies and later again as puppies. Finally, there were no differences in consistency between working versus non-working dogs, between behavioral codings versus behavioral ratings, and between aggregate versus single measures. Implications for theory, practice, and future research are discussed.

  16. Personality Consistency in Dogs: A Meta-Analysis

    Science.gov (United States)

    Fratkin, Jamie L.; Sinn, David L.; Patall, Erika A.; Gosling, Samuel D.

    2013-01-01

    Personality, or consistent individual differences in behavior, is well established in studies of dogs. Such consistency implies predictability of behavior, but some recent research suggests that predictability cannot be assumed. In addition, anecdotally, many dog experts believe that ‘puppy tests’ measuring behavior during the first year of a dog's life are not accurate indicators of subsequent adult behavior. Personality consistency in dogs is an important aspect of human-dog relationships (e.g., when selecting dogs suitable for substance-detection work or placement in a family). Here we perform the first comprehensive meta-analysis of studies reporting estimates of temporal consistency of dog personality. A thorough literature search identified 31 studies suitable for inclusion in our meta-analysis. Overall, we found evidence to suggest substantial consistency (r = 0.43). Furthermore, personality consistency was higher in older dogs, when behavioral assessment intervals were shorter, and when the measurement tool was exactly the same in both assessments. In puppies, aggression and submissiveness were the most consistent dimensions, while responsiveness to training, fearfulness, and sociability were the least consistent dimensions. In adult dogs, there were no dimension-based differences in consistency. There was no difference in personality consistency in dogs tested first as puppies and later as adults (e.g., ‘puppy tests’) versus dogs tested first as puppies and later again as puppies. Finally, there were no differences in consistency between working versus non-working dogs, between behavioral codings versus behavioral ratings, and between aggregate versus single measures. Implications for theory, practice, and future research are discussed. PMID:23372787

  17. Diagnosing a Strong-Fault Model by Conflict and Consistency

    Directory of Open Access Journals (Sweden)

    Wenfeng Zhang

    2018-03-01

    Full Text Available The diagnosis method for a weak-fault model with only normal behaviors of each component has evolved over decades. However, many systems now demand a strong-fault models, the fault modes of which have specific behaviors as well. It is difficult to diagnose a strong-fault model due to its non-monotonicity. Currently, diagnosis methods usually employ conflicts to isolate possible fault and the process can be expedited when some observed output is consistent with the model’s prediction where the consistency indicates probably normal components. This paper solves the problem of efficiently diagnosing a strong-fault model by proposing a novel Logic-based Truth Maintenance System (LTMS with two search approaches based on conflict and consistency. At the beginning, the original a strong-fault model is encoded by Boolean variables and converted into Conjunctive Normal Form (CNF. Then the proposed LTMS is employed to reason over CNF and find multiple minimal conflicts and maximal consistencies when there exists fault. The search approaches offer the best candidate efficiency based on the reasoning result until the diagnosis results are obtained. The completeness, coverage, correctness and complexity of the proposals are analyzed theoretically to show their strength and weakness. Finally, the proposed approaches are demonstrated by applying them to a real-world domain—the heat control unit of a spacecraft—where the proposed methods are significantly better than best first and conflict directly with A* search methods.

  18. Diagnosing a Strong-Fault Model by Conflict and Consistency.

    Science.gov (United States)

    Zhang, Wenfeng; Zhao, Qi; Zhao, Hongbo; Zhou, Gan; Feng, Wenquan

    2018-03-29

    The diagnosis method for a weak-fault model with only normal behaviors of each component has evolved over decades. However, many systems now demand a strong-fault models, the fault modes of which have specific behaviors as well. It is difficult to diagnose a strong-fault model due to its non-monotonicity. Currently, diagnosis methods usually employ conflicts to isolate possible fault and the process can be expedited when some observed output is consistent with the model's prediction where the consistency indicates probably normal components. This paper solves the problem of efficiently diagnosing a strong-fault model by proposing a novel Logic-based Truth Maintenance System (LTMS) with two search approaches based on conflict and consistency. At the beginning, the original a strong-fault model is encoded by Boolean variables and converted into Conjunctive Normal Form (CNF). Then the proposed LTMS is employed to reason over CNF and find multiple minimal conflicts and maximal consistencies when there exists fault. The search approaches offer the best candidate efficiency based on the reasoning result until the diagnosis results are obtained. The completeness, coverage, correctness and complexity of the proposals are analyzed theoretically to show their strength and weakness. Finally, the proposed approaches are demonstrated by applying them to a real-world domain-the heat control unit of a spacecraft-where the proposed methods are significantly better than best first and conflict directly with A* search methods.

  19. An efficient method to transcription factor binding sites imputation via simultaneous completion of multiple matrices with positional consistency.

    Science.gov (United States)

    Guo, Wei-Li; Huang, De-Shuang

    2017-08-22

    Transcription factors (TFs) are DNA-binding proteins that have a central role in regulating gene expression. Identification of DNA-binding sites of TFs is a key task in understanding transcriptional regulation, cellular processes and disease. Chromatin immunoprecipitation followed by high-throughput sequencing (ChIP-seq) enables genome-wide identification of in vivo TF binding sites. However, it is still difficult to map every TF in every cell line owing to cost and biological material availability, which poses an enormous obstacle for integrated analysis of gene regulation. To address this problem, we propose a novel computational approach, TFBSImpute, for predicting additional TF binding profiles by leveraging information from available ChIP-seq TF binding data. TFBSImpute fuses the dataset to a 3-mode tensor and imputes missing TF binding signals via simultaneous completion of multiple TF binding matrices with positional consistency. We show that signals predicted by our method achieve overall similarity with experimental data and that TFBSImpute significantly outperforms baseline approaches, by assessing the performance of imputation methods against observed ChIP-seq TF binding profiles. Besides, motif analysis shows that TFBSImpute preforms better in capturing binding motifs enriched in observed data compared with baselines, indicating that the higher performance of TFBSImpute is not simply due to averaging related samples. We anticipate that our approach will constitute a useful complement to experimental mapping of TF binding, which is beneficial for further study of regulation mechanisms and disease.

  20. Non-Born-Oppenheimer trajectories with self-consistent decay of mixing

    International Nuclear Information System (INIS)

    Zhu Chaoyuan; Jasper, Ahren W.; Truhlar, Donald G.

    2004-01-01

    A semiclassical trajectory method, called the self-consistent decay of mixing (SCDM) method, is presented for the treatment of electronically nonadiabatic dynamics. The SCDM method is a modification of the semiclassical Ehrenfest (SE) method (also called the semiclassical time-dependent self-consistent-field method) that solves the problem of unphysical mixed final states by including decay-of-mixing terms in the equations for the evolution of the electronic state populations. These terms generate a force, called the decoherent force (or dephasing force), that drives the electronic component of each trajectory toward a pure state. Results for several mixed quantum-classical methods, in particular the SCDM, SE, and natural-decay-of-mixing methods and several trajectory surface hopping methods, are compared to the results of accurate quantum mechanical calculations for 12 cases involving five different fully dimensional triatomic model systems. The SCDM method is found to be the most accurate of the methods tested. The method should be useful for the simulation of photochemical reactions

  1. Consistent Conformal Extensions of the Standard Model arXiv

    CERN Document Server

    Loebbert, Florian; Plefka, Jan

    The question of whether classically conformal modifications of the standard model are consistent with experimental obervations has recently been subject to renewed interest. The method of Gildener and Weinberg provides a natural framework for the study of the effective potential of the resulting multi-scalar standard model extensions. This approach relies on the assumption of the ordinary loop hierarchy $\\lambda_\\text{s} \\sim g^2_\\text{g}$ of scalar and gauge couplings. On the other hand, Andreassen, Frost and Schwartz recently argued that in the (single-scalar) standard model, gauge invariant results require the consistent scaling $\\lambda_\\text{s} \\sim g^4_\\text{g}$. In the present paper we contrast these two hierarchy assumptions and illustrate the differences in the phenomenological predictions of minimal conformal extensions of the standard model.

  2. Designing a mixed methods study in primary care.

    Science.gov (United States)

    Creswell, John W; Fetters, Michael D; Ivankova, Nataliya V

    2004-01-01

    Mixed methods or multimethod research holds potential for rigorous, methodologically sound investigations in primary care. The objective of this study was to use criteria from the literature to evaluate 5 mixed methods studies in primary care and to advance 3 models useful for designing such investigations. We first identified criteria from the social and behavioral sciences to analyze mixed methods studies in primary care research. We then used the criteria to evaluate 5 mixed methods investigations published in primary care research journals. Of the 5 studies analyzed, 3 included a rationale for mixing based on the need to develop a quantitative instrument from qualitative data or to converge information to best understand the research topic. Quantitative data collection involved structured interviews, observational checklists, and chart audits that were analyzed using descriptive and inferential statistical procedures. Qualitative data consisted of semistructured interviews and field observations that were analyzed using coding to develop themes and categories. The studies showed diverse forms of priority: equal priority, qualitative priority, and quantitative priority. Data collection involved quantitative and qualitative data gathered both concurrently and sequentially. The integration of the quantitative and qualitative data in these studies occurred between data analysis from one phase and data collection from a subsequent phase, while analyzing the data, and when reporting the results. We recommend instrument-building, triangulation, and data transformation models for mixed methods designs as useful frameworks to add rigor to investigations in primary care. We also discuss the limitations of our study and the need for future research.

  3. Studying the Night Shift: A Multi-method Analysis of Overnight Academic Library Users

    Directory of Open Access Journals (Sweden)

    David Schwieder

    2017-09-01

    Full Text Available Abstract Objective – This paper reports on a study which assessed the preferences and behaviors of overnight library users at a major state university. The findings were used to guide the design and improvement of overnight library resources and services, and the selection of a future overnight library site. Methods – A multi-method design used descriptive and correlational statistics to analyze data produced by a multi-sample survey of overnight library users. These statistical methods included rankings, percentages, and multiple regression. Results – Results showed a strong consistency across statistical methods and samples. Overnight library users consistently prioritized facilities like power outlets for electronic devices, and group and quiet study spaces, and placed far less emphasis on assistance from library staff. Conclusions – By employing more advanced statistical and sampling procedures than had been found in previous research, this paper strengthens the validity of findings on overnight user preferences and behaviors. The multi-method research design can also serve to guide future work in this area.

  4. A study of Consistency in the Selection of Search Terms and Search Concepts: A Case Study in National Taiwan University

    Directory of Open Access Journals (Sweden)

    Mu-hsuan Huang

    2001-12-01

    Full Text Available This article analyzes the consistency in the selection of search terms and search contents of college and graduate students in National Taiwan University when they are using PsycLIT CD-ROM database. 31 students conducted pre-assigned searches, doing 59 searches generating 609 search terms. The study finds the consistency in selection of search terms of first level is 22.14% and second level is 35%. These results are similar with others’ researches. About the consistency in search concepts, no matter the overlaps of searched articles or judge relevant articles are lower than other researches. [Article content in Chinese

  5. Frank Gilbreth and health care delivery method study driven learning.

    Science.gov (United States)

    Towill, Denis R

    2009-01-01

    The purpose of this article is to look at method study, as devised by the Gilbreths at the beginning of the twentieth century, which found early application in hospital quality assurance and surgical "best practice". It has since become a core activity in all modern methods, as applied to healthcare delivery improvement programmes. The article traces the origin of what is now currently and variously called "business process re-engineering", "business process improvement" and "lean healthcare" etc., by different management gurus back to the century-old pioneering work of Frank Gilbreth. The outcome is a consistent framework involving "width", "length" and "depth" dimensions within which healthcare delivery systems can be analysed, designed and successfully implemented to achieve better and more consistent performance. Healthcare method (saving time plus saving motion) study is best practised as co-joint action learning activity "owned" by all "players" involved in the re-engineering process. However, although process mapping is a key step forward, in itself it is no guarantee of effective re-engineering. It is not even the beginning of the end of the change challenge, although it should be the end of the beginning. What is needed is innovative exploitation of method study within a healthcare organisational learning culture accelerated via the Gilbreth Knowledge Flywheel. It is shown that effective healthcare delivery pipeline improvement is anchored into a team approach involving all "players" in the system especially physicians. A comprehensive process study, constructive dialogue, proper and highly professional re-engineering plus managed implementation are essential components. Experience suggests "learning" is thereby achieved via "natural groups" actively involved in healthcare processes. The article provides a proven method for exploiting Gilbreths' outputs and their many successors in enabling more productive evidence-based healthcare delivery as summarised

  6. Deep Feature Consistent Variational Autoencoder

    OpenAIRE

    Hou, Xianxu; Shen, Linlin; Sun, Ke; Qiu, Guoping

    2016-01-01

    We present a novel method for constructing Variational Autoencoder (VAE). Instead of using pixel-by-pixel loss, we enforce deep feature consistency between the input and the output of a VAE, which ensures the VAE's output to preserve the spatial correlation characteristics of the input, thus leading the output to have a more natural visual appearance and better perceptual quality. Based on recent deep learning works such as style transfer, we employ a pre-trained deep convolutional neural net...

  7. Selection of suitable propagation method for consistent plantlets production in Stevia rebaudiana (Bertoni)

    OpenAIRE

    Khalil, Shahid Akbar; Zamir, Roshan; Ahmad, Nisar

    2014-01-01

    Stevia rebaudiana (Bert.) is an emerging sugar alternative and anti-diabetic plant in Pakistan. That is why people did not know the exact time of propagation. The main objective of the present study was to establish feasible propagation methods for healthy biomass production. In the present study, seed germination, stem cuttings and micropropagation were investigated for higher productivity. Fresh seeds showed better germination (25.51–40%) but lost viability after a few days of storage. In o...

  8. Adopting a blended learning approach to teaching evidence based medicine: a mixed methods study

    OpenAIRE

    Ilic, Dragan; Hart, William; Fiddes, Patrick; Misso, Marie; Villanueva, Elmer

    2013-01-01

    Background Evidence Based Medicine (EBM) is a core unit delivered across many medical schools. Few studies have investigated the most effective method of teaching a course in EBM to medical students. The objective of this study was to identify whether a blended-learning approach to teaching EBM is more effective a didactic-based approach at increasing medical student competency in EBM. Methods A mixed-methods study was conducted consisting of a controlled trial and focus groups with second ye...

  9. Systematic studies of molecular vibrational anharmonicity and vibration-rotation interaction by self-consistent-field higher derivative methods: Applications to asymmetric and symmetric top and linear polyatomic molecules

    International Nuclear Information System (INIS)

    Clabo, D.A. Jr.

    1987-04-01

    Inclusion of the anharmonicity normal mode vibrations [i.e., the third and fourth (and higher) derivatives of a molecular Born-Oppenheimer potential energy surface] is necessary in order to theoretically reproduce experimental fundamental vibrational frequencies of a molecule. Although ab initio determinations of harmonic vibrational frequencies may give errors of only a few percent by the inclusion of electron correlation within a large basis set for small molecules, in general, molecular fundamental vibrational frequencies are more often available from high resolution vibration-rotation spectra. Recently developed analytic third derivatives methods for self-consistent-field (SCF) wavefunctions have made it possible to examine with previously unavailable accuracy and computational efficiency the anharmonic force fields of small molecules

  10. Systematic studies of molecular vibrational anharmonicity and vibration-rotation interaction by self-consistent-field higher derivative methods: Applications to asymmetric and symmetric top and linear polyatomic molecules

    Energy Technology Data Exchange (ETDEWEB)

    Clabo, D.A. Jr.

    1987-04-01

    Inclusion of the anharmonicity normal mode vibrations (i.e., the third and fourth (and higher) derivatives of a molecular Born-Oppenheimer potential energy surface) is necessary in order to theoretically reproduce experimental fundamental vibrational frequencies of a molecule. Although ab initio determinations of harmonic vibrational frequencies may give errors of only a few percent by the inclusion of electron correlation within a large basis set for small molecules, in general, molecular fundamental vibrational frequencies are more often available from high resolution vibration-rotation spectra. Recently developed analytic third derivatives methods for self-consistent-field (SCF) wavefunctions have made it possible to examine with previously unavailable accuracy and computational efficiency the anharmonic force fields of small molecules.

  11. Measuring consistency of web page design and its effects on performance and satisfaction.

    Science.gov (United States)

    Ozok, A A; Salvendy, G

    2000-04-01

    This study examines the methods for measuring the consistency levels of web pages and the effect of consistency on the performance and satisfaction of the world-wide web (WWW) user. For clarification, a home page is referred to as a single page that is the default page of a web site on the WWW. A web page refers to a single screen that indicates a specific address on the WWW. This study has tested a series of web pages that were mostly hyperlinked. Therefore, the term 'web page' has been adopted for the nomenclature while referring to the objects of which the features were tested. It was hypothesized that participants would perform better and be more satisfied using web pages that have consistent rather than inconsistent interface design; that the overall consistency level of an interface design would significantly correlate with the three elements of consistency, physical, communicational and conceptual consistency; and that physical and communicational consistencies would interact with each other. The hypotheses were tested in a four-group, between-subject design, with 10 participants in each group. The results partially support the hypothesis regarding error rate, but not regarding satisfaction and performance time. The results also support the hypothesis that each of the three elements of consistency significantly contribute to the overall consistency of a web page, and that physical and communicational consistencies interact with each other, while conceptual consistency does not interact with them.

  12. Efficient implementation of three-dimensional reference interaction site model self-consistent-field method: application to solvatochromic shift calculations.

    Science.gov (United States)

    Minezawa, Noriyuki; Kato, Shigeki

    2007-02-07

    The authors present an implementation of the three-dimensional reference interaction site model self-consistent-field (3D-RISM-SCF) method. First, they introduce a robust and efficient algorithm for solving the 3D-RISM equation. The algorithm is a hybrid of the Newton-Raphson and Picard methods. The Jacobian matrix is analytically expressed in a computationally useful form. Second, they discuss the solute-solvent electrostatic interaction. For the solute to solvent route, the electrostatic potential (ESP) map on a 3D grid is constructed directly from the electron density. The charge fitting procedure is not required to determine the ESP. For the solvent to solute route, the ESP acting on the solute molecule is derived from the solvent charge distribution obtained by solving the 3D-RISM equation. Matrix elements of the solute-solvent interaction are evaluated by the direct numerical integration. A remarkable reduction in the computational time is observed in both routes. Finally, the authors implement the first derivatives of the free energy with respect to the solute nuclear coordinates. They apply the present method to "solute" water and formaldehyde in aqueous solvent using the simple point charge model, and the results are compared with those from other methods: the six-dimensional molecular Ornstein-Zernike SCF, the one-dimensional site-site RISM-SCF, and the polarizable continuum model. The authors also calculate the solvatochromic shifts of acetone, benzonitrile, and nitrobenzene using the present method and compare them with the experimental and other theoretical results.

  13. Consistency between Self-Reported and Recorded Values for Clinical Measures

    OpenAIRE

    III, Joseph Thomas; Paulet, Mindy; Rajpura, Jigar R.

    2016-01-01

    Objectives. This study evaluated consistency between self-reported values for clinical measures and recorded clinical measures. Methods. Self-reported values were collected for the clinical measures: systolic blood pressure, diastolic blood pressure, glucose level, height, weight, and cholesterol from health risk assessments completed by enrollees in a privately insured cohort. Body mass index (BMI) was computed from reported height and weight. Practitioner recorded values for the clinical me...

  14. Wide baseline stereo matching based on double topological relationship consistency

    Science.gov (United States)

    Zou, Xiaohong; Liu, Bin; Song, Xiaoxue; Liu, Yang

    2009-07-01

    Stereo matching is one of the most important branches in computer vision. In this paper, an algorithm is proposed for wide-baseline stereo vision matching. Here, a novel scheme is presented called double topological relationship consistency (DCTR). The combination of double topological configuration includes the consistency of first topological relationship (CFTR) and the consistency of second topological relationship (CSTR). It not only sets up a more advanced model on matching, but discards mismatches by iteratively computing the fitness of the feature matches and overcomes many problems of traditional methods depending on the powerful invariance to changes in the scale, rotation or illumination across large view changes and even occlusions. Experimental examples are shown where the two cameras have been located in very different orientations. Also, epipolar geometry can be recovered using RANSAC by far the most widely method adopted possibly. By the method, we can obtain correspondences with high precision on wide baseline matching problems. Finally, the effectiveness and reliability of this method are demonstrated in wide-baseline experiments on the image pairs.

  15. Daily Behavior Report Cards: An Investigation of the Consistency of On-Task Data across Raters and Methods

    Science.gov (United States)

    Chafouleas, Sandra M.; Riley-Tillman, T. Chris; Sassu, Kari A.; LaFrance, Mary J.; Patwa, Shamim S.

    2007-01-01

    In this study, the consistency of on-task data collected across raters using either a Daily Behavior Report Card (DBRC) or systematic direct observation was examined to begin to understand the decision reliability of using DBRCs to monitor student behavior. Results suggested very similar conclusions might be drawn when visually examining data…

  16. Modeling a Consistent Behavior of PLC-Sensors

    Directory of Open Access Journals (Sweden)

    E. V. Kuzmin

    2014-01-01

    Full Text Available The article extends the cycle of papers dedicated to programming and verificatoin of PLC-programs by LTL-specification. This approach provides the availability of correctness analysis of PLC-programs by the model checking method.The model checking method needs to construct a finite model of a PLC program. For successful verification of required properties it is important to take into consideration that not all combinations of input signals from the sensors can occur while PLC works with a control object. This fact requires more advertence to the construction of the PLC-program model.In this paper we propose to describe a consistent behavior of sensors by three groups of LTL-formulas. They will affect the program model, approximating it to the actual behavior of the PLC program. The idea of LTL-requirements is shown by an example.A PLC program is a description of reactions on input signals from sensors, switches and buttons. In constructing a PLC-program model, the approach to modeling a consistent behavior of PLC sensors allows to focus on modeling precisely these reactions without an extension of the program model by additional structures for realization of a realistic behavior of sensors. The consistent behavior of sensors is taken into account only at the stage of checking a conformity of the programming model to required properties, i. e. a property satisfaction proof for the constructed model occurs with the condition that the model contains only such executions of the program that comply with the consistent behavior of sensors.

  17. Personality and Situation Predictors of Consistent Eating Patterns

    OpenAIRE

    Vainik, Uku; Dub?, Laurette; Lu, Ji; Fellows, Lesley K.

    2015-01-01

    Introduction A consistent eating style might be beneficial to avoid overeating in a food-rich environment. Eating consistency entails maintaining a similar dietary pattern across different eating situations. This construct is relatively under-studied, but the available evidence suggests that eating consistency supports successful weight maintenance and decreases risk for metabolic syndrome and cardiovascular disease. Yet, personality and situation predictors of consistency have not been studi...

  18. Microemulsion Electrokinetic Chromatography in Combination with Chemometric Methods to Evaluate the Holistic Quality Consistency and Predict the Antioxidant Activity of Ixeris sonchifolia (Bunge Hance Injection.

    Directory of Open Access Journals (Sweden)

    Lanping Yang

    Full Text Available In this paper, microemulsion electrokinetic chromatography (MEEKC fingerprints combined with quantification were successfully developed to monitor the holistic quality consistency of Ixeris sonchifolia (Bge. Hance Injection (ISHI. ISHI is a Chinese traditional patent medicine used for its anti-inflammatory and hemostatic effects. The effects of five crucial experimental variables on MEEKC were optimized by the central composite design. Under the optimized conditions, the MEEKC fingerprints of 28 ISHIs were developed. Quantitative determination of seven marker compounds was employed simultaneously, then 28 batches of samples from two manufacturers were clearly divided into two clusters by the principal component analysis. In fingerprint assessments, a systematic quantitative fingerprint method was established for the holistic quality consistency evaluation of ISHI from qualitative and quantitative perspectives, by which the qualities of 28 samples were well differentiated. In addition, the fingerprint-efficacy relationship between the fingerprints and the antioxidant activities was established utilizing orthogonal projection to latent structures, which provided important medicinal efficacy information for quality control. The present study offered a powerful and holistic approach to evaluating the quality consistency of herbal medicines and their preparations.

  19. Multi-view 3D echocardiography compounding based on feature consistency

    Science.gov (United States)

    Yao, Cheng; Simpson, John M.; Schaeffter, Tobias; Penney, Graeme P.

    2011-09-01

    Echocardiography (echo) is a widely available method to obtain images of the heart; however, echo can suffer due to the presence of artefacts, high noise and a restricted field of view. One method to overcome these limitations is to use multiple images, using the 'best' parts from each image to produce a higher quality 'compounded' image. This paper describes our compounding algorithm which specifically aims to reduce the effect of echo artefacts as well as improving the signal-to-noise ratio, contrast and extending the field of view. Our method weights image information based on a local feature coherence/consistency between all the overlapping images. Validation has been carried out using phantom, volunteer and patient datasets consisting of up to ten multi-view 3D images. Multiple sets of phantom images were acquired, some directly from the phantom surface, and others by imaging through hard and soft tissue mimicking material to degrade the image quality. Our compounding method is compared to the original, uncompounded echocardiography images, and to two basic statistical compounding methods (mean and maximum). Results show that our method is able to take a set of ten images, degraded by soft and hard tissue artefacts, and produce a compounded image of equivalent quality to images acquired directly from the phantom. Our method on phantom, volunteer and patient data achieves almost the same signal-to-noise improvement as the mean method, while simultaneously almost achieving the same contrast improvement as the maximum method. We show a statistically significant improvement in image quality by using an increased number of images (ten compared to five), and visual inspection studies by three clinicians showed very strong preference for our compounded volumes in terms of overall high image quality, large field of view, high endocardial border definition and low cavity noise.

  20. Multi-view 3D echocardiography compounding based on feature consistency

    International Nuclear Information System (INIS)

    Yao Cheng; Schaeffter, Tobias; Penney, Graeme P; Simpson, John M

    2011-01-01

    Echocardiography (echo) is a widely available method to obtain images of the heart; however, echo can suffer due to the presence of artefacts, high noise and a restricted field of view. One method to overcome these limitations is to use multiple images, using the 'best' parts from each image to produce a higher quality 'compounded' image. This paper describes our compounding algorithm which specifically aims to reduce the effect of echo artefacts as well as improving the signal-to-noise ratio, contrast and extending the field of view. Our method weights image information based on a local feature coherence/consistency between all the overlapping images. Validation has been carried out using phantom, volunteer and patient datasets consisting of up to ten multi-view 3D images. Multiple sets of phantom images were acquired, some directly from the phantom surface, and others by imaging through hard and soft tissue mimicking material to degrade the image quality. Our compounding method is compared to the original, uncompounded echocardiography images, and to two basic statistical compounding methods (mean and maximum). Results show that our method is able to take a set of ten images, degraded by soft and hard tissue artefacts, and produce a compounded image of equivalent quality to images acquired directly from the phantom. Our method on phantom, volunteer and patient data achieves almost the same signal-to-noise improvement as the mean method, while simultaneously almost achieving the same contrast improvement as the maximum method. We show a statistically significant improvement in image quality by using an increased number of images (ten compared to five), and visual inspection studies by three clinicians showed very strong preference for our compounded volumes in terms of overall high image quality, large field of view, high endocardial border definition and low cavity noise.

  1. Orthology and paralogy constraints: satisfiability and consistency.

    Science.gov (United States)

    Lafond, Manuel; El-Mabrouk, Nadia

    2014-01-01

    A variety of methods based on sequence similarity, reconciliation, synteny or functional characteristics, can be used to infer orthology and paralogy relations between genes of a given gene family  G. But is a given set  C of orthology/paralogy constraints possible, i.e., can they simultaneously co-exist in an evolutionary history for  G? While previous studies have focused on full sets of constraints, here we consider the general case where  C does not necessarily involve a constraint for each pair of genes. The problem is subdivided in two parts: (1) Is  C satisfiable, i.e. can we find an event-labeled gene tree G inducing  C? (2) Is there such a G which is consistent, i.e., such that all displayed triplet phylogenies are included in a species tree? Previous results on the Graph sandwich problem can be used to answer to (1), and we provide polynomial-time algorithms for satisfiability and consistency with a given species tree. We also describe a new polynomial-time algorithm for the case of consistency with an unknown species tree and full knowledge of pairwise orthology/paralogy relationships, as well as a branch-and-bound algorithm in the case when unknown relations are present. We show that our algorithms can be used in combination with ProteinOrtho, a sequence similarity-based orthology detection tool, to extract a set of robust orthology/paralogy relationships.

  2. Methodology or method? A critical review of qualitative case study reports

    Directory of Open Access Journals (Sweden)

    Nerida Hyett

    2014-05-01

    Full Text Available Despite on-going debate about credibility, and reported limitations in comparison to other approaches, case study is an increasingly popular approach among qualitative researchers. We critically analysed the methodological descriptions of published case studies. Three high-impact qualitative methods journals were searched to locate case studies published in the past 5 years; 34 were selected for analysis. Articles were categorized as health and health services (n=12, social sciences and anthropology (n=7, or methods (n=15 case studies. The articles were reviewed using an adapted version of established criteria to determine whether adequate methodological justification was present, and if study aims, methods, and reported findings were consistent with a qualitative case study approach. Findings were grouped into five themes outlining key methodological issues: case study methodology or method, case of something particular and case selection, contextually bound case study, researcher and case interactions and triangulation, and study design inconsistent with methodology reported. Improved reporting of case studies by qualitative researchers will advance the methodology for the benefit of researchers and practitioners.

  3. Methodology or method? A critical review of qualitative case study reports

    Science.gov (United States)

    Hyett, Nerida; Kenny, Amanda; Dickson-Swift, Virginia

    2014-01-01

    Despite on-going debate about credibility, and reported limitations in comparison to other approaches, case study is an increasingly popular approach among qualitative researchers. We critically analysed the methodological descriptions of published case studies. Three high-impact qualitative methods journals were searched to locate case studies published in the past 5 years; 34 were selected for analysis. Articles were categorized as health and health services (n=12), social sciences and anthropology (n=7), or methods (n=15) case studies. The articles were reviewed using an adapted version of established criteria to determine whether adequate methodological justification was present, and if study aims, methods, and reported findings were consistent with a qualitative case study approach. Findings were grouped into five themes outlining key methodological issues: case study methodology or method, case of something particular and case selection, contextually bound case study, researcher and case interactions and triangulation, and study design inconsistent with methodology reported. Improved reporting of case studies by qualitative researchers will advance the methodology for the benefit of researchers and practitioners. PMID:24809980

  4. Communication: On the consistency of approximate quantum dynamics simulation methods for vibrational spectra in the condensed phase.

    Science.gov (United States)

    Rossi, Mariana; Liu, Hanchao; Paesani, Francesco; Bowman, Joel; Ceriotti, Michele

    2014-11-14

    Including quantum mechanical effects on the dynamics of nuclei in the condensed phase is challenging, because the complexity of exact methods grows exponentially with the number of quantum degrees of freedom. Efforts to circumvent these limitations can be traced down to two approaches: methods that treat a small subset of the degrees of freedom with rigorous quantum mechanics, considering the rest of the system as a static or classical environment, and methods that treat the whole system quantum mechanically, but using approximate dynamics. Here, we perform a systematic comparison between these two philosophies for the description of quantum effects in vibrational spectroscopy, taking the Embedded Local Monomer model and a mixed quantum-classical model as representatives of the first family of methods, and centroid molecular dynamics and thermostatted ring polymer molecular dynamics as examples of the latter. We use as benchmarks D2O doped with HOD and pure H2O at three distinct thermodynamic state points (ice Ih at 150 K, and the liquid at 300 K and 600 K), modeled with the simple q-TIP4P/F potential energy and dipole moment surfaces. With few exceptions the different techniques yield IR absorption frequencies that are consistent with one another within a few tens of cm(-1). Comparison with classical molecular dynamics demonstrates the importance of nuclear quantum effects up to the highest temperature, and a detailed discussion of the discrepancies between the various methods let us draw some (circumstantial) conclusions about the impact of the very different approximations that underlie them. Such cross validation between radically different approaches could indicate a way forward to further improve the state of the art in simulations of condensed-phase quantum dynamics.

  5. Quantum Chemistry, and Eclectic Mix: From Silicon Carbide to Size Consistency

    Energy Technology Data Exchange (ETDEWEB)

    Rintelman, Jamie Marie [Iowa State Univ., Ames, IA (United States)

    2004-12-19

    Chemistry is a field of great breadth and variety. It is this diversity that makes for both an interesting and challenging field. My interests have spanned three major areas of theoretical chemistry: applications, method development, and method evaluation. The topics presented in this thesis are as follows: (1) a multi-reference study of the geometries and relative energies of four atom silicon carbide clusters in the gas phase; (2) the reaction of acetylene on the Si(100)-(2x1) surface; (3) an improvement to the Effective Fragment Potential (EFP) solvent model to enable the study of reactions in both aqueous and nonaqueous solution; and (4) an evaluation of the size consistency of Multireference Perturbation Theory (MRPT). In the following section, the author briefly discusses two topics central to, and present throughout, this thesis: Multi-reference methods and Quantum Mechanics/Molecular Mechanics (QM/MM) methods.

  6. Self-consistent-field method and τ-functional method on group manifold in soliton theory. II. Laurent coefficients of soliton solutions for sln and for sun

    International Nuclear Information System (INIS)

    Nishiyama, Seiya; Providencia, Joao da; Komatsu, Takao

    2007-01-01

    To go beyond perturbative method in terms of variables of collective motion, using infinite-dimensional fermions, we have aimed to construct the self-consistent-field (SCF) theory, i.e., time dependent Hartree-Fock theory on associative affine Kac-Moody algebras along the soliton theory. In this paper, toward such an ultimate goal we will reconstruct a theoretical frame for a υ (external parameter)-dependent SCF method to describe more precisely the dynamics on the infinite-dimensional fermion Fock space. An infinite-dimensional fermion operator is introduced through Laurent expansion of finite-dimensional fermion operators with respect to degrees of freedom of the fermions related to a υ-dependent and a Υ-periodic potential. As an illustration, we derive explicit expressions for the Laurent coefficients of soliton solutions for sl n and for su n on infinite-dimensional Grassmannian. The associative affine Kac-Moody algebras play a crucial role to determine the dynamics on the infinite-dimensional fermion Fock space

  7. Internal consistency and content validity of a questionnaire aimed to assess the stages of behavioral lifestyle changes in Colombian schoolchildren: The Fuprecol study

    Directory of Open Access Journals (Sweden)

    Yasmira CARRILLO-BERNATE

    Full Text Available ABSTRACT Objective To assess internal consistency and content validity of a questionnaire aimed to assess the stages of Behavioural Lifestyle Changes in a sample of school-aged children and adolescents aged 9 to 17 years-old. Methods This validation study involved 675 schoolchildren from three official school in the city of Bogota, Colombia. A self-administered questionnaire called Behavioural Lifestyle Changes has been designed to explore stages of change regarding to physical activity/exercise, fruit and vegetable consumption, alcohol abuse, tobacco use, and drug abuse. Cronbach-α, Kappa index and exploratory factor analysis were used for evaluating the internal consistency and validity of content, respectively. Results The study population consisted of 51.1% males and the participants’ average age was 12.7±2.4 years-old. Behavioural Lifestyle Changes scored 0.720 (range 0.691 to 0.730 on the Cronbach α and intra-observer reproducibility was good (Kappa=0.71. Exploratory factor analysis determined two factors (factor 1: physical activity/exercise, fruit and vegetable consumption, and factor 2: alcohol abuse tobacco use and drug abuse, explaining 67.78% of variance by the items and six interactions χ2/gL=11649.833; p<0.001. Conclusion Behavioural Lifestyle Changes Questionnaire was seen to have suitable internal consistency and validity. This instrument can be recommended, mainly within the context of primary attention for studying the stages involved in the lifestyle behavioural changes model on a school-based population.

  8. Consistency among integral measurements of aggregate decay heat power

    Energy Technology Data Exchange (ETDEWEB)

    Takeuchi, H.; Sagisaka, M.; Oyamatsu, K.; Kukita, Y. [Nagoya Univ. (Japan)

    1998-03-01

    Persisting discrepancies between summation calculations and integral measurements force us to assume large uncertainties in the recommended decay heat power. In this paper, we develop a hybrid method to calculate the decay heat power of a fissioning system from those of different fissioning systems. Then, this method is applied to examine consistency among measured decay heat powers of {sup 232}Th, {sup 233}U, {sup 235}U, {sup 238}U and {sup 239}Pu at YAYOI. The consistency among the measured values are found to be satisfied for the {beta} component and fairly well for the {gamma} component, except for cooling times longer than 4000 s. (author)

  9. Precommitted Investment Strategy versus Time-Consistent Investment Strategy for a General Risk Model with Diffusion

    Directory of Open Access Journals (Sweden)

    Lidong Zhang

    2014-01-01

    Full Text Available We mainly study a general risk model and investigate the precommitted strategy and the time-consistent strategy under mean-variance criterion, respectively. A lagrange method is proposed to derive the precommitted investment strategy. Meanwhile from the game theoretical perspective, we find the time-consistent investment strategy by solving the extended Hamilton-Jacobi-Bellman equations. By comparing the precommitted strategy with the time-consistent strategy, we find that the company under the time-consistent strategy has to give up the better current utility in order to keep a consistent satisfaction over the whole time horizon. Furthermore, we theoretically and numerically provide the effect of the parameters on these two optimal strategies and the corresponding value functions.

  10. Creation of Consistent Burn Wounds: A Rat Model

    Directory of Open Access Journals (Sweden)

    Elijah Zhengyang Cai

    2014-07-01

    Full Text Available Background Burn infliction techniques are poorly described in rat models. An accurate study can only be achieved with wounds that are uniform in size and depth. We describe a simple reproducible method for creating consistent burn wounds in rats. Methods Ten male Sprague-Dawley rats were anesthetized and dorsum shaved. A 100 g cylindrical stainless-steel rod (1 cm diameter was heated to 100℃ in boiling water. Temperature was monitored using a thermocouple. We performed two consecutive toe-pinch tests on different limbs to assess the depth of sedation. Burn infliction was limited to the loin. The skin was pulled upwards, away from the underlying viscera, creating a flat surface. The rod rested on its own weight for 5, 10, and 20 seconds at three different sites on each rat. Wounds were evaluated for size, morphology and depth. Results Average wound size was 0.9957 cm2 (standard deviation [SD] 0.1845 (n=30. Wounds created with duration of 5 seconds were pale, with an indistinct margin of erythema. Wounds of 10 and 20 seconds were well-defined, uniformly brown with a rim of erythema. Average depths of tissue damage were 1.30 mm (SD 0.424, 2.35 mm (SD 0.071, and 2.60 mm (SD 0.283 for duration of 5, 10, 20 seconds respectively. Burn duration of 5 seconds resulted in full-thickness damage. Burn duration of 10 seconds and 20 seconds resulted in full-thickness damage, involving subjacent skeletal muscle. Conclusions This is a simple reproducible method for creating burn wounds consistent in size and depth in a rat burn model.

  11. Gate-controlled current and inelastic electron tunneling spectrum of benzene: a self-consistent study.

    Science.gov (United States)

    Liang, Y Y; Chen, H; Mizuseki, H; Kawazoe, Y

    2011-04-14

    We use density functional theory based nonequilibrium Green's function to self-consistently study the current through the 1,4-benzenedithiol (BDT). The elastic and inelastic tunneling properties through this Au-BDT-Au molecular junction are simulated, respectively. For the elastic tunneling case, it is found that the current through the tilted molecule can be modulated effectively by the external gate field, which is perpendicular to the phenyl ring. The gate voltage amplification comes from the modulation of the interaction between the electrodes and the molecules in the junctions. For the inelastic case, the electron tunneling scattered by the molecular vibrational modes is considered within the self-consistent Born approximation scheme, and the inelastic electron tunneling spectrum is calculated.

  12. Bidirectional associations between mothers' and fathers' parenting consistency and child bmi

    OpenAIRE

    Jansen, Pauline; Giallo, Rebecca; Westrupp, Elizabeth; Wake, Melissa; Nicholson, Jan

    2013-01-01

    textabstractBACKGROUND: Research suggests that general parenting dimensions and styles are associated with children's BMI, but directionality in this relationship remains unknown. Moreover, there has been little attention to the influences of both mothers' and fathers' parenting. We aimed to examine reciprocal relationships between maternal and paternal parenting consistency and child BMI. METHODS: Participants were 4002 children and their parents in the population-based Longitudinal Study of...

  13. A Thermodynamically-Consistent Non-Ideal Stochastic Hard-Sphere Fluid

    Energy Technology Data Exchange (ETDEWEB)

    Donev, A; Alder, B J; Garcia, A L

    2009-08-03

    A grid-free variant of the Direct Simulation Monte Carlo (DSMC) method is proposed, named the Isotropic DSMC (I-DSMC) method, that is suitable for simulating collision-dominated dense fluid flows. The I-DSMC algorithm eliminates all grid artifacts from the traditional DSMC algorithm and is Galilean invariant and microscopically isotropic. The stochastic collision rules in I-DSMC are modified to introduce a non-ideal structure factor that gives consistent compressibility, as first proposed in [Phys. Rev. Lett. 101:075902 (2008)]. The resulting Stochastic Hard Sphere Dynamics (SHSD) fluid is empirically shown to be thermodynamically identical to a deterministic Hamiltonian system of penetrable spheres interacting with a linear core pair potential, well-described by the hypernetted chain (HNC) approximation. We develop a kinetic theory for the SHSD fluid to obtain estimates for the transport coefficients that are in excellent agreement with particle simulations over a wide range of densities and collision rates. The fluctuating hydrodynamic behavior of the SHSD fluid is verified by comparing its dynamic structure factor against theory based on the Landau-Lifshitz Navier-Stokes equations. We also study the Brownian motion of a nano-particle suspended in an SHSD fluid and find a long-time power-law tail in its velocity autocorrelation function consistent with hydrodynamic theory and molecular dynamics calculations.

  14. Physical Model Method for Seismic Study of Concrete Dams

    Directory of Open Access Journals (Sweden)

    Bogdan Roşca

    2008-01-01

    Full Text Available The study of the dynamic behaviour of concrete dams by means of the physical model method is very useful to understand the failure mechanism of these structures to action of the strong earthquakes. Physical model method consists in two main processes. Firstly, a study model must be designed by a physical modeling process using the dynamic modeling theory. The result is a equations system of dimensioning the physical model. After the construction and instrumentation of the scale physical model a structural analysis based on experimental means is performed. The experimental results are gathered and are available to be analysed. Depending on the aim of the research may be designed an elastic or a failure physical model. The requirements for the elastic model construction are easier to accomplish in contrast with those required for a failure model, but the obtained results provide narrow information. In order to study the behaviour of concrete dams to strong seismic action is required the employment of failure physical models able to simulate accurately the possible opening of joint, sliding between concrete blocks and the cracking of concrete. The design relations for both elastic and failure physical models are based on dimensional analysis and consist of similitude relations among the physical quantities involved in the phenomenon. The using of physical models of great or medium dimensions as well as its instrumentation creates great advantages, but this operation involves a large amount of financial, logistic and time resources.

  15. Efficient self-consistency for magnetic tight binding

    Science.gov (United States)

    Soin, Preetma; Horsfield, A. P.; Nguyen-Manh, D.

    2011-06-01

    Tight binding can be extended to magnetic systems by including an exchange interaction on an atomic site that favours net spin polarisation. We have used a published model, extended to include long-ranged Coulomb interactions, to study defects in iron. We have found that achieving self-consistency using conventional techniques was either unstable or very slow. By formulating the problem of achieving charge and spin self-consistency as a search for stationary points of a Harris-Foulkes functional, extended to include spin, we have derived a much more efficient scheme based on a Newton-Raphson procedure. We demonstrate the capabilities of our method by looking at vacancies and self-interstitials in iron. Self-consistency can indeed be achieved in a more efficient and stable manner, but care needs to be taken to manage this. The algorithm is implemented in the code PLATO. Program summaryProgram title:PLATO Catalogue identifier: AEFC_v2_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/AEFC_v2_0.html Program obtainable from: CPC Program Library, Queen's University, Belfast, N. Ireland Licensing provisions: Standard CPC licence, http://cpc.cs.qub.ac.uk/licence/licence.html No. of lines in distributed program, including test data, etc.: 228 747 No. of bytes in distributed program, including test data, etc.: 1 880 369 Distribution format: tar.gz Programming language: C and PERL Computer: Apple Macintosh, PC, Unix machines Operating system: Unix, Linux, Mac OS X, Windows XP Has the code been vectorised or parallelised?: Yes. Up to 256 processors tested RAM: Up to 2 Gbytes per processor Classification: 7.3 External routines: LAPACK, BLAS and optionally ScaLAPACK, BLACS, PBLAS, FFTW Catalogue identifier of previous version: AEFC_v1_0 Journal reference of previous version: Comput. Phys. Comm. 180 (2009) 2616 Does the new version supersede the previous version?: Yes Nature of problem: Achieving charge and spin self-consistency in magnetic tight binding can be very

  16. Methods for Analyzing Multivariate Phenotypes in Genetic Association Studies

    Directory of Open Access Journals (Sweden)

    Qiong Yang

    2012-01-01

    Full Text Available Multivariate phenotypes are frequently encountered in genetic association studies. The purpose of analyzing multivariate phenotypes usually includes discovery of novel genetic variants of pleiotropy effects, that is, affecting multiple phenotypes, and the ultimate goal of uncovering the underlying genetic mechanism. In recent years, there have been new method development and application of existing statistical methods to such phenotypes. In this paper, we provide a review of the available methods for analyzing association between a single marker and a multivariate phenotype consisting of the same type of components (e.g., all continuous or all categorical or different types of components (e.g., some are continuous and others are categorical. We also reviewed causal inference methods designed to test whether the detected association with the multivariate phenotype is truly pleiotropy or the genetic marker exerts its effects on some phenotypes through affecting the others.

  17. Consistent Valuation across Curves Using Pricing Kernels

    Directory of Open Access Journals (Sweden)

    Andrea Macrina

    2018-03-01

    Full Text Available The general problem of asset pricing when the discount rate differs from the rate at which an asset’s cash flows accrue is considered. A pricing kernel framework is used to model an economy that is segmented into distinct markets, each identified by a yield curve having its own market, credit and liquidity risk characteristics. The proposed framework precludes arbitrage within each market, while the definition of a curve-conversion factor process links all markets in a consistent arbitrage-free manner. A pricing formula is then derived, referred to as the across-curve pricing formula, which enables consistent valuation and hedging of financial instruments across curves (and markets. As a natural application, a consistent multi-curve framework is formulated for emerging and developed inter-bank swap markets, which highlights an important dual feature of the curve-conversion factor process. Given this multi-curve framework, existing multi-curve approaches based on HJM and rational pricing kernel models are recovered, reviewed and generalised and single-curve models extended. In another application, inflation-linked, currency-based and fixed-income hybrid securities are shown to be consistently valued using the across-curve valuation method.

  18. Consistency between referral diagnosis and post-ENMG diagnosis in children

    International Nuclear Information System (INIS)

    Komur, M.; Okuyaz, C.; Makharoblidze, K.

    2014-01-01

    Objective: To evaluate the degree of consistency between the referral diagnosis and that based on electroneuromyography. Methods: The retrospective study was conducted at the Paediatric Neurology Laboratory of Mersin University School of Medicine, Turkey, and comprised all electroneuromyographies carried out between January 2005 and December 2010. Demographic data, referral diagnosis and post-procedure diagnosis were recorded for each patient, and were classified into groups. Consistency between the two groups was compared using SPSS 13. Results: Of the total 294 patients, polyneuropathy was the reason for referral in 104 (35.4%), peripheral nerve injury in 54 (18.4%), brachial plexus injury in 52 (17.7%), myopathy in 52 (17.7%), hypotonia in 23 (7.8%), and facial paralysis in 9 (3.0%) patients. There was consistency between the two diagnoses in 179 (60.9%) patients. Conclusion: Electroneuromyography is an uneasy, painful and stressfull procedure for children, and, therefore, it should be recommended only in cases where the result may be beneficial in the diagnosis, treatment and follow-up of a patient. (author)

  19. Studies on the consistency of internally taken contrast medium for pancreas CT

    Energy Technology Data Exchange (ETDEWEB)

    Matsushima, Kishio; Mimura, Seiichi; Tahara, Seiji; Kitayama, Takuichi; Inamura, Keiji; Mikami, Yasutaka; Hashimoto, Keiji; Hiraki, Yoshio; Aono, Kaname

    1985-02-01

    A problem of Pancreatic CT scanning is the discrimination between the pancreas and the adjacent gastrointestinal tract. Generally we administer a dilution of gastrografin internally to make the discrimination. The degree of dilution has been decided by experience at each hospital. When the consistency of the contrast medium is low in density, an enhancement effect cannot be expected, but when the consistency is high, artifacts appear. We have experimented on the degree of the dilution and CT-No to decide the optimum consistency of gastrografin for the diagnosis of pancreatic disease. Statistical analysis of the results show the optimum dilution of gastrografin to be 1.5%.

  20. Personality consistency analysis in cloned quarantine dog candidates

    Directory of Open Access Journals (Sweden)

    Jin Choi

    2017-01-01

    Full Text Available In recent research, personality consistency has become an important characteristic. Diverse traits and human-animal interactions, in particular, are studied in the field of personality consistency in dogs. Here, we investigated the consistency of dominant behaviours in cloned and control groups followed by the modified Puppy Aptitude Test, which consists of ten subtests to ascertain the influence of genetic identity. In this test, puppies are exposed to stranger, restraint, prey-like object, noise, startling object, etc. Six cloned and four control puppies participated and the consistency of responses at ages 7–10 and 16 weeks in the two groups was compared. The two groups showed different consistencies in the subtests. While the average scores of the cloned group were consistent (P = 0.7991, those of the control group were not (P = 0.0089. Scores of Pack Drive and Fight or Flight Drive were consistent in the cloned group, however, those of the control group were not. Scores of Prey Drive were not consistent in either the cloned or the control group. Therefore, it is suggested that consistency of dominant behaviour is affected by genetic identity and some behaviours can be influenced more than others. Our results suggest that cloned dogs could show more consistent traits than non-cloned. This study implies that personality consistency could be one of the ways to analyse traits of puppies.

  1. EVALUATION OF CONSISTENCY AND SETTING TIME OF IRANIAN DENTAL STONES

    Directory of Open Access Journals (Sweden)

    F GOL BIDI

    2000-09-01

    Full Text Available Introduction. Dental stones are widely used in dentistry and the success or failure of many dental treatments depend on the accuracy of these gypsums. The purpose of this study was the evaluation of Iranian dental stones and comparison between Iranian and foreign ones. In this investigation, consistency and setting time were compared between Pars Dendn, Almas and Hinrizit stones. The latter is accepted by ADA (American Dental Association. Consistency and setting time are 2 of 5 properties that are necessitated by both ADA specification No. 25 and Iranian Standard Organization specification No. 2569 for evaluation of dental stones. Methods. In this study, the number and preparation of specimens and test conditions were done according to the ADA specification No. 25 and all the measurements were done with vicat apparatus. Results. The results of this study showed that the standard consistency of Almas stone was obtained by 42ml water and 100gr powder and the setting time of this stone was 11±0.03 min. Which was with in the limits of ADA specification (12±4 min. The standard consistency of Pars Dandan stone was obrianed by 31ml water and 100 gr powder, but the setting time of this stone was 5± 0.16 min which was nt within the limits of ADA specification. Discussion: Comparison of Iranian and Hinrizit stones properties showed that two probable problems of Iranian stones are:1- Unhemogrnousity of Iranian stoned powder was caused by uncontrolled temperature, pressure and humidity in the production process of stone. 2- Impurities such as sodium chloride was responsible fo shortening of Pars Dendens setting time.

  2. Day-to-day Consistency in Positive Parent-Child Interactions and Youth Well-Being.

    Science.gov (United States)

    Lippold, Melissa A; Davis, Kelly D; Lawson, Katie M; McHale, Susan M

    2016-12-01

    The frequency of positive parent-child interactions is associated with youth adjustment. Yet, little is known about daily parent-child interactions and how day-to-day consistency in positive parent-child interactions may be linked to youth well-being. Using a daily diary approach, this study added to this literature to investigate whether and how day-to-day consistency in positive parent-child interactions was linked to youth depressive symptoms, risky behavior, and physical health. Participants were youth whose parents were employed in the IT division of a Fortune 500 company ( N = 129, youth's mean age = 13.39, 55 % female), who participated in an 8 day daily diary study. Analyses revealed that, controlling for cross-day mean levels of positive parent-child interactions, older (but not younger) adolescents who experienced more consistency in positive interactions with parents had fewer depressive and physical health symptoms (e.g., colds, flu). The discussion focuses on the utility of daily diary methods for assessing the correlates of consistency in parenting, possible processes underlying these associations, and intervention implications.

  3. Application to ion exchange study of an interferometry method

    International Nuclear Information System (INIS)

    Platzer, R.

    1960-01-01

    The numerous experiments carried out on ion exchange between clay suspensions and solutions have so far been done by studying the equilibrium between the two phases; by this method it is very difficult to obtain the kinetic properties of the exchange reactions. At method consisting of observation with an interferential microscope using polarised white light shows up the variations in concentration which take place during the ion exchange between an ionic solution and a montmorillonite slab as well as between an ionic solution and a grain of organic ion exchanger. By analysing the results it will be possible to compare the exchange constants of organic ion exchangers with those of mineral ion exchangers. (author) [fr

  4. Study on the scope of fault tree method applicability

    International Nuclear Information System (INIS)

    Ito, Taiju

    1980-03-01

    In fault tree analysis of the reliability of nuclear safety system, including reliability analysis of nuclear protection system, there seem to be some documents in which application of the fault tree method is unreasonable. In fault tree method, the addition rule and the multiplication rule are usually used. The addition rule and the multiplication rule must hold exactly or at least practically. The addition rule has no problem but the multiplication rule has occasionally some problem. For unreliability, mean unavailability and instantaneous unavailability of the elements, holding or not of the multiplication rule has been studied comprehensively. Between the unreliability of each element without maintenance, the multiplication rule holds. Between the instantaneous unavailability of each element, with maintenance or not, the multiplication rule also holds. Between the unreliability of each subsystem with maintenance, however, the multiplication rule does not hold, because the product value is larger than the value of unreliability for a parallel system consisting of the two subsystems with maintenance. Between the mean unavailability of each element without maintenance, the multiplication rule also does not hold, because the product value is smaller than the value of mean unavailability for a parallel system consisting of the two elements without maintenance. In these cases, therefore, the fault tree method may not be applied by rote for reliability analysis of the system. (author)

  5. Analytical method of waste allocation in waste management systems: Concept, method and case study

    Energy Technology Data Exchange (ETDEWEB)

    Bergeron, Francis C., E-mail: francis.b.c@videotron.ca

    2017-01-15

    Waste is not a rejected item to dispose anymore but increasingly a secondary resource to exploit, influencing waste allocation among treatment operations in a waste management (WM) system. The aim of this methodological paper is to present a new method for the assessment of the WM system, the “analytical method of the waste allocation process” (AMWAP), based on the concept of the “waste allocation process” defined as the aggregation of all processes of apportioning waste among alternative waste treatment operations inside or outside the spatial borders of a WM system. AMWAP contains a conceptual framework and an analytical approach. The conceptual framework includes, firstly, a descriptive model that focuses on the description and classification of the WM system. It includes, secondly, an explanatory model that serves to explain and to predict the operation of the WM system. The analytical approach consists of a step-by-step analysis for the empirical implementation of the conceptual framework. With its multiple purposes, AMWAP provides an innovative and objective modular method to analyse a WM system which may be integrated in the framework of impact assessment methods and environmental systems analysis tools. Its originality comes from the interdisciplinary analysis of the WAP and to develop the conceptual framework. AMWAP is applied in the framework of an illustrative case study on the household WM system of Geneva (Switzerland). It demonstrates that this method provides an in-depth and contextual knowledge of WM. - Highlights: • The study presents a new analytical method based on the waste allocation process. • The method provides an in-depth and contextual knowledge of the waste management system. • The paper provides a reproducible procedure for professionals, experts and academics. • It may be integrated into impact assessment or environmental system analysis tools. • An illustrative case study is provided based on household waste

  6. Analytical method of waste allocation in waste management systems: Concept, method and case study

    International Nuclear Information System (INIS)

    Bergeron, Francis C.

    2017-01-01

    Waste is not a rejected item to dispose anymore but increasingly a secondary resource to exploit, influencing waste allocation among treatment operations in a waste management (WM) system. The aim of this methodological paper is to present a new method for the assessment of the WM system, the “analytical method of the waste allocation process” (AMWAP), based on the concept of the “waste allocation process” defined as the aggregation of all processes of apportioning waste among alternative waste treatment operations inside or outside the spatial borders of a WM system. AMWAP contains a conceptual framework and an analytical approach. The conceptual framework includes, firstly, a descriptive model that focuses on the description and classification of the WM system. It includes, secondly, an explanatory model that serves to explain and to predict the operation of the WM system. The analytical approach consists of a step-by-step analysis for the empirical implementation of the conceptual framework. With its multiple purposes, AMWAP provides an innovative and objective modular method to analyse a WM system which may be integrated in the framework of impact assessment methods and environmental systems analysis tools. Its originality comes from the interdisciplinary analysis of the WAP and to develop the conceptual framework. AMWAP is applied in the framework of an illustrative case study on the household WM system of Geneva (Switzerland). It demonstrates that this method provides an in-depth and contextual knowledge of WM. - Highlights: • The study presents a new analytical method based on the waste allocation process. • The method provides an in-depth and contextual knowledge of the waste management system. • The paper provides a reproducible procedure for professionals, experts and academics. • It may be integrated into impact assessment or environmental system analysis tools. • An illustrative case study is provided based on household waste

  7. Full self-consistency versus quasiparticle self-consistency in diagrammatic approaches: exactly solvable two-site Hubbard model.

    Science.gov (United States)

    Kutepov, A L

    2015-08-12

    Self-consistent solutions of Hedin's equations (HE) for the two-site Hubbard model (HM) have been studied. They have been found for three-point vertices of increasing complexity (Γ = 1 (GW approximation), Γ1 from the first-order perturbation theory, and the exact vertex Γ(E)). Comparison is made between the cases when an additional quasiparticle (QP) approximation for Green's functions is applied during the self-consistent iterative solving of HE and when QP approximation is not applied. The results obtained with the exact vertex are directly related to the present open question-which approximation is more advantageous for future implementations, GW + DMFT or QPGW + DMFT. It is shown that in a regime of strong correlations only the originally proposed GW + DMFT scheme is able to provide reliable results. Vertex corrections based on perturbation theory (PT) systematically improve the GW results when full self-consistency is applied. The application of QP self-consistency combined with PT vertex corrections shows similar problems to the case when the exact vertex is applied combined with QP sc. An analysis of Ward Identity violation is performed for all studied in this work's approximations and its relation to the general accuracy of the schemes used is provided.

  8. Consistent histories and operational quantum theory

    International Nuclear Information System (INIS)

    Rudolph, O.

    1996-01-01

    In this work a generalization of the consistent histories approach to quantum mechanics is presented. We first critically review the consistent histories approach to nonrelativistic quantum mechanics in a mathematically rigorous way and give some general comments about it. We investigate to what extent the consistent histories scheme is compatible with the results of the operational formulation of quantum mechanics. According to the operational approach, nonrelativistic quantum mechanics is most generally formulated in terms of effects, states, and operations. We formulate a generalized consistent histories theory using the concepts and the terminology which have proven useful in the operational formulation of quantum mechanics. The logical rule of the logical interpretation of quantum mechanics is generalized to the present context. The algebraic structure of the generalized theory is studied in detail

  9. Large scale Bayesian nuclear data evaluation with consistent model defects

    International Nuclear Information System (INIS)

    Schnabel, G

    2015-01-01

    The aim of nuclear data evaluation is the reliable determination of cross sections and related quantities of the atomic nuclei. To this end, evaluation methods are applied which combine the information of experiments with the results of model calculations. The evaluated observables with their associated uncertainties and correlations are assembled into data sets, which are required for the development of novel nuclear facilities, such as fusion reactors for energy supply, and accelerator driven systems for nuclear waste incineration. The efficiency and safety of such future facilities is dependent on the quality of these data sets and thus also on the reliability of the applied evaluation methods. This work investigated the performance of the majority of available evaluation methods in two scenarios. The study indicated the importance of an essential component in these methods, which is the frequently ignored deficiency of nuclear models. Usually, nuclear models are based on approximations and thus their predictions may deviate from reliable experimental data. As demonstrated in this thesis, the neglect of this possibility in evaluation methods can lead to estimates of observables which are inconsistent with experimental data. Due to this finding, an extension of Bayesian evaluation methods is proposed to take into account the deficiency of the nuclear models. The deficiency is modeled as a random function in terms of a Gaussian process and combined with the model prediction. This novel formulation conserves sum rules and allows to explicitly estimate the magnitude of model deficiency. Both features are missing in available evaluation methods so far. Furthermore, two improvements of existing methods have been developed in the course of this thesis. The first improvement concerns methods relying on Monte Carlo sampling. A Metropolis-Hastings scheme with a specific proposal distribution is suggested, which proved to be more efficient in the studied scenarios than the

  10. From SOPs to Reports to Evaluations: Learning and Memory as a Case Study of how Missing Data and Methods Impact Interpretation

    Science.gov (United States)

    In an era of global trade and regulatory cooperation, consistent and scientifically based interpretation of developmental neurotoxicity (DNT) studies is essential. Because there is flexibility in the selection of test method(s), consistency can be especially challenging for lea...

  11. Reporting consistently on CSR

    DEFF Research Database (Denmark)

    Thomsen, Christa; Nielsen, Anne Ellerup

    2006-01-01

    This chapter first outlines theory and literature on CSR and Stakeholder Relations focusing on the different perspectives and the contextual and dynamic character of the CSR concept. CSR reporting challenges are discussed and a model of analysis is proposed. Next, our paper presents the results...... of a case study showing that companies use different and not necessarily consistent strategies for reporting on CSR. Finally, the implications for managerial practice are discussed. The chapter concludes by highlighting the value and awareness of the discourse and the discourse types adopted...... in the reporting material. By implementing consistent discourse strategies that interact according to a well-defined pattern or order, it is possible to communicate a strong social commitment on the one hand, and to take into consideration the expectations of the shareholders and the other stakeholders...

  12. HBV infection in relation to consistent condom use: a population-based study in Peru.

    Science.gov (United States)

    Bernabe-Ortiz, Antonio; Carcamo, Cesar P; Scott, John D; Hughes, James P; Garcia, Patricia J; Holmes, King K

    2011-01-01

    Data on hepatitis B virus (HBV) prevalence are limited in developing countries. There is also limited information of consistent condom use efficacy for reducing HBV transmission at the population level. The study goal was to evaluate the prevalence and factors associated with HBV infection in Peru, and the relationship between anti-HBc positivity and consistent condom use. Data from two different surveys performed in 28 mid-sized Peruvian cities were analyzed. Participants aged 18-29 years were selected using a multistage cluster sampling. Information was collected through a validated two-part questionnaire. The first part (face-to-face) concerned demographic data, while the second part (self-administered using handheld computers) concerned sexual behavior. Hepatitis B core antibody (anti-HBc) was tested in 7,000 blood samples. Prevalences and associations were adjusted for sample strata, primary sampling units and population weights. Anti-HBc prevalence was 5.0% (95%CI 4.1%-5.9%), with the highest prevalence among jungle cities: 16.3% (95%CI 13.8%-19.1%). In the multivariable analysis, Anti-HBc positivity was directly associated with geographic region (highlands OR = 2.05; 95%CI 1.28-3.27, and jungle OR = 4.86; 95%CI 3.05-7.74; compared to coastal region); and inversely associated with age at sexual debut (OR = 0.90; 95%CI 0.85-0.97). Consistent condom use, evaluated in about 40% of participants, was associated with reduced prevalence (OR = 0.34; 95%CI 0.15-0.79) after adjusting for gender, geographic region, education level, lifetime number of sex partners, age at sexual debut and year of survey. Residence in highlands or jungle cities is associated with higher anti-HBc prevalences, whereas increasing age at sexual debut were associated with lower prevalences. Consistent condom use was associated with decreased risk of anti-HBc. Findings from this study emphasize the need of primary prevention programs (vaccination) especially in the jungle

  13. HBV infection in relation to consistent condom use: a population-based study in Peru.

    Directory of Open Access Journals (Sweden)

    Antonio Bernabe-Ortiz

    Full Text Available Data on hepatitis B virus (HBV prevalence are limited in developing countries. There is also limited information of consistent condom use efficacy for reducing HBV transmission at the population level. The study goal was to evaluate the prevalence and factors associated with HBV infection in Peru, and the relationship between anti-HBc positivity and consistent condom use.Data from two different surveys performed in 28 mid-sized Peruvian cities were analyzed. Participants aged 18-29 years were selected using a multistage cluster sampling. Information was collected through a validated two-part questionnaire. The first part (face-to-face concerned demographic data, while the second part (self-administered using handheld computers concerned sexual behavior. Hepatitis B core antibody (anti-HBc was tested in 7,000 blood samples. Prevalences and associations were adjusted for sample strata, primary sampling units and population weights. Anti-HBc prevalence was 5.0% (95%CI 4.1%-5.9%, with the highest prevalence among jungle cities: 16.3% (95%CI 13.8%-19.1%. In the multivariable analysis, Anti-HBc positivity was directly associated with geographic region (highlands OR = 2.05; 95%CI 1.28-3.27, and jungle OR = 4.86; 95%CI 3.05-7.74; compared to coastal region; and inversely associated with age at sexual debut (OR = 0.90; 95%CI 0.85-0.97. Consistent condom use, evaluated in about 40% of participants, was associated with reduced prevalence (OR = 0.34; 95%CI 0.15-0.79 after adjusting for gender, geographic region, education level, lifetime number of sex partners, age at sexual debut and year of survey.Residence in highlands or jungle cities is associated with higher anti-HBc prevalences, whereas increasing age at sexual debut were associated with lower prevalences. Consistent condom use was associated with decreased risk of anti-HBc. Findings from this study emphasize the need of primary prevention programs (vaccination especially in the

  14. Joint local and global consistency on interdocument and interword relationships for co-clustering.

    Science.gov (United States)

    Bao, Bing-Kun; Min, Weiqing; Li, Teng; Xu, Changsheng

    2015-01-01

    Co-clustering has recently received a lot of attention due to its effectiveness in simultaneously partitioning words and documents by exploiting the relationships between them. However, most of the existing co-clustering methods neglect or only partially reveal the interword and interdocument relationships. To fully utilize those relationships, the local and global consistencies on both word and document spaces need to be considered, respectively. Local consistency indicates that the label of a word/document can be predicted from its neighbors, while global consistency enforces a smoothness constraint on words/documents labels over the whole data manifold. In this paper, we propose a novel co-clustering method, called co-clustering via local and global consistency, to not only make use of the relationship between word and document, but also jointly explore the local and global consistency on both word and document spaces, respectively. The proposed method has the following characteristics: 1) the word-document relationships is modeled by following information-theoretic co-clustering (ITCC); 2) the local consistency on both interword and interdocument relationships is revealed by a local predictor; and 3) the global consistency on both interword and interdocument relationships is explored by a global smoothness regularization. All the fitting errors from these three-folds are finally integrated together to formulate an objective function, which is iteratively optimized by a convergence provable updating procedure. The extensive experiments on two benchmark document datasets validate the effectiveness of the proposed co-clustering method.

  15. Consistent estimate of ocean warming, land ice melt and sea level rise from Observations

    Science.gov (United States)

    Blazquez, Alejandro; Meyssignac, Benoît; Lemoine, Jean Michel

    2016-04-01

    Based on the sea level budget closure approach, this study investigates the consistency of observed Global Mean Sea Level (GMSL) estimates from satellite altimetry, observed Ocean Thermal Expansion (OTE) estimates from in-situ hydrographic data (based on Argo for depth above 2000m and oceanic cruises below) and GRACE observations of land water storage and land ice melt for the period January 2004 to December 2014. The consistency between these datasets is a key issue if we want to constrain missing contributions to sea level rise such as the deep ocean contribution. Numerous previous studies have addressed this question by summing up the different contributions to sea level rise and comparing it to satellite altimetry observations (see for example Llovel et al. 2015, Dieng et al. 2015). Here we propose a novel approach which consists in correcting GRACE solutions over the ocean (essentially corrections of stripes and leakage from ice caps) with mass observations deduced from the difference between satellite altimetry GMSL and in-situ hydrographic data OTE estimates. We check that the resulting GRACE corrected solutions are consistent with original GRACE estimates of the geoid spherical harmonic coefficients within error bars and we compare the resulting GRACE estimates of land water storage and land ice melt with independent results from the literature. This method provides a new mass redistribution from GRACE consistent with observations from Altimetry and OTE. We test the sensibility of this method to the deep ocean contribution and the GIA models and propose best estimates.

  16. Context-specific metabolic networks are consistent with experiments.

    Directory of Open Access Journals (Sweden)

    Scott A Becker

    2008-05-01

    Full Text Available Reconstructions of cellular metabolism are publicly available for a variety of different microorganisms and some mammalian genomes. To date, these reconstructions are "genome-scale" and strive to include all reactions implied by the genome annotation, as well as those with direct experimental evidence. Clearly, many of the reactions in a genome-scale reconstruction will not be active under particular conditions or in a particular cell type. Methods to tailor these comprehensive genome-scale reconstructions into context-specific networks will aid predictive in silico modeling for a particular situation. We present a method called Gene Inactivity Moderated by Metabolism and Expression (GIMME to achieve this goal. The GIMME algorithm uses quantitative gene expression data and one or more presupposed metabolic objectives to produce the context-specific reconstruction that is most consistent with the available data. Furthermore, the algorithm provides a quantitative inconsistency score indicating how consistent a set of gene expression data is with a particular metabolic objective. We show that this algorithm produces results consistent with biological experiments and intuition for adaptive evolution of bacteria, rational design of metabolic engineering strains, and human skeletal muscle cells. This work represents progress towards producing constraint-based models of metabolism that are specific to the conditions where the expression profiling data is available.

  17. Longitudinal tDCS: Consistency across Working Memory Training Studies

    Directory of Open Access Journals (Sweden)

    Marian E. Berryhill

    2017-04-01

    Full Text Available There is great interest in enhancing and maintaining cognitive function. In recent years, advances in noninvasive brain stimulation devices, such as transcranial direct current stimulation (tDCS, have targeted working memory in particular. Despite controversy surrounding outcomes of single-session studies, a growing field of working memory training studies incorporate multiple sessions of tDCS. It is useful to take stock of these findings because there is a diversity of paradigms employed and the outcomes observed between research groups. This will be important in assessing cognitive training programs paired with stimulation techniques and identifying the more useful and less effective approaches. Here, we treat the tDCS+ working memory training field as a case example, but also survey training benefits in other neuromodulatory techniques (e.g., tRNS, tACS. There are challenges associated with the broad parameter space including: individual differences, stimulation intensity, duration, montage, session number, session spacing, training task selection, timing of follow up testing, near and far transfer tasks. In summary, although the field of assisted cognitive training is young, some design choices are more favorable than others. By way of heuristic, the current evidence supports including more training/tDCS sessions (5+, applying anodal tDCS targeting prefrontal regions, including follow up testing on trained and transfer tasks after a period of no contact. What remains unclear, but important for future translational value is continuing work to pinpoint optimal values for the tDCS parameters on a per cognitive task basis. Importantly the emerging literature shows notable consistency in the application of tDCS for WM across various participant populations compared to single session experimental designs.

  18. Consistency in the World Wide Web

    DEFF Research Database (Denmark)

    Thomsen, Jakob Grauenkjær

    Tim Berners-Lee envisioned that computers will behave as agents of humans on the World Wide Web, where they will retrieve, extract, and interact with information from the World Wide Web. A step towards this vision is to make computers capable of extracting this information in a reliable...... and consistent way. In this dissertation we study steps towards this vision by showing techniques for the specication, the verication and the evaluation of the consistency of information in the World Wide Web. We show how to detect certain classes of errors in a specication of information, and we show how...... the World Wide Web, in order to help perform consistent evaluations of web extraction techniques. These contributions are steps towards having computers reliable and consistently extract information from the World Wide Web, which in turn are steps towards achieving Tim Berners-Lee's vision. ii...

  19. Toward a consistent RHA-RPA

    International Nuclear Information System (INIS)

    Shepard, J.R.

    1991-01-01

    The authors examine the RPA based on a relativistic Hartree approximation description for nuclear ground states. This model includes contributions from the negative energy sea at the 1-loop level. They emphasize consistency between the treatment of the ground state and the RPA. This consistency is important in the description of low-lying collective levels but less important for the longitudinal (e, e') quasi-elastic response. They also study the effect of imposing a 3-momentum cutoff on negative energy sea contributions. A cutoff of twice the nucleon mass improves agreement with observed spin orbit splittings in nuclei compared to the standard infinite cutoff results, an effect traceable to the fact that imposing the cutoff reduces m*/m. The cutoff is much less important than consistency in the description of low-lying collective levels. The cutoff model provides excellent agreement with quasi-elastic (e, e') data

  20. A Critical Study of Agglomerated Multigrid Methods for Diffusion

    Science.gov (United States)

    Nishikawa, Hiroaki; Diskin, Boris; Thomas, James L.

    2011-01-01

    Agglomerated multigrid techniques used in unstructured-grid methods are studied critically for a model problem representative of laminar diffusion in the incompressible limit. The studied target-grid discretizations and discretizations used on agglomerated grids are typical of current node-centered formulations. Agglomerated multigrid convergence rates are presented using a range of two- and three-dimensional randomly perturbed unstructured grids for simple geometries with isotropic and stretched grids. Two agglomeration techniques are used within an overall topology-preserving agglomeration framework. The results show that multigrid with an inconsistent coarse-grid scheme using only the edge terms (also referred to in the literature as a thin-layer formulation) provides considerable speedup over single-grid methods but its convergence deteriorates on finer grids. Multigrid with a Galerkin coarse-grid discretization using piecewise-constant prolongation and a heuristic correction factor is slower and also grid-dependent. In contrast, grid-independent convergence rates are demonstrated for multigrid with consistent coarse-grid discretizations. Convergence rates of multigrid cycles are verified with quantitative analysis methods in which parts of the two-grid cycle are replaced by their idealized counterparts.

  1. Consistent two-dimensional visualization of protein-ligand complex series

    Directory of Open Access Journals (Sweden)

    Stierand Katrin

    2011-06-01

    Full Text Available Abstract Background The comparative two-dimensional graphical representation of protein-ligand complex series featuring different ligands bound to the same active site offers a quick insight in their binding mode differences. In comparison to arbitrary orientations of the residue molecules in the individual complex depictions a consistent placement improves the legibility and comparability within the series. The automatic generation of such consistent layouts offers the possibility to apply it to large data sets originating from computer-aided drug design methods. Results We developed a new approach, which automatically generates a consistent layout of interacting residues for a given series of complexes. Based on the structural three-dimensional input information, a global two-dimensional layout for all residues of the complex ensemble is computed. The algorithm incorporates the three-dimensional adjacencies of the active site residues in order to find an universally valid circular arrangement of the residues around the ligand. Subsequent to a two-dimensional ligand superimposition step, a global placement for each residue is derived from the set of already placed ligands. The method generates high-quality layouts, showing mostly overlap-free solutions with molecules which are displayed as structure diagrams providing interaction information in atomic detail. Application examples document an improved legibility compared to series of diagrams whose layouts are calculated independently from each other. Conclusions The presented method extends the field of complex series visualizations. A series of molecules binding to the same protein active site is drawn in a graphically consistent way. Compared to existing approaches these drawings substantially simplify the visual analysis of large compound series.

  2. The effect on dose accumulation accuracy of inverse-consistency and transitivity error reduced deformation maps

    International Nuclear Information System (INIS)

    Hardcastle, Nicholas; Bender, Edward T.; Tomé, Wolfgang A.

    2014-01-01

    It has previously been shown that deformable image registrations (DIRs) often result in deformation maps that are neither inverse-consistent nor transitive, and that the dose accumulation based on these deformation maps can be inconsistent if different image pathways are used for dose accumulation. A method presented to reduce inverse consistency and transitivity errors has been shown to result in more consistent dose accumulation, regardless of the image pathway selected for dose accumulation. The present study investigates the effect on the dose accumulation accuracy of deformation maps processed to reduce inverse consistency and transitivity errors. A set of lung 4DCT phases were analysed, consisting of four images on which a dose grid was created. Dose to 75 corresponding anatomical locations was manually tracked. Dose accumulation was performed between all image sets with Demons derived deformation maps as well as deformation maps processed to reduce inverse consistency and transitivity errors. The ground truth accumulated dose was then compared with the accumulated dose derived from DIR. Two dose accumulation image pathways were considered. The post-processing method to reduce inverse consistency and transitivity errors had minimal effect on the dose accumulation accuracy. There was a statistically significant improvement in dose accumulation accuracy for one pathway, but for the other pathway there was no statistically significant difference. A post-processing technique to reduce inverse consistency and transitivity errors has a positive, yet minimal effect on the dose accumulation accuracy. Thus the post-processing technique improves consistency of dose accumulation with minimal effect on dose accumulation accuracy.

  3. The consistency between scientific papers presented at the Orthopaedic Trauma Association and their subsequent full-text publication.

    Science.gov (United States)

    Preston, Charles F; Bhandari, Mohit; Fulkerson, Eric; Ginat, Danial; Egol, Kenneth A; Koval, Kenneth J

    2006-02-01

    To determine the consistency of conclusions/statements made in podium presentations at the annual meeting of the Orthopaedic Trauma Association (OTA) with those in subsequent full-text publications. Also, to evaluate the nature and consistency of study design, methods, sample sizes, results and assign a corresponding level of evidence. Abstracts of the scientific programs of the OTA from 1994 to 1997 (N = 254) were queried by using the PubMed database to identify those studies resulting in a peer-reviewed, full-text publication. Of the 169 articles retrieved, 137 studies were the basis of our study after the exclusion criteria were applied: non-English language, basic science studies, anatomic dissection studies, and articles published in non-peer-reviewed journals. Information was abstracted onto a data form: first from the abstract published in the final meeting program, and then from the published journal article. Information was recorded regarding study issues, including the study design, primary objective, sample size, and statistical methods. We provided descriptive statistics about the frequency of consistent results between abstracts and full-text publications. The results were recorded as percentages and a 95% confidence interval was applied to each value. Study results were recorded for the abstract and full-text publication comparing results and the overall conclusion. A level of scientific-based evidence was assigned to each full-text publication. The final conclusion of the study remained the same 93.4% of the time. The method of study was an observational case series 52% of the time and a statement regarding the rate of patient follow-up was reported 42% of the time. Of the studies published, 18.2% consisted of a sample size smaller than the previously presented abstract. When the published papers had their level of evidence graded, 11% were level I, 16% level II, 17% level III, and 56% level IV. Authors conclusions were consistent with those in full

  4. Consistency of ocular coherence tomography fast macular thickness mapping in diabetic diffuse macular edema

    International Nuclear Information System (INIS)

    Saraiva, Fabio Petersen; Costa, Patricia Grativol; Inomata, Daniela Lumi; Melo, Carlos Sergio Nascimento; Helal Junior, John; Nakashima, Yoshitaka

    2007-01-01

    Objectives: To investigate optical coherence tomography consistency on foveal thickness, foveal volume, and macular volume measurements in patients with and without diffuse diabetic macular edema. Introduction: Optical coherence tomography represents an objective technique that provides cross-sectional tomographs of retinal structure in vivo. However, it is expected that poor fixation ability, as seen in diabetic macular edema, could alter its results. Several authors have discussed the reproducibility of optical coherence tomography, but only a few have addressed the topic with respect to diabetic maculopathy. Methods: The study recruited diabetic patients without clinically evident retinopathy (control group) and with diffuse macular edema (case group). Only one eye of each patient was evaluated. Five consecutive fast macular scans were taken using Ocular Coherence Tomography 3; the 6 mm macular map was chosen. The consistency in measurements of foveal thickness, foveal volume, and total macular volume for both groups was evaluated using the Pearson's coefficient of variation. The T-test for independent samples was used in order to compare measurements of both groups. Results: Each group consisted of 20 patients. All measurements had a coefficient of variation less than 10%. The most consistent parameter for both groups was the total macular volume. Discussion: Consistency in measurement is a mainstay of any test. A test is unreliable if its measurements can not be correctly repeated. We found a good index of consistency, even considering patients with an unstable gaze. Conclusions: Optical coherence tomography is a consistent method for diabetic subjects with diffuse macular edema. (author)

  5. Consistency of ocular coherence tomography fast macular thickness mapping in diabetic diffuse macular edema

    Energy Technology Data Exchange (ETDEWEB)

    Saraiva, Fabio Petersen; Costa, Patricia Grativol; Inomata, Daniela Lumi; Melo, Carlos Sergio Nascimento; Helal Junior, John; Nakashima, Yoshitaka [Universidade de Sao Paulo (USP), SP (Brazil). Hospital das Clinicas. Dept. de Oftalmologia]. E-mail: fabiopetersen@yahoo.com.br

    2007-07-01

    Objectives: To investigate optical coherence tomography consistency on foveal thickness, foveal volume, and macular volume measurements in patients with and without diffuse diabetic macular edema. Introduction: Optical coherence tomography represents an objective technique that provides cross-sectional tomographs of retinal structure in vivo. However, it is expected that poor fixation ability, as seen in diabetic macular edema, could alter its results. Several authors have discussed the reproducibility of optical coherence tomography, but only a few have addressed the topic with respect to diabetic maculopathy. Methods: The study recruited diabetic patients without clinically evident retinopathy (control group) and with diffuse macular edema (case group). Only one eye of each patient was evaluated. Five consecutive fast macular scans were taken using Ocular Coherence Tomography 3; the 6 mm macular map was chosen. The consistency in measurements of foveal thickness, foveal volume, and total macular volume for both groups was evaluated using the Pearson's coefficient of variation. The T-test for independent samples was used in order to compare measurements of both groups. Results: Each group consisted of 20 patients. All measurements had a coefficient of variation less than 10%. The most consistent parameter for both groups was the total macular volume. Discussion: Consistency in measurement is a mainstay of any test. A test is unreliable if its measurements can not be correctly repeated. We found a good index of consistency, even considering patients with an unstable gaze. Conclusions: Optical coherence tomography is a consistent method for diabetic subjects with diffuse macular edema. (author)

  6. Semi-discrete approximations to nonlinear systems of conservation laws; consistency and L(infinity)-stability imply convergence. Final report

    International Nuclear Information System (INIS)

    Tadmor, E.

    1988-07-01

    A convergence theory for semi-discrete approximations to nonlinear systems of conservation laws is developed. It is shown, by a series of scalar counter-examples, that consistency with the conservation law alone does not guarantee convergence. Instead, a notion of consistency which takes into account both the conservation law and its augmenting entropy condition is introduced. In this context it is concluded that consistency and L(infinity)-stability guarantee for a relevant class of admissible entropy functions, that their entropy production rate belongs to a compact subset of H(loc)sup -1 (x,t). One can now use compensated compactness arguments in order to turn this conclusion into a convergence proof. The current state of the art for these arguments includes the scalar and a wide class of 2 x 2 systems of conservation laws. The general framework of the vanishing viscosity method is studied as an effective way to meet the consistency and L(infinity)-stability requirements. How this method is utilized to enforce consistency and stability for scalar conservation laws is shown. In this context we prove, under the appropriate assumptions, the convergence of finite difference approximations (e.g., the high resolution TVD and UNO methods), finite element approximations (e.g., the Streamline-Diffusion methods) and spectral and pseudospectral approximations (e.g., the Spectral Viscosity methods)

  7. Design study of fuel circulating system using Pd-alloy membrane isotope separation method

    International Nuclear Information System (INIS)

    Naito, T.; Yamada, T.; Yamanaka, T.; Aizawa, T.; Kasahara, T.; Nishikawa, M.; Asami, N.

    1980-01-01

    Design study on the fuel circulating system (FCS) for a tokamak experimental fusion reactor (JXFR) has been carried out to establish the system concept, to plan the development program, and to evaluate the feasibility of diffusion system. The FCS consists of main vacuum system, fuel gas refiners, isotope separators, fuel feeders, and auxiliary systems. In the system design, Pd-alloy membrane permeation method is adopted for fuel refining and isotope separating. All impurities are effectively removed and hydrogen isotopes are sufficiently separated by Pd-alloy membrane. The isotope separation system consists of 1st (47 separators) and 2nd (46 separators) cascades for removing protium and separating deuterium, respectively. In the FCS, while cryogenic distillation method appears to be practicable, Pd-alloy membrane diffusion method is attractive for isotope separation and refining of fuel gas. The choice will have to be based on reliability, economic, and safety analyses

  8. Study of Experiment on Rock-like Material Consist of fly-ash, Cement and Mortar

    Science.gov (United States)

    Nan, Qin; Hongwei, Wang; Yongyan, Wang

    2018-03-01

    Study the uniaxial compression test of rock-like material consist of coal ash, cement and mortar by changing the sand cement ratio, replace of fine coal, grain diameter, water-binder ratio and height-diameter ratio. We get the law of four factors above to rock-like material’s uniaxial compression characteristics and the quantitative relation. The effect law can be sum up as below: sample’s uniaxial compressive strength and elasticity modulus tend to decrease with the increase of sand cement ratio, replace of fine coal and water-binder ratio, and it satisfies with power function relation. With high ratio increases gradually, the uniaxial compressive strength and elastic modulus is lower, and presents the inverse function curve; Specimen tensile strength decreases gradually with the increase of fly ash. By contrast, uniaxial compression failure phenomenon is consistent with the real rock common failure pattern.

  9. Gene-Environment Interplay in Internalizing Disorders: Consistent Findings across Six Environmental Risk Factors

    Science.gov (United States)

    Hicks, Brian M.; DiRago, Ana C.; Iacono, William G.; McGue, Matt

    2009-01-01

    Background Newer behavior genetic methods can better elucidate gene-environment (G-E) interplay in the development of internalizing (INT) disorders (i.e., major depression and anxiety disorders). However, no study to date has conducted a comprehensive analysis examining multiple environmental risks with the purpose of delineating how general G-E mechanisms influence the development of INT disorders. Methods The sample consisted of 1315 male and female twin pairs participating in the age 17 assessment of the Minnesota Twin Family Study. Quantitative G-E interplay models were used to examine how genetic and environmental risk for INT disorders changes as a function of environmental context. Multiple measures and informants were employed to construct composite measures of INT disorders and 6 environmental risk factors including: stressful life events, mother-child and father-child relationship problems, antisocial and prosocial peer affiliation, and academic achievement and engagement. Results Significant moderation effects were detected between each environmental risk factor and INT such that in the context of greater environmental adversity, nonshared environmental factors became more important in the etiology of INT symptoms. Conclusion Our results are consistent with the interpretation that environmental stressors have a causative effect on the emergence of INT disorders. The consistency of our results suggests a general mechanism of environmental influence on INT disorders regardless of the specific form of environmental risk. PMID:19594836

  10. Testing the visual consistency of web sites

    NARCIS (Netherlands)

    van der Geest, Thea; Loorbach, N.R.

    2005-01-01

    Consistency in the visual appearance of Web pages is often checked by experts, such as designers or reviewers. This article reports a card sort study conducted to determine whether users rather than experts could distinguish visual (in-)consistency in Web elements and pages. The users proved to

  11. Short-Cut Estimators of Criterion-Referenced Test Consistency.

    Science.gov (United States)

    Brown, James Dean

    1990-01-01

    Presents simplified methods for deriving estimates of the consistency of criterion-referenced, English-as-a-Second-Language tests, including (1) the threshold loss agreement approach using agreement or kappa coefficients, (2) the squared-error loss agreement approach using the phi(lambda) dependability approach, and (3) the domain score…

  12. Self-consistent approach for neutral community models with speciation

    Science.gov (United States)

    Haegeman, Bart; Etienne, Rampal S.

    2010-03-01

    Hubbell’s neutral model provides a rich theoretical framework to study ecological communities. By incorporating both ecological and evolutionary time scales, it allows us to investigate how communities are shaped by speciation processes. The speciation model in the basic neutral model is particularly simple, describing speciation as a point-mutation event in a birth of a single individual. The stationary species abundance distribution of the basic model, which can be solved exactly, fits empirical data of distributions of species’ abundances surprisingly well. More realistic speciation models have been proposed such as the random-fission model in which new species appear by splitting up existing species. However, no analytical solution is available for these models, impeding quantitative comparison with data. Here, we present a self-consistent approximation method for neutral community models with various speciation modes, including random fission. We derive explicit formulas for the stationary species abundance distribution, which agree very well with simulations. We expect that our approximation method will be useful to study other speciation processes in neutral community models as well.

  13. Two new integrable couplings of the soliton hierarchies with self-consistent sources

    International Nuclear Information System (INIS)

    Tie-Cheng, Xia

    2010-01-01

    A kind of integrable coupling of soliton equations hierarchy with self-consistent sources associated with s-tilde l(4) has been presented (Yu F J and Li L 2009 Appl. Math. Comput. 207 171; Yu F J 2008 Phys. Lett. A 372 6613). Based on this method, we construct two integrable couplings of the soliton hierarchy with self-consistent sources by using the loop algebra s-tilde l(4). In this paper, we also point out that there are some errors in these references and we have corrected these errors and set up new formula. The method can be generalized to other soliton hierarchy with self-consistent sources. (general)

  14. Women and postfertilization effects of birth control: consistency of beliefs, intentions and reported use

    Directory of Open Access Journals (Sweden)

    Kim Han S

    2005-11-01

    Full Text Available Abstract Background This study assesses the consistency of responses among women regarding their beliefs about the mechanisms of actions of birth control methods, beliefs about when human life begins, the intention to use or not use birth control methods that they believe may act after fertilization or implantation, and their reported use of specific methods. Methods A questionnaire was administered in family practice and obstetrics and gynecology clinics in Salt Lake City, Utah, and Tulsa, Oklahoma. Participants included women ages 18–50 presenting for any reason and women under age 18 presenting for family planning or pregnancy care. Analyses were based on key questions addressing beliefs about whether specific birth control methods may act after fertilization, beliefs about when human life begins, intention to use a method that may act after fertilization, and reported use of specific methods. The questionnaire contained no information about the mechanism of action of any method of birth control. Responses were considered inconsistent if actual use contradicted intentions, if one intention contradicted another, or if intentions contradicted beliefs. Results Of all respondents, 38% gave consistent responses about intention to not use or to stop use of any birth control method that acted after fertilization, while 4% gave inconsistent responses. The corresponding percentages for birth control methods that work after implantation were 64% consistent and 2% inconsistent. Of all respondents, 34% reported they believed that life begins at fertilization and would not use any birth control method that acts after fertilization (a consistent response, while 3% reported they believed that life begins at fertilization but would use a birth control method that acts after fertilization (inconsistent. For specific methods of birth control, less than 1% of women gave inconsistent responses. A majority of women (68% or greater responded accurately about the

  15. Association between consistent purchase of anticonvulsants or lithium and suicide risk: a longitudinal cohort study from Denmark, 1995-2001.

    Science.gov (United States)

    Smith, Eric G; Søndergård, Lars; Lopez, Ana Garcia; Andersen, Per Kragh; Kessing, Lars Vedel

    2009-10-01

    Prior studies suggest anticonvulsants purchasers may be at greater risk of suicide than lithium purchasers. Longitudinal, retrospective cohort study of all individuals in Denmark purchasing anticonvulsants (valproic acid, carbamazepine, oxcarbazepine or lamotrigine) (n=9952) or lithium (n=6693) from 1995-2001 who also purchased antipsychotics at least once (to select out nonpsychiatric anticonvulsant use). Poisson regression of suicides by medication purchased (anticonvulsants or lithium) was conducted, controlling for age, sex, and calendar year. Confounding by indication was addressed by restricting the comparison to individuals prescribed the same medication: individuals with minimal medication exposure (e.g., who purchased only a single prescription of anticonvulsants) were compared to those individuals with more consistent medication exposure (i.e., purchasing > or = 6 prescriptions of anticonvulsants). Demographics and frequency of anticonvulsant, lithium, or antipsychotic use were similar between lithium and anticonvulsant purchasers. Among patients who also purchased antipsychotic at least once during the study period, purchasing anticonvulsants more consistently (> or = 6 prescriptions) was associated with a substantial reduction in the risk of suicide (RR=0.22, 95% CI=0.11-0.42, panticonvulsant and consistent lithium purchasers were similar. Lack of information about diagnoses and potential confounders, as well as other covariates that may differ between minimal and consistent medication purchasers, are limitations to this study. In this longitudinal study of anticonvulsant purchasers likely to have psychiatric disorders, consistent anticonvulsant treatment was associated with decreased risk of completed suicide.

  16. Context-dependent individual behavioral consistency in Daphnia

    DEFF Research Database (Denmark)

    Heuschele, Jan; Ekvall, Mikael T.; Bianco, Giuseppe

    2017-01-01

    The understanding of consistent individual differences in behavior, often termed "personality," for adapting and coping with threats and novel environmental conditions has advanced considerably during the last decade. However, advancements are almost exclusively associated with higher-order animals......, whereas studies focusing on smaller aquatic organisms are still rare. Here, we show individual differences in the swimming behavior of Daphnia magna, a clonal freshwater invertebrate, before, during, and after being exposed to a lethal threat, ultraviolet radiation (UVR). We show consistency in swimming...... that of adults. Overall, we show that aquatic invertebrates are far from being identical robots, but instead they show considerable individual differences in behavior that can be attributed to both ontogenetic development and individual consistency. Our study also demonstrates, for the first time...

  17. Personality and Situation Predictors of Consistent Eating Patterns.

    Science.gov (United States)

    Vainik, Uku; Dubé, Laurette; Lu, Ji; Fellows, Lesley K

    2015-01-01

    A consistent eating style might be beneficial to avoid overeating in a food-rich environment. Eating consistency entails maintaining a similar dietary pattern across different eating situations. This construct is relatively under-studied, but the available evidence suggests that eating consistency supports successful weight maintenance and decreases risk for metabolic syndrome and cardiovascular disease. Yet, personality and situation predictors of consistency have not been studied. A community-based sample of 164 women completed various personality tests, and 139 of them also reported their eating behaviour 6 times/day over 10 observational days. We focused on observations with meals (breakfast, lunch, or dinner). The participants indicated if their momentary eating patterns were consistent with their own baseline eating patterns in terms of healthiness or size of the meal. Further, participants described various characteristics of each eating situation. Eating consistency was positively predicted by trait self-control. Eating consistency was undermined by eating in the evening, eating with others, eating away from home, having consumed alcohol and having undertaken physical exercise. Interactions emerged between personality traits and situations, including punishment sensitivity, restraint, physical activity and alcohol consumption. Trait self-control and several eating situation variables were related to eating consistency. These findings provide a starting point for targeting interventions to improve consistency, suggesting that a focus on self-control skills, together with addressing contextual factors such as social situations and time of day, may be most promising. This work is a first step to provide people with the tools they need to maintain a consistently healthy lifestyle in a food-rich environment.

  18. Personality and Situation Predictors of Consistent Eating Patterns.

    Directory of Open Access Journals (Sweden)

    Uku Vainik

    Full Text Available A consistent eating style might be beneficial to avoid overeating in a food-rich environment. Eating consistency entails maintaining a similar dietary pattern across different eating situations. This construct is relatively under-studied, but the available evidence suggests that eating consistency supports successful weight maintenance and decreases risk for metabolic syndrome and cardiovascular disease. Yet, personality and situation predictors of consistency have not been studied.A community-based sample of 164 women completed various personality tests, and 139 of them also reported their eating behaviour 6 times/day over 10 observational days. We focused on observations with meals (breakfast, lunch, or dinner. The participants indicated if their momentary eating patterns were consistent with their own baseline eating patterns in terms of healthiness or size of the meal. Further, participants described various characteristics of each eating situation.Eating consistency was positively predicted by trait self-control. Eating consistency was undermined by eating in the evening, eating with others, eating away from home, having consumed alcohol and having undertaken physical exercise. Interactions emerged between personality traits and situations, including punishment sensitivity, restraint, physical activity and alcohol consumption.Trait self-control and several eating situation variables were related to eating consistency. These findings provide a starting point for targeting interventions to improve consistency, suggesting that a focus on self-control skills, together with addressing contextual factors such as social situations and time of day, may be most promising. This work is a first step to provide people with the tools they need to maintain a consistently healthy lifestyle in a food-rich environment.

  19. Modeling and Testing Legacy Data Consistency Requirements

    DEFF Research Database (Denmark)

    Nytun, J. P.; Jensen, Christian Søndergaard

    2003-01-01

    An increasing number of data sources are available on the Internet, many of which offer semantically overlapping data, but based on different schemas, or models. While it is often of interest to integrate such data sources, the lack of consistency among them makes this integration difficult....... This paper addresses the need for new techniques that enable the modeling and consistency checking for legacy data sources. Specifically, the paper contributes to the development of a framework that enables consistency testing of data coming from different types of data sources. The vehicle is UML and its...... accompanying XMI. The paper presents techniques for modeling consistency requirements using OCL and other UML modeling elements: it studies how models that describe the required consistencies among instances of legacy models can be designed in standard UML tools that support XMI. The paper also considers...

  20. Analysis Method of Transfer Pricing Used by Multinational Companies Related to Tax Avoidance and its Consistencies to the Arm's Length Principle

    Directory of Open Access Journals (Sweden)

    Nuraini Sari

    2015-12-01

    Full Text Available The purpose of this study is to evaluate about how Starbucks Corporation uses transfer pricing to minimize the tax bill. In addition, the study also will evaluate how Indonesia’s domestic rules can overcome the case if Starbucks UK case happens in Indonesia. There are three steps conducted in this study. First, using information provided by UK Her Majesty's Revenue and Customs (HMRC and other related articles, find methods used by Starbucks UK to minimize the tax bill. Second, find Organisation for Economic Co-Operation and Development (OECD viewpoint regarding Starbucks Corporation cases. Third, analyze how Indonesia’s transfer pricing rules will work if Starbucks UK’s cases happened in Indonesia. The results showed that there were three inter-company transactions that helped Starbucks UK to minimize the tax bill, such as coffee costs, royalty on intangible property, and interest on inter-company loans. Through a study of OECD’s BEPS action plans, it is recommended to improve the OECD Model Tax Convention including Indonesia’s domestic tax rules in order to produce a fair and transparent judgment on transfer pricing. This study concluded that by using current tax rules, although UK HMRC has been disadvantaged because transfer pricing practices done by most of multinational companies, UK HMRC still cannot prove the transfer pricing practices are not consistent with arm’s length principle. Therefore, current international tax rules need to be improved.

  1. New geometric design consistency model based on operating speed profiles for road safety evaluation.

    Science.gov (United States)

    Camacho-Torregrosa, Francisco J; Pérez-Zuriaga, Ana M; Campoy-Ungría, J Manuel; García-García, Alfredo

    2013-12-01

    To assist in the on-going effort to reduce road fatalities as much as possible, this paper presents a new methodology to evaluate road safety in both the design and redesign stages of two-lane rural highways. This methodology is based on the analysis of road geometric design consistency, a value which will be a surrogate measure of the safety level of the two-lane rural road segment. The consistency model presented in this paper is based on the consideration of continuous operating speed profiles. The models used for their construction were obtained by using an innovative GPS-data collection method that is based on continuous operating speed profiles recorded from individual drivers. This new methodology allowed the researchers to observe the actual behavior of drivers and to develop more accurate operating speed models than was previously possible with spot-speed data collection, thereby enabling a more accurate approximation to the real phenomenon and thus a better consistency measurement. Operating speed profiles were built for 33 Spanish two-lane rural road segments, and several consistency measurements based on the global and local operating speed were checked. The final consistency model takes into account not only the global dispersion of the operating speed, but also some indexes that consider both local speed decelerations and speeds over posted speeds as well. For the development of the consistency model, the crash frequency for each study site was considered, which allowed estimating the number of crashes on a road segment by means of the calculation of its geometric design consistency. Consequently, the presented consistency evaluation method is a promising innovative tool that can be used as a surrogate measure to estimate the safety of a road segment. Copyright © 2012 Elsevier Ltd. All rights reserved.

  2. Smoothing of Fused Spectral Consistent Satellite Images with TV-based Edge Detection

    DEFF Research Database (Denmark)

    Sveinsson, Johannes; Aanæs, Henrik; Benediktsson, Jon Atli

    2007-01-01

    based on satellite data. Additionally, most conventional methods are loosely connected to the image forming physics of the satellite image, giving these methods an ad hoc feel. Vesteinsson et al. [1] proposed a method of fusion of satellite images that is based on the properties of imaging physics...... in a statistically meaningful way and was called spectral consistent panshapening (SCP). In this paper we improve this framework for satellite image fusion by introducing a better image prior, via data-dependent image smoothing. The dependency is obtained via total variation edge detection method.......Several widely used methods have been proposed for fusing high resolution panchromatic data and lower resolution multi-channel data. However, many of these methods fail to maintain the spectral consistency of the fused high resolution image, which is of high importance to many of the applications...

  3. Thin composite films consisting of polypyrrole and polyparaphenylene

    International Nuclear Information System (INIS)

    Golovtsov, I.; Bereznev, S.; Traksmaa, R.; Opik, A.

    2007-01-01

    This study demonstrates that the combined method for the formation of thin composite films, consisting of polypyrrole (PPy) as a film forming agent and polyparaphenylene (PPP) with controlled electrical properties and high stability, enables one to avoid the low processability of PPP and to extend the possibilities for the development of electronic devices. The high temperature (250-600 deg. C) doping method was used for PPP preparation. The crystallinity and grindability of PPP was found to be increasing with the thermochemical modification. Thin composite films were prepared onto the light transparent substrates using the simple electropolymerization technique. The properties of films were characterized by the optical transmittance and temperature-dependent conductivity measurements. The morphology and thickness of the prepared films were determined using the scanning electron microscopy. The composite films showed a better adhesion to an inorganic substrate. It was found to be connected mostly with the improved properties of the high temperature doped PPP. The current-voltage characteristics of indium tin oxide/film/Au hybrid organic-inorganic structures showed the influence of the doping conditions of PPP inclusions in the obtained films

  4. Accuracy and Consistency of Grass Pollen Identification by Human Analysts Using Electron Micrographs of Surface Ornamentation

    Directory of Open Access Journals (Sweden)

    Luke Mander

    2014-08-01

    Full Text Available Premise of the study: Humans frequently identify pollen grains at a taxonomic rank above species. Grass pollen is a classic case of this situation, which has led to the development of computational methods for identifying grass pollen species. This paper aims to provide context for these computational methods by quantifying the accuracy and consistency of human identification. Methods: We measured the ability of nine human analysts to identify 12 species of grass pollen using scanning electron microscopy images. These are the same images that were used in computational identifications. We have measured the coverage, accuracy, and consistency of each analyst, and investigated their ability to recognize duplicate images. Results: Coverage ranged from 87.5% to 100%. Mean identification accuracy ranged from 46.67% to 87.5%. The identification consistency of each analyst ranged from 32.5% to 87.5%, and each of the nine analysts produced considerably different identification schemes. The proportion of duplicate image pairs that were missed ranged from 6.25% to 58.33%. Discussion: The identification errors made by each analyst, which result in a decline in accuracy and consistency, are likely related to psychological factors such as the limited capacity of human memory, fatigue and boredom, recency effects, and positivity bias.

  5. Criteria for the generation of spectra consistent time histories

    International Nuclear Information System (INIS)

    Lin, C.-W.

    1977-01-01

    Several methods are available to conduct seismic analysis for nuclear power plant systems and components. Among them, the response spectrum technique has been most widely adopted for linear type of modal analysis. However, for designs which consist of structural or material nonlinearites such as frequency dependent soil properties, the existance of gaps, single tie rods, and friction between supports where the response has to be computed as a function of time, time history approach is the only viable method of analysis. Two examples of time history analysis are: 1) soil-structure interaction study and, 2) a coupled reactor coolant system and building analysis to either generate the floor response specra or compute nonlinear system time history response. The generation of a suitable time history input for the analysis has been discussed in the literature. Some general guidelines are available to insure that the time history imput will be as conservative as the design response spectra. Very little has been reported as to the effect of the dyanmic characteristics of the time history input upon the system response. In fact, the only available discussion in this respect concerns only with the statitical independent nature of the time history components. In this paper, numerical results for cases using the time history approach are presented. Criteria are also established which may be advantageously used to arrive at spectra consistent time histories which are conservative and more importantly, realistic. (Auth.)

  6. Approximate self-consistent potentials for density-functional-theory exchange-correlation functionals

    International Nuclear Information System (INIS)

    Cafiero, Mauricio; Gonzalez, Carlos

    2005-01-01

    We show that potentials for exchange-correlation functionals within the Kohn-Sham density-functional-theory framework may be written as potentials for simpler functionals multiplied by a factor close to unity, and in a self-consistent field calculation, these effective potentials find the correct self-consistent solutions. This simple theory is demonstrated with self-consistent exchange-only calculations of the atomization energies of some small molecules using the Perdew-Kurth-Zupan-Blaha (PKZB) meta-generalized-gradient-approximation (meta-GGA) exchange functional. The atomization energies obtained with our method agree with or surpass previous meta-GGA calculations performed in a non-self-consistent manner. The results of this work suggest the utility of this simple theory to approximate exchange-correlation potentials corresponding to energy functionals too complicated to generate closed forms for their potentials. We hope that this method will encourage the development of complex functionals which have correct boundary conditions and are free of self-interaction errors without the worry that the functionals are too complex to differentiate to obtain potentials

  7. The Use of Qualitative Case Studies as an Experiential Teaching Method in the Training of Pre-Service Teachers

    Science.gov (United States)

    Arseven, Ilhami

    2018-01-01

    This study presents the suitability of case studies, which is a qualitative research method and can be used as a teaching method in the training of pre-service teachers, for experiential learning theory. The basic view of experiential learning theory on learning and the qualitative case study paradigm are consistent with each other within the…

  8. Convergence of quasiparticle self-consistent GW calculations of transition metal monoxides

    OpenAIRE

    Das, Suvadip; Coulter, John E.; Manousakis, Efstratios

    2014-01-01

    Finding an accurate ab initio approach for calculating the electronic properties of transition metal oxides has been a problem for several decades. In this paper, we investigate the electronic structure of the transition metal monoxides MnO, CoO, and NiO in their undistorted rock-salt structure within a fully iterated quasiparticle self-consistent GW (QPscGW) scheme. We study the convergence of the QPscGW method, i.e., how the quasiparticle energy eigenvalues and wavefunctions converge as a f...

  9. Consistent method of truncating the electron self-energy in nonperturbative QED

    International Nuclear Information System (INIS)

    Rembiesa, P.

    1986-01-01

    A nonperturbative method of solving the Dyson-Schwinger equations for the fermion propagator is considered. The solution satisfies the Ward-Takahashi identity, allows multiplicative regularization, and exhibits a physical-mass pole

  10. Flexibility of Gender Stereotypes: Italian Study on Comparative Gender-consistent and Gender-inconsistent Information

    Directory of Open Access Journals (Sweden)

    Elisabetta Sagone

    2018-05-01

    Full Text Available The topic of this study is flexibility in gender stereotyping linked to attribution of toys, socio-cognitive traits, and occupations in 160 Italian children aged 6 to 12 years. We used the Gender Toys Choice, the Gender Traits Choice, and the Gender Jobs Choice, a selected set of colored cards containing masculine and feminine stimuli to assign to a male or female or both male and female silhouette (the flexible-choice technique. In order to verify the change of flexibility in gender stereotyping, we made use of four cartoon stories with male and female characters with typical or atypical traits and performing gender-consistent or gender-inconsistent activities. Results indicated that the exposure to cartoon stories with gender-inconsistent information rather than cartoon stories with gender-consistent information increased flexibility in gender stereotyping, showing age differences in favor of children aged 11-12. Implications in relation to the developmental-constructivist approach were noted.

  11. A RTS-based method for direct and consistent calculating intermittent peak cooling loads

    International Nuclear Information System (INIS)

    Chen Tingyao; Cui, Mingxian

    2010-01-01

    The RTS method currently recommended by ASHRAE Handbook is based on continuous operation. However, most of air-conditioning systems, if not all, in commercial buildings, are intermittently operated in practice. The application of the current RTS method to intermittent air-conditioning in nonresidential buildings could result in largely underestimated design cooling loads, and inconsistently sized air-conditioning systems. Improperly sized systems could seriously deteriorate the performance of system operation and management. Therefore, a new method based on both the current RTS method and the principles of heat transfer has been developed. The first part of the new method is the same as the current RTS method in principle, but its calculation procedure is simplified by the derived equations in a close form. The technical data available in the current RTS method can be utilized to compute zone responses to a change in space air temperature so that no efforts are needed for regenerating new technical data. Both the overall RTS coefficients and the hourly cooling loads computed in the first part are used to estimate the additional peak cooling load due to a change from continuous operation to intermittent operation. It only needs one more step after the current RTS method to determine the intermittent peak cooling load. The new RTS-based method has been validated by EnergyPlus simulations. The root mean square deviation (RMSD) between the relative additional peak cooling loads (RAPCLs) computed by the two methods is 1.8%. The deviation of the RAPCL varies from -3.0% to 5.0%, and the mean deviation is 1.35%.

  12. Thermodynamically consistent model calibration in chemical kinetics

    Directory of Open Access Journals (Sweden)

    Goutsias John

    2011-05-01

    Full Text Available Abstract Background The dynamics of biochemical reaction systems are constrained by the fundamental laws of thermodynamics, which impose well-defined relationships among the reaction rate constants characterizing these systems. Constructing biochemical reaction systems from experimental observations often leads to parameter values that do not satisfy the necessary thermodynamic constraints. This can result in models that are not physically realizable and may lead to inaccurate, or even erroneous, descriptions of cellular function. Results We introduce a thermodynamically consistent model calibration (TCMC method that can be effectively used to provide thermodynamically feasible values for the parameters of an open biochemical reaction system. The proposed method formulates the model calibration problem as a constrained optimization problem that takes thermodynamic constraints (and, if desired, additional non-thermodynamic constraints into account. By calculating thermodynamically feasible values for the kinetic parameters of a well-known model of the EGF/ERK signaling cascade, we demonstrate the qualitative and quantitative significance of imposing thermodynamic constraints on these parameters and the effectiveness of our method for accomplishing this important task. MATLAB software, using the Systems Biology Toolbox 2.1, can be accessed from http://www.cis.jhu.edu/~goutsias/CSS lab/software.html. An SBML file containing the thermodynamically feasible EGF/ERK signaling cascade model can be found in the BioModels database. Conclusions TCMC is a simple and flexible method for obtaining physically plausible values for the kinetic parameters of open biochemical reaction systems. It can be effectively used to recalculate a thermodynamically consistent set of parameter values for existing thermodynamically infeasible biochemical reaction models of cellular function as well as to estimate thermodynamically feasible values for the parameters of new

  13. Self-consistent equilibria in the pulsar magnetosphere

    International Nuclear Information System (INIS)

    Endean, V.G.

    1976-01-01

    For a 'collisionless' pulsar magnetosphere the self-consistent equilibrium particle distribution functions are functions of the constants of the motion ony. Reasons are given for concluding that to a good approximation they will be functions of the rotating frame Hamiltonian only. This is shown to result in a rigid rotation of the plasma, which therefore becomes trapped inside the velocity of light cylinder. The self-consistent field equations are derived, and a method of solving them is illustrated. The axial component of the magnetic field decays to zero at the plasma boundary. In practice, some streaming of particles into the wind zone may occur as a second-order effect. Acceleration of such particles to very high energies is expected when they approach the velocity of light cylinder, but they cannot be accelerated to very high energies near the star. (author)

  14. Evaluation of the quality consistency of powdered poppy capsule extractive by an averagely linear-quantified fingerprint method in combination with antioxidant activities and two compounds analyses.

    Science.gov (United States)

    Zhang, Yujing; Sun, Guoxiang; Hou, Zhifei; Yan, Bo; Zhang, Jing

    2017-12-01

    A novel averagely linear-quantified fingerprint method was proposed and successfully applied to monitor the quality consistency of alkaloids in powdered poppy capsule extractive. Averagely linear-quantified fingerprint method provided accurate qualitative and quantitative similarities for chromatographic fingerprints of Chinese herbal medicines. The stability and operability of the averagely linear-quantified fingerprint method were verified by the parameter r. The average linear qualitative similarity SL (improved based on conventional qualitative "Similarity") was used as a qualitative criterion in the averagely linear-quantified fingerprint method, and the average linear quantitative similarity PL was introduced as a quantitative one. PL was able to identify the difference in the content of all the chemical components. In addition, PL was found to be highly correlated to the contents of two alkaloid compounds (morphine and codeine). A simple flow injection analysis was developed for the determination of antioxidant capacity in Chinese Herbal Medicines, which was based on the scavenging of 2,2-diphenyl-1-picrylhydrazyl radical by antioxidants. The fingerprint-efficacy relationship linking chromatographic fingerprints and antioxidant activities was investigated utilizing orthogonal projection to latent structures method, which provided important pharmacodynamic information for Chinese herbal medicines quality control. In summary, quantitative fingerprinting based on averagely linear-quantified fingerprint method can be applied for monitoring the quality consistency of Chinese herbal medicines, and the constructed orthogonal projection to latent structures model is particularly suitable for investigating the fingerprint-efficacy relationship. © 2017 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  15. Modifications of imaging spectroscopy methods for increases spatial and temporal consistency: A case study of change in leafy spurge distribution between 1999 and 2001 in Theodore Roosevelt National Park, North Dakota

    Science.gov (United States)

    Dudek, Kathleen Burke

    The noxious weed leafy spurge (Euphorbia esula L.) has spread throughout the northern Great Plains of North America since it was introduced in the early 1800s, and it is currently a significant management concern. Accurate, rapid location and repeatable measurements are critical for successful temporal monitoring of infestations. Imaging spectroscopy is well suited for identification of spurge; however, the development and dissemination of standardized hyperspectral mapping procedures that produce consistent multi-temporal maps has been absent. Airborne Visible/Infrared Imaging Spectrometer (AVIRIS) data, collected in 1999 and 2001 over Theodore Roosevelt National Park, North Dakota, were used to locate leafy spurge. Published image-processing methods were tested to determine the most successful for consistent maps. Best results were obtained using: (1) NDVI masking; (2) cross-track illumination correction; (3) image-derived spectral libraries; and (4) mixture-tuned matched filtering algorithm. Application of the algorithm was modified to standardize processing and eliminate threshold decisions; the image-derived library was refined to eliminate additional variability. Primary (spurge dominant), secondary (spurge non-dominant), abundance, and area-wide vegetation maps were produced. Map accuracies were analyzed with point, polygon, and grid reference sets, using confusion matrices and regression between field-measured and image-derived abundances. Accuracies were recalculated after applying a majority filter, and buffers ranging from 1-5 pixels wide around classified pixels, to accommodate poor reference-image alignment. Overall accuracy varied from 39% to 82%, however, regression analyses yielded r2 = 0.725, indicating a strong relationship between field and image-derived densities. Accuracy was sensitive to: (1) registration offsets between field and image locations; (2) modification of analytical methods; and (3) reference data quality. Sensor viewing angle

  16. The Internal Consistency and Validity of the Vaccination Attitudes Examination Scale: A Replication Study.

    Science.gov (United States)

    Wood, Louise; Smith, Michael; Miller, Christopher B; O'Carroll, Ronan E

    2018-06-19

    Vaccinations are important preventative health behaviors. The recently developed Vaccination Attitudes Examination (VAX) Scale aims to measure the reasons behind refusal/hesitancy regarding vaccinations. The aim of this replication study is to conduct an independent test of the newly developed VAX Scale in the UK. We tested (a) internal consistency (Cronbach's α); (b) convergent validity by assessing its relationships with beliefs about medication, medical mistrust, and perceived sensitivity to medicines; and (c) construct validity by testing how well the VAX Scale discriminated between vaccinators and nonvaccinators. A sample of 243 UK adults completed the VAX Scale, the Beliefs About Medicines Questionnaire, the Perceived Sensitivity to Medicines Scale, and the Medical Mistrust Index, in addition to demographics of age, gender, education levels, and social deprivation. Participants were asked (a) whether they received an influenza vaccination in the past year and (b) if they had a young child, whether they had vaccinated the young child against influenza in the past year. The VAX (a) demonstrated high internal consistency (α = .92); (b) was positively correlated with medical mistrust and beliefs about medicines, and less strongly correlated with perceived sensitivity to medicines; and (c) successfully differentiated parental influenza vaccinators from nonvaccinators. The VAX demonstrated good internal consistency, convergent validity, and construct validity in an independent UK sample. It appears to be a useful measure to help us understand the health beliefs that promote or deter vaccination behavior.

  17. Consistent assignment of nursing staff to residents in nursing homes: a critical review of conceptual and methodological issues.

    Science.gov (United States)

    Roberts, Tonya; Nolet, Kimberly; Bowers, Barbara

    2015-06-01

    Consistent assignment of nursing staff to residents is promoted by a number of national organizations as a strategy for improving nursing home quality and is included in pay for performance schedules in several states. However, research has shown inconsistent effects of consistent assignment on quality outcomes. In order to advance the state of the science of research on consistent assignment and inform current practice and policy, a literature review was conducted to critique conceptual and methodological understandings of consistent assignment. Twenty original research reports of consistent assignment in nursing homes were found through a variety of search strategies. Consistent assignment was conceptualized and operationalized in multiple ways with little overlap from study to study. There was a lack of established methods to measure consistent assignment. Methodological limitations included a lack of control and statistical analyses of group differences in experimental-level studies, small sample sizes, lack of attention to confounds in multicomponent interventions, and outcomes that were not theoretically linked. Future research should focus on developing a conceptual understanding of consistent assignment focused on definition, measurement, and links to outcomes. To inform current policies, testing consistent assignment should include attention to contexts within and levels at which it is most effective. Published by Oxford University Press on behalf of the Gerontological Society of America 2013.

  18. Correlates for Consistency of Contraceptive Use Among Sexually Active Female Adolescents

    Directory of Open Access Journals (Sweden)

    Ruey-Hsia Wang

    2004-04-01

    Full Text Available This study explored the correlates for consistency of contraceptive use among sexually active female adolescents in Kaohsiung County, Taiwan. Overall, 164 female adolescents who had engaged in sexual behavior within the last 6 months and were not pregnant at the time of the study were selected from two vocational high schools in Kaohsiung County, Taiwan. An anonymous questionnaire was used to measure demographic data, contraceptive attitudes, contraceptive knowledge, contraceptive self-efficacy, perception of peers' use of contraceptives, sexual history, and contraceptive use. The results showed that 45.7% of subjects had sex once or more per week, and that 39.6% of subjects always used contraceptives while 15.2% never used contraceptives. Condoms were the most popular contraceptives (51.2% and the withdrawal method was the second most popular (23.8%. Stepwise logistic regression showed that higher contraceptive attitudes (odds ratio, OR, 1.148 and previous contraceptive education in school (OR, 3.394 increased the probability of consistently using contraceptives, correctly classifying 67.2% of the sample.

  19. Identification of the Consistently Altered Metabolic Targets in Human Hepatocellular CarcinomaSummary

    Directory of Open Access Journals (Sweden)

    Zeribe Chike Nwosu

    2017-09-01

    Full Text Available Background & Aims: Cancer cells rely on metabolic alterations to enhance proliferation and survival. Metabolic gene alterations that repeatedly occur in liver cancer are largely unknown. We aimed to identify metabolic genes that are consistently deregulated, and are of potential clinical significance in human hepatocellular carcinoma (HCC. Methods: We studied the expression of 2,761 metabolic genes in 8 microarray datasets comprising 521 human HCC tissues. Genes exclusively up-regulated or down-regulated in 6 or more datasets were defined as consistently deregulated. The consistent genes that correlated with tumor progression markers (ECM2 and MMP9 (Pearson correlation P < .05 were used for Kaplan-Meier overall survival analysis in a patient cohort. We further compared proteomic expression of metabolic genes in 19 tumors vs adjacent normal liver tissues. Results: We identified 634 consistent metabolic genes, ∼60% of which are not yet described in HCC. The down-regulated genes (n = 350 are mostly involved in physiologic hepatocyte metabolic functions (eg, xenobiotic, fatty acid, and amino acid metabolism. In contrast, among consistently up-regulated metabolic genes (n = 284 are those involved in glycolysis, pentose phosphate pathway, nucleotide biosynthesis, tricarboxylic acid cycle, oxidative phosphorylation, proton transport, membrane lipid, and glycan metabolism. Several metabolic genes (n = 434 correlated with progression markers, and of these, 201 predicted overall survival outcome in the patient cohort analyzed. Over 90% of the metabolic targets significantly altered at the protein level were similarly up- or down-regulated as in genomic profile. Conclusions: We provide the first exposition of the consistently altered metabolic genes in HCC and show that these genes are potentially relevant targets for onward studies in preclinical and clinical contexts. Keywords: Liver Cancer, HCC, Tumor Metabolism

  20. The Consistency of Performance Management System Based on Attributes of the Performance Indicator: An Empirical Study

    Directory of Open Access Journals (Sweden)

    Jan Zavadsky

    2014-07-01

    Full Text Available Purpose: The performance management system (PMS is a metasystem over all business processes at the strategic and operational level. Effectiveness of the various management systems depends on many factors. One of them is the consistent definition of each system elements. The main purpose of this study is to explore if the performance management systems of the sample companies is consistent and how companies can create such a system. The consistency in this case is based on the homogenous definition of attributes relating to the performance indicator as a basic element of PMS.Methodology: At the beginning, we used an affinity diagram that helped us to clarify and to group various attributes of performance indicators. The main research results we achieved are through empirical study. The empirical study was carried out in a sample of Slovak companies. The criterion for selection was the existence of the certified management systems according to the ISO 9001. Representativeness of the sample companies was confirmed by application of Pearson´s chi-squared test (χ2 - test due to above standards. Findings: Coming from the review of various literature, we defined four groups of attributes relating to the performance indicator: formal attributes, attributes of target value, informational attributes and attributes of evaluation. The whole set contains 21 attributes. The consistency of PMS is based not on maximum or minimum number of attributes, but on the same type of attributes for each performance indicator used in PMS at both the operational and strategic level. The main findings are: companies use various financial and non-financial indicators at strategic or operational level; companies determine various attributes of performance indicator, but most of the performance indicators are otherwise determined; we identified the common attributes for the whole sample of companies. Practical implications: The research results have got an implication for

  1. WE-AB-207A-02: John’s Equation Based Consistency Condition and Incomplete Projection Restoration Upon Circular Orbit CBCT

    International Nuclear Information System (INIS)

    Ma, J; Qi, H; Wu, S; Xu, Y; Zhou, L; Yan, H

    2016-01-01

    Purpose: In transmitted X-ray tomography imaging, projections are sometimes incomplete due to a variety of reasons, such as geometry inaccuracy, defective detector cells, etc. To address this issue, we have derived a direct consistency condition based on John’s Equation, and proposed a method to effectively restore incomplete projections based on this consistency condition. Methods: Through parameter substitutions, we have derived a direct consistency condition equation from John’s equation, in which the left side is only projection derivative of view and the right side is projection derivative of other geometrical parameters. Based on this consistency condition, a projection restoration method is proposed, which includes five steps: 1) Forward projecting reconstructed image and using linear interpolation to estimate the incomplete projections as the initial result; 2) Performing Fourier transform on the projections; 3) Restoring the incomplete frequency data using the consistency condition equation; 4) Performing inverse Fourier transform; 5) Repeat step 2)∼4) until our criteria is met to terminate the iteration. Results: A beam-blocking-based scatter correction case and a bad-pixel correction case were used to demonstrate the efficacy and robustness of our restoration method. The mean absolute error (MAE), signal noise ratio (SNR) and mean square error (MSE) were employed as our evaluation metrics of the reconstructed images. For the scatter correction case, the MAE is reduced from 63.3% to 71.7% with 4 iterations. Compared with the existing Patch’s method, the MAE of our method is further reduced by 8.72%. For the bad-pixel case, the SNR of the reconstructed image by our method is increased from 13.49% to 21.48%, with the MSE being decreased by 45.95%, compared with linear interpolation method. Conclusion: Our studies have demonstrated that our restoration method based on the new consistency condition could effectively restore the incomplete projections

  2. Measuring process and knowledge consistency

    DEFF Research Database (Denmark)

    Edwards, Kasper; Jensen, Klaes Ladeby; Haug, Anders

    2007-01-01

    When implementing configuration systems, knowledge about products and processes are documented and replicated in the configuration system. This practice assumes that products are specified consistently i.e. on the same rule base and likewise for processes. However, consistency cannot be taken...... for granted; rather the contrary, and attempting to implement a configuration system may easily ignite a political battle. This is because stakes are high in the sense that the rules and processes chosen may only reflect one part of the practice, ignoring a majority of the employees. To avoid this situation......, this paper presents a methodology for measuring product and process consistency prior to implementing a configuration system. The methodology consists of two parts: 1) measuring knowledge consistency and 2) measuring process consistency. Knowledge consistency is measured by developing a questionnaire...

  3. Bitcoin Meets Strong Consistency

    OpenAIRE

    Decker, Christian; Seidel, Jochen; Wattenhofer, Roger

    2014-01-01

    The Bitcoin system only provides eventual consistency. For everyday life, the time to confirm a Bitcoin transaction is prohibitively slow. In this paper we propose a new system, built on the Bitcoin blockchain, which enables strong consistency. Our system, PeerCensus, acts as a certification authority, manages peer identities in a peer-to-peer network, and ultimately enhances Bitcoin and similar systems with strong consistency. Our extensive analysis shows that PeerCensus is in a secure state...

  4. Water supply management using an extended group fuzzy decision-making method: a case study in north-eastern Iran

    Science.gov (United States)

    Minatour, Yasser; Bonakdari, Hossein; Zarghami, Mahdi; Bakhshi, Maryam Ali

    2015-09-01

    The purpose of this study was to develop a group fuzzy multi-criteria decision-making method to be applied in rating problems associated with water resources management. Thus, here Chen's group fuzzy TOPSIS method extended by a difference technique to handle uncertainties of applying a group decision making. Then, the extended group fuzzy TOPSIS method combined with a consistency check. In the presented method, initially linguistic judgments are being surveyed via a consistency checking process, and afterward these judgments are being used in the extended Chen's fuzzy TOPSIS method. Here, each expert's opinion is turned to accurate mathematical numbers and, then, to apply uncertainties, the opinions of group are turned to fuzzy numbers using three mathematical operators. The proposed method is applied to select the optimal strategy for the rural water supply of Nohoor village in north-eastern Iran, as a case study and illustrated example. Sensitivity analyses test over results and comparing results with project reality showed that proposed method offered good results for water resources projects.

  5. Reliability studies of diagnostic methods in Indian traditional Ayurveda medicine: An overview

    Science.gov (United States)

    Kurande, Vrinda Hitendra; Waagepetersen, Rasmus; Toft, Egon; Prasad, Ramjee

    2013-01-01

    Recently, a need to develop supportive new scientific evidence for contemporary Ayurveda has emerged. One of the research objectives is an assessment of the reliability of diagnoses and treatment. Reliability is a quantitative measure of consistency. It is a crucial issue in classification (such as prakriti classification), method development (pulse diagnosis), quality assurance for diagnosis and treatment and in the conduct of clinical studies. Several reliability studies are conducted in western medicine. The investigation of the reliability of traditional Chinese, Japanese and Sasang medicine diagnoses is in the formative stage. However, reliability studies in Ayurveda are in the preliminary stage. In this paper, examples are provided to illustrate relevant concepts of reliability studies of diagnostic methods and their implication in practice, education, and training. An introduction to reliability estimates and different study designs and statistical analysis is given for future studies in Ayurveda. PMID:23930037

  6. Coupled Dyson-Schwinger equations and effects of self-consistency

    International Nuclear Information System (INIS)

    Wu, S.S.; Zhang, H.X.; Yao, Y.J.

    2001-01-01

    Using the σ-ω model as an effective tool, the effects of self-consistency are studied in some detail. A coupled set of Dyson-Schwinger equations for the renormalized baryon and meson propagators in the σ-ω model is solved self-consistently according to the dressed Hartree-Fock scheme, where the hadron propagators in both the baryon and meson self-energies are required to also satisfy this coupled set of equations. It is found that the self-consistency affects the baryon spectral function noticeably, if only the interaction with σ mesons is considered. However, there is a cancellation between the effects due to the σ and ω mesons and the additional contribution of ω mesons makes the above effect insignificant. In both the σ and σ-ω cases the effects of self-consistency on meson spectral function are perceptible, but they can nevertheless be taken account of without a self-consistent calculation. Our study indicates that to include the meson propagators in the self-consistency requirement is unnecessary and one can stop at an early step of an iteration procedure to obtain a good approximation to the fully self-consistent results of all the hadron propagators in the model, if an appropriate initial input is chosen. Vertex corrections and their effects on ghost poles are also studied

  7. Causal inference with missing exposure information: Methods and applications to an obstetric study.

    Science.gov (United States)

    Zhang, Zhiwei; Liu, Wei; Zhang, Bo; Tang, Li; Zhang, Jun

    2016-10-01

    Causal inference in observational studies is frequently challenged by the occurrence of missing data, in addition to confounding. Motivated by the Consortium on Safe Labor, a large observational study of obstetric labor practice and birth outcomes, this article focuses on the problem of missing exposure information in a causal analysis of observational data. This problem can be approached from different angles (i.e. missing covariates and causal inference), and useful methods can be obtained by drawing upon the available techniques and insights in both areas. In this article, we describe and compare a collection of methods based on different modeling assumptions, under standard assumptions for missing data (i.e. missing-at-random and positivity) and for causal inference with complete data (i.e. no unmeasured confounding and another positivity assumption). These methods involve three models: one for treatment assignment, one for the dependence of outcome on treatment and covariates, and one for the missing data mechanism. In general, consistent estimation of causal quantities requires correct specification of at least two of the three models, although there may be some flexibility as to which two models need to be correct. Such flexibility is afforded by doubly robust estimators adapted from the missing covariates literature and the literature on causal inference with complete data, and by a newly developed triply robust estimator that is consistent if any two of the three models are correct. The methods are applied to the Consortium on Safe Labor data and compared in a simulation study mimicking the Consortium on Safe Labor. © The Author(s) 2013.

  8. Consistent descriptions of metal–ligand bonds and spin-crossover in inorganic chemistry

    DEFF Research Database (Denmark)

    Kepp, Kasper Planeta

    2013-01-01

    -scale DFT studies of inorganic systems in catalysis and bioinorganic chemistry rely directly on the ability to balance correlation effects in the involved bonds across the s-, p-, and d-blocks. This review concerns recent efforts to describe such bonds accurately and consistently across the s-, p-, and d......-blocks. Physical effects and ingredients in functionals, their systematic errors, and approaches to deal with them are discussed, in order to identify broadly applicable methods for inorganic chemistry....

  9. Vibrational multiconfiguration self-consistent field theory: implementation and test calculations.

    Science.gov (United States)

    Heislbetz, Sandra; Rauhut, Guntram

    2010-03-28

    A state-specific vibrational multiconfiguration self-consistent field (VMCSCF) approach based on a multimode expansion of the potential energy surface is presented for the accurate calculation of anharmonic vibrational spectra. As a special case of this general approach vibrational complete active space self-consistent field calculations will be discussed. The latter method shows better convergence than the general VMCSCF approach and must be considered the preferred choice within the multiconfigurational framework. Benchmark calculations are provided for a small set of test molecules.

  10. Remaining useful life prediction based on variation coefficient consistency test of a Wiener process

    Directory of Open Access Journals (Sweden)

    Juan LI

    2018-01-01

    Full Text Available High-cost equipment is often reused after maintenance, and whether the information before the maintenance can be used for the Remaining Useful Life (RUL prediction after the maintenance is directly determined by the consistency of the degradation pattern before and after the maintenance. Aiming at this problem, an RUL prediction method based on the consistency test of a Wiener process is proposed. Firstly, the parameters of the Wiener process estimated by Maximum Likelihood Estimation (MLE are proved to be biased, and a modified unbiased estimation method is proposed and verified by derivation and simulations. Then, the h statistic is constructed according to the reciprocal of the variation coefficient of the Wiener process, and the sampling distribution is derived. Meanwhile, a universal method for the consistency test is proposed based on the sampling distribution theorem, which is verified by simulation data and classical crack degradation data. Finally, based on the consistency test of the degradation model, a weighted fusion RUL prediction method is presented for the fuel pump of an airplane, and the validity of the presented method is verified by accurate computation results of real data, which provides a theoretical and practical guidance for engineers to predict the RUL of equipment after maintenance.

  11. A consistency-based feature selection method allied with linear SVMs for HIV-1 protease cleavage site prediction.

    Directory of Open Access Journals (Sweden)

    Orkun Oztürk

    Full Text Available BACKGROUND: Predicting type-1 Human Immunodeficiency Virus (HIV-1 protease cleavage site in protein molecules and determining its specificity is an important task which has attracted considerable attention in the research community. Achievements in this area are expected to result in effective drug design (especially for HIV-1 protease inhibitors against this life-threatening virus. However, some drawbacks (like the shortage of the available training data and the high dimensionality of the feature space turn this task into a difficult classification problem. Thus, various machine learning techniques, and specifically several classification methods have been proposed in order to increase the accuracy of the classification model. In addition, for several classification problems, which are characterized by having few samples and many features, selecting the most relevant features is a major factor for increasing classification accuracy. RESULTS: We propose for HIV-1 data a consistency-based feature selection approach in conjunction with recursive feature elimination of support vector machines (SVMs. We used various classifiers for evaluating the results obtained from the feature selection process. We further demonstrated the effectiveness of our proposed method by comparing it with a state-of-the-art feature selection method applied on HIV-1 data, and we evaluated the reported results based on attributes which have been selected from different combinations. CONCLUSION: Applying feature selection on training data before realizing the classification task seems to be a reasonable data-mining process when working with types of data similar to HIV-1. On HIV-1 data, some feature selection or extraction operations in conjunction with different classifiers have been tested and noteworthy outcomes have been reported. These facts motivate for the work presented in this paper. SOFTWARE AVAILABILITY: The software is available at http

  12. Biomedical journals lack a consistent method to detect outcome reporting bias: a cross-sectional analysis.

    Science.gov (United States)

    Huan, L N; Tejani, A M; Egan, G

    2014-10-01

    An increasing amount of recently published literature has implicated outcome reporting bias (ORB) as a major contributor to skewing data in both randomized controlled trials and systematic reviews; however, little is known about the current methods in place to detect ORB. This study aims to gain insight into the detection and management of ORB by biomedical journals. This was a cross-sectional analysis involving standardized questions via email or telephone with the top 30 biomedical journals (2012) ranked by impact factor. The Cochrane Database of Systematic Reviews was excluded leaving 29 journals in the sample. Of 29 journals, 24 (83%) responded to our initial inquiry of which 14 (58%) answered our questions and 10 (42%) declined participation. Five (36%) of the responding journals indicated they had a specific method to detect ORB, whereas 9 (64%) did not have a specific method in place. The prevalence of ORB in the review process seemed to differ with 4 (29%) journals indicating ORB was found commonly, whereas 7 (50%) indicated ORB was uncommon or never detected by their journal previously. The majority (n = 10/14, 72%) of journals were unwilling to report or make discrepancies found in manuscripts available to the public. Although the minority, there were some journals (n = 4/14, 29%) which described thorough methods to detect ORB. Many journals seemed to lack a method with which to detect ORB and its estimated prevalence was much lower than that reported in literature suggesting inadequate detection. There exists a potential for overestimation of treatment effects of interventions and unclear risks. Fortunately, there are journals within this sample which appear to utilize comprehensive methods for detection of ORB, but overall, the data suggest improvements at the biomedical journal level for detecting and minimizing the effect of this bias are needed. © 2014 John Wiley & Sons Ltd.

  13. Expert Consensus on Characteristics of Wisdom: A Delphi Method Study

    Science.gov (United States)

    Jeste, Dilip V.; Ardelt, Monika; Blazer, Dan; Kraemer, Helena C.; Vaillant, George; Meeks, Thomas W.

    2010-01-01

    Purpose: Wisdom has received increasing attention in empirical research in recent years, especially in gerontology and psychology, but consistent definitions of wisdom remain elusive. We sought to better characterize this concept via an expert consensus panel using a 2-phase Delphi method. Design and Methods: A survey questionnaire comprised 53…

  14. Evaluating Temporal Consistency in Marine Biodiversity Hotspots.

    Science.gov (United States)

    Piacenza, Susan E; Thurman, Lindsey L; Barner, Allison K; Benkwitt, Cassandra E; Boersma, Kate S; Cerny-Chipman, Elizabeth B; Ingeman, Kurt E; Kindinger, Tye L; Lindsley, Amy J; Nelson, Jake; Reimer, Jessica N; Rowe, Jennifer C; Shen, Chenchen; Thompson, Kevin A; Heppell, Selina S

    2015-01-01

    With the ongoing crisis of biodiversity loss and limited resources for conservation, the concept of biodiversity hotspots has been useful in determining conservation priority areas. However, there has been limited research into how temporal variability in biodiversity may influence conservation area prioritization. To address this information gap, we present an approach to evaluate the temporal consistency of biodiversity hotspots in large marine ecosystems. Using a large scale, public monitoring dataset collected over an eight year period off the US Pacific Coast, we developed a methodological approach for avoiding biases associated with hotspot delineation. We aggregated benthic fish species data from research trawls and calculated mean hotspot thresholds for fish species richness and Shannon's diversity indices over the eight year dataset. We used a spatial frequency distribution method to assign hotspot designations to the grid cells annually. We found no areas containing consistently high biodiversity through the entire study period based on the mean thresholds, and no grid cell was designated as a hotspot for greater than 50% of the time-series. To test if our approach was sensitive to sampling effort and the geographic extent of the survey, we followed a similar routine for the northern region of the survey area. Our finding of low consistency in benthic fish biodiversity hotspots over time was upheld, regardless of biodiversity metric used, whether thresholds were calculated per year or across all years, or the spatial extent for which we calculated thresholds and identified hotspots. Our results suggest that static measures of benthic fish biodiversity off the US West Coast are insufficient for identification of hotspots and that long-term data are required to appropriately identify patterns of high temporal variability in biodiversity for these highly mobile taxa. Given that ecological communities are responding to a changing climate and other

  15. Relative amplitude preservation processing utilizing surface consistent amplitude correction. Part 3; Surface consistent amplitude correction wo mochiita sotai shinpuku hozon shori. 3

    Energy Technology Data Exchange (ETDEWEB)

    Saeki, T [Japan National Oil Corporation, Tokyo (Japan). Technology Research Center

    1996-10-01

    For the seismic reflection method conducted on the ground surface, generator and geophone are set on the surface. The observed waveforms are affected by the ground surface and surface layer. Therefore, it is required for discussing physical properties of the deep underground to remove the influence of surface layer, preliminarily. For the surface consistent amplitude correction, properties of the generator and geophone were removed by assuming that the observed waveforms can be expressed by equations of convolution. This is a correction method to obtain records without affected by the surface conditions. In response to analysis and correction of waveforms, wavelet conversion was examined. Using the amplitude patterns after correction, the significant signal region, noise dominant region, and surface wave dominant region would be separated each other. Since the amplitude values after correction of values in the significant signal region have only small variation, a representative value can be given. This can be used for analyzing the surface consistent amplitude correction. Efficiency of the process can be enhanced by considering the change of frequency. 3 refs., 5 figs.

  16. Consistently violating the non-Gaussian consistency relation

    International Nuclear Information System (INIS)

    Mooij, Sander; Palma, Gonzalo A.

    2015-01-01

    Non-attractor models of inflation are characterized by the super-horizon evolution of curvature perturbations, introducing a violation of the non-Gaussian consistency relation between the bispectrum's squeezed limit and the power spectrum's spectral index. In this work we show that the bispectrum's squeezed limit of non-attractor models continues to respect a relation dictated by the evolution of the background. We show how to derive this relation using only symmetry arguments, without ever needing to solve the equations of motion for the perturbations

  17. Sludge characterization: the role of physical consistency

    Energy Technology Data Exchange (ETDEWEB)

    Spinosa, Ludovico; Wichmann, Knut

    2003-07-01

    The physical consistency is an important parameter in sewage sludge characterization as it strongly affects almost all treatment, utilization and disposal operations. In addition, in many european Directives a reference to the physical consistency is reported as a characteristic to be evaluated for fulfilling the regulations requirements. Further, in many analytical methods for sludge different procedures are indicated depending on whether a sample is liquid or not, is solid or not. Three physical behaviours (liquid, paste-like and solid) can be observed with sludges, so the development of analytical procedures to define the boundary limit between liquid and paste-like behaviours (flowability) and that between solid and paste-like ones (solidity) is of growing interest. Several devices can be used for evaluating the flowability and solidity properties, but often they are costly and difficult to be operated in the field. Tests have been carried out to evaluate the possibility to adopt a simple extrusion procedure for flowability measurements, and a Vicat needle for solidity ones. (author)

  18. Self-consistent electron transport in collisional plasmas

    International Nuclear Information System (INIS)

    Mason, R.J.

    1982-01-01

    A self-consistent scheme has been developed to model electron transport in evolving plasmas of arbitrary classical collisionality. The electrons and ions are treated as either multiple donor-cell fluids, or collisional particles-in-cell. Particle suprathermal electrons scatter off ions, and drag against fluid background thermal electrons. The background electrons undergo ion friction, thermal coupling, and bremsstrahlung. The components move in self-consistent advanced E-fields, obtained by the Implicit Moment Method, which permits Δt >> ω/sub p/ -1 and Δx >> lambda/sub D/ - offering a 10 2 - 10 3 -fold speed-up over older explicit techniques. The fluid description for the background plasma components permits the modeling of transport in systems spanning more than a 10 7 -fold change in density, and encompassing contiguous collisional and collisionless regions. Results are presented from application of the scheme to the modeling of CO 2 laser-generated suprathermal electron transport in expanding thin foils, and in multi-foil target configurations

  19. Self-consistent beam halo studies ampersand halo diagnostic development in a continuous linear focusing channel

    International Nuclear Information System (INIS)

    Jameson, R.A.

    1994-01-01

    Beam halos are formed via self-consistent motion of the beam particles. Interactions of single particles with time-varying density distributions of other particles are a major source of halo. Aspects of these interactions are studied for an initially equilibrium distribution in a radial, linear, continuous focusing system. When there is a mismatch, it is shown that in the self-consistent system, there is a threshold in space-charge and mismatch, above which a halo is formed that extends to ∼1.5 times the initial maximum mismatch radius. Tools are sought for characterizing the halo dynamics. Testing the particles against the width of the mismatch driving resonance is useful for finding a conservative estimate of the threshold. The exit, entering and transition times, and the time evolution of the halo, are also explored using this technique. Extension to higher dimensions is briefly discussed

  20. Validity test and its consistency in the construction of patient loyalty model

    Science.gov (United States)

    Yanuar, Ferra

    2016-04-01

    The main objective of this present study is to demonstrate the estimation of validity values and its consistency based on structural equation model. The method of estimation was then implemented to an empirical data in case of the construction the patient loyalty model. In the hypothesis model, service quality, patient satisfaction and patient loyalty were determined simultaneously, each factor were measured by any indicator variables. The respondents involved in this study were the patients who ever got healthcare at Puskesmas in Padang, West Sumatera. All 394 respondents who had complete information were included in the analysis. This study found that each construct; service quality, patient satisfaction and patient loyalty were valid. It means that all hypothesized indicator variables were significant to measure their corresponding latent variable. Service quality is the most measured by tangible, patient satisfaction is the most mesured by satisfied on service and patient loyalty is the most measured by good service quality. Meanwhile in structural equation, this study found that patient loyalty was affected by patient satisfaction positively and directly. Service quality affected patient loyalty indirectly with patient satisfaction as mediator variable between both latent variables. Both structural equations were also valid. This study also proved that validity values which obtained here were also consistence based on simulation study using bootstrap approach.

  1. Robust inverse-consistent affine CT-MR registration in MRI-assisted and MRI-alone prostate radiation therapy.

    Science.gov (United States)

    Rivest-Hénault, David; Dowson, Nicholas; Greer, Peter B; Fripp, Jurgen; Dowling, Jason A

    2015-07-01

    CT-MR registration is a critical component of many radiation oncology protocols. In prostate external beam radiation therapy, it allows the propagation of MR-derived contours to reference CT images at the planning stage, and it enables dose mapping during dosimetry studies. The use of carefully registered CT-MR atlases allows the estimation of patient specific electron density maps from MRI scans, enabling MRI-alone radiation therapy planning and treatment adaptation. In all cases, the precision and accuracy achieved by registration influences the quality of the entire process. Most current registration algorithms do not robustly generalize and lack inverse-consistency, increasing the risk of human error and acting as a source of bias in studies where information is propagated in a particular direction, e.g. CT to MR or vice versa. In MRI-based treatment planning where both CT and MR scans serve as spatial references, inverse-consistency is critical, if under-acknowledged. A robust, inverse-consistent, rigid/affine registration algorithm that is well suited to CT-MR alignment in prostate radiation therapy is presented. The presented method is based on a robust block-matching optimization process that utilises a half-way space definition to maintain inverse-consistency. Inverse-consistency substantially reduces the influence of the order of input images, simplifying analysis, and increasing robustness. An open source implementation is available online at http://aehrc.github.io/Mirorr/. Experimental results on a challenging 35 CT-MR pelvis dataset demonstrate that the proposed method is more accurate than other popular registration packages and is at least as accurate as the state of the art, while being more robust and having an order of magnitude higher inverse-consistency than competing approaches. The presented results demonstrate that the proposed registration algorithm is readily applicable to prostate radiation therapy planning. Copyright © 2015. Published by

  2. Dictionary-based fiber orientation estimation with improved spatial consistency.

    Science.gov (United States)

    Ye, Chuyang; Prince, Jerry L

    2018-02-01

    Diffusion magnetic resonance imaging (dMRI) has enabled in vivo investigation of white matter tracts. Fiber orientation (FO) estimation is a key step in tract reconstruction and has been a popular research topic in dMRI analysis. In particular, the sparsity assumption has been used in conjunction with a dictionary-based framework to achieve reliable FO estimation with a reduced number of gradient directions. Because image noise can have a deleterious effect on the accuracy of FO estimation, previous works have incorporated spatial consistency of FOs in the dictionary-based framework to improve the estimation. However, because FOs are only indirectly determined from the mixture fractions of dictionary atoms and not modeled as variables in the objective function, these methods do not incorporate FO smoothness directly, and their ability to produce smooth FOs could be limited. In this work, we propose an improvement to Fiber Orientation Reconstruction using Neighborhood Information (FORNI), which we call FORNI+; this method estimates FOs in a dictionary-based framework where FO smoothness is better enforced than in FORNI alone. We describe an objective function that explicitly models the actual FOs and the mixture fractions of dictionary atoms. Specifically, it consists of data fidelity between the observed signals and the signals represented by the dictionary, pairwise FO dissimilarity that encourages FO smoothness, and weighted ℓ 1 -norm terms that ensure the consistency between the actual FOs and the FO configuration suggested by the dictionary representation. The FOs and mixture fractions are then jointly estimated by minimizing the objective function using an iterative alternating optimization strategy. FORNI+ was evaluated on a simulation phantom, a physical phantom, and real brain dMRI data. In particular, in the real brain dMRI experiment, we have qualitatively and quantitatively evaluated the reproducibility of the proposed method. Results demonstrate that

  3. Self-consistent calculation of atomic structure for mixture

    International Nuclear Information System (INIS)

    Meng Xujun; Bai Yun; Sun Yongsheng; Zhang Jinglin; Zong Xiaoping

    2000-01-01

    Based on relativistic Hartree-Fock-Slater self-consistent average atomic model, atomic structure for mixture is studied by summing up component volumes in mixture. Algorithmic procedure for solving both the group of Thomas-Fermi equations and the self-consistent atomic structure is presented in detail, and, some numerical results are discussed

  4. Consistency and Reconciliation Model In Regional Development Planning

    Directory of Open Access Journals (Sweden)

    Dina Suryawati

    2016-10-01

    Full Text Available The aim of this study was to identify the problems and determine the conceptual model of regional development planning. Regional development planning is a systemic, complex and unstructured process. Therefore, this study used soft systems methodology to outline unstructured issues with a structured approach. The conceptual models that were successfully constructed in this study are a model of consistency and a model of reconciliation. Regional development planning is a process that is well-integrated with central planning and inter-regional planning documents. Integration and consistency of regional planning documents are very important in order to achieve the development goals that have been set. On the other hand, the process of development planning in the region involves technocratic system, that is, both top-down and bottom-up system of participation. Both must be balanced, do not overlap and do not dominate each other. regional, development, planning, consistency, reconciliation

  5. Consistency of orthodox gravity

    Energy Technology Data Exchange (ETDEWEB)

    Bellucci, S. [INFN, Frascati (Italy). Laboratori Nazionali di Frascati; Shiekh, A. [International Centre for Theoretical Physics, Trieste (Italy)

    1997-01-01

    A recent proposal for quantizing gravity is investigated for self consistency. The existence of a fixed-point all-order solution is found, corresponding to a consistent quantum gravity. A criterion to unify couplings is suggested, by invoking an application of their argument to more complex systems.

  6. Volume of the steady-state space of financial flows in a monetary stock-flow-consistent model

    Science.gov (United States)

    Hazan, Aurélien

    2017-05-01

    We show that a steady-state stock-flow consistent macro-economic model can be represented as a Constraint Satisfaction Problem (CSP). The set of solutions is a polytope, which volume depends on the constraints applied and reveals the potential fragility of the economic circuit, with no need to study the dynamics. Several methods to compute the volume are compared, inspired by operations research methods and the analysis of metabolic networks, both exact and approximate. We also introduce a random transaction matrix, and study the particular case of linear flows with respect to money stocks.

  7. A STUDY ON DYNAMIC LOAD HISTORY RECONSTRUCTION USING PSEUDO-INVERSE METHODS

    OpenAIRE

    Santos, Ariane Rebelato Silva dos; Marczak, Rogério José

    2017-01-01

    Considering that the vibratory forces generally cannot be measured directly at the interface of two bodies, an inverse method is studied in the present work to recover the load history in such cases. The proposed technique attempts to reconstruct the dynamic loads history by using a frequency domain analysis and Moore-Penrose pseudo-inverses of the frequency response function (FRF) of the system. The methodology consists in applying discrete dynamic loads on a finite element model in the time...

  8. A Study of Effectiveness of Rational, Emotive, Behavior Therapy (REBT) with Group Method on Decrease of Stress among Diabetic Patients

    OpenAIRE

    Kianoush Zahrakar

    2012-01-01

    Introduction: The purpose of the present research was studying the effectiveness of Rational Emotive Behavior Therapy (REBT) with Group method in decreasing stress of diabetic patients. Methods: The population of research consisted of all diabetic patients that are member of diabetic patient’s association 0f karaj city. The sample consisted of 30 diabetic patients (experimental group 15 persons and control group 15 persons) that selected through random sampling. Research design was experiment...

  9. Comparative study of discretization methods of microarray data for inferring transcriptional regulatory networks

    Directory of Open Access Journals (Sweden)

    Ji Wei

    2010-10-01

    Full Text Available Abstract Background Microarray data discretization is a basic preprocess for many algorithms of gene regulatory network inference. Some common discretization methods in informatics are used to discretize microarray data. Selection of the discretization method is often arbitrary and no systematic comparison of different discretization has been conducted, in the context of gene regulatory network inference from time series gene expression data. Results In this study, we propose a new discretization method "bikmeans", and compare its performance with four other widely-used discretization methods using different datasets, modeling algorithms and number of intervals. Sensitivities, specificities and total accuracies were calculated and statistical analysis was carried out. Bikmeans method always gave high total accuracies. Conclusions Our results indicate that proper discretization methods can consistently improve gene regulatory network inference independent of network modeling algorithms and datasets. Our new method, bikmeans, resulted in significant better total accuracies than other methods.

  10. Policy consistency and the achievement of Nigeria's foreign policy ...

    African Journals Online (AJOL)

    This study is an attempt to investigate the policy consistency of Nigeria‟s foreign policy and to understand the basis for this consistency; and also to see whether peacekeeping/peace-enforcement is key instrument in the achievement of Nigeria‟s foreign policy goals. The objective of the study was to examine whether the ...

  11. Consistency maintenance for constraint in role-based access control model

    Institute of Scientific and Technical Information of China (English)

    韩伟力; 陈刚; 尹建伟; 董金祥

    2002-01-01

    Constraint is an important aspect of role-based access control and is sometimes argued to be the principal motivation for role-based access control (RBAC). But so far few authors have discussed consistency maintenance for constraint in RBAC model. Based on researches of constraints among roles and types of inconsistency among constraints, this paper introduces corresponding formal rules, rule-based reasoning and corresponding methods to detect, avoid and resolve these inconsistencies. Finally, the paper introduces briefly the application of consistency maintenance in ZD-PDM, an enterprise-oriented product data management (PDM) system.

  12. Consistency maintenance for constraint in role-based access control model

    Institute of Scientific and Technical Information of China (English)

    韩伟力; 陈刚; 尹建伟; 董金祥

    2002-01-01

    Constraint is an important aspect of role-based access control and is sometimes argued to be the principal motivation for role-based access control (RBAC). But so far'few authors have discussed consistency maintenance for constraint in RBAC model. Based on researches of constraints among roles and types of inconsistency among constraints, this paper introduces correaponding formal rules, rulebased reasoning and corresponding methods to detect, avoid and resolve these inconsistencies. Finally,the paper introduces briefly the application of consistency maintenance in ZD-PDM, an enterprise-ori-ented product data management (PDM) system.

  13. Self-consistent Green’s-function technique for surfaces and interfaces

    DEFF Research Database (Denmark)

    Skriver, Hans Lomholt; Rosengaard, N. M.

    1991-01-01

    We have implemented an efficient self-consistent Green’s-function technique for calculating ground-state properties of surfaces and interfaces, based on the linear-muffin-tin-orbitals method within the tight-binding representation. In this approach the interlayer interaction is extremely short...... ranged, and only a few layers close to the interface need be treated self-consistently via a Dyson equation. For semi-infinite jellium, the technique gives work functions and surface energies that are in excellent agreement with earlier calculations. For the bcc(110) surface of the alkali metals, we find...

  14. COMPARATIVE STUDIES OF THREE METHODS FOR MEASURING PEPSIN ACTIVITY

    Science.gov (United States)

    Loken, Merle K.; Terrill, Kathleen D.; Marvin, James F.; Mosser, Donn G.

    1958-01-01

    Comparison has been made of a simple method originated by Absolon and modified in our laboratories for assay of proteolytic activity using RISA (radioactive iodinated serum albumin—Abbott Laboratories), with the commonly used photometric methods of Anson and Kunitz. In this method, pepsin was incubated with an albumin substrate containing RISA, followed by precipitation of the undigested substrate with trichloroacetic acid and measurement of radioactive digestion products in the supernatant fluid. The I131—albumin bond was shown in the present studies to be altered only by the proteolytic activity, and not by the incubation procedures at various values of pH. Any free iodine present originally in the RISA was removed by a single passage through a resin column (amberlite IRA-400-C1). Pepsin was shown to be most stable in solution at a pH of 5.5. Activity of pepsin was shown to be maximal when it was incubated with albumin at a pH of 2.5. Pepsin activity was shown to be altered in the presence of various electrolytes. Pepsin activity measured by the RISA and Anson methods as a function of concentration or of time of incubation indicated that these two methods are in good agreement and are equally sensitive. Consistently smaller standard errors were obtained by the RISA method of pepsin assay than were obtained with either of the other methods. PMID:13587910

  15. Is the molecular statics method suitable for the study of nanomaterials? A study case of nanowires

    International Nuclear Information System (INIS)

    Chang, I-L; Chen, Y-C

    2007-01-01

    Both molecular statics and molecular dynamics methods were employed to study the mechanical properties of copper nanowires. The size effect on both elastic and plastic properties of square cross-sectional nanowire was examined and compared systematically using two molecular approaches. It was found consistently from both molecular methods that the elastic and plastic properties of nanowires depend on the lateral size of nanowires. As the lateral size of nanowires decreases, the values of Young's modulus decrease and dislocation nucleation stresses increase. However, it was shown that the dislocation nucleation stress would be significantly influenced by the axial periodic length of the nanowire model using the molecular statics method while molecular dynamics simulations at two distinct temperatures (0.01 and 300 K) did not show the same dependence. It was concluded that molecular statics as an energy minimization numerical scheme is quite insensitive to the instability of atomic structure especially without thermal fluctuation and might not be a suitable tool for studying the behaviour of nanomaterials beyond the elastic limit

  16. Self-Consistent Monte Carlo Study of the Coulomb Interaction under Nano-Scale Device Structures

    Science.gov (United States)

    Sano, Nobuyuki

    2011-03-01

    It has been pointed that the Coulomb interaction between the electrons is expected to be of crucial importance to predict reliable device characteristics. In particular, the device performance is greatly degraded due to the plasmon excitation represented by dynamical potential fluctuations in high-doped source and drain regions by the channel electrons. We employ the self-consistent 3D Monte Carlo (MC) simulations, which could reproduce both the correct mobility under various electron concentrations and the collective plasma waves, to study the physical impact of dynamical potential fluctuations on device performance under the Double-gate MOSFETs. The average force experienced by an electron due to the Coulomb interaction inside the device is evaluated by performing the self-consistent MC simulations and the fixed-potential MC simulations without the Coulomb interaction. Also, the band-tailing associated with the local potential fluctuations in high-doped source region is quantitatively evaluated and it is found that the band-tailing becomes strongly dependent of position in real space even inside the uniform source region. This work was partially supported by Grants-in-Aid for Scientific Research B (No. 2160160) from the Ministry of Education, Culture, Sports, Science and Technology in Japan.

  17. Consistência interna da versão em português do Mini-Inventário de Fobia Social (Mini-SPIN Internal consistency of the Portuguese version of the Mini-Social Phobia Inventory (Mini-SPIN

    Directory of Open Access Journals (Sweden)

    Gustavo J. Fonseca D'El Rey

    2007-01-01

    Full Text Available CONTEXTO: A fobia social é um grave transtorno de ansiedade que traz incapacitação e sofrimento. OBJETIVOS: Investigar a consistência interna da versão em português do Mini-Inventário de Fobia Social (Mini-SPIN. MÉTODOS: Foi realizado um estudo da consistência interna do Mini-SPIN em uma amostra de 206 estudantes universitários da cidade de São Paulo, SP. RESULTADOS: A consistência interna do instrumento, analisada pelo coeficiente alfa de Cronbach, foi de 0,81. CONCLUSÕES: Esses achados permitiram concluir que a versão em português do Mini-SPIN exibiu resultados de boa consistência interna, semelhantes aos da versão original em inglês.BACKGROUND: Social phobia is a severe anxiety disorder that brings disability and distress. OBJECTIVES: To investigate the internal consistency of the Portuguese version of the Mini-Social Phobia Inventory (Mini-SPIN. METHODS: We conducted a study of internal consistency of the Mini-SPIN in a sample of 206 college students of the city of São Paulo, SP. RESULTS: The internal consistency of the instrument, analyzed by Cronbach's alpha coefficient, was 0.81. CONCLUSIONS: These findings suggest that the Portuguese version of the Mini-SPIN has a good internal consistency, similar to those obtained with the original English version.

  18. Simulation study on unfolding methods for diagnostic X-rays and mixed gamma rays

    International Nuclear Information System (INIS)

    Hashimoto, Makoto; Ohtaka, Masahiko; Ara, Kuniaki; Kanno, Ikuo; Imamura, Ryo; Mikami, Kenta; Nomiya, Seiichiro; Onabe, Hideaki

    2009-01-01

    A photon detector operating in current mode that can sense X-ray energy distribution has been reported. This detector consists of a row of several segment detectors. The energy distribution is derived using an unfolding technique. In this paper, comparisons of the unfolding techniques among error reduction, spectrum surveillance, and neural network methods are discussed through simulation studies on the detection of diagnostic X-rays and gamma rays emitted by a mixture of 137 Cs and 60 Co. For diagnostic X-ray measurement, the spectrum surveillance and neural network methods appeared promising, while the error reduction method yielded poor results. However, in the case of measuring mixtures of gamma rays, the error reduction method was both sufficient and effective. (author)

  19. Study of nuclear reactions with the Skyrme interaction; static properties by the self-consistent method; dynamic properties by the generator-coordinate method

    International Nuclear Information System (INIS)

    Flocard, Hubert.

    1975-01-01

    Using the same effective interaction depending only on 6 parameters a large number of nuclear properties are calculated, and the results are compared with experiment. Total binding energies of all nuclei of the chart table are reproduced within 5MeV. It is shown that the remaining discrepancy is coherent with the increase of total binding energy that can be expected from the further inclusion of collective motion correlations. Monopole, quadrupole and hexadecupole part of the charge densities are also reproduced with good accuracy. The deformation energy curves of many nuclei ranging from carbon to superheavy elements are calculated, and the different features of these curves are discussed. It should be noted that the fission barrier of actinide nuclei has been obtained and the results exhibit the well known two-bump shape. In addition the fusion energy curve of two 16 O merging in one nucleus 32 S has been completed. Results concerning monopole, dipole and quadrupole giant resonances of light nuclei obtained within the frame of the generator coordinate method are also presented. The calculated position of these resonances agree well with present available data [fr

  20. Consistent dynamical and statistical description of fission and comparison

    Energy Technology Data Exchange (ETDEWEB)

    Shunuan, Wang [Chinese Nuclear Data Center, Beijing, BJ (China)

    1996-06-01

    The research survey of consistent dynamical and statistical description of fission is briefly introduced. The channel theory of fission with diffusive dynamics based on Bohr channel theory of fission and Fokker-Planck equation and Kramers-modified Bohr-Wheeler expression according to Strutinsky method given by P.Frobrich et al. are compared and analyzed. (2 figs.).

  1. Random Sampling of Correlated Parameters – a Consistent Solution for Unfavourable Conditions

    Energy Technology Data Exchange (ETDEWEB)

    Žerovnik, G., E-mail: gasper.zerovnik@ijs.si [Jožef Stefan Institute, Jamova cesta 39, SI-1000 Ljubljana (Slovenia); Trkov, A. [Jožef Stefan Institute, Jamova cesta 39, SI-1000 Ljubljana (Slovenia); International Atomic Energy Agency, PO Box 100, A-1400 Vienna (Austria); Kodeli, I.A. [Jožef Stefan Institute, Jamova cesta 39, SI-1000 Ljubljana (Slovenia); Capote, R. [International Atomic Energy Agency, PO Box 100, A-1400 Vienna (Austria); Smith, D.L. [Argonne National Laboratory, 1710 Avenida del Mundo, Coronado, CA 92118-3073 (United States)

    2015-01-15

    Two methods for random sampling according to a multivariate lognormal distribution – the correlated sampling method and the method of transformation of correlation coefficients – are briefly presented. The methods are mathematically exact and enable consistent sampling of correlated inherently positive parameters with given information on the first two distribution moments. Furthermore, a weighted sampling method to accelerate the convergence of parameters with extremely large relative uncertainties is described. However, the method is efficient only for a limited number of correlated parameters.

  2. Migraine patients consistently show abnormal vestibular bedside tests

    Directory of Open Access Journals (Sweden)

    Eliana Teixeira Maranhão

    2015-01-01

    Full Text Available Migraine and vertigo are common disorders, with lifetime prevalences of 16% and 7% respectively, and co-morbidity around 3.2%. Vestibular syndromes and dizziness occur more frequently in migraine patients. We investigated bedside clinical signs indicative of vestibular dysfunction in migraineurs.Objective To test the hypothesis that vestibulo-ocular reflex, vestibulo-spinal reflex and fall risk (FR responses as measured by 14 bedside tests are abnormal in migraineurs without vertigo, as compared with controls.Method Cross-sectional study including sixty individuals – thirty migraineurs, 25 women, 19-60 y-o; and 30 gender/age healthy paired controls.Results Migraineurs showed a tendency to perform worse in almost all tests, albeit only the Romberg tandem test was statistically different from controls. A combination of four abnormal tests better discriminated the two groups (93.3% specificity.Conclusion Migraine patients consistently showed abnormal vestibular bedside tests when compared with controls.

  3. Consistent momentum space regularization/renormalization of supersymmetric quantum field theories: the three-loop β-function for the Wess-Zumino model

    International Nuclear Information System (INIS)

    Carneiro, David; Sampaio, Marcos; Nemes, Maria Carolina; Scarpelli, Antonio Paulo Baeta

    2003-01-01

    We compute the three loop β function of the Wess-Zumino model to motivate implicit regularization (IR) as a consistent and practical momentum-space framework to study supersymmetric quantum field theories. In this framework which works essentially in the physical dimension of the theory we show that ultraviolet are clearly disentangled from infrared divergences. We obtain consistent results which motivate the method as a good choice to study supersymmetry anomalies in quantum field theories. (author)

  4. The Plumber’s Nightmare Phase in Diblock Copolymer/Homopolymer Blends. A Self-Consistent Field Theory Study.

    KAUST Repository

    Martinez-Veracoechea, Francisco J.

    2009-11-24

    Using self-consistent field theory, the Plumber\\'s Nightmare and the double diamond phases are predicted to be stable in a finite region of phase diagrams for blends of AB diblock copolymer (DBC) and A-component homopolymer. To the best of our knowledge, this is the first time that the P phase has been predicted to be stable using self-consistent field theory. The stabilization is achieved by tuning the composition or conformational asymmetry of the DBC chain, and the architecture or length of the homopolymer. The basic features of the phase diagrams are the same in all cases studied, suggesting a general type of behavior for these systems. Finally, it is noted that the homopolymer length should be a convenient variable to stabilize bicontinuous phases in experiments. © 2009 American Chemical Society.

  5. GRAVITATIONALLY CONSISTENT HALO CATALOGS AND MERGER TREES FOR PRECISION COSMOLOGY

    International Nuclear Information System (INIS)

    Behroozi, Peter S.; Wechsler, Risa H.; Wu, Hao-Yi; Busha, Michael T.; Klypin, Anatoly A.; Primack, Joel R.

    2013-01-01

    We present a new algorithm for generating merger trees and halo catalogs which explicitly ensures consistency of halo properties (mass, position, and velocity) across time steps. Our algorithm has demonstrated the ability to improve both the completeness (through detecting and inserting otherwise missing halos) and purity (through detecting and removing spurious objects) of both merger trees and halo catalogs. In addition, our method is able to robustly measure the self-consistency of halo finders; it is the first to directly measure the uncertainties in halo positions, halo velocities, and the halo mass function for a given halo finder based on consistency between snapshots in cosmological simulations. We use this algorithm to generate merger trees for two large simulations (Bolshoi and Consuelo) and evaluate two halo finders (ROCKSTAR and BDM). We find that both the ROCKSTAR and BDM halo finders track halos extremely well; in both, the number of halos which do not have physically consistent progenitors is at the 1%-2% level across all halo masses. Our code is publicly available at http://code.google.com/p/consistent-trees. Our trees and catalogs are publicly available at http://hipacc.ucsc.edu/Bolshoi/.

  6. Quasiparticles and thermodynamical consistency

    International Nuclear Information System (INIS)

    Shanenko, A.A.; Biro, T.S.; Toneev, V.D.

    2003-01-01

    A brief and simple introduction into the problem of the thermodynamical consistency is given. The thermodynamical consistency relations, which should be taken into account under constructing a quasiparticle model, are found in a general manner from the finite-temperature extension of the Hellmann-Feynman theorem. Restrictions following from these relations are illustrated by simple physical examples. (author)

  7. Bootstrap consistency for general semiparametric M-estimation

    KAUST Repository

    Cheng, Guang

    2010-10-01

    Consider M-estimation in a semiparametric model that is characterized by a Euclidean parameter of interest and an infinite-dimensional nuisance parameter. As a general purpose approach to statistical inferences, the bootstrap has found wide applications in semiparametric M-estimation and, because of its simplicity, provides an attractive alternative to the inference approach based on the asymptotic distribution theory. The purpose of this paper is to provide theoretical justifications for the use of bootstrap as a semiparametric inferential tool. We show that, under general conditions, the bootstrap is asymptotically consistent in estimating the distribution of the M-estimate of Euclidean parameter; that is, the bootstrap distribution asymptotically imitates the distribution of the M-estimate. We also show that the bootstrap confidence set has the asymptotically correct coverage probability. These general onclusions hold, in particular, when the nuisance parameter is not estimable at root-n rate, and apply to a broad class of bootstrap methods with exchangeable ootstrap weights. This paper provides a first general theoretical study of the bootstrap in semiparametric models. © Institute of Mathematical Statistics, 2010.

  8. Studies of the Raman Spectra of Cyclic and Acyclic Molecules: Combination and Prediction Spectrum Methods

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Taijin; Assary, Rajeev S.; Marshall, Christopher L.; Gosztola, David J.; Curtiss, Larry A.; Stair, Peter C.

    2012-04-02

    A combination of Raman spectroscopy and density functional methods was employed to investigate the spectral features of selected molecules: furfural, 5-hydroxymethyl furfural (HMF), methanol, acetone, acetic acid, and levulinic acid. The computed spectra and measured spectra are in excellent agreement, consistent with previous studies. Using the combination and prediction spectrum method (CPSM), we were able to predict the important spectral features of two platform chemicals, HMF and levulinic acid.The results have shown that CPSM is a useful alternative method for predicting vibrational spectra of complex molecules in the biomass transformation process.

  9. Self-consistent theory of finite Fermi systems and radii of nuclei

    International Nuclear Information System (INIS)

    Saperstein, E. E.; Tolokonnikov, S. V.

    2011-01-01

    Present-day self-consistent approaches in nuclear theory were analyzed from the point of view of describing distributions of nuclear densities. The generalized method of the energy density functional due to Fayans and his coauthors (this is the most successful version of the self-consistent theory of finite Fermi systems) was the first among the approaches under comparison. The second was the most successful version of the Skyrme-Hartree-Fock method with the HFB-17 functional due to Goriely and his coauthors. Charge radii of spherical nuclei were analyzed in detail. Several isotopic chains of deformed nuclei were also considered. Charge-density distributions ρ ch (r) were calculated for several spherical nuclei. They were compared with model-independent data extracted from an analysis of elastic electron scattering on nuclei.

  10. Internal Branding and Employee Brand Consistent Behaviours

    DEFF Research Database (Denmark)

    Mazzei, Alessandra; Ravazzani, Silvia

    2017-01-01

    constitutive processes. In particular, the paper places emphasis on the role and kinds of communication practices as a central part of the nonnormative and constitutive internal branding process. The paper also discusses an empirical study based on interviews with 32 Italian and American communication managers...... and 2 focus groups with Italian communication managers. Findings show that, in order to enhance employee brand consistent behaviours, the most effective communication practices are those characterised as enablement-oriented. Such a communication creates the organizational conditions adequate to sustain......Employee behaviours conveying brand values, named brand consistent behaviours, affect the overall brand evaluation. Internal branding literature highlights a knowledge gap in terms of communication practices intended to sustain such behaviours. This study contributes to the development of a non...

  11. Study of Seed Germination by Soaking Methode of Cacao (Theobroma cacao L.

    Directory of Open Access Journals (Sweden)

    Sulistyani Pancaningtyas

    2014-12-01

    Full Text Available Study of germination methods conduct to get information about seed viability based on germination rate, percentage of germination and vigority. Germination methods was studied to get the efficiency and effectivity of germination, easy to handle, low costs with high vigority. Sand and gunny sack methods  for germination, need extensive place  and 3-4 days germination period after planting. This research will study the alternative of germination method with soaking. This method can be accelerating  germination rate and effectively place usage without decreasing the quality of cacao seedling.The research was done at Kaliwining Experimental Station, Indonesian Coffee and Cocoa Research Institue. This research consist of two experiment was arranged based on factorial completely random design. First experiment will observed to compared germination rate and the second experiment will observed seedling quality between soaking and wet gunny sack germination method.The results showed that length of radicel on soaking method longer than wet gunny sack method. Growth of radicel started from 2 hours after soaking, moreover length of radicel at 4 hours after soaking have significant different value with gunny sack method. On 24 hours after soaking have 3,69 mm and 0,681 mm on wet gunny sack treatment. Except lengt of hipocotyl, there is not different condition between seedling that out came  from soaking and wet gunny sack method. Length of hipocotyl on 36 hours after soaking have 9,15 cm and significant different between wet gunny sack germination method that have 5,40 cm. Keywords : seed germination, soaking method, Theobroma cacao L., cocoa seedlings

  12. Second-order perturbation theory with a density matrix renormalization group self-consistent field reference function: theory and application to the study of chromium dimer.

    Science.gov (United States)

    Kurashige, Yuki; Yanai, Takeshi

    2011-09-07

    We present a second-order perturbation theory based on a density matrix renormalization group self-consistent field (DMRG-SCF) reference function. The method reproduces the solution of the complete active space with second-order perturbation theory (CASPT2) when the DMRG reference function is represented by a sufficiently large number of renormalized many-body basis, thereby being named DMRG-CASPT2 method. The DMRG-SCF is able to describe non-dynamical correlation with large active space that is insurmountable to the conventional CASSCF method, while the second-order perturbation theory provides an efficient description of dynamical correlation effects. The capability of our implementation is demonstrated for an application to the potential energy curve of the chromium dimer, which is one of the most demanding multireference systems that require best electronic structure treatment for non-dynamical and dynamical correlation as well as large basis sets. The DMRG-CASPT2/cc-pwCV5Z calculations were performed with a large (3d double-shell) active space consisting of 28 orbitals. Our approach using large-size DMRG reference addressed the problems of why the dissociation energy is largely overestimated by CASPT2 with the small active space consisting of 12 orbitals (3d4s), and also is oversensitive to the choice of the zeroth-order Hamiltonian. © 2011 American Institute of Physics

  13. A Consistent Phylogenetic Backbone for the Fungi

    Science.gov (United States)

    Ebersberger, Ingo; de Matos Simoes, Ricardo; Kupczok, Anne; Gube, Matthias; Kothe, Erika; Voigt, Kerstin; von Haeseler, Arndt

    2012-01-01

    The kingdom of fungi provides model organisms for biotechnology, cell biology, genetics, and life sciences in general. Only when their phylogenetic relationships are stably resolved, can individual results from fungal research be integrated into a holistic picture of biology. However, and despite recent progress, many deep relationships within the fungi remain unclear. Here, we present the first phylogenomic study of an entire eukaryotic kingdom that uses a consistency criterion to strengthen phylogenetic conclusions. We reason that branches (splits) recovered with independent data and different tree reconstruction methods are likely to reflect true evolutionary relationships. Two complementary phylogenomic data sets based on 99 fungal genomes and 109 fungal expressed sequence tag (EST) sets analyzed with four different tree reconstruction methods shed light from different angles on the fungal tree of life. Eleven additional data sets address specifically the phylogenetic position of Blastocladiomycota, Ustilaginomycotina, and Dothideomycetes, respectively. The combined evidence from the resulting trees supports the deep-level stability of the fungal groups toward a comprehensive natural system of the fungi. In addition, our analysis reveals methodologically interesting aspects. Enrichment for EST encoded data—a common practice in phylogenomic analyses—introduces a strong bias toward slowly evolving and functionally correlated genes. Consequently, the generalization of phylogenomic data sets as collections of randomly selected genes cannot be taken for granted. A thorough characterization of the data to assess possible influences on the tree reconstruction should therefore become a standard in phylogenomic analyses. PMID:22114356

  14. A new self-consistent model for thermodynamics of binary solutions

    Czech Academy of Sciences Publication Activity Database

    Svoboda, Jiří; Shan, Y. V.; Fischer, F. D.

    2015-01-01

    Roč. 108, NOV (2015), s. 27-30 ISSN 1359-6462 R&D Projects: GA ČR(CZ) GA14-24252S Institutional support: RVO:68081723 Keywords : Thermodynamics * Analytical methods * CALPHAD * Phase diagram * Self-consistent model Subject RIV: BJ - Thermodynamics Impact factor: 3.305, year: 2015

  15. Classical and Quantum Consistency of the DGP Model

    CERN Document Server

    Nicolis, A; Nicolis, Alberto; Rattazzi, Riccardo

    2004-01-01

    We study the Dvali-Gabadadze-Porrati model by the method of the boundary effective action. The truncation of this action to the bending mode \\pi consistently describes physics in a wide range of regimes both at the classical and at the quantum level. The Vainshtein effect, which restores agreement with precise tests of general relativity, follows straightforwardly. We give a simple and general proof of stability, i.e. absence of ghosts in the fluctuations, valid for most of the relevant cases, like for instance the spherical source in asymptotically flat space. However we confirm that around certain interesting self-accelerating cosmological solutions there is a ghost. We consider the issue of quantum corrections. Around flat space \\pi becomes strongly coupled below a macroscopic length of 1000 km, thus impairing the predictivity of the model. Indeed the tower of higher dimensional operators which is expected by a generic UV completion of the model limits predictivity at even larger length scales. We outline ...

  16. Experimental investigation on consistency limits of cement and lime-stabilized marine sediments.

    Science.gov (United States)

    Wang, DongXing; Zentar, Rachid; Abriak, Nor Edine; Xu, WeiYa

    2012-06-01

    This paper presents the effects of treatments with cement and lime on the consistency limits of marine sediments dredged from Dunkirk port. The Casagrande percussion test and the fall cone test were used to determine the liquid limits of raw sediments and treated marine sediments. For the evaluation of the plastic limits, the results of the fall cone test were compared with those obtained by the rolling test method. The relationship between the water contents and the penetration depths for the determination of the liquid limit and the plastic limit was explored. Liquid limits at 15.5 mm and plastic limits at 1.55 mm seem to be a more appropriate choice for the studied marine sediments compared with the limits determined by other used prediction methods. Finally, the effect of cement treatment and lime treatment on the Casagrande classification of the studied sediments was investigated according to the different prediction results.

  17. Diagnostic consistency and interchangeability of schizophrenic disorders and bipolar disorders: A 7-year follow-up study.

    Science.gov (United States)

    Hung, Yen-Ni; Yang, Shu-Yu; Kuo, Chian-Jue; Lin, Shih-Ku

    2018-03-01

    The change in psychiatric diagnoses in clinical practice is not an unusual phenomenon. The interchange between the diagnoses of schizophrenic disorders and bipolar disorders is a major clinical issue because of the differences in treatment regimens and long-term prognoses. In this study, we used a nationwide population-based sample to compare the diagnostic consistency and interchange rate between schizophrenic disorders and bipolar disorders. In total, 25 711 and 11 261 patients newly diagnosed as having schizophrenic disorder and bipolar disorder, respectively, were retrospectively enrolled from the Psychiatric Inpatient Medical Claims database between 2001 and 2005. We followed these two cohorts for 7 years to determine whether their diagnoses were consistent throughout subsequent hospitalizations. The interchange between the two diagnoses was analyzed. In the schizophrenic disorder cohort, the overall diagnostic consistency rate was 87.3% and the rate of change to bipolar disorder was 3.0% during the 7-year follow-up. Additional analyses of subtypes revealed that the change rate from schizoaffective disorder to bipolar disorder was 12.0%. In the bipolar disorder cohort, the overall diagnostic consistency rate was 71.9% and the rate of change to schizophrenic disorder was 8.3%. Changes in the diagnosis of a major psychosis are not uncommon. The interchange between the diagnoses of schizophrenic disorders and bipolar disorders might be attributed to the evolution of clinical symptoms and the observation of preserved social functions that contradict the original diagnosis. While making a psychotic diagnosis, clinicians should be aware of the possibility of the change in diagnosis in the future. © 2017 The Authors. Psychiatry and Clinical Neurosciences © 2017 Japanese Society of Psychiatry and Neurology.

  18. A consistent formulation of the finite element method for solving diffusive-convective transport problems

    International Nuclear Information System (INIS)

    Carmo, E.G.D. do; Galeao, A.C.N.R.

    1986-01-01

    A new method specially designed to solve highly convective transport problems is proposed. Using a variational approach it is shown that this weighted residual method belongs to a class of Petrov-Galerkin's approximation. Some examples are presented in order to demonstrate the adequacy of this method in predicting internal or external boundary layers. (Author) [pt

  19. Fully self-consistent GW calculations for molecules

    DEFF Research Database (Denmark)

    Rostgaard, Carsten; Jacobsen, Karsten Wedel; Thygesen, Kristian Sommer

    2010-01-01

    We calculate single-particle excitation energies for a series of 34 molecules using fully self-consistent GW, one-shot G0W0, Hartree-Fock (HF), and hybrid density-functional theory (DFT). All calculations are performed within the projector-augmented wave method using a basis set of Wannier...... functions augmented by numerical atomic orbitals. The GW self-energy is calculated on the real frequency axis including its full frequency dependence and off-diagonal matrix elements. The mean absolute error of the ionization potential (IP) with respect to experiment is found to be 4.4, 2.6, 0.8, 0.4, and 0...

  20. The numerical multiconfiguration self-consistent field approach for atoms; Der numerische Multiconfiguration Self-Consistent Field-Ansatz fuer Atome

    Energy Technology Data Exchange (ETDEWEB)

    Stiehler, Johannes

    1995-12-15

    The dissertation uses the Multiconfiguration Self-Consistent Field Approach to specify the electronic wave function of N electron atoms in a static electrical field. It presents numerical approaches to describe the wave functions and introduces new methods to compute the numerical Fock equations. Based on results computed with an implemented computer program the universal application, flexibility and high numerical precision of the presented approach is shown. RHF results and for the first time MCSCF results for polarizabilities and hyperpolarizabilities of various states of the atoms He to Kr are discussed. In addition, an application to interpret a plasma spectrum of gallium is presented. (orig.)

  1. Consistent-handed individuals are more authoritarian.

    Science.gov (United States)

    Lyle, Keith B; Grillo, Michael C

    2014-01-01

    Individuals differ in the consistency with which they use one hand over the other to perform everyday activities. Some individuals are very consistent, habitually using a single hand to perform most tasks. Others are relatively inconsistent, and hence make greater use of both hands. More- versus less-consistent individuals have been shown to differ in numerous aspects of personality and cognition. In several respects consistent-handed individuals resemble authoritarian individuals. For example, both consistent-handedness and authoritarianism have been linked to cognitive inflexibility. Therefore we hypothesised that consistent-handedness is an external marker for authoritarianism. Confirming our hypothesis, we found that consistent-handers scored higher than inconsistent-handers on a measure of submission to authority, were more likely to identify with a conservative political party (Republican), and expressed less-positive attitudes towards out-groups. We propose that authoritarianism may be influenced by the degree of interaction between the left and right brain hemispheres, which has been found to differ between consistent- and inconsistent-handed individuals.

  2. BMI was found to be a consistent determinant related to misreporting of energy, protein and potassium intake using self-report and duplicate portion methods.

    Science.gov (United States)

    Trijsburg, Laura; Geelen, Anouk; Hollman, Peter Ch; Hulshof, Paul Jm; Feskens, Edith Jm; Van't Veer, Pieter; Boshuizen, Hendriek C; de Vries, Jeanne Hm

    2017-03-01

    As misreporting, mostly under-reporting, of dietary intake is a generally known problem in nutritional research, we aimed to analyse the association between selected determinants and the extent of misreporting by the duplicate portion method (DP), 24 h recall (24hR) and FFQ by linear regression analysis using the biomarker values as unbiased estimates. For each individual, two DP, two 24hR, two FFQ and two 24 h urinary biomarkers were collected within 1·5 years. Also, for sixty-nine individuals one or two doubly labelled water measurements were obtained. The associations of basic determinants (BMI, gender, age and level of education) with misreporting of energy, protein and K intake of the DP, 24hR and FFQ were evaluated using linear regression analysis. Additionally, associations between other determinants, such as physical activity and smoking habits, and misreporting were investigated. The Netherlands. One hundred and ninety-seven individuals aged 20-70 years. Higher BMI was associated with under-reporting of dietary intake assessed by the different dietary assessment methods for energy, protein and K, except for K by DP. Men tended to under-report protein by the DP, FFQ and 24hR, and persons of older age under-reported K but only by the 24hR and FFQ. When adjusted for the basic determinants, the other determinants did not show a consistent association with misreporting of energy or nutrients and by the different dietary assessment methods. As BMI was the only consistent determinant of misreporting, we conclude that BMI should always be taken into account when assessing and correcting dietary intake.

  3. Study of Y and Lu iron garnets using Bethe-Peierls-Weiss method

    Science.gov (United States)

    Goveas, Neena; Mukhopadhyay, G.; Mukhopadhyay, P.

    1994-11-01

    We study here the magnetic properties of Y- and Lu- Iron Garnets using the Bethe- Peierls-Weiss method modified to suit complex systems like these Garnets. We consider these Garnets as described by Heisenberg Hamiltonian with two sublattices (a,d) and determine the exchange interaction parameters Jad, Jaa and Jdd by matching the exerimental susceptibility curves. We find Jaa and Jdd to be much smaller than those determined by Néel theory, and consistent with those obtained by the study of spin wave spectra; the spin wave dispersion relation constant obtained using these parameters gives good agreement with the experimental values.

  4. A study on manufacturing and construction method of buffer

    International Nuclear Information System (INIS)

    Chijimatsu, Masakazu; Sugita, Yutaka; Amemiya, Kiyoshi

    1999-09-01

    As an engineered barrier system in the geological disposal of high-level waste, multibarrier system is considered. Multibarrier system consists of the vitrified waste, the overpack and the buffer. Bentonite is one of the potential material as the buffer because of its low water permeability, self-sealing properties, radionuclides adsorption and retardation properties, thermal conductivity, chemical buffering properties, overpack supporting properties, stress buffering properties, etc. In order to evaluate the functions of buffer, a lot of experiments has been conducted. The evaluations of these functions are based on the assumption that the buffer is emplaced or constructed in the disposal tunnel (or disposal pit) properly. Therefore, it is necessary to study on the manufacturing / construction method of buffer. As the manufacturing / construction technology of the buffer, the block installation method and in-situ compaction method, etc, are being investigated. The block installation method is to emplace the buffer blocks manufactured in advance at the ground facility, and construction processes of the block installation method at the underground will be simplified compared with the in-situ compaction method. On the other hand, the in-situ compaction method is to introduce the buffer material with specified water content into the disposal tunnel and to make the buffer with high density at the site using a compaction machine. In regard to the in-situ compaction method, it is necessary to investigate the optimum finished thickness of one layer because it is impossible to construct the buffer at one time. This report describes the results of compaction property test and the summary of the past investigation results in connection with the manufacturing / construction method. Then this report shows the construction method that will be feasible in the actual disposal site. (J.P.N.)

  5. Analytic Intermodel Consistent Modeling of Volumetric Human Lung Dynamics.

    Science.gov (United States)

    Ilegbusi, Olusegun; Seyfi, Behnaz; Neylon, John; Santhanam, Anand P

    2015-10-01

    Human lung undergoes breathing-induced deformation in the form of inhalation and exhalation. Modeling the dynamics is numerically complicated by the lack of information on lung elastic behavior and fluid-structure interactions between air and the tissue. A mathematical method is developed to integrate deformation results from a deformable image registration (DIR) and physics-based modeling approaches in order to represent consistent volumetric lung dynamics. The computational fluid dynamics (CFD) simulation assumes the lung is a poro-elastic medium with spatially distributed elastic property. Simulation is performed on a 3D lung geometry reconstructed from four-dimensional computed tomography (4DCT) dataset of a human subject. The heterogeneous Young's modulus (YM) is estimated from a linear elastic deformation model with the same lung geometry and 4D lung DIR. The deformation obtained from the CFD is then coupled with the displacement obtained from the 4D lung DIR by means of the Tikhonov regularization (TR) algorithm. The numerical results include 4DCT registration, CFD, and optimal displacement data which collectively provide consistent estimate of the volumetric lung dynamics. The fusion method is validated by comparing the optimal displacement with the results obtained from the 4DCT registration.

  6. Facial Mimicry and Emotion Consistency: Influences of Memory and Context.

    Science.gov (United States)

    Kirkham, Alexander J; Hayes, Amy E; Pawling, Ralph; Tipper, Steven P

    2015-01-01

    This study investigates whether mimicry of facial emotions is a stable response or can instead be modulated and influenced by memory of the context in which the emotion was initially observed, and therefore the meaning of the expression. The study manipulated emotion consistency implicitly, where a face expressing smiles or frowns was irrelevant and to be ignored while participants categorised target scenes. Some face identities always expressed emotions consistent with the scene (e.g., smiling with a positive scene), whilst others were always inconsistent (e.g., frowning with a positive scene). During this implicit learning of face identity and emotion consistency there was evidence for encoding of face-scene emotion consistency, with slower RTs, a reduction in trust, and inhibited facial EMG for faces expressing incompatible emotions. However, in a later task where the faces were subsequently viewed expressing emotions with no additional context, there was no evidence for retrieval of prior emotion consistency, as mimicry of emotion was similar for consistent and inconsistent individuals. We conclude that facial mimicry can be influenced by current emotion context, but there is little evidence of learning, as subsequent mimicry of emotionally consistent and inconsistent faces is similar.

  7. Facial Mimicry and Emotion Consistency: Influences of Memory and Context.

    Directory of Open Access Journals (Sweden)

    Alexander J Kirkham

    Full Text Available This study investigates whether mimicry of facial emotions is a stable response or can instead be modulated and influenced by memory of the context in which the emotion was initially observed, and therefore the meaning of the expression. The study manipulated emotion consistency implicitly, where a face expressing smiles or frowns was irrelevant and to be ignored while participants categorised target scenes. Some face identities always expressed emotions consistent with the scene (e.g., smiling with a positive scene, whilst others were always inconsistent (e.g., frowning with a positive scene. During this implicit learning of face identity and emotion consistency there was evidence for encoding of face-scene emotion consistency, with slower RTs, a reduction in trust, and inhibited facial EMG for faces expressing incompatible emotions. However, in a later task where the faces were subsequently viewed expressing emotions with no additional context, there was no evidence for retrieval of prior emotion consistency, as mimicry of emotion was similar for consistent and inconsistent individuals. We conclude that facial mimicry can be influenced by current emotion context, but there is little evidence of learning, as subsequent mimicry of emotionally consistent and inconsistent faces is similar.

  8. Orthographic Consistency Affects Spoken Word Recognition at Different Grain-Sizes

    Science.gov (United States)

    Dich, Nadya

    2014-01-01

    A number of previous studies found that the consistency of sound-to-spelling mappings (feedback consistency) affects spoken word recognition. In auditory lexical decision experiments, words that can only be spelled one way are recognized faster than words with multiple potential spellings. Previous studies demonstrated this by manipulating…

  9. Consistency of direct microscopic examination and ELISA in detection of Giardia in stool specimen among children

    Directory of Open Access Journals (Sweden)

    Zohreh Torabi

    2014-09-01

    Full Text Available Objective: To investigate the consistency of direct microscopic examination and ELISA for determination of Giadia in stool specimen. Method: Study population consisted of children with any clinical symptoms of Giardia infestation since last two weeks. Fresh stool specimen was collected from each child. The stools specimens were assessed by two methods of direct microscopic examination and ELISA.The degree of agreement between direct stool exam and ELISA was calculated by Cohen's kappa coefficient. Results: In this study, 124 children with age range 2-12 years were investigated. A total of 64 (61.7% and 79 (65.7% of children had Giardia by direct stool exam and ELISA test respectively. There was association between frequency of constipation and Giardia infection (P=0.036. The Cohen's kappa coefficient calculated for degree of agreement between direct stool exam and ELISA showed κ=0.756 (P<0.001. Conclusions: The frequency of Giardia infection in symptomatic children was high and there was high agreement rate between ELISA and direct stool smear.

  10. Study of Mechanisms for Development and Strengthening of Water User Cooperatives (Case Study of Aras River Basin): Application of AHP Method

    OpenAIRE

    Rohallah maghabl

    2014-01-01

    Water user cooperatives were formed due to consideration to people's empowerment and participation in water investment and management. The purpose of this study was to investigate the mechanisms of development and strengthening of water user cooperatives in the Aras River Basin. The study population consisted of the management board members of the water user cooperatives in the Aras Basin in the year 2012. Respondents were selected by purposeful stratified sampling method. Having the data col...

  11. Investigating Connectivity and Consistency Criteria for Phrase Pair Extraction in Statistical Machine Translation

    NARCIS (Netherlands)

    Martzoukos, S.; Costa Florêncio, C.; Monz, C.; Kornai, A.; Kuhlmann, M.

    2013-01-01

    The consistency method has been established as the standard strategy for extracting high quality translation rules in statistical machine translation (SMT). However, no attention has been drawn to why this method is successful, other than empirical evidence. Using concepts from graph theory, we

  12. The Enhancement of Consistency of Interpretation Skills on the Newton’s Laws Concept

    Directory of Open Access Journals (Sweden)

    Yudi Kurniawan

    2018-03-01

    Full Text Available Conceptual understanding is the most important thing that students should have rather than they had reaches achievement. The interpretation skill is one of conceptual understanding aspects. The aim of this paper is to know the consistency of students’ interpreting skills and all at once to get the levels of increasing of students’ interpretations skill. These variables learned by Interactive Lecture Demonstrations (ILD common sense. The method of this research is pre-experimental research with one group pretest-posttest design. The sample has taken by cluster random sampling. The result had shown that 16 % of all student that are have perfect consistency of interpretation skill and there are increasing of interpretation skill on 84 % from unknown to be understand (this skill. This finding could be used by the future researcher to study in the other areas of conceptual understanding aspects

  13. Spectrally Consistent Satellite Image Fusion with Improved Image Priors

    DEFF Research Database (Denmark)

    Nielsen, Allan Aasbjerg; Aanæs, Henrik; Jensen, Thomas B.S.

    2006-01-01

    Here an improvement to our previous framework for satellite image fusion is presented. A framework purely based on the sensor physics and on prior assumptions on the fused image. The contributions of this paper are two fold. Firstly, a method for ensuring 100% spectrally consistency is proposed......, even when more sophisticated image priors are applied. Secondly, a better image prior is introduced, via data-dependent image smoothing....

  14. STP: A mathematically and physically consistent library of steam properties

    International Nuclear Information System (INIS)

    Aguilar, F.; Hutter, A.C.; Tuttle, P.G.

    1982-01-01

    A new FORTRAN library of subroutines has been developed from the fundamental equation of Keenan et al. to evaluate a large set of water properties including derivatives such as sound speed and isothermal compressibility. The STP library uses the true saturation envelope of the Keenan et al. fundamental equation. The evaluation of the true envelope by a continuation method is explained. This envelope, along with other design features, imparts an exceptionally high degree of thermodynamic and mathematical consistency to the STP library, even at the critical point. Accuracy and smoothness, library self-consistency, and designed user convenience make the STP library a reliable and versatile water property package

  15. Measuring consistency of autobiographical memory recall in depression.

    LENUS (Irish Health Repository)

    Semkovska, Maria

    2012-05-15

    Autobiographical amnesia assessments in depression need to account for normal changes in consistency over time, contribution of mood and type of memories measured. We report herein validation studies of the Columbia Autobiographical Memory Interview - Short Form (CAMI-SF), exclusively used in depressed patients receiving electroconvulsive therapy (ECT) but without previous published report of normative data. The CAMI-SF was administered twice with a 6-month interval to 44 healthy volunteers to obtain normative data for retrieval consistency of its Semantic, Episodic-Extended and Episodic-Specific components and assess their reliability and validity. Healthy volunteers showed significant large decreases in retrieval consistency on all components. The Semantic and Episodic-Specific components demonstrated substantial construct validity. We then assessed CAMI-SF retrieval consistencies over a 2-month interval in 30 severely depressed patients never treated with ECT compared with healthy controls (n=19). On initial assessment, depressed patients produced less episodic-specific memories than controls. Both groups showed equivalent amounts of consistency loss over a 2-month interval on all components. At reassessment, only patients with persisting depressive symptoms were distinguishable from controls on episodic-specific memories retrieved. Research quantifying retrograde amnesia following ECT for depression needs to control for normal loss in consistency over time and contribution of persisting depressive symptoms.

  16. A simple, sufficient, and consistent method to score the status of threats and demography of imperiled species

    Directory of Open Access Journals (Sweden)

    Jacob W. Malcom

    2016-07-01

    Full Text Available Managers of large, complex wildlife conservation programs need information on the conservation status of each of many species to help strategically allocate limited resources. Oversimplifying status data, however, runs the risk of missing information essential to strategic allocation. Conservation status consists of two components, the status of threats a species faces and the species’ demographic status. Neither component alone is sufficient to characterize conservation status. Here we present a simple key for scoring threat and demographic changes for species using detailed information provided in free-form textual descriptions of conservation status. This key is easy to use (simple, captures the two components of conservation status without the cost of more detailed measures (sufficient, and can be applied by different personnel to any taxon (consistent. To evaluate the key’s utility, we performed two analyses. First, we scored the threat and demographic status of 37 species recently recommended for reclassification under the Endangered Species Act (ESA and 15 control species, then compared our scores to two metrics used for decision-making and reports to Congress. Second, we scored the threat and demographic status of all non-plant ESA-listed species from Florida (54 spp., and evaluated scoring repeatability for a subset of those. While the metrics reported by the U.S. Fish and Wildlife Service (FWS are often consistent with our scores in the first analysis, the results highlight two problems with the oversimplified metrics. First, we show that both metrics can mask underlying demographic declines or threat increases; for example, ∼40% of species not recommended for reclassification had changes in threats or demography. Second, we show that neither metric is consistent with either threats or demography alone, but conflates the two. The second analysis illustrates how the scoring key can be applied to a substantial set of species to

  17. The q-deformed mKP hierarchy with self-consistent sources, Wronskian solutions and solitons

    International Nuclear Information System (INIS)

    Lin Runliang; Peng Hua; Manas, Manuel

    2010-01-01

    Based on the eigenfunction symmetry constraint of the q-deformed modified KP hierarchy, a q-deformed mKP hierarchy with self-consistent sources (q-mKPHSCSs) is constructed. The q-mKPHSCSs contain two types of q-deformed mKP equation with self-consistent sources. By the combination of the dressing method and the method of variation of constants, a generalized dressing approach is proposed to solve the q-deformed KP hierarchy with self-consistent sources (q-KPHSCSs). Using the gauge transformation between the q-KPHSCSs and the q-mKPHSCSs, the q-deformed Wronskian solutions for the q-KPHSCSs and the q-mKPHSCSs are obtained. The one-soliton solutions for the q-deformed KP (mKP) equation with a source are given explicitly.

  18. Relative amplitude preservation processing utilizing surface consistent amplitude correction. Part 4; Surface consistent amplitude correction wo mochiita sotai shinpuku hozon shori. 4

    Energy Technology Data Exchange (ETDEWEB)

    Saeki, T [Japan National Oil Corp., Tokyo (Japan). Technology Research Center

    1997-10-22

    Discussions were given on seismic exploration from the ground surface using the reflection method, for surface consistent amplitude correction from among effects imposed from the ground surface and a surface layer. Amplitude distribution on the reflection wave zone is complex. Therefore, items to be considered in making an analysis are multiple, such as estimation of spherical surface divergence effect and exponential attenuation effect, not only amplitude change through the surface layer. If all of these items are taken into consideration, burden of the work becomes excessive. As a method to solve this problem, utilization of amplitude in initial movement of a diffraction wave may be conceived. Distribution of the amplitude in initial movement of the diffraction wave shows a value relatively close to distribution of the vibration transmitting and receiving points. The reason for this is thought because characteristics of the vibration transmitting and receiving points related with waveline paths in the vicinity of the ground surface have no great difference both on the diffraction waves and on the reflection waves. The lecture described in this paper introduces an attempt of improving the efficiency of the surface consistent amplitude correction by utilizing the analysis of amplitude in initial movement of the diffraction wave. 4 refs., 2 figs.

  19. Interface Consistency

    DEFF Research Database (Denmark)

    Staunstrup, Jørgen

    1998-01-01

    This paper proposes that Interface Consistency is an important issue for the development of modular designs. Byproviding a precise specification of component interfaces it becomes possible to check that separately developedcomponents use a common interface in a coherent matter thus avoiding a very...... significant source of design errors. Awide range of interface specifications are possible, the simplest form is a syntactical check of parameter types.However, today it is possible to do more sophisticated forms involving semantic checks....

  20. A consistent description of kinetics and hydrodynamics of quantum Bose-systems

    Directory of Open Access Journals (Sweden)

    P.A.Hlushak

    2004-01-01

    Full Text Available A consistent approach to the description of kinetics and hydrodynamics of many-Boson systems is proposed. The generalized transport equations for strongly and weakly nonequilibrium Bose systems are obtained. Here we use the method of nonequilibrium statistical operator by D.N. Zubarev. New equations for the time distribution function of the quantum Bose system with a separate contribution from both the kinetic and potential energies of particle interactions are obtained. The generalized transport coefficients are determined accounting for the consistent description of kinetic and hydrodynamic processes.

  1. Evaluating the Consistency of the FNA Test in Pathologically Proven Nodules of Thyroidectomy

    Directory of Open Access Journals (Sweden)

    Alireza Khazaei

    2018-02-01

    Full Text Available Fine Needle Aspiration (FNA is a selective diagnostic technique for the evaluation of non-toxic thyroid nodules. Thyroid FNA results are either undiagnosed or suspicious and indeterminate in 20-30% of cases. Therefore, this study seeks to determine the consistency of the FNA test in pathologically proven nodules of thyroidectomy. This is a descriptive cross-sectional study carried out on a total of 73 candidates for thyroidectomy who had been admitted to Imam Ali Hospital. A census sampling method has been used in this study. The FNA samples and pathology samples were evaluated and the consistency of the FNA test in pathologically proven nodules were compared. The SPSS software was used for data analysis. The mean age of the patients was 40.1 ± 12.9 years. 23.3% of the participants were male and 76.7% of them were female. The malignancy rate in the pathology was 65.8% (48 cases and 53.4% (39 cases in the FNA. Of the 48 positive cases, the FNA pathology diagnosed 35 cases (72.9% as positive and 13 cases (27.1% as negative. Of the 25 negative cases, the FNA pathology diagnosed 21 cases (84% as negative and 4 cases (16% as positive. Sensitivity, specificity, positive and negative predictive values of FNA in malignancy diagnosis were 72.92, 84, 89.74, and 61.76%, respectively. The results show that FNA does not have a high sensitivity in the diagnosis of malignancy, but has good specificity and the use of other diagnostic methods before the operation of thyroid nodules seems necessary.

  2. Hybrid method for consistent model of the Pacific absolute plate motion and a test for inter-hotspot motion since 70Ma

    Science.gov (United States)

    Harada, Y.; Wessel, P.; Sterling, A.; Kroenke, L.

    2002-12-01

    Inter-hotspot motion within the Pacific plate is one of the most controversial issues in recent geophysical studies. However, it is a fact that many geophysical and geological data including ages and positions of seamount chains in the Pacific plate can largely be explained by a simple model of absolute motion derived from assumptions of rigid plates and fixed hotspots. Therefore we take the stand that if a model of plate motion can explain the ages and positions of Pacific hotspot tracks, inter-hotspot motion would not be justified. On the other hand, if any discrepancies between the model and observations are found, the inter-hotspot motion may then be estimated from these discrepancies. To make an accurate model of the absolute motion of the Pacific plate, we combined two different approaches: the polygonal finite rotation method (PFRM) by Harada and Hamano (2000) and the hot-spotting technique developed by Wessel and Kroenke (1997). The PFRM can determine accurate positions of finite rotation poles for the Pacific plate if the present positions of hotspots are known. On the other hand, the hot-spotting technique can predict present positions of hotspots if the absolute plate motion is given. Therefore we can undertake iterative calculations using the two methods. This hybrid method enables us to determine accurate finite rotation poles for the Pacific plate solely from geometry of Hawaii, Louisville and Easter(Crough)-Line hotspot tracks from around 70 Ma to present. Information of ages can be independently assigned to the model after the poles and rotation angles are determined. We did not detect any inter-hotspot motion from the geometry of these Pacific hotspot tracks using this method. The Ar-Ar ages of Pacific seamounts including new age data of ODP Leg 197 are used to test the newly determined model of the Pacific plate motion. The ages of Hawaii, Louisville, Easter(Crough)-Line, and Cobb hotspot tracks are quite consistent with each other from 70 Ma to

  3. Consistent three-equation model for thin films

    Science.gov (United States)

    Richard, Gael; Gisclon, Marguerite; Ruyer-Quil, Christian; Vila, Jean-Paul

    2017-11-01

    Numerical simulations of thin films of newtonian fluids down an inclined plane use reduced models for computational cost reasons. These models are usually derived by averaging over the fluid depth the physical equations of fluid mechanics with an asymptotic method in the long-wave limit. Two-equation models are based on the mass conservation equation and either on the momentum balance equation or on the work-energy theorem. We show that there is no two-equation model that is both consistent and theoretically coherent and that a third variable and a three-equation model are required to solve all theoretical contradictions. The linear and nonlinear properties of two and three-equation models are tested on various practical problems. We present a new consistent three-equation model with a simple mathematical structure which allows an easy and reliable numerical resolution. The numerical calculations agree fairly well with experimental measurements or with direct numerical resolutions for neutral stability curves, speed of kinematic waves and of solitary waves and depth profiles of wavy films. The model can also predict the flow reversal at the first capillary trough ahead of the main wave hump.

  4. Study of consistency properties of instillation liniment-gel for therapy of pyoinflammatory diseases of maxillofacial region

    Directory of Open Access Journals (Sweden)

    A. V. Kurinnoy

    2012-12-01

    Full Text Available Using rotary viscometer «Reotest 2» researches of consistency properties of instillation gel-liniment for antimicrobial therapy of pyoinfl ammatory diseases of maxillufacial area are conducted. It is defi ned, that consistency properties of gel-liniment for antimicrobial therapy of pyoinflammatory diseases of maxillufacial area are within the limits of rheological optimum of consistency of ointments, and value «mechanical stability» (1,33 characterizes the system as exceptionally thixotropic, providing recoverability of the system after loading and allows to forecast stability of consistency properties of gel-liniment at the prolonged storage.

  5. Diagnostic language consistency among multicultural English-speaking nurses.

    Science.gov (United States)

    Wieck, K L

    1996-01-01

    Cultural differences among nurses may influence the choice of terminology applicable to use of a nursing diagnostic statement. This study explored whether defining characteristics are consistently applied by culturally varied nurses in an English language setting. Two diagnoses, pain, and high risk for altered skin integrity, were studied within six cultures: African, Asian, Filipino, East Indian, African-American, and Anglo-American nurses. Overall, there was consistency between the cultural groups. Analysis of variance for the pain scale demonstrated differences among cultures on two characteristics of pain, restlessness and grimace. The only difference on the high risk for altered skin integrity scale was found on the distructor, supple skin.

  6. Replica consistency in a Data Grid

    International Nuclear Information System (INIS)

    Domenici, Andrea; Donno, Flavia; Pucciani, Gianni; Stockinger, Heinz; Stockinger, Kurt

    2004-01-01

    A Data Grid is a wide area computing infrastructure that employs Grid technologies to provide storage capacity and processing power to applications that handle very large quantities of data. Data Grids rely on data replication to achieve better performance and reliability by storing copies of data sets on different Grid nodes. When a data set can be modified by applications, the problem of maintaining consistency among existing copies arises. The consistency problem also concerns metadata, i.e., additional information about application data sets such as indices, directories, or catalogues. This kind of metadata is used both by the applications and by the Grid middleware to manage the data. For instance, the Replica Management Service (the Grid middleware component that controls data replication) uses catalogues to find the replicas of each data set. Such catalogues can also be replicated and their consistency is crucial to the correct operation of the Grid. Therefore, metadata consistency generally poses stricter requirements than data consistency. In this paper we report on the development of a Replica Consistency Service based on the middleware mainly developed by the European Data Grid Project. The paper summarises the main issues in the replica consistency problem, and lays out a high-level architectural design for a Replica Consistency Service. Finally, results from simulations of different consistency models are presented

  7. Potential application of the consistency approach for vaccine potency testing.

    Science.gov (United States)

    Arciniega, J; Sirota, L A

    2012-01-01

    The Consistency Approach offers the possibility of reducing the number of animals used for a potency test. However, it is critical to assess the effect that such reduction may have on assay performance. Consistency of production, sometimes referred to as consistency of manufacture or manufacturing, is an old concept implicit in regulation, which aims to ensure the uninterrupted release of safe and effective products. Consistency of manufacture can be described in terms of process capability, or the ability of a process to produce output within specification limits. For example, the standard method for potency testing of inactivated rabies vaccines is a multiple-dilution vaccination challenge test in mice that gives a quantitative, although highly variable estimate. On the other hand, a single-dilution test that does not give a quantitative estimate, but rather shows if the vaccine meets the specification has been proposed. This simplified test can lead to a considerable reduction in the number of animals used. However, traditional indices of process capability assume that the output population (potency values) is normally distributed, which clearly is not the case for the simplified approach. Appropriate computation of capability indices for the latter case will require special statistical considerations.

  8. Consistent lithological units and its influence on geomechanical stratification in shale reservoir: case study from Baltic Basin, Poland.

    Science.gov (United States)

    Pachytel, Radomir; Jarosiński, Marek; Bobek, Kinga

    2017-04-01

    Geomechanical investigations in shale reservoir are crucial to understand rock behavior during hydraulic fracturing treatment and to solve borehole wall stability problem. Anisotropy should be considered as key mechanical parameter while trying to characterize shale properties in variety of scales. We are developing a concept of step-by-step approach to characterize and upscale the Consistent Lithological Units (CLU) at several scales of analysis. We decided that the most regional scale model, comparable to lithostratigraphic formations, is too general for hydraulic fracture propagation study thus a more detailed description is needed. The CLU's hierarchic model aims in upscale elastic properties with their anisotropy based on available data from vertical borehole. For the purpose of our study we have an access to continuous borehole core profile and full set of geophysical logging from several wells in the Pomeranian part of the Ordovician and Silurian shale complex belongs to the Baltic Basin. We are focused on shale properties that might be crucial for mechanical response to hydraulic fracturing: mineral components, porosity, density, elastic parameters and natural fracture pattern. To prepare the precise CLU model we compare several methods of determination and upscaling every single parameter used for consistent units secretion. Mineralogical data taken from ULTRA log, GEM log, X-ray diffraction and X-ray fluorescence were compared with Young modulus from sonic logs and Triaxial Compressive Strength Tests. The results showed the impact of clay content and porosity increase on Young's modulus reduction while carbonates (both calcite and dolomite) have stronger impact on elastic modulus growth, more than quartz, represented here mostly by detrital particles. Comparing the shales of similar composition in a few wells of different depths we concluded that differences in diagenesis and compaction due to variation in formation depth in a range of 1 km has negligible

  9. Research progress of MRI in preoperative evaluation of pituitary adenoma's consistency

    International Nuclear Information System (INIS)

    Lu Yiping; Yin Bo; Geng Daoying

    2013-01-01

    As the most common primary disease in pituitary fossa, the incidence of pituitary adenoma ranks 3rd in the primary tumors of the brain. To remove those resectable pituitary adenomas, there are 2 surgical approaches, named trans-sphenoidal endoscopic surgery and craniotomy. Which approach should be used depends on the size, invasive extension and the consistency of the tumors. The trans-sphenoidal endoscopic surgery is more suitable for the tumors with soft consistency which are easy to pull out, while the craniotomy is suitable for the hard ones. So, preoperative evaluation of the tumors' consistency can help to find the best surgical approach and treatments. MRI is not only an ideal method to show the structure of brain, but also can be used to evaluate consistency of tumor. This review illustrated the forming mechanism of the different consistency of pituitary adenoma and the research process in evaluating the consistency. (authors)

  10. THE STUDY OF SOCIAL REPRESENTATIONS BY THE VIGNETTE METHOD: A QUANTITATIVE INTERPRETATION

    Directory of Open Access Journals (Sweden)

    Ж В Пузанова

    2017-12-01

    Full Text Available The article focuses on the prospects of creating vignettes as a new method in empirical sociology. It is a good alternative to the conventional mass survey methods. The article consists of a few sections differing by the focus. The vignette method is not popular among Russian scientists, but has a big history abroad. A wide range of problems can be solved by this method (e.g. the prospects for guardianship and its evaluation, international students’ adaptation to the educational system, social justice studies, market-ing and business research, etc.. The vignette method can be used for studying different problems including sensitive questions (e.g. HIV, drugs, psychological trauma, etc., because it is one of the projective techniques. Projective techniques allow to obtain more reliable information, because the respondent projects one situation on the another, but at the same time responds to his own stimulus. The article considers advantages and disadvantages of the method. The authors provide information about the limitations of the method. The article presents the key principles for designing and developing the vignettes method depending on the research type. The authors provide examples of their own vignettes tested in the course of their own empirical research. The authors highlight the advantages of the logical-combinatorial approaches (especially the JSM method with its dichotomy for the analysis of data in quantitative research. Also they consider another method of the analysis of the data that implies the technique of “steeping”, i.e. when the respondent gets new information step by step, which extends his previous knowledge.

  11. Measuring consistency of autobiographical memory recall in depression.

    Science.gov (United States)

    Semkovska, Maria; Noone, Martha; Carton, Mary; McLoughlin, Declan M

    2012-05-15

    Autobiographical amnesia assessments in depression need to account for normal changes in consistency over time, contribution of mood and type of memories measured. We report herein validation studies of the Columbia Autobiographical Memory Interview - Short Form (CAMI-SF), exclusively used in depressed patients receiving electroconvulsive therapy (ECT) but without previous published report of normative data. The CAMI-SF was administered twice with a 6-month interval to 44 healthy volunteers to obtain normative data for retrieval consistency of its Semantic, Episodic-Extended and Episodic-Specific components and assess their reliability and validity. Healthy volunteers showed significant large decreases in retrieval consistency on all components. The Semantic and Episodic-Specific components demonstrated substantial construct validity. We then assessed CAMI-SF retrieval consistencies over a 2-month interval in 30 severely depressed patients never treated with ECT compared with healthy controls (n=19). On initial assessment, depressed patients produced less episodic-specific memories than controls. Both groups showed equivalent amounts of consistency loss over a 2-month interval on all components. At reassessment, only patients with persisting depressive symptoms were distinguishable from controls on episodic-specific memories retrieved. Research quantifying retrograde amnesia following ECT for depression needs to control for normal loss in consistency over time and contribution of persisting depressive symptoms. Copyright © 2012 Elsevier Ireland Ltd. All rights reserved.

  12. Consistent initial conditions for the Saint-Venant equations in river network modeling

    Directory of Open Access Journals (Sweden)

    C.-W. Yu

    2017-09-01

    Full Text Available Initial conditions for flows and depths (cross-sectional areas throughout a river network are required for any time-marching (unsteady solution of the one-dimensional (1-D hydrodynamic Saint-Venant equations. For a river network modeled with several Strahler orders of tributaries, comprehensive and consistent synoptic data are typically lacking and synthetic starting conditions are needed. Because of underlying nonlinearity, poorly defined or inconsistent initial conditions can lead to convergence problems and long spin-up times in an unsteady solver. Two new approaches are defined and demonstrated herein for computing flows and cross-sectional areas (or depths. These methods can produce an initial condition data set that is consistent with modeled landscape runoff and river geometry boundary conditions at the initial time. These new methods are (1 the pseudo time-marching method (PTM that iterates toward a steady-state initial condition using an unsteady Saint-Venant solver and (2 the steady-solution method (SSM that makes use of graph theory for initial flow rates and solution of a steady-state 1-D momentum equation for the channel cross-sectional areas. The PTM is shown to be adequate for short river reaches but is significantly slower and has occasional non-convergent behavior for large river networks. The SSM approach is shown to provide a rapid solution of consistent initial conditions for both small and large networks, albeit with the requirement that additional code must be written rather than applying an existing unsteady Saint-Venant solver.

  13. Is Consumer Response to Plain/Standardised Tobacco Packaging Consistent with Framework Convention on Tobacco Control Guidelines? A Systematic Review of Quantitative Studies

    Science.gov (United States)

    Stead, Martine; Moodie, Crawford; Angus, Kathryn; Bauld, Linda; McNeill, Ann; Thomas, James; Hastings, Gerard; Hinds, Kate; O'Mara-Eves, Alison; Kwan, Irene; Purves, Richard I.; Bryce, Stuart L.

    2013-01-01

    Background and Objectives Standardised or ‘plain’ tobacco packaging was introduced in Australia in December 2012 and is currently being considered in other countries. The primary objective of this systematic review was to locate, assess and synthesise published and grey literature relating to the potential impacts of standardised tobacco packaging as proposed by the guidelines for the international Framework Convention on Tobacco Control: reduced appeal, increased salience and effectiveness of health warnings, and more accurate perceptions of product strength and harm. Methods Electronic databases were searched and researchers in the field were contacted to identify studies. Eligible studies were published or unpublished primary research of any design, issued since 1980 and concerning tobacco packaging. Twenty-five quantitative studies reported relevant outcomes and met the inclusion criteria. A narrative synthesis was conducted. Results Studies that explored the impact of package design on appeal consistently found that standardised packaging reduced the appeal of cigarettes and smoking, and was associated with perceived lower quality, poorer taste and less desirable smoker identities. Although findings were mixed, standardised packs tended to increase the salience and effectiveness of health warnings in terms of recall, attention, believability and seriousness, with effects being mediated by the warning size, type and position on pack. Pack colour was found to influence perceptions of product harm and strength, with darker coloured standardised packs generally perceived as containing stronger tasting and more harmful cigarettes than fully branded packs; lighter coloured standardised packs suggested weaker and less harmful cigarettes. Findings were largely consistent, irrespective of location and sample. Conclusions The evidence strongly suggests that standardised packaging will reduce the appeal of packaging and of smoking in general; that it will go some way

  14. Social supports and mental health: a cross-sectional study on the correlation of self-consistency and congruence in China.

    Science.gov (United States)

    Gu, YanMei; Hu, Jie; Hu, YaPing; Wang, JianRong

    2016-06-28

    Psychosocial job characteristics require nursing staff with high self-consistency and good mental health. However, the attention and effort of such study remained very limited in China. A self-administered questionnaire was distributed to the bedside nurses in an affiliated hospital of Hebei Medical University, China. Of 218 registered bedside nurses eligible to participate in the survey anonymously, the data producing sample of 172 subjects resulted in a 79 % of effective response rate.. The Social Support Rating Scale was used to measure social support, and the Self-Consistency and Congruence Scale were used to measure mental health. Compared with the normal referenced group of college students, higher self-flexibility scores, lower self-conflict and self-stethoscope scores from the sample group were obtained with statistical significance in self-conflict scores. The close correlations were observed between participants' social support and Self-Consistency and Congruence Scale score. The difference of Social Support Rating Scale score was significant in demographic features including years of work, marital status, only child family, and levels of cooperation with other health worker. Bedside nurses in this study show a better inner harmony, and their Self-Consistency and Congruence closely correlates with the levels of social support. Thus, it is substantial to improve inner perception of support and external factors, such as the workplace support, and offer beneficial social environment to improve the bedside nurse's sub-health symptoms and decrease the high turnover rate.

  15. Self-consistency and coherent effects in nonlinear resonances

    International Nuclear Information System (INIS)

    Hofmann, I.; Franchetti, G.; Qiang, J.; Ryne, R. D.

    2003-01-01

    The influence of space charge on emittance growth is studied in simulations of a coasting beam exposed to a strong octupolar perturbation in an otherwise linear lattice, and under stationary parameters. We explore the importance of self-consistency by comparing results with a non-self-consistent model, where the space charge electric field is kept 'frozen-in' to its initial values. For Gaussian distribution functions we find that the 'frozen-in' model results in a good approximation of the self-consistent model, hence coherent response is practically absent and the emittance growth is self-limiting due to space charge de-tuning. For KV or waterbag distributions, instead, strong coherent response is found, which we explain in terms of absence of Landau damping

  16. A study on basic theory for CDCC method for three-body model of deuteron scattering

    International Nuclear Information System (INIS)

    Kawai, Mitsuji

    1988-01-01

    Recent studies have revealed that the CDCC method is valid for treating the decomposition process involved in deuteron scattering on the basis of a three-body model. However, theoretical support has not been developed for this method. The present study is aimed at determining whether a solution by the CDCC method can be obtained 'correctly' from a 'realistic' model Hamiltonian for deuteron scattering. Some researchers have recently pointed out that there are some problems with the conventional CDCC calculation procedure in view of the general scattering theory. These problems are associated with asymptotic froms of the wave functions, convergence of calculations, and boundary conditions. Considerations show that the problem with asymptotic forms of the wave function is not a fatal defect, though some compromise is necessary. The problem with the convergence of calculations is not very serious either. Discussions are made of the handling of boundary conditions. Thus, the present study indicates that the CDCC method can be applied satisfactorily to actual deuteron scattering, and that the model wave function for the CDCC method is consistent with the model Hamiltonian. (Nogami, K.)

  17. Self-consistent Ginzburg-Landau theory for transport currents in superconductors

    DEFF Research Database (Denmark)

    Ögren, Magnus; Sørensen, Mads Peter; Pedersen, Niels Falsig

    2012-01-01

    We elaborate on boundary conditions for Ginzburg-Landau (GL) theory in the case of external currents. We implement a self-consistent theory within the finite element method (FEM) and present numerical results for a two-dimensional rectangular geometry. We emphasize that our approach can in princi...... in principle also be used for general geometries in three-dimensional superconductors....

  18. The Language Teaching Methods Scale: Reliability and Validity Studies

    Science.gov (United States)

    Okmen, Burcu; Kilic, Abdurrahman

    2016-01-01

    The aim of this research is to develop a scale to determine the language teaching methods used by English teachers. The research sample consisted of 300 English teachers who taught at Duzce University and in primary schools, secondary schools and high schools in the Provincial Management of National Education in the city of Duzce in 2013-2014…

  19. Study of the Cl2 molecule by the variational cellular method

    International Nuclear Information System (INIS)

    Rosato, A.; Lima, M.A.P.

    1984-01-01

    A self-consistent calculation based on the Variational Cellular Method is performed on the Cl 2 molecule. The results obtained for the ground state potential curve and the first excited state, the dissociation energy, the molecular orbital energies and other related parameters are compared with other methods of calculations and with available data and the agreement is satisfatory. (Author) [pt

  20. Consistency Anchor Formalization and Correctness Proofs

    OpenAIRE

    Miguel, Correia; Bessani, Alysson

    2014-01-01

    This is report contains the formal proofs for the techniques for increasing the consistency of cloud storage as presented in "Bessani et al. SCFS: A Cloud-backed File System. Proc. of the 2014 USENIX Annual Technical Conference. June 2014." The consistency anchor technique allows one to increase the consistency provided by eventually consistent cloud storage services like Amazon S3. This technique has been used in the SCFS (Shared Cloud File System) cloud-backed file system for solving rea...

  1. The consistency effect depends on markedness in less succesful but not succesful problem solvers: An eye movement study in primary school children

    NARCIS (Netherlands)

    van der Schoot, M.; Bakker Arkema, A.H.; Horsley, T.M.; van Lieshout, E.C.D.M.

    2009-01-01

    This study examined the effects of consistency (relational term consistent vs. inconsistent with required arithmetic operation) and markedness (relational term unmarked ['more than'] vs. marked ['less than']) on word problem solving in 10-12 years old children differing in problem-solving skill. The

  2. SIMPLE ESTIMATOR AND CONSISTENT STRONGLY OF STABLE DISTRIBUTIONS

    Directory of Open Access Journals (Sweden)

    Cira E. Guevara Otiniano

    2016-06-01

    Full Text Available Stable distributions are extensively used to analyze earnings of financial assets, such as exchange rates and stock prices assets. In this paper we propose a simple and strongly consistent estimator for the scale parameter of a symmetric stable L´evy distribution. The advantage of this estimator is that your computational time is minimum thus it can be used to initialize intensive computational procedure such as maximum likelihood. With random samples of sized n we tested the efficacy of these estimators by Monte Carlo method. We also included applications for three data sets.

  3. [Consistency study of PowerPlex 21 kit and Goldeneye 20A kit and forensic application].

    Science.gov (United States)

    Ren, He; Liu, Ying; Zhang, Qing-Xia; Jiao, Zhang-Ping

    2014-06-01

    To ensure the consistency of genotype results for PowerPlex 21 kit and Goldeneye 20A kit. The STR loci were amplified in DNA samples from 205 unrelated individuals in Beijing Han population. And consistency of 19 overlap STR loci typing were observed. The genetic polymorphism of D1S1656 locus was obtained. All 19 overlap loci typing showed consistent. The proportion of peak height of heterozygous loci in two kits showed no statistical difference (P > 0.05). The observed heterozygosis of D1S1656 was 0.878. The discrimination power was 0.949. The excluding probability of paternity of triplet was 0.751. The excluding probability of paternity of diploid was 0.506. The polymorphism information content was 0.810. PowerPlex 21 kit and Goldeneye 20A kit present a good consistency. The primer design is reasonable. The polymorphism of D1S1656 is good. The two kits can be used for human genetic analysis, paternity test, and individual identification in forensic practice.

  4. Comparative study of patient doses calculated with two methods for breast digital tomosynthesis

    International Nuclear Information System (INIS)

    Castillo, M.; Chevalier, M.; Calzado, A.; Garayo, J.; Valverde, J.

    2015-01-01

    In this study, the average glandular doses (DG) delivered in breast tomosynthesis examinations were estimated over a sample of 150 patients using two different methods. In method 1, the conversion factors air-kerma to DG used were those tabulated by Dance et al. and in method 2 were the ones from Feng et al. The protocol for the examination followed in the unit of this study consists in two views per breast, each view composed by a 2D acquisition and a tomosynthesis scan (3D). The resulting DG values from both methods present statistically significant differences (p=0.02) for the 2D modality and were similar for the 3D scan (p=0.22). The estimated median value of DG for the most frequent breasts (thicknesses between 50 and 60 mm) delivered in a single 3D acquisition is 1.7 mGy (36% and 17% higher than the value for the 2D mode estimated with each method) which lies far below the tolerances established by the Spanish Protocol Quality Control in Radiodiagnostic (2011). The total DG for a tomosynthesis examination (6.0 mGy) is a factor 2.4 higher than the dose delivered in a 2D examination with two views (method 1). (Author)

  5. Novel in situ radiotracer methods for the direct and indirect study of chromate adsorption on alumina

    International Nuclear Information System (INIS)

    Gancs, L.; Nemeth, Z.; Horanyi, G.

    2002-01-01

    Radiotracer methods, particularly the radiotracer thin foil method, provide unique possibility of in situ monitoring of chromate adsorption on powdered adsorbents. Two different versions of the thin foil method can be distinguished. In the direct method, the species to be studied is labelled and the radiation measured gives direct information on the adsorption of this species. In the indirect method, a different labelled indicator species is added to the system and the adsorption of this species is followed and the adsorption of the species to be studied is determined based on analysis of the competitive adsorption processes. Both methods were used in the present study. In the in situ methods, the radiation measured consists of two main parts, one coming from the solution background, the other originating from the adsorption layer. In the case of the thin foil method using isotopes emitting soft β - radiation or low energy X-ray the solution background is governed and minimised by self-absorption of the radiation. In the direct study we applied an experimental methodology based on the energy selective measurement of the characteristic K α,β X-radiation emitted by the 51 Cr-labelled chromate species, whereas 35 S-labelled sulphate ions were used as the indicator species in the indirect study. (P.A.)

  6. Consistency of hand preference: predictions to intelligence and school achievement.

    Science.gov (United States)

    Kee, D W; Gottfried, A; Bathurst, K

    1991-05-01

    Gottfried and Bathurst (1983) reported that hand preference consistency measured over time during infancy and early childhood predicts intellectual precocity for females, but not for males. In the present study longitudinal assessments of children previously classified by Gottfried and Bathurst as consistent or nonconsistent in cross-time hand preference were conducted during middle childhood (ages 5 to 9). Findings show that (a) early measurement of hand preference consistency for females predicts school-age intellectual precocity, (b) the locus of the difference between consistent vs. nonconsistent females is in verbal intelligence, and (c) the precocity of the consistent females was also revealed on tests of school achievement, particularly tests of reading and mathematics.

  7. Application of activity theory to analysis of human-related accidents: Method and case studies

    International Nuclear Information System (INIS)

    Yoon, Young Sik; Ham, Dong-Han; Yoon, Wan Chul

    2016-01-01

    This study proposes a new approach to human-related accident analysis based on activity theory. Most of the existing methods seem to be insufficient for comprehensive analysis of human activity-related contextual aspects of accidents when investigating the causes of human errors. Additionally, they identify causal factors and their interrelationships with a weak theoretical basis. We argue that activity theory offers useful concepts and insights to supplement existing methods. The proposed approach gives holistic contextual backgrounds for understanding and diagnosing human-related accidents. It also helps identify and organise causal factors in a consistent, systematic way. Two case studies in Korean nuclear power plants are presented to demonstrate the applicability of the proposed method. Human Factors Analysis and Classification System (HFACS) was also applied to the case studies. The results of using HFACS were then compared with those of using the proposed method. These case studies showed that the proposed approach could produce a meaningful set of human activity-related contextual factors, which cannot easily be obtained by using existing methods. It can be especially effective when analysts think it is important to diagnose accident situations with human activity-related contextual factors derived from a theoretically sound model and to identify accident-related contextual factors systematically. - Highlights: • This study proposes a new method for analysing human-related accidents. • The method was developed based on activity theory. • The concept of activity system model and contradiction was used in the method. • Two case studies in nuclear power plants are presented. • The method is helpful to consider causal factors systematically and comprehensively.

  8. Directional selection in temporally replicated studies is remarkably consistent.

    Science.gov (United States)

    Morrissey, Michael B; Hadfield, Jarrod D

    2012-02-01

    Temporal variation in selection is a fundamental determinant of evolutionary outcomes. A recent paper presented a synthetic analysis of temporal variation in selection in natural populations. The authors concluded that there is substantial variation in the strength and direction of selection over time, but acknowledged that sampling error would result in estimates of selection that were more variable than the true values. We reanalyze their dataset using techniques that account for the necessary effect of sampling error to inflate apparent levels of variation and show that directional selection is remarkably constant over time, both in magnitude and direction. Thus we cannot claim that the available data support the existence of substantial temporal heterogeneity in selection. Nonetheless, we conject that temporal variation in selection could be important, but that there are good reasons why it may not appear in the available data. These new analyses highlight the importance of applying techniques that estimate parameters of the distribution of selection, rather than parameters of the distribution of estimated selection (which will reflect both sampling error and "real" variation in selection); indeed, despite availability of methods for the former, focus on the latter has been common in synthetic reviews of the aspects of selection in nature, and can lead to serious misinterpretations. © 2011 The Author(s). Evolution© 2011 The Society for the Study of Evolution.

  9. Consistency and Word-Frequency Effects on Spelling among First- To Fifth-Grade French Children: A Regression-Based Study

    Science.gov (United States)

    Lete, Bernard; Peereman, Ronald; Fayol, Michel

    2008-01-01

    We describe a large-scale regression study that examines the influence of lexical (word frequency, lexical neighborhood) and sublexical (feedforward and feedback consistency) variables on spelling accuracy among first, second, and third- to fifth-graders. The wordset analyzed contained 3430 French words. Predictors in the stepwise regression…

  10. Adjoint-consistent formulations of slip models for coupled electroosmotic flow systems

    KAUST Repository

    Garg, Vikram V

    2014-09-27

    Background Models based on the Helmholtz `slip\\' approximation are often used for the simulation of electroosmotic flows. The objectives of this paper are to construct adjoint-consistent formulations of such models, and to develop adjoint-based numerical tools for adaptive mesh refinement and parameter sensitivity analysis. Methods We show that the direct formulation of the `slip\\' model is adjoint inconsistent, and leads to an ill-posed adjoint problem. We propose a modified formulation of the coupled `slip\\' model, which is shown to be well-posed, and therefore automatically adjoint-consistent. Results Numerical examples are presented to illustrate the computation and use of the adjoint solution in two-dimensional microfluidics problems. Conclusions An adjoint-consistent formulation for Helmholtz `slip\\' models of electroosmotic flows has been proposed. This formulation provides adjoint solutions that can be reliably used for mesh refinement and sensitivity analysis.

  11. A new δf method for neoclassical transport studies

    International Nuclear Information System (INIS)

    Wang, W.X.; Nakajima, N.; Okamoto, M.; Murakami, S.

    1999-01-01

    A new δf method is presented in detail to solve the drift kinetic equation for the simulation study of neoclassical transport. It is demonstrated that valid results essentially rely on the correct evaluation of the marker density g in the weight calculation. A new weighting scheme is developed without assuming g in the weight equation for advancing particle weights, unlike previous schemes. This scheme employs an additional weight function to directly solve g from its kinetic equation based on the δf method itself. Therefore, the severe constraint that the real marker distribution must be consistent with the initially assumed g is relaxed. An improved like-particle collision scheme is also presented. By compensating for momentum, energy and particle losses, the conservations of all three quantities are greatly improved during collisions. With the improvement in both the like-particle collision scheme and the weighting scheme, the δf simulation shows a significantly improved performance. The new δf method is applied to the study of ion neoclassical transports due to self-collisions, taking the effect of finite orbit width into account. The ion thermal transport near the magnetic axis is shown to be greatly reduced from its conventional neoclassical level, like that of previous δf simulations. On the other hand, the direct particle loss from the confinement region may strongly increase the ion thermal transport near the edge. It is found that the ion parallel flow near the axis is also largely reduced due to non-standard orbit topology. (author)

  12. Amorphization of Ge and InP studied using nuclear hyperfine methods

    International Nuclear Information System (INIS)

    Byrne, A.P.; Bezakova, E.; Glover, C.J.; Ridgway, M.C.

    1999-01-01

    The ion beam amorphization of InP and Ge has been studied using the Perturbed Angular Correlation (PAC) technique. Semiconductor samples were preimplanted with the radioisotope 111 In using a direct production-recoil implantation method and beams from the ANU Heavy-ion Facility. Following annealing samples were amorphized using Ge beams with doses between 2 x 10 12 ion/cm 2 and 5000 x 10 12 ion/cm 2 . For InP the PAC spectra identified three distinct regimes, crystalline, disordered and amorphous environments, with a smooth transition observed as a function of dose. The dose dependence of the relative fractions of the individual probe environments has been determined. A direct amorphization process consistent with the overlap model was quantified and evidence for a second amorphization process via the overlap of disordered regions was observed. The PAC method compares favorably with other methods used in its ability to differentiate changes at high dose. The results for InP will be compared with those in Ge. The implantation method will be discussed, as will developments in the establishment of a dedicated facility for the implantation of radioisotopes

  13. Consistent use of a combination product versus a single product in a safety trial of the diaphragm and microbicide in Harare, Zimbabwe.

    Science.gov (United States)

    van der Straten, Ariane; Moore, Jie; Napierala, Sue; Clouse, Kate; Mauck, Christine; Hammond, Nii; Padian, Nancy

    2008-06-01

    We examined the use and acceptability of a combination product (diaphragm and gel) compared to a single product (gel) during a 6-month safety trial in Zimbabwe. Women were randomized to the use of a diaphragm with gel or the use of gel alone, in addition to male condoms. Ever use and use of study product on the last act of sexual intercourse were assessed monthly by Audio Computer-Assisted Self-Interviewing. Acceptability, correct use and consistent use (use at every sexual act during the previous 3 months) were measured on the last visit by face-to-face interview. Predictors of consistent use were examined using multivariate logistic regression analyses. In this sample of 117 sexually active, monogamous, contracepting women, rates of consistent use were similar in both groups (59.7% for combination method vs. 56.4% for gel alone). Product acceptability was high, but was not independently associated with consistent use. Independent predictors of consistent use included age [adjusted odds ratio (AOR)=1.08; 95% confidence interval (95% CI)=1.01-1.16], consistent condom use (AOR=3.85; 95% CI=1.54-9.63) and having a partner who approves of product use (AOR=2.66; 95% CI=1.10-6.39). Despite high reported acceptability and few problems with the products, the participants reported only moderate product adherence levels. Consistent use of condoms and consistent use of products were strongly associated. If observed in other studies, this may bias the estimation of product effectiveness in future trials of female-controlled methods.

  14. Study on the Analytical Method for Determination of P-32 in Human Hair

    International Nuclear Information System (INIS)

    Syarbaini; Lubis, E.; Sarwani

    1996-01-01

    Neutron doses due to accident criticality can be estimated by measuring of radionuclide of neutron activation products in human hair. In this work, the analytical method for the determination of P-32 in neutron irradiated hair sample by G.A Siwabessy reactor has been studied. This analytical method consists of dissolving of human hair sample by 10 M HNO3, separation dan purification of P-32 by precipitation as ammonium molibdophosphate finally, the precipitate was measured by low backgroundα/βcounter. The minimum detectable activity of P-32 was 0,05 Bq at a background of 4,6 cpm and with a counting efficiency of 55 % for a 30 minute counting time

  15. Theoretical study of hydrogen storage in a truncated triangular pyramid molecule consisting of pyridine and benzene rings bridged by vinylene groups

    Science.gov (United States)

    Ishikawa, Shigeru; Nemoto, Tetsushi; Yamabe, Tokio

    2018-06-01

    Hydrogen storage in a truncated triangular pyramid molecule C33H21N3, which consists of three pyridine rings and one benzene ring bridged by six vinylene groups, is studied by quantum chemical methods. The molecule is derived by substituting three benzene rings in a truncated tetrahedron hydrocarbon C36H24 with pyridine rings. The optimized molecular structure under C 3v symmetry shows no imaginary vibrational modes at the B3LYP/cc-pVTZ level of theory. The hydrogen storage process is investigated based on the MP2/cc-pVTZ method. Like the structure before substitution, the C33H21N3 molecule has a cavity that stores a hydrogen molecule with a binding energy of - 140 meV. The Langmuir isotherm shows that this cavity can store hydrogen at higher temperatures and lower pressures than usual physisorption materials. The C33H21N3 molecule has a kinetic advantage over the C36H24 molecule because the former molecule has a lower barrier (+ 560 meV) for the hydrogen molecule entering the cavity compared with the latter molecule (+ 730 meV) owing to the lack of hydrogen atoms narrowing the opening.

  16. Influence of Water Content on the Flow Consistency of Dredged Marine Soils

    Directory of Open Access Journals (Sweden)

    Rosman M. Z.

    2016-01-01

    Full Text Available In present time, dredged marine soils (DMS are generally considered as geo-waste in Malaysia. It is also known to contain high value of water and low shear strength. Lightly solidified soils such as soilcement slurry and flowable fill are known as controlled low strength materials (CLSM. On site, the CLSM was tested for its consistency by using an open-ended cylinder pipe. The vertical and lateral displacement from the test would determine the quality and workability of the CLSM. In this study, manufactured kaolin powder was mixed with different percentages of water. Cement was also added to compare the natural soil with solidified soil samples. There are two methods of flowability test used, namely the conventional lift method and innovative drop method. The lateral displacement or soil spread diameter values were recorded and averaged. Tests showed that the soil spread diameter corresponded almost linear with the increasing amount of water. The binder-added samples show no significant difference with non-binder sample. Also, the mixing water content and percentage of fines had influenced the soil spread diameter.

  17. A study on the separation method of total rare earth oxides in Xenotime

    International Nuclear Information System (INIS)

    Shim, Sang Kwon; Park, Hea Kyung; Kim, Kyung Lim

    1990-01-01

    This study is concerned with the separation method of total rare earth oxides in Xenotime by acid digest method. Thioacetamide was used as a carrier, tartaric acid was used as a masking agent and oxalic acid was used as a precipitant. So the effects of three acid digest methods, pH of the solution, digesting time,tartaric acid, oxalic acid and aging time were oberved. The results showed that the best acid digest method was sulfuric acid leaching and mixed acid digest method, and that pH of the solution was 2, digesting time was 4 hours, tartaric acid was 100 ml of 2% solution, oxalic acid was 8 gr. and aging time was 1 hour. Through this experiment, it was confirmed by X-ray diffractometer that the separated total rare earth oxides consisted of the Yttrium and the other rare earth elements : Gadolinium, Dysprosium, Erbium, Ytterbium and trace rare earth elements. The pure rare earth oxides being separated by this method were dried and ignited at 900 deg C (Author)

  18. Time-dependent restricted-active-space self-consistent-field theory for bosonic many-body systems

    International Nuclear Information System (INIS)

    Lévêque, Camille; Madsen, Lars Bojer

    2017-01-01

    We develop an ab initio time-dependent wavefunction based theory for the description of a many-body system of cold interacting bosons. Like the multi-configurational time-dependent Hartree method for bosons (MCTDHB), the theory is based on a configurational interaction Ansatz for the many-body wavefunction with time-dependent self-consistent-field orbitals. The theory generalizes the MCTDHB method by incorporating restrictions on the active space of the orbital excitations. The restrictions are specified based on the physical situation at hand. The equations of motion of this time-dependent restricted-active-space self-consistent-field (TD-RASSCF) theory are derived. The similarity between the formal development of the theory for bosons and fermions is discussed. The restrictions on the active space allow the theory to be evaluated under conditions where other wavefunction based methods due to exponential scaling in the numerical effort cannot, and to clearly identify the excitations that are important for an accurate description, significantly beyond the mean-field approach. For ground state calculations we find it to be important to allow a few particles to have the freedom to move in many orbitals, an insight facilitated by the flexibility of the restricted-active-space Ansatz . Moreover, we find that a high accuracy can be obtained by including only even excitations in the many-body self-consistent-field wavefunction. Time-dependent simulations of harmonically trapped bosons subject to a quenching of their noncontact interaction, show failure of the mean-field Gross-Pitaevskii approach within a fraction of a harmonic oscillation period. The TD-RASSCF theory remains accurate at much reduced computational cost compared to the MCTDHB method. Exploring the effect of changes of the restricted-active-space allows us to identify that even self-consistent-field excitations are mainly responsible for the accuracy of the method. (paper)

  19. A Theoretically Consistent Method for Minimum Mean-Square Error Estimation of Mel-Frequency Cepstral Features

    DEFF Research Database (Denmark)

    Jensen, Jesper; Tan, Zheng-Hua

    2014-01-01

    We propose a method for minimum mean-square error (MMSE) estimation of mel-frequency cepstral features for noise robust automatic speech recognition (ASR). The method is based on a minimum number of well-established statistical assumptions; no assumptions are made which are inconsistent with others....... The strength of the proposed method is that it allows MMSE estimation of mel-frequency cepstral coefficients (MFCC's), cepstral mean-subtracted MFCC's (CMS-MFCC's), velocity, and acceleration coefficients. Furthermore, the method is easily modified to take into account other compressive non-linearities than...... the logarithmic which is usually used for MFCC computation. The proposed method shows estimation performance which is identical to or better than state-of-the-art methods. It further shows comparable ASR performance, where the advantage of being able to use mel-frequency speech features based on a power non...

  20. Guided color consistency optimization for image mosaicking

    Science.gov (United States)

    Xie, Renping; Xia, Menghan; Yao, Jian; Li, Li

    2018-01-01

    This paper studies the problem of color consistency correction for sequential images with diverse color characteristics. Existing algorithms try to adjust all images to minimize color differences among images under a unified energy framework, however, the results are prone to presenting a consistent but unnatural appearance when the color difference between images is large and diverse. In our approach, this problem is addressed effectively by providing a guided initial solution for the global consistency optimization, which avoids converging to a meaningless integrated solution. First of all, to obtain the reliable intensity correspondences in overlapping regions between image pairs, we creatively propose the histogram extreme point matching algorithm which is robust to image geometrical misalignment to some extents. In the absence of the extra reference information, the guided initial solution is learned from the major tone of the original images by searching some image subset as the reference, whose color characteristics will be transferred to the others via the paths of graph analysis. Thus, the final results via global adjustment will take on a consistent color similar to the appearance of the reference image subset. Several groups of convincing experiments on both the synthetic dataset and the challenging real ones sufficiently demonstrate that the proposed approach can achieve as good or even better results compared with the state-of-the-art approaches.

  1. Range-efficient consistent sampling and locality-sensitive hashing for polygons

    DEFF Research Database (Denmark)

    Gudmundsson, Joachim; Pagh, Rasmus

    2017-01-01

    Locality-sensitive hashing (LSH) is a fundamental technique for similarity search and similarity estimation in high-dimensional spaces. The basic idea is that similar objects should produce hash collisions with probability significantly larger than objects with low similarity. We consider LSH for...... or union of a set of preprocessed polygons. Curiously, our consistent sampling method uses transformation to a geometric problem....

  2. Orthographic consistency affects spoken word recognition at different grain-sizes

    DEFF Research Database (Denmark)

    Dich, Nadya

    2014-01-01

    A number of previous studies found that the consistency of sound-to-spelling mappings (feedback consistency) affects spoken word recognition. In auditory lexical decision experiments, words that can only be spelled one way are recognized faster than words with multiple potential spellings. Previo...

  3. Investigating the consistency between proxy-based reconstructions and climate models using data assimilation: a mid-Holocene case study

    NARCIS (Netherlands)

    A. Mairesse; H. Goosse; P. Mathiot; H. Wanner; S. Dubinkina (Svetlana)

    2013-01-01

    htmlabstractThe mid-Holocene (6 kyr BP; thousand years before present) is a key period to study the consistency between model results and proxy-based reconstruction data as it corresponds to a standard test for models and a reasonable number of proxy-based records is available. Taking advantage of

  4. Consistency relations in effective field theory

    Energy Technology Data Exchange (ETDEWEB)

    Munshi, Dipak; Regan, Donough, E-mail: D.Munshi@sussex.ac.uk, E-mail: D.Regan@sussex.ac.uk [Astronomy Centre, School of Mathematical and Physical Sciences, University of Sussex, Brighton BN1 9QH (United Kingdom)

    2017-06-01

    The consistency relations in large scale structure relate the lower-order correlation functions with their higher-order counterparts. They are direct outcome of the underlying symmetries of a dynamical system and can be tested using data from future surveys such as Euclid. Using techniques from standard perturbation theory (SPT), previous studies of consistency relation have concentrated on continuity-momentum (Euler)-Poisson system of an ideal fluid. We investigate the consistency relations in effective field theory (EFT) which adjusts the SPT predictions to account for the departure from the ideal fluid description on small scales. We provide detailed results for the 3D density contrast δ as well as the scaled divergence of velocity θ-bar . Assuming a ΛCDM background cosmology, we find the correction to SPT results becomes important at k ∼> 0.05 h/Mpc and that the suppression from EFT to SPT results that scales as square of the wave number k , can reach 40% of the total at k ≈ 0.25 h/Mpc at z = 0. We have also investigated whether effective field theory corrections to models of primordial non-Gaussianity can alter the squeezed limit behaviour, finding the results to be rather insensitive to these counterterms. In addition, we present the EFT corrections to the squeezed limit of the bispectrum in redshift space which may be of interest for tests of theories of modified gravity.

  5. Columbia River Stock Identification Study; Validation of Genetic Method, 1980-1981 Final Report.

    Energy Technology Data Exchange (ETDEWEB)

    Milner, George B.; Teel, David J.; Utter, Fred M. (Northwest and Alaska Fisheries Science Center, Coastal Zone and Estuarine Studies Division, Seattle, WA)

    1981-06-01

    The reliability of a method for obtaining maximum likelihood estimate of component stocks in mixed populations of salmonids through the frequency of genetic variants in a mixed population and in potentially contributing stocks was tested in 1980. A data base of 10 polymorphic loci from 14 hatchery stocks of spring chinook salmon of the Columbia River was used to estimate proportions of these stocks in four different blind'' mixtures whose true composition was only revealed subsequent to obtaining estimates. The accuracy and precision of these blind tests have validated the genetic method as a valuable means for identifying components of stock mixtures. Properties of the genetic method were further examined by simulation studies using the pooled data of the four blind tests as a mixed fishery. Replicated tests with samples sizes between 100 and 1,000 indicated that actual standard deviations on estimated contributions were consistently lower than calculated standard deviations; this difference diminished as sample size increased. It is recommended that future applications of the method be preceded by simulation studies that will identify appropriate levels of sampling required for acceptable levels of accuracy and precision. Variables in such studies include the stocks involved, the loci used, and the genetic differentiation among stocks. 8 refs., 6 figs., 4 tabs.

  6. DIETFITS study (diet intervention examining the factors interacting with treatment success) - Study design and methods.

    Science.gov (United States)

    Stanton, Michael V; Robinson, Jennifer L; Kirkpatrick, Susan M; Farzinkhou, Sarah; Avery, Erin C; Rigdon, Joseph; Offringa, Lisa C; Trepanowski, John F; Hauser, Michelle E; Hartle, Jennifer C; Cherin, Rise J; King, Abby C; Ioannidis, John P A; Desai, Manisha; Gardner, Christopher D

    2017-02-01

    Numerous studies have attempted to identify successful dietary strategies for weight loss, and many have focused on Low-Fat vs. Low-Carbohydrate comparisons. Despite relatively small between-group differences in weight loss found in most previous studies, researchers have consistently observed relatively large between-subject differences in weight loss within any given diet group (e.g., ~25kg weight loss to ~5kg weight gain). The primary objective of this study was to identify predisposing individual factors at baseline that help explain differential weight loss achieved by individuals assigned to the same diet, particularly a pre-determined multi-locus genotype pattern and insulin resistance status. Secondary objectives included discovery strategies for further identifying potential genetic risk scores. Exploratory objectives included investigation of an extensive set of physiological, psychosocial, dietary, and behavioral variables as moderating and/or mediating variables and/or secondary outcomes. The target population was generally healthy, free-living adults with BMI 28-40kg/m 2 (n=600). The intervention consisted of a 12-month protocol of 22 one-hour evening instructional sessions led by registered dietitians, with ~15-20 participants/class. Key objectives of dietary instruction included focusing on maximizing the dietary quality of both Low-Fat and Low-Carbohydrate diets (i.e., Healthy Low-Fat vs. Healthy Low-Carbohydrate), and maximally differentiating the two diets from one another. Rather than seeking to determine if one dietary approach was better than the other for the general population, this study sought to examine whether greater overall weight loss success could be achieved by matching different people to different diets. Here we present the design and methods of the study. Copyright © 2016 Elsevier Inc. All rights reserved.

  7. Multinational consistency of a discrete choice model in quantifying health states for the extended 5-level EQ-5D

    NARCIS (Netherlands)

    Krabbe, P.F.M.; Devlin, N.J.; Stolk, E.A.; Shah, K.K.; Oppe, M.; Van Hout, B.; Quik, E.H.; Pickard, A.S.; Xie, F.

    2013-01-01

    Objectives: To investigate the feasibility of choice experiments for EQ-5D-5L states using computer-based data collection, and to examine the consistency of the estimated parameters values derived after modeling the stated preference data across countries in a multinational study. Methods: Similar

  8. A Self Consistent Multiprocessor Space Charge Algorithm that is Almost Embarrassingly Parallel

    International Nuclear Information System (INIS)

    Nissen, Edward; Erdelyi, B.; Manikonda, S.L.

    2012-01-01

    We present a space charge code that is self consistent, massively parallelizeable, and requires very little communication between computer nodes; making the calculation almost embarrassingly parallel. This method is implemented in the code COSY Infinity where the differential algebras used in this code are important to the algorithm's proper functioning. The method works by calculating the self consistent space charge distribution using the statistical moments of the test particles, and converting them into polynomial series coefficients. These coefficients are combined with differential algebraic integrals to form the potential, and electric fields. The result is a map which contains the effects of space charge. This method allows for massive parallelization since its statistics based solver doesn't require any binning of particles, and only requires a vector containing the partial sums of the statistical moments for the different nodes to be passed. All other calculations are done independently. The resulting maps can be used to analyze the system using normal form analysis, as well as advance particles in numbers and at speeds that were previously impossible.

  9. Hands-off and hands-on casting consistency of amputee below knee sockets using magnetic resonance imaging.

    Science.gov (United States)

    Safari, Mohammad Reza; Rowe, Philip; McFadyen, Angus; Buis, Arjan

    2013-01-01

    Residual limb shape capturing (Casting) consistency has a great influence on the quality of socket fit. Magnetic Resonance Imaging was used to establish a reliable reference grid for intercast and intracast shape and volume consistency of two common casting methods, Hands-off and Hands-on. Residual limbs were cast for twelve people with a unilateral below knee amputation and scanned twice for each casting concept. Subsequently, all four volume images of each amputee were semiautomatically segmented and registered to a common coordinate system using the tibia and then the shape and volume differences were calculated. The results show that both casting methods have intra cast volume consistency and there is no significant volume difference between the two methods. Inter- and intracast mean volume differences were not clinically significant based on the volume of one sock criteria. Neither the Hands-off nor the Hands-on method resulted in a consistent residual limb shape as the coefficient of variation of shape differences was high. The resultant shape of the residual limb in the Hands-off casting was variable but the differences were not clinically significant. For the Hands-on casting, shape differences were equal to the maximum acceptable limit for a poor socket fit.

  10. Dosimetric comparative study of two methods of intensity modulation performed on the same accelerator; Etude comparative dosimetrique de deux methodes de modulation d'intensite realisees sur le meme accelerateur

    Energy Technology Data Exchange (ETDEWEB)

    Marchesi, V.; Aletti, P.; Madelis, G. [Centre Alexis-Vautrin, Unite de Radiophysique, CRLCC, 54 - Vandoeuvre-les-Nancy (France); Marchal, C.; Bey, P. [Centre Alexis-Vautrin, Service de Radiotherapie, CRLCC, 54 - Vandoeuvre-les-Nancy (France); Wolf, D. [Institut National Polytechnique de Lorraine, UPRES-A 7039, 54 - Vandoeuvre-les-Nancy (France)

    2000-12-01

    Intensity modulated radiation therapy (IMRT) is an advanced method of conformal radiotherapy. It permits optimal dose distribution to the target volume while preserving surrounding normal tissues. IMRT, with a multi-leaf collimator, can be realised in two different ways: either the segmented mode, which consists of combining small elementary static field, or the dynamic mode, which consists of moving the leaves while irradiating. The purpose of this work was to study these two methods of modulation on a Varian linear accelerator equipped with a collimator consisting of 40 pairs of one centimeter-wide leaves. The measurements, obtained by using a diode array, showed that the quality of the irradiation in the dynamic mode does not depend on either the dose rate or the duration of the irradiation. In the segmented mode, weak magnitude segments are preferable, but increase the errors in the delivered dose. Comparisons of various profiles showed that the measured profiles are consistent with those programmed. Both modes seem to be equivalent for step shaped profiles. In the case of profiles with constant slope, the segmentation generated by the segmented method deteriorates the profile. Even though the choice of technique is strongly dependent on the material available, the dynamic mode presents greater flexibility of use and has been chosen in our institution for IMRT. (authors)

  11. SU-E-J-29: Audiovisual Biofeedback Improves Tumor Motion Consistency for Lung Cancer Patients

    International Nuclear Information System (INIS)

    Lee, D; Pollock, S; Makhija, K; Keall, P; Greer, P; Arm, J; Hunter, P; Kim, T

    2014-01-01

    Purpose: To investigate whether the breathing-guidance system: audiovisual (AV) biofeedback improves tumor motion consistency for lung cancer patients. This will minimize respiratory-induced tumor motion variations across cancer imaging and radiotherapy procedues. This is the first study to investigate the impact of respiratory guidance on tumor motion. Methods: Tumor motion consistency was investigated with five lung cancer patients (age: 55 to 64), who underwent a training session to get familiarized with AV biofeedback, followed by two MRI sessions across different dates (pre and mid treatment). During the training session in a CT room, two patient specific breathing patterns were obtained before (Breathing-Pattern-1) and after (Breathing-Pattern-2) training with AV biofeedback. In each MRI session, four MRI scans were performed to obtain 2D coronal and sagittal image datasets in free breathing (FB), and with AV biofeedback utilizing Breathing-Pattern-2. Image pixel values of 2D images after the normalization of 2D images per dataset and Gaussian filter per image were used to extract tumor motion using image pixel values. The tumor motion consistency of the superior-inferior (SI) direction was evaluated in terms of an average tumor motion range and period. Results: Audiovisual biofeedback improved tumor motion consistency by 60% (p value = 0.019) from 1.0±0.6 mm (FB) to 0.4±0.4 mm (AV) in SI motion range, and by 86% (p value < 0.001) from 0.7±0.6 s (FB) to 0.1±0.2 s (AV) in period. Conclusion: This study demonstrated that audiovisual biofeedback improves both breathing pattern and tumor motion consistency for lung cancer patients. These results suggest that AV biofeedback has the potential for facilitating reproducible tumor motion towards achieving more accurate medical imaging and radiation therapy procedures

  12. SU-E-J-29: Audiovisual Biofeedback Improves Tumor Motion Consistency for Lung Cancer Patients

    Energy Technology Data Exchange (ETDEWEB)

    Lee, D; Pollock, S; Makhija, K; Keall, P [The University of Sydney, Camperdown, NSW (Australia); Greer, P [The University of Newcastle, Newcastle, NSW (Australia); Calvary Mater Newcastle Hospital, Newcastle, NSW (Australia); Arm, J; Hunter, P [Calvary Mater Newcastle Hospital, Newcastle, NSW (Australia); Kim, T [The University of Sydney, Camperdown, NSW (Australia); University of Virginia Health System, Charlottesville, VA (United States)

    2014-06-01

    Purpose: To investigate whether the breathing-guidance system: audiovisual (AV) biofeedback improves tumor motion consistency for lung cancer patients. This will minimize respiratory-induced tumor motion variations across cancer imaging and radiotherapy procedues. This is the first study to investigate the impact of respiratory guidance on tumor motion. Methods: Tumor motion consistency was investigated with five lung cancer patients (age: 55 to 64), who underwent a training session to get familiarized with AV biofeedback, followed by two MRI sessions across different dates (pre and mid treatment). During the training session in a CT room, two patient specific breathing patterns were obtained before (Breathing-Pattern-1) and after (Breathing-Pattern-2) training with AV biofeedback. In each MRI session, four MRI scans were performed to obtain 2D coronal and sagittal image datasets in free breathing (FB), and with AV biofeedback utilizing Breathing-Pattern-2. Image pixel values of 2D images after the normalization of 2D images per dataset and Gaussian filter per image were used to extract tumor motion using image pixel values. The tumor motion consistency of the superior-inferior (SI) direction was evaluated in terms of an average tumor motion range and period. Results: Audiovisual biofeedback improved tumor motion consistency by 60% (p value = 0.019) from 1.0±0.6 mm (FB) to 0.4±0.4 mm (AV) in SI motion range, and by 86% (p value < 0.001) from 0.7±0.6 s (FB) to 0.1±0.2 s (AV) in period. Conclusion: This study demonstrated that audiovisual biofeedback improves both breathing pattern and tumor motion consistency for lung cancer patients. These results suggest that AV biofeedback has the potential for facilitating reproducible tumor motion towards achieving more accurate medical imaging and radiation therapy procedures.

  13. Standard Model Vacuum Stability and Weyl Consistency Conditions

    DEFF Research Database (Denmark)

    Antipin, Oleg; Gillioz, Marc; Krog, Jens

    2013-01-01

    At high energy the standard model possesses conformal symmetry at the classical level. This is reflected at the quantum level by relations between the different beta functions of the model. These relations are known as the Weyl consistency conditions. We show that it is possible to satisfy them...... order by order in perturbation theory, provided that a suitable coupling constant counting scheme is used. As a direct phenomenological application, we study the stability of the standard model vacuum at high energies and compare with previous computations violating the Weyl consistency conditions....

  14. Selection of suitable propagation method for consistent plantlets production in Stevia rebaudiana (Bertoni).

    Science.gov (United States)

    Khalil, Shahid Akbar; Zamir, Roshan; Ahmad, Nisar

    2014-12-01

    Stevia rebaudiana (Bert.) is an emerging sugar alternative and anti-diabetic plant in Pakistan. That is why people did not know the exact time of propagation. The main objective of the present study was to establish feasible propagation methods for healthy biomass production. In the present study, seed germination, stem cuttings and micropropagation were investigated for higher productivity. Fresh seeds showed better germination (25.51-40%) but lost viability after a few days of storage. In order to improve the germination percentage, seeds were irradiated with 2.5, 5.0, 7.5 and 10 Gy gamma doses. But gamma irradiation did not show any significant change in seed germination. A great variation in survival of stem cutting was observed in each month of 2012. October and November were found the most suitable months for stem cutting survival (60%). In order to enhance survival, stem cuttings were also dipped in different plant growth regulators (PGRs) solution. Only indole butyric acid (IBA; 1000 ppm) treated cutting showed a higher survival (33%) than control (11.1%). Furthermore, simple and feasible indirect regeneration system was established from leaf explants. Best callus induction (84.6%) was observed on MS-medium augmented with 6-benzyladenine (BA) and 2,4-dichlorophenoxyacetic acid (2,4-D; 2.0 mg l(-1)). For the first time, we obtained the highest number of shoots (106) on a medium containing BA (1.5 mg l(-1)) and gibberellic acid (GA3; 0.5 mg l(-1)). Plantlets were successfully acclimatized in plastic pots. The current results preferred micropropagation (85%) over seed germination (25.51-40%) and stem cutting (60%).

  15. Consistent Quantum Theory

    Science.gov (United States)

    Griffiths, Robert B.

    2001-11-01

    Quantum mechanics is one of the most fundamental yet difficult subjects in physics. Nonrelativistic quantum theory is presented here in a clear and systematic fashion, integrating Born's probabilistic interpretation with Schrödinger dynamics. Basic quantum principles are illustrated with simple examples requiring no mathematics beyond linear algebra and elementary probability theory. The quantum measurement process is consistently analyzed using fundamental quantum principles without referring to measurement. These same principles are used to resolve several of the paradoxes that have long perplexed physicists, including the double slit and Schrödinger's cat. The consistent histories formalism used here was first introduced by the author, and extended by M. Gell-Mann, J. Hartle and R. Omnès. Essential for researchers yet accessible to advanced undergraduate students in physics, chemistry, mathematics, and computer science, this book is supplementary to standard textbooks. It will also be of interest to physicists and philosophers working on the foundations of quantum mechanics. Comprehensive account Written by one of the main figures in the field Paperback edition of successful work on philosophy of quantum mechanics

  16. Evaluation of the Consistency of MODIS Land Cover Product (MCD12Q1 Based on Chinese 30 m GlobeLand30 Datasets: A Case Study in Anhui Province, China

    Directory of Open Access Journals (Sweden)

    Dong Liang

    2015-11-01

    Full Text Available Land cover plays an important role in the climate and biogeochemistry of the Earth system. It is of great significance to produce and evaluate the global land cover (GLC data when applying the data to the practice at a specific spatial scale. The objective of this study is to evaluate and validate the consistency of the Moderate Resolution Imaging Spectroradiometer (MODIS land cover product (MCD12Q1 at a provincial scale (Anhui Province, China based on the Chinese 30 m GLC product (GlobeLand30. A harmonization method is firstly used to reclassify the land cover types between five classification schemes (International Geosphere Biosphere Programme (IGBP global vegetation classification, University of Maryland (UMD, MODIS-derived Leaf Area Index and Fractional Photosynthetically Active Radiation (LAI/FPAR, MODIS-derived Net Primary Production (NPP, and Plant Functional Type (PFT of MCD12Q1 and ten classes of GlobeLand30, based on the knowledge rule (KR and C4.5 decision tree (DT classification algorithm. A total of five harmonized land cover types are derived including woodland, grassland, cropland, wetland and artificial surfaces, and four evaluation indicators are selected including the area consistency, spatial consistency, classification accuracy and landscape diversity in the three sub-regions of Wanbei, Wanzhong and Wannan. The results indicate that the consistency of IGBP is the best among the five schemes of MCD12Q1 according to the correlation coefficient (R. The “woodland” LAI/FPAR is the worst, with a spatial similarity (O of 58.17% due to the misclassification between “woodland” and “others”. The consistency of NPP is the worst among the five schemes as the agreement varied from 1.61% to 56.23% in the three sub-regions. Furthermore, with the biggest difference of diversity indices between LAI/FPAR and GlobeLand30, the consistency of LAI/FPAR is the weakest. This study provides a methodological reference for evaluating the

  17. Consistency properties of chaotic systems driven by time-delayed feedback

    Science.gov (United States)

    Jüngling, T.; Soriano, M. C.; Oliver, N.; Porte, X.; Fischer, I.

    2018-04-01

    Consistency refers to the property of an externally driven dynamical system to respond in similar ways to similar inputs. In a delay system, the delayed feedback can be considered as an external drive to the undelayed subsystem. We analyze the degree of consistency in a generic chaotic system with delayed feedback by means of the auxiliary system approach. In this scheme an identical copy of the nonlinear node is driven by exactly the same signal as the original, allowing us to verify complete consistency via complete synchronization. In the past, the phenomenon of synchronization in delay-coupled chaotic systems has been widely studied using correlation functions. Here, we analytically derive relationships between characteristic signatures of the correlation functions in such systems and unequivocally relate them to the degree of consistency. The analytical framework is illustrated and supported by numerical calculations of the logistic map with delayed feedback for different replica configurations. We further apply the formalism to time series from an experiment based on a semiconductor laser with a double fiber-optical feedback loop. The experiment constitutes a high-quality replica scheme for studying consistency of the delay-driven laser and confirms the general theoretical results.

  18. SU-F-T-26: A Study of the Consistency of Brachytherapy Treatments for Vaginal Cuff

    Energy Technology Data Exchange (ETDEWEB)

    Shojaei, M; Pella, S; Dumitru, N [21st Century Oncology, Boca Raton, FL (United States)

    2016-06-15

    Purpose: To evaluate to treatment consistency over the total number of fractions when treatment what HDR brachytherapy using the ML cylinders. At the same time the dosimetric impact on the critical organs is monitored over the total number of fractions. Methods: A retrospective analysis of 10 patients treated with Cylinder applicators, from 2015–2016 were considered for this study. The CT scans of these patients, taken before each treatment were separately imported in to the treatment planning system and paired with the initial CT scan after completing the contouring. Two sets of CT images were fused together with respective to the applicator, using landmark registration. The doses of each plan were imported as well and a cumulative dosimetric analysis was made for bladder, bowels, and rectum and PTV. Results: No contour of any of the OAR was exactly similar when CT images were fused on each other. The PTV volumes vary from fraction to fraction. There was always a difference between the doses received by the OARs between treatments. The maximum dose varied between 5% and 30% in rectum and bladder. The minimum dose varied between 5% and 8% in rectum and bladder. The average dose varied between 15% and 20% in rectum and bladder. Deviation in placement were noticed between fractions. Conclusion: The variation in volumes of OARs and isodoses near the OARs, indicate that the estimated doses to OARs on the planning system may not be the same dose delivered to the patient in all the fractions. There are no major differences between the prescribed dose and the delivered dose over the total number of fractions. In some cases the critical organs will benefit if the consecutive plans will made after the CT scans will be registered with the initial scan and then planned.

  19. The exact solution of self-consistent equations in the scanning near-field optic microscopy problem

    DEFF Research Database (Denmark)

    Lozovski, Valeri; Bozhevolnyi, Sergey I.

    1999-01-01

    The macroscopic approach that allows one to obtain an exact solution of the self-consistent equation of the Lippmann-Schwinger type is developed. The main idea of our method consist in usage of diagram technque for exact summation of the infinite series corresponding to the iteration procedure fo...

  20. Bootstrapping the economy -- a non-parametric method of generating consistent future scenarios

    OpenAIRE

    Müller, Ulrich A; Bürgi, Roland; Dacorogna, Michel M

    2004-01-01

    The fortune and the risk of a business venture depends on the future course of the economy. There is a strong demand for economic forecasts and scenarios that can be applied to planning and modeling. While there is an ongoing debate on modeling economic scenarios, the bootstrapping (or resampling) approach presented here has several advantages. As a non-parametric method, it directly relies on past market behaviors rather than debatable assumptions on models and parameters. Simultaneous dep...

  1. Consistent and efficient processing of ADCP streamflow measurements

    Science.gov (United States)

    Mueller, David S.; Constantinescu, George; Garcia, Marcelo H.; Hanes, Dan

    2016-01-01

    The use of Acoustic Doppler Current Profilers (ADCPs) from a moving boat is a commonly used method for measuring streamflow. Currently, the algorithms used to compute the average depth, compute edge discharge, identify invalid data, and estimate velocity and discharge for invalid data vary among manufacturers. These differences could result in different discharges being computed from identical data. Consistent computational algorithm, automated filtering, and quality assessment of ADCP streamflow measurements that are independent of the ADCP manufacturer are being developed in a software program that can process ADCP moving-boat discharge measurements independent of the ADCP used to collect the data.

  2. A Consistent Pricing Model for Index Options and Volatility Derivatives

    DEFF Research Database (Denmark)

    Cont, Rama; Kokholm, Thomas

    observed properties of variance swap dynamics and allows for jumps in volatility and returns. An affine specification using L´evy processes as building blocks leads to analytically tractable pricing formulas for options on variance swaps as well as efficient numerical methods for pricing of European......We propose and study a flexible modeling framework for the joint dynamics of an index and a set of forward variance swap rates written on this index, allowing options on forward variance swaps and options on the underlying index to be priced consistently. Our model reproduces various empirically...... options on the underlying asset. The model has the convenient feature of decoupling the vanilla skews from spot/volatility correlations and allowing for different conditional correlations in large and small spot/volatility moves. We show that our model can simultaneously fit prices of European options...

  3. Joint depth map and color consistency estimation for stereo images with different illuminations and cameras.

    Science.gov (United States)

    Heo, Yong Seok; Lee, Kyoung Mu; Lee, Sang Uk

    2013-05-01

    Abstract—In this paper, we propose a method that infers both accurate depth maps and color-consistent stereo images for radiometrically varying stereo images. In general, stereo matching and performing color consistency between stereo images are a chicken-and-egg problem since it is not a trivial task to simultaneously achieve both goals. Hence, we have developed an iterative framework in which these two processes can boost each other. First, we transform the input color images to log-chromaticity color space, from which a linear relationship can be established during constructing a joint pdf of transformed left and right color images. From this joint pdf, we can estimate a linear function that relates the corresponding pixels in stereo images. Based on this linear property, we present a new stereo matching cost by combining Mutual Information (MI), SIFT descriptor, and segment-based plane-fitting to robustly find correspondence for stereo image pairs which undergo radiometric variations. Meanwhile, we devise a Stereo Color Histogram Equalization (SCHE) method to produce color-consistent stereo image pairs, which conversely boost the disparity map estimation. Experimental results show that our method produces both accurate depth maps and color-consistent stereo images, even for stereo images with severe radiometric differences.

  4. How consistent is teachers’ planning, implementation, and assessment in character education?

    Directory of Open Access Journals (Sweden)

    Luh Gd Rahayu Budiarta

    2018-01-01

    Full Text Available This descriptive case study aims at investigating how teachers planned, practiced, assessed students’ character development in the classroom. The study involved 29 students of Grade 5, a classroom, and an English teacher. The instruments include classroom observation, checklist, interview guide, field note, and video recording. The obtained data were analyzed by using analysis method by Miles and Huberman (1994. The result of the research shows that there were ten characters values detected in the lesson plans, however, only 55% consistently appeared in the teaching and learning process. Teachers were found to use performance rubrics and diary to keep record on students’ character development. The interview revealed that there was reluctance regarding role of teacher in term of assessing students’ characters. This fact implies that there should be intensive training and workshop for the teachers to improve the quality of the character education practices in elementary schools in Bali.

  5. Fingerprint analysis and quality consistency evaluation of flavonoid compounds for fermented Guava leaf by combining high-performance liquid chromatography time-of-flight electrospray ionization mass spectrometry and chemometric methods.

    Science.gov (United States)

    Wang, Lu; Tian, Xiaofei; Wei, Wenhao; Chen, Gong; Wu, Zhenqiang

    2016-10-01

    Guava leaves are used in traditional herbal teas as antidiabetic therapies. Flavonoids are the main active of Guava leaves and have many physiological functions. However, the flavonoid compositions and activities of Guava leaves could change due to microbial fermentation. A high-performance liquid chromatography time-of-flight electrospray ionization mass spectrometry method was applied to identify the varieties of the flavonoids in Guava leaves before and after fermentation. High-performance liquid chromatography, hierarchical cluster analysis and principal component analysis were used to quantitatively determine the changes in flavonoid compositions and evaluate the consistency and quality of Guava leaves. Monascus anka Saccharomyces cerevisiae fermented Guava leaves contained 2.32- and 4.06-fold more total flavonoids and quercetin, respectively, than natural Guava leaves. The flavonoid compounds of the natural Guava leaves had similarities ranging from 0.837 to 0.927. The flavonoid compounds from the Monascus anka S. cerevisiae fermented Guava leaves had similarities higher than 0.993. This indicated that the quality consistency of the fermented Guava leaves was better than that of the natural Guava leaves. High-performance liquid chromatography fingerprinting and chemometric analysis are promising methods for evaluating the degree of fermentation of Guava leaves based on quality consistency, which could be used in assessing flavonoid compounds for the production of fermented Guava leaves. © 2016 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  6. Analysis of self-consistency effects in range-separated density-functional theory with Møller-Plesset perturbation theory

    DEFF Research Database (Denmark)

    Fromager, Emmanuel; Jensen, Hans Jørgen Aagaard

    2011-01-01

    Range-separated density-functional theory combines wave function theory for the long-range part of the two-electron interaction with density-functional theory for the short-range part. When describing the long-range interaction with non-variational methods, such as perturbation or coupled......-cluster theories, self-consistency effects are introduced in the density functional part, which for an exact solution requires iterations. They are generally assumed to be small but no detailed study has been performed so far. Here, the authors analyze self-consistency when using Møller-Plesset-type (MP......) perturbation theory for the long range interaction. The lowest-order self-consistency corrections to the wave function and the energy, that enter the perturbation expansions at the second and fourth order, respectively, are both expressed in terms of the one-electron reduced density matrix. The computational...

  7. Internal consistency & validity of Indian Disability Evaluation and Assessment Scale (IDEAS in patients with schizophrenia

    Directory of Open Access Journals (Sweden)

    Sandeep Grover

    2014-01-01

    Full Text Available Background & objectives: The Indian Disability Evaluation and Assessment Scale (IDEAS has been recommended for assessment and certification of disability by the Government of India (GOI. However, the psychometric properties of IDEAS as adopted by GOI remain understudied. Our aim, thus, was to study the internal consistency and validity of IDEAS in patients with schizophrenia. Methods: A total of 103 consenting patients with residual schizophrenia were assessed for disability, quality of life (QOL and psychopathology using the IDEAS, WHO QOL-100 and Positive and Negative symptom scale (PANSS respectively. Internal consistency was calculated using Cronbach′s alpha. For construct validity, relations between IDEAS, and psychopathology and QOL were studied. Results: The inter-item correlations for IDEAS were significant with a Cronbach′s alpha of 0.721. All item scores other than score on communication and understanding; total and global IDEAS scores correlated significantly with the positive, negative and general sub-scales, and total PANSS scores. Communication and understanding was significantly related to negative sub-scale score only. Total and global disability scores correlated negatively with all the domains of WHOQOL-100 (ρ<0.01. The individual IDEAS item scores correlated negatively with various WHOQOL-100 domains (ρ0< 0.01. Interpretation & conclusions: This study findings showed that the GOI-modified IDEAS had good internal consistency and construct validity as tested in patients with residual schizophrenia. Similar studies need to be done with other groups of patients.

  8. Combining the Complete Active Space Self-Consistent Field Method and the Full Configuration Interaction Quantum Monte Carlo within a Super-CI Framework, with Application to Challenging Metal-Porphyrins.

    Science.gov (United States)

    Li Manni, Giovanni; Smart, Simon D; Alavi, Ali

    2016-03-08

    A novel stochastic Complete Active Space Self-Consistent Field (CASSCF) method has been developed and implemented in the Molcas software package. A two-step procedure is used, in which the CAS configuration interaction secular equations are solved stochastically with the Full Configuration Interaction Quantum Monte Carlo (FCIQMC) approach, while orbital rotations are performed using an approximated form of the Super-CI method. This new method does not suffer from the strong combinatorial limitations of standard MCSCF implementations using direct schemes and can handle active spaces well in excess of those accessible to traditional CASSCF approaches. The density matrix formulation of the Super-CI method makes this step independent of the size of the CI expansion, depending exclusively on one- and two-body density matrices with indices restricted to the relatively small number of active orbitals. No sigma vectors need to be stored in memory for the FCIQMC eigensolver--a substantial gain in comparison to implementations using the Davidson method, which require three or more vectors of the size of the CI expansion. Further, no orbital Hessian is computed, circumventing limitations on basis set expansions. Like the parent FCIQMC method, the present technique is scalable on massively parallel architectures. We present in this report the method and its application to the free-base porphyrin, Mg(II) porphyrin, and Fe(II) porphyrin. In the present study, active spaces up to 32 electrons and 29 orbitals in orbital expansions containing up to 916 contracted functions are treated with modest computational resources. Results are quite promising even without accounting for the correlation outside the active space. The systems here presented clearly demonstrate that large CASSCF calculations are possible via FCIQMC-CASSCF without limitations on basis set size.

  9. A reduced-scaling density matrix-based method for the computation of the vibrational Hessian matrix at the self-consistent field level

    International Nuclear Information System (INIS)

    Kussmann, Jörg; Luenser, Arne; Beer, Matthias; Ochsenfeld, Christian

    2015-01-01

    An analytical method to calculate the molecular vibrational Hessian matrix at the self-consistent field level is presented. By analysis of the multipole expansions of the relevant derivatives of Coulomb-type two-electron integral contractions, we show that the effect of the perturbation on the electronic structure due to the displacement of nuclei decays at least as r −2 instead of r −1 . The perturbation is asymptotically local, and the computation of the Hessian matrix can, in principle, be performed with O(N) complexity. Our implementation exhibits linear scaling in all time-determining steps, with some rapid but quadratic-complexity steps remaining. Sample calculations illustrate linear or near-linear scaling in the construction of the complete nuclear Hessian matrix for sparse systems. For more demanding systems, scaling is still considerably sub-quadratic to quadratic, depending on the density of the underlying electronic structure

  10. A theoretically consistent stochastic cascade for temporal disaggregation of intermittent rainfall

    Science.gov (United States)

    Lombardo, F.; Volpi, E.; Koutsoyiannis, D.; Serinaldi, F.

    2017-06-01

    Generating fine-scale time series of intermittent rainfall that are fully consistent with any given coarse-scale totals is a key and open issue in many hydrological problems. We propose a stationary disaggregation method that simulates rainfall time series with given dependence structure, wet/dry probability, and marginal distribution at a target finer (lower-level) time scale, preserving full consistency with variables at a parent coarser (higher-level) time scale. We account for the intermittent character of rainfall at fine time scales by merging a discrete stochastic representation of intermittency and a continuous one of rainfall depths. This approach yields a unique and parsimonious mathematical framework providing general analytical formulations of mean, variance, and autocorrelation function (ACF) for a mixed-type stochastic process in terms of mean, variance, and ACFs of both continuous and discrete components, respectively. To achieve the full consistency between variables at finer and coarser time scales in terms of marginal distribution and coarse-scale totals, the generated lower-level series are adjusted according to a procedure that does not affect the stochastic structure implied by the original model. To assess model performance, we study rainfall process as intermittent with both independent and dependent occurrences, where dependence is quantified by the probability that two consecutive time intervals are dry. In either case, we provide analytical formulations of main statistics of our mixed-type disaggregation model and show their clear accordance with Monte Carlo simulations. An application to rainfall time series from real world is shown as a proof of concept.

  11. Consistent probabilities in loop quantum cosmology

    International Nuclear Information System (INIS)

    Craig, David A; Singh, Parampreet

    2013-01-01

    A fundamental issue for any quantum cosmological theory is to specify how probabilities can be assigned to various quantum events or sequences of events such as the occurrence of singularities or bounces. In previous work, we have demonstrated how this issue can be successfully addressed within the consistent histories approach to quantum theory for Wheeler–DeWitt-quantized cosmological models. In this work, we generalize that analysis to the exactly solvable loop quantization of a spatially flat, homogeneous and isotropic cosmology sourced with a massless, minimally coupled scalar field known as sLQC. We provide an explicit, rigorous and complete decoherent-histories formulation for this model and compute the probabilities for the occurrence of a quantum bounce versus a singularity. Using the scalar field as an emergent internal time, we show for generic states that the probability for a singularity to occur in this model is zero, and that of a bounce is unity, complementing earlier studies of the expectation values of the volume and matter density in this theory. We also show from the consistent histories point of view that all states in this model, whether quantum or classical, achieve arbitrarily large volume in the limit of infinite ‘past’ or ‘future’ scalar ‘time’, in the sense that the wave function evaluated at any arbitrary fixed value of the volume vanishes in that limit. Finally, we briefly discuss certain misconceptions concerning the utility of the consistent histories approach in these models. (paper)

  12. Consistent classical supergravity theories

    International Nuclear Information System (INIS)

    Muller, M.

    1989-01-01

    This book offers a presentation of both conformal and Poincare supergravity. The consistent four-dimensional supergravity theories are classified. The formulae needed for further modelling are included

  13. Selection of suitable propagation method for consistent plantlets production in Stevia rebaudiana (Bertoni)

    Science.gov (United States)

    Khalil, Shahid Akbar; Zamir, Roshan; Ahmad, Nisar

    2014-01-01

    Stevia rebaudiana (Bert.) is an emerging sugar alternative and anti-diabetic plant in Pakistan. That is why people did not know the exact time of propagation. The main objective of the present study was to establish feasible propagation methods for healthy biomass production. In the present study, seed germination, stem cuttings and micropropagation were investigated for higher productivity. Fresh seeds showed better germination (25.51–40%) but lost viability after a few days of storage. In order to improve the germination percentage, seeds were irradiated with 2.5, 5.0, 7.5 and 10 Gy gamma doses. But gamma irradiation did not show any significant change in seed germination. A great variation in survival of stem cutting was observed in each month of 2012. October and November were found the most suitable months for stem cutting survival (60%). In order to enhance survival, stem cuttings were also dipped in different plant growth regulators (PGRs) solution. Only indole butyric acid (IBA; 1000 ppm) treated cutting showed a higher survival (33%) than control (11.1%). Furthermore, simple and feasible indirect regeneration system was established from leaf explants. Best callus induction (84.6%) was observed on MS-medium augmented with 6-benzyladenine (BA) and 2,4-dichlorophenoxyacetic acid (2,4-D; 2.0 mg l−1). For the first time, we obtained the highest number of shoots (106) on a medium containing BA (1.5 mg l−1) and gibberellic acid (GA3; 0.5 mg l−1). Plantlets were successfully acclimatized in plastic pots. The current results preferred micropropagation (85%) over seed germination (25.51–40%) and stem cutting (60%). PMID:25473365

  14. Generation of static solutions of the self-consistent system of Einstein-Maxwell equations

    International Nuclear Information System (INIS)

    Anchikov, A.M.; Daishev, R.A.

    1988-01-01

    A theorem is proved, according to which to each solution of the Einstein equations with an arbitrary momentum-energy tensor in the right hand side there corresponds a static solution of the self-consistent system of Einstein-Maxwell equations. As a consequence of this theorem, a method is established of generating static solutions of the self-consistent system of Einstein-Maxwell equations with a charged grain as a source of vacuum solutions of the Einstein equations

  15. Self-consistent cluster theories for alloys with diagonal and off-diagonal disorder

    International Nuclear Information System (INIS)

    Gonis, A.; Garland, J.W.

    1978-01-01

    The molecular coherent-potential approximation (MCPA) and other, simpler cluster approximations for disordered alloys are studied both analytically and numerically for alloys with diagonal and off-diagonal disorder (ODD). First, the MCPA for alloys with only diagonal disorder is rederived within the interactor formalism of Blackman, Esterling, and Berk. This formalism, which simplifies the numerical implementation of the MCPA, is then used to generalize the MCPA so as to take account of ODD. It is shown that the analytic properties of the MCPA are preserved under this generalization. Also, two computationally simple cluster approximations, the self-consistent central-site approximation (SCCSA) and the self-consistent boundary-site approximation (SCBSA), are generalized to include the effects of ODD. It is shown that for one-dimensional systems with only nearest-neighbor hopping the SCBSA yields Green's functions which are identical to those given by the MCPA and thus are analytic, even in the presence of ODD. Finally, the results of numerical calculations are reported for one-dimensional systems with only nearest-neighbor hopping but with both diagonal and off-diagonal disorder. These calculations were performed using the single-site approximation of Blackman, Esterling, and Berk and three different cluster approximations: the multishell method previously proposed by the authors, the SCCSA, and the SCBSA. The results of these calculations are compared with exact results and with previous results obtained using the truncated t-matix approximation and the recent method of Kaplan and Gray. These comparisons suggest that the multishell method and the generalization of the SCBSA given in this paper are more efficient and accurate for the calculation of densities of states for systems with ODD. On the other hand, as expected, the SCCSA was found to yield severely nonanalytic results for the values of band parameters used

  16. A Bayesian Decision-Theoretic Approach to Logically-Consistent Hypothesis Testing

    Directory of Open Access Journals (Sweden)

    Gustavo Miranda da Silva

    2015-09-01

    Full Text Available This work addresses an important issue regarding the performance of simultaneous test procedures: the construction of multiple tests that at the same time are optimal from a statistical perspective and that also yield logically-consistent results that are easy to communicate to practitioners of statistical methods. For instance, if hypothesis A implies hypothesis B, is it possible to create optimal testing procedures that reject A whenever they reject B? Unfortunately, several standard testing procedures fail in having such logical consistency. Although this has been deeply investigated under a frequentist perspective, the literature lacks analyses under a Bayesian paradigm. In this work, we contribute to the discussion by investigating three rational relationships under a Bayesian decision-theoretic standpoint: coherence, invertibility and union consonance. We characterize and illustrate through simple examples optimal Bayes tests that fulfill each of these requisites separately. We also explore how far one can go by putting these requirements together. We show that although fairly intuitive tests satisfy both coherence and invertibility, no Bayesian testing scheme meets the desiderata as a whole, strengthening the understanding that logical consistency cannot be combined with statistical optimality in general. Finally, we associate Bayesian hypothesis testing with Bayes point estimation procedures. We prove the performance of logically-consistent hypothesis testing by means of a Bayes point estimator to be optimal only under very restrictive conditions.

  17. Consistent robustness analysis (CRA) identifies biologically relevant properties of regulatory network models.

    Science.gov (United States)

    Saithong, Treenut; Painter, Kevin J; Millar, Andrew J

    2010-12-16

    A number of studies have previously demonstrated that "goodness of fit" is insufficient in reliably classifying the credibility of a biological model. Robustness and/or sensitivity analysis is commonly employed as a secondary method for evaluating the suitability of a particular model. The results of such analyses invariably depend on the particular parameter set tested, yet many parameter values for biological models are uncertain. Here, we propose a novel robustness analysis that aims to determine the "common robustness" of the model with multiple, biologically plausible parameter sets, rather than the local robustness for a particular parameter set. Our method is applied to two published models of the Arabidopsis circadian clock (the one-loop [1] and two-loop [2] models). The results reinforce current findings suggesting the greater reliability of the two-loop model and pinpoint the crucial role of TOC1 in the circadian network. Consistent Robustness Analysis can indicate both the relative plausibility of different models and also the critical components and processes controlling each model.

  18. Unilateral Measures addressing Non-Trade Concerns. A Study on WTO Consistency, Relevance of other International Agreements, Economic Effectiveness and Impact on Developing Countries of Measures concerning Non-Product-Related Processes and Production Methods

    International Nuclear Information System (INIS)

    Van den Bossche, P.; Schrijver, N.; Faber, G.

    2007-01-01

    Over the last two years, the debate in the Netherlands on trade measures addressing non-trade concerns has focused on two important and politically sensitive issues, namely: (1) the sustainability of the large-scale production of biomass as an alternative source of energy; and (2) the production of livestock products in a manner that is consistent with animal welfare requirements. In February 2007 a report was issued on the 'Toetsingskader voor Duurzame Biomassa', the so-called Cramer Report. This report discusses the risks associated with large-scale biomass production and establishes a list of criteria for the sustainable production of biomass. These criteria reflect a broad range of non-trade concerns, including environmental protection, global warming, food security, biodiversity, economic prosperity and social welfare. The report recognizes that the implementation of the criteria (including the establishment of a certification system) will require careful consideration of the obligations of the Netherlands under EU and WTO law. Governments called upon to address non-trade concerns may do so by using different types of measures. Prominent among these are measures concerning processes and production methods of products. In the present study, these issues are examined primarily with regard to existing, proposed or still purely hypothetical measures for implementing the Cramer criteria for the sustainable production of biomass. Several other, non-energy-related issues are discussed in this report

  19. Synchronization in node of complex networks consist of complex chaotic system

    Energy Technology Data Exchange (ETDEWEB)

    Wei, Qiang, E-mail: qiangweibeihua@163.com [Beihua University computer and technology College, BeiHua University, Jilin, 132021, Jilin (China); Digital Images Processing Institute of Beihua University, BeiHua University, Jilin, 132011, Jilin (China); Faculty of Electronic Information and Electrical Engineering, Dalian University of Technology, Dalian, 116024 (China); Xie, Cheng-jun [Beihua University computer and technology College, BeiHua University, Jilin, 132021, Jilin (China); Digital Images Processing Institute of Beihua University, BeiHua University, Jilin, 132011, Jilin (China); Liu, Hong-jun [School of Information Engineering, Weifang Vocational College, Weifang, 261041 (China); Li, Yan-hui [The Library, Weifang Vocational College, Weifang, 261041 (China)

    2014-07-15

    A new synchronization method is investigated for node of complex networks consists of complex chaotic system. When complex networks realize synchronization, different component of complex state variable synchronize up to different scaling complex function by a designed complex feedback controller. This paper change synchronization scaling function from real field to complex field for synchronization in node of complex networks with complex chaotic system. Synchronization in constant delay and time-varying coupling delay complex networks are investigated, respectively. Numerical simulations are provided to show the effectiveness of the proposed method.

  20. Consistency of GPS and strong-motion records: case study of the Mw9.0 Tohoku-Oki 2011 earthquake

    Science.gov (United States)

    Psimoulis, Panos; Houlié, Nicolas; Michel, Clotaire; Meindl, Michael; Rothacher, Markus

    2014-05-01

    High-rate GPS data are today commonly used to supplement seismic data for the Earth surface motions focusing on earthquake characterisation and rupture modelling. Processing of GPS records using Precise Point Positioning (PPP) can provide real-time information of seismic wave propagation, tsunami early-warning and seismic rupture. Most studies have shown differences between the GPS and seismic systems at very long periods (e.g. >100sec) and static displacements. The aim of this study is the assessment of the consistency of GPS and strong-motion records by comparing their respective displacement waveforms for several frequency bands. For this purpose, the records of the GPS (GEONET) and the strong-motion (KiK-net and K-NET) networks corresponding to the Mw9.0 Tohoku 2011 earthquake were analysed. The comparison of the displacement waveforms of collocated (distance<100m) GPS and strong-motion sites show that the consistency between the two datasets depends on the frequency of the excitation. Differences are mainly due to the GPS noise at relatively short-periods (<3-4 s) and the saturation of the strong-motion sensors for relatively long-periods (40-80 s). Furthermore the agreement between the GPS and strong-motion records also depends on the direction of the excitation signal and the distance from the epicentre. In conclusion, velocities and displacements recovered from GPS and strong-motion records are consistent for long-periods (3-100 s), proving that GPS networks can contribute to the real-time estimation of the long-period ground motion map of an earthquake.

  1. Radionuclide methods application in cardiac studies

    International Nuclear Information System (INIS)

    Kotina, E.D.; Ploskikh, V.A.; Babin, A.V.

    2013-01-01

    Radionuclide methods are one of the most modern methods of functional diagnostics of diseases of the cardio-vascular system that requires the use of mathematical methods of processing and analysis of data obtained during the investigation. Study is carried out by means of one-photon emission computed tomography (SPECT). Mathematical methods and software for SPECT data processing are developed. This software allows defining physiologically meaningful indicators for cardiac studies

  2. Consistency Analysis of Nearest Subspace Classifier

    OpenAIRE

    Wang, Yi

    2015-01-01

    The Nearest subspace classifier (NSS) finds an estimation of the underlying subspace within each class and assigns data points to the class that corresponds to its nearest subspace. This paper mainly studies how well NSS can be generalized to new samples. It is proved that NSS is strongly consistent under certain assumptions. For completeness, NSS is evaluated through experiments on various simulated and real data sets, in comparison with some other linear model based classifiers. It is also ...

  3. Consistency-dependent optical properties of lubricating grease studied by terahertz spectroscopy

    International Nuclear Information System (INIS)

    Tian Lu; Zhao Kun; Zhou Qing-Li; Shi Yu-Lei; Zhao Dong-Mei; Zhang Cun-Lin; Zhao Song-Qing; Zhao Hui; Bao Ri-Ma; Zhu Shou-Ming; Miao Qing

    2011-01-01

    The optical properties of four kinds of lubricating greases (urea, lithium, extreme pressure lithium, molybdenum disulfide lithium greases) with different NLGL (National Lubricant Grease Institute of America) numbers were investigated using terahertz time-domain spectroscopy. Greases with different NLGL grades have unique spectral features in the terahertz range. Comparison of the experimental data with predictions based on Lorentz—Lorenz theory exhibited that the refractive indices of each kind of lubricating grease were dependent on the their consistency. In addition, molybdenum disulfide (MoS 2 ) as a libricant additive shows strong absorption from 0.2 to 1.4 THz, leading to higher absorption of MoS 2 -lithium grease than that of lithium grease. (general)

  4. Self-consistent theory of hadron-nucleus scattering. Application to pion physics

    International Nuclear Information System (INIS)

    Johnson, M.B.

    1981-01-01

    The first part of this set of two seminars will consist of a review of several of the important accomplishments made in the last few years in the field of pion-nucleus physics. Next I discuss some questions raised by these accomplishments and show that for some very natural reasons the commonly employed theoretical methods cannot be applied to answer these questions. This situation leads to the idea of self-consistency, which is first explained in a general context. The remainder of the seminars are devoted to illustrating the idea within a simple multiple-scattering model for the case of pion scattering. An evaluation of the effectiveness of the self-consistent requirment to produce a solution to the model is made, and a few of the questions raised by recent accomplishments in the field of pion physics are addressed in the model. Finally, the results of the model calculation are compared to experimental data and implications of the results discussed. (orig./HSI)

  5. Is cosmology consistent?

    International Nuclear Information System (INIS)

    Wang Xiaomin; Tegmark, Max; Zaldarriaga, Matias

    2002-01-01

    We perform a detailed analysis of the latest cosmic microwave background (CMB) measurements (including BOOMERaNG, DASI, Maxima and CBI), both alone and jointly with other cosmological data sets involving, e.g., galaxy clustering and the Lyman Alpha Forest. We first address the question of whether the CMB data are internally consistent once calibration and beam uncertainties are taken into account, performing a series of statistical tests. With a few minor caveats, our answer is yes, and we compress all data into a single set of 24 bandpowers with associated covariance matrix and window functions. We then compute joint constraints on the 11 parameters of the 'standard' adiabatic inflationary cosmological model. Our best fit model passes a series of physical consistency checks and agrees with essentially all currently available cosmological data. In addition to sharp constraints on the cosmic matter budget in good agreement with those of the BOOMERaNG, DASI and Maxima teams, we obtain a heaviest neutrino mass range 0.04-4.2 eV and the sharpest constraints to date on gravity waves which (together with preference for a slight red-tilt) favor 'small-field' inflation models

  6. Personality and behavior prediction and consistency across cultures: A multimethod study of Blacks and Whites in South Africa.

    Science.gov (United States)

    Fetvadjiev, Velichko H; Meiring, Deon; van de Vijver, Fons J R; Nel, J Alewyn; Sekaja, Lusanda; Laher, Sumaya

    2018-03-01

    The cross-cultural universality of behavior's consistency and predictability from personality, assumed in trait models though challenged in cultural psychological models, has usually been operationalized in terms of beliefs and perceptions, and assessed using single-instance self-reports. In a multimethod study of actual behavior across a range of situations, we examined predictability and consistency in participants from the more collectivistic Black ethnic group and the more individualistic White group in South Africa. Participants completed personality questionnaires before the behavior measurements. In Study 1, 107 Black and 241 White students kept diaries for 21 days, recording their behaviors and the situations in which they had occurred. In Study 2, 57 Black and 52 White students were video-recorded in 12 situations in laboratory settings, and external observers scored their behaviors. Across both studies, behavior was predicted by personality on average equally well in the 2 groups, and equally well when using trait-adjective- and behavior-based personality measures. The few cultural differences in situational variability were not in line with individualism-collectivism; however, subjective perceptions of variability, operationalized as dialectical beliefs, were more in line with individualism-collectivism: Blacks viewed their behavior as more variable than Whites. We propose drawing a distinction between subjective beliefs and objective behavior in the study of personality and culture. Larger cultural differences can be expected in beliefs and perceptions than in the links between personality and actual behavior. (PsycINFO Database Record (c) 2018 APA, all rights reserved).

  7. Consistency in the Reporting of Sensitive Behaviors by Adolescent American Indian Women: A Comparison of Interviewing Methods

    Science.gov (United States)

    Mullany, Britta; Barlow, Allison; Neault, Nicole; Billy, Trudy; Hastings, Ranelda; Coho-Mescal, Valerie; Lorenzo, Sherilyn; Walkup, John T.

    2013-01-01

    Computer-assisted interviewing techniques have increasingly been used in program and research settings to improve data collection quality and efficiency. Little is known, however, regarding the use of such techniques with American Indian (AI) adolescents in collecting sensitive information. This brief compares the consistency of AI adolescent…

  8. No Consistent Evidence for Advancing or Delaying Trends in Spring Phenology on the Tibetan Plateau

    Science.gov (United States)

    Wang, Xufeng; Xiao, Jingfeng; Li, Xin; Cheng, Guodong; Ma, Mingguo; Che, Tao; Dai, Liyun; Wang, Shaoying; Wu, Jinkui

    2017-12-01

    Vegetation phenology is a sensitive indicator of climate change and has significant effects on the exchange of carbon, water, and energy between the terrestrial biosphere and the atmosphere. The Tibetan Plateau, the Earth's "third pole," is a unique region for studying the long-term trends in vegetation phenology in response to climate change because of the sensitivity of its alpine ecosystems to climate and its low-level human disturbance. There has been a debate whether the trends in spring phenology over the Tibetan Plateau have been continuously advancing over the last two to three decades. In this study, we examine the trends in the start of growing season (SOS) for alpine meadow and steppe using the Global Inventory Modeling and Mapping Studies (GIMMS)3g normalized difference vegetation index (NDVI) data set (1982-2014), the GIMMS NDVI data set (1982-2006), the Moderate Resolution Imaging Spectroradiometer (MODIS) NDVI data set (2001-2014), the Satellite Pour l'Observation de la Terre Vegetation (SPOT-VEG) NDVI data set (1999-2013), and the Sea-viewing Wide Field-of-View Sensor (SeaWiFS) NDVI data set (1998-2007). Both logistic and polynomial fitting methods are used to retrieve the SOS dates from the NDVI data sets. Our results show that the trends in spring phenology over the Tibetan Plateau depend on both the NDVI data set used and the method for retrieving the SOS date. There are large discrepancies in the SOS trends among the different NDVI data sets and between the two different retrieval methods. There is no consistent evidence that spring phenology ("green-up" dates) has been advancing or delaying over the Tibetan Plateau during the last two to three decades. Ground-based budburst data also indicate no consistent trends in spring phenology. The responses of SOS to environmental factors (air temperature, precipitation, soil temperature, and snow depth) also vary among NDVI data sets and phenology retrieval methods. The increases in winter and spring

  9. Calculation of Self-consistent Radial Electric Field in Presence of Convective Electron Transport in a Stellarator

    International Nuclear Information System (INIS)

    Kernbichler, W.; Heyn, M.F.; Kasilov, S.V.

    2003-01-01

    Convective transport of supra-thermal electrons can play a significant role in the energy balance of stellarators in case of high power electron cyclotron heating. Here, together with neoclassical thermal particle fluxes also the supra-thermal electron flux should be taken into account in the flux ambipolarity condition, which defines the self-consistent radial electric field. Since neoclassical particle fluxes are non-linear functions of the radial electric field, one needs an iterative procedure to solve the ambipolarity condition, where the supra-thermal electron flux has to be calculated for each iteration. A conventional Monte-Carlo method used earlier for evaluation of supra-thermal electron fluxes is rather slow for performing the iterations in reasonable computer time. In the present report, the Stochastic Mapping Technique (SMT), which is more effective than the conventional Monte Carlo method, is used instead. Here, the problem with a local monoenergetic supra-thermal particle source is considered and the effect of supra-thermal electron fluxes on both, the self-consistent radial electric field and the formation of different roots of the ambipolarity condition are studied

  10. Self-consistent radial sheath

    International Nuclear Information System (INIS)

    Hazeltine, R.D.

    1988-12-01

    The boundary layer arising in the radial vicinity of a tokamak limiter is examined, with special reference to the TEXT tokamak. It is shown that sheath structure depends upon the self-consistent effects of ion guiding-center orbit modification, as well as the radial variation of E /times/ B-induced toroidal rotation. Reasonable agreement with experiment is obtained from an idealized model which, however simplified, preserves such self-consistent effects. It is argued that the radial sheath, which occurs whenever confining magnetic field-lines lie in the plasma boundary surface, is an object of some intrinsic interest. It differs from the more familiar axial sheath because magnetized charges respond very differently to parallel and perpendicular electric fields. 11 refs., 1 fig

  11. Consistently high sports/exercise activity is associated with better sleep quality, continuity and depth in midlife women: the SWAN sleep study.

    Science.gov (United States)

    Kline, Christopher E; Irish, Leah A; Krafty, Robert T; Sternfeld, Barbara; Kravitz, Howard M; Buysse, Daniel J; Bromberger, Joyce T; Dugan, Sheila A; Hall, Martica H

    2013-09-01

    To examine relationships between different physical activity (PA) domains and sleep, and the influence of consistent PA on sleep, in midlife women. Cross-sectional. Community-based. 339 women in the Study of Women's Health Across the Nation Sleep Study (52.1 ± 2.1 y). None. Sleep was examined using questionnaires, diaries and in-home polysomnography (PSG). PA was assessed in three domains (Active Living, Household/Caregiving, Sports/Exercise) using the Kaiser Physical Activity Survey (KPAS) up to 4 times over 6 years preceding the sleep assessments. The association between recent PA and sleep was evaluated using KPAS scores immediately preceding the sleep assessments. The association between the historical PA pattern and sleep was examined by categorizing PA in each KPAS domain according to its pattern over the 6 years preceding sleep assessments (consistently low, inconsistent/consistently moderate, or consistently high). Greater recent Sports/Exercise activity was associated with better sleep quality (diary "restedness" [P sleep continuity (diary sleep efficiency [SE; P = 0.02]) and depth (higher NREM delta electroencephalographic [EEG] power [P = 0.04], lower NREM beta EEG power [P Sports/Exercise activity was also associated with better Pittsburgh Sleep Quality Index scores (P = 0.02) and higher PSG-assessed SE (P sleep and Active Living or Household/Caregiving activity (either recent or historical pattern) were noted. Consistently high levels of recreational physical activity, but not lifestyle- or household-related activity, are associated with better sleep in midlife women. Increasing recreational physical activity early in midlife may protect against sleep disturbance in this population.

  12. A Hierarchical Approach for Measuring the Consistency of Water Areas between Multiple Representations of Tile Maps with Different Scales

    Directory of Open Access Journals (Sweden)

    Yilang Shen

    2017-08-01

    Full Text Available In geographic information systems, the reliability of querying, analysing, or reasoning results depends on the data quality. One central criterion of data quality is consistency, and identifying inconsistencies is crucial for maintaining the integrity of spatial data from multiple sources or at multiple resolutions. In traditional methods of consistency assessment, vector data are used as the primary experimental data. In this manuscript, we describe the use of a new type of raster data, tile maps, to access the consistency of information from multiscale representations of the water bodies that make up drainage systems. We describe a hierarchical methodology to determine the spatial consistency of tile-map datasets that display water areas in a raster format. Three characteristic indices, the degree of global feature consistency, the degree of local feature consistency, and the degree of overlap, are proposed to measure the consistency of multiscale representations of water areas. The perceptual hash algorithm and the scale-invariant feature transform (SIFT descriptor are applied to extract and measure the global and local features of water areas. By performing combined calculations using these three characteristic indices, the degrees of consistency of multiscale representations of water areas can be divided into five grades: exactly consistent, highly consistent, moderately consistent, less consistent, and inconsistent. For evaluation purposes, the proposed method is applied to several test areas from the Tiandi map of China. In addition, we identify key technologies that are related to the process of extracting water areas from a tile map. The accuracy of the consistency assessment method is evaluated, and our experimental results confirm that the proposed methodology is efficient and accurate.

  13. "Geo-statistics methods and neural networks in geophysical applications: A case study"

    Science.gov (United States)

    Rodriguez Sandoval, R.; Urrutia Fucugauchi, J.; Ramirez Cruz, L. C.

    2008-12-01

    The study is focus in the Ebano-Panuco basin of northeastern Mexico, which is being explored for hydrocarbon reservoirs. These reservoirs are in limestones and there is interest in determining porosity and permeability in the carbonate sequences. The porosity maps presented in this study are estimated from application of multiattribute and neural networks techniques, which combine geophysics logs and 3-D seismic data by means of statistical relationships. The multiattribute analysis is a process to predict a volume of any underground petrophysical measurement from well-log and seismic data. The data consist of a series of target logs from wells which tie a 3-D seismic volume. The target logs are neutron porosity logs. From the 3-D seismic volume a series of sample attributes is calculated. The objective of this study is to derive a set of attributes and the target log values. The selected set is determined by a process of forward stepwise regression. The analysis can be linear or nonlinear. In the linear mode the method consists of a series of weights derived by least-square minimization. In the nonlinear mode, a neural network is trained using the select attributes as inputs. In this case we used a probabilistic neural network PNN. The method is applied to a real data set from PEMEX. For better reservoir characterization the porosity distribution was estimated using both techniques. The case shown a continues improvement in the prediction of the porosity from the multiattribute to the neural network analysis. The improvement is in the training and the validation, which are important indicators of the reliability of the results. The neural network showed an improvement in resolution over the multiattribute analysis. The final maps provide more realistic results of the porosity distribution.

  14. A feasibility study on FP transmutation for Self-Consistent Nuclear Energy System (SCNES)

    International Nuclear Information System (INIS)

    Fujita, Reiko; Kawashima, Masatoshi; Ueda, Hiroaki; Takagi, Ryuzo; Matsuura, Haruaki; Fujii-e, Yoichi

    1997-01-01

    A fast reactor core/fuel cycle concept is discussed for the future 'Self-Consistent Nuclear Energy System (SCNES)' concept. The present study mainly discussed long-lived fission products (LLFPs) burning capability and recycle scheme in the framework of metallic fuel fast reactor cycle, aiming at the goals for fuel breeding capability and confinement for TRU and radio-active FPs within the system. In present paper, burning capability for Cs135 and Zr93 is mainly discussed from neutronic and chemical view points, assuming metallic fuel cycle system. The recent experimental results indicate that Cs can be separable along with the pyroprocess for metal fuel recycle system, as previously designed for a candidate fuel cycle system. Combining neutron spectrum-shift for target sub-assemblies and isotope separation using tunable laser, LLFP burning capability is enhanced. This result indicates that major LLFPs can be treated in the additional recycle schemes to avoid LLFP accumulation along with energy production. In total, the proposed fuel cycle is an candidate for realizing SCNES concept. (author)

  15. Multivariate analysis: models and method

    International Nuclear Information System (INIS)

    Sanz Perucha, J.

    1990-01-01

    Data treatment techniques are increasingly used since computer methods result of wider access. Multivariate analysis consists of a group of statistic methods that are applied to study objects or samples characterized by multiple values. A final goal is decision making. The paper describes the models and methods of multivariate analysis

  16. Choice, internal consistency, and rationality

    OpenAIRE

    Aditi Bhattacharyya; Prasanta K. Pattanaik; Yongsheng Xu

    2010-01-01

    The classical theory of rational choice is built on several important internal consistency conditions. In recent years, the reasonableness of those internal consistency conditions has been questioned and criticized, and several responses to accommodate such criticisms have been proposed in the literature. This paper develops a general framework to accommodate the issues raised by the criticisms of classical rational choice theory, and examines the broad impact of these criticisms from both no...

  17. Smaller self-inflating bags produce greater guideline consistent ventilation in simulated cardiopulmonary resuscitation

    Directory of Open Access Journals (Sweden)

    Boyle Malcolm J

    2009-02-01

    Full Text Available Abstract Background Suboptimal bag ventilation in cardiopulmonary resuscitation (CPR has demonstrated detrimental physiological outcomes for cardiac arrest patients. In light of recent guideline changes for resuscitation, there is a need to identify the efficacy of bag ventilation by prehospital care providers. The objective of this study was to evaluate bag ventilation in relation to operator ability to achieve guideline consistent ventilation rate, tidal volume and minute volume when using two different capacity self-inflating bags in an undergraduate paramedic cohort. Methods An experimental study using a mechanical lung model and a simulated adult cardiac arrest to assess the ventilation ability of third year Monash University undergraduate paramedic students. Participants were instructed to ventilate using 1600 ml and 1000 ml bags for a length of two minutes at the correct rate and tidal volume for a patient undergoing CPR with an advanced airway. Ventilation rate and tidal volume were recorded using an analogue scale with mean values calculated. Ethics approval was granted. Results Suboptimal ventilation with the use of conventional 1600 ml bag was common, with 77% and 97% of participants unable to achieve guideline consistent ventilation rates and tidal volumes respectively. Reduced levels of suboptimal ventilation arouse from the use of the smaller bag with a 27% reduction in suboptimal tidal volumes (p = 0.015 and 23% reduction in suboptimal minute volumes (p = 0.045. Conclusion Smaller self-inflating bags reduce the incidence of suboptimal tidal volumes and minute volumes and produce greater guideline consistent results for cardiac arrest patients.

  18. Consistent Partial Least Squares Path Modeling via Regularization.

    Science.gov (United States)

    Jung, Sunho; Park, JaeHong

    2018-01-01

    Partial least squares (PLS) path modeling is a component-based structural equation modeling that has been adopted in social and psychological research due to its data-analytic capability and flexibility. A recent methodological advance is consistent PLS (PLSc), designed to produce consistent estimates of path coefficients in structural models involving common factors. In practice, however, PLSc may frequently encounter multicollinearity in part because it takes a strategy of estimating path coefficients based on consistent correlations among independent latent variables. PLSc has yet no remedy for this multicollinearity problem, which can cause loss of statistical power and accuracy in parameter estimation. Thus, a ridge type of regularization is incorporated into PLSc, creating a new technique called regularized PLSc. A comprehensive simulation study is conducted to evaluate the performance of regularized PLSc as compared to its non-regularized counterpart in terms of power and accuracy. The results show that our regularized PLSc is recommended for use when serious multicollinearity is present.

  19. Consistent Partial Least Squares Path Modeling via Regularization

    Directory of Open Access Journals (Sweden)

    Sunho Jung

    2018-02-01

    Full Text Available Partial least squares (PLS path modeling is a component-based structural equation modeling that has been adopted in social and psychological research due to its data-analytic capability and flexibility. A recent methodological advance is consistent PLS (PLSc, designed to produce consistent estimates of path coefficients in structural models involving common factors. In practice, however, PLSc may frequently encounter multicollinearity in part because it takes a strategy of estimating path coefficients based on consistent correlations among independent latent variables. PLSc has yet no remedy for this multicollinearity problem, which can cause loss of statistical power and accuracy in parameter estimation. Thus, a ridge type of regularization is incorporated into PLSc, creating a new technique called regularized PLSc. A comprehensive simulation study is conducted to evaluate the performance of regularized PLSc as compared to its non-regularized counterpart in terms of power and accuracy. The results show that our regularized PLSc is recommended for use when serious multicollinearity is present.

  20. Self-consistent electronic-structure calculations for interface geometries

    International Nuclear Information System (INIS)

    Sowa, E.C.; Gonis, A.; MacLaren, J.M.; Zhang, X.G.

    1992-01-01

    This paper describes a technique for computing self-consistent electronic structures and total energies of planar defects, such as interfaces, which are embedded in an otherwise perfect crystal. As in the Layer Korringa-Kohn-Rostoker approach, the solid is treated as a set of coupled layers of atoms, using Bloch's theorem to take advantage of the two-dimensional periodicity of the individual layers. The layers are coupled using the techniques of the Real-Space Multiple-Scattering Theory, avoiding artificial slab or supercell boundary conditions. A total-energy calculation on a Cu crystal, which has been split apart at a (111) plane, is used to illustrate the method

  1. Study of parachute inflation process using fluid–structure interaction method

    Directory of Open Access Journals (Sweden)

    Yu Li

    2014-04-01

    Full Text Available A direct numerical modeling method for parachute is proposed firstly, and a model for the star-shaped folded parachute with detailed structures is established. The simplified arbitrary Lagrangian–Eulerian fluid–structure interaction (SALE/FSI method is used to simulate the inflation process of a folded parachute, and the flow field calculation is mainly based on operator splitting technique. By using this method, the dynamic variations of related parameters such as flow field and structure are obtained, and the load jump appearing at the end of initial inflation stage is captured. Numerical results including opening load, drag characteristics, swinging angle, etc. are well consistent with wind tunnel tests. In addition, this coupled method can get more space–time detailed information such as geometry shape, structure, motion, and flow field. Compared with previous inflation time method, this method is a completely theoretical analysis approach without relying on empirical coefficients, which can provide a reference for material selection, performance optimization during parachute design.

  2. Quasiparticle self-consistent GW study of cuprates: electronic structure, model parameters, and the two-band theory for Tc.

    Science.gov (United States)

    Jang, Seung Woo; Kotani, Takao; Kino, Hiori; Kuroki, Kazuhiko; Han, Myung Joon

    2015-07-24

    Despite decades of progress, an understanding of unconventional superconductivity still remains elusive. An important open question is about the material dependence of the superconducting properties. Using the quasiparticle self-consistent GW method, we re-examine the electronic structure of copper oxide high-Tc materials. We show that QSGW captures several important features, distinctive from the conventional LDA results. The energy level splitting between d(x(2)-y(2)) and d(3z(2)-r(2)) is significantly enlarged and the van Hove singularity point is lowered. The calculated results compare better than LDA with recent experimental results from resonant inelastic xray scattering and angle resolved photoemission experiments. This agreement with the experiments supports the previously suggested two-band theory for the material dependence of the superconducting transition temperature, Tc.

  3. Time Consistent Strategies for Mean-Variance Asset-Liability Management Problems

    Directory of Open Access Journals (Sweden)

    Hui-qiang Ma

    2013-01-01

    Full Text Available This paper studies the optimal time consistent investment strategies in multiperiod asset-liability management problems under mean-variance criterion. By applying time consistent model of Chen et al. (2013 and employing dynamic programming technique, we derive two-time consistent policies for asset-liability management problems in a market with and without a riskless asset, respectively. We show that the presence of liability does affect the optimal strategy. More specifically, liability leads a parallel shift of optimal time-consistent investment policy. Moreover, for an arbitrarily risk averse investor (under the variance criterion with liability, the time-diversification effects could be ignored in a market with a riskless asset; however, it should be considered in a market without any riskless asset.

  4. DIETFITS Study (Diet Intervention Examining The Factors Interacting with Treatment Success) – Study Design and Methods

    Science.gov (United States)

    Stanton, Michael; Robinson, Jennifer; Kirkpatrick, Susan; Farzinkhou, Sarah; Avery, Erin; Rigdon, Joseph; Offringa, Lisa; Trepanowski, John; Hauser, Michelle; Hartle, Jennifer; Cherin, Rise; King, Abby C.; Ioannidis, John P.A.; Desai, Manisha; Gardner, Christopher D.

    2017-01-01

    Numerous studies have attempted to identify successful dietary strategies for weight loss, and many have focused on Low-Fat vs. Low-Carbohydrate comparisons. Despite relatively small between-group differences in weight loss found in most previous studies, researchers have consistently observed relatively large between-subject differences in weight loss within any given diet group (e.g., ~25 kg weight loss to ~5 kg weight gain). The primary objective of this study was to identify predisposing individual factors at baseline that help explain differential weight loss achieved by individuals assigned to the same diet, particularly a pre-determined multi-locus genotype pattern and insulin resistance status. Secondary objectives included discovery strategies for further identifying potential genetic risk scores. Exploratory objectives included investigation of an extensive set of physiological, psychosocial, dietary, and behavioral variables as moderating and/or mediating variables and/or secondary outcomes. The target population was generally healthy, free-living adults with BMI 28-40 kg/m2 (n=600). The intervention consisted of a 12-month protocol of 22 one-hour evening instructional sessions led by registered dietitians, with ~15-20 participants/class. Key objectives of dietary instruction included focusing on maximizing the dietary quality of both Low-Fat and Low-Carbohydrate diets (i.e., Healthy Low-Fat vs. Healthy Low-Carbohydrate), and maximally differentiating the two diets from one another. Rather than seeking to determine if one dietary approach was better than the other for the general population, this study sought to examine whether greater overall weight loss success could be achieved by matching different people to different diets. Here we present the design and methods of the study. PMID:28027950

  5. Study on phase transformations in superconducting Ti-50%Nb alloy using temperature-dependent internal friction method

    International Nuclear Information System (INIS)

    Shapoval, B.I.; Tikhinskij, G.F.; Somov, A.I.; Chernyj, O.V.; Rudycheva, T.Yu.; Andrievskaya, N.F.

    1980-01-01

    The internal friction method is used to study phase transformations in the Ti-50%Nb alloy parallel with other methods. The effect of annealing temperature and time, as well as the content of interstitial impurities in the alloy and its thermomechanical treatment (TMT) is studied. In the 250-300 deg C temperature range the complex maximum of internal friction caused by extraction of secondary phases is observed. The latter is confirmed by the measurement data of mechanical properties and electron microscopic analysis. The maximum consists of three overlapping peaks that reflects stepped form of the decomposition process of the metastable solid solution. The preliminary thermo-mechanical alloy treatment consisting of equidirectional plastic deformation with the following recrystallization annealing leads to peak increase. This fact testifies to the stimulating effect of thermo-mechanical treatment on the degree of solid solution decomposition and reveals in the increase of the critical current density of a wire made of the ingot. The increase of the interstitial impurity content in the alloy has the analogous effect. The reduction of the internal friction level during isothermal stand-up at temperatures higher than the third peak temperature proceeds in two stages [ru

  6. A relativistic self-consistent model for studying enhancement of space charge limited emission due to counter-streaming ions

    Science.gov (United States)

    Lin, M. C.; Verboncoeur, J.

    2016-10-01

    A maximum electron current transmitted through a planar diode gap is limited by space charge of electrons dwelling across the gap region, the so called space charge limited (SCL) emission. By introducing a counter-streaming ion flow to neutralize the electron charge density, the SCL emission can be dramatically raised, so electron current transmission gets enhanced. In this work, we have developed a relativistic self-consistent model for studying the enhancement of maximum transmission by a counter-streaming ion current. The maximum enhancement is found when the ion effect is saturated, as shown analytically. The solutions in non-relativistic, intermediate, and ultra-relativistic regimes are obtained and verified with 1-D particle-in-cell simulations. This self-consistent model is general and can also serve as a comparison for verification of simulation codes, as well as extension to higher dimensions.

  7. Monitoring the quality consistency of Fufang Danshen Pills using micellar electrokinetic chromatography fingerprint coupled with prediction of antioxidant activity and chemometrics.

    Science.gov (United States)

    Ji, Zhengchao; Sun, Wanyang; Sun, Guoxiang; Zhang, Jin

    2016-08-01

    A fast micellar electrokinetic chromatography fingerprint method combined with quantification was developed and validated to evaluate the quality of Fufang Danshen Pills, a traditional Chinese Medicine, which has been used in the treatment of cardiovascular system diseases, in which the tetrahedron optimization method was first used to optimize the background electrolyte solution. Subsequently, the index of the fingerprint information amount of I was performed as an excellent objective indictor to investigate the experimental conditions. In addition, a systematical quantified fingerprint method was constructed for evaluating the quality consistency of 20 batches of test samples obtained from the same drug manufacturer. The fingerprint analysis combined with quantitative determination of two components showed that the quality consistency of the test samples was quite good within the same commercial brand. Furthermore, the partial least squares model analysis was used to explore the fingerprint-efficacy relationship between active components and antioxidant activity in vitro, which can be applied for the assessment of anti-oxidant activity of Fufang Danshen pills and provide valuable medicinal information for quality control. The result illustrated that the present study provided a reliable and reasonable method for monitoring the quality consistency of Fufang Danshen pills. © 2016 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  8. The Value of Mixed Methods Research: A Mixed Methods Study

    Science.gov (United States)

    McKim, Courtney A.

    2017-01-01

    The purpose of this explanatory mixed methods study was to examine the perceived value of mixed methods research for graduate students. The quantitative phase was an experiment examining the effect of a passage's methodology on students' perceived value. Results indicated students scored the mixed methods passage as more valuable than those who…

  9. Theoretical studies of potential energy surfaces and computational methods

    Energy Technology Data Exchange (ETDEWEB)

    Shepard, R. [Argonne National Laboratory, IL (United States)

    1993-12-01

    This project involves the development, implementation, and application of theoretical methods for the calculation and characterization of potential energy surfaces involving molecular species that occur in hydrocarbon combustion. These potential energy surfaces require an accurate and balanced treatment of reactants, intermediates, and products. This difficult challenge is met with general multiconfiguration self-consistent-field (MCSCF) and multireference single- and double-excitation configuration interaction (MRSDCI) methods. In contrast to the more common single-reference electronic structure methods, this approach is capable of describing accurately molecular systems that are highly distorted away from their equilibrium geometries, including reactant, fragment, and transition-state geometries, and of describing regions of the potential surface that are associated with electronic wave functions of widely varying nature. The MCSCF reference wave functions are designed to be sufficiently flexible to describe qualitatively the changes in the electronic structure over the broad range of geometries of interest. The necessary mixing of ionic, covalent, and Rydberg contributions, along with the appropriate treatment of the different electron-spin components (e.g. closed shell, high-spin open-shell, low-spin open shell, radical, diradical, etc.) of the wave functions, are treated correctly at this level. Further treatment of electron correlation effects is included using large scale multireference CI wave functions, particularly including the single and double excitations relative to the MCSCF reference space. This leads to the most flexible and accurate large-scale MRSDCI wave functions that have been used to date in global PES studies.

  10. Studies of self-consistent field structure in a quasi-optical gyrotron

    International Nuclear Information System (INIS)

    Antonsen, T.M. Jr.

    1993-04-01

    The presence of an electron beam in a quasi-optical gyrotron cavity alters the structure of the fields from that of the empty cavity. A computer code has been written which calculates this alteration for either an electron beam or a thin dielectric tube placed in the cavity. Experiments measuring the quality factor of such a cavity performed for the case of a dielectric tube and the results agree with the predictions of the code. Simulations of the case of an electron beam indicate that self-consistent effects can be made small in that almost all the power leaves the cavity in a symmetric gaussian-like mode provided the resonator parameters are chosen carefully. (author) 6 figs., 1 tab., 13 refs

  11. Studies of cancer risk among Chernobyl liquidators: materials and methods

    International Nuclear Information System (INIS)

    Kesminiene, A.; Cardis, E.; Tenet, V.; Ivanov, V.K.; Kurtinaitis, J.; Malakhova, I.; Stengrevics, A.; Tekkel, M.

    2002-01-01

    The current paper presents the methods and design of two case-control studies among Chernobyl liquidators - one of leukaemia and non-Hodgkin lymphoma, the other of thyroid cancer risk - carried out in Belarus, Estonia, Latvia, Lithuania and Russia. The specific objective of these studies is to estimate the radiation induced risk of these diseases among liquidators of the Chernobyl accident, and, in particular, to study the effect of exposure protraction and radiation type on the risk of radiation induced cancer in the low-to-medium- (0-500 mSv) radiation dose range. The study population consists of the approximately 10,000 Baltic, 40,000 Belarus and 51,000 Russian liquidators who worked in the 30 km zone in 1986-1987, and who were registered in the Chernobyl registry of these countries. The studies included cases diagnosed in 1993-1998 for all countries but Belarus, where the study period was extended until 2000. Four controls were selected in each country from the national cohort for each case, matched on age, gender and region of residence. Information on study subjects was obtained through face-to-face interview using a standardised questionnaire with questions on demographic factors, time, place and conditions of work as a liquidator and potential risk and confounding factors for the tumours of interest. Overall, 136 cases and 595 controls after receiving their consent were included in the studies. A method of analytical dose reconstruction has been developed, validated and applied to the estimation of doses and related uncertainties for all the subjects in the study. Dose-response analyses are underway and results are likely to have important implications to assess the adequacy of existing protection standards, which are based on risk estimates derived from analyses of the mortality of atomic bomb survivors and other high dose studies. (author)

  12. Assessment of mastication in healthy children and children with cerebral palsy: a validity and consistency study.

    Science.gov (United States)

    Remijn, L; Speyer, R; Groen, B E; Holtus, P C M; van Limbeek, J; Nijhuis-van der Sanden, M W G

    2013-05-01

    The aim of this study was to develop the Mastication Observation and Evaluation instrument for observing and assessing the chewing ability of children eating solid and lumpy foods. This study describes the process of item definition and item selection and reports the content validity, reproducibility and consistency of the instrument. In the developmental phase, 15 experienced speech therapists assessed item relevance and descriptions over three Delphi rounds. Potential items were selected based on the results from a literature review. At the initial Delphi round, 17 potential items were included. After three Delphi rounds, 14 items that regarded as providing distinctive value in assessment of mastication (consensus >75%) were included in the Mastication Observation and Evaluation instrument. To test item reproducibility and consistency, two experts and five students evaluated video recordings of 20 children (10 children with cerebral palsy aged 29-65 months and 10 healthy children aged 11-42 months) eating bread and a biscuit. Reproducibility was estimated by means of the intraclass correlation coefficient (ICC). With the exception of one item concerning chewing duration, all items showed good to excellent intra-observer agreement (ICC students: 0.73-1.0). With the exception of chewing duration and number of swallows, inter-observer agreement was fair to excellent for all items (ICC experts: 0.68-1.0 and ICC students: 0.42-1.0). Results indicate that this tool is a feasible instrument and could be used in clinical practice after further research is completed on the reliability of the tool. © 2013 Blackwell Publishing Ltd.

  13. Hydrological change: Towards a consistent approach to assess changes on both floods and droughts

    Science.gov (United States)

    Quesada-Montano, Beatriz; Di Baldassarre, Giuliano; Rangecroft, Sally; Van Loon, Anne F.

    2018-01-01

    Several studies have found that the frequency, magnitude and spatio-temporal distribution of droughts and floods have significantly increased in many regions of the world. Yet, most of the methods used in detecting trends in hydrological extremes 1) focus on either floods or droughts, and/or 2) base their assessment on characteristics that, even though useful for trend identification, cannot be directly used in decision making, e.g. integrated water resources management and disaster risk reduction. In this paper, we first discuss the need for a consistent approach to assess changes on both floods and droughts, and then propose a method based on the theory of runs and threshold levels. Flood and drought changes were assessed in terms of frequency, length and surplus/deficit volumes. This paper also presents an example application using streamflow data from two hydrometric stations along the Po River basin (Italy), Piacenza and Pontelagoscuro, and then discuss opportunities and challenges of the proposed method.

  14. QUALITATIVE METHODS IN CREATIVITY STUDIES

    DEFF Research Database (Denmark)

    Hertel, Frederik

    2015-01-01

    In this article we will focus on developing a qualitative research design suitable for conducting case study in creativity. The case is a team of workers (See Hertel, 2015) doing industrial cleaning in the Danish food industry. The hypothesis is that these workers are both participating in......-specific methods, involving a discussion of creativity test, divergent and convergent thinking, for studying creativity in this specific setting. Beside from that we will develop a research design involving a combination of methods necessary for conducting a case study in the setting mentioned....

  15. Self-consistent calculation of steady-state creep and growth in textured zirconium

    International Nuclear Information System (INIS)

    Tome, C.N.; So, C.B.; Woo, C.H.

    1993-01-01

    Irradiation creep and growth in zirconium alloys result in anisotropic dimensional changes relative to the crystallographic axis in each individual grain. Several methods have been attempted to model such dimensional changes, taking into account the development of intergranular stresses. In this paper, we compare the predictions of several such models, namely the upper-bound, the lower-bound, the isotropic K* self-consistent (analytical) and the fully self-consistent (numerical) models. For given single-crystal creep compliances and growth factors, the polycrystal compliances predicted by the upper- and lower-bound models are unreliable. The predictions of the two self-consistent approaches are usually similar. The analytical isotropic K* approach is simple to implement and can be used to estimate the creep and growth rates of the polycrystal in many cases. The numerical fully self-consistent approach should be used when an accurate prediction of polycrystal creep is required, particularly for the important case of a closed-end internally pressurized tube. In most cases, the variations in grain shape introduce only minor corrections to the behaviour of polycrystalline materials. (author)

  16. The Consistency Between Clinical and Electrophysiological Diagnoses

    Directory of Open Access Journals (Sweden)

    Esra E. Okuyucu

    2009-09-01

    Full Text Available OBJECTIVE: The aim of this study was to provide information concerning the impact of electrophysiological tests in the clinical management and diagnosis of patients, and to evaluate the consistency between referring clinical diagnoses and electrophysiological diagnoses. METHODS: The study included 957 patients referred to the electroneuromyography (ENMG laboratory from different clinics with different clinical diagnoses in 2008. Demographic data, referring clinical diagnoses, the clinics where the requests wanted, and diagnoses after ENMG testing were recorded and statistically evaluated. RESULTS: In all, 957 patients [644 (67.3% female and 313 (32.7% male] were included in the study. Mean age of the patients was 45.40 ± 14.54 years. ENMG requests were made by different specialists; 578 (60.4% patients were referred by neurologists, 122 (12.8% by orthopedics, 140 (14.6% by neurosurgeons, and 117 (12.2% by physical treatment and rehabilitation departments. According to the results of ENMG testing, 513 (53.6% patients’ referrals were related to their referral diagnosis, whereas 397 (41.5% patients had normal ENMG test results, and 47 (4.9% patients had a diagnosis that differed from the referring diagnosis. Among the relation between the referral diagnosis and electrophysiological diagnosis according to the clinics where the requests were made, there was no statistical difference (p= 0.794, but there were statistically significant differences between the support of different clinical diagnoses, such as carpal tunnel syndrome, polyneuropathy, radiculopathy-plexopathy, entrapment neuropathy, and myopathy based on ENMG test results (p< 0.001. CONCLUSION: ENMG is a frequently used neurological examination. As such, referrals for ENMG can be made to either support the referring diagnosis or to exclude other diagnoses. This may explain the inconsistency between clinical referring diagnoses and diagnoses following ENMG

  17. Surfactant modified clays’ consistency limits and contact angles

    Directory of Open Access Journals (Sweden)

    S Akbulut

    2012-07-01

    Full Text Available This study was aimed at preparing a surfactant modified clay (SMC and researching the effect of surfactants on clays' contact angles and consistency limits; clay was thus modified by surfactants formodifying their engineering properties. Seven surfactants (trimethylglycine, hydroxyethylcellulose  octyl phenol ethoxylate, linear alkylbenzene sulfonic acid, sodium lauryl ether sulfate, cetyl trimethylammonium chloride and quaternised ethoxylated fatty amine were used as surfactants in this study. The experimental results indicated that SMC consistency limits (liquid and plastic limits changedsignificantly compared to those of natural clay. Plasticity index and liquid limit (PI-LL values representing soil class approached the A-line when zwitterion, nonionic, and anionic surfactant percentageincreased. However, cationic SMC became transformed from CH (high plasticity clay to MH (high plasticity silt class soils, according to the unified soil classification system (USCS. Clay modifiedwith cationic and anionic surfactants gave higher and lower contact angles than natural clay, respectively.

  18. Factors Influencing the Degree of Intrajudge Consistency during the Standard Setting Process.

    Science.gov (United States)

    Plake, Barbara S.; And Others

    The accuracy of standards obtained from judgmental methods is dependent on the quality of the judgments made by experts throughout the standard setting process. One important dimension of the quality of these judgments is the consistency of the judges' perceptions with item performance of minimally competent candidates. Several interrelated…

  19. Consistent guiding center drift theories

    International Nuclear Information System (INIS)

    Wimmel, H.K.

    1982-04-01

    Various guiding-center drift theories are presented that are optimized in respect of consistency. They satisfy exact energy conservation theorems (in time-independent fields), Liouville's theorems, and appropriate power balance equations. A theoretical framework is given that allows direct and exact derivation of associated drift-kinetic equations from the respective guiding-center drift-orbit theories. These drift-kinetic equations are listed. Northrop's non-optimized theory is discussed for reference, and internal consistency relations of G.C. drift theories are presented. (orig.)

  20. (In)Consistencies in Responses to Sodium Bicarbonate Supplementation: A Randomised, Repeated Measures, Counterbalanced and Double-Blind Study.

    Science.gov (United States)

    Froio de Araujo Dias, Gabriela; da Eira Silva, Vinicius; de Salles Painelli, Vitor; Sale, Craig; Giannini Artioli, Guilherme; Gualano, Bruno; Saunders, Bryan

    2015-01-01

    Intervention studies do not account for high within-individual variation potentially compromising the magnitude of an effect. Repeat administration of a treatment allows quantification of individual responses and determination of the consistency of responses. We determined the consistency of metabolic and exercise responses following repeated administration of sodium bicarbonate (SB). 15 physically active males (age 25±4 y; body mass 76.0±7.3 kg; height 1.77±0.05 m) completed six cycling capacity tests at 110% of maximum power output (CCT110%) following ingestion of either 0.3 g∙kg-1BM of SB (4 trials) or placebo (PL, 2 trials). Blood pH, bicarbonate, base excess and lactate were determined at baseline, pre-exercise, post-exercise and 5-min post-exercise. Total work done (TWD) was recorded as the exercise outcome. SB supplementation increased blood pH, bicarbonate and base excess prior to every trial (all p ≤ 0.001); absolute changes in pH, bicarbonate and base excess from baseline to pre-exercise were similar in all SB trials (all p > 0.05). Blood lactate was elevated following exercise in all trials (p ≤ 0.001), and was higher in some, but not all, SB trials compared to PL. TWD was not significantly improved with SB vs. PL in any trial (SB1: +3.6%; SB2 +0.3%; SB3: +2.1%; SB4: +6.7%; all p > 0.05), although magnitude-based inferences suggested a 93% likely improvement in SB4. Individual analysis showed ten participants improved in at least one SB trial above the normal variation of the test although five improved in none. The mechanism for improved exercise with SB was consistently in place prior to exercise, although this only resulted in a likely improvement in one trial. SB does not consistently improve high intensity cycling capacity, with results suggesting that caution should be taken when interpreting the results from single trials as to the efficacy of SB supplementation. ClinicalTrials.gov NCT02474628.

  1. (InConsistencies in Responses to Sodium Bicarbonate Supplementation: A Randomised, Repeated Measures, Counterbalanced and Double-Blind Study.

    Directory of Open Access Journals (Sweden)

    Gabriela Froio de Araujo Dias

    Full Text Available Intervention studies do not account for high within-individual variation potentially compromising the magnitude of an effect. Repeat administration of a treatment allows quantification of individual responses and determination of the consistency of responses. We determined the consistency of metabolic and exercise responses following repeated administration of sodium bicarbonate (SB.15 physically active males (age 25±4 y; body mass 76.0±7.3 kg; height 1.77±0.05 m completed six cycling capacity tests at 110% of maximum power output (CCT110% following ingestion of either 0.3 g∙kg-1BM of SB (4 trials or placebo (PL, 2 trials. Blood pH, bicarbonate, base excess and lactate were determined at baseline, pre-exercise, post-exercise and 5-min post-exercise. Total work done (TWD was recorded as the exercise outcome.SB supplementation increased blood pH, bicarbonate and base excess prior to every trial (all p ≤ 0.001; absolute changes in pH, bicarbonate and base excess from baseline to pre-exercise were similar in all SB trials (all p > 0.05. Blood lactate was elevated following exercise in all trials (p ≤ 0.001, and was higher in some, but not all, SB trials compared to PL. TWD was not significantly improved with SB vs. PL in any trial (SB1: +3.6%; SB2 +0.3%; SB3: +2.1%; SB4: +6.7%; all p > 0.05, although magnitude-based inferences suggested a 93% likely improvement in SB4. Individual analysis showed ten participants improved in at least one SB trial above the normal variation of the test although five improved in none.The mechanism for improved exercise with SB was consistently in place prior to exercise, although this only resulted in a likely improvement in one trial. SB does not consistently improve high intensity cycling capacity, with results suggesting that caution should be taken when interpreting the results from single trials as to the efficacy of SB supplementation.ClinicalTrials.gov NCT02474628.

  2. Comparative Study of Daylighting Calculation Methods

    Directory of Open Access Journals (Sweden)

    Mandala Ariani

    2018-01-01

    Full Text Available The aim of this study is to assess five daylighting calculation method commonly used in architectural study. The methods used include hand calculation methods (SNI/DPMB method and BRE Daylighting Protractors, scale models studied in an artificial sky simulator and computer programs using Dialux and Velux lighting software. The test room is conditioned by the uniform sky conditions, simple room geometry with variations of the room reflectance (black, grey, and white color. The analyses compared the result (including daylight factor, illumination, and coefficient of uniformity value and examines the similarity and contrast the result different. The color variations trial is used to analyses the internally reflection factor contribution to the result.

  3. Are Self-study Procedural Teaching Methods Effective? A Pilot Study of a Family Medicine Residency Program.

    Science.gov (United States)

    Deffenbacher, Brandy; Langner, Shannon; Khodaee, Morteza

    2017-11-01

    A family medicine residency is a unique training environment where residents are exposed to care in multiple settings, across all ages. Procedures are an integral part of family medicine practice. Family medicine residency (FMR) programs are tasked with the job of teaching these skills at a level of intensity and frequency that allows a resident to achieve competency of such skills. In an environment that is limited by work hour restrictions, self-study teaching methods are one way to ensure all residents receive the fundamental knowledge of how to perform procedures. We developed and evaluated the efficacy of a self-study procedure teaching method and procedure evaluation checklist. A self-study procedure teaching intervention was created, consisting of instructional articles and videos on three procedures. To assess the efficacy of the intervention, and the competency of the residents, pre- and postintervention procedure performance sessions were completed. These sessions were reviewed and scored using a standardized procedure performance checklist. All 24 residents participated in the study. Overall, the resident procedure knowledge increased on two of the three procedures studied, and ability to perform procedure according to expert-validated checklist improved significantly on all procedures. A self-study intervention is a simple but effective way to increase and improve procedure training in a way that fits the complex scheduling needs of a residency training program. In addition, this study demonstrates that the procedure performance checklists are a simple and reliable way to increase assessment of resident procedure performance skills in a residency setting.

  4. Using naturalistic driving data to identify variables associated with infrequent, occasional, and consistent seat belt use.

    Science.gov (United States)

    Reagan, Ian J; McClafferty, Julie A; Berlin, Sharon P; Hankey, Jonathan M

    2013-01-01

    Seat belt use is one of the most effective countermeasures to reduce traffic fatalities and injuries. The success of efforts to increase use is measured by road side observations and self-report questionnaires. These methods have shortcomings, with the former requiring a binary point estimate and the latter being subjective. The 100-car naturalistic driving study presented a unique opportunity to study seat belt use in that seat belt status was known for every trip each driver made during a 12-month period. Drivers were grouped into infrequent, occasional, or consistent seat belt users based on the frequency of belt use. Analyses were then completed to assess if these groups differed on several measures including personality, demographics, self-reported driving style variables as well as measures from the 100-car study instrumentation suite (average trip speed, trips per day). In addition, detailed analyses of the occasional belt user group were completed to identify factors that were predictive of occasional belt users wearing their belts. The analyses indicated that consistent seat belt users took fewer trips per day, and that increased average trip speed was associated with increased belt use among occasional belt users. The results of this project may help focus messaging efforts to convert occasional and inconsistent seat belt users to consistent users. Copyright © 2012 Elsevier Ltd. All rights reserved.

  5. Microscopic and self-consistent description of nuclear properties by extended generator-coordinate method

    International Nuclear Information System (INIS)

    Didong, M.

    1976-01-01

    The extend generator-coordinated method is discussed and a procedure is given for the solution of the Hill-Wheeler equation. The HFB-theory, the particle-number and angular-momentum projections necessary for symmetry, and the modified surprice delta interaction are discussed. The described procedures are used to calculate 72 Ge, 70 Zn and 74 Ge properties. (BJ) [de

  6. Consistent estimation of Gibbs energy using component contributions.

    Directory of Open Access Journals (Sweden)

    Elad Noor

    Full Text Available Standard Gibbs energies of reactions are increasingly being used in metabolic modeling for applying thermodynamic constraints on reaction rates, metabolite concentrations and kinetic parameters. The increasing scope and diversity of metabolic models has led scientists to look for genome-scale solutions that can estimate the standard Gibbs energy of all the reactions in metabolism. Group contribution methods greatly increase coverage, albeit at the price of decreased precision. We present here a way to combine the estimations of group contribution with the more accurate reactant contributions by decomposing each reaction into two parts and applying one of the methods on each of them. This method gives priority to the reactant contributions over group contributions while guaranteeing that all estimations will be consistent, i.e. will not violate the first law of thermodynamics. We show that there is a significant increase in the accuracy of our estimations compared to standard group contribution. Specifically, our cross-validation results show an 80% reduction in the median absolute residual for reactions that can be derived by reactant contributions only. We provide the full framework and source code for deriving estimates of standard reaction Gibbs energy, as well as confidence intervals, and believe this will facilitate the wide use of thermodynamic data for a better understanding of metabolism.

  7. bNEAT: a Bayesian network method for detecting epistatic interactions in genome-wide association studies

    Directory of Open Access Journals (Sweden)

    Chen Xue-wen

    2011-07-01

    Full Text Available Abstract Background Detecting epistatic interactions plays a significant role in improving pathogenesis, prevention, diagnosis and treatment of complex human diseases. A recent study in automatic detection of epistatic interactions shows that Markov Blanket-based methods are capable of finding genetic variants strongly associated with common diseases and reducing false positives when the number of instances is large. Unfortunately, a typical dataset from genome-wide association studies consists of very limited number of examples, where current methods including Markov Blanket-based method may perform poorly. Results To address small sample problems, we propose a Bayesian network-based approach (bNEAT to detect epistatic interactions. The proposed method also employs a Branch-and-Bound technique for learning. We apply the proposed method to simulated datasets based on four disease models and a real dataset. Experimental results show that our method outperforms Markov Blanket-based methods and other commonly-used methods, especially when the number of samples is small. Conclusions Our results show bNEAT can obtain a strong power regardless of the number of samples and is especially suitable for detecting epistatic interactions with slight or no marginal effects. The merits of the proposed approach lie in two aspects: a suitable score for Bayesian network structure learning that can reflect higher-order epistatic interactions and a heuristic Bayesian network structure learning method.

  8. Density Functional Theory and the Basis Set Truncation Problem with Correlation Consistent Basis Sets: Elephant in the Room or Mouse in the Closet?

    Science.gov (United States)

    Feller, David; Dixon, David A

    2018-03-08

    Two recent papers in this journal called into question the suitability of the correlation consistent basis sets for density functional theory (DFT) calculations, because the sets were designed for correlated methods such as configuration interaction, perturbation theory, and coupled cluster theory. These papers focused on the ability of the correlation consistent and other basis sets to reproduce total energies, atomization energies, and dipole moments obtained from "quasi-exact" multiwavelet results. Undesirably large errors were observed for the correlation consistent basis sets. One of the papers argued that basis sets specifically optimized for DFT methods were "essential" for obtaining high accuracy. In this work we re-examined the performance of the correlation consistent basis sets by resolving problems with the previous calculations and by making more appropriate basis set choices for the alkali and alkaline-earth metals and second-row elements. When this is done, the statistical errors with respect to the benchmark values and with respect to DFT optimized basis sets are greatly reduced, especially in light of the relatively large intrinsic error of the underlying DFT method. When judged with respect to high-quality Feller-Peterson-Dixon coupled cluster theory atomization energies, the PBE0 DFT method used in the previous studies exhibits a mean absolute deviation more than a factor of 50 larger than the quintuple zeta basis set truncation error.

  9. Total energy calculation of perovskite, BaTiO3, by self-consistent

    Indian Academy of Sciences (India)

    Unknown

    rgy, lattice constant, density of states, band structure etc using self-consistent tight binding method. ... share the paraelectric simple-cubic perovskite structure .... of neighbouring ions. In order to find the ground state, we solve the variation problem, minimizing Etot with respect to the coefficients, .*,λµ ic. The final equation is.

  10. Self-consistent Bulge/Disk/Halo Galaxy Dynamical Modeling Using Integral Field Kinematics

    Science.gov (United States)

    Taranu, D. S.; Obreschkow, D.; Dubinski, J. J.; Fogarty, L. M. R.; van de Sande, J.; Catinella, B.; Cortese, L.; Moffett, A.; Robotham, A. S. G.; Allen, J. T.; Bland-Hawthorn, J.; Bryant, J. J.; Colless, M.; Croom, S. M.; D'Eugenio, F.; Davies, R. L.; Drinkwater, M. J.; Driver, S. P.; Goodwin, M.; Konstantopoulos, I. S.; Lawrence, J. S.; López-Sánchez, Á. R.; Lorente, N. P. F.; Medling, A. M.; Mould, J. R.; Owers, M. S.; Power, C.; Richards, S. N.; Tonini, C.

    2017-11-01

    We introduce a method for modeling disk galaxies designed to take full advantage of data from integral field spectroscopy (IFS). The method fits equilibrium models to simultaneously reproduce the surface brightness, rotation, and velocity dispersion profiles of a galaxy. The models are fully self-consistent 6D distribution functions for a galaxy with a Sérsic profile stellar bulge, exponential disk, and parametric dark-matter halo, generated by an updated version of GalactICS. By creating realistic flux-weighted maps of the kinematic moments (flux, mean velocity, and dispersion), we simultaneously fit photometric and spectroscopic data using both maximum-likelihood and Bayesian (MCMC) techniques. We apply the method to a GAMA spiral galaxy (G79635) with kinematics from the SAMI Galaxy Survey and deep g- and r-band photometry from the VST-KiDS survey, comparing parameter constraints with those from traditional 2D bulge-disk decomposition. Our method returns broadly consistent results for shared parameters while constraining the mass-to-light ratios of stellar components and reproducing the H I-inferred circular velocity well beyond the limits of the SAMI data. Although the method is tailored for fitting integral field kinematic data, it can use other dynamical constraints like central fiber dispersions and H I circular velocities, and is well-suited for modeling galaxies with a combination of deep imaging and H I and/or optical spectra (resolved or otherwise). Our implementation (MagRite) is computationally efficient and can generate well-resolved models and kinematic maps in under a minute on modern processors.

  11. Occlusion-Aware Fragment-Based Tracking With Spatial-Temporal Consistency.

    Science.gov (United States)

    Sun, Chong; Wang, Dong; Lu, Huchuan

    2016-08-01

    In this paper, we present a robust tracking method by exploiting a fragment-based appearance model with consideration of both temporal continuity and discontinuity information. From the perspective of probability theory, the proposed tracking algorithm can be viewed as a two-stage optimization problem. In the first stage, by adopting the estimated occlusion state as a prior, the optimal state of the tracked object can be obtained by solving an optimization problem, where the objective function is designed based on the classification score, occlusion prior, and temporal continuity information. In the second stage, we propose a discriminative occlusion model, which exploits both foreground and background information to detect the possible occlusion, and also models the consistency of occlusion labels among different frames. In addition, a simple yet effective training strategy is introduced during the model training (and updating) process, with which the effects of spatial-temporal consistency are properly weighted. The proposed tracker is evaluated by using the recent benchmark data set, on which the results demonstrate that our tracker performs favorably against other state-of-the-art tracking algorithms.

  12. Dynamically consistent oil import tariffs

    International Nuclear Information System (INIS)

    Karp, L.; Newbery, D.M.

    1992-01-01

    The standard theory of optimal tariffs considers tariffs on perishable goods produced abroad under static conditions, in which tariffs affect prices only in that period. Oil and other exhaustable resources do not fit this model, for current tariffs affect the amount of oil imported, which will affect the remaining stock and hence its future price. The problem of choosing a dynamically consistent oil import tariff when suppliers are competitive but importers have market power is considered. The open-loop Nash tariff is solved for the standard competitive case in which the oil price is arbitraged, and it was found that the resulting tariff rises at the rate of interest. This tariff was found to have an equilibrium that in general is dynamically inconsistent. Nevertheless, it is shown that necessary and sufficient conditions exist under which the tariff satisfies the weaker condition of time consistency. A dynamically consistent tariff is obtained by assuming that all agents condition their current decisions on the remaining stock of the resource, in contrast to open-loop strategies. For the natural case in which all agents choose their actions simultaneously in each period, the dynamically consistent tariff was characterized, and found to differ markedly from the time-inconsistent open-loop tariff. It was shown that if importers do not have overwhelming market power, then the time path of the world price is insensitive to the ability to commit, as is the level of wealth achieved by the importer. 26 refs., 4 figs

  13. Investigation of the thermo-mechanical behavior of neutron-irradiated Fe-Cr alloys by self-consistent plasticity theory

    Energy Technology Data Exchange (ETDEWEB)

    Xiao, Xiazi [State Key Laboratory for Turbulence and Complex System, Department of Mechanics and Engineering Science, College of Engineering, Peking University, Beijing 100871 (China); CAPT, HEDPS and IFSA Collaborative Innovation Center of MoE, BIC-ESAT, Peking University, Beijing 100871 (China); Terentyev, Dmitry [Structural Material Group, Institute of Nuclear Materials Science, SCK CEN, Mol (Belgium); Yu, Long [State Key Laboratory for Turbulence and Complex System, Department of Mechanics and Engineering Science, College of Engineering, Peking University, Beijing 100871 (China); Bakaev, A. [Structural Material Group, Institute of Nuclear Materials Science, SCK CEN, Mol (Belgium); Jin, Zhaohui [School of Materials Science and Engineering, Shanghai Jiao Tong University, Shanghai 200240 (China); Duan, Huiling, E-mail: hlduan@pku.edu.cn [State Key Laboratory for Turbulence and Complex System, Department of Mechanics and Engineering Science, College of Engineering, Peking University, Beijing 100871 (China); CAPT, HEDPS and IFSA Collaborative Innovation Center of MoE, BIC-ESAT, Peking University, Beijing 100871 (China)

    2016-08-15

    The thermo-mechanical behavior of non-irradiated (at 223 K, 302 K and 573 K) and neutron irradiated (at 573 K) Fe-2.5Cr, Fe-5Cr and Fe-9Cr alloys is studied by a self-consistent plasticity theory, which consists of constitutive equations describing the contribution of radiation defects at grain level, and the elastic-viscoplastic self-consistent method to obtain polycrystalline behaviors. Attention is paid to two types of radiation-induced defects: interstitial dislocation loops and solute rich clusters, which are believed to be the main sources of hardening in Fe-Cr alloys at medium irradiation doses. Both the hardening mechanism and microstructural evolution are investigated by using available experimental data on microstructures, and implementing hardening rules derived from atomistic data. Good agreement with experimental data is achieved for both the yield stress and strain hardening of non-irradiated and irradiated Fe-Cr alloys by treating dislocation loops as strong thermally activated obstacles and solute rich clusters as weak shearable ones. - Highlights: • A self-consistent plasticity theory is proposed for irradiated Fe-Cr alloys. • Both the irradiation-induced hardening and plastic flow evolution are studied. • Dislocation loops and solute rich clusters are considered as the main defects. • Numerical results of the proposed model match with corresponding experimental data.

  14. Assessing the Consistency and Microbiological Effectiveness of Household Water Treatment Practices by Urban and Rural Populations Claiming to Treat Their Water at Home: A Case Study in Peru

    Science.gov (United States)

    Rosa, Ghislaine; Huaylinos, Maria L.; Gil, Ana; Lanata, Claudio; Clasen, Thomas

    2014-01-01

    Background Household water treatment (HWT) can improve drinking water quality and prevent disease if used correctly and consistently by vulnerable populations. Over 1.1 billion people report treating their water prior to drinking it. These estimates, however, are based on responses to household surveys that may exaggerate the consistency and microbiological performance of the practice—key factors for reducing pathogen exposure and achieving health benefits. The objective of this study was to examine how HWT practices are actually performed by households identified as HWT users, according to international monitoring standards. Methods and Findings We conducted a 6-month case study in urban (n = 117 households) and rural (n = 115 households) Peru, a country in which 82.8% of households report treating their water at home. We used direct observation, in-depth interviews, surveys, spot-checks, and water sampling to assess water treatment practices among households that claimed to treat their drinking water at home. While consistency of reported practices was high in both urban (94.8%) and rural (85.3%) settings, availability of treated water (based on self-report) at time of collection was low, with 67.1% and 23.0% of urban and rural households having treated water at all three sampling visits. Self-reported consumption of untreated water in the home among adults and children water of self-reported users was significantly better than source water in the urban setting and negligible but significantly better in the rural setting. However, only 46.3% and 31.6% of households had drinking water water quality. The lack of consistency and sub-optimal microbiological effectiveness also raises questions about the potential of HWT to prevent waterborne diseases. PMID:25522371

  15. Quasi-Particle Self-Consistent GW for Molecules.

    Science.gov (United States)

    Kaplan, F; Harding, M E; Seiler, C; Weigend, F; Evers, F; van Setten, M J

    2016-06-14

    We present the formalism and implementation of quasi-particle self-consistent GW (qsGW) and eigenvalue only quasi-particle self-consistent GW (evGW) adapted to standard quantum chemistry packages. Our implementation is benchmarked against high-level quantum chemistry computations (coupled-cluster theory) and experimental results using a representative set of molecules. Furthermore, we compare the qsGW approach for five molecules relevant for organic photovoltaics to self-consistent GW results (scGW) and analyze the effects of the self-consistency on the ground state density by comparing calculated dipole moments to their experimental values. We show that qsGW makes a significant improvement over conventional G0W0 and that partially self-consistent flavors (in particular evGW) can be excellent alternatives.

  16. Consistency Checking of Web Service Contracts

    DEFF Research Database (Denmark)

    Cambronero, M. Emilia; Okika, Joseph C.; Ravn, Anders Peter

    2008-01-01

    Behavioural properties are analyzed for web service contracts formulated in Business Process Execution Language (BPEL) and Choreography Description Language (CDL). The key result reported is an automated technique to check consistency between protocol aspects of the contracts. The contracts...... are abstracted to (timed) automata and from there a simulation is set up, which is checked using automated tools for analyzing networks of finite state processes. Here we use the Concurrency Work Bench. The proposed techniques are illustrated with a case study that include otherwise difficult to analyze fault...

  17. Weak consistency and strong paraconsistency

    Directory of Open Access Journals (Sweden)

    Gemma Robles

    2009-11-01

    Full Text Available In a standard sense, consistency and paraconsistency are understood as, respectively, the absence of any contradiction and as the absence of the ECQ (“E contradictione quodlibet” rule that allows us to conclude any well formed formula from any contradiction. The aim of this paper is to explain the concepts of weak consistency alternative to the standard one, the concepts of paraconsistency related to them and the concept of strong paraconsistency, all of which have been defined by the author together with José M. Méndez.

  18. Self-consistent clustering analysis: an efficient multiscale scheme for inelastic heterogeneous materials

    Energy Technology Data Exchange (ETDEWEB)

    Liu, Z.; Bessa, M. A.; Liu, W.K.

    2017-10-25

    A predictive computational theory is shown for modeling complex, hierarchical materials ranging from metal alloys to polymer nanocomposites. The theory can capture complex mechanisms such as plasticity and failure that span across multiple length scales. This general multiscale material modeling theory relies on sound principles of mathematics and mechanics, and a cutting-edge reduced order modeling method named self-consistent clustering analysis (SCA) [Zeliang Liu, M.A. Bessa, Wing Kam Liu, “Self-consistent clustering analysis: An efficient multi-scale scheme for inelastic heterogeneous materials,” Comput. Methods Appl. Mech. Engrg. 306 (2016) 319–341]. SCA reduces by several orders of magnitude the computational cost of micromechanical and concurrent multiscale simulations, while retaining the microstructure information. This remarkable increase in efficiency is achieved with a data-driven clustering method. Computationally expensive operations are performed in the so-called offline stage, where degrees of freedom (DOFs) are agglomerated into clusters. The interaction tensor of these clusters is computed. In the online or predictive stage, the Lippmann-Schwinger integral equation is solved cluster-wise using a self-consistent scheme to ensure solution accuracy and avoid path dependence. To construct a concurrent multiscale model, this scheme is applied at each material point in a macroscale structure, replacing a conventional constitutive model with the average response computed from the microscale model using just the SCA online stage. A regularized damage theory is incorporated in the microscale that avoids the mesh and RVE size dependence that commonly plagues microscale damage calculations. The SCA method is illustrated with two cases: a carbon fiber reinforced polymer (CFRP) structure with the concurrent multiscale model and an application to fatigue prediction for additively manufactured metals. For the CFRP problem, a speed up estimated to be about

  19. IT&C Impact on the Romanian Business and Organizations. The Enterprise Resource Planning and Business Intelligence Methods Influence on Manager’s Decision: A Case Study

    Directory of Open Access Journals (Sweden)

    Eduard EDELHAUSER

    2011-01-01

    Full Text Available The aim of the paper is to study the use of the advanced management methods in the 2010 year in Romania. The research results were obtained with the use of a questionnaire, and our purpose was to demonstrate some hypothesis concerning identify the effect that implementation of ERP and BI applications in all functions of an organization has over the management method and the IT&C based decision. The originality of this article consists in the study realized in computer based advanced management methods implementation. The study is limited to the SIVECO companies portfolio. The purpose of the study was to demonstrate some hypothesis concerning the relationship between the size of the organization, the management method used, and the role of IT&C in decision making. The practical value of this study consists in the measurement of the impacts of contingency factors, including size, and in the assessment of the ERP systems success. The results demonstrate that the relationship between firm size and ERP success is moderated by IT assets.

  20. Consistency in PERT problems

    OpenAIRE

    Bergantiños, Gustavo; Valencia-Toledo, Alfredo; Vidal-Puga, Juan

    2016-01-01

    The program evaluation review technique (PERT) is a tool used to schedule and coordinate activities in a complex project. In assigning the cost of a potential delay, we characterize the Shapley rule as the only rule that satisfies consistency and other desirable properties.