WorldWideScience

Sample records for methods give consistent

  1. Do family physicians, emergency department physicians, and pediatricians give consistent sport-related concussion management advice?

    Science.gov (United States)

    Stoller, Jacqueline; Carson, James D; Garel, Alisha; Libfeld, Paula; Snow, Catherine L; Law, Marcus; Frémont, Pierre

    2014-06-01

    To identify differences and gaps in recommendations to patients for the management of sport-related concussion among FPs, emergency department physicians (EDPs), and pediatricians. A self-administered, multiple-choice survey was e-mailed to FPs, EDPs, and pediatricians. The survey had been assessed for content validity. Two community teaching hospitals in the greater Toronto area in Ontario. Two hundred seventy physicians, including FPs, EDPs, and pediatricians, were invited to participate. Identification of sources of concussion management information, usefulness of concussion diagnosis strategies, and whether physicians use common terminology when explaining cognitive rest strategies to patients after sport-related concussions. The response rate was 43.7%. Surveys were completed by 70 FPs, 23 EDPs, and 11 pediatricians. In total, 49% of FP, 52% of EDP, and 27% of pediatrician respondents reported no knowledge of any consensus statements on concussion in sport, and 54% of FPs, 86% of EDPs, and 78% of pediatricians never used the Sport Concussion Assessment Tool, version 2. Only 49% of FPs, 57% of EDPs, and 36% of pediatricians always advised cognitive rest. This study identified large gaps in the knowledge of concussion guidelines and implementation of recommendations for treating patients with sport-related concussions. Although some physicians recommended physical and cognitive rest, a large proportion failed to consistently advise this strategy. Better knowledge transfer efforts should target all 3 groups of physicians. Copyright© the College of Family Physicians of Canada.

  2. Consistent forcing scheme in the cascaded lattice Boltzmann method

    Science.gov (United States)

    Fei, Linlin; Luo, Kai Hong

    2017-11-01

    In this paper, we give an alternative derivation for the cascaded lattice Boltzmann method (CLBM) within a general multiple-relaxation-time (MRT) framework by introducing a shift matrix. When the shift matrix is a unit matrix, the CLBM degrades into an MRT LBM. Based on this, a consistent forcing scheme is developed for the CLBM. The consistency of the nonslip rule, the second-order convergence rate in space, and the property of isotropy for the consistent forcing scheme is demonstrated through numerical simulations of several canonical problems. Several existing forcing schemes previously used in the CLBM are also examined. The study clarifies the relation between MRT LBM and CLBM under a general framework.

  3. Consistent forcing scheme in the cascaded lattice Boltzmann method.

    Science.gov (United States)

    Fei, Linlin; Luo, Kai Hong

    2017-11-01

    In this paper, we give an alternative derivation for the cascaded lattice Boltzmann method (CLBM) within a general multiple-relaxation-time (MRT) framework by introducing a shift matrix. When the shift matrix is a unit matrix, the CLBM degrades into an MRT LBM. Based on this, a consistent forcing scheme is developed for the CLBM. The consistency of the nonslip rule, the second-order convergence rate in space, and the property of isotropy for the consistent forcing scheme is demonstrated through numerical simulations of several canonical problems. Several existing forcing schemes previously used in the CLBM are also examined. The study clarifies the relation between MRT LBM and CLBM under a general framework.

  4. A method for consistent precision radiation therapy

    International Nuclear Information System (INIS)

    Leong, J.

    1985-01-01

    Using a meticulous setup procedure in which repeated portal films were taken before each treatment until satisfactory portal verifications were obtained, a high degree of precision in patient positioning was achieved. A fluctuation from treatment to treatment, over 11 treatments, of less than +-0.10 cm (S.D.) for anatomical points inside the treatment field was obtained. This, however, only applies to specific anatomical points selected for this positioning procedure and does not apply to all points within the portal. We have generalized this procedure and have suggested a means by which any target volume can be consistently positioned which may approach this degree of precision. (orig.)

  5. 14 CFR 221.140 - Method of giving concurrence.

    Science.gov (United States)

    2010-01-01

    ...) Conflicting authority to be avoided. Care should be taken to avoid giving authority to two or more carriers... Aviation shall be used by a carrier to give authority to another carrier to issue and file with the... used as authority to file joint fares or charges in which the carrier to whom the concurrence is given...

  6. Quasiparticle self-consistent GW method: a short summary

    International Nuclear Information System (INIS)

    Kotani, Takao; Schilfgaarde, Mark van; Faleev, Sergey V; Chantis, Athanasios

    2007-01-01

    We have developed a quasiparticle self-consistent GW method (QSGW), which is a new self-consistent method to calculate the electronic structure within the GW approximation. The method is formulated based on the idea of a self-consistent perturbation; the non-interacting Green function G 0 , which is the starting point for GWA to obtain G, is determined self-consistently so as to minimize the perturbative correction generated by GWA. After self-consistency is attained, we have G 0 , W (the screened Coulomb interaction) and G self-consistently. This G 0 can be interpreted as the optimum non-interacting propagator for the quasiparticles. We will summarize some theoretical discussions to justify QSGW. Then we will survey results which have been obtained up to now: e.g., band gaps for normal semiconductors are predicted to a precision of 0.1-0.3 eV; the self-consistency including the off-diagonal part is required for NiO and MnO; and so on. There are still some remaining disagreements with experiments; however, they are very systematic, and can be explained from the neglect of excitonic effects

  7. Linear augmented plane wave method for self-consistent calculations

    International Nuclear Information System (INIS)

    Takeda, T.; Kuebler, J.

    1979-01-01

    O.K. Andersen has recently introduced a linear augmented plane wave method (LAPW) for the calculation of electronic structure that was shown to be computationally fast. A more general formulation of an LAPW method is presented here. It makes use of a freely disposable number of eigenfunctions of the radial Schroedinger equation. These eigenfunctions can be selected in a self-consistent way. The present formulation also results in a computationally fast method. It is shown that Andersen's LAPW is obtained in a special limit from the present formulation. Self-consistent test calculations for copper show the present method to be remarkably accurate. As an application, scalar-relativistic self-consistent calculations are presented for the band structure of FCC lanthanum. (author)

  8. An algebraic method for constructing stable and consistent autoregressive filters

    International Nuclear Information System (INIS)

    Harlim, John; Hong, Hoon; Robbins, Jacob L.

    2015-01-01

    In this paper, we introduce an algebraic method to construct stable and consistent univariate autoregressive (AR) models of low order for filtering and predicting nonlinear turbulent signals with memory depth. By stable, we refer to the classical stability condition for the AR model. By consistent, we refer to the classical consistency constraints of Adams–Bashforth methods of order-two. One attractive feature of this algebraic method is that the model parameters can be obtained without directly knowing any training data set as opposed to many standard, regression-based parameterization methods. It takes only long-time average statistics as inputs. The proposed method provides a discretization time step interval which guarantees the existence of stable and consistent AR model and simultaneously produces the parameters for the AR models. In our numerical examples with two chaotic time series with different characteristics of decaying time scales, we find that the proposed AR models produce significantly more accurate short-term predictive skill and comparable filtering skill relative to the linear regression-based AR models. These encouraging results are robust across wide ranges of discretization times, observation times, and observation noise variances. Finally, we also find that the proposed model produces an improved short-time prediction relative to the linear regression-based AR-models in forecasting a data set that characterizes the variability of the Madden–Julian Oscillation, a dominant tropical atmospheric wave pattern

  9. Consistency analysis of subspace identification methods based on a linear regression approach

    DEFF Research Database (Denmark)

    Knudsen, Torben

    2001-01-01

    In the literature results can be found which claim consistency for the subspace method under certain quite weak assumptions. Unfortunately, a new result gives a counter example showing inconsistency under these assumptions and then gives new more strict sufficient assumptions which however does n...... not include important model structures as e.g. Box-Jenkins. Based on a simple least squares approach this paper shows the possible inconsistency under the weak assumptions and develops only slightly stricter assumptions sufficient for consistency and which includes any model structure...

  10. Bootstrap embedding: An internally consistent fragment-based method

    Energy Technology Data Exchange (ETDEWEB)

    Welborn, Matthew; Tsuchimochi, Takashi; Van Voorhis, Troy [Department of Chemistry, Massachusetts Institute of Technology, 77 Massachusetts Avenue, Cambridge, Massachusetts 02139 (United States)

    2016-08-21

    Strong correlation poses a difficult problem for electronic structure theory, with computational cost scaling quickly with system size. Fragment embedding is an attractive approach to this problem. By dividing a large complicated system into smaller manageable fragments “embedded” in an approximate description of the rest of the system, we can hope to ameliorate the steep cost of correlated calculations. While appealing, these methods often converge slowly with fragment size because of small errors at the boundary between fragment and bath. We describe a new electronic embedding method, dubbed “Bootstrap Embedding,” a self-consistent wavefunction-in-wavefunction embedding theory that uses overlapping fragments to improve the description of fragment edges. We apply this method to the one dimensional Hubbard model and a translationally asymmetric variant, and find that it performs very well for energies and populations. We find Bootstrap Embedding converges rapidly with embedded fragment size, overcoming the surface-area-to-volume-ratio error typical of many embedding methods. We anticipate that this method may lead to a low-scaling, high accuracy treatment of electron correlation in large molecular systems.

  11. Statistically Consistent k-mer Methods for Phylogenetic Tree Reconstruction.

    Science.gov (United States)

    Allman, Elizabeth S; Rhodes, John A; Sullivant, Seth

    2017-02-01

    Frequencies of k-mers in sequences are sometimes used as a basis for inferring phylogenetic trees without first obtaining a multiple sequence alignment. We show that a standard approach of using the squared Euclidean distance between k-mer vectors to approximate a tree metric can be statistically inconsistent. To remedy this, we derive model-based distance corrections for orthologous sequences without gaps, which lead to consistent tree inference. The identifiability of model parameters from k-mer frequencies is also studied. Finally, we report simulations showing that the corrected distance outperforms many other k-mer methods, even when sequences are generated with an insertion and deletion process. These results have implications for multiple sequence alignment as well since k-mer methods are usually the first step in constructing a guide tree for such algorithms.

  12. Fully consistent CFD methods for incompressible flow computations

    DEFF Research Database (Denmark)

    Kolmogorov, Dmitry; Shen, Wen Zhong; Sørensen, Niels N.

    2014-01-01

    Nowadays collocated grid based CFD methods are one of the most e_cient tools for computations of the ows past wind turbines. To ensure the robustness of the methods they require special attention to the well-known problem of pressure-velocity coupling. Many commercial codes to ensure the pressure...

  13. Self-consistent study of nuclei far from stability with the energy density method

    CERN Document Server

    Tondeur, F

    1981-01-01

    The self-consistent energy density method has been shown to give good results with a small number of parameters for the calculation of nuclear masses, radii, deformations, neutron skins, shell and sub- shell effects. It is here used to study the properties of nuclei far from stability, like densities, shell structure, even-odd mass differences, single-particle potentials and nuclear deformations. A few possible consequences of the results for astrophysical problems are briefly considered. The predictions of the model in the super- heavy region are summarised. (34 refs).

  14. Simplified DFT methods for consistent structures and energies of large systems

    Science.gov (United States)

    Caldeweyher, Eike; Gerit Brandenburg, Jan

    2018-05-01

    Kohn–Sham density functional theory (DFT) is routinely used for the fast electronic structure computation of large systems and will most likely continue to be the method of choice for the generation of reliable geometries in the foreseeable future. Here, we present a hierarchy of simplified DFT methods designed for consistent structures and non-covalent interactions of large systems with particular focus on molecular crystals. The covered methods are a minimal basis set Hartree–Fock (HF-3c), a small basis set screened exchange hybrid functional (HSE-3c), and a generalized gradient approximated functional evaluated in a medium-sized basis set (B97-3c), all augmented with semi-classical correction potentials. We give an overview on the methods design, a comprehensive evaluation on established benchmark sets for geometries and lattice energies of molecular crystals, and highlight some realistic applications on large organic crystals with several hundreds of atoms in the primitive unit cell.

  15. An eigenvalue approach to quantum plasmonics based on a self-consistent hydrodynamics method.

    Science.gov (United States)

    Ding, Kun; Chan, C T

    2018-02-28

    Plasmonics has attracted much attention not only because it has useful properties such as strong field enhancement, but also because it reveals the quantum nature of matter. To handle quantum plasmonics effects, ab initio packages or empirical Feibelman d-parameters have been used to explore the quantum correction of plasmonic resonances. However, most of these methods are formulated within the quasi-static framework. The self-consistent hydrodynamics model offers a reliable approach to study quantum plasmonics because it can incorporate the quantum effect of the electron gas into classical electrodynamics in a consistent manner. Instead of the standard scattering method, we formulate the self-consistent hydrodynamics method as an eigenvalue problem to study quantum plasmonics with electrons and photons treated on the same footing. We find that the eigenvalue approach must involve a global operator, which originates from the energy functional of the electron gas. This manifests the intrinsic nonlocality of the response of quantum plasmonic resonances. Our model gives the analytical forms of quantum corrections to plasmonic modes, incorporating quantum electron spill-out effects and electrodynamical retardation. We apply our method to study the quantum surface plasmon polariton for a single flat interface.

  16. 14 CFR 221.150 - Method of giving power of attorney.

    Science.gov (United States)

    2010-01-01

    ... accordance with a form acceptable to the Office of International Aviation shall be used by a carrier to give... the word “Agent”. When such a designee is replaced the Department shall be immediately notified in...

  17. Generalized WKB method through an appropriate canonical transformation giving an exact invariant

    International Nuclear Information System (INIS)

    Guyard, J.; Nadeau, A.

    1976-01-01

    The solution of differential equations of the type d 2 q/dtau 2 +ω 2 (tau)q=0 is of great interest in Physics. Authors often introduce an auxiliary function w, solution of a differential equation which can be solved by a perturbation method. In fact this approach is nothing but an extension of the well known WKB method. Lewis has found an exact invariant of the motion given in closed form in terms in a much easier way. This method can now be used as a natural way of introducing the WKB extension [fr

  18. Giving presentations

    CERN Document Server

    Ellis, Mark

    1997-01-01

    This is part of a series of books, which gives training in key business communication skills. Emphasis is placed on building awareness of language appropriateness and fluency in typical business interactions. This new edition is in full colour.

  19. Self-consistent field variational cellular method as applied to the band structure calculation of sodium

    International Nuclear Information System (INIS)

    Lino, A.T.; Takahashi, E.K.; Leite, J.R.; Ferraz, A.C.

    1988-01-01

    The band structure of metallic sodium is calculated, using for the first time the self-consistent field variational cellular method. In order to implement the self-consistency in the variational cellular theory, the crystal electronic charge density was calculated within the muffin-tin approximation. The comparison between our results and those derived from other calculations leads to the conclusion that the proposed self-consistent version of the variational cellular method is fast and accurate. (author) [pt

  20. Gene tree rooting methods give distributions that mimic the coalescent process.

    Science.gov (United States)

    Tian, Yuan; Kubatko, Laura S

    2014-01-01

    Multi-locus phylogenetic inference is commonly carried out via models that incorporate the coalescent process to model the possibility that incomplete lineage sorting leads to incongruence between gene trees and the species tree. An interesting question that arises in this context is whether data "fit" the coalescent model. Previous work (Rosenfeld et al., 2012) has suggested that rooting of gene trees may account for variation in empirical data that has been previously attributed to the coalescent process. We examine this possibility using simulated data. We show that, in the case of four taxa, the distribution of gene trees observed from rooting estimated gene trees with either the molecular clock or with outgroup rooting can be closely matched by the distribution predicted by the coalescent model with specific choices of species tree branch lengths. We apply commonly-used coalescent-based methods of species tree inference to assess their performance in these situations. Copyright © 2013 Elsevier Inc. All rights reserved.

  1. A Benchmark Estimate for the Capital Stock. An Optimal Consistency Method

    OpenAIRE

    Jose Miguel Albala-Bertrand

    2001-01-01

    There are alternative methods to estimate a capital stock for a benchmark year. These methods, however, do not allow for an independent check, which could establish whether the estimated benchmark level is too high or too low. I propose here an optimal consistency method (OCM), which may allow estimating a capital stock level for a benchmark year and/or checking the consistency of alternative estimates of a benchmark capital stock.

  2. Measuring consistency in translation memories: a mixed-methods case study

    OpenAIRE

    Moorkens, Joss

    2012-01-01

    Introduced in the early 1990s, translation memory (TM) tools have since become widely used as an aid to human translation based on commonly‐held assumptions that they save time, reduce cost, and maximise consistency. The purpose of this research is twofold: it aims to develop a method for measuring consistency in TMs; and it aims to use this method to interrogate selected TMs from the localisation industry in order to find out whether the use of TM tools does, in fact, promote consistency in ...

  3. An Economical Approach to Estimate a Benchmark Capital Stock. An Optimal Consistency Method

    OpenAIRE

    Jose Miguel Albala-Bertrand

    2003-01-01

    There are alternative methods of estimating capital stock for a benchmark year. However, these methods are costly and time-consuming, requiring the gathering of much basic information as well as the use of some convenient assumptions and guesses. In addition, a way is needed of checking whether the estimated benchmark is at the correct level. This paper proposes an optimal consistency method (OCM), which enables a capital stock to be estimated for a benchmark year, and which can also be used ...

  4. A self-consistent nodal method in response matrix formalism for the multigroup diffusion equations

    International Nuclear Information System (INIS)

    Malambu, E.M.; Mund, E.H.

    1996-01-01

    We develop a nodal method for the multigroup diffusion equations, based on the transverse integration procedure (TIP). The efficiency of the method rests upon the convergence properties of a high-order multidimensional nodal expansion and upon numerical implementation aspects. The discrete 1D equations are cast in response matrix formalism. The derivation of the transverse leakage moments is self-consistent i.e. does not require additional assumptions. An outstanding feature of the method lies in the linear spatial shape of the local transverse leakage for the first-order scheme. The method is described in the two-dimensional case. The method is validated on some classical benchmark problems. (author)

  5. Method used to test the imaging consistency of binocular camera's left-right optical system

    Science.gov (United States)

    Liu, Meiying; Wang, Hu; Liu, Jie; Xue, Yaoke; Yang, Shaodong; Zhao, Hui

    2016-09-01

    To binocular camera, the consistency of optical parameters of the left and the right optical system is an important factor that will influence the overall imaging consistency. In conventional testing procedure of optical system, there lacks specifications suitable for evaluating imaging consistency. In this paper, considering the special requirements of binocular optical imaging system, a method used to measure the imaging consistency of binocular camera is presented. Based on this method, a measurement system which is composed of an integrating sphere, a rotary table and a CMOS camera has been established. First, let the left and the right optical system capture images in normal exposure time under the same condition. Second, a contour image is obtained based on the multiple threshold segmentation result and the boundary is determined using the slope of contour lines near the pseudo-contour line. Third, the constraint of gray level based on the corresponding coordinates of left-right images is established and the imaging consistency could be evaluated through standard deviation σ of the imaging grayscale difference D (x, y) between the left and right optical system. The experiments demonstrate that the method is suitable for carrying out the imaging consistency testing for binocular camera. When the standard deviation 3σ distribution of imaging gray difference D (x, y) between the left and right optical system of the binocular camera does not exceed 5%, it is believed that the design requirements have been achieved. This method could be used effectively and paves the way for the imaging consistency testing of the binocular camera.

  6. On Consistency Test Method of Expert Opinion in Ecological Security Assessment.

    Science.gov (United States)

    Gong, Zaiwu; Wang, Lihong

    2017-09-04

    To reflect the initiative design and initiative of human security management and safety warning, ecological safety assessment is of great value. In the comprehensive evaluation of regional ecological security with the participation of experts, the expert's individual judgment level, ability and the consistency of the expert's overall opinion will have a very important influence on the evaluation result. This paper studies the consistency measure and consensus measure based on the multiplicative and additive consistency property of fuzzy preference relation (FPR). We firstly propose the optimization methods to obtain the optimal multiplicative consistent and additively consistent FPRs of individual and group judgments, respectively. Then, we put forward a consistency measure by computing the distance between the original individual judgment and the optimal individual estimation, along with a consensus measure by computing the distance between the original collective judgment and the optimal collective estimation. In the end, we make a case study on ecological security for five cities. Result shows that the optimal FPRs are helpful in measuring the consistency degree of individual judgment and the consensus degree of collective judgment.

  7. Integrating the Toda Lattice with Self-Consistent Source via Inverse Scattering Method

    International Nuclear Information System (INIS)

    Urazboev, Gayrat

    2012-01-01

    In this work, there is shown that the solutions of Toda lattice with self-consistent source can be found by the inverse scattering method for the discrete Sturm-Liuville operator. For the considered problem the one-soliton solution is obtained.

  8. Using network screening methods to determine locations with specific safety issues: A design consistency case study.

    Science.gov (United States)

    Butsick, Andrew J; Wood, Jonathan S; Jovanis, Paul P

    2017-09-01

    The Highway Safety Manual provides multiple methods that can be used to identify sites with promise (SWiPs) for safety improvement. However, most of these methods cannot be used to identify sites with specific problems. Furthermore, given that infrastructure funding is often specified for use related to specific problems/programs, a method for identifying SWiPs related to those programs would be very useful. This research establishes a method for Identifying SWiPs with specific issues. This is accomplished using two safety performance functions (SPFs). This method is applied to identifying SWiPs with geometric design consistency issues. Mixed effects negative binomial regression was used to develop two SPFs using 5 years of crash data and over 8754km of two-lane rural roadway. The first SPF contained typical roadway elements while the second contained additional geometric design consistency parameters. After empirical Bayes adjustments, sites with promise (SWiPs) were identified. The disparity between SWiPs identified by the two SPFs was evident; 40 unique sites were identified by each model out of the top 220 segments. By comparing sites across the two models, candidate road segments can be identified where a lack design consistency may be contributing to an increase in expected crashes. Practitioners can use this method to more effectively identify roadway segments suffering from reduced safety performance due to geometric design inconsistency, with detailed engineering studies of identified sites required to confirm the initial assessment. Copyright © 2017 Elsevier Ltd. All rights reserved.

  9. Comparison of COD, R6, and J-contour integral methods of defect assessment, modified to give critical flaw sizes

    International Nuclear Information System (INIS)

    Burdekin, F.M.; Turner, C.E.

    1982-01-01

    A comparative study of the application of different elastic-plastic fracture mechanics methods to the calculation of critical defect sizes in pressure vessels showed widely varying results. The present authors have investigated in detail the reasons for the variations resulting from the use of the CEGB R6, COD design curve, and J-design curve methods to the particular pressure vessel problems. To obtain reasonable agreement between the three methods for the calculation of critical flaw sizes in high stress gradient situations, the published COD method in PD6493 has to be modified to remove its inherent safety factor, and to allow for stress gradients, and a consistent treatment for gross yielding/collapse has to be adopted for all three methods. (author)

  10. RELIABILITY ASSESSMENT OF ENTROPY METHOD FOR SYSTEM CONSISTED OF IDENTICAL EXPONENTIAL UNITS

    Institute of Scientific and Technical Information of China (English)

    Sun Youchao; Shi Jun

    2004-01-01

    The reliability assessment of unit-system near two levels is the most important content in the reliability multi-level synthesis of complex systems. Introducing the information theory into system reliability assessment, using the addible characteristic of information quantity and the principle of equivalence of information quantity, an entropy method of data information conversion is presented for the system consisted of identical exponential units. The basic conversion formulae of entropy method of unit test data are derived based on the principle of information quantity equivalence. The general models of entropy method synthesis assessment for system reliability approximate lower limits are established according to the fundamental principle of the unit reliability assessment. The applications of the entropy method are discussed by way of practical examples. Compared with the traditional methods, the entropy method is found to be valid and practicable and the assessment results are very satisfactory.

  11. The method and program system CABEI for adjusting consistency between natural element and its isotopes data

    Energy Technology Data Exchange (ETDEWEB)

    Tingjin, Liu; Zhengjun, Sun [Chinese Nuclear Data Center, Beijing, BJ (China)

    1996-06-01

    To meet the requirement of nuclear engineering, especially nuclear fusion reactor, now the data in the major evaluated libraries are given not only for natural element but also for its isotopes. Inconsistency between element and its isotopes data is one of the main problem in present evaluated neutron libraries. The formulas for adjusting to satisfy simultaneously the two kinds of consistent relationships were derived by means of least square method, the program system CABEI were developed. This program was tested by calculating the Fe data in CENDL-2.1. The results show that adjusted values satisfy the two kinds of consistent relationships.

  12. Dynamic and quantitative method of analyzing service consistency evolution based on extended hierarchical finite state automata.

    Science.gov (United States)

    Fan, Linjun; Tang, Jun; Ling, Yunxiang; Li, Benxian

    2014-01-01

    This paper is concerned with the dynamic evolution analysis and quantitative measurement of primary factors that cause service inconsistency in service-oriented distributed simulation applications (SODSA). Traditional methods are mostly qualitative and empirical, and they do not consider the dynamic disturbances among factors in service's evolution behaviors such as producing, publishing, calling, and maintenance. Moreover, SODSA are rapidly evolving in terms of large-scale, reusable, compositional, pervasive, and flexible features, which presents difficulties in the usage of traditional analysis methods. To resolve these problems, a novel dynamic evolution model extended hierarchical service-finite state automata (EHS-FSA) is constructed based on finite state automata (FSA), which formally depict overall changing processes of service consistency states. And also the service consistency evolution algorithms (SCEAs) based on EHS-FSA are developed to quantitatively assess these impact factors. Experimental results show that the bad reusability (17.93% on average) is the biggest influential factor, the noncomposition of atomic services (13.12%) is the second biggest one, and the service version's confusion (1.2%) is the smallest one. Compared with previous qualitative analysis, SCEAs present good effectiveness and feasibility. This research can guide the engineers of service consistency technologies toward obtaining a higher level of consistency in SODSA.

  13. Dynamic and Quantitative Method of Analyzing Service Consistency Evolution Based on Extended Hierarchical Finite State Automata

    Directory of Open Access Journals (Sweden)

    Linjun Fan

    2014-01-01

    Full Text Available This paper is concerned with the dynamic evolution analysis and quantitative measurement of primary factors that cause service inconsistency in service-oriented distributed simulation applications (SODSA. Traditional methods are mostly qualitative and empirical, and they do not consider the dynamic disturbances among factors in service’s evolution behaviors such as producing, publishing, calling, and maintenance. Moreover, SODSA are rapidly evolving in terms of large-scale, reusable, compositional, pervasive, and flexible features, which presents difficulties in the usage of traditional analysis methods. To resolve these problems, a novel dynamic evolution model extended hierarchical service-finite state automata (EHS-FSA is constructed based on finite state automata (FSA, which formally depict overall changing processes of service consistency states. And also the service consistency evolution algorithms (SCEAs based on EHS-FSA are developed to quantitatively assess these impact factors. Experimental results show that the bad reusability (17.93% on average is the biggest influential factor, the noncomposition of atomic services (13.12% is the second biggest one, and the service version’s confusion (1.2% is the smallest one. Compared with previous qualitative analysis, SCEAs present good effectiveness and feasibility. This research can guide the engineers of service consistency technologies toward obtaining a higher level of consistency in SODSA.

  14. Homogenization of Periodic Masonry Using Self-Consistent Scheme and Finite Element Method

    Science.gov (United States)

    Kumar, Nitin; Lambadi, Harish; Pandey, Manoj; Rajagopal, Amirtham

    2016-01-01

    Masonry is a heterogeneous anisotropic continuum, made up of the brick and mortar arranged in a periodic manner. Obtaining the effective elastic stiffness of the masonry structures has been a challenging task. In this study, the homogenization theory for periodic media is implemented in a very generic manner to derive the anisotropic global behavior of the masonry, through rigorous application of the homogenization theory in one step and through a full three-dimensional behavior. We have considered the periodic Eshelby self-consistent method and the finite element method. Two representative unit cells that represent the microstructure of the masonry wall exactly are considered for calibration and numerical application of the theory.

  15. Self-consistent collective coordinate method for large amplitude collective motions

    International Nuclear Information System (INIS)

    Sakata, F.; Hashimoto, Y.; Marumori, T.; Une, T.

    1982-01-01

    A recent development of the self-consistent collective coordinate method is described. The self-consistent collective coordinate method was proposed on the basis of the fundamental principle called the invariance principle of the Schroedinger equation. If this is formulated within a framework of the time dependent Hartree Fock (TDHF) theory, a classical version of the theory is obtained. A quantum version of the theory is deduced by formulating it within a framework of the unitary transformation method with auxiliary bosons. In this report, the discussion is concentrated on a relation between the classical theory and the quantum theory, and an applicability of the classical theory. The aim of the classical theory is to extract a maximally decoupled collective subspace out of a huge dimensional 1p - 1h parameter space introduced by the TDHF theory. An intimate similarity between the classical theory and a full quantum boson expansion method (BEM) was clarified. Discussion was concentrated to a simple Lipkin model. Then a relation between the BEM and the unitary transformation method with auxiliary bosons was discussed. It became clear that the quantum version of the theory had a strong relation to the BEM, and that the BEM was nothing but a quantum analogue of the present classical theory. The present theory was compared with the full TDHF calculation by using a simple model. (Kato, T.)

  16. Physically consistent data assimilation method based on feedback control for patient-specific blood flow analysis.

    Science.gov (United States)

    Ii, Satoshi; Adib, Mohd Azrul Hisham Mohd; Watanabe, Yoshiyuki; Wada, Shigeo

    2018-01-01

    This paper presents a novel data assimilation method for patient-specific blood flow analysis based on feedback control theory called the physically consistent feedback control-based data assimilation (PFC-DA) method. In the PFC-DA method, the signal, which is the residual error term of the velocity when comparing the numerical and reference measurement data, is cast as a source term in a Poisson equation for the scalar potential field that induces flow in a closed system. The pressure values at the inlet and outlet boundaries are recursively calculated by this scalar potential field. Hence, the flow field is physically consistent because it is driven by the calculated inlet and outlet pressures, without any artificial body forces. As compared with existing variational approaches, although this PFC-DA method does not guarantee the optimal solution, only one additional Poisson equation for the scalar potential field is required, providing a remarkable improvement for such a small additional computational cost at every iteration. Through numerical examples for 2D and 3D exact flow fields, with both noise-free and noisy reference data as well as a blood flow analysis on a cerebral aneurysm using actual patient data, the robustness and accuracy of this approach is shown. Moreover, the feasibility of a patient-specific practical blood flow analysis is demonstrated. Copyright © 2017 John Wiley & Sons, Ltd.

  17. A fast inverse consistent deformable image registration method based on symmetric optical flow computation

    International Nuclear Information System (INIS)

    Yang Deshan; Li Hua; Low, Daniel A; Deasy, Joseph O; Naqa, Issam El

    2008-01-01

    Deformable image registration is widely used in various radiation therapy applications including daily treatment planning adaptation to map planned tissue or dose to changing anatomy. In this work, a simple and efficient inverse consistency deformable registration method is proposed with aims of higher registration accuracy and faster convergence speed. Instead of registering image I to a second image J, the two images are symmetrically deformed toward one another in multiple passes, until both deformed images are matched and correct registration is therefore achieved. In each pass, a delta motion field is computed by minimizing a symmetric optical flow system cost function using modified optical flow algorithms. The images are then further deformed with the delta motion field in the positive and negative directions respectively, and then used for the next pass. The magnitude of the delta motion field is forced to be less than 0.4 voxel for every pass in order to guarantee smoothness and invertibility for the two overall motion fields that are accumulating the delta motion fields in both positive and negative directions, respectively. The final motion fields to register the original images I and J, in either direction, are calculated by inverting one overall motion field and combining the inversion result with the other overall motion field. The final motion fields are inversely consistent and this is ensured by the symmetric way that registration is carried out. The proposed method is demonstrated with phantom images, artificially deformed patient images and 4D-CT images. Our results suggest that the proposed method is able to improve the overall accuracy (reducing registration error by 30% or more, compared to the original and inversely inconsistent optical flow algorithms), reduce the inverse consistency error (by 95% or more) and increase the convergence rate (by 100% or more). The overall computation speed may slightly decrease, or increase in most cases

  18. Geometry of the self-consistent collective-coordinate method for the large-amplitude collective motion

    International Nuclear Information System (INIS)

    Sakata, Fumihiko; Marumori, Toshio; Hashimoto, Yukio; Une, Tsutomu.

    1983-05-01

    The geometry of the self-consistent collective-coordinate (SCC) method formulated within the framework of the time-dependent Hartree-Fock (TDHF) theory is investigated by associating the variational parameters with a symplectic manifold (a TDHF manifold). With the use of a canonical-variables parametrization, it is shown that the TDHF equation is equivalent to the canonical equations of motion in classical mechanics in the TDHF manifold. This enables us to investigate geometrical structure of the SCC method in the language of the classical mechanics. The SCC method turns out to give a prescription how to dynamically extract a ''maximally-decoupled'' collective submanifold (hypersurface) out of the TDHF manifold, in such a way that a certain kind of trajectories corresponding to the large-amplitude collective motion under consideration can be reproduced on the hypersurface as precisely as possible. The stability of the hypersurface at each point on it is investigated, in order to see whether the hypersurface obtained by the SCC method is really an approximate integral surface in the TDHF manifold or not. (author)

  19. RPA method based on the self-consistent cranking model for 168Er and 158Dy

    International Nuclear Information System (INIS)

    Kvasil, J.; Cwiok, S.; Chariev, M.M.; Choriev, B.

    1983-01-01

    The low-lying nuclear states in 168 Er and 158 Dy are analysed within the random phase approximation (RPA) method based on the self-consistent cranking model (SCCM). The moment of inertia, the value of chemical potential, and the strength constant k 1 have been obtained from the symmetry condition. The pairing strength constants Gsub(tau) have been determined from the experimental values of neutron and proton pairing energies for nonrotating nuclei. A quite good agreement with experimental energies of states with positive parity was obtained without introducing the two-phonon vibrational states

  20. Analytical free energy gradient for the molecular Ornstein-Zernike self-consistent-field method

    Directory of Open Access Journals (Sweden)

    N.Yoshida

    2007-09-01

    Full Text Available An analytical free energy gradient for the molecular Ornstein-Zernike self-consistent-field (MOZ-SCF method is presented. MOZ-SCF theory is one of the theories to considering the solvent effects on the solute electronic structure in solution. [Yoshida N. et al., J. Chem. Phys., 2000, 113, 4974] Molecular geometries of water, formaldehyde, acetonitrile and acetone in water are optimized by analytical energy gradient formula. The results are compared with those from the polarizable continuum model (PCM, the reference interaction site model (RISM-SCF and the three dimensional (3D RISM-SCF.

  1. The Roche Immunoturbidimetric Albumin Method on Cobas c 501 Gives Higher Values Than the Abbott and Roche BCP Methods When Analyzing Patient Plasma Samples.

    Science.gov (United States)

    Helmersson-Karlqvist, Johanna; Flodin, Mats; Havelka, Aleksandra Mandic; Xu, Xiao Yan; Larsson, Anders

    2016-09-01

    Serum/plasma albumin is an important and widely used laboratory marker and it is important that we measure albumin correctly without bias. We had indications that the immunoturbidimetric method on Cobas c 501 and the bromocresol purple (BCP) method on Architect 16000 differed, so we decided to study these methods more closely. A total of 1,951 patient requests with albumin measured with both the Architect BCP and Cobas immunoturbidimetric methods were extracted from the laboratory system. A comparison with fresh plasma samples was also performed that included immunoturbidimetric and BCP methods on Cobas c 501 and analysis of the international protein calibrator ERM-DA470k/IFCC. The median difference between the Abbott BCP and Roche immunoturbidimetric methods was 3.3 g/l and the Roche method overestimated ERM-DA470k/IFCC by 2.2 g/l. The Roche immunoturbidimetric method gave higher values than the Roche BCP method: y = 1.111x - 0.739, R² = 0.971. The Roche immunoturbidimetric albumin method gives clearly higher values than the Abbott and Roche BCP methods when analyzing fresh patient samples. The differences between the two methods were similar at normal and low albumin levels. © 2016 Wiley Periodicals, Inc.

  2. Quasiparticle self-consistent GW method for the spectral properties of complex materials.

    Science.gov (United States)

    Bruneval, Fabien; Gatti, Matteo

    2014-01-01

    The GW approximation to the formally exact many-body perturbation theory has been applied successfully to materials for several decades. Since the practical calculations are extremely cumbersome, the GW self-energy is most commonly evaluated using a first-order perturbative approach: This is the so-called G 0 W 0 scheme. However, the G 0 W 0 approximation depends heavily on the mean-field theory that is employed as a basis for the perturbation theory. Recently, a procedure to reach a kind of self-consistency within the GW framework has been proposed. The quasiparticle self-consistent GW (QSGW) approximation retains some positive aspects of a self-consistent approach, but circumvents the intricacies of the complete GW theory, which is inconveniently based on a non-Hermitian and dynamical self-energy. This new scheme allows one to surmount most of the flaws of the usual G 0 W 0 at a moderate calculation cost and at a reasonable implementation burden. In particular, the issues of small band gap semiconductors, of large band gap insulators, and of some transition metal oxides are then cured. The QSGW method broadens the range of materials for which the spectral properties can be predicted with confidence.

  3. A Dynamic Linear Hashing Method for Redundancy Management in Train Ethernet Consist Network

    Directory of Open Access Journals (Sweden)

    Xiaobo Nie

    2016-01-01

    Full Text Available Massive transportation systems like trains are considered critical systems because they use the communication network to control essential subsystems on board. Critical system requires zero recovery time when a failure occurs in a communication network. The newly published IEC62439-3 defines the high-availability seamless redundancy protocol, which fulfills this requirement and ensures no frame loss in the presence of an error. This paper adopts these for train Ethernet consist network. The challenge is management of the circulating frames, capable of dealing with real-time processing requirements, fast switching times, high throughout, and deterministic behavior. The main contribution of this paper is the in-depth analysis it makes of network parameters imposed by the application of the protocols to train control and monitoring system (TCMS and the redundant circulating frames discarding method based on a dynamic linear hashing, using the fastest method in order to resolve all the issues that are dealt with.

  4. Linearized self-consistent quasiparticle GW method: Application to semiconductors and simple metals

    Science.gov (United States)

    Kutepov, A. L.; Oudovenko, V. S.; Kotliar, G.

    2017-10-01

    We present a code implementing the linearized quasiparticle self-consistent GW method (LQSGW) in the LAPW basis. Our approach is based on the linearization of the self-energy around zero frequency which differs it from the existing implementations of the QSGW method. The linearization allows us to use Matsubara frequencies instead of working on the real axis. This results in efficiency gains by switching to the imaginary time representation in the same way as in the space time method. The all electron LAPW basis set eliminates the need for pseudopotentials. We discuss the advantages of our approach, such as its N3 scaling with the system size N, as well as its shortcomings. We apply our approach to study the electronic properties of selected semiconductors, insulators, and simple metals and show that our code produces the results very close to the previously published QSGW data. Our implementation is a good platform for further many body diagrammatic resummations such as the vertex-corrected GW approach and the GW+DMFT method. Program Files doi:http://dx.doi.org/10.17632/cpchkfty4w.1 Licensing provisions: GNU General Public License Programming language: Fortran 90 External routines/libraries: BLAS, LAPACK, MPI (optional) Nature of problem: Direct implementation of the GW method scales as N4 with the system size, which quickly becomes prohibitively time consuming even in the modern computers. Solution method: We implemented the GW approach using a method that switches between real space and momentum space representations. Some operations are faster in real space, whereas others are more computationally efficient in the reciprocal space. This makes our approach scale as N3. Restrictions: The limiting factor is usually the memory available in a computer. Using 10 GB/core of memory allows us to study the systems up to 15 atoms per unit cell.

  5. Linearized self-consistent quasiparticle GW method: Application to semiconductors and simple metals

    International Nuclear Information System (INIS)

    Kutepov, A. L.

    2017-01-01

    We present a code implementing the linearized self-consistent quasiparticle GW method (QSGW) in the LAPW basis. Our approach is based on the linearization of the self-energy around zero frequency which differs it from the existing implementations of the QSGW method. The linearization allows us to use Matsubara frequencies instead of working on the real axis. This results in efficiency gains by switching to the imaginary time representation in the same way as in the space time method. The all electron LAPW basis set eliminates the need for pseudopotentials. We discuss the advantages of our approach, such as its N 3 scaling with the system size N, as well as its shortcomings. We apply our approach to study the electronic properties of selected semiconductors, insulators, and simple metals and show that our code produces the results very close to the previously published QSGW data. Our implementation is a good platform for further many body diagrammatic resummations such as the vertex-corrected GW approach and the GW+DMFT method.

  6. An exact and consistent adjoint method for high-fidelity discretization of the compressible flow equations

    Science.gov (United States)

    Subramanian, Ramanathan Vishnampet Ganapathi

    , and can be tailored to achieve global conservation up to arbitrary orders of accuracy. We again confirm that the sensitivity gradient for turbulent jet noise computed using our dual-consistent method is only limited by computing precision.

  7. Direct and regression methods do not give different estimates of digestible and metabolizable energy of wheat for pigs.

    Science.gov (United States)

    Bolarinwa, O A; Adeola, O

    2012-12-01

    Digestible and metabolizable energy contents of feed ingredients for pigs can be determined by direct or indirect methods. There are situations when only the indirect approach is suitable and the regression method is a robust indirect approach. This study was conducted to compare the direct and regression methods for determining the energy value of wheat for pigs. Twenty-four barrows with an average initial BW of 31 kg were assigned to 4 diets in a randomized complete block design. The 4 diets consisted of 969 g wheat/kg plus minerals and vitamins (sole wheat) for the direct method, corn (Zea mays)-soybean (Glycine max) meal reference diet (RD), RD + 300 g wheat/kg, and RD + 600 g wheat/kg. The 3 corn-soybean meal diets were used for the regression method and wheat replaced the energy-yielding ingredients, corn and soybean meal, so that the same ratio of corn and soybean meal across the experimental diets was maintained. The wheat used was analyzed to contain 883 g DM, 15.2 g N, and 3.94 Mcal GE/kg. Each diet was fed to 6 barrows in individual metabolism crates for a 5-d acclimation followed by a 5-d total but separate collection of feces and urine. The DE and ME for the sole wheat diet were 3.83 and 3.77 Mcal/kg DM, respectively. Because the sole wheat diet contained 969 g wheat/kg, these translate to 3.95 Mcal DE/kg DM and 3.89 Mcal ME/kg DM. The RD used for the regression approach yielded 4.00 Mcal DE and 3.91 Mcal ME/kg DM diet. Increasing levels of wheat in the RD linearly reduced (P direct method (3.95 and 3.89 Mcal/kg DM) did not differ (0.78 < P < 0.89) from those obtained using the regression method (3.96 and 3.88 Mcal/kg DM).

  8. A consistent modelling methodology for secondary settling tanks: a reliable numerical method.

    Science.gov (United States)

    Bürger, Raimund; Diehl, Stefan; Farås, Sebastian; Nopens, Ingmar; Torfs, Elena

    2013-01-01

    The consistent modelling methodology for secondary settling tanks (SSTs) leads to a partial differential equation (PDE) of nonlinear convection-diffusion type as a one-dimensional model for the solids concentration as a function of depth and time. This PDE includes a flux that depends discontinuously on spatial position modelling hindered settling and bulk flows, a singular source term describing the feed mechanism, a degenerating term accounting for sediment compressibility, and a dispersion term for turbulence. In addition, the solution itself is discontinuous. A consistent, reliable and robust numerical method that properly handles these difficulties is presented. Many constitutive relations for hindered settling, compression and dispersion can be used within the model, allowing the user to switch on and off effects of interest depending on the modelling goal as well as investigate the suitability of certain constitutive expressions. Simulations show the effect of the dispersion term on effluent suspended solids and total sludge mass in the SST. The focus is on correct implementation whereas calibration and validation are not pursued.

  9. Evaluation of a Consistent LES/PDF Method Using a Series of Experimental Spray Flames

    Science.gov (United States)

    Heye, Colin; Raman, Venkat

    2012-11-01

    A consistent method for the evolution of the joint-scalar probability density function (PDF) transport equation is proposed for application to large eddy simulation (LES) of turbulent reacting flows containing evaporating spray droplets. PDF transport equations provide the benefit of including the chemical source term in closed form, however, additional terms describing LES subfilter mixing must be modeled. The recent availability of detailed experimental measurements provide model validation data for a wide range of evaporation rates and combustion regimes, as is well-known to occur in spray flames. In this work, the experimental data will used to investigate the impact of droplet mass loading and evaporation rates on the subfilter scalar PDF shape in comparison with conventional flamelet models. In addition, existing model term closures in the PDF transport equations are evaluated with a focus on their validity in the presence of regime changes.

  10. Consistent calculation of the polarization electric dipole moment by the shell-correction method

    International Nuclear Information System (INIS)

    Denisov, V.Yu.

    1992-01-01

    Macroscopic calculations of the polarization electric dipole moment which arises in nuclei with an octupole deformation are discussed in detail. This dipole moment is shown to depend on the position of the center of gravity. The conditions of consistency of the radii of the proton and neutron potentials and the radii of the proton and neutron surfaces, respectively, are discussed. These conditions must be incorporated in a shell-correction calculation of this dipole moment. A correct calculation of this moment by the shell-correction method is carried out. Dipole transitions between (on the one hand) levels belonging to an octupole vibrational band and (on the other) the ground state in rare-earth nuclei with a large quadrupole deformation are studied. 19 refs., 3 figs

  11. Consistency analysis of Keratograph and traditional methods to evaluate tear film function

    Directory of Open Access Journals (Sweden)

    Pei-Yang Shen

    2015-05-01

    Full Text Available AIM: To investigate repeatability and accuracy of a latest Keratograph for evaluating the tear film stability and to compare its measurements with that of traditional examination methods. METHODS: The results of noninvasive tear film break-up time(NI-BUTincluding the first tear film break-up time(BUT-fand the average tear film break-up time(BUT-avewere measured by Keratograph. The repeatability of the measurements was evaluated by coefficient of variation(CVand intraclass correlation coefficient(ICC. Wilcoxon Signed-Rank test was used to compare NI-BUT with fluorescein tear film break-up time(FBUTto confirm the correlation between NI-BUT and FBUT, Schirmer I test values. Bland-Altman analysis was used to evaluate consistency. RESULTS: The study recruited 48 subjects(48 eyes(mean age 38.7±15.2 years. The CV and ICC of BUT-f were respectively 12.6% and 0.95, those of BUT-ave were 9.8% and 0.96. The value of BUT-f was lower than that of FBUT. The difference had statistical significance(6.16±2.46s vs 7.46±1.92s, PPCONCLUSION: Keratograph can provide NI-BUT data that has a better repeatability and reliability, which has great application prospects in diagnosis and treatment of dry eye and refractive corneal surgery.

  12. Regression and direct methods do not give different estimates of digestible and metabolizable energy values of barley, sorghum, and wheat for pigs.

    Science.gov (United States)

    Bolarinwa, O A; Adeola, O

    2016-02-01

    Direct or indirect methods can be used to determine the DE and ME of feed ingredients for pigs. In situations when only the indirect approach is suitable, the regression method presents a robust indirect approach. Three experiments were conducted to compare the direct and regression methods for determining the DE and ME values of barley, sorghum, and wheat for pigs. In each experiment, 24 barrows with an average initial BW of 31, 32, and 33 kg were assigned to 4 diets in a randomized complete block design. The 4 diets consisted of 969 g barley, sorghum, or wheat/kg plus minerals and vitamins for the direct method; a corn-soybean meal reference diet (RD); the RD + 300 g barley, sorghum, or wheat/kg; and the RD + 600 g barley, sorghum, or wheat/kg. The 3 corn-soybean meal diets were used for the regression method. Each diet was fed to 6 barrows in individual metabolism crates for a 5-d acclimation followed by a 5-d period of total but separate collection of feces and urine in each experiment. Graded substitution of barley or wheat, but not sorghum, into the RD linearly reduced ( direct method-derived DE and ME for barley were 3,669 and 3,593 kcal/kg DM, respectively. The regressions of barley contribution to DE and ME in kilocalories against the quantity of barley DMI in kilograms generated 3,746 kcal DE/kg DM and 3,647 kcal ME/kg DM. The DE and ME for sorghum by the direct method were 4,097 and 4,042 kcal/kg DM, respectively; the corresponding regression-derived estimates were 4,145 and 4,066 kcal/kg DM. Using the direct method, energy values for wheat were 3,953 kcal DE/kg DM and 3,889 kcal ME/kg DM. The regressions of wheat contribution to DE and ME in kilocalories against the quantity of wheat DMI in kilograms generated 3,960 kcal DE/kg DM and 3,874 kcal ME/kg DM. The DE and ME of barley using the direct method were not different (0.3 direct method-derived DE and ME of sorghum were not different (0.5 direct method- and regression method-derived DE (3,953 and 3

  13. A consistent, differential versus integral, method for measuring the delayed neutron yield in fissions

    International Nuclear Information System (INIS)

    Flip, A.; Pang, H.F.; D'Angelo, A.

    1995-01-01

    Due to the persistent uncertainties: ∼ 5 % (the uncertainty, here and there after, is at 1σ) in the prediction of the 'reactivity scale' (β eff ) for a fast power reactor, an international project was recently initiated in the framework of the OECD/NEA activities for reevaluation, new measurements and integral benchmarking of delayed neutron (DN) data and related kinetic parameters (principally β eff ). Considering that the major part of this uncertainty is due to uncertainties in the DN yields (v d ) and the difficulty for further improvement of the precision in differential (e.g. Keepin's method) measurements, an international cooperative strategy was adopted aiming at extracting and consistently interpreting information from both differential (nuclear) and integral (in reactor) measurements. The main problem arises from the integral side; thus the idea was to realize β eff like measurements (both deterministic and noise) in 'clean' assemblies. The 'clean' calculational context permitted the authors to develop a theory allowing to link explicitly this integral experimental level with the differential one, via a unified 'Master Model' which relates v d and measurables quantities (on both levels) linearly. The combined error analysis is consequently largely simplified and the final uncertainty drastically reduced (theoretically, by a factor √3). On the other hand the same theoretical development leading to the 'Master Model', also resulted in a structured scheme of approximations of the general (stochastic) Boltzmann equation allowing a consistent analysis of the large range of measurements concerned (stochastic, dynamic, static ... ). This paper is focused on the main results of this theoretical development and its application to the analysis of the Preliminary results of the BERENICE program (β eff measurements in MASURCA, the first assembly in CADARACHE-FRANCE)

  14. Direct numerical simulation of interfacial instabilities: A consistent, conservative, all-speed, sharp-interface method

    Science.gov (United States)

    Chang, Chih-Hao; Deng, Xiaolong; Theofanous, Theo G.

    2013-06-01

    We present a conservative and consistent numerical method for solving the Navier-Stokes equations in flow domains that may be separated by any number of material interfaces, at arbitrarily-high density/viscosity ratios and acoustic-impedance mismatches, subjected to strong shock waves and flow speeds that can range from highly supersonic to near-zero Mach numbers. A principal aim is prediction of interfacial instabilities under superposition of multiple potentially-active modes (Rayleigh-Taylor, Kelvin-Helmholtz, Richtmyer-Meshkov) as found for example with shock-driven, immersed fluid bodies (locally oblique shocks)—accordingly we emphasize fidelity supported by physics-based validation, including experiments. Consistency is achieved by satisfying the jump discontinuities at the interface within a conservative 2nd-order scheme that is coupled, in a conservative manner, to the bulk-fluid motions. The jump conditions are embedded into a Riemann problem, solved exactly to provide the pressures and velocities along the interface, which is tracked by a level set function to accuracy of O(Δx5, Δt4). Subgrid representation of the interface is achieved by allowing curvature of its constituent interfacial elements to obtain O(Δx3) accuracy in cut-cell volume, with attendant benefits in calculating cell- geometric features and interface curvature (O(Δx3)). Overall the computation converges at near-theoretical O(Δx2). Spurious-currents are down to machine error and there is no time-step restriction due to surface tension. Our method is built upon a quadtree-like adaptive mesh refinement infrastructure. When necessary, this is supplemented by body-fitted grids to enhance resolution of the gas dynamics, including flow separation, shear layers, slip lines, and critical layers. Comprehensive comparisons with exact solutions for the linearized Rayleigh-Taylor and Kelvin-Helmholtz problems demonstrate excellent performance. Sample simulations of liquid drops subjected to

  15. Self-Consistency Method to Evaluate a Linear Expansion Thermal Coefficient of Composite with Dispersed Inclusions

    Directory of Open Access Journals (Sweden)

    V. S. Zarubin

    2015-01-01

    Full Text Available The rational use of composites as structural materials, while perceiving the thermal and mechanical loads, to a large extent determined by their thermoelastic properties. From the presented review of works devoted to the analysis of thermoelastic characteristics of composites, it follows that the problem of estimating these characteristics is important. Among the thermoelastic properties of composites occupies an important place its temperature coefficient of linear expansion.Along with fiber composites are widely used in the technique of dispersion hardening composites, in which the role of inclusions carry particles of high-strength and high-modulus materials, including nanostructured elements. Typically, the dispersed particles have similar dimensions in all directions, which allows the shape of the particles in the first approximation the ball.In an article for the composite with isotropic spherical inclusions of a plurality of different materials by the self-produced design formulas relating the temperature coefficient of linear expansion with volume concentration of inclusions and their thermoelastic characteristics, as well as the thermoelastic properties of the matrix of the composite. Feature of the method is the self-accountability thermomechanical interaction of a single inclusion or matrix particles with a homogeneous isotropic medium having the desired temperature coefficient of linear expansion. Averaging over the volume of the composite arising from such interaction perturbation strain and stress in the inclusions and the matrix particles and makes it possible to obtain such calculation formulas.For the validation of the results of calculations of the temperature coefficient of linear expansion of the composite of this type used two-sided estimates that are based on the dual variational formulation of linear thermoelasticity problem in an inhomogeneous solid containing two alternative functional (such as Lagrange and Castigliano

  16. A consistent formulation of the finite element method for solving diffusive-convective transport problems

    International Nuclear Information System (INIS)

    Carmo, E.G.D. do; Galeao, A.C.N.R.

    1986-01-01

    A new method specially designed to solve highly convective transport problems is proposed. Using a variational approach it is shown that this weighted residual method belongs to a class of Petrov-Galerkin's approximation. Some examples are presented in order to demonstrate the adequacy of this method in predicting internal or external boundary layers. (Author) [pt

  17. A particle method with adjustable transport properties - the generalized consistent Boltzmann algorithm

    International Nuclear Information System (INIS)

    Garcia, A.L.; Alexander, F.J.; Alder, B.J.

    1997-01-01

    The consistent Boltzmann algorithm (CBA) for dense, hard-sphere gases is generalized to obtain the van der Waals equation of state and the corresponding exact viscosity at all densities except at the highest temperatures. A general scheme for adjusting any transport coefficients to higher values is presented

  18. Bosons system with finite repulsive interaction: self-consistent field method

    International Nuclear Information System (INIS)

    Renatino, M.M.B.

    1983-01-01

    Some static properties of a boson system (T = zero degree Kelvin), under the action of a repulsive potential are studied. For the repulsive potential, a model was adopted consisting of a region where it is constant (r c ), and a decay as 1/r (r > r c ). The self-consistent field approximation used takes into account short range correlations through a local field corrections, which leads to an effective field. The static structure factor S(q-vector) and the effective potential ψ(q-vector) are obtained through a self-consistent calculation. The pair-correlation function g(r-vector) and the energy of the collective excitations E(q-vector) are also obtained, from the structure factor. The density of the system and the parameters of the repulsive potential, that is, its height and the size of the constant region were used as variables for the problem. The results obtained for S(q-vector), g(r-vector) and E(q-vector) for a fixed ratio r o /r c and a variable λ, indicates the raising of a system structure, which is more noticeable when the potential became more repulsive. (author)

  19. A RTS-based method for direct and consistent calculating intermittent peak cooling loads

    International Nuclear Information System (INIS)

    Chen Tingyao; Cui, Mingxian

    2010-01-01

    The RTS method currently recommended by ASHRAE Handbook is based on continuous operation. However, most of air-conditioning systems, if not all, in commercial buildings, are intermittently operated in practice. The application of the current RTS method to intermittent air-conditioning in nonresidential buildings could result in largely underestimated design cooling loads, and inconsistently sized air-conditioning systems. Improperly sized systems could seriously deteriorate the performance of system operation and management. Therefore, a new method based on both the current RTS method and the principles of heat transfer has been developed. The first part of the new method is the same as the current RTS method in principle, but its calculation procedure is simplified by the derived equations in a close form. The technical data available in the current RTS method can be utilized to compute zone responses to a change in space air temperature so that no efforts are needed for regenerating new technical data. Both the overall RTS coefficients and the hourly cooling loads computed in the first part are used to estimate the additional peak cooling load due to a change from continuous operation to intermittent operation. It only needs one more step after the current RTS method to determine the intermittent peak cooling load. The new RTS-based method has been validated by EnergyPlus simulations. The root mean square deviation (RMSD) between the relative additional peak cooling loads (RAPCLs) computed by the two methods is 1.8%. The deviation of the RAPCL varies from -3.0% to 5.0%, and the mean deviation is 1.35%.

  20. Internal consistency and validity of an observational method for assessing disability in mobility in patients with osteoarthritis.

    NARCIS (Netherlands)

    Steultjens, M.P.M.; Dekker, J.; Baar, M.E. van; Oostendorp, R.A.B.; Bijlsma, J.W.J.

    1999-01-01

    Objective: To establish the internal consistency of validity of an observational method for assessing diasbility in mobility in patients with osteoarthritis (OA), Methods: Data were obtained from 198 patients with OA of the hip or knee. Results of the observational method were compared with results

  1. Consistent method of truncating the electron self-energy in nonperturbative QED

    International Nuclear Information System (INIS)

    Rembiesa, P.

    1986-01-01

    A nonperturbative method of solving the Dyson-Schwinger equations for the fermion propagator is considered. The solution satisfies the Ward-Takahashi identity, allows multiplicative regularization, and exhibits a physical-mass pole

  2. Giving Credit where Credit is Due. A Practical Method to Distinguish between Human and Natural Factors in Carbon Accounting

    International Nuclear Information System (INIS)

    Kirschbaum, M.U.F.; Cowie, A.L.

    2004-01-01

    Net carbon emissions from the biosphere differ from fossil-fuel based emissions in that: (1) a large proportion of biospheric carbon exchange is not under direct human control; (2) land-use decisions often have only a small short-term effect on net emissions, but a large long-term effect; and (3) biospheric carbon exchange is potentially reversible. Because of these differences, carbon accounting approaches also need to be different for fossil-fuel and biosphere-based emissions. Recognising that, the international negotiators at COP 7 adopted a range of guiding principles for accounting for biospheric carbon exchange, including: that accounting excludes removals resulting from (a) elevated carbon dioxide concentrations above pre-industrial level; (b) indirect nitrogen deposition; and (c) the dynamic effects of age structure resulting from activities and practices before the reference year. In this paper, we highlight some of the challenges in biospheric carbon accounting for Canada, the U.S.A, New Zealand and Australia, four nations for which biospheric net carbon exchange is large relative to fossil-fuel based emissions. We discuss an accounting scheme that is based on assessing changes in average carbon stocks due to changes in land use. That scheme is tailored to the special needs of biospheric carbon management and is consistent with the accounting principles adopted at COP 7. The paper shows how the accounting scheme would resolve many of the biospheric carbon accounting anomalies identified for the four nations we studied

  3. A novel method for identification and quantification of consistently differentially methylated regions.

    Directory of Open Access Journals (Sweden)

    Ching-Lin Hsiao

    Full Text Available Advances in biotechnology have resulted in large-scale studies of DNA methylation. A differentially methylated region (DMR is a genomic region with multiple adjacent CpG sites that exhibit different methylation statuses among multiple samples. Many so-called "supervised" methods have been established to identify DMRs between two or more comparison groups. Methods for the identification of DMRs without reference to phenotypic information are, however, less well studied. An alternative "unsupervised" approach was proposed, in which DMRs in studied samples were identified with consideration of nature dependence structure of methylation measurements between neighboring probes from tiling arrays. Through simulation study, we investigated effects of dependencies between neighboring probes on determining DMRs where a lot of spurious signals would be produced if the methylation data were analyzed independently of the probe. In contrast, our newly proposed method could successfully correct for this effect with a well-controlled false positive rate and a comparable sensitivity. By applying to two real datasets, we demonstrated that our method could provide a global picture of methylation variation in studied samples. R source codes to implement the proposed method were freely available at http://www.csjfann.ibms.sinica.edu.tw/eag/programlist/ICDMR/ICDMR.html.

  4. Technical Note: Regularization performances with the error consistency method in the case of retrieved atmospheric profiles

    Directory of Open Access Journals (Sweden)

    S. Ceccherini

    2007-01-01

    Full Text Available The retrieval of concentration vertical profiles of atmospheric constituents from spectroscopic measurements is often an ill-conditioned problem and regularization methods are frequently used to improve its stability. Recently a new method, that provides a good compromise between precision and vertical resolution, was proposed to determine analytically the value of the regularization parameter. This method is applied for the first time to real measurements with its implementation in the operational retrieval code of the satellite limb-emission measurements of the MIPAS instrument and its performances are quantitatively analyzed. The adopted regularization improves the stability of the retrieval providing smooth profiles without major degradation of the vertical resolution. In the analyzed measurements the retrieval procedure provides a vertical resolution that, in the troposphere and low stratosphere, is smaller than the vertical field of view of the instrument.

  5. Delta self-consistent field method to obtain potential energy surfaces of excited molecules on surfaces

    DEFF Research Database (Denmark)

    Gavnholt, Jeppe; Olsen, Thomas; Engelund, Mads

    2008-01-01

    is a density-functional method closely resembling standard density-functional theory (DFT), the only difference being that in Delta SCF one or more electrons are placed in higher lying Kohn-Sham orbitals instead of placing all electrons in the lowest possible orbitals as one does when calculating the ground......-state energy within standard DFT. We extend the Delta SCF method by allowing excited electrons to occupy orbitals which are linear combinations of Kohn-Sham orbitals. With this extra freedom it is possible to place charge locally on adsorbed molecules in the calculations, such that resonance energies can...... be estimated, which is not possible in traditional Delta SCF because of very delocalized Kohn-Sham orbitals. The method is applied to N2, CO, and NO adsorbed on different metallic surfaces and compared to ordinary Delta SCF without our modification, spatially constrained DFT, and inverse...

  6. Biomedical journals lack a consistent method to detect outcome reporting bias: a cross-sectional analysis.

    Science.gov (United States)

    Huan, L N; Tejani, A M; Egan, G

    2014-10-01

    An increasing amount of recently published literature has implicated outcome reporting bias (ORB) as a major contributor to skewing data in both randomized controlled trials and systematic reviews; however, little is known about the current methods in place to detect ORB. This study aims to gain insight into the detection and management of ORB by biomedical journals. This was a cross-sectional analysis involving standardized questions via email or telephone with the top 30 biomedical journals (2012) ranked by impact factor. The Cochrane Database of Systematic Reviews was excluded leaving 29 journals in the sample. Of 29 journals, 24 (83%) responded to our initial inquiry of which 14 (58%) answered our questions and 10 (42%) declined participation. Five (36%) of the responding journals indicated they had a specific method to detect ORB, whereas 9 (64%) did not have a specific method in place. The prevalence of ORB in the review process seemed to differ with 4 (29%) journals indicating ORB was found commonly, whereas 7 (50%) indicated ORB was uncommon or never detected by their journal previously. The majority (n = 10/14, 72%) of journals were unwilling to report or make discrepancies found in manuscripts available to the public. Although the minority, there were some journals (n = 4/14, 29%) which described thorough methods to detect ORB. Many journals seemed to lack a method with which to detect ORB and its estimated prevalence was much lower than that reported in literature suggesting inadequate detection. There exists a potential for overestimation of treatment effects of interventions and unclear risks. Fortunately, there are journals within this sample which appear to utilize comprehensive methods for detection of ORB, but overall, the data suggest improvements at the biomedical journal level for detecting and minimizing the effect of this bias are needed. © 2014 John Wiley & Sons Ltd.

  7. Microscopic and self-consistent description of nuclear properties by extended generator-coordinate method

    International Nuclear Information System (INIS)

    Didong, M.

    1976-01-01

    The extend generator-coordinated method is discussed and a procedure is given for the solution of the Hill-Wheeler equation. The HFB-theory, the particle-number and angular-momentum projections necessary for symmetry, and the modified surprice delta interaction are discussed. The described procedures are used to calculate 72 Ge, 70 Zn and 74 Ge properties. (BJ) [de

  8. Selection of suitable propagation method for consistent plantlets production in Stevia rebaudiana (Bertoni)

    OpenAIRE

    Khalil, Shahid Akbar; Zamir, Roshan; Ahmad, Nisar

    2014-01-01

    Stevia rebaudiana (Bert.) is an emerging sugar alternative and anti-diabetic plant in Pakistan. That is why people did not know the exact time of propagation. The main objective of the present study was to establish feasible propagation methods for healthy biomass production. In the present study, seed germination, stem cuttings and micropropagation were investigated for higher productivity. Fresh seeds showed better germination (25.51–40%) but lost viability after a few days of storage. In o...

  9. Bootstrapping the economy -- a non-parametric method of generating consistent future scenarios

    OpenAIRE

    Müller, Ulrich A; Bürgi, Roland; Dacorogna, Michel M

    2004-01-01

    The fortune and the risk of a business venture depends on the future course of the economy. There is a strong demand for economic forecasts and scenarios that can be applied to planning and modeling. While there is an ongoing debate on modeling economic scenarios, the bootstrapping (or resampling) approach presented here has several advantages. As a non-parametric method, it directly relies on past market behaviors rather than debatable assumptions on models and parameters. Simultaneous dep...

  10. Consistent economic cross-sectoral climate change impact scenario analysis: Method and application to Austria

    Directory of Open Access Journals (Sweden)

    Karl W. Steininger

    2016-03-01

    Full Text Available Climate change triggers manifold impacts at the national to local level, which in turn have various economy-wide implications (e.g. on welfare, employment, or tax revenues. In its response, society needs to prioritize which of these impacts to address and what share of resources to spend on each respective adaptation. A prerequisite to achieving that end is an economic impact analysis that is consistent across sectors and acknowledges intersectoral and economy-wide feedback effects. Traditional Integrated Assessment Models (IAMs are usually operating at a level too aggregated for this end, while bottom-up impact models most often are not fully comprehensive, focusing on only a subset of climate sensitive sectors and/or a subset of climate change impact chains. Thus, we develop here an approach which applies climate and socioeconomic scenario analysis, harmonized economic costing, and sector explicit bandwidth analysis in a coupled framework of eleven (biophysical impact assessment models and a uniform multi-sectoral computable general equilibrium model. In applying this approach to the alpine country of Austria, we find that macroeconomic feedbacks can magnify sectoral climate damages up to fourfold, or that by mid-century costs of climate change clearly outweigh benefits, with net costs rising two- to fourfold above current damage cost levels. The resulting specific impact information – differentiated by climate and economic drivers – can support sector-specific adaptation as well as adaptive capacity building. Keywords: climate impact, local impact, economic evaluation, adaptation

  11. Self-consistent EXAFS PDF Projection Method by Matched Correction of Fourier Filter Signal Distortion

    International Nuclear Information System (INIS)

    Lee, Jay Min; Yang, Dong-Seok

    2007-01-01

    Inverse problem solving computation was performed for solving PDF (pair distribution function) from simulated data EXAFS based on data FEFF. For a realistic comparison with experimental data, we chose a model of the first sub-shell Mn-0 pair showing the Jahn Teller distortion in crystalline LaMnO3. To restore the Fourier filtering signal distortion, involved in the first sub-shell information isolated from higher shell contents, relevant distortion matching function was computed initially from the proximity model, and iteratively from the prior-guess during consecutive regularization computation. Adaptive computation of EXAFS background correction is an issue of algorithm development, but our preliminary test was performed under the simulated background correction perfectly excluding the higher shell interference. In our numerical result, efficient convergence of iterative solution indicates a self-consistent tendency that a true PDF solution is convinced as a counterpart of genuine chi-data, provided that a background correction function is iteratively solved using an extended algorithm of MEPP (Matched EXAFS PDF Projection) under development

  12. Internal consistency and validity of an observational method for assessing disability in mobility in patients with osteoarthritis

    NARCIS (Netherlands)

    Steultjens, M. P.; Dekker, J.; van Baar, M. E.; Oostendorp, R. A.; Bijlsma, J. W.

    1999-01-01

    To establish the internal consistency and validity of an observational method for assessing disability in mobility in patients with osteoarthritis (OA). Data were obtained from 198 patients with OA of the hip or knee. Results of the observational method were compared with results of self-report

  13. A design method for two-layer beams consisting of normal and fibered high strength concrete

    International Nuclear Information System (INIS)

    Iskhakov, I.; Ribakov, Y.

    2007-01-01

    Two-layer fibered concrete beams can be analyzed using conventional methods for composite elements. The compressed zone of such beam section is made of high strength concrete (HSC), and the tensile one of normal strength concrete (NSC). The problems related to such type of beams are revealed and studied. An appropriate depth of each layer is prescribed. Compatibility conditions between HSC and NSC layers are found. It is based on the shear deformations equality on the layers border in a section with maximal depth of the compression zone. For the first time a rigorous definition of HSC is given using a comparative analysis of deformability and strength characteristics of different concrete classes. According to this definition, HSC has no download branch in the stress-strain diagram, the stress-strain function has minimum exponent, the ductility parameter is minimal and the concrete tensile strength remains constant with an increase in concrete compression strength. The application fields of two-layer concrete beams based on different static schemes and load conditions make known. It is known that the main disadvantage of HSCs is their low ductility. In order to overcome this problem, fibers are added to the HSC layer. Influence of different fiber volume ratios on structural ductility is discussed. An upper limit of the required fibers volume ratio is found based on compatibility equation of transverse tensile concrete deformations and deformations of fibers

  14. Selection of suitable propagation method for consistent plantlets production in Stevia rebaudiana (Bertoni)

    Science.gov (United States)

    Khalil, Shahid Akbar; Zamir, Roshan; Ahmad, Nisar

    2014-01-01

    Stevia rebaudiana (Bert.) is an emerging sugar alternative and anti-diabetic plant in Pakistan. That is why people did not know the exact time of propagation. The main objective of the present study was to establish feasible propagation methods for healthy biomass production. In the present study, seed germination, stem cuttings and micropropagation were investigated for higher productivity. Fresh seeds showed better germination (25.51–40%) but lost viability after a few days of storage. In order to improve the germination percentage, seeds were irradiated with 2.5, 5.0, 7.5 and 10 Gy gamma doses. But gamma irradiation did not show any significant change in seed germination. A great variation in survival of stem cutting was observed in each month of 2012. October and November were found the most suitable months for stem cutting survival (60%). In order to enhance survival, stem cuttings were also dipped in different plant growth regulators (PGRs) solution. Only indole butyric acid (IBA; 1000 ppm) treated cutting showed a higher survival (33%) than control (11.1%). Furthermore, simple and feasible indirect regeneration system was established from leaf explants. Best callus induction (84.6%) was observed on MS-medium augmented with 6-benzyladenine (BA) and 2,4-dichlorophenoxyacetic acid (2,4-D; 2.0 mg l−1). For the first time, we obtained the highest number of shoots (106) on a medium containing BA (1.5 mg l−1) and gibberellic acid (GA3; 0.5 mg l−1). Plantlets were successfully acclimatized in plastic pots. The current results preferred micropropagation (85%) over seed germination (25.51–40%) and stem cutting (60%). PMID:25473365

  15. Selection of suitable propagation method for consistent plantlets production in Stevia rebaudiana (Bertoni).

    Science.gov (United States)

    Khalil, Shahid Akbar; Zamir, Roshan; Ahmad, Nisar

    2014-12-01

    Stevia rebaudiana (Bert.) is an emerging sugar alternative and anti-diabetic plant in Pakistan. That is why people did not know the exact time of propagation. The main objective of the present study was to establish feasible propagation methods for healthy biomass production. In the present study, seed germination, stem cuttings and micropropagation were investigated for higher productivity. Fresh seeds showed better germination (25.51-40%) but lost viability after a few days of storage. In order to improve the germination percentage, seeds were irradiated with 2.5, 5.0, 7.5 and 10 Gy gamma doses. But gamma irradiation did not show any significant change in seed germination. A great variation in survival of stem cutting was observed in each month of 2012. October and November were found the most suitable months for stem cutting survival (60%). In order to enhance survival, stem cuttings were also dipped in different plant growth regulators (PGRs) solution. Only indole butyric acid (IBA; 1000 ppm) treated cutting showed a higher survival (33%) than control (11.1%). Furthermore, simple and feasible indirect regeneration system was established from leaf explants. Best callus induction (84.6%) was observed on MS-medium augmented with 6-benzyladenine (BA) and 2,4-dichlorophenoxyacetic acid (2,4-D; 2.0 mg l(-1)). For the first time, we obtained the highest number of shoots (106) on a medium containing BA (1.5 mg l(-1)) and gibberellic acid (GA3; 0.5 mg l(-1)). Plantlets were successfully acclimatized in plastic pots. The current results preferred micropropagation (85%) over seed germination (25.51-40%) and stem cutting (60%).

  16. VLE measurements using a static cell vapor phase manual sampling method accompanied with an empirical data consistency test

    International Nuclear Information System (INIS)

    Freitag, Joerg; Kosuge, Hitoshi; Schmelzer, Juergen P.; Kato, Satoru

    2015-01-01

    Highlights: • We use a new, simple static cell vapor phase manual sampling method (SCVMS) for VLE (x, y, T) measurement. • The method is applied to non-azeotropic, asymmetric and two-liquid phase forming azeotropic binaries. • The method is approved by a data consistency test, i.e., a plot of the polarity exclusion factor vs. pressure. • The consistency test reveals that with the new SCVMS method accurate VLE near ambient temperature can be measured. • Moreover, the consistency test approves that the effect of air in the SCVMS system is negligible. - Abstract: A new static cell vapor phase manual sampling (SCVMS) method is used for the simple measurement of constant temperature x, y (vapor + liquid) equilibria (VLE). The method was applied to the VLE measurements of the (methanol + water) binary at T/K = (283.2, 298.2, 308.2 and 322.9), asymmetric (acetone + 1-butanol) binary at T/K = (283.2, 295.2, 308.2 and 324.2) and two-liquid phase forming azeotropic (water + 1-butanol) binary at T/K = (283.2 and 298.2). The accuracy of the experimental data was approved by a data consistency test, that is, an empirical plot of the polarity exclusion factor, β, vs. the system pressure, P. The SCVMS data are accurate, because the VLE data converge to the same lnβ vs. lnP straight line determined from conventional distillation-still method and a headspace gas chromatography method

  17. Application of the adiabatic self-consistent collective coordinate method to a solvable model of prolate-oblate shape coexistence

    International Nuclear Information System (INIS)

    Kobayasi, Masato; Matsuyanagi, Kenichi; Nakatsukasa, Takashi; Matsuo, Masayuki

    2003-01-01

    The adiabatic self-consistent collective coordinate method is applied to an exactly solvable multi-O(4) model that is designed to describe nuclear shape coexistence phenomena. The collective mass and dynamics of large amplitude collective motion in this model system are analyzed, and it is shown that the method yields a faithful description of tunneling motion through a barrier between the prolate and oblate local minima in the collective potential. The emergence of the doublet pattern is clearly described. (author)

  18. A consistent method for finite volume discretization of body forces on collocated grids applied to flow through an actuator disk

    DEFF Research Database (Denmark)

    Troldborg, Niels; Sørensen, Niels N.; Réthoré, Pierre-Elouan

    2015-01-01

    This paper describes a consistent algorithm for eliminating the numerical wiggles appearing when solving the finite volume discretized Navier-Stokes equations with discrete body forces in a collocated grid arrangement. The proposed method is a modification of the Rhie-Chow algorithm where the for...

  19. A Combined Self-Consistent Method to Estimate the Effective Properties of Polypropylene/Calcium Carbonate Composites

    Directory of Open Access Journals (Sweden)

    Zhongqiang Xiong

    2018-01-01

    Full Text Available In this work, trying to avoid difficulty of application due to the irregular filler shapes in experiments, self-consistent and differential self-consistent methods were combined to obtain a decoupled equation. The combined method suggests a tenor γ independent of filler-contents being an important connection between high and low filler-contents. On one hand, the constant parameter can be calculated by Eshelby’s inclusion theory or the Mori–Tanaka method to predict effective properties of composites coinciding with its hypothesis. On the other hand, the parameter can be calculated with several experimental results to estimate the effective properties of prepared composites of other different contents. In addition, an evaluation index σ f ′ of the interactional strength between matrix and fillers is proposed based on experiments. In experiments, a hyper-dispersant was synthesized to prepare polypropylene/calcium carbonate (PP/CaCO3 composites up to 70 wt % of filler-content with dispersion, whose dosage was only 5 wt % of the CaCO3 contents. Based on several verifications, it is hoped that the combined self-consistent method is valid for other two-phase composites in experiments with the same application progress as in this work.

  20. DFTB3: Extension of the self-consistent-charge density-functional tight-binding method (SCC-DFTB).

    Science.gov (United States)

    Gaus, Michael; Cui, Qiang; Elstner, Marcus

    2012-04-10

    The self-consistent-charge density-functional tight-binding method (SCC-DFTB) is an approximate quantum chemical method derived from density functional theory (DFT) based on a second-order expansion of the DFT total energy around a reference density. In the present study we combine earlier extensions and improve them consistently with, first, an improved Coulomb interaction between atomic partial charges, and second, the complete third-order expansion of the DFT total energy. These modifications lead us to the next generation of the DFTB methodology called DFTB3, which substantially improves the description of charged systems containing elements C, H, N, O, and P, especially regarding hydrogen binding energies and proton affinities. As a result, DFTB3 is particularly applicable to biomolecular systems. Remaining challenges and possible solutions are also briefly discussed.

  1. New exact solutions of the(2+1-dimensional Broer-Kaup equation by the consistent Riccati expansion method

    Directory of Open Access Journals (Sweden)

    Jiang Ying

    2017-01-01

    Full Text Available In this work, we study the (2+1-D Broer-Kaup equation. The composite periodic breather wave, the exact composite kink breather wave and the solitary wave solutions are obtained by using the coupled degradation technique and the consistent Riccati expansion method. These results may help us to investigate some complex dynamical behaviors and the interaction between composite non-linear waves in high dimensional models

  2. Self-consistent DFT +U method for real-space time-dependent density functional theory calculations

    Science.gov (United States)

    Tancogne-Dejean, Nicolas; Oliveira, Micael J. T.; Rubio, Angel

    2017-12-01

    We implemented various DFT+U schemes, including the Agapito, Curtarolo, and Buongiorno Nardelli functional (ACBN0) self-consistent density-functional version of the DFT +U method [Phys. Rev. X 5, 011006 (2015), 10.1103/PhysRevX.5.011006] within the massively parallel real-space time-dependent density functional theory (TDDFT) code octopus. We further extended the method to the case of the calculation of response functions with real-time TDDFT+U and to the description of noncollinear spin systems. The implementation is tested by investigating the ground-state and optical properties of various transition-metal oxides, bulk topological insulators, and molecules. Our results are found to be in good agreement with previously published results for both the electronic band structure and structural properties. The self-consistent calculated values of U and J are also in good agreement with the values commonly used in the literature. We found that the time-dependent extension of the self-consistent DFT+U method yields improved optical properties when compared to the empirical TDDFT+U scheme. This work thus opens a different theoretical framework to address the nonequilibrium properties of correlated systems.

  3. Assessment of rigid multi-modality image registration consistency using the multiple sub-volume registration (MSR) method

    International Nuclear Information System (INIS)

    Ceylan, C; Heide, U A van der; Bol, G H; Lagendijk, J J W; Kotte, A N T J

    2005-01-01

    Registration of different imaging modalities such as CT, MRI, functional MRI (fMRI), positron (PET) and single photon (SPECT) emission tomography is used in many clinical applications. Determining the quality of any automatic registration procedure has been a challenging part because no gold standard is available to evaluate the registration. In this note we present a method, called the 'multiple sub-volume registration' (MSR) method, for assessing the consistency of a rigid registration. This is done by registering sub-images of one data set on the other data set, performing a crude non-rigid registration. By analysing the deviations (local deformations) of the sub-volume registrations from the full registration we get a measure of the consistency of the rigid registration. Registration of 15 data sets which include CT, MR and PET images for brain, head and neck, cervix, prostate and lung was performed utilizing a rigid body registration with normalized mutual information as the similarity measure. The resulting registrations were classified as good or bad by visual inspection. The resulting registrations were also classified using our MSR method. The results of our MSR method agree with the classification obtained from visual inspection for all cases (p < 0.02 based on ANOVA of the good and bad groups). The proposed method is independent of the registration algorithm and similarity measure. It can be used for multi-modality image data sets and different anatomic sites of the patient. (note)

  4. Method and apparatus for fabricating a composite structure consisting of a filamentary material in a metal matrix

    Science.gov (United States)

    Banker, J.G.; Anderson, R.C.

    1975-10-21

    A method and apparatus are provided for preparing a composite structure consisting of filamentary material within a metal matrix. The method is practiced by the steps of confining the metal for forming the matrix in a first chamber, heating the confined metal to a temperature adequate to effect melting thereof, introducing a stream of inert gas into the chamber for pressurizing the atmosphere in the chamber to a pressure greater than atmospheric pressure, confining the filamentary material in a second chamber, heating the confined filamentary material to a temperature less than the melting temperature of the metal, evacuating the second chamber to provide an atmosphere therein at a pressure, placing the second chamber in registry with the first chamber to provide for the forced flow of the molten metal into the second chamber to effect infiltration of the filamentary material with the molten metal, and thereafter cooling the metal infiltrated-filamentary material to form said composite structure.

  5. Method and apparatus for fabricating a composite structure consisting of a filamentary material in a metal matrix

    International Nuclear Information System (INIS)

    Banker, J.G.; Anderson, R.C.

    1975-01-01

    A method and apparatus are provided for preparing a composite structure consisting of filamentary material within a metal matrix. The method is practiced by the steps of confining the metal for forming the matrix in a first chamber, heating the confined metal to a temperature adequate to effect melting thereof, introducing a stream of inert gas into the chamber for pressurizing the atmosphere in the chamber to a pressure greater than atmospheric pressure, confining the filamentary material in a second chamber, heating the confined filamentary material to a temperature less than the melting temperature of the metal, evacuating the second chamber to provide an atmosphere therein at a pressure, placing the second chamber in registry with the first chamber to provide for the forced flow of the molten metal into the second chamber to effect infiltration of the filamentary material with the molten metal, and thereafter cooling the metal infiltrated-filamentary material to form said composite structure

  6. Self-consistent ab initio Calculations for Photoionization and Electron-Ion Recombination Using the R-Matrix Method

    Science.gov (United States)

    Nahar, S. N.

    2003-01-01

    Most astrophysical plasmas entail a balance between ionization and recombination. We present new results from a unified method for self-consistent and ab initio calculations for the inverse processes of photoionization and (e + ion) recombination. The treatment for (e + ion) recombination subsumes the non-resonant radiative recombination and the resonant dielectronic recombination processes in a unified scheme (S.N. Nahar and A.K. Pradhan, Phys. Rev. A 49, 1816 (1994);H.L. Zhang, S.N. Nahar, and A.K. Pradhan, J.Phys.B, 32,1459 (1999)). Calculations are carried out using the R-matrix method in the close coupling approximation using an identical wavefunction expansion for both processes to ensure self-consistency. The results for photoionization and recombination cross sections may also be compared with state-of-the-art experiments on synchrotron radiation sources for photoionization, and on heavy ion storage rings for recombination. The new experiments display heretofore unprecedented detail in terms of resonances and background cross sections and thereby calibrate the theoretical data precisely. We find a level of agreement between theory and experiment at about 10% for not only the ground state but also the metastable states. The recent experiments therefore verify the estimated accuracy of the vast amount of photoionization data computed under the OP, IP and related works. features. Present work also reports photoionization cross sections including relativistic effects in the Breit-Pauli R-matrix (BPRM) approximation. Detailed features in the calculated cross sections exhibit the missing resonances due to fine structure. Self-consistent datasets for photoionization and recombination have so far been computed for approximately 45 atoms and ions. These are being reported in a continuing series of publications in Astrophysical J. Supplements (e.g. references below). These data will also be available from the electronic database TIPTOPBASE (http://heasarc.gsfc.nasa.gov)

  7. Optimization of preparation method for ketoprofen-loaded microspheres consisting polymeric blends using simplex lattice mixture design

    Energy Technology Data Exchange (ETDEWEB)

    Das, Sanjoy Kumar, E-mail: sanjoydasju@gmail.com; Khanam, Jasmina; Nanda, Arunabha

    2016-12-01

    In the present investigation, simplex lattice mixture design was applied for formulation development and optimization of a controlled release dosage form of ketoprofen microspheres consisting polymers like ethylcellulose and Eudragit{sup ®}RL 100; when those were formed by oil-in-oil emulsion solvent evaporation method. The investigation was carried out to observe the effects of polymer amount, stirring speed and emulsifier concentration (% w/w) on percentage yield, average particle size, drug entrapment efficiency and in vitro drug release in 8 h from the microspheres. Analysis of variance (ANOVA) was used to estimate the significance of the models. Based on the desirability function approach numerical optimization was carried out. Optimized formulation (KTF-O) showed close match between actual and predicted responses with desirability factor 0.811. No adverse reaction between drug and polymers were observed on the basis of Fourier transform infrared (FTIR) spectroscopy and Differential scanning calorimetric (DSC) analysis. Scanning electron microscopy (SEM) was carried out to show discreteness of microspheres (149.2 ± 1.25 μm) and their surface conditions during pre and post dissolution operations. The drug release pattern from KTF-O was best explained by Korsmeyer-Peppas and Higuchi models. The batch of optimized microspheres were found with maximum entrapment (~ 90%), minimum loss (~ 10%) and prolonged drug release for 8 h (91.25%) which may be considered as favourable criteria of controlled release dosage form. - Graphical abstract: Optimization of preparation method for ketoprofen-loaded microspheres consisting polymeric blends using simplex lattice mixture design. - Highlights: • Simplex lattice design was used to optimize ketoprofen-loaded microspheres. • Polymeric blend (Ethylcellulose and Eudragit® RL 100) was used. • Microspheres were prepared by oil-in-oil emulsion solvent evaporation method. • Optimized formulation depicted favourable

  8. Give Me Strength.

    Institute of Scientific and Technical Information of China (English)

    维拉

    1996-01-01

    Mort had an absolutely terrible day at the office.Everythingthat could go wrong did go wrong.As he walked home he could beheard muttering strange words to himself:“Oh,give me strength,give me strength.”Mort isn’t asking for the kind of strength thatbuilds strong muscles:he’s asking for the courage or ability to

  9. Communication: On the consistency of approximate quantum dynamics simulation methods for vibrational spectra in the condensed phase.

    Science.gov (United States)

    Rossi, Mariana; Liu, Hanchao; Paesani, Francesco; Bowman, Joel; Ceriotti, Michele

    2014-11-14

    Including quantum mechanical effects on the dynamics of nuclei in the condensed phase is challenging, because the complexity of exact methods grows exponentially with the number of quantum degrees of freedom. Efforts to circumvent these limitations can be traced down to two approaches: methods that treat a small subset of the degrees of freedom with rigorous quantum mechanics, considering the rest of the system as a static or classical environment, and methods that treat the whole system quantum mechanically, but using approximate dynamics. Here, we perform a systematic comparison between these two philosophies for the description of quantum effects in vibrational spectroscopy, taking the Embedded Local Monomer model and a mixed quantum-classical model as representatives of the first family of methods, and centroid molecular dynamics and thermostatted ring polymer molecular dynamics as examples of the latter. We use as benchmarks D2O doped with HOD and pure H2O at three distinct thermodynamic state points (ice Ih at 150 K, and the liquid at 300 K and 600 K), modeled with the simple q-TIP4P/F potential energy and dipole moment surfaces. With few exceptions the different techniques yield IR absorption frequencies that are consistent with one another within a few tens of cm(-1). Comparison with classical molecular dynamics demonstrates the importance of nuclear quantum effects up to the highest temperature, and a detailed discussion of the discrepancies between the various methods let us draw some (circumstantial) conclusions about the impact of the very different approximations that underlie them. Such cross validation between radically different approaches could indicate a way forward to further improve the state of the art in simulations of condensed-phase quantum dynamics.

  10. Efficient implementation of three-dimensional reference interaction site model self-consistent-field method: application to solvatochromic shift calculations.

    Science.gov (United States)

    Minezawa, Noriyuki; Kato, Shigeki

    2007-02-07

    The authors present an implementation of the three-dimensional reference interaction site model self-consistent-field (3D-RISM-SCF) method. First, they introduce a robust and efficient algorithm for solving the 3D-RISM equation. The algorithm is a hybrid of the Newton-Raphson and Picard methods. The Jacobian matrix is analytically expressed in a computationally useful form. Second, they discuss the solute-solvent electrostatic interaction. For the solute to solvent route, the electrostatic potential (ESP) map on a 3D grid is constructed directly from the electron density. The charge fitting procedure is not required to determine the ESP. For the solvent to solute route, the ESP acting on the solute molecule is derived from the solvent charge distribution obtained by solving the 3D-RISM equation. Matrix elements of the solute-solvent interaction are evaluated by the direct numerical integration. A remarkable reduction in the computational time is observed in both routes. Finally, the authors implement the first derivatives of the free energy with respect to the solute nuclear coordinates. They apply the present method to "solute" water and formaldehyde in aqueous solvent using the simple point charge model, and the results are compared with those from other methods: the six-dimensional molecular Ornstein-Zernike SCF, the one-dimensional site-site RISM-SCF, and the polarizable continuum model. The authors also calculate the solvatochromic shifts of acetone, benzonitrile, and nitrobenzene using the present method and compare them with the experimental and other theoretical results.

  11. Solvent effects in time-dependent self-consistent field methods. II. Variational formulations and analytical gradients

    International Nuclear Information System (INIS)

    Bjorgaard, J. A.; Velizhanin, K. A.; Tretiak, S.

    2015-01-01

    This study describes variational energy expressions and analytical excited state energy gradients for time-dependent self-consistent field methods with polarizable solvent effects. Linear response, vertical excitation, and state-specific solventmodels are examined. Enforcing a variational ground stateenergy expression in the state-specific model is found to reduce it to the vertical excitation model. Variational excited state energy expressions are then provided for the linear response and vertical excitation models and analytical gradients are formulated. Using semiempiricalmodel chemistry, the variational expressions are verified by numerical and analytical differentiation with respect to a static external electric field. Lastly, analytical gradients are further tested by performing microcanonical excited state molecular dynamics with p-nitroaniline

  12. Optical forces, torques, and force densities calculated at a microscopic level using a self-consistent hydrodynamics method

    Science.gov (United States)

    Ding, Kun; Chan, C. T.

    2018-04-01

    The calculation of optical force density distribution inside a material is challenging at the nanoscale, where quantum and nonlocal effects emerge and macroscopic parameters such as permittivity become ill-defined. We demonstrate that the microscopic optical force density of nanoplasmonic systems can be defined and calculated using the microscopic fields generated using a self-consistent hydrodynamics model that includes quantum, nonlocal, and retardation effects. We demonstrate this technique by calculating the microscopic optical force density distributions and the optical binding force induced by external light on nanoplasmonic dimers. This approach works even in the limit when the nanoparticles are close enough to each other so that electron tunneling occurs, a regime in which classical electromagnetic approach fails completely. We discover that an uneven distribution of optical force density can lead to a light-induced spinning torque acting on individual particles. The hydrodynamics method offers us an accurate and efficient approach to study optomechanical behavior for plasmonic systems at the nanoscale.

  13. An efficient method to transcription factor binding sites imputation via simultaneous completion of multiple matrices with positional consistency.

    Science.gov (United States)

    Guo, Wei-Li; Huang, De-Shuang

    2017-08-22

    Transcription factors (TFs) are DNA-binding proteins that have a central role in regulating gene expression. Identification of DNA-binding sites of TFs is a key task in understanding transcriptional regulation, cellular processes and disease. Chromatin immunoprecipitation followed by high-throughput sequencing (ChIP-seq) enables genome-wide identification of in vivo TF binding sites. However, it is still difficult to map every TF in every cell line owing to cost and biological material availability, which poses an enormous obstacle for integrated analysis of gene regulation. To address this problem, we propose a novel computational approach, TFBSImpute, for predicting additional TF binding profiles by leveraging information from available ChIP-seq TF binding data. TFBSImpute fuses the dataset to a 3-mode tensor and imputes missing TF binding signals via simultaneous completion of multiple TF binding matrices with positional consistency. We show that signals predicted by our method achieve overall similarity with experimental data and that TFBSImpute significantly outperforms baseline approaches, by assessing the performance of imputation methods against observed ChIP-seq TF binding profiles. Besides, motif analysis shows that TFBSImpute preforms better in capturing binding motifs enriched in observed data compared with baselines, indicating that the higher performance of TFBSImpute is not simply due to averaging related samples. We anticipate that our approach will constitute a useful complement to experimental mapping of TF binding, which is beneficial for further study of regulation mechanisms and disease.

  14. System for the chemical professing and evaluation gives the residual thickness the gives detecting for gives appearances LR115 type 2

    International Nuclear Information System (INIS)

    Carrazana Gonzalez, J.A.; Tomas Zerquera, J.; Prendes Alonso, M.

    1998-01-01

    In this work the system is described built in the CPHR for the homogeneous chemical processing gives detecting gives nuclear appearances. A new developed method is exposed, based on the application gives the technique optical densitometry, for the precise estimate gives the residual thickness, gives detecting, gives nuclear appearances LR115 type 2 after the process gives chemical engraving

  15. Giving behavior of millionaires.

    Science.gov (United States)

    Smeets, Paul; Bauer, Rob; Gneezy, Uri

    2015-08-25

    This paper studies conditions influencing the generosity of wealthy people. We conduct incentivized experiments with individuals who have at least €1 million in their bank account. The results show that millionaires are more generous toward low-income individuals in a giving situation when the other participant has no power, than in a strategic setting, where the other participant can punish unfair behavior. Moreover, the level of giving by millionaires is higher than in any other previous study. Our findings have important implications for charities and financial institutions that deal with wealthy individuals.

  16. Giving Back, Moving Forward

    Directory of Open Access Journals (Sweden)

    Louise Fortmann

    2014-07-01

    Full Text Available While reflecting on her own experience with giving back in Zimbabwe, Fortmann considers how the idea of “giving back” sits at the intersection of feminist theory, participatory research, and the democratization of science. From feminist theory arises the question of how to reciprocate to those who have contributed to our research. The participatory research and democratization of science literature push us to recognize and consider the collaborative nature of our research. Fortmann concludes by identifying three categories of reciprocity in research: material, intellectual, and personal. Sharing must occur, regardless of the kind of research taking place.

  17. Interface Consistency

    DEFF Research Database (Denmark)

    Staunstrup, Jørgen

    1998-01-01

    This paper proposes that Interface Consistency is an important issue for the development of modular designs. Byproviding a precise specification of component interfaces it becomes possible to check that separately developedcomponents use a common interface in a coherent matter thus avoiding a very...... significant source of design errors. Awide range of interface specifications are possible, the simplest form is a syntactical check of parameter types.However, today it is possible to do more sophisticated forms involving semantic checks....

  18. Give blood at CERN

    CERN Multimedia

    SC Unit

    2008-01-01

    ACCIDENTS and ILLNESSES don’t take a break! DO SOMETHING AMAZING - GIVE BLOOD! IT’S IN ALL OUR INTERESTS. 30 July 2008 from 9.30 a.m. to 4 p.m. CERN RESTAURANT NOVAE First floor - Salle des Pas Perdus After you have given blood, you are invited to partake of refreshments kindly offered by NOVAE.

  19. A consistency-based feature selection method allied with linear SVMs for HIV-1 protease cleavage site prediction.

    Directory of Open Access Journals (Sweden)

    Orkun Oztürk

    Full Text Available BACKGROUND: Predicting type-1 Human Immunodeficiency Virus (HIV-1 protease cleavage site in protein molecules and determining its specificity is an important task which has attracted considerable attention in the research community. Achievements in this area are expected to result in effective drug design (especially for HIV-1 protease inhibitors against this life-threatening virus. However, some drawbacks (like the shortage of the available training data and the high dimensionality of the feature space turn this task into a difficult classification problem. Thus, various machine learning techniques, and specifically several classification methods have been proposed in order to increase the accuracy of the classification model. In addition, for several classification problems, which are characterized by having few samples and many features, selecting the most relevant features is a major factor for increasing classification accuracy. RESULTS: We propose for HIV-1 data a consistency-based feature selection approach in conjunction with recursive feature elimination of support vector machines (SVMs. We used various classifiers for evaluating the results obtained from the feature selection process. We further demonstrated the effectiveness of our proposed method by comparing it with a state-of-the-art feature selection method applied on HIV-1 data, and we evaluated the reported results based on attributes which have been selected from different combinations. CONCLUSION: Applying feature selection on training data before realizing the classification task seems to be a reasonable data-mining process when working with types of data similar to HIV-1. On HIV-1 data, some feature selection or extraction operations in conjunction with different classifiers have been tested and noteworthy outcomes have been reported. These facts motivate for the work presented in this paper. SOFTWARE AVAILABILITY: The software is available at http

  20. The nuclear N-body problem and the effective interaction in self-consistent mean-field methods

    International Nuclear Information System (INIS)

    Duguet, Thomas

    2002-01-01

    This work deals with two aspects of mean-field type methods extensively used in low-energy nuclear structure. The first study is at the mean-field level. The link between the wave-function describing an even-even nucleus and the odd-even neighbor is revisited. To get a coherent description as a function of the pairing intensity in the system, the utility of the formalization of this link through a two steps process is demonstrated. This two-steps process allows to identify the role played by different channels of the force when a nucleon is added in the system. In particular, perturbative formula evaluating the contribution of time-odd components of the functional to the nucleon separation energy are derived for zero and realistic pairing intensities. Self-consistent calculations validate the developed scheme as well as the derived perturbative formula. This first study ends up with an extended analysis of the odd-even mass staggering in nuclei. The new scheme allows to identify the contribution to this observable coming from different channels of the force. The necessity of a better understanding of time-odd terms in order to decide which odd-even mass formulae extracts the pairing gap the most properly is identified. These terms being nowadays more or less out of control, extended studies are needed to make precise the fit of a pairing force through the comparison of theoretical and experimental odd-even mass differences. The second study deals with beyond mean-field methods taking care of the correlations associated with large amplitude oscillations in nuclei. Their effects are usually incorporated through the GCM or the projected mean-field method. We derive a perturbation theory motivating such variational calculations from a diagrammatic point of view for the first time. Resuming two-body correlations in the energy expansion, we obtain an effective interaction removing the hard-core problem in the context of configuration mixing calculations. Proceeding to a

  1. Fully self-consistent multiparticle-multi-hole configuration mixing method - Applications to a few light nuclei

    International Nuclear Information System (INIS)

    Robin, Caroline

    2014-01-01

    This thesis project takes part in the development of the multiparticle-multi-hole configuration mixing method aiming to describe the structure of atomic nuclei. Based on a double variational principle, this approach allows to determine the expansion coefficients of the wave function and the single-particle states at the same time. In this work we apply for the first time the fully self-consistent formalism of the mp-mh method to the description of a few p- and sd-shell nuclei, using the D1S Gogny interaction. A first study of the 12 C nucleus is performed in order to test the doubly iterative convergence procedure when different types of truncation criteria are applied to select the many-body configurations included in the wave-function. A detailed analysis of the effect caused by the orbital optimization is conducted. In particular, its impact on the one-body density and on the fragmentation of the ground state wave function is analyzed. A systematic study of sd-shell nuclei is then performed. A careful analysis of the correlation content of the ground state is first conducted and observables quantities such as binding and separation energies, as well as charge radii are calculated and compared to experimental data. Satisfactory results are found. Spectroscopic properties are also studied. Excitation energies of low-lying states are found in very good agreement with experiment, and the study of magnetic dipole features are also satisfactory. Calculation of electric quadrupole properties, and in particular transition probabilities B(E2), however reveal a clear lack of collectivity of the wave function, due to the reduced valence space used to select the many-body configurations. Although the renormalization of orbitals leads to an important fragmentation of the ground state wave function, only little effect is observed on B(E2) probabilities. A tentative explanation is given. Finally, the structure description of nuclei provided by the multiparticle

  2. GIVING AND RECEIVING CONSTRUCTIVE FEEDBACK

    Directory of Open Access Journals (Sweden)

    Ірина Олійник

    2015-05-01

    Full Text Available The article scrutinizes the notion of feedback applicable in classrooms where team teaching is provided. The experience of giving and receiving feedback has been a good practice in cooperation between a U.S. Peace Corps volunteer and a Ukrainian counterpart. Giving and receiving feedback is an effective means of classroom observation that provides better insight into the process of teaching a foreign language. The article discusses the stages of feedback and explicates the notion of sharing experience between two teachers working simultaneously in the same classroom. The guidelines for giving and receiving feedback have been provided as well as the most commonly used vocabulary items have been listed. It has been proved that mutual feedback leads to improving teaching methods and using various teaching styles and techniques.

  3. Consistency in the Reporting of Sensitive Behaviors by Adolescent American Indian Women: A Comparison of Interviewing Methods

    Science.gov (United States)

    Mullany, Britta; Barlow, Allison; Neault, Nicole; Billy, Trudy; Hastings, Ranelda; Coho-Mescal, Valerie; Lorenzo, Sherilyn; Walkup, John T.

    2013-01-01

    Computer-assisted interviewing techniques have increasingly been used in program and research settings to improve data collection quality and efficiency. Little is known, however, regarding the use of such techniques with American Indian (AI) adolescents in collecting sensitive information. This brief compares the consistency of AI adolescent…

  4. Daily Behavior Report Cards: An Investigation of the Consistency of On-Task Data across Raters and Methods

    Science.gov (United States)

    Chafouleas, Sandra M.; Riley-Tillman, T. Chris; Sassu, Kari A.; LaFrance, Mary J.; Patwa, Shamim S.

    2007-01-01

    In this study, the consistency of on-task data collected across raters using either a Daily Behavior Report Card (DBRC) or systematic direct observation was examined to begin to understand the decision reliability of using DBRCs to monitor student behavior. Results suggested very similar conclusions might be drawn when visually examining data…

  5. Modeling of the 3RS tau protein with self-consistent field method and Monte Carlo simulation

    NARCIS (Netherlands)

    Leermakers, F.A.M.; Jho, Y.S.; Zhulina, E.B.

    2010-01-01

    Using a model with amino acid resolution of the 196 aa N-terminus of the 3RS tau protein, we performed both a Monte Carlo study and a complementary self-consistent field (SCF) analysis to obtain detailed information on conformational properties of these moieties near a charged plane (mimicking the

  6. A Theoretically Consistent Method for Minimum Mean-Square Error Estimation of Mel-Frequency Cepstral Features

    DEFF Research Database (Denmark)

    Jensen, Jesper; Tan, Zheng-Hua

    2014-01-01

    We propose a method for minimum mean-square error (MMSE) estimation of mel-frequency cepstral features for noise robust automatic speech recognition (ASR). The method is based on a minimum number of well-established statistical assumptions; no assumptions are made which are inconsistent with others....... The strength of the proposed method is that it allows MMSE estimation of mel-frequency cepstral coefficients (MFCC's), cepstral mean-subtracted MFCC's (CMS-MFCC's), velocity, and acceleration coefficients. Furthermore, the method is easily modified to take into account other compressive non-linearities than...... the logarithmic which is usually used for MFCC computation. The proposed method shows estimation performance which is identical to or better than state-of-the-art methods. It further shows comparable ASR performance, where the advantage of being able to use mel-frequency speech features based on a power non...

  7. A new relative radiometric consistency processing method for change detection based on wavelet transform and a low-pass filter

    Institute of Scientific and Technical Information of China (English)

    2010-01-01

    The research purpose of this paper is to show the limitations of the existing radiometric normalization approaches and their disadvantages in change detection of artificial objects by comparing the existing approaches,on the basis of which a preprocessing approach to radiometric consistency,based on wavelet transform and a spatial low-pass filter,has been devised.This approach first separates the high frequency information and low frequency information by wavelet transform.Then,the processing of relative radiometric consistency based on a low-pass filter is conducted on the low frequency parts.After processing,an inverse wavelet transform is conducted to obtain the results image.The experimental results show that this approach can substantially reduce the influence on change detection of linear or nonlinear radiometric differences in multi-temporal images.

  8. A simple, sufficient, and consistent method to score the status of threats and demography of imperiled species

    Directory of Open Access Journals (Sweden)

    Jacob W. Malcom

    2016-07-01

    Full Text Available Managers of large, complex wildlife conservation programs need information on the conservation status of each of many species to help strategically allocate limited resources. Oversimplifying status data, however, runs the risk of missing information essential to strategic allocation. Conservation status consists of two components, the status of threats a species faces and the species’ demographic status. Neither component alone is sufficient to characterize conservation status. Here we present a simple key for scoring threat and demographic changes for species using detailed information provided in free-form textual descriptions of conservation status. This key is easy to use (simple, captures the two components of conservation status without the cost of more detailed measures (sufficient, and can be applied by different personnel to any taxon (consistent. To evaluate the key’s utility, we performed two analyses. First, we scored the threat and demographic status of 37 species recently recommended for reclassification under the Endangered Species Act (ESA and 15 control species, then compared our scores to two metrics used for decision-making and reports to Congress. Second, we scored the threat and demographic status of all non-plant ESA-listed species from Florida (54 spp., and evaluated scoring repeatability for a subset of those. While the metrics reported by the U.S. Fish and Wildlife Service (FWS are often consistent with our scores in the first analysis, the results highlight two problems with the oversimplified metrics. First, we show that both metrics can mask underlying demographic declines or threat increases; for example, ∼40% of species not recommended for reclassification had changes in threats or demography. Second, we show that neither metric is consistent with either threats or demography alone, but conflates the two. The second analysis illustrates how the scoring key can be applied to a substantial set of species to

  9. Self-consistent-field method and τ-functional method on group manifold in soliton theory. II. Laurent coefficients of soliton solutions for sln and for sun

    International Nuclear Information System (INIS)

    Nishiyama, Seiya; Providencia, Joao da; Komatsu, Takao

    2007-01-01

    To go beyond perturbative method in terms of variables of collective motion, using infinite-dimensional fermions, we have aimed to construct the self-consistent-field (SCF) theory, i.e., time dependent Hartree-Fock theory on associative affine Kac-Moody algebras along the soliton theory. In this paper, toward such an ultimate goal we will reconstruct a theoretical frame for a υ (external parameter)-dependent SCF method to describe more precisely the dynamics on the infinite-dimensional fermion Fock space. An infinite-dimensional fermion operator is introduced through Laurent expansion of finite-dimensional fermion operators with respect to degrees of freedom of the fermions related to a υ-dependent and a Υ-periodic potential. As an illustration, we derive explicit expressions for the Laurent coefficients of soliton solutions for sl n and for su n on infinite-dimensional Grassmannian. The associative affine Kac-Moody algebras play a crucial role to determine the dynamics on the infinite-dimensional fermion Fock space

  10. Development of a consistent Monte Carlo-deterministic transport methodology based on the method of characteristics and MCNP5

    International Nuclear Information System (INIS)

    Karriem, Z.; Ivanov, K.; Zamonsky, O.

    2011-01-01

    This paper presents work that has been performed to develop an integrated Monte Carlo- Deterministic transport methodology in which the two methods make use of exactly the same general geometry and multigroup nuclear data. The envisioned application of this methodology is in reactor lattice physics methods development and shielding calculations. The methodology will be based on the Method of Long Characteristics (MOC) and the Monte Carlo N-Particle Transport code MCNP5. Important initial developments pertaining to ray tracing and the development of an MOC flux solver for the proposed methodology are described. Results showing the viability of the methodology are presented for two 2-D general geometry transport problems. The essential developments presented is the use of MCNP as geometry construction and ray tracing tool for the MOC, verification of the ray tracing indexing scheme that was developed to represent the MCNP geometry in the MOC and the verification of the prototype 2-D MOC flux solver. (author)

  11. Decentralized Method for Load Sharing and Power Management in a Hybrid Single/Three-Phase-Islanded Microgrid Consisting of Hybrid Source PV/Battery Units

    DEFF Research Database (Denmark)

    Karimi, Yaser; Oraee, Hashem; Guerrero, Josep M.

    2017-01-01

    This paper proposes a new decentralized power management and load sharing method for a photovoltaic based, hybrid single/three-phase islanded microgrid consisting of various PV units, battery units and hybrid PV/battery units. The proposed method is not limited to the systems with separate PV...... in different load, PV generation and battery conditions is validated experimentally in a microgrid lab prototype consisted of one three-phase unit and two single-phase units....

  12. Method and equipment for fast transmission of a signal consisting of many data pulses by a normal well-logging cable

    International Nuclear Information System (INIS)

    Pitts, R.W. Jr.; Whatley, H.A. Jr.

    1975-01-01

    Well logging methods and equipment in general and a nuclear well logging process and apparatus in particular are presented. They increase the number of pulses which can be transmitted to the top of the well from a well logging instrument placed at the bottom, during a given time interval, by means of a processing giving two pulses for each pulse corresponding to a detection. The equipment is a double-spectrum well logging system, with near and far detectors [fr

  13. Self-Consistent-Field Method and τ-Functional Method on Group Manifold in Soliton Theory: a Review and New Results

    Directory of Open Access Journals (Sweden)

    Seiya Nishiyama

    2009-01-01

    Full Text Available The maximally-decoupled method has been considered as a theory to apply an basic idea of an integrability condition to certain multiple parametrized symmetries. The method is regarded as a mathematical tool to describe a symmetry of a collective submanifold in which a canonicity condition makes the collective variables to be an orthogonal coordinate-system. For this aim we adopt a concept of curvature unfamiliar in the conventional time-dependent (TD self-consistent field (SCF theory. Our basic idea lies in the introduction of a sort of Lagrange manner familiar to fluid dynamics to describe a collective coordinate-system. This manner enables us to take a one-form which is linearly composed of a TD SCF Hamiltonian and infinitesimal generators induced by collective variable differentials of a canonical transformation on a group. The integrability condition of the system read the curvature C = 0. Our method is constructed manifesting itself the structure of the group under consideration. To go beyond the maximaly-decoupled method, we have aimed to construct an SCF theory, i.e., υ (external parameter-dependent Hartree-Fock (HF theory. Toward such an ultimate goal, the υ-HF theory has been reconstructed on an affine Kac-Moody algebra along the soliton theory, using infinite-dimensional fermion. An infinite-dimensional fermion operator is introduced through a Laurent expansion of finite-dimensional fermion operators with respect to degrees of freedom of the fermions related to a υ-dependent potential with a Υ-periodicity. A bilinear equation for the υ-HF theory has been transcribed onto the corresponding τ-function using the regular representation for the group and the Schur-polynomials. The υ-HF SCF theory on an infinite-dimensional Fock space F∞ leads to a dynamics on an infinite-dimensional Grassmannian Gr∞ and may describe more precisely such a dynamics on the group manifold. A finite-dimensional Grassmannian is identified with a Gr

  14. Demonstration of two-phase Direct Numerical Simulation (DNS) methods potentiality to give information to averaged models: application to bubbles column

    International Nuclear Information System (INIS)

    Magdeleine, S.

    2009-11-01

    This work is a part of a long term project that aims at using two-phase Direct Numerical Simulation (DNS) in order to give information to averaged models. For now, it is limited to isothermal bubbly flows with no phase change. It could be subdivided in two parts: Firstly, theoretical developments are made in order to build an equivalent of Large Eddy Simulation (LES) for two phase flows called Interfaces and Sub-grid Scales (ISS). After the implementation of the ISS model in our code called Trio U , a set of various cases is used to validate this model. Then, special test are made in order to optimize the model for our particular bubbly flows. Thus we showed the capacity of the ISS model to produce a cheap pertinent solution. Secondly, we use the ISS model to perform simulations of bubbly flows in column. Results of these simulations are averaged to obtain quantities that appear in mass, momentum and interfacial area density balances. Thus, we processed to an a priori test of a complete one dimensional averaged model.We showed that this model predicts well the simplest flows (laminar and monodisperse). Moreover, the hypothesis of one pressure, which is often made in averaged model like CATHARE, NEPTUNE and RELAP5, is satisfied in such flows. At the opposite, without a polydisperse model, the drag is over-predicted and the uncorrelated A i flux needs a closure law. Finally, we showed that in turbulent flows, fluctuations of velocity and pressure in the liquid phase are not represented by the tested averaged model. (author)

  15. The discrete null space method for the energy-consistent integration of constrained mechanical systems. Part III: Flexible multibody dynamics

    International Nuclear Information System (INIS)

    Leyendecker, Sigrid; Betsch, Peter; Steinmann, Paul

    2008-01-01

    In the present work, the unified framework for the computational treatment of rigid bodies and nonlinear beams developed by Betsch and Steinmann (Multibody Syst. Dyn. 8, 367-391, 2002) is extended to the realm of nonlinear shells. In particular, a specific constrained formulation of shells is proposed which leads to the semi-discrete equations of motion characterized by a set of differential-algebraic equations (DAEs). The DAEs provide a uniform description for rigid bodies, semi-discrete beams and shells and, consequently, flexible multibody systems. The constraints may be divided into two classes: (i) internal constraints which are intimately connected with the assumption of rigidity of the bodies, and (ii) external constraints related to the presence of joints in a multibody framework. The present approach thus circumvents the use of rotational variables throughout the whole time discretization, facilitating the design of energy-momentum methods for flexible multibody dynamics. After the discretization has been completed a size-reduction of the discrete system is performed by eliminating the constraint forces. Numerical examples dealing with a spatial slider-crank mechanism and with intersecting shells illustrate the performance of the proposed method

  16. Development of a self-consistent model of dust grain charging at elevated pressures using the method of moments

    International Nuclear Information System (INIS)

    Filippov, A.V.; Dyatko, N.A.; Pal', A.F.; Starostin, A.N.

    2003-01-01

    A model of dust grain charging is constructed using the method of moments. The dust grain charging process in a weakly ionized helium plasma produced by a 100-keV electron beam at atmospheric pressure is studied theoretically. In simulations, the beam current density was varied from 1 to 10 6 μA/cm 2 . It is shown that, in a He plasma, dust grains of radius 5 μm and larger perturb the electron temperature only slightly, although the reduced electric field near the grain reaches 8 Td, the beam current density being 10 6 μA/cm 2 . It is found that, at distances from the grain that are up to several tens or hundreds of times larger than its radius, the electron and ion densities are lower than their equilibrium values. Conditions are determined under which the charging process may be described by a model with constant electron transport coefficients. The dust grain charge is shown to be weakly affected by secondary electron emission. In a beam-produced helium plasma, the dust grain potential calculated in the drift-diffusion model is shown to be close to that calculated in the orbit motion limited model. It is found that, in the vicinity of a body perturbing the plasma, there may be no quasineutral plasma presheath with an ambipolar diffusion of charged particles. The conditions for the onset of this presheath in a beam-produced plasma are determined

  17. A Novel Degradation Estimation Method for a Hybrid Energy Storage System Consisting of Battery and Double-Layer Capacitor

    Directory of Open Access Journals (Sweden)

    Yuanbin Yu

    2016-01-01

    Full Text Available This paper presents a new method for battery degradation estimation using a power-energy (PE function in a battery/ultracapacitor hybrid energy storage system (HESS, and the integrated optimization which concerns both parameters matching and control for HESS has been done as well. A semiactive topology of HESS with double-layer capacitor (EDLC coupled directly with DC-link is adopted for a hybrid electric city bus (HECB. In the purpose of presenting the quantitative relationship between system parameters and battery serving life, the data during a 37-minute driving cycle has been collected and decomposed into discharging/charging fragments firstly, and then the optimal control strategy which is supposed to maximally use the available EDLC energy is presented to decompose the power between battery and EDLC. Furthermore, based on a battery degradation model, the conversion of power demand by PE function and PE matrix is applied to evaluate the relationship between the available energy stored in HESS and the serving life of battery pack. Therefore, according to the approach which could decouple parameters matching and optimal control of the HESS, the process of battery degradation and its serving life estimation for HESS has been summed up.

  18. Analysis gives sensibility two models gives migration and transport gives radionuclides in the geosphere

    International Nuclear Information System (INIS)

    Torres Berdeguez, M. B.; Gil Castillo, R.; Peralta Vidal, J.L.

    1998-01-01

    An sensibility analysis it was applied two models, the first one, a model compressible for the near field (I finish source) The second, a simple model gives migration and transport radionuclides in the geosphere. The study was developed varying the securities ed simultaneously at the same time each parameter and observing the results in changes in the output and input. The intention in analysis it is to determine the parameter that but it influences in the variation the concentration. The statistical technique Regression it was employee in the study. This statistical method is used to analyze the dependence between a dependent variable and an or but independent variables

  19. Preventing syndemic Zika virus, HIV/STIs and unintended pregnancy: dual method use and consistent condom use among Brazilian women in marital and civil unions.

    Science.gov (United States)

    Tsuyuki, Kiyomi; Gipson, Jessica D; Barbosa, Regina Maria; Urada, Lianne A; Morisky, Donald E

    2017-12-12

    Syndemic Zika virus, HIV and unintended pregnancy call for an urgent understanding of dual method (condoms with another modern non-barrier contraceptive) and consistent condom use. Multinomial and logistic regression analysis using data from the Pesquisa Nacional de Demografia e Saúde da Criança e da Mulher (PNDS), a nationally representative household survey of reproductive-aged women in Brazil, identified the socio-demographic, fertility and relationship context correlates of exclusive non-barrier contraception, dual method use and condom use consistency. Among women in marital and civil unions, half reported dual protection (30% condoms, 20% dual methods). In adjusted models, condom use was associated with older age and living in the northern region of Brazil or in urban areas, whereas dual method use (versus condom use) was associated with younger age, living in the southern region of Brazil, living in non-urban areas and relationship age homogamy. Among condom users, consistent condom use was associated with reporting Afro-religion or other religion, not wanting (more) children and using condoms only (versus dual methods). Findings highlight that integrated STI prevention and family planning services should target young married/in union women, couples not wanting (more) children and heterogamous relationships to increase dual method use and consistent condom use.

  20. How to Safely Give Ibuprofen

    Science.gov (United States)

    ... of ibuprofen are available in similar forms. How to Give When giving ibuprofen, refer to the following dosage ... of Use Notice of Nondiscrimination Visit the Nemours Web site. Note: All information on KidsHealth® is for ...

  1. The currently used commercial DNA-extraction methods give different results of clostridial and actinobacterial populations derived from human fecal samples.

    Science.gov (United States)

    Maukonen, Johanna; Simões, Catarina; Saarela, Maria

    2012-03-01

    Recently several human health-related microbiota studies have had partly contradictory results. As some differences may be explained by methodologies applied, we evaluated how different storage conditions and commonly used DNA-extraction kits affect bacterial composition, diversity, and numbers of human fecal microbiota. According to our results, the DNA-extraction did not affect the diversity, composition, or quantity of Bacteroides spp., whereas after a week's storage at -20 °C, the numbers of Bacteroides spp. were 1.6-2.5 log units lower (P Eubacterium rectale (Erec)-group, Clostridium leptum group, bifidobacteria, and Atopobium group were 0.5-4 log units higher (P < 0.05) after mechanical DNA-extraction as detected with qPCR, regardless of storage. Furthermore, the bacterial composition of Erec-group differed significantly after different DNA-extractions; after enzymatic DNA-extraction, the most prevalent genera detected were Roseburia (39% of clones) and Coprococcus (10%), whereas after mechanical DNA-extraction, the most prevalent genera were Blautia (30%), Coprococcus (13%), and Dorea (10%). According to our results, rigorous mechanical lysis enables detection of higher bacterial numbers and diversity from human fecal samples. As it was shown that the results of clostridial and actinobacterial populations are highly dependent on the DNA-extraction methods applied, the use of different DNA-extraction protocols may explain the contradictory results previously obtained. © 2011 Federation of European Microbiological Societies. Published by Blackwell Publishing Ltd. All rights reserved.

  2. The Rucio Consistency Service

    CERN Document Server

    Serfon, Cedric; The ATLAS collaboration

    2016-01-01

    One of the biggest challenge with Large scale data management system is to ensure the consistency between the global file catalog and what is physically on all storage elements. To tackle this issue, the Rucio software which is used by the ATLAS Distributed Data Management system has been extended to automatically handle lost or unregistered files (aka Dark Data). This system automatically detects these inconsistencies and take actions like recovery or deletion of unneeded files in a central manner. In this talk, we will present this system, explain the internals and give some results.

  3. Methods for regionalization of impacts of non-toxic air pollutants in life-cycle assessments often tell a consistent story

    DEFF Research Database (Denmark)

    Djomo, Sylvestre Njakou; Knudsen, Marie Trydeman; Andersen, Mikael Skou

    2017-01-01

    There is an ongoing debate regarding the influence of the source location of pollution on the fate of pollutants and their subsequent impacts. Several methods have been developed to derive site-dependent characterization factors (CFs) for use in life-cycle assessment (LCA). Consistent, precise, a...

  4. Consistent model driven architecture

    Science.gov (United States)

    Niepostyn, Stanisław J.

    2015-09-01

    The goal of the MDA is to produce software systems from abstract models in a way where human interaction is restricted to a minimum. These abstract models are based on the UML language. However, the semantics of UML models is defined in a natural language. Subsequently the verification of consistency of these diagrams is needed in order to identify errors in requirements at the early stage of the development process. The verification of consistency is difficult due to a semi-formal nature of UML diagrams. We propose automatic verification of consistency of the series of UML diagrams originating from abstract models implemented with our consistency rules. This Consistent Model Driven Architecture approach enables us to generate automatically complete workflow applications from consistent and complete models developed from abstract models (e.g. Business Context Diagram). Therefore, our method can be used to check practicability (feasibility) of software architecture models.

  5. Decentralized method for load sharing and power management in a hybrid single/three-phase islanded microgrid consisting of hybrid source PV/battery units

    DEFF Research Database (Denmark)

    Karimi, Yaser; Guerrero, Josep M.; Oraee, Hashem

    2016-01-01

    This paper proposes a new decentralized power management and load sharing method for a photovoltaic based, hybrid single/three-phase islanded microgrid consisting of various PV units, battery units and hybrid PV/battery units. The proposed method takes into account the available PV power...... and battery conditions of the units to share the load among them and power flow among different phases is performed automatically through three-phase units. Modified active power-frequency droop functions are used according to operating states of each unit and the frequency level is used as trigger...... for switching between the states. Efficacy of the proposed method in different load, PV generation and battery conditions is validated experimentally in a microgrid lab prototype consisted of one three-phase unit and two single-phase units....

  6. Electronic structure of thin films by the self-consistent numerical-basis-set linear combination of atomic orbitals method: Ni(001)

    International Nuclear Information System (INIS)

    Wang, C.S.; Freeman, A.J.

    1979-01-01

    We present the self-consistent numerical-basis-set linear combination of atomic orbitals (LCAO) discrete variational method for treating the electronic structure of thin films. As in the case of bulk solids, this method provides for thin films accurate solutions of the one-particle local density equations with a non-muffin-tin potential. Hamiltonian and overlap matrix elements are evaluated accurately by means of a three-dimensional numerical Diophantine integration scheme. Application of this method is made to the self-consistent solution of one-, three-, and five-layer Ni(001) unsupported films. The LCAO Bloch basis set consists of valence orbitals (3d, 4s, and 4p states for transition metals) orthogonalized to the frozen-core wave functions. The self-consistent potential is obtained iteratively within the superposition of overlapping spherical atomic charge density model with the atomic configurations treated as adjustable parameters. Thus the crystal Coulomb potential is constructed as a superposition of overlapping spherically symmetric atomic potentials and, correspondingly, the local density Kohn-Sham (α = 2/3) potential is determined from a superposition of atomic charge densities. At each iteration in the self-consistency procedure, the crystal charge density is evaluated using a sampling of 15 independent k points in (1/8)th of the irreducible two-dimensional Brillouin zone. The total density of states (DOS) and projected local DOS (by layer plane) are calculated using an analytic linear energy triangle method (presented as an Appendix) generalized from the tetrahedron scheme for bulk systems. Distinct differences are obtained between the surface and central plane local DOS. The central plane DOS is found to converge rapidly to the DOS of bulk paramagnetic Ni obtained by Wang and Callaway. Only a very small surplus charge (0.03 electron/atom) is found on the surface planes, in agreement with jellium model calculations

  7. Evaluation of the quality consistency of powdered poppy capsule extractive by an averagely linear-quantified fingerprint method in combination with antioxidant activities and two compounds analyses.

    Science.gov (United States)

    Zhang, Yujing; Sun, Guoxiang; Hou, Zhifei; Yan, Bo; Zhang, Jing

    2017-12-01

    A novel averagely linear-quantified fingerprint method was proposed and successfully applied to monitor the quality consistency of alkaloids in powdered poppy capsule extractive. Averagely linear-quantified fingerprint method provided accurate qualitative and quantitative similarities for chromatographic fingerprints of Chinese herbal medicines. The stability and operability of the averagely linear-quantified fingerprint method were verified by the parameter r. The average linear qualitative similarity SL (improved based on conventional qualitative "Similarity") was used as a qualitative criterion in the averagely linear-quantified fingerprint method, and the average linear quantitative similarity PL was introduced as a quantitative one. PL was able to identify the difference in the content of all the chemical components. In addition, PL was found to be highly correlated to the contents of two alkaloid compounds (morphine and codeine). A simple flow injection analysis was developed for the determination of antioxidant capacity in Chinese Herbal Medicines, which was based on the scavenging of 2,2-diphenyl-1-picrylhydrazyl radical by antioxidants. The fingerprint-efficacy relationship linking chromatographic fingerprints and antioxidant activities was investigated utilizing orthogonal projection to latent structures method, which provided important pharmacodynamic information for Chinese herbal medicines quality control. In summary, quantitative fingerprinting based on averagely linear-quantified fingerprint method can be applied for monitoring the quality consistency of Chinese herbal medicines, and the constructed orthogonal projection to latent structures model is particularly suitable for investigating the fingerprint-efficacy relationship. © 2017 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  8. Standard test methods for determining chemical durability of nuclear, hazardous, and mixed waste glasses and multiphase glass ceramics: The product consistency test (PCT)

    CERN Document Server

    American Society for Testing and Materials. Philadelphia

    2002-01-01

    1.1 These product consistency test methods A and B evaluate the chemical durability of homogeneous glasses, phase separated glasses, devitrified glasses, glass ceramics, and/or multiphase glass ceramic waste forms hereafter collectively referred to as “glass waste forms” by measuring the concentrations of the chemical species released to a test solution. 1.1.1 Test Method A is a seven-day chemical durability test performed at 90 ± 2°C in a leachant of ASTM-Type I water. The test method is static and conducted in stainless steel vessels. Test Method A can specifically be used to evaluate whether the chemical durability and elemental release characteristics of nuclear, hazardous, and mixed glass waste forms have been consistently controlled during production. This test method is applicable to radioactive and simulated glass waste forms as defined above. 1.1.2 Test Method B is a durability test that allows testing at various test durations, test temperatures, mesh size, mass of sample, leachant volume, a...

  9. Systematic studies of molecular vibrational anharmonicity and vibration-rotation interaction by self-consistent-field higher derivative methods: Applications to asymmetric and symmetric top and linear polyatomic molecules

    Energy Technology Data Exchange (ETDEWEB)

    Clabo, D.A. Jr.

    1987-04-01

    Inclusion of the anharmonicity normal mode vibrations (i.e., the third and fourth (and higher) derivatives of a molecular Born-Oppenheimer potential energy surface) is necessary in order to theoretically reproduce experimental fundamental vibrational frequencies of a molecule. Although ab initio determinations of harmonic vibrational frequencies may give errors of only a few percent by the inclusion of electron correlation within a large basis set for small molecules, in general, molecular fundamental vibrational frequencies are more often available from high resolution vibration-rotation spectra. Recently developed analytic third derivatives methods for self-consistent-field (SCF) wavefunctions have made it possible to examine with previously unavailable accuracy and computational efficiency the anharmonic force fields of small molecules.

  10. Systematic studies of molecular vibrational anharmonicity and vibration-rotation interaction by self-consistent-field higher derivative methods: Applications to asymmetric and symmetric top and linear polyatomic molecules

    International Nuclear Information System (INIS)

    Clabo, D.A. Jr.

    1987-04-01

    Inclusion of the anharmonicity normal mode vibrations [i.e., the third and fourth (and higher) derivatives of a molecular Born-Oppenheimer potential energy surface] is necessary in order to theoretically reproduce experimental fundamental vibrational frequencies of a molecule. Although ab initio determinations of harmonic vibrational frequencies may give errors of only a few percent by the inclusion of electron correlation within a large basis set for small molecules, in general, molecular fundamental vibrational frequencies are more often available from high resolution vibration-rotation spectra. Recently developed analytic third derivatives methods for self-consistent-field (SCF) wavefunctions have made it possible to examine with previously unavailable accuracy and computational efficiency the anharmonic force fields of small molecules

  11. Optimum collective submanifold in resonant cases by the self-consistent collective-coordinate method for large-amplitude collective motion

    International Nuclear Information System (INIS)

    Hashimoto, Y.; Marumori, T.; Sakata, F.

    1987-01-01

    With the purpose of clarifying characteristic difference of the optimum collective submanifolds in nonresonant and resonant cases, we develop an improved method of solving the basic equations of the self-consistent collective-coordinate (SCC) method for large-amplitude collective motion. It is shown that, in the resonant cases, there inevitably arise essential coupling terms which break the maximal-decoupling property of the collective motion, and we have to extend the optimum collective submanifold so as to properly treat the degrees of freedom which bring about the resonances

  12. FDTD method for computing the off-plane band structure in a two-dimensional photonic crystal consisting of nearly free-electron metals

    Energy Technology Data Exchange (ETDEWEB)

    Xiao Sanshui; He Sailing

    2002-12-01

    An FDTD numerical method for computing the off-plane band structure of a two-dimensional photonic crystal consisting of nearly free-electron metals is presented. The method requires only a two-dimensional discretization mesh for a given off-plane wave number k{sub z} although the off-plane propagation is a three-dimensional problem. The off-plane band structures of a square lattice of metallic rods with the high-frequency metallic model in the air are studied, and a complete band gap for some nonzero off-plane wave number k{sub z} is founded.

  13. FDTD method for computing the off-plane band structure in a two-dimensional photonic crystal consisting of nearly free-electron metals

    International Nuclear Information System (INIS)

    Xiao Sanshui; He Sailing

    2002-01-01

    An FDTD numerical method for computing the off-plane band structure of a two-dimensional photonic crystal consisting of nearly free-electron metals is presented. The method requires only a two-dimensional discretization mesh for a given off-plane wave number k z although the off-plane propagation is a three-dimensional problem. The off-plane band structures of a square lattice of metallic rods with the high-frequency metallic model in the air are studied, and a complete band gap for some nonzero off-plane wave number k z is founded

  14. Multiscale methods framework: self-consistent coupling of molecular theory of solvation with quantum chemistry, molecular simulations, and dissipative particle dynamics.

    Science.gov (United States)

    Kovalenko, Andriy; Gusarov, Sergey

    2018-01-31

    In this work, we will address different aspects of self-consistent field coupling of computational chemistry methods at different time and length scales in modern materials and biomolecular science. Multiscale methods framework yields dramatically improved accuracy, efficiency, and applicability by coupling models and methods on different scales. This field benefits many areas of research and applications by providing fundamental understanding and predictions. It could also play a particular role in commercialization by guiding new developments and by allowing quick evaluation of prospective research projects. We employ molecular theory of solvation which allows us to accurately introduce the effect of the environment on complex nano-, macro-, and biomolecular systems. The uniqueness of this method is that it can be naturally coupled with the whole range of computational chemistry approaches, including QM, MM, and coarse graining.

  15. Using qualitative methods to inform the trade-off between content validity and consistency in utility assessment: the example of type 2 diabetes and Alzheimer's Disease

    Directory of Open Access Journals (Sweden)

    Gargon Elizabeth

    2010-02-01

    Full Text Available Abstract Background Key stakeholders regard generic utility instruments as suitable tools to inform health technology assessment decision-making regarding allocation of resources across competing interventions. These instruments require a 'descriptor', a 'valuation' and a 'perspective' of the economic evaluation. There are various approaches that can be taken for each of these, offering a potential lack of consistency between instruments (a basic requirement for comparisons across diseases. The 'reference method' has been proposed as a way to address the limitations of the Quality-Adjusted Life Year (QALY. However, the degree to which generic measures can assess patients' specific experiences with their disease would remain unresolved. This has been neglected in the discussions on methods development and its impact on the QALY values obtained and resulting cost per QALY estimate underestimated. This study explored the content of utility instruments relevant to type 2 diabetes and Alzheimer's disease (AD as examples, and the role of qualitative research in informing the trade-off between content coverage and consistency. Method A literature review was performed to identify qualitative and quantitative studies regarding patients' experiences with type 2 diabetes or AD, and associated treatments. Conceptual models for each indication were developed. Generic- and disease-specific instruments were mapped to the conceptual models. Results Findings showed that published descriptions of relevant concepts important to patients with type 2 diabetes or AD are available for consideration in deciding on the most comprehensive approach to utility assessment. While the 15-dimensional health related quality of life measure (15D seemed the most comprehensive measure for both diseases, the Health Utilities Index 3 (HUI 3 seemed to have the least coverage for type 2 diabetes and the EuroQol-5 Dimensions (EQ-5D for AD. Furthermore, some of the utility instruments

  16. Study of nuclear reactions with the Skyrme interaction; static properties by the self-consistent method; dynamic properties by the generator-coordinate method

    International Nuclear Information System (INIS)

    Flocard, Hubert.

    1975-01-01

    Using the same effective interaction depending only on 6 parameters a large number of nuclear properties are calculated, and the results are compared with experiment. Total binding energies of all nuclei of the chart table are reproduced within 5MeV. It is shown that the remaining discrepancy is coherent with the increase of total binding energy that can be expected from the further inclusion of collective motion correlations. Monopole, quadrupole and hexadecupole part of the charge densities are also reproduced with good accuracy. The deformation energy curves of many nuclei ranging from carbon to superheavy elements are calculated, and the different features of these curves are discussed. It should be noted that the fission barrier of actinide nuclei has been obtained and the results exhibit the well known two-bump shape. In addition the fusion energy curve of two 16 O merging in one nucleus 32 S has been completed. Results concerning monopole, dipole and quadrupole giant resonances of light nuclei obtained within the frame of the generator coordinate method are also presented. The calculated position of these resonances agree well with present available data [fr

  17. METHODS FOR SPEECH CONTACT ANALYSIS: THE CASE OF THE ‘UBIQUITY OF RHETORIC’. PROOFING THE CONCEPTUAL CONSISTENCY OF SPEECH AS LINGUISTIC MACRO-SETTING

    Directory of Open Access Journals (Sweden)

    Dr. Fee-Alexandra HAASE

    2012-11-01

    Full Text Available In this article we will apply a method of proof for conceptual consistency in a long historical range taking the example of rhetoric and persuasion. We will analyze the evidentially present linguistic features of this concept within three linguistic areas: The Indo-European languages, the Semitic languages, and the Afro-Asiatic languages. We have chosen the case of the concept ‘rhetoric’ / ’persuasion’ as a paradigm for this study. With the phenomenon of ‘linguistic dispersion’ we can explain the development of language as undirected, but with linguistic consistency across the borders of language families. We will prove that the Semitic and Indo-European languages are related. As a consequence, the strict differentiation between the Semitic and the Indo-European language families is outdated following the research positions of Starostin. In contrast to this, we will propose a theory of cultural exchange between the two language families.

  18. The Limits to Giving Back

    Directory of Open Access Journals (Sweden)

    Jade S. Sasser

    2014-07-01

    Full Text Available In this thematic section, authors consider the limitations on giving back that they faced in field research, or saw others face. For some authors, their attempts at giving back were severely limited by the scope of their projects, or their understandings of local cultures or histories. For others, very specific circumstances and historical interventions of foreigners in certain places can limit how and to what extent a researcher is able to have a reciprocal relationship with the participating community. Some authors, by virtue of their lesser positions of power relative to those that they were studying, simply decided not to give back to those communities. In each article it becomes apparent that how and in what ways people give back is unique (and limited both to their personal values and the contexts in which they do research.

  19. A self-consistent MoD-WM/MM structural refinement method: characterization of hydrogen bonding in the orytricha nova G-1uar

    Energy Technology Data Exchange (ETDEWEB)

    Batista, Enrique R [Los Alamos National Laboratory; Newcomer, Micharel B [YALE UNIV; Raggin, Christina M [YALE UNIV; Gascon, Jose A [YALE UNIV; Loria, J Patrick [YALE UNIV; Batista, Victor S [YALE UNIV

    2008-01-01

    This paper generalizes the MoD-QM/MM hybrid method, developed for ab initio computations of protein electrostatic potentials [Gasc6n, l.A.; Leung, S.S.F.; Batista, E.R.; Batista, V.S. J. Chem. Theory Comput. 2006,2, 175-186], as a practical algorithm for structural refinement of extended systems. The computational protocol involves a space-domain decomposition scheme for the formal fragmentation of extended systems into smaller, partially overlapping, molecular domains and the iterative self-consistent energy minimization of the constituent domains by relaxation of their geometry and electronic structure. The method accounts for mutual polarization of the molecular domains, modeled as Quantum-Mechanical (QM) layers embedded in the otherwise classical Molecular-Mechanics (MM) environment according to QM/MM hybrid methods. The method is applied to the description of benchmark models systems that allow for direct comparisons with full QM calculations, and subsequently applied to the structural characterization of the DNA Oxytricha nova Guanine quadruplex (G4). The resulting MoD-QM/MM structural model of the DNA G4 is compared to recently reported highresolution X-ray diffraction and NMR models, and partially validated by direct comparisons between {sup 1}H NMR chemical shifts that are highly sensitive to hydrogen-bonding and stacking interactions and the corresponding theoretical values obtained at the density functional theory DFT QM/MM (BH&H/6-31 G*:Amber) level in conjunction with the gauge independent atomic orbital (GIAO) method for the ab initio self consistent-field (SCF) calculation of NMR chemical shifts.

  20. Microemulsion Electrokinetic Chromatography in Combination with Chemometric Methods to Evaluate the Holistic Quality Consistency and Predict the Antioxidant Activity of Ixeris sonchifolia (Bunge Hance Injection.

    Directory of Open Access Journals (Sweden)

    Lanping Yang

    Full Text Available In this paper, microemulsion electrokinetic chromatography (MEEKC fingerprints combined with quantification were successfully developed to monitor the holistic quality consistency of Ixeris sonchifolia (Bge. Hance Injection (ISHI. ISHI is a Chinese traditional patent medicine used for its anti-inflammatory and hemostatic effects. The effects of five crucial experimental variables on MEEKC were optimized by the central composite design. Under the optimized conditions, the MEEKC fingerprints of 28 ISHIs were developed. Quantitative determination of seven marker compounds was employed simultaneously, then 28 batches of samples from two manufacturers were clearly divided into two clusters by the principal component analysis. In fingerprint assessments, a systematic quantitative fingerprint method was established for the holistic quality consistency evaluation of ISHI from qualitative and quantitative perspectives, by which the qualities of 28 samples were well differentiated. In addition, the fingerprint-efficacy relationship between the fingerprints and the antioxidant activities was established utilizing orthogonal projection to latent structures, which provided important medicinal efficacy information for quality control. The present study offered a powerful and holistic approach to evaluating the quality consistency of herbal medicines and their preparations.

  1. Modal Bin Hybrid Model: A surface area consistent, triple-moment sectional method for use in process-oriented modeling of atmospheric aerosols

    Science.gov (United States)

    Kajino, Mizuo; Easter, Richard C.; Ghan, Steven J.

    2013-09-01

    triple-moment sectional (TMS) aerosol dynamics model, Modal Bin Hybrid Model (MBHM), has been developed. In addition to number and mass (volume), surface area is predicted (and preserved), which is important for aerosol processes and properties such as gas-to-particle mass transfer, heterogeneous reaction, and light extinction cross section. The performance of MBHM was evaluated against double-moment sectional (DMS) models with coarse (BIN4) to very fine (BIN256) size resolutions for simulating evolution of particles under simultaneously occurring nucleation, condensation, and coagulation processes (BINx resolution uses x sections to cover the 1 nm to 1 µm size range). Because MBHM gives a physically consistent form of the intrasectional distributions, errors and biases of MBHM at BIN4-8 resolution were almost equivalent to those of DMS at BIN16-32 resolution for various important variables such as the moments Mk (k: 0, 2, 3), dMk/dt, and the number and volume of particles larger than a certain diameter. Another important feature of MBHM is that only a single bin is adequate to simulate full aerosol dynamics for particles whose size distribution can be approximated by a single lognormal mode. This flexibility is useful for process-oriented (multicategory and/or mixing state) modeling: Primary aerosols whose size parameters would not differ substantially in time and space can be expressed by a single or a small number of modes, whereas secondary aerosols whose size changes drastically from 1 to several hundred nanometers can be expressed by a number of modes. Added dimensions can be applied to MBHM to represent mixing state or photochemical age for aerosol mixing state studies.

  2. Structural Consistency, Consistency, and Sequential Rationality.

    OpenAIRE

    Kreps, David M; Ramey, Garey

    1987-01-01

    Sequential equilibria comprise consistent beliefs and a sequentially ra tional strategy profile. Consistent beliefs are limits of Bayes ratio nal beliefs for sequences of strategies that approach the equilibrium strategy. Beliefs are structurally consistent if they are rationaliz ed by some single conjecture concerning opponents' strategies. Consis tent beliefs are not necessarily structurally consistent, notwithstan ding a claim by Kreps and Robert Wilson (1982). Moreover, the spirit of stru...

  3. BMI was found to be a consistent determinant related to misreporting of energy, protein and potassium intake using self-report and duplicate portion methods.

    Science.gov (United States)

    Trijsburg, Laura; Geelen, Anouk; Hollman, Peter Ch; Hulshof, Paul Jm; Feskens, Edith Jm; Van't Veer, Pieter; Boshuizen, Hendriek C; de Vries, Jeanne Hm

    2017-03-01

    As misreporting, mostly under-reporting, of dietary intake is a generally known problem in nutritional research, we aimed to analyse the association between selected determinants and the extent of misreporting by the duplicate portion method (DP), 24 h recall (24hR) and FFQ by linear regression analysis using the biomarker values as unbiased estimates. For each individual, two DP, two 24hR, two FFQ and two 24 h urinary biomarkers were collected within 1·5 years. Also, for sixty-nine individuals one or two doubly labelled water measurements were obtained. The associations of basic determinants (BMI, gender, age and level of education) with misreporting of energy, protein and K intake of the DP, 24hR and FFQ were evaluated using linear regression analysis. Additionally, associations between other determinants, such as physical activity and smoking habits, and misreporting were investigated. The Netherlands. One hundred and ninety-seven individuals aged 20-70 years. Higher BMI was associated with under-reporting of dietary intake assessed by the different dietary assessment methods for energy, protein and K, except for K by DP. Men tended to under-report protein by the DP, FFQ and 24hR, and persons of older age under-reported K but only by the 24hR and FFQ. When adjusted for the basic determinants, the other determinants did not show a consistent association with misreporting of energy or nutrients and by the different dietary assessment methods. As BMI was the only consistent determinant of misreporting, we conclude that BMI should always be taken into account when assessing and correcting dietary intake.

  4. The New Planned Giving Officer.

    Science.gov (United States)

    Jordan, Ronald R.; Quynn, Katelyn L.

    1994-01-01

    A planned giving officer is seen as an asset to college/university development for technical expertise, credibility, and connections. Attorneys, certified public accountants, bank trust officers, financial planners, investment advisers, life insurance agents, and real estate brokers may be qualified but probably also need training. (MSE)

  5. (Micro)Financing to Give

    DEFF Research Database (Denmark)

    Bajde, Domen

    2013-01-01

    and workings of microfinance. We illustrate how market-like elements are productively and problematically deployed in philanthropic giving and address the need to consider a broader range of socio-material relations involved in the framing of transactions. A complex network of actors and (trans)actions needs...

  6. Vibrational frequency scaling factors for correlation consistent basis sets and the methods CC2 and MP2 and their spin-scaled SCS and SOS variants

    Energy Technology Data Exchange (ETDEWEB)

    Friese, Daniel H., E-mail: daniel.h.friese@uit.no [Centre for Theoretical and Computational Chemistry CTCC, Department of Chemistry, University of Tromsø, N-9037 Tromsø (Norway); Törk, Lisa; Hättig, Christof, E-mail: christof.haettig@rub.de [Lehrstuhl für Theoretische Chemie, Ruhr-Universität Bochum, D-44801 Bochum (Germany)

    2014-11-21

    We present scaling factors for vibrational frequencies calculated within the harmonic approximation and the correlated wave-function methods coupled cluster singles and doubles model (CC2) and Møller-Plesset perturbation theory (MP2) with and without a spin-component scaling (SCS or spin-opposite scaling (SOS)). Frequency scaling factors and the remaining deviations from the reference data are evaluated for several non-augmented basis sets of the cc-pVXZ family of generally contracted correlation-consistent basis sets as well as for the segmented contracted TZVPP basis. We find that the SCS and SOS variants of CC2 and MP2 lead to a slightly better accuracy for the scaled vibrational frequencies. The determined frequency scaling factors can also be used for vibrational frequencies calculated for excited states through response theory with CC2 and the algebraic diagrammatic construction through second order and their spin-component scaled variants.

  7. A reduced-scaling density matrix-based method for the computation of the vibrational Hessian matrix at the self-consistent field level

    International Nuclear Information System (INIS)

    Kussmann, Jörg; Luenser, Arne; Beer, Matthias; Ochsenfeld, Christian

    2015-01-01

    An analytical method to calculate the molecular vibrational Hessian matrix at the self-consistent field level is presented. By analysis of the multipole expansions of the relevant derivatives of Coulomb-type two-electron integral contractions, we show that the effect of the perturbation on the electronic structure due to the displacement of nuclei decays at least as r −2 instead of r −1 . The perturbation is asymptotically local, and the computation of the Hessian matrix can, in principle, be performed with O(N) complexity. Our implementation exhibits linear scaling in all time-determining steps, with some rapid but quadratic-complexity steps remaining. Sample calculations illustrate linear or near-linear scaling in the construction of the complete nuclear Hessian matrix for sparse systems. For more demanding systems, scaling is still considerably sub-quadratic to quadratic, depending on the density of the underlying electronic structure

  8. Cation solvation with quantum chemical effects modeled by a size-consistent multi-partitioning quantum mechanics/molecular mechanics method.

    Science.gov (United States)

    Watanabe, Hiroshi C; Kubillus, Maximilian; Kubař, Tomáš; Stach, Robert; Mizaikoff, Boris; Ishikita, Hiroshi

    2017-07-21

    In the condensed phase, quantum chemical properties such as many-body effects and intermolecular charge fluctuations are critical determinants of the solvation structure and dynamics. Thus, a quantum mechanical (QM) molecular description is required for both solute and solvent to incorporate these properties. However, it is challenging to conduct molecular dynamics (MD) simulations for condensed systems of sufficient scale when adapting QM potentials. To overcome this problem, we recently developed the size-consistent multi-partitioning (SCMP) quantum mechanics/molecular mechanics (QM/MM) method and realized stable and accurate MD simulations, using the QM potential to a benchmark system. In the present study, as the first application of the SCMP method, we have investigated the structures and dynamics of Na + , K + , and Ca 2+ solutions based on nanosecond-scale sampling, a sampling 100-times longer than that of conventional QM-based samplings. Furthermore, we have evaluated two dynamic properties, the diffusion coefficient and difference spectra, with high statistical certainty. Furthermore the calculation of these properties has not previously been possible within the conventional QM/MM framework. Based on our analysis, we have quantitatively evaluated the quantum chemical solvation effects, which show distinct differences between the cations.

  9. Enhanced microwave absorption properties of MnO{sub 2} hollow microspheres consisted of MnO{sub 2} nanoribbons synthesized by a facile hydrothermal method

    Energy Technology Data Exchange (ETDEWEB)

    Wang, Yan; Han, Bingqian; Chen, Nan; Deng, Dongyang; Guan, Hongtao [Department of Materials Science and Engineering, Yunnan University, 650091, Kunming (China); Wang, Yude, E-mail: ydwang@ynu.edu.cn [Department of Materials Science and Engineering, Yunnan University, 650091, Kunming (China); Yunnan Province Key Lab of Micro-Nano Materials and Technology, Yunnan University, 650091, Kunming (China)

    2016-08-15

    MnO{sub 2} hollow microspheres consisted of nanoribbons were successfully fabricated via a facile hydrothermal method with SiO{sub 2} sphere templates. The crystal structure, morphology and microwave absorption properties in X and Ku band of the as-synthesized samples were characterized by powder X-ray diffraction (XRD), transmission electron microscopy (TEM) and a vector network analyzer. The results show that the three-dimensional (3D) hollow microspheres are assembled by ultra thin and narrow one-dimensional (1D) nanoribbons. A rational process for the formation of hollow microspheres is proposed. The 3D MnO{sub 2} hollow microspheres possess improved dielectric and magnetic properties than the 1D nanoribbons prepared by the same procedures with the absence of SiO{sub 2} hard templates, which are closely related to their special nanostructures. The MnO{sub 2} microspheres also show much better microwave absorption properties in X (8–12 GHz) and Ku (12–18 GHz) microwave band compared with 1D MnO{sub 2} nanoribbons. The minimum reflection loss of −40 dB for hollow microsphere can be observed at 14.2 GHz and reflection loss below −10 dB is 3.5 GHz with a thickness of only 4 mm. The possible mechanism for the enhanced microwave absorption properties is also discussed. - Graphical abstract: MnO{sub 2} hollow microspheres composed of nanoribbons show the excellent microwave absorption properties in X and Ku band. - Highlights: • MnO{sub 2} hollow microspheres consisted of MnO{sub 2} nanoribbons were successfully prepared. • MnO{sub 2} hollow microspheres possess good microwave absorption performances. • The excellent microwave absorption properties are in X and Ku microwave band. • Electromagnetic impedance matching is great contribution to absorption properties.

  10. Hybrid method for consistent model of the Pacific absolute plate motion and a test for inter-hotspot motion since 70Ma

    Science.gov (United States)

    Harada, Y.; Wessel, P.; Sterling, A.; Kroenke, L.

    2002-12-01

    Inter-hotspot motion within the Pacific plate is one of the most controversial issues in recent geophysical studies. However, it is a fact that many geophysical and geological data including ages and positions of seamount chains in the Pacific plate can largely be explained by a simple model of absolute motion derived from assumptions of rigid plates and fixed hotspots. Therefore we take the stand that if a model of plate motion can explain the ages and positions of Pacific hotspot tracks, inter-hotspot motion would not be justified. On the other hand, if any discrepancies between the model and observations are found, the inter-hotspot motion may then be estimated from these discrepancies. To make an accurate model of the absolute motion of the Pacific plate, we combined two different approaches: the polygonal finite rotation method (PFRM) by Harada and Hamano (2000) and the hot-spotting technique developed by Wessel and Kroenke (1997). The PFRM can determine accurate positions of finite rotation poles for the Pacific plate if the present positions of hotspots are known. On the other hand, the hot-spotting technique can predict present positions of hotspots if the absolute plate motion is given. Therefore we can undertake iterative calculations using the two methods. This hybrid method enables us to determine accurate finite rotation poles for the Pacific plate solely from geometry of Hawaii, Louisville and Easter(Crough)-Line hotspot tracks from around 70 Ma to present. Information of ages can be independently assigned to the model after the poles and rotation angles are determined. We did not detect any inter-hotspot motion from the geometry of these Pacific hotspot tracks using this method. The Ar-Ar ages of Pacific seamounts including new age data of ODP Leg 197 are used to test the newly determined model of the Pacific plate motion. The ages of Hawaii, Louisville, Easter(Crough)-Line, and Cobb hotspot tracks are quite consistent with each other from 70 Ma to

  11. Descriptive Qualitative Method of Evaluation from the Viewpoint of Math Teachers and Its Comparison with the Quantitative Evaluation (Giving scores) Method (A Case Study on the Primary Schools for Girls in Zone 1 of Tehran City)

    OpenAIRE

    Farnaz Ostad-Ali; Mohammad Hasan Behzadi; Ahmad Shahvarani

    2015-01-01

    In recent years, one of the most important developments which have been taking place in the primary school education system is the development of the qualitative-descriptive method of evaluating the students' achievements. The main goals of the qualitative-descriptive evaluation are improving the quality of learning and promoting the level of mental health in teaching-learning environments. Therefore, based on the the raised hypothesis, the purpose of this study is to investigate the teachers...

  12. Benefits of extensive recruitment effort persist during follow-ups and are consistent across age group and survey method. The TRAILS study

    Directory of Open Access Journals (Sweden)

    Nederhof Esther

    2012-07-01

    . Conclusions First, extensive recruitment effort at the first assessment wave of a prospective population based cohort study has long lasting positive effects. Second, characteristics of hard-to-recruit responders are largely consistent across age groups and survey methods.

  13. Benefits of extensive recruitment effort persist during follow-ups and are consistent across age group and survey method. The TRAILS study

    Science.gov (United States)

    2012-01-01

    recruitment effort at the first assessment wave of a prospective population based cohort study has long lasting positive effects. Second, characteristics of hard-to-recruit responders are largely consistent across age groups and survey methods. PMID:22747967

  14. Analysis Method of Transfer Pricing Used by Multinational Companies Related to Tax Avoidance and its Consistencies to the Arm's Length Principle

    Directory of Open Access Journals (Sweden)

    Nuraini Sari

    2015-12-01

    Full Text Available The purpose of this study is to evaluate about how Starbucks Corporation uses transfer pricing to minimize the tax bill. In addition, the study also will evaluate how Indonesia’s domestic rules can overcome the case if Starbucks UK case happens in Indonesia. There are three steps conducted in this study. First, using information provided by UK Her Majesty's Revenue and Customs (HMRC and other related articles, find methods used by Starbucks UK to minimize the tax bill. Second, find Organisation for Economic Co-Operation and Development (OECD viewpoint regarding Starbucks Corporation cases. Third, analyze how Indonesia’s transfer pricing rules will work if Starbucks UK’s cases happened in Indonesia. The results showed that there were three inter-company transactions that helped Starbucks UK to minimize the tax bill, such as coffee costs, royalty on intangible property, and interest on inter-company loans. Through a study of OECD’s BEPS action plans, it is recommended to improve the OECD Model Tax Convention including Indonesia’s domestic tax rules in order to produce a fair and transparent judgment on transfer pricing. This study concluded that by using current tax rules, although UK HMRC has been disadvantaged because transfer pricing practices done by most of multinational companies, UK HMRC still cannot prove the transfer pricing practices are not consistent with arm’s length principle. Therefore, current international tax rules need to be improved.

  15. Towards the harmonization between National Forest Inventory and Forest Condition Monitoring. Consistency of plot allocation and effect of tree selection methods on sample statistics in Italy.

    Science.gov (United States)

    Gasparini, Patrizia; Di Cosmo, Lucio; Cenni, Enrico; Pompei, Enrico; Ferretti, Marco

    2013-07-01

    In the frame of a process aiming at harmonizing National Forest Inventory (NFI) and ICP Forests Level I Forest Condition Monitoring (FCM) in Italy, we investigated (a) the long-term consistency between FCM sample points (a subsample of the first NFI, 1985, NFI_1) and recent forest area estimates (after the second NFI, 2005, NFI_2) and (b) the effect of tree selection method (tree-based or plot-based) on sample composition and defoliation statistics. The two investigations were carried out on 261 and 252 FCM sites, respectively. Results show that some individual forest categories (larch and stone pine, Norway spruce, other coniferous, beech, temperate oaks and cork oak forests) are over-represented and others (hornbeam and hophornbeam, other deciduous broadleaved and holm oak forests) are under-represented in the FCM sample. This is probably due to a change in forest cover, which has increased by 1,559,200 ha from 1985 to 2005. In case of shift from a tree-based to a plot-based selection method, 3,130 (46.7%) of the original 6,703 sample trees will be abandoned, and 1,473 new trees will be selected. The balance between exclusion of former sample trees and inclusion of new ones will be particularly unfavourable for conifers (with only 16.4% of excluded trees replaced by new ones) and less for deciduous broadleaves (with 63.5% of excluded trees replaced). The total number of tree species surveyed will not be impacted, while the number of trees per species will, and the resulting (plot-based) sample composition will have a much larger frequency of deciduous broadleaved trees. The newly selected trees have-in general-smaller diameter at breast height (DBH) and defoliation scores. Given the larger rate of turnover, the deciduous broadleaved part of the sample will be more impacted. Our results suggest that both a revision of FCM network to account for forest area change and a plot-based approach to permit statistical inference and avoid bias in the tree sample

  16. Benefits of extensive recruitment effort persist during follow-ups and are consistent across age group and survey method. The TRAILS study.

    Science.gov (United States)

    Nederhof, Esther; Jörg, Frederike; Raven, Dennis; Veenstra, René; Verhulst, Frank C; Ormel, Johan; Oldehinkel, Albertine J

    2012-07-02

    assessment wave of a prospective population based cohort study has long lasting positive effects. Second, characteristics of hard-to-recruit responders are largely consistent across age groups and survey methods.

  17. Theoretical modeling of large molecular systems. Advances in the local self consistent field method for mixed quantum mechanics/molecular mechanics calculations.

    Science.gov (United States)

    Monari, Antonio; Rivail, Jean-Louis; Assfeld, Xavier

    2013-02-19

    Molecular mechanics methods can efficiently compute the macroscopic properties of a large molecular system but cannot represent the electronic changes that occur during a chemical reaction or an electronic transition. Quantum mechanical methods can accurately simulate these processes, but they require considerably greater computational resources. Because electronic changes typically occur in a limited part of the system, such as the solute in a molecular solution or the substrate within the active site of enzymatic reactions, researchers can limit the quantum computation to this part of the system. Researchers take into account the influence of the surroundings by embedding this quantum computation into a calculation of the whole system described at the molecular mechanical level, a strategy known as the mixed quantum mechanics/molecular mechanics (QM/MM) approach. The accuracy of this embedding varies according to the types of interactions included, whether they are purely mechanical or classically electrostatic. This embedding can also introduce the induced polarization of the surroundings. The difficulty in QM/MM calculations comes from the splitting of the system into two parts, which requires severing the chemical bonds that link the quantum mechanical subsystem to the classical subsystem. Typically, researchers replace the quantoclassical atoms, those at the boundary between the subsystems, with a monovalent link atom. For example, researchers might add a hydrogen atom when a C-C bond is cut. This Account describes another approach, the Local Self Consistent Field (LSCF), which was developed in our laboratory. LSCF links the quantum mechanical portion of the molecule to the classical portion using a strictly localized bond orbital extracted from a small model molecule for each bond. In this scenario, the quantoclassical atom has an apparent nuclear charge of +1. To achieve correct bond lengths and force constants, we must take into account the inner shell of

  18. Narcissism in women giving birth

    Directory of Open Access Journals (Sweden)

    Mojca Slana

    2010-07-01

    Full Text Available The aim of this study was to determine, whether there are any, and if so, which differences there are in narcissism in puerperium between first-time mothers, third-time mothers and women, who have never been pregnant. There were 170 women participants, divided into three groups. The first group consisted of 71 first-time mothers and the second group of 21 third-time mothers. In the third, comparative, group there were 78 women, who have never been pregnant. The participants completed the questionnaire Narzißmusinventar (Deneke and Hilgenstock, 1989. The results showed no significant differences between both groups of mothers. Significant differences in narcissism were present mainly only between first-time mothers and the comparative group. They suggest lower narcissism of the group of the first-time mothers.

  19. To give or not to give, that's the question: How methodology is destiny in Dutch giving data

    NARCIS (Netherlands)

    Bekkers, R.H.F.P.; Wiepking, P.

    2006-01-01

    In research on giving, methodology is destiny. The volume of donations estimated from sample surveys strongly depends on the length of the questionnaire used to measure giving. By comparing two giving surveys from the Netherlands, the authors show that a short questionnaire on giving not only

  20. To Give or Not to Give, That Is the Question : How Methodology Is Destiny in Dutch Giving Data

    NARCIS (Netherlands)

    Bekkers, René; Wiepking, Pamala

    2006-01-01

    In research on giving, methodology is destiny. The volume of donations estimated from sample surveys strongly depends on the length of the questionnaire used to measure giving. By comparing two giving surveys from the Netherlands, the authors show that a short questionnaire on giving not only

  1. A self-consistent theory of the magnetic polaron

    International Nuclear Information System (INIS)

    Marvakov, D.I.; Kuzemsky, A.L.; Vlahov, J.P.

    1984-10-01

    A finite temperature self-consistent theory of magnetic polaron in the s-f model of ferromagnetic semiconductors is developed. The calculations are based on the novel approach of the thermodynamic two-time Green function methods. This approach consists in the introduction of the ''irreducible'' Green functions (IGF) and derivation of the exact Dyson equation and exact self-energy operator. It is shown that IGF method gives a unified and natural approach for a calculation of the magnetic polaron states by taking explicitly into account the damping effects and finite lifetime. (author)

  2. Image recognition and consistency of response

    Science.gov (United States)

    Haygood, Tamara M.; Ryan, John; Liu, Qing Mary A.; Bassett, Roland; Brennan, Patrick C.

    2012-02-01

    Purpose: To investigate the connection between conscious recognition of an image previously encountered in an experimental setting and consistency of response to the experimental question. Materials and Methods: Twenty-four radiologists viewed 40 frontal chest radiographs and gave their opinion as to the position of a central venous catheter. One-to-three days later they again viewed 40 frontal chest radiographs and again gave their opinion as to the position of the central venous catheter. Half of the radiographs in the second set were repeated images from the first set and half were new. The radiologists were asked of each image whether it had been included in the first set. For this study, we are evaluating only the 20 repeated images. We used the Kruskal-Wallis test and Fisher's exact test to determine the relationship between conscious recognition of a previously interpreted image and consistency in interpretation of the image. Results. There was no significant correlation between recognition of the image and consistency in response regarding the position of the central venous catheter. In fact, there was a trend in the opposite direction, with radiologists being slightly more likely to give a consistent response with respect to images they did not recognize than with respect to those they did recognize. Conclusion: Radiologists' recognition of previously-encountered images in an observer-performance study does not noticeably color their interpretation on the second encounter.

  3. Two-dimensional tracking and TDI are consistent methods for evaluating myocardial longitudinal peak strain in left and right ventricle basal segments in athletes

    OpenAIRE

    Lstefani, L.Stefani; Ltoncelli, L.Toncelli; Mgianassi, M.Gianassi; Pmanetti, P.Manetti; Vdi, V.Di Tante; Mrvono, M.R.Vono; Amoretti, A.Moretti; Bcappelli, B.Cappelli; Gpedrizzetti, G.Pedrizzetti; Ggalanti, G.Galanti

    2007-01-01

    Abstract Background Myocardial contractility can be investigated using longitudinal peak strain. It can be calculated using the Doppler-derived TDI method and the non-Doppler method based on tissue tracking on B-mode images. Both are validated and show good reproducibility, but no comparative analysis of their results has yet been conducted. This study analyzes the results obtained from the basal segments of the ventricular chambers in a group of athletes. Methods 30 regularly-trained athlete...

  4. Bitcoin Meets Strong Consistency

    OpenAIRE

    Decker, Christian; Seidel, Jochen; Wattenhofer, Roger

    2014-01-01

    The Bitcoin system only provides eventual consistency. For everyday life, the time to confirm a Bitcoin transaction is prohibitively slow. In this paper we propose a new system, built on the Bitcoin blockchain, which enables strong consistency. Our system, PeerCensus, acts as a certification authority, manages peer identities in a peer-to-peer network, and ultimately enhances Bitcoin and similar systems with strong consistency. Our extensive analysis shows that PeerCensus is in a secure state...

  5. Is This Year's Exam as Demanding as Last Year's? Using a Pilot Method to Evaluate the Consistency of Examination Demands over Time

    Science.gov (United States)

    Crisp, Victoria; Novakovic, Nadezda

    2009-01-01

    Maintaining standards over time is a much debated topic in the context of national examinations in the UK. This study used a pilot method to compare the demands, over time, of two examination units testing administration. The method involved 15 experts revising a framework of demand types and making paired comparisons of examinations from…

  6. Consistent classical supergravity theories

    International Nuclear Information System (INIS)

    Muller, M.

    1989-01-01

    This book offers a presentation of both conformal and Poincare supergravity. The consistent four-dimensional supergravity theories are classified. The formulae needed for further modelling are included

  7. Giving the Customer a Voice

    DEFF Research Database (Denmark)

    Van der Hoven, Christopher; Michea, Adela; Varnes, Claus

    , for example there are studies that have strongly criticized focus groups, interviews and surveys (e.g. Ulwick, 2002; Goffin et al, 2010; Sandberg, 2002). In particular, a point is made that, “…traditional market research and development approaches proved to be particularly ill-suited to breakthrough products...... the voice of the customer (VoC) through market research is well documented (Davis, 1993; Mullins and Sutherland, 1998; Cooper et al., 2002; Flint, 2002; Davilla et al., 2006; Cooper and Edgett, 2008; Cooper and Dreher, 2010; Goffin and Mitchell, 2010). However, not all research methods are well received......” (Deszca et al, 2010, p613). Therefore, in situations where traditional techniques - interviews and focus groups - are ineffective, the question is which market research techniques are appropriate, particularly for developing breakthrough products? To investigate this, an attempt was made to access...

  8. Time-consistent and market-consistent evaluations

    NARCIS (Netherlands)

    Pelsser, A.; Stadje, M.A.

    2014-01-01

    We consider evaluation methods for payoffs with an inherent financial risk as encountered for instance for portfolios held by pension funds and insurance companies. Pricing such payoffs in a way consistent to market prices typically involves combining actuarial techniques with methods from

  9. Preparation of CuIn1-xGaxS2 (x = 0.5) flowers consisting of nanoflakes via a solvothermal method

    International Nuclear Information System (INIS)

    Liang Xiaojuan; Zhong Jiasong; Yang Fan; Hua Wei; Jin Huaidong; Liu Haitao; Sun Juncai; Xiang Weidong

    2011-01-01

    Highlights: → We report for the first time a small biomolecule-assisted route using L-cysteine as sulfur source and complexing agent to synthesis CuIn 0.5 Ga 0.5 S 2 crystals. → The possible mechanisms leading to CuIn 0.5 Ga 0.5 S 2 flowers consisting of nanoflakes were proposed. → In addition, the morphology, structure, and phase composition of the as-prepared CuIn 0.5 Ga 0.5 S 2 products were investigated in detail by XRD, FESEM, EDS, XPS, TEM (HRTEM) and SAED. - Abstract: CuIn 1-x Ga x S 2 (x = 0.5) flowers consisting of nanoflakes were successfully prepared by a biomolecule-assisted solvothermal route at 220 deg. C for 10 h, employing copper chloride, gallium chloride, indium chloride and L-cysteine as precursors. The biomolecule L-cysteine acting as sulfur source was found to play a very important role in the formation of the final product. The diameter of the CuIn 0.5 Ga 0.5 S 2 flowers was 1-2 μm, and the thickness of the flakes was about 15 nm. The obtained products were characterized by X-ray diffraction (XRD), energy dispersion spectroscopy (EDS), X-ray photoelectron spectroscopy (XPS), field-emission scanning electron microscopy (FESEM), transmission electron microscopy (TEM), high-resolution transmission electron microscopy (HRTEM), selected area electron diffraction spectroscopy (SAED), and UV-vis absorption spectroscopy. The influences of the reaction temperature, reaction time, sulfur source and the molar ratio of Cu-to-L-cysteine (reactants) on the formation of the target compound were investigated. The formation mechanism of the CuIn 0.5 Ga 0.5 S 2 flowers consisting of flakes was discussed.

  10. Consistency of orthodox gravity

    Energy Technology Data Exchange (ETDEWEB)

    Bellucci, S. [INFN, Frascati (Italy). Laboratori Nazionali di Frascati; Shiekh, A. [International Centre for Theoretical Physics, Trieste (Italy)

    1997-01-01

    A recent proposal for quantizing gravity is investigated for self consistency. The existence of a fixed-point all-order solution is found, corresponding to a consistent quantum gravity. A criterion to unify couplings is suggested, by invoking an application of their argument to more complex systems.

  11. Quasiparticles and thermodynamical consistency

    International Nuclear Information System (INIS)

    Shanenko, A.A.; Biro, T.S.; Toneev, V.D.

    2003-01-01

    A brief and simple introduction into the problem of the thermodynamical consistency is given. The thermodynamical consistency relations, which should be taken into account under constructing a quasiparticle model, are found in a general manner from the finite-temperature extension of the Hellmann-Feynman theorem. Restrictions following from these relations are illustrated by simple physical examples. (author)

  12. Systematic homogenization and self-consistent flux and pin power reconstruction for nodal diffusion methods. 1: Diffusion equation-based theory

    International Nuclear Information System (INIS)

    Zhang, H.; Rizwan-uddin; Dorning, J.J.

    1995-01-01

    A diffusion equation-based systematic homogenization theory and a self-consistent dehomogenization theory for fuel assemblies have been developed for use with coarse-mesh nodal diffusion calculations of light water reactors. The theoretical development is based on a multiple-scales asymptotic expansion carried out through second order in a small parameter, the ratio of the average diffusion length to the reactor characteristic dimension. By starting from the neutron diffusion equation for a three-dimensional heterogeneous medium and introducing two spatial scales, the development systematically yields an assembly-homogenized global diffusion equation with self-consistent expressions for the assembly-homogenized diffusion tensor elements and cross sections and assembly-surface-flux discontinuity factors. The rector eigenvalue 1/k eff is shown to be obtained to the second order in the small parameter, and the heterogeneous diffusion theory flux is shown to be obtained to leading order in that parameter. The latter of these two results provides a natural procedure for the reconstruction of the local fluxes and the determination of pin powers, even though homogenized assemblies are used in the global nodal diffusion calculation

  13. Preparation of CuIn{sub 1-x}Ga{sub x}S{sub 2} (x = 0.5) flowers consisting of nanoflakes via a solvothermal method

    Energy Technology Data Exchange (ETDEWEB)

    Liang Xiaojuan [College of Chemistry and Materials Engineering, Wenzhou University, Wenzhou, Zhejiang Province 325035 (China); Institute of Materials and Technology, Dalian Maritime University, Dalian 116026 (China); Zhong Jiasong; Yang Fan; Hua Wei; Jin Huaidong [College of Chemistry and Materials Engineering, Wenzhou University, Wenzhou, Zhejiang Province 325035 (China); Liu Haitao, E-mail: lht@wzu.edu.cn [College of Chemistry and Materials Engineering, Wenzhou University, Wenzhou, Zhejiang Province 325035 (China); Sun Juncai [Institute of Materials and Technology, Dalian Maritime University, Dalian 116026 (China); Xiang Weidong, E-mail: weidongxiang@yahoo.com.cn [College of Chemistry and Materials Engineering, Wenzhou University, Wenzhou, Zhejiang Province 325035 (China)

    2011-05-26

    Highlights: > We report for the first time a small biomolecule-assisted route using L-cysteine as sulfur source and complexing agent to synthesis CuIn{sub 0.5}Ga{sub 0.5}S{sub 2} crystals. > The possible mechanisms leading to CuIn{sub 0.5}Ga{sub 0.5}S{sub 2} flowers consisting of nanoflakes were proposed. > In addition, the morphology, structure, and phase composition of the as-prepared CuIn{sub 0.5}Ga{sub 0.5}S{sub 2} products were investigated in detail by XRD, FESEM, EDS, XPS, TEM (HRTEM) and SAED. - Abstract: CuIn{sub 1-x}Ga{sub x}S{sub 2} (x = 0.5) flowers consisting of nanoflakes were successfully prepared by a biomolecule-assisted solvothermal route at 220 deg. C for 10 h, employing copper chloride, gallium chloride, indium chloride and L-cysteine as precursors. The biomolecule L-cysteine acting as sulfur source was found to play a very important role in the formation of the final product. The diameter of the CuIn{sub 0.5}Ga{sub 0.5}S{sub 2} flowers was 1-2 {mu}m, and the thickness of the flakes was about 15 nm. The obtained products were characterized by X-ray diffraction (XRD), energy dispersion spectroscopy (EDS), X-ray photoelectron spectroscopy (XPS), field-emission scanning electron microscopy (FESEM), transmission electron microscopy (TEM), high-resolution transmission electron microscopy (HRTEM), selected area electron diffraction spectroscopy (SAED), and UV-vis absorption spectroscopy. The influences of the reaction temperature, reaction time, sulfur source and the molar ratio of Cu-to-L-cysteine (reactants) on the formation of the target compound were investigated. The formation mechanism of the CuIn{sub 0.5}Ga{sub 0.5}S{sub 2} flowers consisting of flakes was discussed.

  14. Self-consistent method for quantifying indium content from X-ray spectra of thick compound semiconductor specimens in a transmission electron microscope.

    Science.gov (United States)

    Walther, T; Wang, X

    2016-05-01

    Based on Monte Carlo simulations of X-ray generation by fast electrons we calculate curves of effective sensitivity factors for analytical transmission electron microscopy based energy-dispersive X-ray spectroscopy including absorption and fluorescence effects, as a function of Ga K/L ratio for different indium and gallium containing compound semiconductors. For the case of InGaN alloy thin films we show that experimental spectra can thus be quantified without the need to measure specimen thickness or density, yielding self-consistent values for quantification with Ga K and Ga L lines. The effect of uncertainties in the detector efficiency are also shown to be reduced. © 2015 The Authors Journal of Microscopy © 2015 Royal Microscopical Society.

  15. BMI was found to be a consistent determinant related to misreporting of energy, protein and potassium intake using self-report and duplicate portion methods

    NARCIS (Netherlands)

    Trijsburg, L.E.; Geelen, M.M.E.E.; Hollman, P.C.H.; Hulshof, P.J.M.; Feskens, E.J.M.; Veer, van 't P.; Boshuizen, H.C.; Vries, de J.H.M.

    2017-01-01


    As misreporting, mostly under-reporting, of dietary intake is a generally known problem in nutritional research, we aimed to analyse the association between selected determinants and the extent of misreporting by the duplicate portion method (DP), 24 h recall (24hR) and FFQ by linear regression

  16. A consistent and efficient graphical analysis method to improve the quantification of reversible tracer binding in radioligand receptor dynamic PET studies

    OpenAIRE

    Zhou, Yun; Ye, Weiguo; Brašić, James R.; Crabb, Andrew H.; Hilton, John; Wong, Dean F.

    2008-01-01

    The widely used Logan plot in radioligand receptor dynamic PET studies produces marked noise-induced negative biases in the estimates of total distribution volume (DVT) and binding potential (BP). To avoid the inconsistencies in the estimates from the Logan plot, a new graphical analysis method was proposed and characterized in this study. The new plot with plasma input and with reference tissue input was first derived to estimate DVT and BP. A condition was provided to ensure that the estima...

  17. Self-consistent quark bags

    International Nuclear Information System (INIS)

    Rafelski, J.

    1979-01-01

    After an introductory overview of the bag model the author uses the self-consistent solution of the coupled Dirac-meson fields to represent a bound state of strongly ineteracting fermions. In this framework he discusses the vivial approach to classical field equations. After a short description of the used numerical methods the properties of bound states of scalar self-consistent Fields and the solutions of a self-coupled Dirac field are considered. (HSI) [de

  18. Analysis Method of Transfer Pricing Used by Multinational Companies Related to Tax Avoidance and Its Consistencies to the Arm's Length Principle

    OpenAIRE

    Sari, Nuraini; Hunar, Ririn Susanti

    2015-01-01

    The purpose of this study is to evaluate about how Starbucks Corporation uses transfer pricing to minimize the tax bill. In addition, the study also will evaluate how Indonesia’s domestic rules can overcome the case if Starbucks UK case happens in Indonesia. There are three steps conducted in this study. First, using information provided by UK Her Majesty's Revenue and Customs (HMRC) and other related articles, find methods used by Starbucks UK to minimize the tax bill. Second, find Organisat...

  19. Coordinating user interfaces for consistency

    CERN Document Server

    Nielsen, Jakob

    2001-01-01

    In the years since Jakob Nielsen's classic collection on interface consistency first appeared, much has changed, and much has stayed the same. On the one hand, there's been exponential growth in the opportunities for following or disregarding the principles of interface consistency-more computers, more applications, more users, and of course the vast expanse of the Web. On the other, there are the principles themselves, as persistent and as valuable as ever. In these contributed chapters, you'll find details on many methods for seeking and enforcing consistency, along with bottom-line analys

  20. Data on consistency among different methods to assess atherosclerotic plaque echogenicity on standard ultrasound and intraplaque neovascularization on contrast-enhanced ultrasound imaging in human carotid artery

    Directory of Open Access Journals (Sweden)

    Mattia Cattaneo

    2016-12-01

    Full Text Available Here we provide the correlation among different carotid ultrasound (US variables to assess echogenicity n standard carotid US and to assess intraplaque neovascularization on contrast enhanced US. We recruited 45 consecutive subjects with an asymptomatic≥50% carotid artery stenosis. Carotid plaque echogenicity at standard US was visually graded according to Gray–Weale classification (GW and measured by the greyscale median (GSM, a semi-automated computerized measurement performed by Adobe Photoshop®. On CEUS imaging IPNV was graded according to the visual appearance of contrast within the plaque according to three different methods: CEUS_A (1=absent; 2=present; CEUS_B a three-point scale (increasing IPNV from 1 to 3; CEUS_C a four-point scale (increasing IPNV from 0 to 3. We have also implemented a new simple quantification method derived from region of interest (ROI signal intensity ratio as assessed by QLAB software. Further information is available in “Contrast-enhanced ultrasound imaging of intraplaque neovascularization and its correlation to plaque echogenicity in human carotid arteries atherosclerosis (M. Cattaneo, D. Staub, A.P. Porretta, J.M. Gallino, P. Santini, C. Limoni et al., 2016 [1].

  1. Whether and How Much to Give

    DEFF Research Database (Denmark)

    Petrovski, Erik

    This study evaluates whether factors known to foster charitable giving have a uniform influence on both (1) the decision to give and (2) the decision of how much to give. I establish that these two decisions are independent by dismissing the widely used Tobit model, which assumes a singe decision...

  2. Consistency in PERT problems

    OpenAIRE

    Bergantiños, Gustavo; Valencia-Toledo, Alfredo; Vidal-Puga, Juan

    2016-01-01

    The program evaluation review technique (PERT) is a tool used to schedule and coordinate activities in a complex project. In assigning the cost of a potential delay, we characterize the Shapley rule as the only rule that satisfies consistency and other desirable properties.

  3. A phenomenologic investigation of pediatric residents' experiences being parented and giving parenting advice.

    Science.gov (United States)

    Bax, A C; Shawler, P M; Blackmon, D L; DeGrace, E W; Wolraich, M L

    2016-09-01

    Factors surrounding pediatricians' parenting advice and training on parenting during residency have not been well studied. The primary purpose of this study was to examine pediatric residents' self-reported experiences giving parenting advice and explore the relationship between parenting advice given and types of parenting residents received as children. Thirteen OUHSC pediatric residents were individually interviewed to examine experiences being parented and giving parenting advice. Phenomenological methods were used to explicate themes and secondary analyses explored relationships of findings based upon Baumrind's parenting styles (authoritative, authoritarian, permissive). While childhood experiences were not specifically correlated to the parenting advice style of pediatric residents interviewed, virtually all reported relying upon childhood experiences to generate their advice. Those describing authoritative parents reported giving more authoritative advice while others reported more variable advice. Core interview themes related to residents' parenting advice included anxiety about not being a parent, varying advice based on families' needs, and emphasis of positive interactions and consistency. Themes related to how residents were parented included discipline being a learning process for their parents and recalling that their parents always had expectations, yet always loved them. Pediatric residents interviewed reported giving family centered parenting advice with elements of positive interactions and consistency, but interviews highlighted many areas of apprehension residents have around giving parenting advice. Our study suggests that pediatric residents may benefit from more general educational opportunities to develop the content of their parenting advice, including reflecting on any impact from their own upbringing.

  4. Mapping the imaginary of charitable giving

    DEFF Research Database (Denmark)

    Bajde, Domen

    2012-01-01

    The meaningfulness of charitable giving is largely owed to the imaginary conceptions that underpin this form of giving. Building on Taylor's notion of “social imaginary” and Godelier's work on “gift imaginary,” we theorize the imaginary of charitable giving. Through a combination of qualitative m...... across relatively stable assemblages of conceptions of poverty, donors, end-recipients and charitable giving. These assemblages are suggested to form a multifaceted imaginary that is both cultural (shared) and personal (individually performed).......The meaningfulness of charitable giving is largely owed to the imaginary conceptions that underpin this form of giving. Building on Taylor's notion of “social imaginary” and Godelier's work on “gift imaginary,” we theorize the imaginary of charitable giving. Through a combination of qualitative...

  5. The Practical Realities of Giving Back

    Directory of Open Access Journals (Sweden)

    Ashton Bree Wesner

    2014-07-01

    Full Text Available In this thematic section, authors consider practical ways of giving back to the communities in which they conduct research. Each author discusses their evolving thoughts on how to give back in these practical ways. Some of these authors discuss giving back by giving money, food, rides, parties, and water bottles. In other cases, authors discuss giving back by creating jobs in the short or long term, grant writing, advocacy, and education. Story-telling is also a theme that many of the authors in this section discuss. For some authors, non-material forms of giving back are critical—simply maintaining social ties to the communities in which they worked, or sharing humor. The authors consider the utility of their attempts at giving back, and in some cases present their personal philosophy or guidelines on the subject.

  6. A self-consistent, multivariate method for the determination of gas-phase rate coefficients, applied to reactions of atmospheric VOCs and the hydroxyl radical

    Science.gov (United States)

    Shaw, Jacob T.; Lidster, Richard T.; Cryer, Danny R.; Ramirez, Noelia; Whiting, Fiona C.; Boustead, Graham A.; Whalley, Lisa K.; Ingham, Trevor; Rickard, Andrew R.; Dunmore, Rachel E.; Heard, Dwayne E.; Lewis, Ally C.; Carpenter, Lucy J.; Hamilton, Jacqui F.; Dillon, Terry J.

    2018-03-01

    Gas-phase rate coefficients are fundamental to understanding atmospheric chemistry, yet experimental data are not available for the oxidation reactions of many of the thousands of volatile organic compounds (VOCs) observed in the troposphere. Here, a new experimental method is reported for the simultaneous study of reactions between multiple different VOCs and OH, the most important daytime atmospheric radical oxidant. This technique is based upon established relative rate concepts but has the advantage of a much higher throughput of target VOCs. By evaluating multiple VOCs in each experiment, and through measurement of the depletion in each VOC after reaction with OH, the OH + VOC reaction rate coefficients can be derived. Results from experiments conducted under controlled laboratory conditions were in good agreement with the available literature for the reaction of 19 VOCs, prepared in synthetic gas mixtures, with OH. This approach was used to determine a rate coefficient for the reaction of OH with 2,3-dimethylpent-1-ene for the first time; k = 5.7 (±0.3) × 10-11 cm3 molecule-1 s-1. In addition, a further seven VOCs had only two, or fewer, individual OH rate coefficient measurements available in the literature. The results from this work were in good agreement with those measurements. A similar dataset, at an elevated temperature of 323 (±10) K, was used to determine new OH rate coefficients for 12 aromatic, 5 alkane, 5 alkene and 3 monoterpene VOC + OH reactions. In OH relative reactivity experiments that used ambient air at the University of York, a large number of different VOCs were observed, of which 23 were positively identified. Due to difficulties with detection limits and fully resolving peaks, only 19 OH rate coefficients were derived from these ambient air samples, including 10 reactions for which data were previously unavailable at the elevated reaction temperature of T = 323 (±10) K.

  7. A self-consistent, multivariate method for the determination of gas-phase rate coefficients, applied to reactions of atmospheric VOCs and the hydroxyl radical

    Directory of Open Access Journals (Sweden)

    J. T. Shaw

    2018-03-01

    Full Text Available Gas-phase rate coefficients are fundamental to understanding atmospheric chemistry, yet experimental data are not available for the oxidation reactions of many of the thousands of volatile organic compounds (VOCs observed in the troposphere. Here, a new experimental method is reported for the simultaneous study of reactions between multiple different VOCs and OH, the most important daytime atmospheric radical oxidant. This technique is based upon established relative rate concepts but has the advantage of a much higher throughput of target VOCs. By evaluating multiple VOCs in each experiment, and through measurement of the depletion in each VOC after reaction with OH, the OH + VOC reaction rate coefficients can be derived. Results from experiments conducted under controlled laboratory conditions were in good agreement with the available literature for the reaction of 19 VOCs, prepared in synthetic gas mixtures, with OH. This approach was used to determine a rate coefficient for the reaction of OH with 2,3-dimethylpent-1-ene for the first time; k =  5.7 (±0.3  ×  10−11 cm3 molecule−1 s−1. In addition, a further seven VOCs had only two, or fewer, individual OH rate coefficient measurements available in the literature. The results from this work were in good agreement with those measurements. A similar dataset, at an elevated temperature of 323 (±10 K, was used to determine new OH rate coefficients for 12 aromatic, 5 alkane, 5 alkene and 3 monoterpene VOC + OH reactions. In OH relative reactivity experiments that used ambient air at the University of York, a large number of different VOCs were observed, of which 23 were positively identified. Due to difficulties with detection limits and fully resolving peaks, only 19 OH rate coefficients were derived from these ambient air samples, including 10 reactions for which data were previously unavailable at the elevated reaction temperature of T =  323 (±10 K.

  8. Reporting consistently on CSR

    DEFF Research Database (Denmark)

    Thomsen, Christa; Nielsen, Anne Ellerup

    2006-01-01

    This chapter first outlines theory and literature on CSR and Stakeholder Relations focusing on the different perspectives and the contextual and dynamic character of the CSR concept. CSR reporting challenges are discussed and a model of analysis is proposed. Next, our paper presents the results...... of a case study showing that companies use different and not necessarily consistent strategies for reporting on CSR. Finally, the implications for managerial practice are discussed. The chapter concludes by highlighting the value and awareness of the discourse and the discourse types adopted...... in the reporting material. By implementing consistent discourse strategies that interact according to a well-defined pattern or order, it is possible to communicate a strong social commitment on the one hand, and to take into consideration the expectations of the shareholders and the other stakeholders...

  9. Geometrically Consistent Mesh Modification

    KAUST Repository

    Bonito, A.

    2010-01-01

    A new paradigm of adaptivity is to execute refinement, coarsening, and smoothing of meshes on manifolds with incomplete information about their geometry and yet preserve position and curvature accuracy. We refer to this collectively as geometrically consistent (GC) mesh modification. We discuss the concept of discrete GC, show the failure of naive approaches, and propose and analyze a simple algorithm that is GC and accuracy preserving. © 2010 Society for Industrial and Applied Mathematics.

  10. Consistency argued students of fluid

    Science.gov (United States)

    Viyanti; Cari; Suparmi; Winarti; Slamet Budiarti, Indah; Handika, Jeffry; Widyastuti, Fatma

    2017-01-01

    Problem solving for physics concepts through consistency arguments can improve thinking skills of students and it is an important thing in science. The study aims to assess the consistency of the material Fluid student argmentation. The population of this study are College students PGRI Madiun, UIN Sunan Kalijaga Yogyakarta and Lampung University. Samples using cluster random sampling, 145 samples obtained by the number of students. The study used a descriptive survey method. Data obtained through multiple-choice test and interview reasoned. Problem fluid modified from [9] and [1]. The results of the study gained an average consistency argmentation for the right consistency, consistency is wrong, and inconsistent respectively 4.85%; 29.93%; and 65.23%. Data from the study have an impact on the lack of understanding of the fluid material which is ideally in full consistency argued affect the expansion of understanding of the concept. The results of the study as a reference in making improvements in future studies is to obtain a positive change in the consistency of argumentations.

  11. Time-consistent actuarial valuations

    NARCIS (Netherlands)

    Pelsser, A.A.J.; Salahnejhad Ghalehjooghi, A.

    2016-01-01

    Time-consistent valuations (i.e. pricing operators) can be created by backward iteration of one-period valuations. In this paper we investigate the continuous-time limits of well-known actuarial premium principles when such backward iteration procedures are applied. This method is applied to an

  12. Is cosmology consistent?

    International Nuclear Information System (INIS)

    Wang Xiaomin; Tegmark, Max; Zaldarriaga, Matias

    2002-01-01

    We perform a detailed analysis of the latest cosmic microwave background (CMB) measurements (including BOOMERaNG, DASI, Maxima and CBI), both alone and jointly with other cosmological data sets involving, e.g., galaxy clustering and the Lyman Alpha Forest. We first address the question of whether the CMB data are internally consistent once calibration and beam uncertainties are taken into account, performing a series of statistical tests. With a few minor caveats, our answer is yes, and we compress all data into a single set of 24 bandpowers with associated covariance matrix and window functions. We then compute joint constraints on the 11 parameters of the 'standard' adiabatic inflationary cosmological model. Our best fit model passes a series of physical consistency checks and agrees with essentially all currently available cosmological data. In addition to sharp constraints on the cosmic matter budget in good agreement with those of the BOOMERaNG, DASI and Maxima teams, we obtain a heaviest neutrino mass range 0.04-4.2 eV and the sharpest constraints to date on gravity waves which (together with preference for a slight red-tilt) favor 'small-field' inflation models

  13. Consistent Quantum Theory

    Science.gov (United States)

    Griffiths, Robert B.

    2001-11-01

    Quantum mechanics is one of the most fundamental yet difficult subjects in physics. Nonrelativistic quantum theory is presented here in a clear and systematic fashion, integrating Born's probabilistic interpretation with Schrödinger dynamics. Basic quantum principles are illustrated with simple examples requiring no mathematics beyond linear algebra and elementary probability theory. The quantum measurement process is consistently analyzed using fundamental quantum principles without referring to measurement. These same principles are used to resolve several of the paradoxes that have long perplexed physicists, including the double slit and Schrödinger's cat. The consistent histories formalism used here was first introduced by the author, and extended by M. Gell-Mann, J. Hartle and R. Omnès. Essential for researchers yet accessible to advanced undergraduate students in physics, chemistry, mathematics, and computer science, this book is supplementary to standard textbooks. It will also be of interest to physicists and philosophers working on the foundations of quantum mechanics. Comprehensive account Written by one of the main figures in the field Paperback edition of successful work on philosophy of quantum mechanics

  14. Validation of computer codes and modelling methods for giving proof of nuclear saefty of transport and storage of spent VVER-type nuclear fuels. Part 1. Purposes and goals of the project. Final report

    International Nuclear Information System (INIS)

    Buechse, H.; Langowski, A.; Lein, M.; Nagel, R.; Schmidt, H.; Stammel, M.

    1995-01-01

    The report gives the results of investigations on the validation of computer codes used to prove nuclear safety during transport and storage of spent VVER - fuel of NPP Greifswald and Rheinsberg. Characteristics of typical spent fuel (nuclide concentration, neutron source strength, gamma spectrum, decay heat) - calculated with several codes - and dose rates (e.g. in the surrounding of a loaded spent fuel cask) - based on the different source terms - are presented. Differences and their possible reasons are discussed. The results show that despite the differences in the source terms all relevant health physics requirements are met for all cases of source term. The validation of the criticality code OMEGA was established by calculation of appr. 200 critical experiments of LWR fuel, including VVER fuel rod arrangements. The mean error of the effective multiplication factor k eff is -0,01 compared to the experiment for this area of applicability. Thus, the OMEGA error of 2% assumed in earlier works has turned out to be sufficiently conservative. (orig.) [de

  15. Consistent force fields for saccharides

    DEFF Research Database (Denmark)

    Rasmussen, Kjeld

    1999-01-01

    Consistent force fields for carbohydrates were hitherto developed by extensive optimization ofpotential energy function parameters on experimental data and on ab initio results. A wide range of experimental data is used: internal structures obtained from gas phase electron diffraction and from x......-anomeric effects are accounted for without addition of specific terms. The work is done in the framework of the Consistent Force Field which originatedin Israel and was further developed in Denmark. The actual methods and strategies employed havebeen described previously. Extensive testing of the force field...

  16. How to Give a Good Talk?

    OpenAIRE

    Legout , Arnaud

    2013-01-01

    Why should you give great talks? How to make great slides? How to give a talk? How to make good presentations?; 3rd cycle; Warning: download the powerpoint version to get animations. Animated slides in the PDF version may look cluttered.

  17. Smoothing of Fused Spectral Consistent Satellite Images

    DEFF Research Database (Denmark)

    Sveinsson, Johannes; Aanæs, Henrik; Benediktsson, Jon Atli

    2006-01-01

    on satellite data. Additionally, most conventional methods are loosely connected to the image forming physics of the satellite image, giving these methods an ad hoc feel. Vesteinsson et al. (2005) proposed a method of fusion of satellite images that is based on the properties of imaging physics...

  18. Whether and How Much to Give

    DEFF Research Database (Denmark)

    Petrovski, Erik

    2017-01-01

    Charitable giving involves two seemingly distinct decisions: whether to give and how much to give. However, many researchers methodologically assume that these decisions are one and the same. The present study supports the argument that this is an incorrect assumption that is likely to generate...... misleading conclusions, in part, since the second decision is much more financial in nature than the first. The argument that charitable giving entails two distinct decisions is validated by empirically dismissing the prevailing Tobit model, which assumes a single decision, in favor of less restrictive two......-stage approaches: Cragg’s model and the Heckman model. Most importantly, it is shown that only by adopting a two-stage approach may it be uncovered that common determinants of charitable giving such as income and gender affect the two decisions at hand very differently. Data comes from a high-quality 2012 Danish...

  19. Market-consistent actuarial valuation

    CERN Document Server

    Wüthrich, Mario V

    2016-01-01

    This is the third edition of this well-received textbook, presenting powerful methods for measuring insurance liabilities and assets in a consistent way, with detailed mathematical frameworks that lead to market-consistent values for liabilities. Topics covered are stochastic discounting with deflators, valuation portfolio in life and non-life insurance, probability distortions, asset and liability management, financial risks, insurance technical risks, and solvency. Including updates on recent developments and regulatory changes under Solvency II, this new edition of Market-Consistent Actuarial Valuation also elaborates on different risk measures, providing a revised definition of solvency based on industry practice, and presents an adapted valuation framework which takes a dynamic view of non-life insurance reserving risk.

  20. The Principle of Energetic Consistency

    Science.gov (United States)

    Cohn, Stephen E.

    2009-01-01

    A basic result in estimation theory is that the minimum variance estimate of the dynamical state, given the observations, is the conditional mean estimate. This result holds independently of the specifics of any dynamical or observation nonlinearity or stochasticity, requiring only that the probability density function of the state, conditioned on the observations, has two moments. For nonlinear dynamics that conserve a total energy, this general result implies the principle of energetic consistency: if the dynamical variables are taken to be the natural energy variables, then the sum of the total energy of the conditional mean and the trace of the conditional covariance matrix (the total variance) is constant between observations. Ensemble Kalman filtering methods are designed to approximate the evolution of the conditional mean and covariance matrix. For them the principle of energetic consistency holds independently of ensemble size, even with covariance localization. However, full Kalman filter experiments with advection dynamics have shown that a small amount of numerical dissipation can cause a large, state-dependent loss of total variance, to the detriment of filter performance. The principle of energetic consistency offers a simple way to test whether this spurious loss of variance limits ensemble filter performance in full-blown applications. The classical second-moment closure (third-moment discard) equations also satisfy the principle of energetic consistency, independently of the rank of the conditional covariance matrix. Low-rank approximation of these equations offers an energetically consistent, computationally viable alternative to ensemble filtering. Current formulations of long-window, weak-constraint, four-dimensional variational methods are designed to approximate the conditional mode rather than the conditional mean. Thus they neglect the nonlinear bias term in the second-moment closure equation for the conditional mean. The principle of

  1. Deep Feature Consistent Variational Autoencoder

    OpenAIRE

    Hou, Xianxu; Shen, Linlin; Sun, Ke; Qiu, Guoping

    2016-01-01

    We present a novel method for constructing Variational Autoencoder (VAE). Instead of using pixel-by-pixel loss, we enforce deep feature consistency between the input and the output of a VAE, which ensures the VAE's output to preserve the spatial correlation characteristics of the input, thus leading the output to have a more natural visual appearance and better perceptual quality. Based on recent deep learning works such as style transfer, we employ a pre-trained deep convolutional neural net...

  2. Generating Consistent Program Tutorials

    DEFF Research Database (Denmark)

    Vestdam, Thomas

    2002-01-01

    In this paper we present a tool that supports construction of program tutorials. A program tutorial provides the reader with an understanding of an example program by interleaving fragments of source code and explaining text. An example program can for example illustrate how to use a library or a......, and we see potential in using the tool to produce program tutorials to be used for frameworks, libraries, and in educational contexts.......In this paper we present a tool that supports construction of program tutorials. A program tutorial provides the reader with an understanding of an example program by interleaving fragments of source code and explaining text. An example program can for example illustrate how to use a library...... or a framework. We present a means for specifying the fragments of a program that are to be in-lined in the tutorial text. These in-line fragments are defined by addressing named syntactical elements, such as classes and methods, but it is also possible to address individual code lines by labeling them...

  3. Who gives? Multilevel effects of gender and ethnicity on workplace charitable giving.

    Science.gov (United States)

    Leslie, Lisa M; Snyder, Mark; Glomb, Theresa M

    2013-01-01

    Research on diversity in organizations has largely focused on the implications of gender and ethnic differences for performance, to the exclusion of other outcomes. We propose that gender and ethnic differences also have implications for workplace charitable giving, an important aspect of corporate social responsibility. Drawing from social role theory, we hypothesize and find that gender has consistent effects across levels of analysis; women donate more money to workplace charity than do men, and the percentage of women in a work unit is positively related to workplace charity, at least among men. Alternatively and consistent with social exchange theory, we hypothesize and find that ethnicity has opposing effects across levels of analysis; ethnic minorities donate less money to workplace charity than do Whites, but the percentage of minorities in a work unit is positively related to workplace charity, particularly among minorities. The findings provide a novel perspective on the consequences of gender and ethnic diversity in organizations and highlight synergies between organizational efforts to increase diversity and to build a reputation for corporate social responsibility. PsycINFO Database Record (c) 2013 APA, all rights reserved.

  4. Children are sensitive to norms of giving

    OpenAIRE

    McAuliffe, K.; Raihani, N. J.; Dunham, Y.

    2017-01-01

    People across societies engage in costly sharing, but the extent of such sharing shows striking cultural variation, highlighting the importance of local norms in shaping generosity. Despite this acknowledged role for norms, it is unclear when they begin to exert their influence in development. Here we use a Dictator Game to investigate the extent to which 4- to 9-year-old children are sensitive to selfish (give 20%) and generous (give 80%) norms. Additionally, we varied whether children were ...

  5. Potential application of the consistency approach for vaccine potency testing.

    Science.gov (United States)

    Arciniega, J; Sirota, L A

    2012-01-01

    The Consistency Approach offers the possibility of reducing the number of animals used for a potency test. However, it is critical to assess the effect that such reduction may have on assay performance. Consistency of production, sometimes referred to as consistency of manufacture or manufacturing, is an old concept implicit in regulation, which aims to ensure the uninterrupted release of safe and effective products. Consistency of manufacture can be described in terms of process capability, or the ability of a process to produce output within specification limits. For example, the standard method for potency testing of inactivated rabies vaccines is a multiple-dilution vaccination challenge test in mice that gives a quantitative, although highly variable estimate. On the other hand, a single-dilution test that does not give a quantitative estimate, but rather shows if the vaccine meets the specification has been proposed. This simplified test can lead to a considerable reduction in the number of animals used. However, traditional indices of process capability assume that the output population (potency values) is normally distributed, which clearly is not the case for the simplified approach. Appropriate computation of capability indices for the latter case will require special statistical considerations.

  6. Consistence of Network Filtering Rules

    Institute of Scientific and Technical Information of China (English)

    SHE Kun; WU Yuancheng; HUANG Juncai; ZHOU Mingtian

    2004-01-01

    The inconsistence of firewall/VPN(Virtual Private Network) rule makes a huge maintainable cost.With development of Multinational Company,SOHO office,E-government the number of firewalls/VPN will increase rapidly.Rule table in stand-alone or network will be increased in geometric series accordingly.Checking the consistence of rule table manually is inadequate.A formal approach can define semantic consistence,make a theoretic foundation of intelligent management about rule tables.In this paper,a kind of formalization of host rules and network ones for auto rule-validation based on SET theory were proporsed and a rule validation scheme was defined.The analysis results show the superior performance of the methods and demonstrate its potential for the intelligent management based on rule tables.

  7. The Effect of Media on Charitable Giving and Volunteering: Evidence from the "Give Five" Campaign

    Science.gov (United States)

    Yoruk, Baris K.

    2012-01-01

    Fundraising campaigns advertised via mass media are common. To what extent such campaigns affect charitable behavior is mostly unknown, however. Using giving and volunteering surveys conducted biennially from 1988 to 1996, I investigate the effect of a national fundraising campaign, "Give Five," on charitable giving and volunteering patterns. The…

  8. The giving standard: conditional cooperation in the case of charitable giving

    NARCIS (Netherlands)

    P. Wiepking (Pamala); M. Heijnen (Merijn)

    2011-01-01

    textabstractIn this study, we make a first attempt to investigate the mechanisms of conditional cooperation in giving outside experiments, using retrospective survey data on charitable giving (the Giving the Netherlands Panel Study 2005 (GINPS05, 2005 ; N  = 1474)). Our results show that in the case

  9. Thinkers and feelers: Emotion and giving.

    Science.gov (United States)

    Corcoran, Katie E

    2015-07-01

    Voluntary organizations, such as religious congregations, ask their members to contribute money as a part of membership and rely on these contributions for their survival. Yet often only a small cadre of members provides the majority of the contributions. Past research on congregational giving focuses on cognitive rational processes, generally neglecting the role of emotion. Extending Collins' (2004) interaction ritual theory, I predict that individuals who experience positive emotions during religious services will be more likely to give a higher proportion of their income to their congregation than those who do not. Moreover, I argue that this effect will be amplified in congregational contexts characterized by high aggregate levels of positive emotion, strictness, dense congregational networks, and expressive rituals. Using data from the 2001 U.S. Congregational Life Survey and multilevel modeling, I find support for several of these hypotheses. The findings suggest that both cognitive and emotional processes underlie congregational giving. Copyright © 2014 Elsevier Inc. All rights reserved.

  10. Conscientious refusals and reason-giving.

    Science.gov (United States)

    Marsh, Jason

    2014-07-01

    Some philosophers have argued for what I call the reason-giving requirement for conscientious refusal in reproductive healthcare. According to this requirement, healthcare practitioners who conscientiously object to administering standard forms of treatment must have arguments to back up their conscience, arguments that are purely public in character. I argue that such a requirement, though attractive in some ways, faces an overlooked epistemic problem: it is either too easy or too difficult to satisfy in standard cases. I close by briefly considering whether a version of the reason-giving requirement can be salvaged despite this important difficulty. © 2013 John Wiley & Sons Ltd.

  11. Fingerprint analysis and quality consistency evaluation of flavonoid compounds for fermented Guava leaf by combining high-performance liquid chromatography time-of-flight electrospray ionization mass spectrometry and chemometric methods.

    Science.gov (United States)

    Wang, Lu; Tian, Xiaofei; Wei, Wenhao; Chen, Gong; Wu, Zhenqiang

    2016-10-01

    Guava leaves are used in traditional herbal teas as antidiabetic therapies. Flavonoids are the main active of Guava leaves and have many physiological functions. However, the flavonoid compositions and activities of Guava leaves could change due to microbial fermentation. A high-performance liquid chromatography time-of-flight electrospray ionization mass spectrometry method was applied to identify the varieties of the flavonoids in Guava leaves before and after fermentation. High-performance liquid chromatography, hierarchical cluster analysis and principal component analysis were used to quantitatively determine the changes in flavonoid compositions and evaluate the consistency and quality of Guava leaves. Monascus anka Saccharomyces cerevisiae fermented Guava leaves contained 2.32- and 4.06-fold more total flavonoids and quercetin, respectively, than natural Guava leaves. The flavonoid compounds of the natural Guava leaves had similarities ranging from 0.837 to 0.927. The flavonoid compounds from the Monascus anka S. cerevisiae fermented Guava leaves had similarities higher than 0.993. This indicated that the quality consistency of the fermented Guava leaves was better than that of the natural Guava leaves. High-performance liquid chromatography fingerprinting and chemometric analysis are promising methods for evaluating the degree of fermentation of Guava leaves based on quality consistency, which could be used in assessing flavonoid compounds for the production of fermented Guava leaves. © 2016 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  12. Termination of Commercial Contracts by giving Notice

    DEFF Research Database (Denmark)

    Edlund, Hans Henrik

    2008-01-01

    Some long-term contracts are brought to an end if one of the parties gives notice. Usually, such a step is not considered a breach of contract. It causes the contract to end in accordance with the contract. When no express rules cover the situation, it is often not entirely clear whether or not t...

  13. Children are sensitive to norms of giving.

    Science.gov (United States)

    McAuliffe, Katherine; Raihani, Nichola J; Dunham, Yarrow

    2017-10-01

    People across societies engage in costly sharing, but the extent of such sharing shows striking cultural variation, highlighting the importance of local norms in shaping generosity. Despite this acknowledged role for norms, it is unclear when they begin to exert their influence in development. Here we use a Dictator Game to investigate the extent to which 4- to 9-year-old children are sensitive to selfish (give 20%) and generous (give 80%) norms. Additionally, we varied whether children were told how much other children give (descriptive norm) or what they should give according to an adult (injunctive norm). Results showed that children generally gave more when they were exposed to a generous norm. However, patterns of compliance varied with age. Younger children were more likely to comply with the selfish norm, suggesting a licensing effect. By contrast, older children were more influenced by the generous norm, yet capped their donations at 50%, perhaps adhering to a pre-existing norm of equality. Children were not differentially influenced by descriptive or injunctive norms, suggesting a primacy of norm content over norm format. Together, our findings indicate that while generosity is malleable in children, normative information does not completely override pre-existing biases. Copyright © 2017 Elsevier B.V. All rights reserved.

  14. They Make Space and Give Time

    Indian Academy of Sciences (India)

    ... Resonance – Journal of Science Education; Volume 3; Issue 3. They Make Space and Give Time The Engineer as Poet. Gangan Prathap. Book Review Volume 3 ... Author Affiliations. Gangan Prathap1. National Aerospace Laboratories and the Jawaharlal Nehru Centre for Advanced Scientific Research in Bangalore.

  15. Bidding to give in the field

    NARCIS (Netherlands)

    Onderstal, Sander; Schram, Arthur J. H. C.; Soetevent, Adriaan R.

    In a door-to-door fundraising field experiment, we study the impact of fundraising mechanisms on charitable giving. We approached about 4500 households, each participating in an all-pay auction, a lottery, a non-anonymous voluntary contribution mechanism (VCM), or an anonymous VCM. In contrast to

  16. Bidding to give in the field

    NARCIS (Netherlands)

    Onderstal, S.; Schram, A.J.H.C.; Soetevent, A.R.

    2013-01-01

    In a door-to-door fundraising field experiment, we study the impact of fundraising mechanisms on charitable giving. We approached about 4500 households, each participating in an all-pay auction, a lottery, a non-anonymous voluntary contribution mechanism (VCM), or an anonymous VCM. In contrast to

  17. Why healthcare workers give prelacteal feeds.

    Science.gov (United States)

    Akuse, R M; Obinya, E A

    2002-08-01

    Because prelacteal feeds can adversely affect breastfeeding, UNICEF/WHO discourage their use unless medically indicated. The study was carried out to determine the proportion of healthcare workers who routinely give prelacteal feeds, and their reasons for doing so; further, to determine whether any differences exist between medically and non-medically trained healthcare workers in their administration of prelacteal feeds. Survey. Primary, secondary and tertiary health facilities in Kaduna township Nigeria. Of 1100 healthcare workers sampled, 747 (68%) responded. Of these 80% had received medical training, 20% had not. Use of a pretested validated questionnaire. Large proportions of both medical and non-medically trained healthcare workers stated they routinely give prelacteal feeds (doctors, 68.2%; nurses, 70.2%; and non-medical, 73.6%). However their reasons for doing so differed significantly (P=0.00001). Nurses gave mainly for perceived breast milk insufficiency, doctors for prevention of dehydration, hypoglycaemia and neonatal jaundice and non-medical staff to prepare the gastrointestinal tract for digestion and to quench thirst. Most healthcare workers (medical and non-medical) routinely and unnecessarily give prelacteal feeds. Therefore training and retraining programmes in lactation management are necessary and must include non-medical staff. These programmes, while emphasizing the danger of giving prelacteal feeds, must deal with the misconceptions of each group. Deliberate efforts have to be made to incorporate clinical training in breastfeeding in curricula of Schools of Medicine and Nursing.

  18. Asian American Giving to US Higher Education

    Science.gov (United States)

    Tsunoda, Kozue

    2010-01-01

    Asian Americans have had significant impacts on and within mainstream US society, and their great efforts and gifts in the name of charitable causes are no exception. This study aims to examine perceptions within American university development offices about Asian American giving to US higher education. The article begins with a literature review…

  19. IMPROVING WOMEN'S LIVES Practical support for women gives ...

    International Development Research Centre (IDRC) Digital Library (Canada)

    IMPROVING WOMEN'S LIVES Practical support for women gives communities a better future. October 26 ... Organized into small cooperatives, the women produce and market argan oil using a mix of traditional and modern methods. At the same time ... arts and craft. Technology helps Asian women balance family and work.

  20. Mensurations give the radioactivity natural gamma, radon in Chile

    International Nuclear Information System (INIS)

    Gamarra, J.; Stuardo, E.

    1998-01-01

    In this work, are presented the methods for measurement, calculate and you discusses the results, in each studied area, in the mark the respective world averages. None the averages evaluated annual effective dose they surpassed these world averages effective dose or level gives intervention, corresponding

  1. Consistent histories and operational quantum theory

    International Nuclear Information System (INIS)

    Rudolph, O.

    1996-01-01

    In this work a generalization of the consistent histories approach to quantum mechanics is presented. We first critically review the consistent histories approach to nonrelativistic quantum mechanics in a mathematically rigorous way and give some general comments about it. We investigate to what extent the consistent histories scheme is compatible with the results of the operational formulation of quantum mechanics. According to the operational approach, nonrelativistic quantum mechanics is most generally formulated in terms of effects, states, and operations. We formulate a generalized consistent histories theory using the concepts and the terminology which have proven useful in the operational formulation of quantum mechanics. The logical rule of the logical interpretation of quantum mechanics is generalized to the present context. The algebraic structure of the generalized theory is studied in detail

  2. How to give a good talk.

    Science.gov (United States)

    Alon, Uri

    2009-10-23

    We depend on talks to communicate our work, and we spend much of our time as audience members in talks. However, few scientists are taught the well-established principles of giving good talks. Here, I describe how to prepare, present, and answer questions in a scientific talk. We will see how a talk prepared with a single premise and delivered with good eye contact is clear and enjoyable.

  3. Neurocultural evidence that ideal affect match promotes giving.

    Science.gov (United States)

    Park, BoKyung; Blevins, Elizabeth; Knutson, Brian; Tsai, Jeanne L

    2017-07-01

    Why do people give to strangers? We propose that people trust and give more to those whose emotional expressions match how they ideally want to feel ("ideal affect match"). European Americans and Koreans played multiple trials of the Dictator Game with recipients who varied in emotional expression (excited, calm), race (White, Asian) and sex (male, female). Consistent with their culture's valued affect, European Americans trusted and gave more to excited than calm recipients, whereas Koreans trusted and gave more to calm than excited recipients. These findings held regardless of recipient race and sex. We then used fMRI to probe potential affective and mentalizing mechanisms. Increased activity in the nucleus accumbens (associated with reward anticipation) predicted giving, as did decreased activity in the right temporo-parietal junction (rTPJ; associated with reduced belief prediction error). Ideal affect match decreased rTPJ activity, suggesting that people may trust and give more to strangers whom they perceive to share their affective values. © The Author (2017). Published by Oxford University Press.

  4. Combining the Complete Active Space Self-Consistent Field Method and the Full Configuration Interaction Quantum Monte Carlo within a Super-CI Framework, with Application to Challenging Metal-Porphyrins.

    Science.gov (United States)

    Li Manni, Giovanni; Smart, Simon D; Alavi, Ali

    2016-03-08

    A novel stochastic Complete Active Space Self-Consistent Field (CASSCF) method has been developed and implemented in the Molcas software package. A two-step procedure is used, in which the CAS configuration interaction secular equations are solved stochastically with the Full Configuration Interaction Quantum Monte Carlo (FCIQMC) approach, while orbital rotations are performed using an approximated form of the Super-CI method. This new method does not suffer from the strong combinatorial limitations of standard MCSCF implementations using direct schemes and can handle active spaces well in excess of those accessible to traditional CASSCF approaches. The density matrix formulation of the Super-CI method makes this step independent of the size of the CI expansion, depending exclusively on one- and two-body density matrices with indices restricted to the relatively small number of active orbitals. No sigma vectors need to be stored in memory for the FCIQMC eigensolver--a substantial gain in comparison to implementations using the Davidson method, which require three or more vectors of the size of the CI expansion. Further, no orbital Hessian is computed, circumventing limitations on basis set expansions. Like the parent FCIQMC method, the present technique is scalable on massively parallel architectures. We present in this report the method and its application to the free-base porphyrin, Mg(II) porphyrin, and Fe(II) porphyrin. In the present study, active spaces up to 32 electrons and 29 orbitals in orbital expansions containing up to 916 contracted functions are treated with modest computational resources. Results are quite promising even without accounting for the correlation outside the active space. The systems here presented clearly demonstrate that large CASSCF calculations are possible via FCIQMC-CASSCF without limitations on basis set size.

  5. Theoretical prediction of the band offsets at the ZnO/anatase TiO{sub 2} and GaN/ZnO heterojunctions using the self-consistent ab initio DFT/GGA-1/2 method

    Energy Technology Data Exchange (ETDEWEB)

    Fang, D. Q., E-mail: fangdqphy@mail.xjtu.edu.cn; Zhang, S. L. [MOE Key Laboratory for Nonequilibrium Synthesis and Modulation of Condensed Matter, School of Science, Xi’an Jiaotong University, Xi’an 710049 (China)

    2016-01-07

    The band offsets of the ZnO/anatase TiO{sub 2} and GaN/ZnO heterojunctions are calculated using the density functional theory/generalized gradient approximation (DFT/GGA)-1/2 method, which takes into account the self-energy corrections and can give an approximate description to the quasiparticle characteristics of the electronic structure of semiconductors. We present the results of the ionization potential (IP)-based and interfacial offset-based band alignments. In the interfacial offset-based band alignment, to get the natural band offset, we use the surface calculations to estimate the change of reference level due to the interfacial strain. Based on the interface models and GGA-1/2 calculations, we find that the valence band maximum and conduction band minimum of ZnO, respectively, lie 0.64 eV and 0.57 eV above those of anatase TiO{sub 2}, while lie 0.84 eV and 1.09 eV below those of GaN, which agree well with the experimental data. However, a large discrepancy exists between the IP-based band offset and the calculated natural band offset, the mechanism of which is discussed. Our results clarify band alignment of the ZnO/anatase TiO{sub 2} heterojunction and show good agreement with the GW calculations for the GaN/ZnO heterojunction.

  6. Role-modeling and conversations about giving in the socialization of adolescent charitable giving and volunteering.

    Science.gov (United States)

    Ottoni-Wilhelm, Mark; Estell, David B; Perdue, Neil H

    2014-01-01

    This study investigated the relationship between the monetary giving and volunteering behavior of adolescents and the role-modeling and conversations about giving provided by their parents. The participants are a large nationally-representative sample of 12-18 year-olds from the Panel Study of Income Dynamics' Child Development Supplement (n = 1244). Adolescents reported whether they gave money and whether they volunteered. In a separate interview parents reported whether they talked to their adolescent about giving. In a third interview, parents reported whether they gave money and volunteered. The results show that both role-modeling and conversations about giving are strongly related to adolescents' giving and volunteering. Knowing that both role-modeling and conversation are strongly related to adolescents' giving and volunteering suggests an often over-looked way for practitioners and policy-makers to nurture giving and volunteering among adults: start earlier, during adolescence, by guiding parents in their role-modeling of, and conversations about, charitable giving and volunteering. Copyright © 2013 The Foundation for Professionals in Services for Adolescents. Published by Elsevier Ltd. All rights reserved.

  7. Give and Take in Dictator Games

    DEFF Research Database (Denmark)

    Cappelen, Alexander W.; Nielsen, Ulrik Haagen; Sørensen, Erik Ø.

    2014-01-01

    It has been shown that participants in the dictator game are less willing to give money to the other participant when their choice set also includes the option to take money. We examine whether this effect is due to the choice set providing a signal about entitlements in a setting where...... entitlements initially may be considered unclear. We find that the share of positive transfers depends on the choice set even when there is no uncertainty about entitlements, and that this choice-set effect is robust across a heterogenous group of participants recruited from the general adult population...

  8. Modifications of imaging spectroscopy methods for increases spatial and temporal consistency: A case study of change in leafy spurge distribution between 1999 and 2001 in Theodore Roosevelt National Park, North Dakota

    Science.gov (United States)

    Dudek, Kathleen Burke

    The noxious weed leafy spurge (Euphorbia esula L.) has spread throughout the northern Great Plains of North America since it was introduced in the early 1800s, and it is currently a significant management concern. Accurate, rapid location and repeatable measurements are critical for successful temporal monitoring of infestations. Imaging spectroscopy is well suited for identification of spurge; however, the development and dissemination of standardized hyperspectral mapping procedures that produce consistent multi-temporal maps has been absent. Airborne Visible/Infrared Imaging Spectrometer (AVIRIS) data, collected in 1999 and 2001 over Theodore Roosevelt National Park, North Dakota, were used to locate leafy spurge. Published image-processing methods were tested to determine the most successful for consistent maps. Best results were obtained using: (1) NDVI masking; (2) cross-track illumination correction; (3) image-derived spectral libraries; and (4) mixture-tuned matched filtering algorithm. Application of the algorithm was modified to standardize processing and eliminate threshold decisions; the image-derived library was refined to eliminate additional variability. Primary (spurge dominant), secondary (spurge non-dominant), abundance, and area-wide vegetation maps were produced. Map accuracies were analyzed with point, polygon, and grid reference sets, using confusion matrices and regression between field-measured and image-derived abundances. Accuracies were recalculated after applying a majority filter, and buffers ranging from 1-5 pixels wide around classified pixels, to accommodate poor reference-image alignment. Overall accuracy varied from 39% to 82%, however, regression analyses yielded r2 = 0.725, indicating a strong relationship between field and image-derived densities. Accuracy was sensitive to: (1) registration offsets between field and image locations; (2) modification of analytical methods; and (3) reference data quality. Sensor viewing angle

  9. The experience gives the Cuban program with children gives territories affected by the Chernobyl accident

    International Nuclear Information System (INIS)

    Garcia, O.; Llanes, R.

    1998-01-01

    From 1990 it works in Cuba a program destined to offer medical attention you specialize and to develop a plan sanatoria gives rehabilitation with children provided the different areas affected by the contamination radioactive resultant to the Chernobyl accident

  10. Measuring process and knowledge consistency

    DEFF Research Database (Denmark)

    Edwards, Kasper; Jensen, Klaes Ladeby; Haug, Anders

    2007-01-01

    When implementing configuration systems, knowledge about products and processes are documented and replicated in the configuration system. This practice assumes that products are specified consistently i.e. on the same rule base and likewise for processes. However, consistency cannot be taken...... for granted; rather the contrary, and attempting to implement a configuration system may easily ignite a political battle. This is because stakes are high in the sense that the rules and processes chosen may only reflect one part of the practice, ignoring a majority of the employees. To avoid this situation......, this paper presents a methodology for measuring product and process consistency prior to implementing a configuration system. The methodology consists of two parts: 1) measuring knowledge consistency and 2) measuring process consistency. Knowledge consistency is measured by developing a questionnaire...

  11. The Luxury of Igniting Change by Giving

    DEFF Research Database (Denmark)

    Llamas, Rosa; Uth Thomsen, Thyra

    2016-01-01

    This study investigates the phenomenon of luxury from a consumer perspective, by means of multisited phenomenological inquiry. The findings expand the pervasive view of luxury as accumulation of highly valued goods by offering a transformative perspective of luxury as transforming the life...... the giver with a sense of luxury in terms of pleasure, purpose, and connection with humankind. Thus, the findings not only extend the traditional conceptualization of luxury from having to giving, but also challenge current conceptualizations of sharing out as a non-reciprocal pro-social behavior...... by illustrating how ‘the luxury of giving’ relies on both pro-social and pro-ego consumption rationales, which implicitly include circular reciprocation....

  12. Giving Devices the Ability to Exercise Reason

    Directory of Open Access Journals (Sweden)

    Thomas Keeley

    2008-10-01

    Full Text Available One of the capabilities that separates humans from computers has been the ability to exercise "reason / judgment". Computers and computerized devices have provided excellent platforms for following rules. Computer programs provide the scripts for processing the rules. The exercise of reason, however, is more of an image processing function than a function composed of a series of rules. The exercise of reason is more right brain than left brain. It involves the interpretation of information and balancing inter-related alternatives. This paper will discuss a new way to define and process information that will give devices the ability to exercise human-like reasoning and judgment. The paper will discuss the characteristics of a "dynamic graphical language" in the context of addressing judgment, since judgment is often required to adjust rules when operating in a dynamic environment. The paper will touch on architecture issues and how judgment is integrated with rule processing.

  13. Giving up nuclear energy. Obstacles, conditions, consequences

    International Nuclear Information System (INIS)

    Koegel-Dorfs, H.

    1990-01-01

    Life on this earth is not possible without using energy. The resources of the energies used so far are limited and their utilization carries certain risks which have now become obvious: climatic problems on the one hand, safety problems on the other. Chernobyl, Wackersdorf, tornados and population growth are issues mentioned all the time in the fight for the best solution. Even church synodes have spoken up and demanded to give up nuclear energy. The energy issue, however, has become a question of survival. This study, worked out by a group of scientists (natural science, energy science, lawyers, theologians) analyses the obstacles, conditions and consequences of such a step. The possible solution of rational energy utilization and substitution of energy services and regenerative energies is discussed in depth. The book concludes that problems can only be coped with if there is a feeling of joint responsibility and global social consensus. (orig./HP) [de

  14. A Study on the Consistency of Discretization Equation in Unsteady Heat Transfer Calculations

    Directory of Open Access Journals (Sweden)

    Wenhua Zhang

    2013-01-01

    Full Text Available The previous studies on the consistency of discretization equation mainly focused on the finite difference method, but the issue of consistency still remains with several problems far from totally solved in the actual numerical computation. For instance, the consistency problem is involved in the numerical case where the boundary variables are solved explicitly while the variables away from the boundary are solved implicitly. And when the coefficient of discretization equation of nonlinear numerical case is the function of variables, calculating the coefficient explicitly and the variables implicitly might also give rise to consistency problem. Thus the present paper mainly researches the consistency problems involved in the explicit treatment of the second and third boundary conditions and that of thermal conductivity which is the function of temperature. The numerical results indicate that the consistency problem should be paid more attention and not be neglected in the practical computation.

  15. Extension of the self-consistent-charge density-functional tight-binding method: third-order expansion of the density functional theory total energy and introduction of a modified effective coulomb interaction.

    Science.gov (United States)

    Yang, Yang; Yu, Haibo; York, Darrin; Cui, Qiang; Elstner, Marcus

    2007-10-25

    The standard self-consistent-charge density-functional-tight-binding (SCC-DFTB) method (Phys. Rev. B 1998, 58, 7260) is derived by a second-order expansion of the density functional theory total energy expression, followed by an approximation of the charge density fluctuations by charge monopoles and an effective damped Coulomb interaction between the atomic net charges. The central assumptions behind this effective charge-charge interaction are the inverse relation of atomic size and chemical hardness and the use of a fixed chemical hardness parameter independent of the atomic charge state. While these approximations seem to be unproblematic for many covalently bound systems, they are quantitatively insufficient for hydrogen-bonding interactions and (anionic) molecules with localized net charges. Here, we present an extension of the SCC-DFTB method to incorporate third-order terms in the charge density fluctuations, leading to chemical hardness parameters that are dependent on the atomic charge state and a modification of the Coulomb scaling to improve the electrostatic treatment within the second-order terms. These modifications lead to a significant improvement in the description of hydrogen-bonding interactions and proton affinities of biologically relevant molecules.

  16. A new approach to hull consistency

    Directory of Open Access Journals (Sweden)

    Kolev Lubomir

    2016-06-01

    Full Text Available Hull consistency is a known technique to improve the efficiency of iterative interval methods for solving nonlinear systems describing steady-states in various circuits. Presently, hull consistency is checked in a scalar manner, i.e. successively for each equation of the nonlinear system with respect to a single variable. In the present poster, a new more general approach to implementing hull consistency is suggested which consists in treating simultaneously several equations with respect to the same number of variables.

  17. A consistent interpretation of quantum mechanics

    International Nuclear Information System (INIS)

    Omnes, Roland

    1990-01-01

    Some mostly recent theoretical and mathematical advances can be linked together to yield a new consistent interpretation of quantum mechanics. It relies upon a unique and universal interpretative rule of a logical character which is based upon Griffiths consistent history. Some new results in semi-classical physics allow classical physics to be derived from this rule, including its logical aspects, and to prove accordingly the existence of determinism within the quantum framework. Together with decoherence, this can be used to retrieve the existence of facts, despite the probabilistic character of the theory. Measurement theory can then be made entirely deductive. It is accordingly found that wave packet reduction is a logical property, whereas one can always choose to avoid using it. The practical consequences of this interpretation are most often in agreement with the Copenhagen formulation but they can be proved never to give rise to any logical inconsistency or paradox. (author)

  18. Consistent Estimation of Partition Markov Models

    Directory of Open Access Journals (Sweden)

    Jesús E. García

    2017-04-01

    Full Text Available The Partition Markov Model characterizes the process by a partition L of the state space, where the elements in each part of L share the same transition probability to an arbitrary element in the alphabet. This model aims to answer the following questions: what is the minimal number of parameters needed to specify a Markov chain and how to estimate these parameters. In order to answer these questions, we build a consistent strategy for model selection which consist of: giving a size n realization of the process, finding a model within the Partition Markov class, with a minimal number of parts to represent the process law. From the strategy, we derive a measure that establishes a metric in the state space. In addition, we show that if the law of the process is Markovian, then, eventually, when n goes to infinity, L will be retrieved. We show an application to model internet navigation patterns.

  19. Maintaining clinical governance when giving telephone advice.

    Science.gov (United States)

    Alazawi, William; Agarwal, Kosh; Suddle, Abid; Aluvihare, Varuna; Heneghan, Michael A

    2013-10-01

    Delivering excellent healthcare depends on accurate communication between professionals who may be in different locations. Frequently, the first point of contact with the liver unit at King's College Hospital (KCH) is through a telephone call to a specialist registrar or liver fellow, for whom no case notes are available in which to record information. The aim of this study was to improve the clinical governance of telephone referrals and to generate contemporaneous records that could be easily retrieved and audited. An electronic database for telephone referrals and advice was designed and made securely available to registrars in our unit. Service development in a tertiary liver centre that receives referrals from across the UK and Europe. Demographic and clinical data were recorded prospectively and analysed retrospectively. Data from 350 calls were entered during 5 months. The information included the nature and origin of the call (200 from 75 different institutions), disease burden and severity of disease among the patients discussed with KCH, and outcome of the call. The majority of cases were discussed with consultants or arrangements were made for formal review at KCH. A telephone referrals and advice database provides clinical governance, serves as a quality indicator and forms a contemporaneous record at the referral centre. Activity data and knowledge of disease burden help to tailor services to the needs of referrers and commissioners. We recommend implementation of similar models in other centres that give extramural verbal advice.

  20. Cultivating gratitude and giving through experiential consumption.

    Science.gov (United States)

    Walker, Jesse; Kumar, Amit; Gilovich, Thomas

    2016-12-01

    Gratitude promotes well-being and prompts prosocial behavior. Here, we examine a novel way to cultivate this beneficial emotion. We demonstrate that 2 different types of consumption-material consumption (buying for the sake of having) and experiential consumption (buying for the sake of doing)-differentially foster gratitude and giving. In 6 studies we show that reflecting on experiential purchases (e.g., travel, meals out, tickets to events) inspires more gratitude than reflecting on material purchases (e.g., clothing, jewelry, furniture), and that thinking about experiences leads to more subsequent altruistic behavior than thinking about possessions. In Studies 1-2b, we use within-subject and between-subjects designs to test our main hypothesis: that people are more grateful for what they've done than what they have. Study 3 finds evidence for this effect in the real-world setting of online customer reviews: Consumers are more likely to spontaneously mention feeling grateful for experiences they have bought than for material goods they have bought. In our final 2 studies, we show that experiential consumption also makes people more likely to be generous to others. Participants who contemplated a significant experiential purchase behaved more generously toward anonymous others in an economic game than those who contemplated a significant material purchase. It thus appears that shifting spending toward experiential consumption can improve people's everyday lives as well as the lives of those around them. (PsycINFO Database Record (c) 2016 APA, all rights reserved).

  1. Giving what one should: explanations for the knowledge-behavior gap for altruistic giving.

    Science.gov (United States)

    Blake, Peter R

    2018-04-01

    Several studies have shown that children struggle to give what they believe that they should: the so-called knowledge-behavior gap. Over a dozen recent Dictator Game studies find that, although young children believe that they should give half of a set of resources to a peer, they typically give less and often keep all of the resources for themselves. This article reviews recent evidence for five potential explanations for the gap and how children close it with age: self-regulation, social distance, theory of mind, moral knowledge and social learning. I conclude that self-regulation, social distance, and social learning show the most promising evidence for understanding the mechanisms that can close the gap. Copyright © 2017 Elsevier Ltd. All rights reserved.

  2. The fine art of giving encouragement.

    Science.gov (United States)

    Davidhizar, R

    1991-11-01

    1. Support and encouragement can significantly influence emotional well-being and profoundly affect quality of life. Encouragement is a powerful nursing strategy, increasing both nursing effectiveness and feelings of job satisfaction. 2. A variety of encouragement techniques are available, including focusing on the positive, communicating respect, showing appreciation, picking up the phone, avoiding a superior attitude, sharing personal experiences, providing motivation, and cheerleading. 3. To be most meaningful, words of encouragement should relate to a specific behavior. If encouragement is not consistent with an individual's personal wishes, goals, or feelings, encouragement may receive a negative response or be denied.

  3. Unilateral Measures addressing Non-Trade Concerns. A Study on WTO Consistency, Relevance of other International Agreements, Economic Effectiveness and Impact on Developing Countries of Measures concerning Non-Product-Related Processes and Production Methods

    International Nuclear Information System (INIS)

    Van den Bossche, P.; Schrijver, N.; Faber, G.

    2007-01-01

    Over the last two years, the debate in the Netherlands on trade measures addressing non-trade concerns has focused on two important and politically sensitive issues, namely: (1) the sustainability of the large-scale production of biomass as an alternative source of energy; and (2) the production of livestock products in a manner that is consistent with animal welfare requirements. In February 2007 a report was issued on the 'Toetsingskader voor Duurzame Biomassa', the so-called Cramer Report. This report discusses the risks associated with large-scale biomass production and establishes a list of criteria for the sustainable production of biomass. These criteria reflect a broad range of non-trade concerns, including environmental protection, global warming, food security, biodiversity, economic prosperity and social welfare. The report recognizes that the implementation of the criteria (including the establishment of a certification system) will require careful consideration of the obligations of the Netherlands under EU and WTO law. Governments called upon to address non-trade concerns may do so by using different types of measures. Prominent among these are measures concerning processes and production methods of products. In the present study, these issues are examined primarily with regard to existing, proposed or still purely hypothetical measures for implementing the Cramer criteria for the sustainable production of biomass. Several other, non-energy-related issues are discussed in this report

  4. Self-consistent expansion for the molecular beam epitaxy equation.

    Science.gov (United States)

    Katzav, Eytan

    2002-03-01

    Motivated by a controversy over the correct results derived from the dynamic renormalization group (DRG) analysis of the nonlinear molecular beam epitaxy (MBE) equation, a self-consistent expansion for the nonlinear MBE theory is considered. The scaling exponents are obtained for spatially correlated noise of the general form D(r-r('),t-t('))=2D(0)[r-->-r(')](2rho-d)delta(t-t(')). I find a lower critical dimension d(c)(rho)=4+2rho, above which the linear MBE solution appears. Below the lower critical dimension a rho-dependent strong-coupling solution is found. These results help to resolve the controversy over the correct exponents that describe nonlinear MBE, using a reliable method that proved itself in the past by giving reasonable results for the strong-coupling regime of the Kardar-Parisi-Zhang system (for d>1), where DRG failed to do so.

  5. Giving in Europe : The state of research on giving in 20 European countries

    NARCIS (Netherlands)

    Hoolwerf, L.K.; Schuyt, T.N.M.

    2017-01-01

    This study is in intitial attempt to map philanthropy in Europe and presents a first overall estimation of the European philanthropic sector. Containing an overview of what we know about research on the philanthropy sector, it provides data and and assesment of the data on giving by households,

  6. Smoothing of Fused Spectral Consistent Satellite Images with TV-based Edge Detection

    DEFF Research Database (Denmark)

    Sveinsson, Johannes; Aanæs, Henrik; Benediktsson, Jon Atli

    2007-01-01

    based on satellite data. Additionally, most conventional methods are loosely connected to the image forming physics of the satellite image, giving these methods an ad hoc feel. Vesteinsson et al. [1] proposed a method of fusion of satellite images that is based on the properties of imaging physics...... in a statistically meaningful way and was called spectral consistent panshapening (SCP). In this paper we improve this framework for satellite image fusion by introducing a better image prior, via data-dependent image smoothing. The dependency is obtained via total variation edge detection method.......Several widely used methods have been proposed for fusing high resolution panchromatic data and lower resolution multi-channel data. However, many of these methods fail to maintain the spectral consistency of the fused high resolution image, which is of high importance to many of the applications...

  7. Choice, internal consistency, and rationality

    OpenAIRE

    Aditi Bhattacharyya; Prasanta K. Pattanaik; Yongsheng Xu

    2010-01-01

    The classical theory of rational choice is built on several important internal consistency conditions. In recent years, the reasonableness of those internal consistency conditions has been questioned and criticized, and several responses to accommodate such criticisms have been proposed in the literature. This paper develops a general framework to accommodate the issues raised by the criticisms of classical rational choice theory, and examines the broad impact of these criticisms from both no...

  8. Precommitted Investment Strategy versus Time-Consistent Investment Strategy for a General Risk Model with Diffusion

    Directory of Open Access Journals (Sweden)

    Lidong Zhang

    2014-01-01

    Full Text Available We mainly study a general risk model and investigate the precommitted strategy and the time-consistent strategy under mean-variance criterion, respectively. A lagrange method is proposed to derive the precommitted investment strategy. Meanwhile from the game theoretical perspective, we find the time-consistent investment strategy by solving the extended Hamilton-Jacobi-Bellman equations. By comparing the precommitted strategy with the time-consistent strategy, we find that the company under the time-consistent strategy has to give up the better current utility in order to keep a consistent satisfaction over the whole time horizon. Furthermore, we theoretically and numerically provide the effect of the parameters on these two optimal strategies and the corresponding value functions.

  9. No-call NIPT gives important information

    DEFF Research Database (Denmark)

    Uldbjerg, Niels

    2018-01-01

    Non-invasive prenatal testing (NIPT) based on cell free fetal DNA fragments in maternal serum samples (cffDNA) is a well-established method for Downs-screening in early pregnancy. The sensitivity is above 99%. However, when used for screening of younger women without risk factors, the positive pr...

  10. Giving Reason to Prospective Mathematics Teachers

    Science.gov (United States)

    D'Ambrosio, Beatriz; Kastberg, Signe

    2012-01-01

    This article describes the development of the authors' understanding of the contradictions in their mathematics teacher education practice. This understanding emerged from contrasting analyses of the impact of the authors' practices in mathematics content courses versus mathematics methods courses. Examples of the authors' work with two students,…

  11. Consistent guiding center drift theories

    International Nuclear Information System (INIS)

    Wimmel, H.K.

    1982-04-01

    Various guiding-center drift theories are presented that are optimized in respect of consistency. They satisfy exact energy conservation theorems (in time-independent fields), Liouville's theorems, and appropriate power balance equations. A theoretical framework is given that allows direct and exact derivation of associated drift-kinetic equations from the respective guiding-center drift-orbit theories. These drift-kinetic equations are listed. Northrop's non-optimized theory is discussed for reference, and internal consistency relations of G.C. drift theories are presented. (orig.)

  12. Weak consistency and strong paraconsistency

    Directory of Open Access Journals (Sweden)

    Gemma Robles

    2009-11-01

    Full Text Available In a standard sense, consistency and paraconsistency are understood as, respectively, the absence of any contradiction and as the absence of the ECQ (“E contradictione quodlibet” rule that allows us to conclude any well formed formula from any contradiction. The aim of this paper is to explain the concepts of weak consistency alternative to the standard one, the concepts of paraconsistency related to them and the concept of strong paraconsistency, all of which have been defined by the author together with José M. Méndez.

  13. Glass consistency and glass performance

    International Nuclear Information System (INIS)

    Plodinec, M.J.; Ramsey, W.G.

    1994-01-01

    Glass produced by the Defense Waste Processing Facility (DWPF) will have to consistently be more durable than a benchmark glass (evaluated using a short-term leach test), with high confidence. The DWPF has developed a Glass Product Control Program to comply with this specification. However, it is not clear what relevance product consistency has on long-term glass performance. In this report, the authors show that DWPF glass, produced in compliance with this specification, can be expected to effectively limit the release of soluble radionuclides to natural environments. However, the release of insoluble radionuclides to the environment will be limited by their solubility, and not glass durability

  14. Cogema gives its communication a new impetus

    International Nuclear Information System (INIS)

    Saulnier, J.-E.

    2001-01-01

    Starting 2 November 1999, COGEMA launched a mass public communication campaign and creating an Internet site, equipped with cameras (web-cams), to make everyone familiar with the COGEMA plant at La Hague. This system is designed to serve a communication policy that is resolutely open and attentive to French public concerns: - The COGEMA plant at La Hague is often perceived as a mystery, occult and dehumanized world. This communication campaign, entitled 'We have nothing to hide', illustrated COGEMA's determination to inform the citizens in the greatest possible transparence and its wish to bring the Group's industrial operations and the persons working there closer to the public. The campaign included TV commercials and press ads. The underlying principle is to work on issues that have made news. The televised system included two films, shot at La Hague. The first, lasting 90 seconds, consists of interviews and testimonies of employees who represent the professional and human diversity of the plant. The second, in 45-second format, presents the questions to which public opinion wants answers. These questions are also repeated in the press ads. - To ensure that everyone obtains all the answers to his questions, the TV spots and press ads refer to the website http://www.cogemalahague.fr and to a toll-free number 0 800-64-64-64 . This campaign was the first stage of a long-term approach. Its positive reception from the public strengthens COGEMA's resolution to anticipate the legitimate information's needs expressed by the public opinion. As a responsible firm, COGEMA means to adapt her communication policy in order to make the whole activities of the Group widely known. Beyond communication, COGEMA intends to carry on showing her attachment to nuclear industry and bolstering this sector's interests on the international scene. (authors)

  15. Self-consistent Green’s-function technique for surfaces and interfaces

    DEFF Research Database (Denmark)

    Skriver, Hans Lomholt; Rosengaard, N. M.

    1991-01-01

    We have implemented an efficient self-consistent Green’s-function technique for calculating ground-state properties of surfaces and interfaces, based on the linear-muffin-tin-orbitals method within the tight-binding representation. In this approach the interlayer interaction is extremely short...... ranged, and only a few layers close to the interface need be treated self-consistently via a Dyson equation. For semi-infinite jellium, the technique gives work functions and surface energies that are in excellent agreement with earlier calculations. For the bcc(110) surface of the alkali metals, we find...

  16. Dynamically consistent oil import tariffs

    International Nuclear Information System (INIS)

    Karp, L.; Newbery, D.M.

    1992-01-01

    The standard theory of optimal tariffs considers tariffs on perishable goods produced abroad under static conditions, in which tariffs affect prices only in that period. Oil and other exhaustable resources do not fit this model, for current tariffs affect the amount of oil imported, which will affect the remaining stock and hence its future price. The problem of choosing a dynamically consistent oil import tariff when suppliers are competitive but importers have market power is considered. The open-loop Nash tariff is solved for the standard competitive case in which the oil price is arbitraged, and it was found that the resulting tariff rises at the rate of interest. This tariff was found to have an equilibrium that in general is dynamically inconsistent. Nevertheless, it is shown that necessary and sufficient conditions exist under which the tariff satisfies the weaker condition of time consistency. A dynamically consistent tariff is obtained by assuming that all agents condition their current decisions on the remaining stock of the resource, in contrast to open-loop strategies. For the natural case in which all agents choose their actions simultaneously in each period, the dynamically consistent tariff was characterized, and found to differ markedly from the time-inconsistent open-loop tariff. It was shown that if importers do not have overwhelming market power, then the time path of the world price is insensitive to the ability to commit, as is the level of wealth achieved by the importer. 26 refs., 4 figs

  17. On giving radwaste management some status

    International Nuclear Information System (INIS)

    Dickson, H.W.; Walker, E.E.; Thiesing, J.W.

    1987-01-01

    Radwaste management is receiving ever increasing attention in the nuclear industry. The reasons for this include limited allocations for burial, increasing costs of handling and disposal, increased regulatory attention, and ALARA requirements. These issues have lead to an increasing awareness of the disadvantages of running a ''dirty'' plant and a variety of sophisticated systems have been proposed to fix the problem. Instead of these technologically difficult, and sometimes very expensive fixes, this paper focuses on several relatively simple ''low tech,'' and inexpensive solutions. Much can be done with organizational alternatives and assigned responsibilities and authorities to improve the situation. Applying controls on the front end of a radiological task rather than attempting to reduce the magnitude at the back end is the only realistic method for proper radwaste management

  18. Rethinking the social and cultural dimensions of charitable giving

    DEFF Research Database (Denmark)

    Bajde, Domen

    2009-01-01

    -giving and focuses on charitable gifts as an emblem of postmodern gift-giving to distant others. Historical evidence and sociological theory on postmodern solidarity are combined to shed light on the fluid duality of contemporary giving and the importance of the imaginary in charitable giving. The outlined socially...... symbolic dimensions of charitable giving are critically examined in light of postmodern consumer culture and the recent social corporate responsibility trends. By openly engaging the proposed complexities of gift-giving, our vocabulary and understanding of postmodern giving can be revised so as to invite...

  19. Consistently violating the non-Gaussian consistency relation

    International Nuclear Information System (INIS)

    Mooij, Sander; Palma, Gonzalo A.

    2015-01-01

    Non-attractor models of inflation are characterized by the super-horizon evolution of curvature perturbations, introducing a violation of the non-Gaussian consistency relation between the bispectrum's squeezed limit and the power spectrum's spectral index. In this work we show that the bispectrum's squeezed limit of non-attractor models continues to respect a relation dictated by the evolution of the background. We show how to derive this relation using only symmetry arguments, without ever needing to solve the equations of motion for the perturbations

  20. Self-consistent modelling of ICRH

    International Nuclear Information System (INIS)

    Hellsten, T.; Hedin, J.; Johnson, T.; Laxaaback, M.; Tennfors, E.

    2001-01-01

    The performance of ICRH is often sensitive to the shape of the high energy part of the distribution functions of the resonating species. This requires self-consistent calculations of the distribution functions and the wave-field. In addition to the wave-particle interactions and Coulomb collisions the effects of the finite orbit width and the RF-induced spatial transport are found to be important. The inward drift dominates in general even for a symmetric toroidal wave spectrum in the centre of the plasma. An inward drift does not necessarily produce a more peaked heating profile. On the contrary, for low concentrations of hydrogen minority in deuterium plasmas it can even give rise to broader profiles. (author)

  1. Self-consistent radial sheath

    International Nuclear Information System (INIS)

    Hazeltine, R.D.

    1988-12-01

    The boundary layer arising in the radial vicinity of a tokamak limiter is examined, with special reference to the TEXT tokamak. It is shown that sheath structure depends upon the self-consistent effects of ion guiding-center orbit modification, as well as the radial variation of E /times/ B-induced toroidal rotation. Reasonable agreement with experiment is obtained from an idealized model which, however simplified, preserves such self-consistent effects. It is argued that the radial sheath, which occurs whenever confining magnetic field-lines lie in the plasma boundary surface, is an object of some intrinsic interest. It differs from the more familiar axial sheath because magnetized charges respond very differently to parallel and perpendicular electric fields. 11 refs., 1 fig

  2. Lagrangian multiforms and multidimensional consistency

    Energy Technology Data Exchange (ETDEWEB)

    Lobb, Sarah; Nijhoff, Frank [Department of Applied Mathematics, University of Leeds, Leeds LS2 9JT (United Kingdom)

    2009-10-30

    We show that well-chosen Lagrangians for a class of two-dimensional integrable lattice equations obey a closure relation when embedded in a higher dimensional lattice. On the basis of this property we formulate a Lagrangian description for such systems in terms of Lagrangian multiforms. We discuss the connection of this formalism with the notion of multidimensional consistency, and the role of the lattice from the point of view of the relevant variational principle.

  3. Consistency and Communication in Committees

    OpenAIRE

    Inga Deimen; Felix Ketelaar; Mark T. Le Quement

    2013-01-01

    This paper analyzes truthtelling incentives in pre-vote communication in heterogeneous committees. We generalize the classical Condorcet jury model by introducing a new informational structure that captures consistency of information. In contrast to the impossibility result shown by Coughlan (2000) for the classical model, full pooling of information followed by sincere voting is an equilibrium outcome of our model for a large set of parameter values implying the possibility of ex post confli...

  4. Application of potential harmonic expansion method to BEC ...

    Indian Academy of Sciences (India)

    We adopt the potential harmonics expansion method for an ab initio solu- ... commonly adopted mean-field theories, our method is capable of handling ..... potentials in self-consistent mean-field calculation [7] gives wrong results as the.

  5. An interview study of Gift-giving in China at New Year

    OpenAIRE

    Zhang, Shuo

    2007-01-01

    The purpose of this dissertation is to examine to what a extent the Chinese culture including the custom during Chinese New Year, reciprocity, face and Guanxi has influence on gift-giving among Chinese people and how these factors affect the behavior of gift-giving during Chinese New Year using qualitative research method with in-depth interview and limited observation. This dissertation stemmed from the observations of gift-giving in Bejing institute of geological engineering (BIGE) in...

  6. Give first priority to publicity and education.

    Science.gov (United States)

    1991-12-01

    Commentary is provided on the implementation of China's Three Priorities in strengthening family planning (FP) for population control. The Three Priorities issued by the Party Central Committee of China and the State Council refers to the emphasis on 1) "publicity and education rather than economic disincentives," 2) contraception rather than induced abortion," and 3) "day to day management work rather than irregular campaigns." The expectations are that leaders at all levels should be active, steadfast, patient, and down to earth. Improvements in management lead to more constant, scientific, and systematic FP. Family planning should be voluntary. The achievement is not just population control but better relations with the Party and cadres, which leads to social stability and unity. The directives have been well thought out and are to be resolutely carried out. It was stressed in April 1991 by the General-Secretary and the Premier that coercion would not be tolerated in FP work. The confidence of the masses must be relied upon. The success of FP is guaranteed with the practice of these directives. Constancy of education and publicity is the key work. There should be a strong population awareness and the awareness of available resources/capita, and also an understanding and firm command of the principles and methods of better implementation. FP has an effect both on the fundamental interests of the country and immediate personal interests. The task is expected to be difficult because traditional ideas are still strong. The country is just at the beginning stages of socialism. A social security system is not a reality and farmer's educational attainment is not high. Productivity in the rural areas is underdeveloped. There is a contradiction between childbearing intentions of some farmers and the government requirements of FP. In order for the people to understand government FP policy, painstaking and meticulous education must be carried out to explain why FP is

  7. Towards thermodynamical consistency of quasiparticle picture

    International Nuclear Information System (INIS)

    Biro, T.S.; Shanenko, A.A.; Toneev, V.D.; Research Inst. for Particle and Nuclear Physics, Hungarian Academy of Sciences, Budapest

    2003-01-01

    The purpose of the present article is to call attention to some realistic quasi-particle-based description of the quark/gluon matter and its consistent implementation in thermodynamics. A simple and transparent representation of the thermodynamical consistency conditions is given. This representation allows one to review critically and systemize available phenomenological approaches to the deconfinement problem with respect to their thermodynamical consistency. A particular attention is paid to the development of a method for treating the string screening in the dense matter of unbound color charges. The proposed method yields an integrable effective pair potential, which can be incorporated into the mean-field picture. The results of its application are in reasonable agreement with lattice data on the QCD thermodynamics [ru

  8. Toward thermodynamic consistency of quasiparticle picture

    International Nuclear Information System (INIS)

    Biro, T.S.; Toneev, V.D.; Shanenko, A.A.

    2003-01-01

    The purpose of the present article is to call attention to some realistic quasiparticle-based description of quark/gluon matter and its consistent implementation in thermodynamics. A simple and transparent representation of the thermodynamic consistency conditions is given. This representation allows one to review critically and systemize available phenomenological approaches to the deconfinement problem with respect to their thermodynamic consistency. Particular attention is paid to the development of a method for treating the string screening in the dense matter of unbound color charges. The proposed method yields an integrable effective pair potential that can be incorporated into the mean-field picture. The results of its application are in reasonable agreement with lattice data on the QCD thermodynamics

  9. Consistent Visual Analyses of Intrasubject Data

    Science.gov (United States)

    Kahng, SungWoo; Chung, Kyong-Mee; Gutshall, Katharine; Pitts, Steven C.; Kao, Joyce; Girolami, Kelli

    2010-01-01

    Visual inspection of single-case data is the primary method of interpretation of the effects of an independent variable on a dependent variable in applied behavior analysis. The purpose of the current study was to replicate and extend the results of DeProspero and Cohen (1979) by reexamining the consistency of visual analysis across raters. We…

  10. Consistent estimation of Gibbs energy using component contributions.

    Directory of Open Access Journals (Sweden)

    Elad Noor

    Full Text Available Standard Gibbs energies of reactions are increasingly being used in metabolic modeling for applying thermodynamic constraints on reaction rates, metabolite concentrations and kinetic parameters. The increasing scope and diversity of metabolic models has led scientists to look for genome-scale solutions that can estimate the standard Gibbs energy of all the reactions in metabolism. Group contribution methods greatly increase coverage, albeit at the price of decreased precision. We present here a way to combine the estimations of group contribution with the more accurate reactant contributions by decomposing each reaction into two parts and applying one of the methods on each of them. This method gives priority to the reactant contributions over group contributions while guaranteeing that all estimations will be consistent, i.e. will not violate the first law of thermodynamics. We show that there is a significant increase in the accuracy of our estimations compared to standard group contribution. Specifically, our cross-validation results show an 80% reduction in the median absolute residual for reactions that can be derived by reactant contributions only. We provide the full framework and source code for deriving estimates of standard reaction Gibbs energy, as well as confidence intervals, and believe this will facilitate the wide use of thermodynamic data for a better understanding of metabolism.

  11. Gender inequalities in care-giving in Canada.

    Science.gov (United States)

    Dowler, J M; Jordan-Simpson, D A; Adams, O

    1992-01-01

    In Canada today, as in the past, men and women devote different amounts of time and effort to providing health care to their family and in the broader community. This paper examines "the social distinction between masculinity and femininity" in care-giving. Issues of gender inequality in care-giving have the potential to affect the formal health care system as the burden of caring will increase over the next few decades with the aging of society. This increase in need for care is occurring at a time when primary care providers--women--have additional demand on their time. Canadian society has been facing a series of social, demographic and economic shifts such as a higher divorce rate, two-income families, women's increasing professional commitments and first pregnancies at later ages, these factors may not affect women's willingness to care for others. However women may need support to provide the care. Assuming present trends, increasing need for families to care for elderly relatives may be inevitable. Perhaps society will recognize this need and provide support to those providing informal care. This could take the form of allowing individuals more time to provide care through child care leave, leave to care for parents and job-sharing. Support to those providing informal care might also be facilitated through community support services such as respite care, household maintenance, psychological support to care-givers, support groups, informal networks within a community and consideration of unconventional support methods.

  12. Maintaining consistency in distributed systems

    Science.gov (United States)

    Birman, Kenneth P.

    1991-01-01

    In systems designed as assemblies of independently developed components, concurrent access to data or data structures normally arises within individual programs, and is controlled using mutual exclusion constructs, such as semaphores and monitors. Where data is persistent and/or sets of operation are related to one another, transactions or linearizability may be more appropriate. Systems that incorporate cooperative styles of distributed execution often replicate or distribute data within groups of components. In these cases, group oriented consistency properties must be maintained, and tools based on the virtual synchrony execution model greatly simplify the task confronting an application developer. All three styles of distributed computing are likely to be seen in future systems - often, within the same application. This leads us to propose an integrated approach that permits applications that use virtual synchrony with concurrent objects that respect a linearizability constraint, and vice versa. Transactional subsystems are treated as a special case of linearizability.

  13. Consistency of extreme flood estimation approaches

    Science.gov (United States)

    Felder, Guido; Paquet, Emmanuel; Penot, David; Zischg, Andreas; Weingartner, Rolf

    2017-04-01

    Estimations of low-probability flood events are frequently used for the planning of infrastructure as well as for determining the dimensions of flood protection measures. There are several well-established methodical procedures to estimate low-probability floods. However, a global assessment of the consistency of these methods is difficult to achieve, the "true value" of an extreme flood being not observable. Anyway, a detailed comparison performed on a given case study brings useful information about the statistical and hydrological processes involved in different methods. In this study, the following three different approaches for estimating low-probability floods are compared: a purely statistical approach (ordinary extreme value statistics), a statistical approach based on stochastic rainfall-runoff simulation (SCHADEX method), and a deterministic approach (physically based PMF estimation). These methods are tested for two different Swiss catchments. The results and some intermediate variables are used for assessing potential strengths and weaknesses of each method, as well as for evaluating the consistency of these methods.

  14. [Gift giving and the ethics of the caregiver].

    Science.gov (United States)

    Grassin, Marc

    2014-12-01

    Modern societies establish relationships on a contract basis, but the caregiver relationship invariably involves the notion of a gift. Caring engages the giving / receiving / giving back circle of reciprocity. The caregiving relationship requires a gift ethic which gives meaning to the nurse/patient contract.

  15. Using the Perceptron Algorithm to Find Consistent Hypotheses

    OpenAIRE

    Anthony, M.; Shawe-Taylor, J.

    1993-01-01

    The perceptron learning algorithm yields quite naturally an algorithm for finding a linearly separable boolean function consistent with a sample of such a function. Using the idea of a specifying sample, we give a simple proof that this algorithm is not efficient, in general.

  16. Decentralized Consistent Updates in SDN

    KAUST Repository

    Nguyen, Thanh Dang

    2017-04-10

    We present ez-Segway, a decentralized mechanism to consistently and quickly update the network state while preventing forwarding anomalies (loops and blackholes) and avoiding link congestion. In our design, the centralized SDN controller only pre-computes information needed by the switches during the update execution. This information is distributed to the switches, which use partial knowledge and direct message passing to efficiently realize the update. This separation of concerns has the key benefit of improving update performance as the communication and computation bottlenecks at the controller are removed. Our evaluations via network emulations and large-scale simulations demonstrate the efficiency of ez-Segway, which compared to a centralized approach, improves network update times by up to 45% and 57% at the median and the 99th percentile, respectively. A deployment of a system prototype in a real OpenFlow switch and an implementation in P4 demonstrate the feasibility and low overhead of implementing simple network update functionality within switches.

  17. A Simple Method for Decreasing the Liquid Junction Potential in a Flow-through-Type Differential pH Sensor Probe Consisting of pH-FETs by Exerting Spatiotemporal Control of the Liquid Junction

    Science.gov (United States)

    Yamada, Akira; Mohri, Satoshi; Nakamura, Michihiro; Naruse, Keiji

    2015-01-01

    The liquid junction potential (LJP), the phenomenon that occurs when two electrolyte solutions of different composition come into contact, prevents accurate measurements in potentiometry. The effect of the LJP is usually remarkable in measurements of diluted solutions with low buffering capacities or low ion concentrations. Our group has constructed a simple method to eliminate the LJP by exerting spatiotemporal control of a liquid junction (LJ) formed between two solutions, a sample solution and a baseline solution (BLS), in a flow-through-type differential pH sensor probe. The method was contrived based on microfluidics. The sensor probe is a differential measurement system composed of two ion-sensitive field-effect transistors (ISFETs) and one Ag/AgCl electrode. With our new method, the border region of the sample solution and BLS is vibrated in order to mix solutions and suppress the overshoot after the sample solution is suctioned into the sensor probe. Compared to the conventional method without vibration, our method shortened the settling time from over two min to 15 s and reduced the measurement error by 86% to within 0.060 pH. This new method will be useful for improving the response characteristics and decreasing the measurement error of many apparatuses that use LJs. PMID:25835300

  18. IMPACT OF THE “GIVING CIGARETTES IS GIVING HARM” CAMPAIGN ON KNOWLEDGE AND ATTITUDES OF CHINESE SMOKERS

    Science.gov (United States)

    Huang, Li-Ling; Thrasher, James F.; Jiang, Yuan; Li, Qiang; Fong, Geoffrey T.; Chang, Yvette; Walsemann, Katrina M.; Friedman, Daniela B.

    2015-01-01

    Objective To date there is limited published evidence on the efficacy of tobacco control mass media campaigns in China. This study aimed to evaluate the impact of a mass media campaign “Giving Cigarettes is Giving Harm” (GCGH) on Chinese smokers’ knowledge of smoking-related harms and attitudes toward cigarette gifts. Methods Population-based, representative data were analyzed from a longitudinal cohort of 3,709 adult smokers who participated in the International Tobacco Control China Survey conducted in six Chinese cities before and after the campaign. Logistic regression models were estimated to examine associations between campaign exposure and attitudes about cigarettes as gifts measured post-campaign. Poisson regression models were estimated to assess the effects of campaign exposure on post-campaign knowledge, adjusting for pre-campaign knowledge. Findings Fourteen percent (n=335) of participants recalled the campaign within the cities where the GCGH campaign was implemented. Participants in the intervention cities who recalled the campaign were more likely to disagree that cigarettes are good gifts (71% vs. 58%, pcampaign-targeted knowledge than those who did not recall the campaign (Mean=1.97 vs. 1.62, pcampaign-targeted knowledge were similar in both cities, perhaps due to a secular trend, low campaign recall, or contamination issues. Conclusions These findings suggest that the GCGH campaign increased knowledge of smoking harms, which could promote downstream cessation. Findings provide evidence to support future campaign development to effectively fight the tobacco epidemic in China. PMID:24813427

  19. Time-Consistent and Market-Consistent Evaluations (Revised version of 2012-086)

    NARCIS (Netherlands)

    Stadje, M.A.; Pelsser, A.

    2014-01-01

    Abstract: We consider evaluation methods for payoffs with an inherent financial risk as encountered for instance for portfolios held by pension funds and insurance companies. Pricing such payoffs in a way consistent to market prices typically involves combining actuarial techniques with methods from

  20. Consistent application of codes and standards

    International Nuclear Information System (INIS)

    Scott, M.A.

    1989-01-01

    The guidelines presented in the US Department of Energy, General Design Criteria (DOE 6430.1A), and the Design and Evaluation Guidelines for Department of Energy Facilities Subject to Natural Phenomena Hazards (UCRL-15910) provide a consistent and well defined approach to determine the natural phenomena hazards loads for US Department of Energy site facilities. The guidelines for the application of loads combinations and allowables criteria are not as well defined and are more flexible in interpretation. This flexibility in the interpretation of load combinations can lead to conflict between the designer and overseer. The establishment of an efficient set of acceptable design criteria, based on US Department of Energy guidelines, provides a consistent baseline for analysis, design, and review. Additionally, the proposed method should not limit the design and analytical innovation necessary to analyze or qualify the unique structure. This paper investigates the consistent application of load combinations, analytical methods, and load allowables and suggests a reference path consistent with the US Department of Energy guidelines

  1. Ethics of trial drug use: to give or not to give?

    Science.gov (United States)

    Ebunoluwa, Oduwole O; Kareem, Fayemi A

    2016-01-01

    The 2014 outbreak of Ebola viral disease in some West African countries, which later spread to the USA and Spain, has continued to be a subject of global public health debate. While there is no approved vaccine or drug for Ebola cure yet, moral questions of bioethical significance are emerging even as vaccine studies are at different clinical trial phases. This paper, through a normative and critical approach, focuses on the question of whether it is ethical to give any experimental drugs to Ebola victims in West Africa or not. Given the global panic and deadly contagious nature of Ebola, this paper argues on three major compassionate grounds that it is ethical to use experimental drugs on the dying African victims of Ebola. Besides respecting patients and family consent in the intervention process, this paper argues that the use of Ebola trial drugs on West African population will be ethical if it promotes the common good, and does not violate the fundamental principles of transparency and integrity in human research ethics. Using Kantian ethical framework of universality as a basis for moral defense of allowing access to yet approved drugs. This paper provides argument to strengthen the compassionate ground provisional recommendation of the WHO's Strategic Advisory Group of Experts on Immunization (SAGE) on Ebola vaccines and vaccination.

  2. Self-consistent asset pricing models

    Science.gov (United States)

    Malevergne, Y.; Sornette, D.

    2007-08-01

    We discuss the foundations of factor or regression models in the light of the self-consistency condition that the market portfolio (and more generally the risk factors) is (are) constituted of the assets whose returns it is (they are) supposed to explain. As already reported in several articles, self-consistency implies correlations between the return disturbances. As a consequence, the alphas and betas of the factor model are unobservable. Self-consistency leads to renormalized betas with zero effective alphas, which are observable with standard OLS regressions. When the conditions derived from internal consistency are not met, the model is necessarily incomplete, which means that some sources of risk cannot be replicated (or hedged) by a portfolio of stocks traded on the market, even for infinite economies. Analytical derivations and numerical simulations show that, for arbitrary choices of the proxy which are different from the true market portfolio, a modified linear regression holds with a non-zero value αi at the origin between an asset i's return and the proxy's return. Self-consistency also introduces “orthogonality” and “normality” conditions linking the betas, alphas (as well as the residuals) and the weights of the proxy portfolio. Two diagnostics based on these orthogonality and normality conditions are implemented on a basket of 323 assets which have been components of the S&P500 in the period from January 1990 to February 2005. These two diagnostics show interesting departures from dynamical self-consistency starting about 2 years before the end of the Internet bubble. Assuming that the CAPM holds with the self-consistency condition, the OLS method automatically obeys the resulting orthogonality and normality conditions and therefore provides a simple way to self-consistently assess the parameters of the model by using proxy portfolios made only of the assets which are used in the CAPM regressions. Finally, the factor decomposition with the

  3. Self-consistent velocity dependent effective interactions

    International Nuclear Information System (INIS)

    Kubo, Takayuki; Sakamoto, Hideo; Kammuri, Tetsuo; Kishimoto, Teruo.

    1993-09-01

    The field coupling method is extended to a system with a velocity dependent mean potential. By means of this method, we can derive the effective interactions which are consistent with the mean potential. The self-consistent velocity dependent effective interactions are applied to the microscopic analysis of the structures of giant dipole resonances (GDR) of 148,154 Sm, of the first excited 2 + states of Sn isotopes and of the first excited 3 - states of Mo isotopes. It is clarified that the interactions play crucial roles in describing the splitting of the resonant structure of GDR peaks, in restoring the energy weighted sum rule values, and in reducing B (Eλ) values. (author)

  4. Development of constraint algorithm for the number of electrons in molecular orbitals consisting mainly 4f atomic orbitals of rare-earth elements and its introduction to tight-binding quantum chemical molecular dynamics method

    International Nuclear Information System (INIS)

    Endou, Akira; Onuma, Hiroaki; Jung, Sun-ho

    2007-01-01

    Our original tight-binding quantum chemical molecular dynamics code, Colors', has been successfully applied to the theoretical investigation of complex materials including rare-earth elements, e.g., metal catalysts supported on a CeO 2 surface. To expand our code so as to obtain a good convergence for the electronic structure of a calculation system including a rare-earth element, we developed a novel algorithm to provide a constraint condition for the number of electrons occupying the selected molecular orbitals that mainly consist of 4f atomic orbitals of the rare-earth element. This novel algorithm was introduced in Colors. Using Colors, we succeeded in obtaining the classified electronic configurations of the 4f atomic orbitals of Ce 4+ and reduced Ce ions in a CeO 2 bulk model with one oxygen defect, which makes it difficult to obtain a good convergence using a conventional first-principles quantum chemical calculation code. (author)

  5. The storage capacity of cocoa seeds (Theobroma cacao L.) through giving Polyethylene Glycol (PEG) in the various of storage container

    Science.gov (United States)

    Lahay, R. R.; Misrun, S.; Sipayung, R.

    2018-02-01

    Cocoa is plant which it’s seed character is recalcitrant. Giving PEG and using various of storage containers was hoped to increase storage capacity of cocoa seeds as long as period of saving. The reseach was aimed to identify the storage capacity of cocoa seeds through giving PEG in the various of storage containers. Research took place in Hataram Jawa II, Kabupaten Simalungun, Propinsi Sumatera Utara, Indonesia. The method of this research is spit-split plot design with 3 replication. Storage period was put on main plot which was consisted of 4 level, PEG concentration was put on sub plot, consisted of 4 level and storage container was put on the sub sub plot consisted of 3 types. The results showed that until 4 days at storage with 45 % PEG concentration at all storage container, percentage of seed germination at storage can be decreased to be 2.90 %, and can be defensed until 16 days with 45 % PEG concentration at perforated plastic storage container. Percentage of molded seeds and seed moisture content were increased with added period of storage but seed moisture content was increased until 12 days at storage and was decreased at 16 days in storage.

  6. Consistent Valuation across Curves Using Pricing Kernels

    Directory of Open Access Journals (Sweden)

    Andrea Macrina

    2018-03-01

    Full Text Available The general problem of asset pricing when the discount rate differs from the rate at which an asset’s cash flows accrue is considered. A pricing kernel framework is used to model an economy that is segmented into distinct markets, each identified by a yield curve having its own market, credit and liquidity risk characteristics. The proposed framework precludes arbitrage within each market, while the definition of a curve-conversion factor process links all markets in a consistent arbitrage-free manner. A pricing formula is then derived, referred to as the across-curve pricing formula, which enables consistent valuation and hedging of financial instruments across curves (and markets. As a natural application, a consistent multi-curve framework is formulated for emerging and developed inter-bank swap markets, which highlights an important dual feature of the curve-conversion factor process. Given this multi-curve framework, existing multi-curve approaches based on HJM and rational pricing kernel models are recovered, reviewed and generalised and single-curve models extended. In another application, inflation-linked, currency-based and fixed-income hybrid securities are shown to be consistently valued using the across-curve valuation method.

  7. Social support and ambulatory blood pressure: an examination of both receiving and giving.

    Science.gov (United States)

    Piferi, Rachel L; Lawler, Kathleen A

    2006-11-01

    The relationship between the social network and physical health has been studied extensively and it has consistently been shown that individuals live longer, have fewer physical symptoms of illness, and have lower blood pressure when they are a member of a social network than when they are isolated. Much of the research has focused on the benefits of receiving social support from the network and the effects of giving to others within the network have been neglected. The goal of the present research was to systematically investigate the relationship between giving and ambulatory blood pressure. Systolic blood pressure, diastolic blood pressure, mean arterial pressure, and heart rate were recorded every 30 min during the day and every 60 min at night during a 24-h period. Linear mixed models analyses revealed that lower systolic and diastolic blood pressure and mean arterial pressure were related to giving social support. Furthermore, correlational analyses revealed that participants with a higher tendency to give social support reported greater received social support, greater self-efficacy, greater self-esteem, less depression, and less stress than participants with a lower tendency to give social support to others. Structural equation modeling was also used to test a proposed model that giving and receiving social support represent separate pathways predicting blood pressure and health. From this study, it appears that giving social support may represent a unique construct from receiving social support and may exert a unique effect on health.

  8. Effects of Dzyaloshinsky–Moriya interaction on magnetism in nanodisks from a self-consistent approach

    Energy Technology Data Exchange (ETDEWEB)

    Liu, Zhaosen, E-mail: liuzhsnj@yahoo.com [Nanjing University of Information Science and Technology, Department of Applied Physics (China); Ian, Hou, E-mail: houian@umac.mo [University of Macau, Institute of Applied Physics and Materials Engineering, FST (China)

    2016-01-15

    We give a theoretical study on the magnetic properties of monolayer nanodisks with both Heisenberg exchange and Dzyaloshinsky–Moriya (DM) interactions. In particular, we survey the magnetic effects caused by anisotropy, external magnetic field, and disk size when DM interaction is present by means of a new quantum simulation method facilitated by a self-consistent algorithm based on mean field theory. This computational approach finds that uniaxial anisotropy and transversal magnetic field enhance the net magnetization as well as increase the transition temperature of the vortical phase while preserving the chiralities of the swirly magnetic structures, whereas when the strength of DM interaction is sufficiently strong for a given disk size, magnetic domains appear within the circularly bounded region, which vanish and give in to a single vortex when a transversal magnetic field is applied. The latter confirms the magnetic skyrmions induced by the magnetic field as observed in the experiments.

  9. Why Feminism? How Feminist Methodologies Can Aid Our Efforts to ‘Give Back’ Through Research

    Directory of Open Access Journals (Sweden)

    Hekia Ellen Golden Bodwitch

    2014-07-01

    Full Text Available In this thematic section, the authors take a critical stance to the notion of giving back. They emphasize that giving back should be a model of solidarity and movement building, not charity. They push us to consider the ways in which the framework of giving back may actually reinforce hierarchical relationships between the researcher and the researched. In doing so, they offer new ways of thinking about the relationship between researchers and their communities of subjects. The strategies employed by these authors resonate with work from feminist activists and scholars whose approaches bring us alternative theories and methods through which to address the potentially dangerous effects of speaking for others through research. Examined alongside the giving back pieces in this section, these feminist contributions illuminate ways that we can give back by advancing the anti-oppression agendas of marginalized subjects through our research.

  10. Optimization method development of the core characteristics of a fast reactor in order to explore possible high performance solutions (a solution being a consistent set of fuel, core, system and safety)

    International Nuclear Information System (INIS)

    Ingremeau, J.-J.X.

    2011-01-01

    In the study of any new nuclear reactor, the design of the core is an important step. However designing and optimising a reactor core is quite complex as it involves neutronics, thermal-hydraulics and fuel thermomechanics and usually design of such a system is achieved through an iterative process, involving several different disciplines. In order to solve quickly such a multi-disciplinary system, while observing the appropriate constraints, a new approach has been developed to optimise both the core performance (in-cycle Pu inventory, fuel burn-up, etc...) and the core safety characteristics (safety estimators) of a Fast Neutron Reactor. This new approach, called FARM (Fast Reactor Methodology) uses analytical models and interpolations (Meta-models) from CEA reference codes for neutronics, thermal-hydraulics and fuel behaviour, which are coupled to automatically design a core based on several optimization variables. This global core model is then linked to a genetic algorithm and used to explore and optimise new core designs with improved performance. Consideration has also been given to which parameters can be best used to define the core performance and how safety can be taken into account.This new approach has been used to optimize the design of three concepts of Gas cooled Fast Reactor (GFR). For the first one, using a SiC/SiCf-cladded carbide-fuelled helium-bonded pin, the results demonstrate that the CEA reference core obtained with the traditional iterative method was an optimal core, but among many other possibilities (that is to say on the Pareto front). The optimization also found several other cores which exhibit some improved features at the expense of other safety or performance estimators. An evolution of this concept using a 'buffer', a new technology being developed at CEA, has hence been introduced in FARM. The FARM optimisation produced several core designs using this technology, and estimated their performance. The results obtained show that

  11. Classical and Quantum Consistency of the DGP Model

    CERN Document Server

    Nicolis, A; Nicolis, Alberto; Rattazzi, Riccardo

    2004-01-01

    We study the Dvali-Gabadadze-Porrati model by the method of the boundary effective action. The truncation of this action to the bending mode \\pi consistently describes physics in a wide range of regimes both at the classical and at the quantum level. The Vainshtein effect, which restores agreement with precise tests of general relativity, follows straightforwardly. We give a simple and general proof of stability, i.e. absence of ghosts in the fluctuations, valid for most of the relevant cases, like for instance the spherical source in asymptotically flat space. However we confirm that around certain interesting self-accelerating cosmological solutions there is a ghost. We consider the issue of quantum corrections. Around flat space \\pi becomes strongly coupled below a macroscopic length of 1000 km, thus impairing the predictivity of the model. Indeed the tower of higher dimensional operators which is expected by a generic UV completion of the model limits predictivity at even larger length scales. We outline ...

  12. Exploring women's personal experiences of giving birth in Gonabad city: a qualitative study.

    Science.gov (United States)

    Askari, Fariba; Atarodi, Alireza; Torabi, Shirin; Moshki, Mahdi

    2014-05-08

    Women's health is an important task in society. The aim of this qualitative study that used a phenomenological approach was to explain women's personal experiences of giving birth in Gonabad city that had positive experiences of giving birth in order to establish quality cares and the related factors of midwifery cares for this physiological phenomenon. The participants were 21 primiparae women who gave a normal and or uncomplicated giving birth in the hospital of Gonabad University of medical sciences. Based on a purposeful approach in-depth interviews were continued to reach data saturation. The data were collected through open and semi-structured interactional in-depth interviews with all the participants. All the interviews were taped, transcribed and then analyzed through a qualitative content analysis method to identify the concepts and themes. Some categories were emerged. A quiet and safe environment was the most urgent need of the most women giving birth. Unnecessary routine interventions that are performed on all women regardless of their needs and should be avoided were considered such as: "absolute rest, establishing vein, frequent vaginal examinations, fasting and early Amniotomy". All the women wanted to take part actively in their giving birth, because they believed it could affect their giving birth. We hope that the women's experiences of giving birth will be a pleasant and enjoyable experience for all the mothers giving birth.

  13. Principle of Care and Giving to Help People in Need.

    Science.gov (United States)

    Bekkers, René; Ottoni-Wilhelm, Mark

    2016-01-01

    Theories of moral development posit that an internalized moral value that one should help those in need-the principle of care-evokes helping behaviour in situations where empathic concern does not. Examples of such situations are helping behaviours that involve cognitive deliberation and planning, that benefit others who are known only in the abstract, and who are out-group members. Charitable giving to help people in need is an important helping behaviour that has these characteristics. Therefore we hypothesized that the principle of care would be positively associated with charitable giving to help people in need, and that the principle of care would mediate the empathic concern-giving relationship. The two hypotheses were tested across four studies. The studies used four different samples, including three nationally representative samples from the American and Dutch populations, and included both self-reports of giving (Studies 1-3), giving observed in a survey experiment (Study 3), and giving observed in a laboratory experiment (Study 4). The evidence from these studies indicated that a moral principle to care for others was associated with charitable giving to help people in need and mediated the empathic concern-giving relationship. © 2016 The Authors. European Journal of Personality published by John Wiley & Sons Ltd on behalf of European Association of Personality Psychology.

  14. Characterization of a multidrug resistant Salmonella enterica give ...

    African Journals Online (AJOL)

    Salmonella enterica Give is one of the serotypes that have been incriminated in Salmonella infections; sometimes associated with hospitalization and mortalities in humans and animals in some parts of the world. In this work, we characterized one Salmonella Give isolated from cloaca swab of an Agama agama lizard ...

  15. Magma chamber interaction giving rise to asymmetric oscillations

    Science.gov (United States)

    Walwer, D.; Ghil, M.; Calais, E.

    2017-12-01

    Geodetic time series at four volcanoes (Okmok, Akutan, Shishaldin, and Réunion) are processed using Multi-channel Singular Spectrum Analysis (M-SSA) and reveal sawtooth-shaped oscillations ; the latter are characterized by short intervals of fast inflations followed by longer intervals of slower deflations. At Okmok and Akutan, the oscillations are first damped and then accentuated. At Okmok, the increase in amplitude of the oscillations is followed by an eruption. We first show that the dynamics of these four volcanoes bears similarities with that of a simple nonlinear, dissipative oscillator, indicating that the inflation-deflation episodes are relaxation oscillations. These observations imply that ab initio dynamical models of magma chambers should possess an asymmetric oscillatory regime. Next, based on the work of Whitehead and Helfrich [1991], we show that a model of two magma chambers — connected by a cylindrical conduit in which the magma viscosity depends on temperature — gives rise to asymmetric overpressure oscillations in the magma reservoirs. These oscillations lead to surface deformations that are consistent with those observed at the four volcanoes in this study. This relaxation oscillation regime occurs only when the vertical temperature gradient in the host rock between the two magma chambers is large enough and when the magma flux entering the volcanic system is sufficiently high. The magma being supplied by a deeper source region, the input flux depends on the pressure difference between the source and the deepest reservoir. When this difference is not sufficiently high, the magma flux exponentially decreases, leading to damped oscillations as observed at Akutan and Okmok. The combination of observational and modeling results clearly supports the role of relaxation oscillations in the dynamics of volcanic systems.

  16. The economic impact of giving up nuclear power in France

    International Nuclear Information System (INIS)

    Carnot, N.; Gallon, St.

    2001-01-01

    French nuclear plants will have to be shut down in the 2020's. Electricite de France (EDF) could replace them by either nuclear or gas-fired plants. Choosing the latter would lead to an increase in Green House Gases (GHG) emissions and to a rise of EDF's generation costs. In 2020, the price of electricity in Europe will be determined by a competitive market. Therefore, a rise of EDF's generation costs will mainly depress its operating profit (and slightly increase the market's price). Giving up nuclear power in 2020 would consequently lead to a fall of EDF's value and would penalize its shareholders, the State. On a macro-economic scale, the shock on the production cost of electricity would lead to a 0,5 to 1,0 percentage point drop of GDP (depending on the hypotheses). Structural unemployment would rise by 0,3 to 0,6 percentage point. The model used to find these results does not take into account the risk of nuclear accidents nor the uncertainty on the costs of nuclear waste disposal. On the other hand, gas-price is assumed to be low, and the costs of gas-fired generation do not integrate the risk premium due to gas-price volatility. In conclusion, the best choice on both micro and macro scales, consists in extending the life of current nuclear plants (if such an extension is authorised by safety regulators). These plants would be financially-amortized, produce electricity at a very competitive cost and emit no GHG. Furthermore, extending the life of current nuclear plants will defer any irreversible commitment on their replacement. The necessary decision could therefore be taken later on, with more information on the cost of alternative generation technologies and their efficiency. (author)

  17. Self-consistent gravitational self-force

    International Nuclear Information System (INIS)

    Pound, Adam

    2010-01-01

    I review the problem of motion for small bodies in general relativity, with an emphasis on developing a self-consistent treatment of the gravitational self-force. An analysis of the various derivations extant in the literature leads me to formulate an asymptotic expansion in which the metric is expanded while a representative worldline is held fixed. I discuss the utility of this expansion for both exact point particles and asymptotically small bodies, contrasting it with a regular expansion in which both the metric and the worldline are expanded. Based on these preliminary analyses, I present a general method of deriving self-consistent equations of motion for arbitrarily structured (sufficiently compact) small bodies. My method utilizes two expansions: an inner expansion that keeps the size of the body fixed, and an outer expansion that lets the body shrink while holding its worldline fixed. By imposing the Lorenz gauge, I express the global solution to the Einstein equation in the outer expansion in terms of an integral over a worldtube of small radius surrounding the body. Appropriate boundary data on the tube are determined from a local-in-space expansion in a buffer region where both the inner and outer expansions are valid. This buffer-region expansion also results in an expression for the self-force in terms of irreducible pieces of the metric perturbation on the worldline. Based on the global solution, these pieces of the perturbation can be written in terms of a tail integral over the body's past history. This approach can be applied at any order to obtain a self-consistent approximation that is valid on long time scales, both near and far from the small body. I conclude by discussing possible extensions of my method and comparing it to alternative approaches.

  18. The notion of gift-giving and organ donation.

    Science.gov (United States)

    Gerrand, Nicole

    1994-04-01

    The analogy between gift-giving and organ donation was first suggested at the beginning of the transplantation era, when policy makers and legislators were promoting voluntary organ donation as the preferred procurement procedure. It was believed that the practice of gift-giving had some features which were also thought to be necessary to ensure that an organ procurement procedure would be morally acceptable, namely voluntarism and altruism. Twenty-five years later, the analogy between gift-giving and organ donation is still being made in the literature and used in organ donation awareness campaigns. In this paper I want to challenge this analogy. By examining a range of circumstances in which gift-giving occurs, I argue that the significant differences between the various types of gift-giving and organ donation makes any analogy between the two very general and superficial, and I suggest that a more appropriate analogy can be found elsewhere.

  19. OPINION GIVING SERVICES AS A SOURCE OF CONSUMER INFORMATION

    Directory of Open Access Journals (Sweden)

    Joanna Wyrwisz

    2015-09-01

    Full Text Available The goal of the article is to determine the place and role of opinion giving services in consumer behaviours. The discussion is conducted around the thesis saying that in the information society, opinion giving services constitute an important source of information for consumers in the process of selecting and purchasing both products and services. In the article the research approach based on the theoretical and empirical examinations was presented. The discussion starts with presenting a defi nition and types of opinion giving services which constitute the base for the characteristics of activities and usefulness of web portals collecting consumers opinions. The use of opinion giving services provided in the purchase process was evaluated. An essential interest in other consumers opinions, placed in Internet, was observed together with perceiving them as credible. Positive assessment of the functionality of opinion giving services was noticed.

  20. Orthology and paralogy constraints: satisfiability and consistency.

    Science.gov (United States)

    Lafond, Manuel; El-Mabrouk, Nadia

    2014-01-01

    A variety of methods based on sequence similarity, reconciliation, synteny or functional characteristics, can be used to infer orthology and paralogy relations between genes of a given gene family  G. But is a given set  C of orthology/paralogy constraints possible, i.e., can they simultaneously co-exist in an evolutionary history for  G? While previous studies have focused on full sets of constraints, here we consider the general case where  C does not necessarily involve a constraint for each pair of genes. The problem is subdivided in two parts: (1) Is  C satisfiable, i.e. can we find an event-labeled gene tree G inducing  C? (2) Is there such a G which is consistent, i.e., such that all displayed triplet phylogenies are included in a species tree? Previous results on the Graph sandwich problem can be used to answer to (1), and we provide polynomial-time algorithms for satisfiability and consistency with a given species tree. We also describe a new polynomial-time algorithm for the case of consistency with an unknown species tree and full knowledge of pairwise orthology/paralogy relationships, as well as a branch-and-bound algorithm in the case when unknown relations are present. We show that our algorithms can be used in combination with ProteinOrtho, a sequence similarity-based orthology detection tool, to extract a set of robust orthology/paralogy relationships.

  1. Substituting fields within the action: Consistency issues and some applications

    International Nuclear Information System (INIS)

    Pons, Josep M.

    2010-01-01

    In field theory, as well as in mechanics, the substitution of some fields in terms of other fields at the level of the action raises an issue of consistency with respect to the equations of motion. We discuss this issue and give an expression which neatly displays the difference between doing the substitution at the level of the Lagrangian or at the level of the equations of motion. Both operations do not commute in general. A very relevant exception is the case of auxiliary variables, which are discussed in detail together with some of their relevant applications. We discuss the conditions for the preservation of symmetries--Noether as well as non-Noether--under the reduction of degrees of freedom provided by the mechanism of substitution. We also examine how the gauge fixing procedures fit in our framework and give simple examples on the issue of consistency in this case.

  2. Thermodynamically consistent model calibration in chemical kinetics

    Directory of Open Access Journals (Sweden)

    Goutsias John

    2011-05-01

    Full Text Available Abstract Background The dynamics of biochemical reaction systems are constrained by the fundamental laws of thermodynamics, which impose well-defined relationships among the reaction rate constants characterizing these systems. Constructing biochemical reaction systems from experimental observations often leads to parameter values that do not satisfy the necessary thermodynamic constraints. This can result in models that are not physically realizable and may lead to inaccurate, or even erroneous, descriptions of cellular function. Results We introduce a thermodynamically consistent model calibration (TCMC method that can be effectively used to provide thermodynamically feasible values for the parameters of an open biochemical reaction system. The proposed method formulates the model calibration problem as a constrained optimization problem that takes thermodynamic constraints (and, if desired, additional non-thermodynamic constraints into account. By calculating thermodynamically feasible values for the kinetic parameters of a well-known model of the EGF/ERK signaling cascade, we demonstrate the qualitative and quantitative significance of imposing thermodynamic constraints on these parameters and the effectiveness of our method for accomplishing this important task. MATLAB software, using the Systems Biology Toolbox 2.1, can be accessed from http://www.cis.jhu.edu/~goutsias/CSS lab/software.html. An SBML file containing the thermodynamically feasible EGF/ERK signaling cascade model can be found in the BioModels database. Conclusions TCMC is a simple and flexible method for obtaining physically plausible values for the kinetic parameters of open biochemical reaction systems. It can be effectively used to recalculate a thermodynamically consistent set of parameter values for existing thermodynamically infeasible biochemical reaction models of cellular function as well as to estimate thermodynamically feasible values for the parameters of new

  3. Weyl consistency conditions in non-relativistic quantum field theory

    Energy Technology Data Exchange (ETDEWEB)

    Pal, Sridip; Grinstein, Benjamín [Department of Physics, University of California,San Diego, 9500 Gilman Drive, La Jolla, CA 92093 (United States)

    2016-12-05

    Weyl consistency conditions have been used in unitary relativistic quantum field theory to impose constraints on the renormalization group flow of certain quantities. We classify the Weyl anomalies and their renormalization scheme ambiguities for generic non-relativistic theories in 2+1 dimensions with anisotropic scaling exponent z=2; the extension to other values of z are discussed as well. We give the consistency conditions among these anomalies. As an application we find several candidates for a C-theorem. We comment on possible candidates for a C-theorem in higher dimensions.

  4. Giving Back: Exploring Service-Learning in an Online Learning Environment

    Science.gov (United States)

    McWhorter, Rochell R.; Delello, Julie A.; Roberts, Paul B.

    2016-01-01

    Service-Learning (SL) as an instructional method is growing in popularity for giving back to the community while connecting the experience to course content. However, little has been published on using SL for online business students. This study highlights an exploratory mixed-methods, multiple case study of an online business leadership and…

  5. The accompanying adult: authority to give consent in the UK.

    Science.gov (United States)

    Lal, Seema Madhur Lata; Parekh, Susan; Mason, Carol; Roberts, Graham

    2007-05-01

    Children may be accompanied by various people when attending for dental treatment. Before treatment is started, there is a legal requirement that the operator obtain informed consent for the proposed procedure. In the case of minors, the person authorized to give consent (parental responsibility) is usually a parent. To ascertain if accompanying persons of children attending the Department of Paediatric Dentistry at the Eastman Dental Hospital, London were empowered to give consent for the child's dental treatment. A total of 250 accompanying persons of children attending were selected, over a 6-month period. A questionnaire was used to establish whether the accompanying person(s) were authorized to give consent. The study showed that 12% of accompanying persons had no legal authority to give consent for the child's dental treatment. Clinicians need to be aware of the status of persons accompanying children to ensure valid consent is obtained.

  6. When may doctors give nurses telephonic treatment instructions?

    African Journals Online (AJOL)

    When is it legal for doctors to give nurses telephonic treatment instructions? ... telemedicine? Telemedicine is defined as 'the practice of medicine, from a distance, ... [6] Therefore, if in such circumstances the doctors cannot reach the patients in ...

  7. Can False Advertising Give Rise to Antitrust Liability? (2)

    OpenAIRE

    Christopher Cole

    2014-01-01

    With the Retractable Technologies case, is the theory that false advertising can give rise to violations of the Sherman Act, while rarely invoked, gaining traction? Christopher A. Cole (Crowell & Moring)

  8. Can False Advertising Give Rise to Antitrust Liability?

    OpenAIRE

    Christopher Cole

    2014-01-01

    With the Retractable Technologies case, is the theory that false advertising can give rise to violations of the Sherman Act, while rarely invoked, gaining traction? Christopher A. Cole (Crowell & Moring)

  9. Markel senior vice president to give Wachovia Lecture

    OpenAIRE

    Ho, Sookhan

    2008-01-01

    Markel Corporation's senior vice president and chief financial officer Richard R. Whitt will give a talk on Thursday, Nov. 13, as the Wachovia Distinguished Speaker in the Pamplin College of Business.

  10. developing skills of giving and receiving feedbacks between

    African Journals Online (AJOL)

    User

    could be by developing the skill of giving and receiving feedbacks among the individuals involved in the ... education rather than damaging their self esteem against to improve the teaching – process (David ..... A better skill of communication.

  11. Reciprocity revisited: Give and take in Dutch and immigrant families

    NARCIS (Netherlands)

    Komter, A.; Schans, J.M.D.

    2008-01-01

    Classical theory suggests that "generalized reciprocity," giving without clear expectations of returns, is characteristic for exchange within the family. Modern theory assumes differences between Western, "individualistic" cultures, and non-Western, more "collectivistic" cultures, presumably leading

  12. Benefits of Giving (A Book Review Using Islamic Perspective

    Directory of Open Access Journals (Sweden)

    M. Hamdar Arraiyyah

    2016-01-01

    Full Text Available This writing is a book review. It discusses a book entitled Give and Take. The book introduces a new approach to success. It makes three categories of people in doing interaction or communication. They are takers, matchers, and givers. The writer of the book, Adam Grant, explains the principles and characteristics of each category. He shows a lot of facts to prove that being a giver brings benefits for people and the doer as well. The objects of giving here comprise different kinds help like wealth, ideas, knowledge, skills and information. Therefore, he motivates people to become givers. In this connection, the reviewer would like to show that Islamic religion also motivates its followers to give helps to others. Though, there are some similarities and differences between the benefits of giving mentioned in the book and the verses of the Holy Qur’an and the sayings of Prophet Muhammad Peace be upon him.

  13. Assessing motivation for work environment improvements: internal consistency, reliability and factorial structure.

    Science.gov (United States)

    Hedlund, Ann; Ateg, Mattias; Andersson, Ing-Marie; Rosén, Gunnar

    2010-04-01

    Workers' motivation to actively take part in improvements to the work environment is assumed to be important for the efficiency of investments for that purpose. That gives rise to the need for a tool to measure this motivation. A questionnaire to measure motivation for improvements to the work environment has been designed. Internal consistency and test-retest reliability of the domains of the questionnaire have been measured, and the factorial structure has been explored, from the answers of 113 employees. The internal consistency is high (0.94), as well as the correlation for the total score (0.84). Three factors are identified accounting for 61.6% of the total variance. The questionnaire can be a useful tool in improving intervention methods. The expectation is that the tool can be useful, particularly with the aim of improving efficiency of companies' investments for work environment improvements. Copyright 2010 Elsevier Ltd. All rights reserved.

  14. Evaluating Temporal Consistency in Marine Biodiversity Hotspots.

    Science.gov (United States)

    Piacenza, Susan E; Thurman, Lindsey L; Barner, Allison K; Benkwitt, Cassandra E; Boersma, Kate S; Cerny-Chipman, Elizabeth B; Ingeman, Kurt E; Kindinger, Tye L; Lindsley, Amy J; Nelson, Jake; Reimer, Jessica N; Rowe, Jennifer C; Shen, Chenchen; Thompson, Kevin A; Heppell, Selina S

    2015-01-01

    With the ongoing crisis of biodiversity loss and limited resources for conservation, the concept of biodiversity hotspots has been useful in determining conservation priority areas. However, there has been limited research into how temporal variability in biodiversity may influence conservation area prioritization. To address this information gap, we present an approach to evaluate the temporal consistency of biodiversity hotspots in large marine ecosystems. Using a large scale, public monitoring dataset collected over an eight year period off the US Pacific Coast, we developed a methodological approach for avoiding biases associated with hotspot delineation. We aggregated benthic fish species data from research trawls and calculated mean hotspot thresholds for fish species richness and Shannon's diversity indices over the eight year dataset. We used a spatial frequency distribution method to assign hotspot designations to the grid cells annually. We found no areas containing consistently high biodiversity through the entire study period based on the mean thresholds, and no grid cell was designated as a hotspot for greater than 50% of the time-series. To test if our approach was sensitive to sampling effort and the geographic extent of the survey, we followed a similar routine for the northern region of the survey area. Our finding of low consistency in benthic fish biodiversity hotspots over time was upheld, regardless of biodiversity metric used, whether thresholds were calculated per year or across all years, or the spatial extent for which we calculated thresholds and identified hotspots. Our results suggest that static measures of benthic fish biodiversity off the US West Coast are insufficient for identification of hotspots and that long-term data are required to appropriately identify patterns of high temporal variability in biodiversity for these highly mobile taxa. Given that ecological communities are responding to a changing climate and other

  15. Sludge characterization: the role of physical consistency

    Energy Technology Data Exchange (ETDEWEB)

    Spinosa, Ludovico; Wichmann, Knut

    2003-07-01

    The physical consistency is an important parameter in sewage sludge characterization as it strongly affects almost all treatment, utilization and disposal operations. In addition, in many european Directives a reference to the physical consistency is reported as a characteristic to be evaluated for fulfilling the regulations requirements. Further, in many analytical methods for sludge different procedures are indicated depending on whether a sample is liquid or not, is solid or not. Three physical behaviours (liquid, paste-like and solid) can be observed with sludges, so the development of analytical procedures to define the boundary limit between liquid and paste-like behaviours (flowability) and that between solid and paste-like ones (solidity) is of growing interest. Several devices can be used for evaluating the flowability and solidity properties, but often they are costly and difficult to be operated in the field. Tests have been carried out to evaluate the possibility to adopt a simple extrusion procedure for flowability measurements, and a Vicat needle for solidity ones. (author)

  16. Testing for altruism and social pressure in charitable giving.

    Science.gov (United States)

    DellaVigna, Stefano; List, John A; Malmendier, Ulrike

    2012-01-01

    Every year, 90% of Americans give money to charities. Is such generosity necessarily welfare enhancing for the giver? We present a theoretical framework that distinguishes two types of motivation: individuals like to give, for example, due to altruism or warm glow, and individuals would rather not give but dislike saying no, for example, due to social pressure. We design a door-to-door fund-raiser in which some households are informed about the exact time of solicitation with a flyer on their doorknobs. Thus, they can seek or avoid the fund-raiser. We find that the flyer reduces the share of households opening the door by 9% to 25% and, if the flyer allows checking a Do Not Disturb box, reduces giving by 28% to 42%. The latter decrease is concentrated among donations smaller than $10. These findings suggest that social pressure is an important determinant of door-to-door giving. Combining data from this and a complementary field experiment, we structurally estimate the model. The estimated social pressure cost of saying no to a solicitor is $3.80 for an in-state charity and $1.40 for an out-of-state charity. Our welfare calculations suggest that our door-to-door fund-raising campaigns on average lower the utility of the potential donors.

  17. Social Relations of Fieldwork: Giving Back in a Research Setting

    Directory of Open Access Journals (Sweden)

    Clare Gupta

    2014-07-01

    Full Text Available The project of this special issue emerged from the guest editors' experiences as field researchers in sub-Saharan Africa. During this time both researchers faced the difficult question of "giving back" to the communities in which, and with whom, they worked—communities that were often far less privileged than the researchers were in terms of wealth, mobility, education, and access to health care. Returning from their field sites, both researchers felt a combination of guilt and frustration that they had not done enough or had not done things right. Thus emerged the idea of bringing together a group of researchers, from a range of disciplines, to discuss the topic of giving back in field research. This editorial describes the idea and process that led to the present collection of articles. The guest editors situate the project in the literature on feminist studies and briefly summarize each of the four thematic sections in this special issue. They conclude by emphasizing that their collection is not a guide to giving back. Rather than lay out hard and fast rules about what, how much, and to whom field researchers should give, their collection offers a series of examples and considerations for giving back in fieldwork.

  18. Authorization gives the personnel he/she gives the center he/she gives Isotopes for the acting he/she gives tied functions with the security and the radiological protection

    International Nuclear Information System (INIS)

    Perez Pijuan, S.; Hernandez Alvarez, R.; Peres Reyes, Y.; Venegas Bernal, M.C.

    1998-01-01

    The conception is described used in a center production labelled compound and radiopharmaceuticals for the authorization to the support, operation and supervision personnel The approaches are exposed used to define the excellent positions for the security the installation. The are described the training programs, designed starting from the indentification the specific competitions for each duty station and with particular emphasis in the development gives abilities you practice. It is used for the administration and evaluation gives the programs training the Automated System Administration Programs Training (GESAT)

  19. Gift-giving in the medical student--patient relationship.

    Science.gov (United States)

    Alamri, Yassar Abdullah S

    2012-08-01

    There is paucity in the published literature that provides any ethical guidance guiding gift-giving within the student--patient relationship. This is perhaps because the dynamics of the medical student--patient relationship have not yet been explored as extensively as the doctor--patient relationship. More importantly, however, gift--giving in the doctor-patient relationship has traditionally been from the patient to the doctor and not vice versa. This article examines the literature published in this vicinity reflecting on an encounter with a patient.

  20. A Consistent Phylogenetic Backbone for the Fungi

    Science.gov (United States)

    Ebersberger, Ingo; de Matos Simoes, Ricardo; Kupczok, Anne; Gube, Matthias; Kothe, Erika; Voigt, Kerstin; von Haeseler, Arndt

    2012-01-01

    The kingdom of fungi provides model organisms for biotechnology, cell biology, genetics, and life sciences in general. Only when their phylogenetic relationships are stably resolved, can individual results from fungal research be integrated into a holistic picture of biology. However, and despite recent progress, many deep relationships within the fungi remain unclear. Here, we present the first phylogenomic study of an entire eukaryotic kingdom that uses a consistency criterion to strengthen phylogenetic conclusions. We reason that branches (splits) recovered with independent data and different tree reconstruction methods are likely to reflect true evolutionary relationships. Two complementary phylogenomic data sets based on 99 fungal genomes and 109 fungal expressed sequence tag (EST) sets analyzed with four different tree reconstruction methods shed light from different angles on the fungal tree of life. Eleven additional data sets address specifically the phylogenetic position of Blastocladiomycota, Ustilaginomycotina, and Dothideomycetes, respectively. The combined evidence from the resulting trees supports the deep-level stability of the fungal groups toward a comprehensive natural system of the fungi. In addition, our analysis reveals methodologically interesting aspects. Enrichment for EST encoded data—a common practice in phylogenomic analyses—introduces a strong bias toward slowly evolving and functionally correlated genes. Consequently, the generalization of phylogenomic data sets as collections of randomly selected genes cannot be taken for granted. A thorough characterization of the data to assess possible influences on the tree reconstruction should therefore become a standard in phylogenomic analyses. PMID:22114356

  1. A New Bias Corrected Version of Heteroscedasticity Consistent Covariance Estimator

    Directory of Open Access Journals (Sweden)

    Munir Ahmed

    2016-06-01

    Full Text Available In the presence of heteroscedasticity, different available flavours of the heteroscedasticity consistent covariance estimator (HCCME are used. However, the available literature shows that these estimators can be considerably biased in small samples. Cribari–Neto et al. (2000 introduce a bias adjustment mechanism and give the modified White estimator that becomes almost bias-free even in small samples. Extending these results, Cribari-Neto and Galvão (2003 present a similar bias adjustment mechanism that can be applied to a wide class of HCCMEs’. In the present article, we follow the same mechanism as proposed by Cribari-Neto and Galvão to give bias-correction version of HCCME but we use adaptive HCCME rather than the conventional HCCME. The Monte Carlo study is used to evaluate the performance of our proposed estimators.

  2. Consistency Anchor Formalization and Correctness Proofs

    OpenAIRE

    Miguel, Correia; Bessani, Alysson

    2014-01-01

    This is report contains the formal proofs for the techniques for increasing the consistency of cloud storage as presented in "Bessani et al. SCFS: A Cloud-backed File System. Proc. of the 2014 USENIX Annual Technical Conference. June 2014." The consistency anchor technique allows one to increase the consistency provided by eventually consistent cloud storage services like Amazon S3. This technique has been used in the SCFS (Shared Cloud File System) cloud-backed file system for solving rea...

  3. Better by the Year. The FRI Annual Giving Book.

    Science.gov (United States)

    Williams, M. Jane

    Designed for a nonprofit organization executive, this book suggests how to start and run an increasingly profitable program for attracting the kind of gifts that will be repeated year after year. Preliminary preparations, the launch and administration of a campaign, four ways to reach higher goals, and annual giving ideas from education, health…

  4. Give me a break!: Informal caregiver attitudes towards respite care

    NARCIS (Netherlands)

    van Exel, J..; de Graaf, G.; Brouwer, W.B.F.

    2009-01-01

    Background/objective: Because informal health care is now recognized to be indispensable to health care systems, different forms of respite care have been developed and publicly funded that supposedly alleviate caregivers' perceived burdens and help prolong the care giving task. Nonetheless, the use

  5. Improving Evidence on Private Giving in Emerging Economies ...

    International Development Research Centre (IDRC) Digital Library (Canada)

    ... gaps on amounts and sources. There is also a lack of research on regulations and policies that support or discourage private giving. This research project will explore philanthropic cooperation in emerging and developing country contexts by quantifying financial flows from emerging economies to developing countries.

  6. Why do firms give away their patents for free?

    NARCIS (Netherlands)

    Ziegler, Nicole; Gassmann, Oliver; Friesike, Sascha

    2013-01-01

    Within the trend of increasing patent commercialisation and open innovation, a recent phenomenon where firms give away their patents free of charge can be observed. This seems contradictory to the original intention of the patent system (enabling firms to create temporary monopolies to appropriate

  7. Giving Voice: A Course on American Indian Women.

    Science.gov (United States)

    Krouse, Susan Applegate

    1997-01-01

    Presents the story of the creation of an undergraduate course on the traditional and contemporary roles of women in North American Indian cultures. Notes that the course was designed around experiential learning precepts and the idea of "giving voice" to American Indian women. Lists texts used and evaluates course strengths. (DSK)

  8. A Conversation Model Enabling Intelligent Agents to Give Emotional Support

    NARCIS (Netherlands)

    Van der Zwaan, J.M.; Dignum, V.; Jonker, C.M.

    2012-01-01

    In everyday life, people frequently talk to others to help them deal with negative emotions. To some extent, everybody is capable of comforting other people, but so far conversational agents are unable to deal with this type of situation. To provide intelligent agents with the capability to give

  9. Giving Back — IDRC photo contest winner shares prize with ...

    International Development Research Centre (IDRC) Digital Library (Canada)

    2011-01-28

    Jan 28, 2011 ... Giving Back — IDRC photo contest winner shares prize with Senegalese colleagues ... South or the developed world are tackling the challenges of urban living. ... Upon his return to Canada, the 26-year-old wrote to IDRC the ...

  10. Advice-giving in the English lingua franca classroom

    African Journals Online (AJOL)

    important pragmatic differences between the ways in which advice is given by native speakers and ... gender differences and that teacher modeling may have an effect on which available form of advice-giving a ... nation wishes to participate in global enterprises such as international finance, multi-national corporations, and ...

  11. Reprint of: Bidding to give in the field

    NARCIS (Netherlands)

    Onderstal, S.; Schram, A.J.H.C.; Soetevent, A.R.

    2014-01-01

    In a door-to-door fundraising field experiment, we study the impact of fundraising mechanisms on charitable giving. We approached about 4500 households, each participating in an all-pay auction, a lottery, a non-anonymous voluntary contribution mechanism (VCM), or an anonymous VCM. In contrast to

  12. Reprint of : Bidding to give in the field

    NARCIS (Netherlands)

    Onderstal, Sander; Schram, Arthur J. H. C.; Soetevent, Adriaan R.

    In a door-to-door fundraising field experiment, we study the impact of fundraising mechanisms on charitable giving. We approached about 4500 households, each participating in an all-pay auction, a lottery, a non-anonymous voluntary contribution mechanism (VCM), or an anonymous VCM. In contrast to

  13. Exploring gender differences in charitable giving : The Dutch Case

    NARCIS (Netherlands)

    De Wit, Arjen; Bekkers, René

    2016-01-01

    Women’s philanthropy has drawn much attention during recent years, mostly in studies from the United States or the United Kingdom. Relevant issues are to what extent gender differences in charitable giving exist in another national context and how these differences can be explained. In this study,

  14. The good news about giving bad news to patients.

    Science.gov (United States)

    Farber, Neil J; Urban, Susan Y; Collier, Virginia U; Weiner, Joan; Polite, Ronald G; Davis, Elizabeth B; Boyer, E Gil

    2002-12-01

    There are few data available on how physicians inform patients about bad news. We surveyed internists about how they convey this information. We surveyed internists about their activities in giving bad news to patients. One set of questions was about activities for the emotional support of the patient (11 items), and the other was about activities for creating a supportive environment for delivering bad news (9 items). The impact of demographic factors on the performance of emotionally supportive items, environmentally supportive items, and on the number of minutes reportedly spent delivering news was analyzed by analysis of variance and multiple regression analysis. More than half of the internists reported that they always or frequently performed 10 of the 11 emotionally supportive items and 6 of the 9 environmentally supportive items while giving bad news to patients. The average time reportedly spent in giving bad news was 27 minutes. Although training in giving bad news had a significant impact on the number of emotionally supportive items reported (P woman, unmarried, and having a history of major illness were also associated with reporting a greater number of emotionally supportive activities. Internists report that they inform patients of bad news appropriately. Some deficiencies exist, specifically in discussing prognosis and referral of patients to support groups. Physician educational efforts should include discussion of prognosis with patients as well as the availability of support groups.

  15. Using "The Giving Tree" To Teach Literary Criticism.

    Science.gov (United States)

    Remler, Nancy Lawson

    2000-01-01

    Argues that introducing students to literary criticism while introducing them to literature boosts their confidence and abilities to analyze literature, and increases their interest in discussing it. Describes how the author, in her college-level introductory literature course, used Shel Silverstein's "The Giving Tree" (a children's…

  16. Reciprocity Revisited : Give and Take in Dutch and Immigrant Families

    NARCIS (Netherlands)

    Komter, Aafke; Schans, Djamila

    2008-01-01

    The idea that reciprocity is the basic principle underlying forms of social organization, among which the family, is as old as classical anthropology and sociology. The essence of the principle is that giving prompts receiving, thereby creating forms of ongoing exchange and durable cooperation.

  17. The genus Architeuthis was erected, without giving any diagnosis ...

    African Journals Online (AJOL)

    spamer

    The genus Architeuthis was erected, without giving any diagnosis, by Steenstrup in 1857 for a specimen stranded on the Danish coast in 1853. In 1880, Verrill gave the first description of the genus. Pfeffer (1912) related this history and also mentioned that traditional narratives and illustrations of the 16th century had.

  18. Informal care giving to more disabled people with multiple sclerosis.

    Science.gov (United States)

    Buchanan, Robert J; Radin, Dagmar; Chakravorty, Bonnie J; Tyry, Tuula

    2009-01-01

    About 30% of the people with multiple sclerosis (MS) require some form of home care assistance and 80% of that assistance is provided by informal or unpaid care givers. This study focusses on the care givers for 530 more disabled people with MS, with the objective of learning more about informal care giving to people with greater dependency and need for assistance. The data presented in this study were collected in a national survey of 530 people who provided informal care to more disabled people with MS. Almost half of these care givers reported that they provided more than 20 h of care per week to the person with MS, with more than 9 in 10 shopping for groceries, doing indoor housework, preparing meals or providing transportation for the person with MS. More than 4 in 10 employed care givers reduced the amount of time worked in the previous 12 months because of their care giving responsibilities. Although more than half of the MS care givers in our study reported that care giving was demanding, time consuming or challenging, about 90% of these MS care givers were happy that they could help. About two in three of these MS care givers found that care giving was rewarding, with more than 8 in 10 proud of the care they provided. More than a quarter of the informal care givers to people with MS thought they would benefit from treatment or counselling provided by mental health professionals. Not only it is necessary to provide access to mental health services for people with MS, but it is also important to assure that their informal care givers also have access to appropriate mental health care, given the scope of their care giving responsibilities.

  19. Give me a break! Informal caregiver attitudes towards respite care.

    Science.gov (United States)

    van Exel, Job; de Graaf, Gjalt; Brouwer, Werner

    2008-10-01

    Because informal health care is now recognized to be indispensable to health care systems, different forms of respite care have been developed and publicly funded that supposedly alleviate caregivers' perceived burdens and help prolong the care giving task. Nonetheless, the use of respite care services is low even among substantially strained caregivers. To throw light on this low usage, this paper explores the associations between attitudes towards respite care, characteristics of the care giving situation, and the need and use of respite care. The survey, administered to a sample of 273 informal caregivers, addressed caregiver, care recipient, and care giving situation characteristics, as well as the familiarity and use of respite care services. It also included a sub-set of 12 statements eliciting attitudes towards respite care from an earlier study [Van Exel NJA, De Graaf G, Brouwer WBF. Care for a break? An investigation of informal caregivers' attitudes toward respite care using Q-methodology. Health Policy 2007;83(2/3):332-42]. Associations between variables were measured using univariate statistics and multinomial logistic regression. We found three caregiver attitudes, distributed fairly equally in the sample, that are apparently associated with caregiver educational level, employment status, health and happiness, as well as care recipient gender, duration and intensity of care giving, relationship, co-residence, need for surveillance, and subjective burden and process utility of care giving. However, the relation between attitude and familiarity with and use of respite care services is ambiguous. Although further exploration is needed of the mix of Q-methodology and survey analysis, the overall results indicate that a considerable portion of the caregiver population needs but does not readily ask for support or respite care. This finding has important policy implications in the context of an ageing population.

  20. Cowpeas in Northern Ghana and the Factors that Predict Caregivers’ Intention to Give Them to Schoolchildren

    NARCIS (Netherlands)

    Abizari, A.R.; Pilime, N.; Armar-Klemesu, M.; Brouwer, I.D.

    2013-01-01

    Background Cowpeas are important staple legumes among the rural poor in northern Ghana. Our objectives were to assess the iron and zinc content of cowpea landraces and identify factors that predict the intention of mothers/caregivers to give cowpeas to their schoolchildren. Methods and Findings We

  1. Consistency and Inconsistency in PhD Thesis Examination

    Science.gov (United States)

    Holbrook, Allyson; Bourke, Sid; Lovat, Terry; Fairbairn, Hedy

    2008-01-01

    This is a mixed methods investigation of consistency in PhD examination. At its core is the quantification of the content and conceptual analysis of examiner reports for 804 Australian theses. First, the level of consistency between what examiners say in their reports and the recommendation they provide for a thesis is explored, followed by an…

  2. Norm, gender, and bribe-giving: Insights from a behavioral game.

    Directory of Open Access Journals (Sweden)

    Tian Lan

    Full Text Available Previous research has suggested that bribery is more normative in some countries than in others. To understand the underlying process, this paper examines the effects of social norm and gender on bribe-giving behavior. We argue that social norms provide information for strategic planning and impression management, and thus would impact participants' bribe amount. Besides, males are more agentic and focus more on impression management than females. We predicted that males would defy the norm in order to win when the amount of their bribe was kept private, but would conform to the norm when it was made public. To test this hypothesis, we conducted two studies using a competitive game. In each game, we asked three participants to compete in five rounds of creative tasks, and the winner was determined by a referee's subjective judgment of the participants' performance on the tasks. Participants were allowed to give bribes to the referee. Bribe-giving norms were manipulated in two domains: norm level (high vs. low and norm context (private vs. public, in order to investigate the influence of informational and affiliational needs. Studies 1 and 2 consistently showed that individuals conformed to the norm level of bribe-giving while maintaining a relative advantage for economic benefit. Study 2 found that males gave larger bribes in the private context than in the public, whereas females gave smaller bribes in both contexts. We used a latent growth curve model (LGCM to depict the development of bribe-giving behaviors during five rounds of competition. The results showed that gender, creative performance, and norm level all influence the trajectory of bribe-giving behavior.

  3. Norm, gender, and bribe-giving: Insights from a behavioral game.

    Science.gov (United States)

    Lan, Tian; Hong, Ying-Yi

    2017-01-01

    Previous research has suggested that bribery is more normative in some countries than in others. To understand the underlying process, this paper examines the effects of social norm and gender on bribe-giving behavior. We argue that social norms provide information for strategic planning and impression management, and thus would impact participants' bribe amount. Besides, males are more agentic and focus more on impression management than females. We predicted that males would defy the norm in order to win when the amount of their bribe was kept private, but would conform to the norm when it was made public. To test this hypothesis, we conducted two studies using a competitive game. In each game, we asked three participants to compete in five rounds of creative tasks, and the winner was determined by a referee's subjective judgment of the participants' performance on the tasks. Participants were allowed to give bribes to the referee. Bribe-giving norms were manipulated in two domains: norm level (high vs. low) and norm context (private vs. public), in order to investigate the influence of informational and affiliational needs. Studies 1 and 2 consistently showed that individuals conformed to the norm level of bribe-giving while maintaining a relative advantage for economic benefit. Study 2 found that males gave larger bribes in the private context than in the public, whereas females gave smaller bribes in both contexts. We used a latent growth curve model (LGCM) to depict the development of bribe-giving behaviors during five rounds of competition. The results showed that gender, creative performance, and norm level all influence the trajectory of bribe-giving behavior.

  4. Processes give selection location like fundamental approach gives the security for the repositories radioactive waste (radioactive installation) in Cuba

    International Nuclear Information System (INIS)

    Peralta Vidal, J.L.; Gil Castillo, R.O.; Chales Suarez, G.; Rodriguez Reyes, A.

    1998-01-01

    On the base for the best international practice, the requirements given by the IAEA, specialized national experience, the technician economic conditions and social matters give Cuba, it has been documented in the country the process the documented location for evacuation and storage the worn-out fuel lingeringly

  5. A Childhood Rich in Culture Gives a Socioeconomic Bonus

    DEFF Research Database (Denmark)

    Austring, Bennye Düranc

    2015-01-01

    Artiklen ridser den nyeste forskning op inden for feltet 'art rich learning', altså æstetiske læreprocesser af god kvalitet. In the book ”Art and Culture Give Children a Life that Works” 60 (Danish and non-Danish) experts, practitioners, artists and several Ministers from the Danish Government fo...... focus on the significance of Art and Culture for children. The book provides lots of inspiration for teachers, pedagogues and cultural mediators and contains many examples of specific cultural activities, links and bibliographic references.......Artiklen ridser den nyeste forskning op inden for feltet 'art rich learning', altså æstetiske læreprocesser af god kvalitet. In the book ”Art and Culture Give Children a Life that Works” 60 (Danish and non-Danish) experts, practitioners, artists and several Ministers from the Danish Government...

  6. Framing charitable donations as exceptional expenses increases giving.

    Science.gov (United States)

    Sussman, Abigail B; Sharma, Eesha; Alter, Adam L

    2015-06-01

    Many articles have examined the psychological drivers of charitable giving, but little is known about how people mentally budget for charitable gifts. The present research aims to address this gap by investigating how perceptions of donations as exceptional (uncommon and infrequent) rather than ordinary (common and frequent) expenses might affect budgeting for and giving to charity. We provide the first demonstration that exceptional framing of an identical item can directly influence mental budgeting processes, and yield societal benefits. In 5 lab and field experiments, exceptional framing increased charitable behavior, and diminished the extent to which people considered the effect of the donation on their budgets. The current work extends our understanding of mental accounting and budgeting for charitable gifts, and demonstrates practical techniques that enable fundraisers to enhance the perceived exceptionality of donations. (c) 2015 APA, all rights reserved).

  7. Self gift Giving: A New Widespread Consumption Culture

    OpenAIRE

    KAR, Yrd.Doç.Dr. Altan

    2013-01-01

    SUMMARY The meanings derived from consumption goods have an increasing impact on the psychological formation of individuals Hence to understand the complex nature of consumer behavior a multidisciplinary approach is needed Consumption goods become magical fetish objects which satisfy individual pleasures The gift in this concept becomes a token that the individual gives her himself and evolves from being a collective system to an individual form of consumption The aim of this study...

  8. Laser techniques for radioactive decontamination gives metallic surfaces

    International Nuclear Information System (INIS)

    Escobar Alracon, L.; Molina, G.; Vizuet Gonzalez, J.

    1998-01-01

    In this work it presented the prototype for system decontamination at diverse component with removable superficial contamination, using the technique gives laser ablation, for the evaporation at the pollutant. It discusses the principle in the fact that system, as well as the different elements that compose it. The are presented the obtained results when irradiating with a laser a surface without radioactive contamination to verify the system operation

  9. Consistent Estimation of Pricing Kernels from Noisy Price Data

    OpenAIRE

    Vladislav Kargin

    2003-01-01

    If pricing kernels are assumed non-negative then the inverse problem of finding the pricing kernel is well-posed. The constrained least squares method provides a consistent estimate of the pricing kernel. When the data are limited, a new method is suggested: relaxed maximization of the relative entropy. This estimator is also consistent. Keywords: $\\epsilon$-entropy, non-parametric estimation, pricing kernel, inverse problems.

  10. Giving an account of one's pain in the anthropological interview.

    Science.gov (United States)

    Buchbinder, Mara

    2010-03-01

    In this paper, I analyze the illness stories narrated by a mother and her 13-year-old son as part of an ethnographic study of child chronic pain sufferers and their families. In examining some of the moral, relational and communicative challenges of giving an account of one's pain, I focus on what is left out of some accounts of illness and suffering and explore some possible reasons for these elisions. Drawing on recent work by Judith Butler (Giving an Account of Oneself, 2005), I investigate how the pragmatic context of interviews can introduce a form of symbolic violence to narrative accounts. Specifically, I use the term "genre of complaint" to highlight how anthropological research interviews in biomedical settings invoke certain typified forms of suffering that call for the rectification of perceived injustices. Interview narratives articulated in the genre of complaint privilege specific types of pain and suffering and cast others into the background. Giving an account of one's pain is thus a strategic and selective process, creating interruptions and silences as much as moments of clarity. Therefore, I argue that medical anthropologists ought to attend more closely to the institutional structures and relations that shape the production of illness narratives in interview encounters.

  11. Development of an analysis method for determining chlorinated hydrocarbons in marine sediments and suspended matter giving particular consideration to supercritical fluid extraction; Entwicklung eines Analysenverfahrens zur Bestimmung von chlorierten Kohlenwasserstoffen in marinen Sedimenten und Schwebstoffen unter besonderer Beruecksichtigung der ueberkritischen Fluidextraktion

    Energy Technology Data Exchange (ETDEWEB)

    Sterzenbach, D.

    1997-11-01

    The purpose of the present study was to develop an analysis method for chlorinate hydrocarbons in marine environments using supercritical fluid extraction (SFE) instead of conventional approaches. In order to apply this extraction method the available SFE device had to be extended and all the individual steps of the analysis method had to be optimised and adapted. As chlorinated hydrocarbons only occur at very low concentrations in marine environments (ppm to ppt range) the analysis method had to be extremely sensitive. High sensitivity, in town, is generally associated with a high susceptibility of an analysis method to faults through contamination or losses. This meant that the entire method and all its individual steps had to scrutinised for such weak points and improved where necessary. A method for sampling suspended matter in marine environments had to be developed which permits efficient separation of the smallest possible particles from seawater. The designated purpose of the developed analysis method is to deal with topical aspects of marine chemistry relating to sources, transport, distribution, and the fate of chlorinated hydrocarbons in marine environments. (orig.) [Deutsch] Ziel der vorliegenden Arbeit ist, ein Analysenverfahren fuer chlorierte Kohlenwasserstoffe in der marinen Umwelt zu entwickeln. Dabei soll die ueberkritische Fluidextraktion (SFE) anstelle herkoemmlicher Verfahren eingesetzt werden. Fuer die Anwendung dieser Extraktionsmethode ist es erforderlich, das zur Verfuegung stehende SFE-Geraet zu erweitern und saemtliche Teilschritte des Analysenverfahrens zu optimieren und auf diese Methode abzustimmen. Der Umstand, dass die chlorierten Kohlenwasserstoffe nur in sehr geringen Konzentrationen in der marinen Umwelt vorkommen (ppm- bis ppt-Bereich), erfordert eine sehr hohe Empfindlichkeit des Analysenverfahrens. Eine hohe Empfindlichkeit bedingt eine grosse Stoeranfaelligkeit des Analysenverfahrens durch Kontaminationen oder Verluste. Aus

  12. Sociality Mental Modes Modulate the Processing of Advice-Giving: An Event-Related Potentials Study

    Directory of Open Access Journals (Sweden)

    Jin Li

    2018-02-01

    Full Text Available People have different motivations to get along with others in different sociality mental modes (i.e., communal mode and market mode, which might affect social decision-making. The present study examined how these two types of sociality mental modes affect the processing of advice-giving using the event-related potentials (ERPs. After primed with the communal mode and market mode, participants were instructed to decide whether or not give an advice (profitable or damnous to a stranger without any feedback. The behavioral results showed that participants preferred to give the profitable advice to the stranger more slowly compared with the damnous advice, but this difference was only observed in the market mode condition. The ERP results indicated that participants demonstrated more negative N1 amplitude for the damnous advice compared with the profitable advice, and larger P300 was elicited in the market mode relative to both the communal mode and the control group. More importantly, participants in the market mode demonstrated larger P300 for the profitable advice than the damnous advice, whereas this difference was not observed at the communal mode and the control group. These findings are consistent with the dual-process system during decision-making and suggest that market mode may lead to deliberate calculation for costs and benefits when giving the profitable advice to others.

  13. Replica consistency in a Data Grid

    International Nuclear Information System (INIS)

    Domenici, Andrea; Donno, Flavia; Pucciani, Gianni; Stockinger, Heinz; Stockinger, Kurt

    2004-01-01

    A Data Grid is a wide area computing infrastructure that employs Grid technologies to provide storage capacity and processing power to applications that handle very large quantities of data. Data Grids rely on data replication to achieve better performance and reliability by storing copies of data sets on different Grid nodes. When a data set can be modified by applications, the problem of maintaining consistency among existing copies arises. The consistency problem also concerns metadata, i.e., additional information about application data sets such as indices, directories, or catalogues. This kind of metadata is used both by the applications and by the Grid middleware to manage the data. For instance, the Replica Management Service (the Grid middleware component that controls data replication) uses catalogues to find the replicas of each data set. Such catalogues can also be replicated and their consistency is crucial to the correct operation of the Grid. Therefore, metadata consistency generally poses stricter requirements than data consistency. In this paper we report on the development of a Replica Consistency Service based on the middleware mainly developed by the European Data Grid Project. The paper summarises the main issues in the replica consistency problem, and lays out a high-level architectural design for a Replica Consistency Service. Finally, results from simulations of different consistency models are presented

  14. Reduced reciprocal giving in social anxiety - Evidence from the Trust Game.

    Science.gov (United States)

    Anderl, Christine; Steil, Regina; Hahn, Tim; Hitzeroth, Patricia; Reif, Andreas; Windmann, Sabine

    2018-06-01

    Social anxiety is known to impair interpersonal relationships. These impairments are thought to partly arise from difficulties to engage in affiliative interactions with others, such as sharing favors or reciprocating prosocial acts. Here, we examined whether individuals high compared to low in social anxiety differ in giving towards strangers in an economic game paradigm. One hundred and twenty seven non-clinical participants who had been pre-screened to be either particularly high or low in social anxiety played an incentivized Trust Game to assess trustful and reciprocal giving towards strangers in addition to providing information on real life interpersonal functioning (perceived social support and attachment style). We found that reciprocal, but not trustful giving, was significantly decreased among highly socially anxious individuals. Both social anxiety and reciprocal giving furthermore showed significant associations with self-reported real life interpersonal functioning. Participants played the Trust Game with the strategy method; results need replication with a clinical sample. Individuals high in social anxiety showed reduced reciprocal, but intact trustful giving, pointing to a constraint in responsiveness. The research may contribute to the development of new treatment and prevention programs to reduce the interpersonal impairments in socially anxious individuals. Copyright © 2017 Elsevier Ltd. All rights reserved.

  15. Quantitative verification of ab initio self-consistent laser theory.

    Science.gov (United States)

    Ge, Li; Tandy, Robert J; Stone, A D; Türeci, Hakan E

    2008-10-13

    We generalize and test the recent "ab initio" self-consistent (AISC) time-independent semiclassical laser theory. This self-consistent formalism generates all the stationary lasing properties in the multimode regime (frequencies, thresholds, internal and external fields, output power and emission pattern) from simple inputs: the dielectric function of the passive cavity, the atomic transition frequency, and the transverse relaxation time of the lasing transition.We find that the theory gives excellent quantitative agreement with full time-dependent simulations of the Maxwell-Bloch equations after it has been generalized to drop the slowly-varying envelope approximation. The theory is infinite order in the non-linear hole-burning interaction; the widely used third order approximation is shown to fail badly.

  16. Student Effort, Consistency, and Online Performance

    Science.gov (United States)

    Patron, Hilde; Lopez, Salvador

    2011-01-01

    This paper examines how student effort, consistency, motivation, and marginal learning, influence student grades in an online course. We use data from eleven Microeconomics courses taught online for a total of 212 students. Our findings show that consistency, or less time variation, is a statistically significant explanatory variable, whereas…

  17. Translationally invariant self-consistent field theories

    International Nuclear Information System (INIS)

    Shakin, C.M.; Weiss, M.S.

    1977-01-01

    We present a self-consistent field theory which is translationally invariant. The equations obtained go over to the usual Hartree-Fock equations in the limit of large particle number. In addition to deriving the dynamic equations for the self-consistent amplitudes we discuss the calculation of form factors and various other observables

  18. Sticky continuous processes have consistent price systems

    DEFF Research Database (Denmark)

    Bender, Christian; Pakkanen, Mikko; Sayit, Hasanjan

    Under proportional transaction costs, a price process is said to have a consistent price system, if there is a semimartingale with an equivalent martingale measure that evolves within the bid-ask spread. We show that a continuous, multi-asset price process has a consistent price system, under...

  19. Consistent-handed individuals are more authoritarian.

    Science.gov (United States)

    Lyle, Keith B; Grillo, Michael C

    2014-01-01

    Individuals differ in the consistency with which they use one hand over the other to perform everyday activities. Some individuals are very consistent, habitually using a single hand to perform most tasks. Others are relatively inconsistent, and hence make greater use of both hands. More- versus less-consistent individuals have been shown to differ in numerous aspects of personality and cognition. In several respects consistent-handed individuals resemble authoritarian individuals. For example, both consistent-handedness and authoritarianism have been linked to cognitive inflexibility. Therefore we hypothesised that consistent-handedness is an external marker for authoritarianism. Confirming our hypothesis, we found that consistent-handers scored higher than inconsistent-handers on a measure of submission to authority, were more likely to identify with a conservative political party (Republican), and expressed less-positive attitudes towards out-groups. We propose that authoritarianism may be influenced by the degree of interaction between the left and right brain hemispheres, which has been found to differ between consistent- and inconsistent-handed individuals.

  20. Testing the visual consistency of web sites

    NARCIS (Netherlands)

    van der Geest, Thea; Loorbach, N.R.

    2005-01-01

    Consistency in the visual appearance of Web pages is often checked by experts, such as designers or reviewers. This article reports a card sort study conducted to determine whether users rather than experts could distinguish visual (in-)consistency in Web elements and pages. The users proved to

  1. Consistent spectroscopy for a extended gauge model

    International Nuclear Information System (INIS)

    Oliveira Neto, G. de.

    1990-11-01

    The consistent spectroscopy was obtained with a Lagrangian constructed with vector fields with a U(1) group extended symmetry. As consistent spectroscopy is understood the determination of quantum physical properties described by the model in an manner independent from the possible parametrizations adopted in their description. (L.C.J.A.)

  2. Modeling and Testing Legacy Data Consistency Requirements

    DEFF Research Database (Denmark)

    Nytun, J. P.; Jensen, Christian Søndergaard

    2003-01-01

    An increasing number of data sources are available on the Internet, many of which offer semantically overlapping data, but based on different schemas, or models. While it is often of interest to integrate such data sources, the lack of consistency among them makes this integration difficult....... This paper addresses the need for new techniques that enable the modeling and consistency checking for legacy data sources. Specifically, the paper contributes to the development of a framework that enables consistency testing of data coming from different types of data sources. The vehicle is UML and its...... accompanying XMI. The paper presents techniques for modeling consistency requirements using OCL and other UML modeling elements: it studies how models that describe the required consistencies among instances of legacy models can be designed in standard UML tools that support XMI. The paper also considers...

  3. The Consistency Between Clinical and Electrophysiological Diagnoses

    Directory of Open Access Journals (Sweden)

    Esra E. Okuyucu

    2009-09-01

    Full Text Available OBJECTIVE: The aim of this study was to provide information concerning the impact of electrophysiological tests in the clinical management and diagnosis of patients, and to evaluate the consistency between referring clinical diagnoses and electrophysiological diagnoses. METHODS: The study included 957 patients referred to the electroneuromyography (ENMG laboratory from different clinics with different clinical diagnoses in 2008. Demographic data, referring clinical diagnoses, the clinics where the requests wanted, and diagnoses after ENMG testing were recorded and statistically evaluated. RESULTS: In all, 957 patients [644 (67.3% female and 313 (32.7% male] were included in the study. Mean age of the patients was 45.40 ± 14.54 years. ENMG requests were made by different specialists; 578 (60.4% patients were referred by neurologists, 122 (12.8% by orthopedics, 140 (14.6% by neurosurgeons, and 117 (12.2% by physical treatment and rehabilitation departments. According to the results of ENMG testing, 513 (53.6% patients’ referrals were related to their referral diagnosis, whereas 397 (41.5% patients had normal ENMG test results, and 47 (4.9% patients had a diagnosis that differed from the referring diagnosis. Among the relation between the referral diagnosis and electrophysiological diagnosis according to the clinics where the requests were made, there was no statistical difference (p= 0.794, but there were statistically significant differences between the support of different clinical diagnoses, such as carpal tunnel syndrome, polyneuropathy, radiculopathy-plexopathy, entrapment neuropathy, and myopathy based on ENMG test results (p< 0.001. CONCLUSION: ENMG is a frequently used neurological examination. As such, referrals for ENMG can be made to either support the referring diagnosis or to exclude other diagnoses. This may explain the inconsistency between clinical referring diagnoses and diagnoses following ENMG

  4. Multiphase flows of N immiscible incompressible fluids: A reduction-consistent and thermodynamically-consistent formulation and associated algorithm

    Science.gov (United States)

    Dong, S.

    2018-05-01

    We present a reduction-consistent and thermodynamically consistent formulation and an associated numerical algorithm for simulating the dynamics of an isothermal mixture consisting of N (N ⩾ 2) immiscible incompressible fluids with different physical properties (densities, viscosities, and pair-wise surface tensions). By reduction consistency we refer to the property that if only a set of M (1 ⩽ M ⩽ N - 1) fluids are present in the system then the N-phase governing equations and boundary conditions will exactly reduce to those for the corresponding M-phase system. By thermodynamic consistency we refer to the property that the formulation honors the thermodynamic principles. Our N-phase formulation is developed based on a more general method that allows for the systematic construction of reduction-consistent formulations, and the method suggests the existence of many possible forms of reduction-consistent and thermodynamically consistent N-phase formulations. Extensive numerical experiments have been presented for flow problems involving multiple fluid components and large density ratios and large viscosity ratios, and the simulation results are compared with the physical theories or the available physical solutions. The comparisons demonstrate that our method produces physically accurate results for this class of problems.

  5. Consistent Parameter and Transfer Function Estimation using Context Free Grammars

    Science.gov (United States)

    Klotz, Daniel; Herrnegger, Mathew; Schulz, Karsten

    2017-04-01

    This contribution presents a method for the inference of transfer functions for rainfall-runoff models. Here, transfer functions are defined as parametrized (functional) relationships between a set of spatial predictors (e.g. elevation, slope or soil texture) and model parameters. They are ultimately used for estimation of consistent, spatially distributed model parameters from a limited amount of lumped global parameters. Additionally, they provide a straightforward method for parameter extrapolation from one set of basins to another and can even be used to derive parameterizations for multi-scale models [see: Samaniego et al., 2010]. Yet, currently an actual knowledge of the transfer functions is often implicitly assumed. As a matter of fact, for most cases these hypothesized transfer functions can rarely be measured and often remain unknown. Therefore, this contribution presents a general method for the concurrent estimation of the structure of transfer functions and their respective (global) parameters. Note, that by consequence an estimation of the distributed parameters of the rainfall-runoff model is also undertaken. The method combines two steps to achieve this. The first generates different possible transfer functions. The second then estimates the respective global transfer function parameters. The structural estimation of the transfer functions is based on the context free grammar concept. Chomsky first introduced context free grammars in linguistics [Chomsky, 1956]. Since then, they have been widely applied in computer science. But, to the knowledge of the authors, they have so far not been used in hydrology. Therefore, the contribution gives an introduction to context free grammars and shows how they can be constructed and used for the structural inference of transfer functions. This is enabled by new methods from evolutionary computation, such as grammatical evolution [O'Neill, 2001], which make it possible to exploit the constructed grammar as a

  6. Second-Order Perturbation Theory for Generalized Active Space Self-Consistent-Field Wave Functions.

    Science.gov (United States)

    Ma, Dongxia; Li Manni, Giovanni; Olsen, Jeppe; Gagliardi, Laura

    2016-07-12

    A multireference second-order perturbation theory approach based on the generalized active space self-consistent-field (GASSCF) wave function is presented. Compared with the complete active space (CAS) and restricted active space (RAS) wave functions, GAS wave functions are more flexible and can employ larger active spaces and/or different truncations of the configuration interaction expansion. With GASSCF, one can explore chemical systems that are not affordable with either CASSCF or RASSCF. Perturbation theory to second order on top of GAS wave functions (GASPT2) has been implemented to recover the remaining electron correlation. The method has been benchmarked by computing the chromium dimer ground-state potential energy curve. These calculations show that GASPT2 gives results similar to CASPT2 even with a configuration interaction expansion much smaller than the corresponding CAS expansion.

  7. The influence of relationship beliefs on gift giving

    Directory of Open Access Journals (Sweden)

    Rai Dipankar

    2017-12-01

    Full Text Available People have fundamental beliefs about what constitutes a good relationship, known as implicit theories of relationship, where some people have destiny beliefs whereas others have growth beliefs. People with destiny beliefs believe that potential partners are meant either for each other or not, whereas people with growth beliefs believe that successful relationships are cultivated and developed. This research shows that different implicit theories of relationship influence consumers’ gift choice to their significant others. We demonstrate, through two studies, that consumers with destiny beliefs prefer giving gifts that are more feasible in nature, whereas consumers with growth beliefs prefer giving gifts that are more desirable in nature. We show that this effect is mediated by desirability-feasibility considerations. Specifically, consumers with destiny beliefs focus on feasibility considerations, which leads them to choose a highly feasible gift. Conversely, consumers with growth beliefs focus on desirability considerations, which leads them to choose a highly desirable gift. We also discuss the theoretical and managerial implications of our research.

  8. Do spinors give rise to a frame-dragging effect?

    International Nuclear Information System (INIS)

    Randono, Andrew

    2010-01-01

    We investigate the effect of the intrinsic spin of a fundamental spinor field on the surrounding spacetime geometry. We show that despite the lack of a rotating stress-energy source (and despite claims to the contrary) the intrinsic spin of a spin-half fermion gives rise to a frame-dragging effect analogous to that of orbital angular momentum, even in Einstein-Hilbert gravity where torsion is constrained to be zero. This resolves a paradox regarding the counter-force needed to restore Newton's third law in the well-known spin-orbit interaction. In addition, the frame-dragging effect gives rise to a long-range gravitationally mediated spin-spin dipole interaction coupling the internal spins of two sources. We argue that despite the weakness of the interaction, the spin-spin interaction will dominate over the ordinary inverse square Newtonian interaction in any process of sufficiently high energy for quantum field theoretical effects to be non-negligible.

  9. [The ability of drivers to give first aid--testing by questionnaire].

    Science.gov (United States)

    Goniewicz, M

    1998-01-01

    Road accidents have become a serious social problem. The scale and complexity of this problem shows clearly that there is a necessity to improve citizens' ability to give first aid which is especially essential in the case of drivers. Thus special training how to give first aid at the accident place seems to be of the primary importance. The objective of this paper is to: 1) identify to what extent the drivers of motor vehicles are prepared to provide first aid for casualties of the road accidents, 2) evaluate the training system of teaching motorists how to give first aid before professional help arrives, 3) identify drivers' views on possibilities of decreasing the number of fatal casualties of the road accidents. The questionnaire was given to 560 employees of local government institutions in the city of Lublin either professional or non-professional drivers. The direct method and anonymous questionnaire were used. The results of the questionnaire revealed clearly that very few drivers are well-prepared to give proper first aid at the accident site. No matter what sex, education or driving experience, the drivers have not got enough skills to give first aid and the effect is enhanced by various psychological barriers. The questioned drivers shared the opinion that first aid training is badly run. The drivers stressed bad quality of the training and the fact that it is impossible to acquire practical skills that may be required in the case of emergency. Drivers' views on possibilities of decreasing the number of fatal casualties of the road accidents included, among others, the following propositions: in addition to the driving licence exam first aid exam should be compulsory severe enforcement and execution of the law which regulates the mandatory first aid giving.

  10. Consistency in the World Wide Web

    DEFF Research Database (Denmark)

    Thomsen, Jakob Grauenkjær

    Tim Berners-Lee envisioned that computers will behave as agents of humans on the World Wide Web, where they will retrieve, extract, and interact with information from the World Wide Web. A step towards this vision is to make computers capable of extracting this information in a reliable...... and consistent way. In this dissertation we study steps towards this vision by showing techniques for the specication, the verication and the evaluation of the consistency of information in the World Wide Web. We show how to detect certain classes of errors in a specication of information, and we show how...... the World Wide Web, in order to help perform consistent evaluations of web extraction techniques. These contributions are steps towards having computers reliable and consistently extract information from the World Wide Web, which in turn are steps towards achieving Tim Berners-Lee's vision. ii...

  11. Self-consistent areas law in QCD

    International Nuclear Information System (INIS)

    Makeenko, Yu.M.; Migdal, A.A.

    1980-01-01

    The problem of obtaining the self-consistent areas law in quantum chromodynamics (QCD) is considered from the point of view of the quark confinement. The exact equation for the loop average in multicolor QCD is reduced to a bootstrap form. Its iterations yield new manifestly gauge invariant perturbation theory in the loop space, reproducing asymptotic freedom. For large loops, the areas law apprears to be a self-consistent solution

  12. Consistency of the MLE under mixture models

    OpenAIRE

    Chen, Jiahua

    2016-01-01

    The large-sample properties of likelihood-based statistical inference under mixture models have received much attention from statisticians. Although the consistency of the nonparametric MLE is regarded as a standard conclusion, many researchers ignore the precise conditions required on the mixture model. An incorrect claim of consistency can lead to false conclusions even if the mixture model under investigation seems well behaved. Under a finite normal mixture model, for instance, the consis...

  13. Two-particle irreducible effective actions versus resummation: Analytic properties and self-consistency

    Directory of Open Access Journals (Sweden)

    Michael Brown

    2015-11-01

    Full Text Available Approximations based on two-particle irreducible (2PI effective actions (also known as Φ-derivable, Cornwall–Jackiw–Tomboulis or Luttinger–Ward functionals depending on context have been widely used in condensed matter and non-equilibrium quantum/statistical field theory because this formalism gives a robust, self-consistent, non-perturbative and systematically improvable approach which avoids problems with secular time evolution. The strengths of 2PI approximations are often described in terms of a selective resummation of Feynman diagrams to infinite order. However, the Feynman diagram series is asymptotic and summation is at best a dangerous procedure. Here we show that, at least in the context of a toy model where exact results are available, the true strength of 2PI approximations derives from their self-consistency rather than any resummation. This self-consistency allows truncated 2PI approximations to capture the branch points of physical amplitudes where adjustments of coupling constants can trigger an instability of the vacuum. This, in effect, turns Dyson's argument for the failure of perturbation theory on its head. As a result we find that 2PI approximations perform better than Padé approximation and are competitive with Borel–Padé resummation. Finally, we introduce a hybrid 2PI–Padé method.

  14. Exciton spectrum of surface-corrugated quantum wells: the adiabatic self-consistent approach

    International Nuclear Information System (INIS)

    Atenco A, N.; Perez R, F.; Makarov, N.M.

    2005-01-01

    A theory for calculating the relaxation frequency ν and the shift δ ω of exciton resonances in quantum wells with finite potential barriers and adiabatic surface disorder is developed. The adiabaticity implies that the correlation length R C for the well width fluctuations is much larger than the exciton radius a 0 (R C >> a 0 ). Our theory is based on the self-consistent Green's function method, and therefore takes into account the inherent action of the exciton scattering on itself. The self-consistent approach is shown to describe quantitatively the sharp exciton resonance. It also gives the qualitatively correct resonance picture for the transition to the classical limit, as well as within the domain of the classical limit itself. We present and analyze results for h h-exciton in a GaAs quantum well with Al 0.3 Ga 0.7 As barriers. It is established that the self-consistency and finite height of potential barriers significantly influence on the line-shape of exciton resonances, and make the values of ν and δ ω be quite realistic. In particular, the relaxation frequency ν for the ground-state resonance has a broad, almost symmetric maximum near the resonance frequency ω 0 , while the surface-induced resonance shift δ ω vanishes near ω 0 , and has different signs on the sides of the exciton resonance. (Author) 43 refs., 4 figs

  15. Time-Consistent and Market-Consistent Evaluations (replaced by CentER DP 2012-086)

    NARCIS (Netherlands)

    Pelsser, A.; Stadje, M.A.

    2011-01-01

    We consider evaluation methods for payoffs with an inherent financial risk as encountered for instance for portfolios held by pension funds and insurance companies. Pricing such payoffs in a way consistent to market prices typically involves combining actuarial techniques with methods from

  16. Time-Consistent and Market-Consistent Evaluations (Revised version of CentER DP 2011-063)

    NARCIS (Netherlands)

    Pelsser, A.; Stadje, M.A.

    2012-01-01

    Abstract: We consider evaluation methods for payoffs with an inherent financial risk as encountered for instance for portfolios held by pension funds and insurance companies. Pricing such payoffs in a way consistent to market prices typically involves combining actuarial techniques with methods from

  17. Who Is Giving Feedback To Whom In Entrepreneurship Education?

    DEFF Research Database (Denmark)

    Trolle Elmholdt, Stine; Warhuus, Jan; Blenker, Per

    evaluate and provide feedback on, with regard to both the teaching and the learning that takes place in these types of courses. We therefore ask: Who is giving feedback to whom in entrepreneurship education - and for what purpose?The intent of the paper is to develop and explore the system of feedback......The question we care about (objectives):When entrepreneurship is taught through the process of practicing entrepreneurship and based on experiential learning, a need arises for different forms of assessment, evaluation, and feedback procedures than those applied to traditional forms of higher...... is at play that involves both feedback among educators and students and between educators and students;3. that the complexity is further increased when it is acknowledged that the subject of the feedback may concern the learning, the teaching, the process, the object of the process (the entrepreneurial...

  18. Uranium oxide recycling to give more sustainable power generation

    International Nuclear Information System (INIS)

    Hagger, R.; Garner, D.S.J.; Beaumont, D.M.; Hesketh, K.

    2001-01-01

    In broad terms there are two routes for irradiated nuclear fuel, the closed cycle involving recycling and the open cycle culminating in direct disposal. The benefits of following the closed cycle are presented. The environmental burdens associated with open and closed cycles are compared using Life Cycle Assessment (LCA) for non-active burdens and human irradiation. Consideration is given to the extension of the nuclear fuel cycle to include a proportion of MOX fuel elements within a reactor core, and the impact in terms of total activity, waste volumes and Integrated Toxic Potential (ITP) discussed. The potential of moving to a fast reactor cycle is also raised in support of the recycling of spent nuclear fuel giving sustainable power generation. (author)

  19. Conditions needed to give meaning to rad-equivalence principle

    International Nuclear Information System (INIS)

    Latarjet, R.

    1980-01-01

    To legislate on mutagenic chemical pollution the problem to be faced is similar to that tackled about 30 years ago regarding pollution by ionizing radiations. It would be useful to benefit from the work of these 30 years by establishing equivalences, if possible, between chemical mutagens and radiations. Inevitable mutagenic pollutions are considered here, especially those associated with fuel based energy production. As with radiations the legislation must derive from a compromise between the harmful and beneficial effects of the polluting system. When deciding on tolerance doses it is necessary to safeguard the biosphere without inflicting excessive restrictions on industry and on the economy. The present article discusses the conditions needed to give meaning to the notion of rad-equivalence. Some examples of already established equivalences are given, together with the first practical consequences which emerge [fr

  20. Evaluation gives the activity inventory the nuclear fuel irradiated and its radioactive waste

    International Nuclear Information System (INIS)

    Rodriguez Gual, Maritza

    1998-01-01

    The present work has as objectives to give a quantitative evaluation to the activity that possesses the nuclear fuel for 3,6% enrichment with a burnt one the 33 000 NWd/Tu proposed for the Juragua Nuclear Power Plant . In this work the method is used I calculate ORIGEN2. Obtained results are presented and they are compared with other calculations carried out in reactors type VVER-440

  1. Parquet equations for numerical self-consistent-field theory

    International Nuclear Information System (INIS)

    Bickers, N.E.

    1991-01-01

    In recent years increases in computational power have provided new motivation for the study of self-consistent-field theories for interacting electrons. In this set of notes, the so-called parquet equations for electron systems are derived pedagogically. The principal advantages of the parquet approach are outlined, and its relationship to simpler self-consistent-field methods, including the Baym-Kadanoff technique, is discussed in detail. (author). 14 refs, 9 figs

  2. Effectiveness of a television advertisement campaign on giving cigarettes in a chinese population.

    Science.gov (United States)

    Qin, Yu; Su, Jian; Xiang, Quanyong; Hu, Yihe; Xu, Guanqun; Ma, Jiuhua; Shi, Zumin

    2014-01-01

    Anti-tobacco television advertisement campaigns may convey messages on smoking-related health consequences and create norms against giving cigarettes. Altogether, 156 and 112 slots of a television advertisement "Giving cigarettes is giving harm" were aired on Suzhou and Yizheng, respectively, over one month in 2010. Participants were recruited from 15 locations in Suzhou and 8 locations in Yizheng using a street intercept method. Overall 2306 residents aged 18-45 years completed questionnaires, including 1142 before the campaign and 1164 after, with respective response rates of 79.1% and 79.7%. Chi square tests were used to compare the difference between categorical variables. After the campaign, 36.0% of subjects recalled that they had seen the advertisement. Residents of Suzhou had a higher recall rate than those of Yizheng (47.6% vs. 20.6%, P advertisement were more likely not to give cigarettes in the future than those who reported not seeing the advertisement (38.7% vs. 27.5%, P advertisements helped change societal norms and improve health behavior. Continuous and adequate funding of anti-tobacco media campaigns targeted at different levels of the general population is needed, in conjunction with a comprehensive tobacco control effort.

  3. Case Studies of Community-Academic Partnerships Established Using the Give-Get Grid Model.

    Science.gov (United States)

    Behringer, Bruce; Southerland, Jodi L; Plummer, Robert M

    2017-11-01

    While partnerships for health delivery and improvement are frequently described by their structure, goals, and plans, less attention is paid to the interactive relationships among partners or for larger stakeholder groups' coalition memberships. The Give-Get Grid group process tool can be used to assess each stakeholders' expected benefits ("gets") and contributions ("gives") needed to establish and maintain long-term, mutually advantageous community-academic partnerships. This article describes three case study experiences using the Give-Get Grid in real-world context to understand and generate ideas to address contemporary health promotion opportunities among a variety of stakeholders. The case studies address three distinct community health promotion opportunities: prevention of school-based adolescent obesity disparities, higher education health professions training programs in rural community-based settings, and methods for engaging community coalitions in state Comprehensive Cancer Control Programs. The case studies demonstrate the Give-Get Grid's utility in both planning and evaluating partnerships and documenting key elements for progress in health promotion initiatives built on long-term community-academic relationships. Steps are explained with practical lessons learned in using the Grid.

  4. Toward a consistent RHA-RPA

    International Nuclear Information System (INIS)

    Shepard, J.R.

    1991-01-01

    The authors examine the RPA based on a relativistic Hartree approximation description for nuclear ground states. This model includes contributions from the negative energy sea at the 1-loop level. They emphasize consistency between the treatment of the ground state and the RPA. This consistency is important in the description of low-lying collective levels but less important for the longitudinal (e, e') quasi-elastic response. They also study the effect of imposing a 3-momentum cutoff on negative energy sea contributions. A cutoff of twice the nucleon mass improves agreement with observed spin orbit splittings in nuclei compared to the standard infinite cutoff results, an effect traceable to the fact that imposing the cutoff reduces m*/m. The cutoff is much less important than consistency in the description of low-lying collective levels. The cutoff model provides excellent agreement with quasi-elastic (e, e') data

  5. Personalized recommendation based on unbiased consistence

    Science.gov (United States)

    Zhu, Xuzhen; Tian, Hui; Zhang, Ping; Hu, Zheng; Zhou, Tao

    2015-08-01

    Recently, in physical dynamics, mass-diffusion-based recommendation algorithms on bipartite network provide an efficient solution by automatically pushing possible relevant items to users according to their past preferences. However, traditional mass-diffusion-based algorithms just focus on unidirectional mass diffusion from objects having been collected to those which should be recommended, resulting in a biased causal similarity estimation and not-so-good performance. In this letter, we argue that in many cases, a user's interests are stable, and thus bidirectional mass diffusion abilities, no matter originated from objects having been collected or from those which should be recommended, should be consistently powerful, showing unbiased consistence. We further propose a consistence-based mass diffusion algorithm via bidirectional diffusion against biased causality, outperforming the state-of-the-art recommendation algorithms in disparate real data sets, including Netflix, MovieLens, Amazon and Rate Your Music.

  6. Financial model calibration using consistency hints.

    Science.gov (United States)

    Abu-Mostafa, Y S

    2001-01-01

    We introduce a technique for forcing the calibration of a financial model to produce valid parameters. The technique is based on learning from hints. It converts simple curve fitting into genuine calibration, where broad conclusions can be inferred from parameter values. The technique augments the error function of curve fitting with consistency hint error functions based on the Kullback-Leibler distance. We introduce an efficient EM-type optimization algorithm tailored to this technique. We also introduce other consistency hints, and balance their weights using canonical errors. We calibrate the correlated multifactor Vasicek model of interest rates, and apply it successfully to Japanese Yen swaps market and US dollar yield market.

  7. A Thermodynamically-Consistent Non-Ideal Stochastic Hard-Sphere Fluid

    Energy Technology Data Exchange (ETDEWEB)

    Donev, A; Alder, B J; Garcia, A L

    2009-08-03

    A grid-free variant of the Direct Simulation Monte Carlo (DSMC) method is proposed, named the Isotropic DSMC (I-DSMC) method, that is suitable for simulating collision-dominated dense fluid flows. The I-DSMC algorithm eliminates all grid artifacts from the traditional DSMC algorithm and is Galilean invariant and microscopically isotropic. The stochastic collision rules in I-DSMC are modified to introduce a non-ideal structure factor that gives consistent compressibility, as first proposed in [Phys. Rev. Lett. 101:075902 (2008)]. The resulting Stochastic Hard Sphere Dynamics (SHSD) fluid is empirically shown to be thermodynamically identical to a deterministic Hamiltonian system of penetrable spheres interacting with a linear core pair potential, well-described by the hypernetted chain (HNC) approximation. We develop a kinetic theory for the SHSD fluid to obtain estimates for the transport coefficients that are in excellent agreement with particle simulations over a wide range of densities and collision rates. The fluctuating hydrodynamic behavior of the SHSD fluid is verified by comparing its dynamic structure factor against theory based on the Landau-Lifshitz Navier-Stokes equations. We also study the Brownian motion of a nano-particle suspended in an SHSD fluid and find a long-time power-law tail in its velocity autocorrelation function consistent with hydrodynamic theory and molecular dynamics calculations.

  8. Bootstrap-Based Inference for Cube Root Consistent Estimators

    DEFF Research Database (Denmark)

    Cattaneo, Matias D.; Jansson, Michael; Nagasawa, Kenichi

    This note proposes a consistent bootstrap-based distributional approximation for cube root consistent estimators such as the maximum score estimator of Manski (1975) and the isotonic density estimator of Grenander (1956). In both cases, the standard nonparametric bootstrap is known...... to be inconsistent. Our method restores consistency of the nonparametric bootstrap by altering the shape of the criterion function defining the estimator whose distribution we seek to approximate. This modification leads to a generic and easy-to-implement resampling method for inference that is conceptually distinct...... from other available distributional approximations based on some form of modified bootstrap. We offer simulation evidence showcasing the performance of our inference method in finite samples. An extension of our methodology to general M-estimation problems is also discussed....

  9. INVESTIGATE-I (INVasive Evaluation before Surgical Treatment of Incontinence Gives Added Therapeutic Effect?): a mixed-methods study to assess the feasibility of a future randomised controlled trial of invasive urodynamic testing prior to surgery for stress urinary incontinence in women.

    Science.gov (United States)

    Hilton, Paul; Armstrong, Natalie; Brennand, Catherine; Howel, Denise; Shen, Jing; Bryant, Andrew; Tincello, Douglas G; Lucas, Malcolm G; Buckley, Brian S; Chapple, Christopher R; Homer, Tara; Vale, Luke; McColl, Elaine

    2015-02-01

    The position of invasive urodynamic testing in the diagnostic pathway for urinary incontinence (UI) is unclear. Systematic reviews have called for further trials evaluating clinical utility, although a preliminary feasibility study was considered appropriate. To inform the decision whether or not to proceed to a definitive randomised trial of invasive urodynamic testing compared with clinical assessment with non-invasive tests, prior to surgery in women with stress UI (SUI) or stress predominant mixed UI (MUI). A mixed-methods study comprising a pragmatic multicentre randomised pilot trial; economic evaluation; survey of clinicians' views about invasive urodynamic testing; qualitative interviews with clinicians and trial participants. Urogynaecology, female urology and general gynaecology units in Newcastle, Leicester, Swansea, Sheffield, Northumberland, Gateshead and South Tees. Trial recruits were women with SUI or stress predominant MUI who were considering surgery after unsuccessful conservative treatment. Relevant clinicians completed two online surveys. Subsets of survey respondents and trial participants took part in separate qualitative interview studies. Pilot trial participants were randomised to undergo clinical assessment with non-invasive tests (control arm); or assessment as controls, plus invasive urodynamic testing (intervention arm). Confirmation that units can identify and recruit eligible women; acceptability of investigation strategies and data collection tools; acquisition of outcome data to determine the sample size for a definitive trial. The proposed primary outcome for the definitive trial was International Consultation on Incontinence Modular Questionnaire (ICIQ) Female Lower Urinary Tract Symptoms (ICIQ-FLUTS) (total score) 6 months after surgery or the start of non-surgical treatment; secondary outcomes included: ICIQ-FLUTS (subscales); ICIQ Urinary Incontinence Short Form; ICIQ Lower Urinary Tract Symptoms Quality of Life; Urogenital

  10. Proteolysis and consistency of Meshanger cheese

    NARCIS (Netherlands)

    Jong, de L.

    1978-01-01

    Proteolysis in Meshanger cheese, estimated by quantitative polyacrylamide gel electrophoresis is discussed. The conversion of α s1 -casein was proportional to rennet concentration in the cheese. Changes in consistency, after a maximum, were correlated to breakdown of

  11. Developing consistent pronunciation models for phonemic variants

    CSIR Research Space (South Africa)

    Davel, M

    2006-09-01

    Full Text Available Pronunciation lexicons often contain pronunciation variants. This can create two problems: It can be difficult to define these variants in an internally consistent way and it can also be difficult to extract generalised grapheme-to-phoneme rule sets...

  12. Guided color consistency optimization for image mosaicking

    Science.gov (United States)

    Xie, Renping; Xia, Menghan; Yao, Jian; Li, Li

    2018-01-01

    This paper studies the problem of color consistency correction for sequential images with diverse color characteristics. Existing algorithms try to adjust all images to minimize color differences among images under a unified energy framework, however, the results are prone to presenting a consistent but unnatural appearance when the color difference between images is large and diverse. In our approach, this problem is addressed effectively by providing a guided initial solution for the global consistency optimization, which avoids converging to a meaningless integrated solution. First of all, to obtain the reliable intensity correspondences in overlapping regions between image pairs, we creatively propose the histogram extreme point matching algorithm which is robust to image geometrical misalignment to some extents. In the absence of the extra reference information, the guided initial solution is learned from the major tone of the original images by searching some image subset as the reference, whose color characteristics will be transferred to the others via the paths of graph analysis. Thus, the final results via global adjustment will take on a consistent color similar to the appearance of the reference image subset. Several groups of convincing experiments on both the synthetic dataset and the challenging real ones sufficiently demonstrate that the proposed approach can achieve as good or even better results compared with the state-of-the-art approaches.

  13. Consistency in multi-viewpoint architectural design

    NARCIS (Netherlands)

    Dijkman, R.M.; Dijkman, Remco Matthijs

    2006-01-01

    This thesis presents a framework that aids in preserving consistency in multi-viewpoint designs. In a multi-viewpoint design each stakeholder constructs his own design part. We call each stakeholder’s design part the view of that stakeholder. To construct his view, a stakeholder has a viewpoint.

  14. Consistent Stochastic Modelling of Meteocean Design Parameters

    DEFF Research Database (Denmark)

    Sørensen, John Dalsgaard; Sterndorff, M. J.

    2000-01-01

    Consistent stochastic models of metocean design parameters and their directional dependencies are essential for reliability assessment of offshore structures. In this paper a stochastic model for the annual maximum values of the significant wave height, and the associated wind velocity, current...

  15. On the existence of consistent price systems

    DEFF Research Database (Denmark)

    Bayraktar, Erhan; Pakkanen, Mikko S.; Sayit, Hasanjan

    2014-01-01

    We formulate a sufficient condition for the existence of a consistent price system (CPS), which is weaker than the conditional full support condition (CFS). We use the new condition to show the existence of CPSs for certain processes that fail to have the CFS property. In particular this condition...

  16. Dynamic phonon exchange requires consistent dressing

    International Nuclear Information System (INIS)

    Hahne, F.J.W.; Engelbrecht, C.A.; Heiss, W.D.

    1976-01-01

    It is shown that states with undersirable properties (such as ghosts, states with complex eigenenergies and states with unrestricted normalization) emerge from two-body calculations using dynamic effective interactions if one is not careful in introducing single-particle self-energy insertions in a consistent manner

  17. Consistent feeding positions of great tit parents

    NARCIS (Netherlands)

    Lessells, C.M.; Poelman, E.H.; Mateman, A.C.; Cassey, Ph.

    2006-01-01

    When parent birds arrive at the nest to provision their young, their position on the nest rim may influence which chick or chicks are fed. As a result, the consistency of feeding positions of the individual parents, and the difference in position between the parents, may affect how equitably food is

  18. Consistency of the postulates of special relativity

    International Nuclear Information System (INIS)

    Gron, O.; Nicola, M.

    1976-01-01

    In a recent article in this journal, Kingsley has tried to show that the postulates of special relativity contradict each other. It is shown that the arguments of Kingsley are invalid because of an erroneous appeal to symmetry in a nonsymmetric situation. The consistency of the postulates of special relativity and the relativistic kinematics deduced from them is restated

  19. Consistency of Network Traffic Repositories: An Overview

    NARCIS (Netherlands)

    Lastdrager, E.; Lastdrager, E.E.H.; Pras, Aiko

    2009-01-01

    Traffc repositories with TCP/IP header information are very important for network analysis. Researchers often assume that such repositories reliably represent all traffc that has been flowing over the network; little thoughts are made regarding the consistency of these repositories. Still, for

  20. Consistency analysis of network traffic repositories

    NARCIS (Netherlands)

    Lastdrager, Elmer; Lastdrager, E.E.H.; Pras, Aiko

    Traffic repositories with TCP/IP header information are very important for network analysis. Researchers often assume that such repositories reliably represent all traffic that has been flowing over the network; little thoughts are made regarding the consistency of these repositories. Still, for

  1. A dynamical mechanism for large volumes with consistent couplings

    Energy Technology Data Exchange (ETDEWEB)

    Abel, Steven [IPPP, Durham University,Durham, DH1 3LE (United Kingdom)

    2016-11-14

    A mechanism for addressing the “decompactification problem” is proposed, which consists of balancing the vacuum energy in Scherk-Schwarzed theories against contributions coming from non-perturbative physics. Universality of threshold corrections ensures that, in such situations, the stable minimum will have consistent gauge couplings for any gauge group that shares the same N=2 beta function for the bulk excitations as the gauge group that takes part in the minimisation. Scherk-Schwarz compactification from 6D to 4D in heterotic strings is discussed explicitly, together with two alternative possibilities for the non-perturbative physics, namely metastable SQCD vacua and a single gaugino condensate. In the former case, it is shown that modular symmetries gives various consistency checks, and allow one to follow soft-terms, playing a similar role to R-symmetry in global SQCD. The latter case is particularly attractive when there is nett Bose-Fermi degeneracy in the massless sector. In such cases, because the original Casimir energy is generated entirely by excited and/or non-physical string modes, it is completely immune to the non-perturbative IR physics. Such a separation between UV and IR contributions to the potential greatly simplifies the analysis of stabilisation, and is a general possibility that has not been considered before.

  2. Consistent Kaluza-Klein truncations via exceptional field theory

    Energy Technology Data Exchange (ETDEWEB)

    Hohm, Olaf [Center for Theoretical Physics, Massachusetts Institute of Technology,Cambridge, MA 02139 (United States); Samtleben, Henning [Université de Lyon, Laboratoire de Physique, UMR 5672, CNRS,École Normale Supérieure de Lyon, 46, allée d’Italie, F-69364 Lyon cedex 07 (France)

    2015-01-26

    We present the generalized Scherk-Schwarz reduction ansatz for the full supersymmetric exceptional field theory in terms of group valued twist matrices subject to consistency equations. With this ansatz the field equations precisely reduce to those of lower-dimensional gauged supergravity parametrized by an embedding tensor. We explicitly construct a family of twist matrices as solutions of the consistency equations. They induce gauged supergravities with gauge groups SO(p,q) and CSO(p,q,r). Geometrically, they describe compactifications on internal spaces given by spheres and (warped) hyperboloides H{sup p,q}, thus extending the applicability of generalized Scherk-Schwarz reductions beyond homogeneous spaces. Together with the dictionary that relates exceptional field theory to D=11 and IIB supergravity, respectively, the construction defines an entire new family of consistent truncations of the original theories. These include not only compactifications on spheres of different dimensions (such as AdS{sub 5}×S{sup 5}), but also various hyperboloid compactifications giving rise to a higher-dimensional embedding of supergravities with non-compact and non-semisimple gauge groups.

  3. Consistency among integral measurements of aggregate decay heat power

    Energy Technology Data Exchange (ETDEWEB)

    Takeuchi, H.; Sagisaka, M.; Oyamatsu, K.; Kukita, Y. [Nagoya Univ. (Japan)

    1998-03-01

    Persisting discrepancies between summation calculations and integral measurements force us to assume large uncertainties in the recommended decay heat power. In this paper, we develop a hybrid method to calculate the decay heat power of a fissioning system from those of different fissioning systems. Then, this method is applied to examine consistency among measured decay heat powers of {sup 232}Th, {sup 233}U, {sup 235}U, {sup 238}U and {sup 239}Pu at YAYOI. The consistency among the measured values are found to be satisfied for the {beta} component and fairly well for the {gamma} component, except for cooling times longer than 4000 s. (author)

  4. From Windfall Sharing to Property Ownership: Prosocial Personality Traits in Giving and Taking Dictator Games

    Directory of Open Access Journals (Sweden)

    Kun Zhao

    2018-05-01

    Full Text Available The dictator game is a well-known task measuring prosocial preferences, in which one person divides a fixed amount of windfall money with a recipient. A key factor in real-world transfers of wealth is the concept of property ownership and consequently the related acts of giving and taking. Using a variation of the traditional dictator game (N = 256, we examined whether individual differences under different game frames corresponded with prosocial personality traits from the Big Five (politeness, compassion and HEXACO (Honesty-Humility, Emotionality, eXtraversion, Agreeableness, Conscientiousness, Openness to Experience (honesty-humility, agreeableness models. In the Big Five model, the effects of prosocial personality traits were generally stronger and more consistent for taking than for giving, in line with a “do-no-harm” explanation, whereby prosocial individuals felt less entitled to and less willing to infringe on the endowments of others. In contrast, HEXACO honesty-humility predicted allocations across both frames, consistent with its broad association with fair-mindedness, and providing further evidence of its role in allocations of wealth more generally. These findings highlight the utility of integrating personality psychology with behavioral economics, in which the discriminant validity across prosocial traits can shed light on the distinct motivations underpinning social decisions.

  5. The good engineer: giving virtue its due in engineering ethics.

    Science.gov (United States)

    Harris, Charles E

    2008-06-01

    During the past few decades, engineering ethics has been oriented towards protecting the public from professional misconduct by engineers and from the harmful effects of technology. This "preventive ethics" project has been accomplished primarily by means of the promulgation of negative rules. However, some aspects of engineering professionalism, such as (1) sensitivity to risk (2) awareness of the social context of technology, (3) respect for nature, and (4) commitment to the public good, cannot be adequately accounted for in terms of rules, certainly not negative rules. Virtue ethics is a more appropriate vehicle for expressing these aspects of engineering professionalism. Some of the unique features of virtue ethics are the greater place it gives for discretion and judgment and also for inner motivation and commitment. Four of the many professional virtues that are important for engineers correspond to the four aspects of engineering professionalism listed above. Finally, the importance of the humanities and social sciences in promoting these virtues suggests that these disciplines are crucial in the professional education of engineers.

  6. Nasal nicotine solution: a potential aid to giving up smoking?

    Science.gov (United States)

    Russell, M A; Jarvis, M J; Feyerabend, C; Fernö, O

    1983-01-01

    A nasal solution was developed containing 2 mg nicotine for use as a kind of liquid snuff. Its absorption was studied in three subjects. An average peak of plasma nicotine concentrations of 86.9 nmol/l (14.1 ng/ml) was reached seven and a half minutes after taking the solution. This compared with an average peak of 158.4 nmol/l (25.7 ng/ml) one and a half minutes after completing (but seven and a half minutes after starting) a middle tar cigarette (1.4 mg nicotine) and an average peak of 52.4 nmol/l (8.5 ng/ml) after chewing nicotine gum (2 mg nicotine) for 30 minutes. The more rapid and efficient absorption of nicotine from the nasal nicotine solution than from nicotine chewing gum suggests that it might prove a useful aid to giving up smoking. Nasal nicotine solution might be particularly useful in smokers for whom the gum is less suitable on account of dentures or peptic ulcers or who experience nausea and dyspeptic symptoms from the gum. PMID:6402202

  7. Does friendship give us non-derivative partial reasons ?

    Directory of Open Access Journals (Sweden)

    Andrew Reisner

    2008-02-01

    Full Text Available One way to approach the question of whether there are non-derivative partial reasons of any kind is to give an account of what partial reasons are, and then to consider whether there are such reasons. If there are, then it is at least possible that there are partial reasons of friendship. It is this approach that will be taken here, and it produces several interesting results. The first is a point about the structure of partial reasons. It is at least a necessary condition of a reason’s being partial that it has an explicit relational component. This component, technically, is a relatum in the reason relation that itself is a relation between the person to whom the reason applies and the person whom the action for which there is a reason concerns. The second conclusion of the paper is that this relational component is also required for a number of types of putatively impartial reasons. In order to avoid trivialising the distinction between partial and impartial reasons, some further sufficient condition must be applied. Finally, there is some prospect for a way of distinguishing between impartial reasons that contain a relational component and partial reasons, but that this approach suggests that the question of whether ethics is partial or impartial will be settled at the level of normative ethical discourse, or at least not at the level of discourse about the nature of reasons for action.

  8. galerkin's methods

    African Journals Online (AJOL)

    user

    The assumed deflection shapes used in the approximate methods such as in the Galerkin's method were normally ... to direct compressive forces Nx, was derived by Navier. [3]. ..... tend to give higher frequency and stiffness, as well as.

  9. Method

    Directory of Open Access Journals (Sweden)

    Ling Fiona W.M.

    2017-01-01

    Full Text Available Rapid prototyping of microchannel gain lots of attention from researchers along with the rapid development of microfluidic technology. The conventional methods carried few disadvantages such as high cost, time consuming, required high operating pressure and temperature and involve expertise in operating the equipment. In this work, new method adapting xurography method is introduced to replace the conventional method of fabrication of microchannels. The novelty in this study is replacing the adhesion film with clear plastic film which was used to cut the design of the microchannel as the material is more suitable for fabricating more complex microchannel design. The microchannel was then mold using polymethyldisiloxane (PDMS and bonded with a clean glass to produce a close microchannel. The microchannel produced had a clean edge indicating good master mold was produced using the cutting plotter and the bonding between the PDMS and glass was good where no leakage was observed. The materials used in this method is cheap and the total time consumed is less than 5 hours where this method is suitable for rapid prototyping of microchannel.

  10. Fourier rebinning and consistency equations for time-of-flight PET planograms.

    Science.gov (United States)

    Li, Yusheng; Defrise, Michel; Matej, Samuel; Metzler, Scott D

    2016-01-01

    Due to the unique geometry, dual-panel PET scanners have many advantages in dedicated breast imaging and on-board imaging applications since the compact scanners can be combined with other imaging and treatment modalities. The major challenges of dual-panel PET imaging are the limited-angle problem and data truncation, which can cause artifacts due to incomplete data sampling. The time-of-flight (TOF) information can be a promising solution to reduce these artifacts. The TOF planogram is the native data format for dual-panel TOF PET scanners, and the non-TOF planogram is the 3D extension of linogram. The TOF planograms is five-dimensional while the objects are three-dimensional, and there are two degrees of redundancy. In this paper, we derive consistency equations and Fourier-based rebinning algorithms to provide a complete understanding of the rich structure of the fully 3D TOF planograms. We first derive two consistency equations and John's equation for 3D TOF planograms. By taking the Fourier transforms, we obtain two Fourier consistency equations and the Fourier-John equation, which are the duals of the consistency equations and John's equation, respectively. We then solve the Fourier consistency equations and Fourier-John equation using the method of characteristics. The two degrees of entangled redundancy of the 3D TOF data can be explicitly elicited and exploited by the solutions along the characteristic curves. As the special cases of the general solutions, we obtain Fourier rebinning and consistency equations (FORCEs), and thus we obtain a complete scheme to convert among different types of PET planograms: 3D TOF, 3D non-TOF, 2D TOF and 2D non-TOF planograms. The FORCEs can be used as Fourier-based rebinning algorithms for TOF-PET data reduction, inverse rebinnings for designing fast projectors, or consistency conditions for estimating missing data. As a byproduct, we show the two consistency equations are necessary and sufficient for 3D TOF planograms

  11. Trend in land degradation has been the most contended issue in the Sahel. Trends documented have not been consistent across authors and science disciplines, hence little agreement has been gained on the magnitude and direction of land degradation in the Sahel. Differentiated science outputs are related to methods and data used at various scales.

    Science.gov (United States)

    Mbow, C.; Brandt, M.; Fensholt, R.; Ouedraogo, I.; Tagesson, T.

    2015-12-01

    Thematic gaps in land degradation trends in the SahelTrend in land degradation has been the most contended issue for arid and semi-arid regions. In the Sahel, depending to scale of analysis and methods and data used, the trend documented have not been consistent across authors and science disciplines. The assessment of land degradation and the quantification of its effects on land productivity have been assessed for many decades, but little agreement has been gained on the magnitude and direction in the Sahel. This lack of consistency amid science outputs can be related to many methodological underpinnings and data used for various scales of analysis. Assessing biophysical trends on the ground requires long-term ground-based data collection to evaluate and better understand the mechanisms behind land dynamics. The Sahel is seen as greening by many authors? Is that greening geographically consistent? These questions enquire the importance of scale analysis and related drivers. The questions addressed are not only factors explaining loss of tree cover but also regeneration of degraded land. The picture used is the heuristic cycle model to assess loss and damages vs gain and improvements of various land use practices. The presentation will address the following aspects - How much we know from satellite data after 40 years of remote sensing analysis over the Sahel? That section discuss agreement and divergences of evidences and differentiated interpretation of land degradation in the Sahel. - The biophysical factors that are relevant for tracking land degradation in the Sahel. Aspects such detangling human to climate factors and biophysical factors behind land dynamics will be presented - Introduce some specific cases of driver of land architecture transition under the combined influence of climate and human factor. - Based on the above we will conclude with some key recommendations on how to improve land degradation assessment in the Arid region of the Sahel.

  12. Self-consistency in Capital Markets

    Science.gov (United States)

    Benbrahim, Hamid

    2013-03-01

    Capital Markets are considered, at least in theory, information engines whereby traders contribute to price formation with their diverse perspectives. Regardless whether one believes in efficient market theory on not, actions by individual traders influence prices of securities, which in turn influence actions by other traders. This influence is exerted through a number of mechanisms including portfolio balancing, margin maintenance, trend following, and sentiment. As a result market behaviors emerge from a number of mechanisms ranging from self-consistency due to wisdom of the crowds and self-fulfilling prophecies, to more chaotic behavior resulting from dynamics similar to the three body system, namely the interplay between equities, options, and futures. This talk will address questions and findings regarding the search for self-consistency in capital markets.

  13. Student Effort, Consistency and Online Performance

    Directory of Open Access Journals (Sweden)

    Hilde Patron

    2011-07-01

    Full Text Available This paper examines how student effort, consistency, motivation, and marginal learning, influence student grades in an online course. We use data from eleven Microeconomics courses taught online for a total of 212 students. Our findings show that consistency, or less time variation, is a statistically significant explanatory variable, whereas effort, or total minutes spent online, is not. Other independent variables include GPA and the difference between a pre-test and a post-test. The GPA is used as a measure of motivation, and the difference between a post-test and pre-test as marginal learning. As expected, the level of motivation is found statistically significant at a 99% confidence level, and marginal learning is also significant at a 95% level.

  14. Consistent thermodynamic properties of lipids systems

    DEFF Research Database (Denmark)

    Cunico, Larissa; Ceriani, Roberta; Sarup, Bent

    different pressures, with azeotrope behavior observed. Available thermodynamic consistency tests for TPx data were applied before performing parameter regressions for Wilson, NRTL, UNIQUAC and original UNIFAC models. The relevance of enlarging experimental databank of lipids systems data in order to improve......Physical and thermodynamic properties of pure components and their mixtures are the basic requirement for process design, simulation, and optimization. In the case of lipids, our previous works[1-3] have indicated a lack of experimental data for pure components and also for their mixtures...... the performance of predictive thermodynamic models was confirmed in this work by analyzing the calculated values of original UNIFAC model. For solid-liquid equilibrium (SLE) data, new consistency tests have been developed [2]. Some of the developed tests were based in the quality tests proposed for VLE data...

  15. Consistency relation for cosmic magnetic fields

    DEFF Research Database (Denmark)

    Jain, R. K.; Sloth, M. S.

    2012-01-01

    If cosmic magnetic fields are indeed produced during inflation, they are likely to be correlated with the scalar metric perturbations that are responsible for the cosmic microwave background anisotropies and large scale structure. Within an archetypical model of inflationary magnetogenesis, we show...... that there exists a new simple consistency relation for the non-Gaussian cross correlation function of the scalar metric perturbation with two powers of the magnetic field in the squeezed limit where the momentum of the metric perturbation vanishes. We emphasize that such a consistency relation turns out...... to be extremely useful to test some recent calculations in the literature. Apart from primordial non-Gaussianity induced by the curvature perturbations, such a cross correlation might provide a new observational probe of inflation and can in principle reveal the primordial nature of cosmic magnetic fields. DOI...

  16. Internal Branding and Employee Brand Consistent Behaviours

    DEFF Research Database (Denmark)

    Mazzei, Alessandra; Ravazzani, Silvia

    2017-01-01

    constitutive processes. In particular, the paper places emphasis on the role and kinds of communication practices as a central part of the nonnormative and constitutive internal branding process. The paper also discusses an empirical study based on interviews with 32 Italian and American communication managers...... and 2 focus groups with Italian communication managers. Findings show that, in order to enhance employee brand consistent behaviours, the most effective communication practices are those characterised as enablement-oriented. Such a communication creates the organizational conditions adequate to sustain......Employee behaviours conveying brand values, named brand consistent behaviours, affect the overall brand evaluation. Internal branding literature highlights a knowledge gap in terms of communication practices intended to sustain such behaviours. This study contributes to the development of a non...

  17. Evaluating Temporal Consistency in Marine Biodiversity Hotspots

    OpenAIRE

    Piacenza, Susan E.; Thurman, Lindsey L.; Barner, Allison K.; Benkwitt, Cassandra E.; Boersma, Kate S.; Cerny-Chipman, Elizabeth B.; Ingeman, Kurt E.; Kindinger, Tye L.; Lindsley, Amy J.; Nelson, Jake; Reimer, Jessica N.; Rowe, Jennifer C.; Shen, Chenchen; Thompson, Kevin A.; Heppell, Selina S.

    2015-01-01

    With the ongoing crisis of biodiversity loss and limited resources for conservation, the concept of biodiversity hotspots has been useful in determining conservation priority areas. However, there has been limited research into how temporal variability in biodiversity may influence conservation area prioritization. To address this information gap, we present an approach to evaluate the temporal consistency of biodiversity hotspots in large marine ecosystems. Using a large scale, public monito...

  18. Cloud Standardization: Consistent Business Processes and Information

    Directory of Open Access Journals (Sweden)

    Razvan Daniel ZOTA

    2013-01-01

    Full Text Available Cloud computing represents one of the latest emerging trends in distributed computing that enables the existence of hardware infrastructure and software applications as services. The present paper offers a general approach to the cloud computing standardization as a mean of improving the speed of adoption for the cloud technologies. Moreover, this study tries to show out how organizations may achieve more consistent business processes while operating with cloud computing technologies.

  19. Consistency Analysis of Nearest Subspace Classifier

    OpenAIRE

    Wang, Yi

    2015-01-01

    The Nearest subspace classifier (NSS) finds an estimation of the underlying subspace within each class and assigns data points to the class that corresponds to its nearest subspace. This paper mainly studies how well NSS can be generalized to new samples. It is proved that NSS is strongly consistent under certain assumptions. For completeness, NSS is evaluated through experiments on various simulated and real data sets, in comparison with some other linear model based classifiers. It is also ...

  20. Consistency relations in effective field theory

    Energy Technology Data Exchange (ETDEWEB)

    Munshi, Dipak; Regan, Donough, E-mail: D.Munshi@sussex.ac.uk, E-mail: D.Regan@sussex.ac.uk [Astronomy Centre, School of Mathematical and Physical Sciences, University of Sussex, Brighton BN1 9QH (United Kingdom)

    2017-06-01

    The consistency relations in large scale structure relate the lower-order correlation functions with their higher-order counterparts. They are direct outcome of the underlying symmetries of a dynamical system and can be tested using data from future surveys such as Euclid. Using techniques from standard perturbation theory (SPT), previous studies of consistency relation have concentrated on continuity-momentum (Euler)-Poisson system of an ideal fluid. We investigate the consistency relations in effective field theory (EFT) which adjusts the SPT predictions to account for the departure from the ideal fluid description on small scales. We provide detailed results for the 3D density contrast δ as well as the scaled divergence of velocity θ-bar . Assuming a ΛCDM background cosmology, we find the correction to SPT results becomes important at k ∼> 0.05 h/Mpc and that the suppression from EFT to SPT results that scales as square of the wave number k , can reach 40% of the total at k ≈ 0.25 h/Mpc at z = 0. We have also investigated whether effective field theory corrections to models of primordial non-Gaussianity can alter the squeezed limit behaviour, finding the results to be rather insensitive to these counterterms. In addition, we present the EFT corrections to the squeezed limit of the bispectrum in redshift space which may be of interest for tests of theories of modified gravity.

  1. Consistent probabilities in loop quantum cosmology

    International Nuclear Information System (INIS)

    Craig, David A; Singh, Parampreet

    2013-01-01

    A fundamental issue for any quantum cosmological theory is to specify how probabilities can be assigned to various quantum events or sequences of events such as the occurrence of singularities or bounces. In previous work, we have demonstrated how this issue can be successfully addressed within the consistent histories approach to quantum theory for Wheeler–DeWitt-quantized cosmological models. In this work, we generalize that analysis to the exactly solvable loop quantization of a spatially flat, homogeneous and isotropic cosmology sourced with a massless, minimally coupled scalar field known as sLQC. We provide an explicit, rigorous and complete decoherent-histories formulation for this model and compute the probabilities for the occurrence of a quantum bounce versus a singularity. Using the scalar field as an emergent internal time, we show for generic states that the probability for a singularity to occur in this model is zero, and that of a bounce is unity, complementing earlier studies of the expectation values of the volume and matter density in this theory. We also show from the consistent histories point of view that all states in this model, whether quantum or classical, achieve arbitrarily large volume in the limit of infinite ‘past’ or ‘future’ scalar ‘time’, in the sense that the wave function evaluated at any arbitrary fixed value of the volume vanishes in that limit. Finally, we briefly discuss certain misconceptions concerning the utility of the consistent histories approach in these models. (paper)

  2. A consistent response spectrum analysis including the resonance range

    International Nuclear Information System (INIS)

    Schmitz, D.; Simmchen, A.

    1983-01-01

    The report provides a complete consistent Response Spectrum Analysis for any component. The effect of supports with different excitation is taken into consideration, at is the description of the resonance ranges. It includes information explaining how the contributions of the eigenforms with higher eigenfrequencies are to be considered. Stocking of floor response spectra is also possible using the method described here. However, modified floor response spectra must now be calculated for each building mode. Once these have been prepared, the calculation of the dynamic component values is practically no more complicated than with the conventional, non-consistent methods. The consistent Response Spectrum Analysis can supply smaller and larger values than the conventional theory, a fact which can be demonstrated using simple examples. The report contains a consistent Response Spectrum Analysis (RSA), which, as far as we know, has been formulated in this way for the first time. A consistent RSA is so important because today this method is preferentially applied as an important tool for the earthquake proof of components in nuclear power plants. (orig./HP)

  3. Which method for quantifying urinary albumin excretion gives what outcome? A comparison of immunonephelometry with HPLC

    NARCIS (Netherlands)

    Brinkman, JW; Bakker, SJL; Gansevoort, RT; Hillege, HL; Kema, IP; Gans, ROB; De Jong, PE; De Zeeuw, D

    2004-01-01

    Background. Microalbuminuria has recently been identified as an independent risk factor for cardiovascular disease in the general population. Immunochemical urinary albumin assays only detect immunoreactive intact albumin. High performance liquid chromatography (HPLC) is able to detect both

  4. Consistency of color representation in smart phones.

    Science.gov (United States)

    Dain, Stephen J; Kwan, Benjamin; Wong, Leslie

    2016-03-01

    One of the barriers to the construction of consistent computer-based color vision tests has been the variety of monitors and computers. Consistency of color on a variety of screens has necessitated calibration of each setup individually. Color vision examination with a carefully controlled display has, as a consequence, been a laboratory rather than a clinical activity. Inevitably, smart phones have become a vehicle for color vision tests. They have the advantage that the processor and screen are associated and there are fewer models of smart phones than permutations of computers and monitors. Colorimetric consistency of display within a model may be a given. It may extend across models from the same manufacturer but is unlikely to extend between manufacturers especially where technologies vary. In this study, we measured the same set of colors in a JPEG file displayed on 11 samples of each of four models of smart phone (iPhone 4s, iPhone5, Samsung Galaxy S3, and Samsung Galaxy S4) using a Photo Research PR-730. The iPhones are white LED backlit LCD and the Samsung are OLEDs. The color gamut varies between models and comparison with sRGB space shows 61%, 85%, 117%, and 110%, respectively. The iPhones differ markedly from the Samsungs and from one another. This indicates that model-specific color lookup tables will be needed. Within each model, the primaries were quite consistent (despite the age of phone varying within each sample). The worst case in each model was the blue primary; the 95th percentile limits in the v' coordinate were ±0.008 for the iPhone 4 and ±0.004 for the other three models. The u'v' variation in white points was ±0.004 for the iPhone4 and ±0.002 for the others, although the spread of white points between models was u'v'±0.007. The differences are essentially the same for primaries at low luminance. The variation of colors intermediate between the primaries (e.g., red-purple, orange) mirror the variation in the primaries. The variation in

  5. Do Health Systems Have Consistent Performance Across Locations and Is Consistency Associated With Higher Performance?

    Science.gov (United States)

    Crespin, Daniel J; Christianson, Jon B; McCullough, Jeffrey S; Finch, Michael D

    This study addresses whether health systems have consistent diabetes care performance across their ambulatory clinics and whether increasing consistency is associated with improvements in clinic performance. Study data included 2007 to 2013 diabetes care intermediate outcome measures for 661 ambulatory clinics in Minnesota and bordering states. Health systems provided more consistent performance, as measured by the standard deviation of performance for clinics in a system, relative to propensity score-matched proxy systems created for comparison purposes. No evidence was found that improvements in consistency were associated with higher clinic performance. The combination of high performance and consistent care is likely to enhance a health system's brand reputation, allowing it to better mitigate the financial risks of consumers seeking care outside the organization. These results suggest that larger health systems are most likely to deliver the combination of consistent and high-performance care. Future research should explore the mechanisms that drive consistent care within health systems.

  6. method

    Directory of Open Access Journals (Sweden)

    L. M. Kimball

    2002-01-01

    Full Text Available This paper presents an interior point algorithm to solve the multiperiod hydrothermal economic dispatch (HTED. The multiperiod HTED is a large scale nonlinear programming problem. Various optimization methods have been applied to the multiperiod HTED, but most neglect important network characteristics or require decomposition into thermal and hydro subproblems. The algorithm described here exploits the special bordered block diagonal structure and sparsity of the Newton system for the first order necessary conditions to result in a fast efficient algorithm that can account for all network aspects. Applying this new algorithm challenges a conventional method for the use of available hydro resources known as the peak shaving heuristic.

  7. Saving reed lands by giving economic value to reed

    Directory of Open Access Journals (Sweden)

    F.W. Croon

    2014-07-01

    Full Text Available Discussions about the need for renewable energy, the need for nature conservation, the need to double the world’s food production to eliminate hunger, the need to reduce carbon dioxide emission, and the wish to reduce dependency on dwindling oil resources, show that these issues are intimately related and sometimes mutually exclusive. The use of food crops for the production of renewable fuels has resulted in the energy vs. food debate; the use of scarce land and fresh water for the dedicated production of biomass conflicts with food production and nature conservation; the collection of harvest residues and forest wastes as biomass to produce renewable fuels is complex and leaves a CO2 footprint. The several species of reed that grow naturally in deltas, river plains etc. can provide large amounts of biomass but are hardly mentioned in the debates. Harvesting reed does not threaten the nature and the natural functions of reed lands, which are carbon neutral or carbon dioxide sinks. Reed production does not need extensive infrastructure or complex cultivation and does not compete with food production for land and fresh water. Reed lands in many places are under threat of reclamation for economic activities and urbanisation. This trend can be countered if reed is seen to have a proven economic value. In this article I argue that giving a sustainable economic value to reed lands can only be realised if the exploitation is recognised as being environmentally acceptable, commercially feasible and a source of economic gains for all stakeholders. Commercial feasibility can be achieved under present economic conditions only if a reliable supply of considerable volumes of reed at a limited price can be guaranteed.

  8. Giving birth with rape in one's past: a qualitative study.

    Science.gov (United States)

    Halvorsen, Lotta; Nerum, Hilde; Oian, Pål; Sørlie, Tore

    2013-09-01

    Rape is one of the most traumatizing violations a woman can be subjected to, and leads to extensive health problems, predominantly psychological ones. A large proportion of women develop a form of posttraumatic stress termed Rape Trauma Syndrome. A previous study by our research group has shown that women with a history of rape far more often had an operative delivery in their first birth and those who gave birth vaginally had second stages twice as long as women with no history of sexual assault. The aim of this study is to examine and illuminate how women previously subjected to rape experience giving birth for the first time and their advice on the kind of birth care they regard as good for women with a history of rape. A semi-structured interview with 10 women, who had been exposed to rape before their first childbirth. Data on the birth experience were analyzed by qualitative content analysis. The main theme was "being back in the rape" with two categories: "reactivation of the rape during labor," with subcategories "struggle," "surrender," and "escape" and "re-traumatization after birth," with the subcategories "objectified," "dirtied," and "alienated body." A rape trauma can be reactivated during the first childbirth regardless of mode of delivery. After birth, the women found themselves re-traumatized with the feeling of being dirtied, alienated, and reduced to just a body that another body is to come out of. Birth attendants should acknowledge that the common measures and procedures used during normal birth or cesarean section can contribute to a reactivation of the rape trauma. © 2013, Copyright the Authors Journal compilation © 2013, Wiley Periodicals, Inc.

  9. Simulated parents: developing paediatric trainees' skills in giving bad news.

    Science.gov (United States)

    Gough, Jenny K; Frydenberg, Alexis R; Donath, Susan K; Marks, Michael M

    2009-03-01

    In curriculum documents for medicine in undergraduate, post-graduate and continuing professional development, there is now a focus on communication skills. The challenges are to place communication skills in the crowded curriculum and then to construct and sustain a programme that uses an evidence-based approach to the teaching and learning of communication skills. For 6 years, we have conducted a programme that involves simulated parents supporting junior medical staff to refine their skills in communication, particularly in giving parents bad news. The aim of our study was to obtain a better understanding of the trainees' experiences of the programme. Nine junior residents individually worked through two scenarios and received feedback from the simulated parent. They gave bad news to a simulated parent/actor who then gave feedback. A recording of the simulation was provided for discussion with a designated colleague at an arranged time. The tapes were then separately appraised by two independent raters - another actor and a paediatrician. Brief written reports and conducted semi-structured interviews provided more insights into the trainees' experience of the simulation. Other participating medical/medical education staff were interviewed about the simulation programme. Five themes emerged from the qualitative data: timeliness, emotional safety, the complexity of communication, practical usefulness and the challenge of effecting change. In addition, the ratings of the videos helped to clarify those 'parent-centred' communication skills that trainees may neglect in difficult conversations: 'ask about support', 'encourage the parent to ask questions' and 'repeat key messages'. The evaluation highlighted the value of an early-career experiential programme to highlight the importance of communication skills in post-graduate paediatrics practice.

  10. Self-consistent approximations beyond the CPA: Part II

    International Nuclear Information System (INIS)

    Kaplan, T.; Gray, L.J.

    1982-01-01

    This paper concentrates on a self-consistent approximation for random alloys developed by Kaplan, Leath, Gray, and Diehl. The construction of the augmented space formalism for a binary alloy is sketched, and the notation to be used derived. Using the operator methods of the augmented space, the self-consistent approximation is derived for the average Green's function, and for evaluating the self-energy, taking into account the scattering by clusters of excitations. The particular cluster approximation desired is derived by treating the scattering by the excitations with S /SUB T/ exactly. Fourier transforms on the disorder-space clustersite labels solve the self-consistent set of equations. Expansion to short range order in the alloy is also discussed. A method to reduce the problem to a computationally tractable form is described

  11. Short-Cut Estimators of Criterion-Referenced Test Consistency.

    Science.gov (United States)

    Brown, James Dean

    1990-01-01

    Presents simplified methods for deriving estimates of the consistency of criterion-referenced, English-as-a-Second-Language tests, including (1) the threshold loss agreement approach using agreement or kappa coefficients, (2) the squared-error loss agreement approach using the phi(lambda) dependability approach, and (3) the domain score…

  12. Probability Machines: Consistent Probability Estimation Using Nonparametric Learning Machines

    Science.gov (United States)

    Malley, J. D.; Kruppa, J.; Dasgupta, A.; Malley, K. G.; Ziegler, A.

    2011-01-01

    Summary Background Most machine learning approaches only provide a classification for binary responses. However, probabilities are required for risk estimation using individual patient characteristics. It has been shown recently that every statistical learning machine known to be consistent for a nonparametric regression problem is a probability machine that is provably consistent for this estimation problem. Objectives The aim of this paper is to show how random forests and nearest neighbors can be used for consistent estimation of individual probabilities. Methods Two random forest algorithms and two nearest neighbor algorithms are described in detail for estimation of individual probabilities. We discuss the consistency of random forests, nearest neighbors and other learning machines in detail. We conduct a simulation study to illustrate the validity of the methods. We exemplify the algorithms by analyzing two well-known data sets on the diagnosis of appendicitis and the diagnosis of diabetes in Pima Indians. Results Simulations demonstrate the validity of the method. With the real data application, we show the accuracy and practicality of this approach. We provide sample code from R packages in which the probability estimation is already available. This means that all calculations can be performed using existing software. Conclusions Random forest algorithms as well as nearest neighbor approaches are valid machine learning methods for estimating individual probabilities for binary responses. Freely available implementations are available in R and may be used for applications. PMID:21915433

  13. Diagnosing a Strong-Fault Model by Conflict and Consistency

    Directory of Open Access Journals (Sweden)

    Wenfeng Zhang

    2018-03-01

    Full Text Available The diagnosis method for a weak-fault model with only normal behaviors of each component has evolved over decades. However, many systems now demand a strong-fault models, the fault modes of which have specific behaviors as well. It is difficult to diagnose a strong-fault model due to its non-monotonicity. Currently, diagnosis methods usually employ conflicts to isolate possible fault and the process can be expedited when some observed output is consistent with the model’s prediction where the consistency indicates probably normal components. This paper solves the problem of efficiently diagnosing a strong-fault model by proposing a novel Logic-based Truth Maintenance System (LTMS with two search approaches based on conflict and consistency. At the beginning, the original a strong-fault model is encoded by Boolean variables and converted into Conjunctive Normal Form (CNF. Then the proposed LTMS is employed to reason over CNF and find multiple minimal conflicts and maximal consistencies when there exists fault. The search approaches offer the best candidate efficiency based on the reasoning result until the diagnosis results are obtained. The completeness, coverage, correctness and complexity of the proposals are analyzed theoretically to show their strength and weakness. Finally, the proposed approaches are demonstrated by applying them to a real-world domain—the heat control unit of a spacecraft—where the proposed methods are significantly better than best first and conflict directly with A* search methods.

  14. Consistent dynamical and statistical description of fission and comparison

    Energy Technology Data Exchange (ETDEWEB)

    Shunuan, Wang [Chinese Nuclear Data Center, Beijing, BJ (China)

    1996-06-01

    The research survey of consistent dynamical and statistical description of fission is briefly introduced. The channel theory of fission with diffusive dynamics based on Bohr channel theory of fission and Fokker-Planck equation and Kramers-modified Bohr-Wheeler expression according to Strutinsky method given by P.Frobrich et al. are compared and analyzed. (2 figs.).

  15. Brief Report: Consistency of Search Engine Rankings for Autism Websites

    Science.gov (United States)

    Reichow, Brian; Naples, Adam; Steinhoff, Timothy; Halpern, Jason; Volkmar, Fred R.

    2012-01-01

    The World Wide Web is one of the most common methods used by parents to find information on autism spectrum disorders and most consumers find information through search engines such as Google or Bing. However, little is known about how the search engines operate or the consistency of the results that are returned over time. This study presents the…

  16. Diagnosing a Strong-Fault Model by Conflict and Consistency.

    Science.gov (United States)

    Zhang, Wenfeng; Zhao, Qi; Zhao, Hongbo; Zhou, Gan; Feng, Wenquan

    2018-03-29

    The diagnosis method for a weak-fault model with only normal behaviors of each component has evolved over decades. However, many systems now demand a strong-fault models, the fault modes of which have specific behaviors as well. It is difficult to diagnose a strong-fault model due to its non-monotonicity. Currently, diagnosis methods usually employ conflicts to isolate possible fault and the process can be expedited when some observed output is consistent with the model's prediction where the consistency indicates probably normal components. This paper solves the problem of efficiently diagnosing a strong-fault model by proposing a novel Logic-based Truth Maintenance System (LTMS) with two search approaches based on conflict and consistency. At the beginning, the original a strong-fault model is encoded by Boolean variables and converted into Conjunctive Normal Form (CNF). Then the proposed LTMS is employed to reason over CNF and find multiple minimal conflicts and maximal consistencies when there exists fault. The search approaches offer the best candidate efficiency based on the reasoning result until the diagnosis results are obtained. The completeness, coverage, correctness and complexity of the proposals are analyzed theoretically to show their strength and weakness. Finally, the proposed approaches are demonstrated by applying them to a real-world domain-the heat control unit of a spacecraft-where the proposed methods are significantly better than best first and conflict directly with A* search methods.

  17. Consistency Checking of Web Service Contracts

    DEFF Research Database (Denmark)

    Cambronero, M. Emilia; Okika, Joseph C.; Ravn, Anders Peter

    2008-01-01

    Behavioural properties are analyzed for web service contracts formulated in Business Process Execution Language (BPEL) and Choreography Description Language (CDL). The key result reported is an automated technique to check consistency between protocol aspects of the contracts. The contracts...... are abstracted to (timed) automata and from there a simulation is set up, which is checked using automated tools for analyzing networks of finite state processes. Here we use the Concurrency Work Bench. The proposed techniques are illustrated with a case study that include otherwise difficult to analyze fault...

  18. Gentzen's centenary the quest for consistency

    CERN Document Server

    Rathjen, Michael

    2015-01-01

    Gerhard Gentzen has been described as logic’s lost genius, whom Gödel called a better logician than himself. This work comprises articles by leading proof theorists, attesting to Gentzen’s enduring legacy to mathematical logic and beyond. The contributions range from philosophical reflections and re-evaluations of Gentzen’s original consistency proofs to the most recent developments in proof theory. Gentzen founded modern proof theory. His sequent calculus and natural deduction system beautifully explain the deep symmetries of logic. They underlie modern developments in computer science such as automated theorem proving and type theory.

  19. Two consistent calculations of the Weinberg angle

    International Nuclear Information System (INIS)

    Fairlie, D.B.

    1979-01-01

    The Weinberg-Salam theory is reformulated as a pure Yang-Mills theory in a six-dimensional space, the Higgs field being interpreted as gauge potentials in the additional dimensions. Viewed in this way, the condition that the Higgs field transforms as a U(1) representation of charge one is equivalent to requiring a value of 30 0 C for the Weinberg angle. A second consistent determination comes from the idea borrowed from monopole theory that the electromagnetic field is in the direction of the Higgs field. (Author)

  20. Interesting association rule mining with consistent and inconsistent rule detection from big sales data in distributed environment

    Directory of Open Access Journals (Sweden)

    Dinesh J. Prajapati

    2017-06-01

    Full Text Available Nowadays, there is an increasing demand in mining interesting patterns from the big data. The process of analyzing such a huge amount of data is really computationally complex task when using traditional methods. The overall purpose of this paper is in twofold. First, this paper presents a novel approach to identify consistent and inconsistent association rules from sales data located in distributed environment. Secondly, the paper also overcomes the main memory bottleneck and computing time overhead of single computing system by applying computations to multi node cluster. The proposed method initially extracts frequent itemsets for each zone using existing distributed frequent pattern mining algorithms. The paper also compares the time efficiency of Mapreduce based frequent pattern mining algorithm with Count Distribution Algorithm (CDA and Fast Distributed Mining (FDM algorithms. The association generated from frequent itemsets are too large that it becomes complex to analyze it. Thus, Mapreduce based consistent and inconsistent rule detection (MR-CIRD algorithm is proposed to detect the consistent and inconsistent rules from big data and provide useful and actionable knowledge to the domain experts. These pruned interesting rules also give useful knowledge for better marketing strategy as well. The extracted consistent and inconsistent rules are evaluated and compared based on different interestingness measures presented together with experimental results that lead to the final conclusions.

  1. Synchrotron radiation gives insight in smaller and smaller crystals

    International Nuclear Information System (INIS)

    Hintsches, E.

    1983-01-01

    Scientists from the ''Max-Planck-Institut fuer Festkoerperforschung'' in Stuttgart have extended the method of X-ray analysis to study the structure of very small crystals. For the first time a crystal with 6 μm linear dimension has been successfully analysed using the synchrotron radiation from the DESY electron synchrotron at Hamburg. Thus this important method of analysis has been demonstrated to be usefull for structural studies of crystals, which are smaller by a factor of 20 than hitherto. (orig.) [de

  2. GIVE THE PUBLIC SOMETHING, SOMETHING MORE INTERESTING THAN RADIOACTIVE WASTE

    Energy Technology Data Exchange (ETDEWEB)

    Codee, Hans D.K.

    2003-02-27

    In the Netherlands the policy to manage radioactive waste is somewhat different from that in other countries, although the practical outcome is not much different. Long-term, i.e. at least 100 years, storage in above ground engineered structures of all waste types is the first element in the Dutch policy. Second element, but equally important, is that deep geologic disposal is foreseen after the storage period. This policy was brought out in the early eighties and was communicated to the public as a practical, logical and feasible management system for the Dutch situation. Strong opposition existed at that time to deep disposal in salt domes in the Netherlands. Above ground storage at principle was not rejected because the need to do something was obvious. Volunteers for a long term storage site did not automatically emerge. A site selection procedure was followed and resulted in the present site at Vlissingen-Oost. The waste management organization, COVRA, was not really welcomed here , but was tolerated. In the nineties facilities for low and medium level waste were erected and commissioned. In the design of the facilities much attention was given to emotional factors. The first ten operational years were needed to gain trust from the local population. Impeccable conduct and behavior was necessary as well as honesty and full openness to the public Now, after some ten years, the COVRA facilities are accepted. And a new phase is entered with the commissioning of the storage facility for high level waste, the HABOG facility. A visit to that facility will not be very spectacular, activities take place only during loading and unloading. Furthermore it is a facility for waste, so unwanted material will be brought into the community. In order to give the public something more interesting the building itself is transformed into a piece of art and in the inside a special work of art will be displayed. Together with that the attitude of the company will change. We are

  3. GIVE THE PUBLIC SOMETHING, SOMETHING MORE INTERESTING THAN RADIOACTIVE WASTE

    International Nuclear Information System (INIS)

    Codee, Hans D.K.

    2003-01-01

    In the Netherlands the policy to manage radioactive waste is somewhat different from that in other countries, although the practical outcome is not much different. Long-term, i.e. at least 100 years, storage in above ground engineered structures of all waste types is the first element in the Dutch policy. Second element, but equally important, is that deep geologic disposal is foreseen after the storage period. This policy was brought out in the early eighties and was communicated to the public as a practical, logical and feasible management system for the Dutch situation. Strong opposition existed at that time to deep disposal in salt domes in the Netherlands. Above ground storage at principle was not rejected because the need to do something was obvious. Volunteers for a long term storage site did not automatically emerge. A site selection procedure was followed and resulted in the present site at Vlissingen-Oost. The waste management organization, COVRA, was not really welcomed here , but was tolerated. In the nineties facilities for low and medium level waste were erected and commissioned. In the design of the facilities much attention was given to emotional factors. The first ten operational years were needed to gain trust from the local population. Impeccable conduct and behavior was necessary as well as honesty and full openness to the public Now, after some ten years, the COVRA facilities are accepted. And a new phase is entered with the commissioning of the storage facility for high level waste, the HABOG facility. A visit to that facility will not be very spectacular, activities take place only during loading and unloading. Furthermore it is a facility for waste, so unwanted material will be brought into the community. In order to give the public something more interesting the building itself is transformed into a piece of art and in the inside a special work of art will be displayed. Together with that the attitude of the company will change. We are

  4. Fourier rebinning and consistency equations for time-of-flight PET planograms

    International Nuclear Information System (INIS)

    Li, Yusheng; Matej, Samuel; Metzler, Scott D; Defrise, Michel

    2016-01-01

    Due to the unique geometry, dual-panel PET scanners have many advantages in dedicated breast imaging and on-board imaging applications since the compact scanners can be combined with other imaging and treatment modalities. The major challenges of dual-panel PET imaging are the limited-angle problem and data truncation, which can cause artifacts due to incomplete data sampling. The time-of-flight (TOF) information can be a promising solution to reduce these artifacts. The TOF planogram is the native data format for dual-panel TOF PET scanners, and the non-TOF planogram is the 3D extension of linogram. The TOF planograms is five-dimensional while the objects are three-dimensional, and there are two degrees of redundancy. In this paper, we derive consistency equations and Fourier-based rebinning algorithms to provide a complete understanding of the rich structure of the fully 3D TOF planograms. We first derive two consistency equations and John’s equation for 3D TOF planograms. By taking the Fourier transforms, we obtain two Fourier consistency equations (FCEs) and the Fourier–John equation (FJE), which are the duals of the consistency equations and John’s equation, respectively. We then solve the FCEs and FJE using the method of characteristics. The two degrees of entangled redundancy of the 3D TOF data can be explicitly elicited and exploited by the solutions along the characteristic curves. As the special cases of the general solutions, we obtain Fourier rebinning and consistency equations (FORCEs), and thus we obtain a complete scheme to convert among different types of PET planograms: 3D TOF, 3D non-TOF, 2D TOF and 2D non-TOF planograms. The FORCEs can be used as Fourier-based rebinning algorithms for TOF-PET data reduction, inverse rebinnings for designing fast projectors, or consistency conditions for estimating missing data. As a byproduct, we show the two consistency equations are necessary and sufficient for 3D TOF planograms. Finally, we give

  5. Consistent resolution of some relativistic quantum paradoxes

    International Nuclear Information System (INIS)

    Griffiths, Robert B.

    2002-01-01

    A relativistic version of the (consistent or decoherent) histories approach to quantum theory is developed on the basis of earlier work by Hartle, and used to discuss relativistic forms of the paradoxes of spherical wave packet collapse, Bohm's formulation of the Einstein-Podolsky-Rosen paradox, and Hardy's paradox. It is argued that wave function collapse is not needed for introducing probabilities into relativistic quantum mechanics, and in any case should never be thought of as a physical process. Alternative approaches to stochastic time dependence can be used to construct a physical picture of the measurement process that is less misleading than collapse models. In particular, one can employ a coarse-grained but fully quantum-mechanical description in which particles move along trajectories, with behavior under Lorentz transformations the same as in classical relativistic physics, and detectors are triggered by particles reaching them along such trajectories. States entangled between spacelike separate regions are also legitimate quantum descriptions, and can be consistently handled by the formalism presented here. The paradoxes in question arise because of using modes of reasoning which, while correct for classical physics, are inconsistent with the mathematical structure of quantum theory, and are resolved (or tamed) by using a proper quantum analysis. In particular, there is no need to invoke, nor any evidence for, mysterious long-range superluminal influences, and thus no incompatibility, at least from this source, between relativity theory and quantum mechanics

  6. Self-consistent model of confinement

    International Nuclear Information System (INIS)

    Swift, A.R.

    1988-01-01

    A model of the large-spatial-distance, zero--three-momentum, limit of QCD is developed from the hypothesis that there is an infrared singularity. Single quarks and gluons do not propagate because they have infinite energy after renormalization. The Hamiltonian formulation of the path integral is used to quantize QCD with physical, nonpropagating fields. Perturbation theory in the infrared limit is simplified by the absence of self-energy insertions and by the suppression of large classes of diagrams due to vanishing propagators. Remaining terms in the perturbation series are resummed to produce a set of nonlinear, renormalizable integral equations which fix both the confining interaction and the physical propagators. Solutions demonstrate the self-consistency of the concepts of an infrared singularity and nonpropagating fields. The Wilson loop is calculated to provide a general proof of confinement. Bethe-Salpeter equations for quark-antiquark pairs and for two gluons have finite-energy solutions in the color-singlet channel. The choice of gauge is addressed in detail. Large classes of corrections to the model are discussed and shown to support self-consistency

  7. Subgame consistent cooperation a comprehensive treatise

    CERN Document Server

    Yeung, David W K

    2016-01-01

    Strategic behavior in the human and social world has been increasingly recognized in theory and practice. It is well known that non-cooperative behavior could lead to suboptimal or even highly undesirable outcomes. Cooperation suggests the possibility of obtaining socially optimal solutions and the calls for cooperation are prevalent in real-life problems. Dynamic cooperation cannot be sustainable if there is no guarantee that the agreed upon optimality principle at the beginning is maintained throughout the cooperation duration. It is due to the lack of this kind of guarantees that cooperative schemes fail to last till its end or even fail to get started. The property of subgame consistency in cooperative dynamic games and the corresponding solution mechanism resolve this “classic” problem in game theory. This book is a comprehensive treatise on subgame consistent dynamic cooperation covering the up-to-date state of the art analyses in this important topic. It sets out to provide the theory, solution tec...

  8. Consistent mutational paths predict eukaryotic thermostability

    Directory of Open Access Journals (Sweden)

    van Noort Vera

    2013-01-01

    Full Text Available Abstract Background Proteomes of thermophilic prokaryotes have been instrumental in structural biology and successfully exploited in biotechnology, however many proteins required for eukaryotic cell function are absent from bacteria or archaea. With Chaetomium thermophilum, Thielavia terrestris and Thielavia heterothallica three genome sequences of thermophilic eukaryotes have been published. Results Studying the genomes and proteomes of these thermophilic fungi, we found common strategies of thermal adaptation across the different kingdoms of Life, including amino acid biases and a reduced genome size. A phylogenetics-guided comparison of thermophilic proteomes with those of other, mesophilic Sordariomycetes revealed consistent amino acid substitutions associated to thermophily that were also present in an independent lineage of thermophilic fungi. The most consistent pattern is the substitution of lysine by arginine, which we could find in almost all lineages but has not been extensively used in protein stability engineering. By exploiting mutational paths towards the thermophiles, we could predict particular amino acid residues in individual proteins that contribute to thermostability and validated some of them experimentally. By determining the three-dimensional structure of an exemplar protein from C. thermophilum (Arx1, we could also characterise the molecular consequences of some of these mutations. Conclusions The comparative analysis of these three genomes not only enhances our understanding of the evolution of thermophily, but also provides new ways to engineer protein stability.

  9. Consistent biokinetic models for the actinide elements

    International Nuclear Information System (INIS)

    Leggett, R.W.

    2001-01-01

    The biokinetic models for Th, Np, Pu, Am and Cm currently recommended by the International Commission on Radiological Protection (ICRP) were developed within a generic framework that depicts gradual burial of skeletal activity in bone volume, depicts recycling of activity released to blood and links excretion to retention and translocation of activity. For other actinide elements such as Ac, Pa, Bk, Cf and Es, the ICRP still uses simplistic retention models that assign all skeletal activity to bone surface and depicts one-directional flow of activity from blood to long-term depositories to excreta. This mixture of updated and older models in ICRP documents has led to inconsistencies in dose estimates and interpretation of bioassay for radionuclides with reasonably similar biokinetics. This paper proposes new biokinetic models for Ac, Pa, Bk, Cf and Es that are consistent with the updated models for Th, Np, Pu, Am and Cm. The proposed models are developed within the ICRP's generic model framework for bone-surface-seeking radionuclides, and an effort has been made to develop parameter values that are consistent with results of comparative biokinetic data on the different actinide elements. (author)

  10. Selection, Classification and requirements give design he/she gives the important systems for the security in a center he/she gives production he/she gives radiopharmaceuticals and marked compounds

    International Nuclear Information System (INIS)

    Perez Pijuan, S.; Ayra Pardo, F.E.; Ilizastegui Perez, F.

    1998-01-01

    In the work the security functions are identified that should complete the system, subsystems and components to guarantee the security the workers and human populations. Was selected the system that intervene in the security the installation and they are classified in categories by deterministic methods consider their holding in the detection and mitigation the radiological events postulates and the maintenance in normal operation conditions

  11. How to give the gift of hospitality. Great customer service.

    Science.gov (United States)

    Schechter, M

    1994-08-01

    Whether it takes the form of greeting customers with a smile, redressing a diner's grievance or conducting special kitchen tours, providing customer service has become the number-one priority in foodservices coast to coast. Operators share tips & training methods that are helping staffs provide the hospitable services today's customers are demanding.

  12. Combining unmalted barley and pearling gives good quality brewing

    NARCIS (Netherlands)

    Donkelaar, van Laura H.G.; Hageman, Jos A.; Oguz, Serhat; Noordman, Tom R.; Boom, Remko M.; Goot, van der Atze Jan

    2016-01-01

    Brewing with unmalted barley can reduce the use of raw materials, thereby increasing the efficiency of the brewing process. However, unmalted barley contains several undesired components for brewing and has a low enzymatic activity. Pearling, an abrasive milling method, has been proposed as a

  13. Self-consistent studies of magnetic thin film Ni (001)

    International Nuclear Information System (INIS)

    Wang, C.S.; Freeman, A.J.

    1979-01-01

    Advances in experimental methods for studying surface phenomena have provided the stimulus to develop theoretical methods capable of interpreting this wealth of new information. Of particular interest have been the relative roles of bulk and surface contributions since in several important cases agreement between experiment and bulk self-consistent (SC) calculations within the local spin density functional formalism (LSDF) is lacking. We discuss our recent extension of the (LSDF) approach to the study of thin films (slabs) and the role of surface effects on magnetic properties. Results are described for Ni (001) films using our new SC numerical basis set LCAO method. Self-consistency within the superposition of overlapping spherical atomic charge density model is obtained iteratively with the atomic configuration as the adjustable parameter. Results are presented for the electronic charge densities and local density of states. The origin and role of (magnetic) surface states is discussed by comparison with results of earlier bulk calculations

  14. Gender Differences In Giving Directions: A Case Study Of English Literature Students At Binus University

    Directory of Open Access Journals (Sweden)

    Tjoo Hong Sing

    2011-05-01

    Full Text Available Many researchers have said that there are differences in the ways people give direction between males and females, especially in spatial task (cardinal directions, topography, mileage, building, right/left markers (e.g., Lawton, 2001; Dabbs et al., 1998. Here, the thesis investigates what differences occur between both genders in giving direction. The respondents are 25 females and 25 males of fifth semester Binus University students majoring in English Literature. The respondents answered with a certain route from Binus’s Anggrek Campus to Senayan City. The study was conducted by qualitative and quantitative method. From the data analysis, the writer discovered that gender does affect in selecting the key words in explaining direction it is found that there were differences in choosing key words in giving direction between females and males. The difference is women use more than twice spatial references than men do. In terms of verbal abilities, it was confirmed that female use longer explanation. However, in other aspects such as serial orientation and maintenance words, the result is inconclusive. 

  15. Robust Visual Tracking Via Consistent Low-Rank Sparse Learning

    KAUST Repository

    Zhang, Tianzhu

    2014-06-19

    Object tracking is the process of determining the states of a target in consecutive video frames based on properties of motion and appearance consistency. In this paper, we propose a consistent low-rank sparse tracker (CLRST) that builds upon the particle filter framework for tracking. By exploiting temporal consistency, the proposed CLRST algorithm adaptively prunes and selects candidate particles. By using linear sparse combinations of dictionary templates, the proposed method learns the sparse representations of image regions corresponding to candidate particles jointly by exploiting the underlying low-rank constraints. In addition, the proposed CLRST algorithm is computationally attractive since temporal consistency property helps prune particles and the low-rank minimization problem for learning joint sparse representations can be efficiently solved by a sequence of closed form update operations. We evaluate the proposed CLRST algorithm against 14 state-of-the-art tracking methods on a set of 25 challenging image sequences. Experimental results show that the CLRST algorithm performs favorably against state-of-the-art tracking methods in terms of accuracy and execution time.

  16. Consistency of canonical formulation of Horava gravity

    International Nuclear Information System (INIS)

    Soo, Chopin

    2011-01-01

    Both the non-projectable and projectable version of Horava gravity face serious challenges. In the non-projectable version, the constraint algebra is seemingly inconsistent. The projectable version lacks a local Hamiltonian constraint, thus allowing for an extra graviton mode which can be problematic. A new formulation (based on arXiv:1007.1563) of Horava gravity which is naturally realized as a representation of the master constraint algebra (instead of the Dirac algebra) studied by loop quantum gravity researchers is presented. This formulation yields a consistent canonical theory with first class constraints; and captures the essence of Horava gravity in retaining only spatial diffeomorphisms as the physically relevant non-trivial gauge symmetry. At the same time the local Hamiltonian constraint is equivalently enforced by the master constraint.

  17. Consistency of canonical formulation of Horava gravity

    Energy Technology Data Exchange (ETDEWEB)

    Soo, Chopin, E-mail: cpsoo@mail.ncku.edu.tw [Department of Physics, National Cheng Kung University, Tainan, Taiwan (China)

    2011-09-22

    Both the non-projectable and projectable version of Horava gravity face serious challenges. In the non-projectable version, the constraint algebra is seemingly inconsistent. The projectable version lacks a local Hamiltonian constraint, thus allowing for an extra graviton mode which can be problematic. A new formulation (based on arXiv:1007.1563) of Horava gravity which is naturally realized as a representation of the master constraint algebra (instead of the Dirac algebra) studied by loop quantum gravity researchers is presented. This formulation yields a consistent canonical theory with first class constraints; and captures the essence of Horava gravity in retaining only spatial diffeomorphisms as the physically relevant non-trivial gauge symmetry. At the same time the local Hamiltonian constraint is equivalently enforced by the master constraint.

  18. A consistent thermodynamic database for cement minerals

    International Nuclear Information System (INIS)

    Blanc, P.; Claret, F.; Burnol, A.; Marty, N.; Gaboreau, S.; Tournassat, C.; Gaucher, E.C.; Giffault, E.; Bourbon, X.

    2010-01-01

    work - the formation enthalpy and the Cp(T) function are taken from the literature or estimated - finally, the Log K(T) function is calculated, based on the selected dataset and it is compared to experimental data gathered at different temperatures. Each experimental point is extracted from solution compositions by using PHREEQC with a selection of aqueous complexes, consistent with the Thermochimie database. The selection was tested namely by drawing activity diagrams, allowing to assess phases relations. An example of such a diagram, drawn in the CaO-Al 2 O 3 -SiO 2 -H 2 O system is displayed. It can be seen that low pH concrete alteration proceeds essentially in decreasing the C/S ratio in C-S-H phases to the point where C-S-H are no longer stable and replaced by zeolite, then clay minerals. This evolution corresponds to a decrease in silica activity, which is consistent with the pH decrease, as silica concentration depends essentially on pH. Some rather consistent phase relations have been obtained for the SO 3 -Al 2 O 3 -CaO-CO 2 -H 2 O system. Addition of iron III enlarges the AFm-SO 4 stability field to the low temperature domain, whereas it decreases the pH domain where ettringite is stable. On the other hand, the stability field of katoite remains largely ambiguous, namely with respect to a hydro-garnet/grossular solid solution. With respect to other databases this work was made in consistency with a larger mineral selection, so that it can be used for modelling works in the cement clay interaction context

  19. Evaluating the hydrological consistency of evaporation products

    KAUST Repository

    Lopez Valencia, Oliver Miguel; Houborg, Rasmus; McCabe, Matthew

    2017-01-01

    Advances in space-based observations have provided the capacity to develop regional- to global-scale estimates of evaporation, offering insights into this key component of the hydrological cycle. However, the evaluation of large-scale evaporation retrievals is not a straightforward task. While a number of studies have intercompared a range of these evaporation products by examining the variance amongst them, or by comparison of pixel-scale retrievals against ground-based observations, there is a need to explore more appropriate techniques to comprehensively evaluate remote-sensing-based estimates. One possible approach is to establish the level of product agreement between related hydrological components: for instance, how well do evaporation patterns and response match with precipitation or water storage changes? To assess the suitability of this "consistency"-based approach for evaluating evaporation products, we focused our investigation on four globally distributed basins in arid and semi-arid environments, comprising the Colorado River basin, Niger River basin, Aral Sea basin, and Lake Eyre basin. In an effort to assess retrieval quality, three satellite-based global evaporation products based on different methodologies and input data, including CSIRO-PML, the MODIS Global Evapotranspiration product (MOD16), and Global Land Evaporation: the Amsterdam Methodology (GLEAM), were evaluated against rainfall data from the Global Precipitation Climatology Project (GPCP) along with Gravity Recovery and Climate Experiment (GRACE) water storage anomalies. To ensure a fair comparison, we evaluated consistency using a degree correlation approach after transforming both evaporation and precipitation data into spherical harmonics. Overall we found no persistent hydrological consistency in these dryland environments. Indeed, the degree correlation showed oscillating values between periods of low and high water storage changes, with a phase difference of about 2–3 months

  20. Non linear self consistency of microtearing modes

    International Nuclear Information System (INIS)

    Garbet, X.; Mourgues, F.; Samain, A.

    1987-01-01

    The self consistency of a microtearing turbulence is studied in non linear regimes where the ergodicity of the flux lines determines the electron response. The current which sustains the magnetic perturbation via the Ampere law results from the combines action of the radial electric field in the frame where the island chains are static and of the thermal electron diamagnetism. Numerical calculations show that at usual values of β pol in Tokamaks the turbulence can create a diffusion coefficient of order ν th p 2 i where p i is the ion larmor radius and ν th the electron ion collision frequency. On the other hand, collisionless regimes involving special profiles of each mode near the resonant surface seem possible

  1. Consistent evolution in a pedestrian flow

    Science.gov (United States)

    Guan, Junbiao; Wang, Kaihua

    2016-03-01

    In this paper, pedestrian evacuation considering different human behaviors is studied by using a cellular automaton (CA) model combined with the snowdrift game theory. The evacuees are divided into two types, i.e. cooperators and defectors, and two different human behaviors, herding behavior and independent behavior, are investigated. It is found from a large amount of numerical simulations that the ratios of the corresponding evacuee clusters are evolved to consistent states despite 11 typically different initial conditions, which may largely owe to self-organization effect. Moreover, an appropriate proportion of initial defectors who are of herding behavior, coupled with an appropriate proportion of initial defectors who are of rationally independent thinking, are two necessary factors for short evacuation time.

  2. Evaluating the hydrological consistency of evaporation products

    KAUST Repository

    Lopez Valencia, Oliver Miguel

    2017-01-18

    Advances in space-based observations have provided the capacity to develop regional- to global-scale estimates of evaporation, offering insights into this key component of the hydrological cycle. However, the evaluation of large-scale evaporation retrievals is not a straightforward task. While a number of studies have intercompared a range of these evaporation products by examining the variance amongst them, or by comparison of pixel-scale retrievals against ground-based observations, there is a need to explore more appropriate techniques to comprehensively evaluate remote-sensing-based estimates. One possible approach is to establish the level of product agreement between related hydrological components: for instance, how well do evaporation patterns and response match with precipitation or water storage changes? To assess the suitability of this "consistency"-based approach for evaluating evaporation products, we focused our investigation on four globally distributed basins in arid and semi-arid environments, comprising the Colorado River basin, Niger River basin, Aral Sea basin, and Lake Eyre basin. In an effort to assess retrieval quality, three satellite-based global evaporation products based on different methodologies and input data, including CSIRO-PML, the MODIS Global Evapotranspiration product (MOD16), and Global Land Evaporation: the Amsterdam Methodology (GLEAM), were evaluated against rainfall data from the Global Precipitation Climatology Project (GPCP) along with Gravity Recovery and Climate Experiment (GRACE) water storage anomalies. To ensure a fair comparison, we evaluated consistency using a degree correlation approach after transforming both evaporation and precipitation data into spherical harmonics. Overall we found no persistent hydrological consistency in these dryland environments. Indeed, the degree correlation showed oscillating values between periods of low and high water storage changes, with a phase difference of about 2–3 months

  3. Using Data Communication to Give Ease in Hotel Room Services

    OpenAIRE

    Tjiptadi, Rudi

    2011-01-01

    Gaining extra comfort in a trip is an important factor. Staying in a hotel needs food, laundry, and other activities that can make guests comfortable. The guesses’ requests are usually ordered to the Room Service. Sometimes problems occur in serving the guests’ requests due to human error, such as overdue orders, misunderstandings, etc. Computers are used to prevent those problems by typing requests directly from a computer in the room. The method is done by collecting data from the direct in...

  4. Informed Consent and Capacity to Give Consent in Mental Disorders

    OpenAIRE

    Zeynep Mackali

    2014-01-01

    Among four basic principles (respect for autonomy, beneficence, non-malfeasance, and justice) which determine ethical behaviors in healthcare, informed consent is mostly related to and lsquo;respect for autonomy'. Also, it reflects patient/client's right for decision and the value given for the client and his/her autonomy. Informed consent is an information sharing process including both rational decision-making about the most appropriate method among many different options and the interacti...

  5. STP: A mathematically and physically consistent library of steam properties

    International Nuclear Information System (INIS)

    Aguilar, F.; Hutter, A.C.; Tuttle, P.G.

    1982-01-01

    A new FORTRAN library of subroutines has been developed from the fundamental equation of Keenan et al. to evaluate a large set of water properties including derivatives such as sound speed and isothermal compressibility. The STP library uses the true saturation envelope of the Keenan et al. fundamental equation. The evaluation of the true envelope by a continuation method is explained. This envelope, along with other design features, imparts an exceptionally high degree of thermodynamic and mathematical consistency to the STP library, even at the critical point. Accuracy and smoothness, library self-consistency, and designed user convenience make the STP library a reliable and versatile water property package

  6. Evidence for Consistency of the Glycation Gap in Diabetes

    OpenAIRE

    Nayak, Ananth U.; Holland, Martin R.; Macdonald, David R.; Nevill, Alan; Singh, Baldev M.

    2011-01-01

    OBJECTIVE Discordance between HbA1c and fructosamine estimations in the assessment of glycemia is often encountered. A number of mechanisms might explain such discordance, but whether it is consistent is uncertain. This study aims to coanalyze paired glycosylated hemoglobin (HbA1c)-fructosamine estimations by using fructosamine to determine a predicted HbA1c, to calculate a glycation gap (G-gap) and to determine whether the G-gap is consistent over time. RESEARCH DESIGN AND METHODS We include...

  7. Demotivating incentives and motivation crowding out in charitable giving.

    Science.gov (United States)

    Chao, Matthew

    2017-07-11

    Research has shown that extrinsic incentives can crowd out intrinsic motivation in many contexts. Despite this, many nonprofits offer conditional thank-you gifts, such as mugs or tote bags, in exchange for donations. In collaboration with a nonprofit, this study implements a direct mail field experiment and demonstrates that thank-you gifts reduced donation rates in a fundraising campaign. Attention-based multiattribute choice models suggest that this is because prospective donors shift attention to the salient gift offer, causing them to underweight less salient intrinsic motives. Attention to the gift may also cause individuals to adopt a more cost-benefit mindset, further de-emphasizing intrinsic motives. Consistent with these hypotheses, crowding out was driven by those who donated higher amounts in the previous year (i.e., those who likely had higher intrinsic motivation). In a complementary online experiment, thank-you gifts also reduced donation rates but only when the gift was visually salient. This corroborates the mediating role of attention in crowding out. Taken together, the laboratory and field results demonstrate that this fundraising technique can be demotivating in some contexts and that this may occur through an attention-based mechanism.

  8. Thermodynamically consistent data-driven computational mechanics

    Science.gov (United States)

    González, David; Chinesta, Francisco; Cueto, Elías

    2018-05-01

    In the paradigm of data-intensive science, automated, unsupervised discovering of governing equations for a given physical phenomenon has attracted a lot of attention in several branches of applied sciences. In this work, we propose a method able to avoid the identification of the constitutive equations of complex systems and rather work in a purely numerical manner by employing experimental data. In sharp contrast to most existing techniques, this method does not rely on the assumption on any particular form for the model (other than some fundamental restrictions placed by classical physics such as the second law of thermodynamics, for instance) nor forces the algorithm to find among a predefined set of operators those whose predictions fit best to the available data. Instead, the method is able to identify both the Hamiltonian (conservative) and dissipative parts of the dynamics while satisfying fundamental laws such as energy conservation or positive production of entropy, for instance. The proposed method is tested against some examples of discrete as well as continuum mechanics, whose accurate results demonstrate the validity of the proposed approach.

  9. Distillation methods

    International Nuclear Information System (INIS)

    Konecny, C.

    1975-01-01

    Two main methods of separation using the distillation method are given and evaluated, namely evaporation and distillation in carrier gas flow. Two basic apparatus are described for illustrating the methods used. The use of the distillation method in radiochemistry is documented by a number of examples of the separation of elements in elemental state, volatile halogenides and oxides. Tables give a survey of distillation methods used for the separation of the individual elements and give conditions under which this separation takes place. The suitability of the use of distillation methods in radiochemistry is discussed with regard to other separation methods. (L.K.)

  10. What are the impacts of giving up the driver license?

    OpenAIRE

    Siren, Anu Kristiina; Haustein, Sonja

    2013-01-01

    Objectives: Driving cessation is a gradual process, where driver’s self-regulation plays an important role. Age-based license renewal procedures may interfere with this process and trigger premature driving cessation. The present study compares drivers aged 69 years at the baseline who either renewed their driver’s license (“renewers”) or did not (“non-renewers”) over a two-year period.Methods: Data were collected by interviewing a sample of older Danish people in 2009 (n = 1,792) and in 2012...

  11. Staying home to give birth: why women in the United States choose home birth.

    Science.gov (United States)

    Boucher, Debora; Bennett, Catherine; McFarlin, Barbara; Freeze, Rixa

    2009-01-01

    Approximately 1% of American women give birth at home and face substantial obstacles when they make this choice. This study describes the reasons that women in the United States choose home birth. A qualitative descriptive secondary analysis was conducted in a previously collected dataset obtained via an online survey. The sample consisted of 160 women who were US residents and planned a home birth at least once. Content analysis was used to study the responses from women to one essay question: "Why did you choose home birth?" Women who participated in the study were mostly married (91%) and white (87%). The majority (62%) had a college education. Our analysis revealed 508 separate statements about why these women chose home birth. Responses were coded and categorized into 26 common themes. The most common reasons given for wanting to birth at home were: 1) safety (n = 38); 2) avoidance of unnecessary medical interventions common in hospital births (n = 38); 3) previous negative hospital experience (n = 37); 4) more control (n = 35); and 5) comfortable, familiar environment (n = 30). Another dominant theme was women's trust in the birth process (n = 25). Women equated medical intervention with reduced safety and trusted their bodies' inherent ability to give birth without interference.

  12. The teacher benefits from giving autonomy support during physical education instruction.

    Science.gov (United States)

    Cheon, Sung Hyeon; Reeve, Johnmarshall; Yu, Tae Ho; Jang, Hue Ryen

    2014-08-01

    Recognizing that students benefit when they receive autonomy-supportive teaching, the current study tested the parallel hypothesis that teachers themselves would benefit from giving autonomy support. Twenty-seven elementary, middle, and high school physical education teachers (20 males, 7 females) were randomly assigned either to participate in an autonomy-supportive intervention program (experimental group) or to teach their physical education course with their existing style (control group) within a three-wave longitudinal research design. Manipulation checks showed that the intervention was successful, as students perceived and raters scored teachers in the experimental group as displaying a more autonomy-supportive and less controlling motivating style. In the main analyses, ANCOVA-based repeated-measures analyses showed large and consistent benefits for teachers in the experimental group, including greater teaching motivation (psychological need satisfaction, autonomous motivation, and intrinsic goals), teaching skill (teaching efficacy), and teaching well-being (vitality, job satisfaction, and lesser emotional and physical exhaustion). These findings show that giving autonomy support benefits teachers in much the same way that receiving it benefits their students.

  13. EVALUATION OF SERVICE QUALITY OF AIRWAY COMPANIES GIVING DOMESTIC SERVICES IN TURKEY WITH FUZZY SET APPROACH

    Directory of Open Access Journals (Sweden)

    H. Handan DEMIR

    2013-01-01

    Full Text Available Today, service quality has become a major phenomenon with the requirement of meeting consumer demands in the best way brought along with the rising competition between companies. Airway transportation is preferred more and more during the recent years. Many qualitative and quantitative criteria are considered while evaluating service criteria in airway transportation. In this context, evaluation of service quality is a decisionmaking problem with many criteria. The purpose of this study is to evaluate service quality of domestic airway companies in Turkey. In this study; fuzzy TOPSIS method which is one of the most preferred fuzzy MCDM methods, extension of multi criteria decision making methods in fuzzy environments, considering qualitative and quantitative criteria together and giving opportunity to make group decisions in fuzzy environments. As a result, evaluation was made based on service quality criteria for the most preferred airways companies in Turkey and these companies were ranked according to their levels of service quality.

  14. Validation of consistency of Mendelian sampling variance.

    Science.gov (United States)

    Tyrisevä, A-M; Fikse, W F; Mäntysaari, E A; Jakobsen, J; Aamand, G P; Dürr, J; Lidauer, M H

    2018-03-01

    Experiences from international sire evaluation indicate that the multiple-trait across-country evaluation method is sensitive to changes in genetic variance over time. Top bulls from birth year classes with inflated genetic variance will benefit, hampering reliable ranking of bulls. However, none of the methods available today enable countries to validate their national evaluation models for heterogeneity of genetic variance. We describe a new validation method to fill this gap comprising the following steps: estimating within-year genetic variances using Mendelian sampling and its prediction error variance, fitting a weighted linear regression between the estimates and the years under study, identifying possible outliers, and defining a 95% empirical confidence interval for a possible trend in the estimates. We tested the specificity and sensitivity of the proposed validation method with simulated data using a real data structure. Moderate (M) and small (S) size populations were simulated under 3 scenarios: a control with homogeneous variance and 2 scenarios with yearly increases in phenotypic variance of 2 and 10%, respectively. Results showed that the new method was able to estimate genetic variance accurately enough to detect bias in genetic variance. Under the control scenario, the trend in genetic variance was practically zero in setting M. Testing cows with an average birth year class size of more than 43,000 in setting M showed that tolerance values are needed for both the trend and the outlier tests to detect only cases with a practical effect in larger data sets. Regardless of the magnitude (yearly increases in phenotypic variance of 2 or 10%) of the generated trend, it deviated statistically significantly from zero in all data replicates for both cows and bulls in setting M. In setting S with a mean of 27 bulls in a year class, the sampling error and thus the probability of a false-positive result clearly increased. Still, overall estimated genetic

  15. Unique Stellar System Gives Einstein a Thumbs-Up

    Science.gov (United States)

    2008-07-01

    Taking advantage of a unique cosmic coincidence, astronomers have measured an effect predicted by Albert Einstein's theory of General Relativity in the extremely strong gravity of a pair of superdense neutron stars. The new data indicate that the famed physicist's 93-year-old theory has passed yet another test. Double Pulsar Graphic Artist's Conception of Double Pulsar System PSR J0737-3039A/B CREDIT: Daniel Cantin, DarwinDimensions, McGill University Click on image for more graphics. The scientists used the National Science Foundation's Robert C. Byrd Green Bank Telescope (GBT) to make a four-year study of a double-star system unlike any other known in the Universe. The system is a pair of neutron stars, both of which are seen as pulsars that emit lighthouse-like beams of radio waves. "Of about 1700 known pulsars, this is the only case where two pulsars are in orbit around each other," said Rene Breton, a graduate student at McGill University in Montreal, Canada. In addition, the stars' orbital plane is aligned nearly perfectly with their line of sight to the Earth, so that one passes behind a doughnut-shaped region of ionized gas surrounding the other, eclipsing the signal from the pulsar in back. "Those eclipses are the key to making a measurement that could never be done before," Breton said. Einstein's 1915 theory predicted that, in a close system of two very massive objects, such as neutron stars, one object's gravitational tug, along with an effect of its spinning around its axis, should cause the spin axis of the other to wobble, or precess. Studies of other pulsars in binary systems had indicated that such wobbling occurred, but could not produce precise measurements of the amount of wobbling. "Measuring the amount of wobbling is what tests the details of Einstein's theory and gives a benchmark that any alternative gravitational theories must meet," said Scott Ransom of the National Radio Astronomy Observatory. The eclipses allowed the astronomers to pin

  16. Wide baseline stereo matching based on double topological relationship consistency

    Science.gov (United States)

    Zou, Xiaohong; Liu, Bin; Song, Xiaoxue; Liu, Yang

    2009-07-01

    Stereo matching is one of the most important branches in computer vision. In this paper, an algorithm is proposed for wide-baseline stereo vision matching. Here, a novel scheme is presented called double topological relationship consistency (DCTR). The combination of double topological configuration includes the consistency of first topological relationship (CFTR) and the consistency of second topological relationship (CSTR). It not only sets up a more advanced model on matching, but discards mismatches by iteratively computing the fitness of the feature matches and overcomes many problems of traditional methods depending on the powerful invariance to changes in the scale, rotation or illumination across large view changes and even occlusions. Experimental examples are shown where the two cameras have been located in very different orientations. Also, epipolar geometry can be recovered using RANSAC by far the most widely method adopted possibly. By the method, we can obtain correspondences with high precision on wide baseline matching problems. Finally, the effectiveness and reliability of this method are demonstrated in wide-baseline experiments on the image pairs.

  17. Consistent data-driven computational mechanics

    Science.gov (United States)

    González, D.; Chinesta, F.; Cueto, E.

    2018-05-01

    We present a novel method, within the realm of data-driven computational mechanics, to obtain reliable and thermodynamically sound simulation from experimental data. We thus avoid the need to fit any phenomenological model in the construction of the simulation model. This kind of techniques opens unprecedented possibilities in the framework of data-driven application systems and, particularly, in the paradigm of industry 4.0.

  18. Discrete differential geometry. Consistency as integrability

    OpenAIRE

    Bobenko, Alexander I.; Suris, Yuri B.

    2005-01-01

    A new field of discrete differential geometry is presently emerging on the border between differential and discrete geometry. Whereas classical differential geometry investigates smooth geometric shapes (such as surfaces), and discrete geometry studies geometric shapes with finite number of elements (such as polyhedra), the discrete differential geometry aims at the development of discrete equivalents of notions and methods of smooth surface theory. Current interest in this field derives not ...

  19. Self-consistent study of localization

    International Nuclear Information System (INIS)

    Brezini, A.; Olivier, G.

    1981-08-01

    The localization models of Abou-Chacra et al. and Kumar et al. are critically re-examined in the limit of weak disorder. By using an improved method of approximation, we have studied the displacement of the band edge and the mobility edge as function of disorder and compared the results of Abou-Chacra et al. and Kumar et al. in the light of the present approximation. (author)

  20. Self-consistent modelling of resonant tunnelling structures

    DEFF Research Database (Denmark)

    Fiig, T.; Jauho, A.P.

    1992-01-01

    We report a comprehensive study of the effects of self-consistency on the I-V-characteristics of resonant tunnelling structures. The calculational method is based on a simultaneous solution of the effective-mass Schrödinger equation and the Poisson equation, and the current is evaluated...... applied voltages and carrier densities at the emitter-barrier interface. We include the two-dimensional accumulation layer charge and the quantum well charge in our self-consistent scheme. We discuss the evaluation of the current contribution originating from the two-dimensional accumulation layer charges......, and our qualitative estimates seem consistent with recent experimental studies. The intrinsic bistability of resonant tunnelling diodes is analyzed within several different approximation schemes....

  1. Self-consistent field with pseudowavefunctions

    International Nuclear Information System (INIS)

    Szasz, L.

    1976-01-01

    A computational method is given in which the energy of an atom is computed by using pseudowavefunctions only. The method centers on a model energy expression E/sub M/ which is similar to the Hartree--Fock energy expression, but contains only pseudowavefunctions. A theorem is proved according to which the Hartree--Fock orbitals can be transformed by a linear transformation into a set of uniquely defined pseudowavefunctions which have the property that, when substituted into E/sub M/, this quantity will closely approximate the Hartree--Fock energy E/sub F/. The new method is then formulated by identifying the total energy of an atom with the minimum of E/sub M/. Application of the energy minimum principle leads to a set of equations for the pseudowavefunctions which are similar to but simpler than the Hartree--Fock equations. These equations contain pseudopotentials for which explicit expressions are derived. The possibility of replacing these pseudopotentials by simpler model potentials is discussed, and the criteria for the selection of the model potential are outlined

  2. Giving USA 1997: The Annual Report on Philanthropy for the Year 1996.

    Science.gov (United States)

    Kaplan, Ann E., Ed.

    This report presents a comprehensive review of private philanthropy in the United States during 1996. After a preliminary section, the first section presents data on giving, using text, graphs, and charts. Sections cover: overall 1996 contributions; changes in giving by source and use; total giving (1966-1996); inflation-adjusted giving in 5-year…

  3. Substitution or Symbiosis? Assessing the Relationship between Religious and Secular Giving

    Science.gov (United States)

    Hill, Jonathan P.; Vaidyanathan, Brandon

    2011-01-01

    Research on philanthropy has not sufficiently examined whether charitable giving to religious causes impinges on giving to secular causes. Examining three waves of national panel data, we find that the relationship between religious and secular giving is generally not of a zero-sum nature; families that increase their religious giving also…

  4. Exploring the Consistent behavior of Information Services

    Directory of Open Access Journals (Sweden)

    Kapidakis Sarantos

    2016-01-01

    Full Text Available Computer services are normally assumed to work well all the time. This usually happens for crucial services like bank electronic services, but not necessarily so for others, that there is no commercial interest in their operation. In this work we examined the operation and the errors of information services and tried to find clues that will help predicting the consistency of the behavior and the quality of the harvesting, which is harder because of the transient conditions and the many services and the huge amount of harvested information. We found many unexpected situations. The services that always successfully satisfy a request may in fact return part of it. A significant part of the OAI services have ceased working while many other serves occasionally fail to respond. Some services fail in the same way each time, and we pronounce them dead, as we do not see a way to overcome that. Others also always, or sometimes fail, but not in the same way, and we hope that their behavior is affected by temporary factors, that may improve later on. We categorized the services into classes, to study their behavior in more detail.

  5. [Consistent Declarative Memory with Depressive Symptomatology].

    Science.gov (United States)

    Botelho de Oliveira, Silvia; Flórez, Ruth Natalia Suárez; Caballero, Diego Andrés Vásquez

    2012-12-01

    Some studies have suggested that potentiated remembrance of negative events on people with depressive disorders seems to be an important factor in the etiology, course and maintenance of depression. Evaluate the emotional memory in people with and without depressive symptomatology by means of an audio-visual test. 73 university students were evaluated, male and female, between 18 and 40 years old, distributed in two groups: with depressive symptomatology (32) and without depressive symptomatology (40), using the Scale from the Center of Epidemiologic Studies for Depression (CES-D, English Abbreviation) and a cutting point of 20. There were not meaningful differences between free and voluntary recalls, with and without depressive symptomatology, in spite of the fact that both groups had granted a higher emotional value to the audio-visual test and that they had associated it with emotional sadness. People with depressive symptomatology did not exhibit the effect of mnemonic potentiation generally associated to the content of the emotional version of the test; therefore, the hypothesis of emotional consistency was not validated. Copyright © 2012 Asociación Colombiana de Psiquiatría. Publicado por Elsevier España. All rights reserved.

  6. Self consistent field theory of virus assembly

    Science.gov (United States)

    Li, Siyu; Orland, Henri; Zandi, Roya

    2018-04-01

    The ground state dominance approximation (GSDA) has been extensively used to study the assembly of viral shells. In this work we employ the self-consistent field theory (SCFT) to investigate the adsorption of RNA onto positively charged spherical viral shells and examine the conditions when GSDA does not apply and SCFT has to be used to obtain a reliable solution. We find that there are two regimes in which GSDA does work. First, when the genomic RNA length is long enough compared to the capsid radius, and second, when the interaction between the genome and capsid is so strong that the genome is basically localized next to the wall. We find that for the case in which RNA is more or less distributed uniformly in the shell, regardless of the length of RNA, GSDA is not a good approximation. We observe that as the polymer-shell interaction becomes stronger, the energy gap between the ground state and first excited state increases and thus GSDA becomes a better approximation. We also present our results corresponding to the genome persistence length obtained through the tangent-tangent correlation length and show that it is zero in case of GSDA but is equal to the inverse of the energy gap when using SCFT.

  7. Consistency based correlations for tailings consolidation

    Energy Technology Data Exchange (ETDEWEB)

    Azam, S.; Paul, A.C. [Regina Univ., Regina, SK (Canada). Environmental Systems Engineering

    2010-07-01

    The extraction of oil, uranium, metals and mineral resources from the earth generates significant amounts of tailings slurry. The tailings are contained in a disposal area with perimeter dykes constructed from the coarser fraction of the slurry. There are many unique challenges pertaining to the management of the containment facilities for several decades beyond mine closure that are a result of the slow settling rates of the fines and the high standing toxic waters. Many tailings dam failures in different parts of the world have been reported to result in significant contaminant releases causing public concern over the conventional practice of tailings disposal. Therefore, in order to reduce and minimize the environmental footprint, the fluid tailings need to undergo efficient consolidation. This paper presented an investigation into the consolidation behaviour of tailings in conjunction with soil consistency that captured physicochemical interactions. The paper discussed the large strain consolidation behaviour (volume compressibility and hydraulic conductivity) of six fine-grained soil slurries based on published data. The paper provided background information on the study and presented the research methodology. The geotechnical index properties of the selected materials were also presented. The large strain consolidation, volume compressibility correlations, and hydraulic conductivity correlations were provided. It was concluded that the normalized void ratio best described volume compressibility whereas liquidity index best explained the hydraulic conductivity. 17 refs., 3 tabs., 4 figs.

  8. Consistency between GRUAN sondes, LBLRTM and IASI

    Directory of Open Access Journals (Sweden)

    X. Calbet

    2017-06-01

    Full Text Available Radiosonde soundings from the GCOS Reference Upper-Air Network (GRUAN data record are shown to be consistent with Infrared Atmospheric Sounding Instrument (IASI-measured radiances via LBLRTM (Line-By-Line Radiative Transfer Model in the part of the spectrum that is mostly affected by water vapour absorption in the upper troposphere (from 700 hPa up. This result is key for climate data records, since GRUAN, IASI and LBLRTM constitute reference measurements or a reference radiative transfer model in each of their fields. This is specially the case for night-time radiosonde measurements. Although the sample size is small (16 cases, daytime GRUAN radiosonde measurements seem to have a small dry bias of 2.5 % in absolute terms of relative humidity, located mainly in the upper troposphere, with respect to LBLRTM and IASI. Full metrological closure is not yet possible and will not be until collocation uncertainties are better characterized and a full uncertainty covariance matrix is clarified for GRUAN.

  9. Self-consistent nuclear energy systems

    International Nuclear Information System (INIS)

    Shimizu, A.; Fujiie, Y.

    1995-01-01

    A concept of self-consistent energy systems (SCNES) has been proposed as an ultimate goal of the nuclear energy system in the coming centuries. SCNES should realize a stable and unlimited energy supply without endangering the human race and the global environment. It is defined as a system that realizes at least the following four objectives simultaneously: (a) energy generation -attain high efficiency in the utilization of fission energy; (b) fuel production - secure inexhaustible energy source: breeding of fissile material with the breeding ratio greater than one and complete burning of transuranium through recycling; (c) burning of radionuclides - zero release of radionuclides from the system: complete burning of transuranium and elimination of radioactive fission products by neutron capture reactions through recycling; (d) system safety - achieve system safety both for the public and experts: eliminate criticality-related safety issues by using natural laws and simple logic. This paper describes the concept of SCNES and discusses the feasibility of the system. Both ''neutron balance'' and ''energbalance'' of the system are introduced as the necessary conditions to be satisfied at least by SCNES. Evaluations made so far indicate that both the neutron balance and the energy balance can be realized by fast reactors but not by thermal reactors. Concerning the system safety, two safety concepts: ''self controllability'' and ''self-terminability'' are introduced to eliminate the criticality-related safety issues in fast reactors. (author)

  10. Toward a consistent model for glass dissolution

    International Nuclear Information System (INIS)

    Strachan, D.M.; McGrail, B.P.; Bourcier, W.L.

    1994-01-01

    Understanding the process of glass dissolution in aqueous media has advanced significantly over the last 10 years through the efforts of many scientists around the world. Mathematical models describing the glass dissolution process have also advanced from simple empirical functions to structured models based on fundamental principles of physics, chemistry, and thermodynamics. Although borosilicate glass has been selected as the waste form for disposal of high-level wastes in at least 5 countries, there is no international consensus on the fundamental methodology for modeling glass dissolution that could be used in assessing the long term performance of waste glasses in a geologic repository setting. Each repository program is developing their own model and supporting experimental data. In this paper, we critically evaluate a selected set of these structured models and show that a consistent methodology for modeling glass dissolution processes is available. We also propose a strategy for a future coordinated effort to obtain the model input parameters that are needed for long-term performance assessments of glass in a geologic repository. (author) 4 figs., tabs., 75 refs

  11. Using Data Communication to Give Ease in Hotel Room Services

    Directory of Open Access Journals (Sweden)

    Rudi Tjiptadi

    2011-06-01

    Full Text Available Gaining extra comfort in a trip is an important factor. Staying in a hotel needs food, laundry, and other activities that can make guests comfortable. The guesses’ requests are usually ordered to the Room Service. Sometimes problems occur in serving the guests’ requests due to human error, such as overdue orders, misunderstandings, etc. Computers are used to prevent those problems by typing requests directly from a computer in the room. The method is done by collecting data from the direct interview at a hotel related to guests’ requests, analyzing the current system, doing literature study, creating a Room Service system draft, as well as implementing the new system in a form of prototype. A Room Service system prototype is created with the abilities to order food, drinks, laundry and ironing. This prototype designed meets the guests’ satisfaction towards the hotel room services. 

  12. What are the impacts of giving up the driver license?

    DEFF Research Database (Denmark)

    Siren, Anu Kristiina; Haustein, Sonja

    Objectives: Driving cessation is a gradual process, where driver’s self-regulation plays an important role. Age-based license renewal procedures may interfere with this process and trigger premature driving cessation. The present study compares drivers aged 69 years at the baseline who either...... should try to prevent unwarranted mobility loss. Licensing policies signaling that in old age continuing to drive is an exception rather than the rule may work against this goal....... renewed their driver’s license (“renewers”) or did not (“non-renewers”) over a two-year period. Methods: Data were collected by interviewing a sample of older Danish people in 2009 (n = 1,792) and in 2012 (n = 863). The standardized interviews covered respondents’ background information, health and well...

  13. Process variables consistency at Atucha I NPP

    International Nuclear Information System (INIS)

    Arostegui, E.; Aparicio, M.; Herzovich, P.; Wenzel, J.; Urrutia, G.

    1996-01-01

    A method to evaluate the different systems performance has been developed and is still under assessment. In order to perform this job a process computer upgraded in 1992 was used. In this sense and taking into account that the resolution and stability of instrumentation is higher than its accuracy process data were corrected by software. In this was, much time spent in recalibration, and also human errors were avoided. Besides, this method allowed a better record of instrumentation performance and also an early detection of instruments failure. On the other hand, the process modelization, mainly heat and material balances has also been used to check that sensors, transducers, analog to digital converters and computer software are working properly. Some of these process equations have been introduced into the computer codes, so in some cases, it is possible to have an ''on line'' analysis of process variables and process instrumentation behaviour. Examples of process analysis are: Heat exchangers, i.e. the power calculated using shell side temperatures is compared with the tube side values; turbine performance is compared with condenser water temperature; power measured on the secondary side (one minute average measurements optimized in order to eliminate process noise are compared with power obtained from primary side data); the calibration of temperatures have been made by direct measurement of redundant sensors and have shown to be the best method; in the case of pressure and differential pressure transducers are cross checked in service when it is possible. In the present paper, details of the examples mentioned above and of other ones are given and discussed. (author). 2 refs, 1 fig., 1 tab

  14. Process variables consistency at Atucha I NPP

    Energy Technology Data Exchange (ETDEWEB)

    Arostegui, E; Aparicio, M; Herzovich, P; Wenzel, J [Central Nuclear Atucha I, Nucleoelectrica S.A., Lima, Buenos Aires (Argentina); Urrutia, G [Comision Nacional de Energia Atomica, Buenos Aires (Argentina)

    1997-12-31

    A method to evaluate the different systems performance has been developed and is still under assessment. In order to perform this job a process computer upgraded in 1992 was used. In this sense and taking into account that the resolution and stability of instrumentation is higher than its accuracy process data were corrected by software. In this was, much time spent in recalibration, and also human errors were avoided. Besides, this method allowed a better record of instrumentation performance and also an early detection of instruments failure. On the other hand, the process modelization, mainly heat and material balances has also been used to check that sensors, transducers, analog to digital converters and computer software are working properly. Some of these process equations have been introduced into the computer codes, so in some cases, it is possible to have an ``on line`` analysis of process variables and process instrumentation behaviour. Examples of process analysis are: Heat exchangers, i.e. the power calculated using shell side temperatures is compared with the tube side values; turbine performance is compared with condenser water temperature; power measured on the secondary side (one minute average measurements optimized in order to eliminate process noise are compared with power obtained from primary side data); the calibration of temperatures have been made by direct measurement of redundant sensors and have shown to be the best method; in the case of pressure and differential pressure transducers are cross checked in service when it is possible. In the present paper, details of the examples mentioned above and of other ones are given and discussed. (author). 2 refs, 1 fig., 1 tab.

  15. Orthology and paralogy constraints: satisfiability and consistency

    OpenAIRE

    Lafond, Manuel; El-Mabrouk, Nadia

    2014-01-01

    Background A variety of methods based on sequence similarity, reconciliation, synteny or functional characteristics, can be used to infer orthology and paralogy relations between genes of a given gene family   G . But is a given set   C of orthology/paralogy constraints possible, i.e., can they simultaneously co-exist in an evolutionary history for   G ? While previous studies have focused on full sets of constraints, here we consider the general case where   C does not necessarily involve a ...

  16. View from Europe: stability, consistency or pragmatism

    International Nuclear Information System (INIS)

    Dunster, H.J.

    1988-01-01

    The last few years of this decade look like a period of reappraisal of radiation protection standards. The revised risk estimates from Japan will be available, and the United Nations Scientific Committee on the Effects of Atomic Radiation will be publishing new reports on biological topics. The International Commission on Radiological Protection (ICRP) has started a review of its basic recommendations, and the new specification for dose equivalent in radiation fields of the International Commission on Radiation Units and Measurements (ICRU) will be coming into use. All this is occurring at a time when some countries are still trying to catch up with committed dose equivalent and the recently recommended change in the value of the quality factor for neutrons. In Europe, the problems of adapting to new ICRP recommendations are considerable. The European Community, including 12 states and nine languages, takes ICRP recommendations as a basis and develops council directives that are binding on member states, which have then to arrange for their own regulatory changes. Any substantial adjustments could take 5 y or more to work through the system. Clearly, the regulatory preference is for stability. Equally clearly, trade unions and public interest groups favor a rapid response to scientific developments (provided that the change is downward). Organizations such as the ICRP have to balance their desire for internal consistency and intellectual purity against the practical problems of their clients in adjusting to change. This paper indicates some of the changes that might be necessary over the next few years and how, given a pragmatic approach, they might be accommodated in Europe without too much regulatory confusion

  17. Self-consistent meson mass spectrum

    International Nuclear Information System (INIS)

    Balazs, L.A.P.

    1982-01-01

    A dual-topological-unitarization (or dual-fragmentation) approach to the calculation of hadron masses is presented, in which the effect of planar ''sea''-quark loops is taken into account from the beginning. Using techniques based on analyticity and generalized ladder-graph dynamics, we first derive the approximate ''generic'' Regge-trajectory formula α(t) = max (S 1 +S 2 , S 3 +S 4 )-(1/2) +2alpha-circumflex'[s/sub a/ +(1/2)(t-summationm/sub i/ 2 )] for any given hadronic process 1+2→3+4, where S/sub i/ and m/sub i/ are the spins and masses of i = 1,2,3,4, and √s/sub a/ is the effective mass of the lowest nonvanishing contribution (a) exchanged in the crossed channel. By requiring a minimization of secondary (background, etc.) contributions to a, and demanding simultaneous consistency for entire sets of such processes, we are then able to calculate the masses of all the lowest pseudoscalar and vector qq-bar states with q = u,d,s and the Regge trajectories on which they lie. By making certain additional assumptions we are also able to do this with q = u,d,c and q = u,d,b. Our only arbitrary parameters are m/sub rho/, m/sub K/*, m/sub psi/, and m/sub Upsilon/, one of which merely serves to fix the energy scale. In contrast to many other approaches, a small m/sub π/ 2 /m/sub rho/ 2 ratio arises quite naturally in the present scheme

  18. Speed Consistency in the Smart Tachograph.

    Science.gov (United States)

    Borio, Daniele; Cano, Eduardo; Baldini, Gianmarco

    2018-05-16

    In the transportation sector, safety risks can be significantly reduced by monitoring the behaviour of drivers and by discouraging possible misconducts that entail fatigue and can increase the possibility of accidents. The Smart Tachograph (ST), the new revision of the Digital Tachograph (DT), has been designed with this purpose: to verify that speed limits and compulsory rest periods are respected by drivers. In order to operate properly, the ST periodically checks the consistency of data from different sensors, which can be potentially manipulated to avoid the monitoring of the driver behaviour. In this respect, the ST regulation specifies a test procedure to detect motion conflicts originating from inconsistencies between Global Navigation Satellite System (GNSS) and odometry data. This paper provides an experimental evaluation of the speed verification procedure specified by the ST regulation. Several hours of data were collected using three vehicles and considering light urban and highway environments. The vehicles were equipped with an On-Board Diagnostics (OBD) data reader and a GPS/Galileo receiver. The tests prescribed by the regulation were implemented with specific focus on synchronization aspects. The experimental analysis also considered aspects such as the impact of tunnels and the presence of data gaps. The analysis shows that the metrics selected for the tests are resilient to data gaps, latencies between GNSS and odometry data and simplistic manipulations such as data scaling. The new ST forces an attacker to falsify data from both sensors at the same time and in a coherent way. This makes more difficult the implementation of frauds in comparison to the current version of the DT.

  19. GRAVITATIONALLY CONSISTENT HALO CATALOGS AND MERGER TREES FOR PRECISION COSMOLOGY

    International Nuclear Information System (INIS)

    Behroozi, Peter S.; Wechsler, Risa H.; Wu, Hao-Yi; Busha, Michael T.; Klypin, Anatoly A.; Primack, Joel R.

    2013-01-01

    We present a new algorithm for generating merger trees and halo catalogs which explicitly ensures consistency of halo properties (mass, position, and velocity) across time steps. Our algorithm has demonstrated the ability to improve both the completeness (through detecting and inserting otherwise missing halos) and purity (through detecting and removing spurious objects) of both merger trees and halo catalogs. In addition, our method is able to robustly measure the self-consistency of halo finders; it is the first to directly measure the uncertainties in halo positions, halo velocities, and the halo mass function for a given halo finder based on consistency between snapshots in cosmological simulations. We use this algorithm to generate merger trees for two large simulations (Bolshoi and Consuelo) and evaluate two halo finders (ROCKSTAR and BDM). We find that both the ROCKSTAR and BDM halo finders track halos extremely well; in both, the number of halos which do not have physically consistent progenitors is at the 1%-2% level across all halo masses. Our code is publicly available at http://code.google.com/p/consistent-trees. Our trees and catalogs are publicly available at http://hipacc.ucsc.edu/Bolshoi/.

  20. Attractive target wave patterns in complex networks consisting of excitable nodes

    International Nuclear Information System (INIS)

    Zhang Li-Sheng; Mi Yuan-Yuan; Liao Xu-Hong; Qian Yu; Hu Gang

    2014-01-01

    This review describes the investigations of oscillatory complex networks consisting of excitable nodes, focusing on the target wave patterns or say the target wave attractors. A method of dominant phase advanced driving (DPAD) is introduced to reveal the dynamic structures in the networks supporting oscillations, such as the oscillation sources and the main excitation propagation paths from the sources to the whole networks. The target center nodes and their drivers are regarded as the key nodes which can completely determine the corresponding target wave patterns. Therefore, the center (say node A) and its driver (say node B) of a target wave can be used as a label, (A,B), of the given target pattern. The label can give a clue to conveniently retrieve, suppress, and control the target waves. Statistical investigations, both theoretically from the label analysis and numerically from direct simulations of network dynamics, show that there exist huge numbers of target wave attractors in excitable complex networks if the system size is large, and all these attractors can be labeled and easily controlled based on the information given by the labels. The possible applications of the physical ideas and the mathematical methods about multiplicity and labelability of attractors to memory problems of neural networks are briefly discussed. (topical review - statistical physics and complex systems)