WorldWideScience

Sample records for methods study consisted

  1. Using network screening methods to determine locations with specific safety issues: A design consistency case study.

    Science.gov (United States)

    Butsick, Andrew J; Wood, Jonathan S; Jovanis, Paul P

    2017-09-01

    The Highway Safety Manual provides multiple methods that can be used to identify sites with promise (SWiPs) for safety improvement. However, most of these methods cannot be used to identify sites with specific problems. Furthermore, given that infrastructure funding is often specified for use related to specific problems/programs, a method for identifying SWiPs related to those programs would be very useful. This research establishes a method for Identifying SWiPs with specific issues. This is accomplished using two safety performance functions (SPFs). This method is applied to identifying SWiPs with geometric design consistency issues. Mixed effects negative binomial regression was used to develop two SPFs using 5 years of crash data and over 8754km of two-lane rural roadway. The first SPF contained typical roadway elements while the second contained additional geometric design consistency parameters. After empirical Bayes adjustments, sites with promise (SWiPs) were identified. The disparity between SWiPs identified by the two SPFs was evident; 40 unique sites were identified by each model out of the top 220 segments. By comparing sites across the two models, candidate road segments can be identified where a lack design consistency may be contributing to an increase in expected crashes. Practitioners can use this method to more effectively identify roadway segments suffering from reduced safety performance due to geometric design inconsistency, with detailed engineering studies of identified sites required to confirm the initial assessment. Copyright © 2017 Elsevier Ltd. All rights reserved.

  2. Self-consistent study of nuclei far from stability with the energy density method

    CERN Document Server

    Tondeur, F

    1981-01-01

    The self-consistent energy density method has been shown to give good results with a small number of parameters for the calculation of nuclear masses, radii, deformations, neutron skins, shell and sub- shell effects. It is here used to study the properties of nuclei far from stability, like densities, shell structure, even-odd mass differences, single-particle potentials and nuclear deformations. A few possible consequences of the results for astrophysical problems are briefly considered. The predictions of the model in the super- heavy region are summarised. (34 refs).

  3. Measuring consistency in translation memories: a mixed-methods case study

    OpenAIRE

    Moorkens, Joss

    2012-01-01

    Introduced in the early 1990s, translation memory (TM) tools have since become widely used as an aid to human translation based on commonly‐held assumptions that they save time, reduce cost, and maximise consistency. The purpose of this research is twofold: it aims to develop a method for measuring consistency in TMs; and it aims to use this method to interrogate selected TMs from the localisation industry in order to find out whether the use of TM tools does, in fact, promote consistency in ...

  4. A method for consistent precision radiation therapy

    International Nuclear Information System (INIS)

    Leong, J.

    1985-01-01

    Using a meticulous setup procedure in which repeated portal films were taken before each treatment until satisfactory portal verifications were obtained, a high degree of precision in patient positioning was achieved. A fluctuation from treatment to treatment, over 11 treatments, of less than +-0.10 cm (S.D.) for anatomical points inside the treatment field was obtained. This, however, only applies to specific anatomical points selected for this positioning procedure and does not apply to all points within the portal. We have generalized this procedure and have suggested a means by which any target volume can be consistently positioned which may approach this degree of precision. (orig.)

  5. Study of nuclear reactions with the Skyrme interaction; static properties by the self-consistent method; dynamic properties by the generator-coordinate method

    International Nuclear Information System (INIS)

    Flocard, Hubert.

    1975-01-01

    Using the same effective interaction depending only on 6 parameters a large number of nuclear properties are calculated, and the results are compared with experiment. Total binding energies of all nuclei of the chart table are reproduced within 5MeV. It is shown that the remaining discrepancy is coherent with the increase of total binding energy that can be expected from the further inclusion of collective motion correlations. Monopole, quadrupole and hexadecupole part of the charge densities are also reproduced with good accuracy. The deformation energy curves of many nuclei ranging from carbon to superheavy elements are calculated, and the different features of these curves are discussed. It should be noted that the fission barrier of actinide nuclei has been obtained and the results exhibit the well known two-bump shape. In addition the fusion energy curve of two 16 O merging in one nucleus 32 S has been completed. Results concerning monopole, dipole and quadrupole giant resonances of light nuclei obtained within the frame of the generator coordinate method are also presented. The calculated position of these resonances agree well with present available data [fr

  6. Consistent forcing scheme in the cascaded lattice Boltzmann method

    Science.gov (United States)

    Fei, Linlin; Luo, Kai Hong

    2017-11-01

    In this paper, we give an alternative derivation for the cascaded lattice Boltzmann method (CLBM) within a general multiple-relaxation-time (MRT) framework by introducing a shift matrix. When the shift matrix is a unit matrix, the CLBM degrades into an MRT LBM. Based on this, a consistent forcing scheme is developed for the CLBM. The consistency of the nonslip rule, the second-order convergence rate in space, and the property of isotropy for the consistent forcing scheme is demonstrated through numerical simulations of several canonical problems. Several existing forcing schemes previously used in the CLBM are also examined. The study clarifies the relation between MRT LBM and CLBM under a general framework.

  7. Consistent forcing scheme in the cascaded lattice Boltzmann method.

    Science.gov (United States)

    Fei, Linlin; Luo, Kai Hong

    2017-11-01

    In this paper, we give an alternative derivation for the cascaded lattice Boltzmann method (CLBM) within a general multiple-relaxation-time (MRT) framework by introducing a shift matrix. When the shift matrix is a unit matrix, the CLBM degrades into an MRT LBM. Based on this, a consistent forcing scheme is developed for the CLBM. The consistency of the nonslip rule, the second-order convergence rate in space, and the property of isotropy for the consistent forcing scheme is demonstrated through numerical simulations of several canonical problems. Several existing forcing schemes previously used in the CLBM are also examined. The study clarifies the relation between MRT LBM and CLBM under a general framework.

  8. Benefits of extensive recruitment effort persist during follow-ups and are consistent across age group and survey method. The TRAILS study

    Directory of Open Access Journals (Sweden)

    Nederhof Esther

    2012-07-01

    . Conclusions First, extensive recruitment effort at the first assessment wave of a prospective population based cohort study has long lasting positive effects. Second, characteristics of hard-to-recruit responders are largely consistent across age groups and survey methods.

  9. Benefits of extensive recruitment effort persist during follow-ups and are consistent across age group and survey method. The TRAILS study

    Science.gov (United States)

    2012-01-01

    recruitment effort at the first assessment wave of a prospective population based cohort study has long lasting positive effects. Second, characteristics of hard-to-recruit responders are largely consistent across age groups and survey methods. PMID:22747967

  10. Benefits of extensive recruitment effort persist during follow-ups and are consistent across age group and survey method. The TRAILS study.

    Science.gov (United States)

    Nederhof, Esther; Jörg, Frederike; Raven, Dennis; Veenstra, René; Verhulst, Frank C; Ormel, Johan; Oldehinkel, Albertine J

    2012-07-02

    assessment wave of a prospective population based cohort study has long lasting positive effects. Second, characteristics of hard-to-recruit responders are largely consistent across age groups and survey methods.

  11. Statistically Consistent k-mer Methods for Phylogenetic Tree Reconstruction.

    Science.gov (United States)

    Allman, Elizabeth S; Rhodes, John A; Sullivant, Seth

    2017-02-01

    Frequencies of k-mers in sequences are sometimes used as a basis for inferring phylogenetic trees without first obtaining a multiple sequence alignment. We show that a standard approach of using the squared Euclidean distance between k-mer vectors to approximate a tree metric can be statistically inconsistent. To remedy this, we derive model-based distance corrections for orthologous sequences without gaps, which lead to consistent tree inference. The identifiability of model parameters from k-mer frequencies is also studied. Finally, we report simulations showing that the corrected distance outperforms many other k-mer methods, even when sequences are generated with an insertion and deletion process. These results have implications for multiple sequence alignment as well since k-mer methods are usually the first step in constructing a guide tree for such algorithms.

  12. Self-consistent study of localization

    International Nuclear Information System (INIS)

    Brezini, A.; Olivier, G.

    1981-08-01

    The localization models of Abou-Chacra et al. and Kumar et al. are critically re-examined in the limit of weak disorder. By using an improved method of approximation, we have studied the displacement of the band edge and the mobility edge as function of disorder and compared the results of Abou-Chacra et al. and Kumar et al. in the light of the present approximation. (author)

  13. Quasiparticle self-consistent GW method: a short summary

    International Nuclear Information System (INIS)

    Kotani, Takao; Schilfgaarde, Mark van; Faleev, Sergey V; Chantis, Athanasios

    2007-01-01

    We have developed a quasiparticle self-consistent GW method (QSGW), which is a new self-consistent method to calculate the electronic structure within the GW approximation. The method is formulated based on the idea of a self-consistent perturbation; the non-interacting Green function G 0 , which is the starting point for GWA to obtain G, is determined self-consistently so as to minimize the perturbative correction generated by GWA. After self-consistency is attained, we have G 0 , W (the screened Coulomb interaction) and G self-consistently. This G 0 can be interpreted as the optimum non-interacting propagator for the quasiparticles. We will summarize some theoretical discussions to justify QSGW. Then we will survey results which have been obtained up to now: e.g., band gaps for normal semiconductors are predicted to a precision of 0.1-0.3 eV; the self-consistency including the off-diagonal part is required for NiO and MnO; and so on. There are still some remaining disagreements with experiments; however, they are very systematic, and can be explained from the neglect of excitonic effects

  14. Linear augmented plane wave method for self-consistent calculations

    International Nuclear Information System (INIS)

    Takeda, T.; Kuebler, J.

    1979-01-01

    O.K. Andersen has recently introduced a linear augmented plane wave method (LAPW) for the calculation of electronic structure that was shown to be computationally fast. A more general formulation of an LAPW method is presented here. It makes use of a freely disposable number of eigenfunctions of the radial Schroedinger equation. These eigenfunctions can be selected in a self-consistent way. The present formulation also results in a computationally fast method. It is shown that Andersen's LAPW is obtained in a special limit from the present formulation. Self-consistent test calculations for copper show the present method to be remarkably accurate. As an application, scalar-relativistic self-consistent calculations are presented for the band structure of FCC lanthanum. (author)

  15. A consistent and efficient graphical analysis method to improve the quantification of reversible tracer binding in radioligand receptor dynamic PET studies

    OpenAIRE

    Zhou, Yun; Ye, Weiguo; Brašić, James R.; Crabb, Andrew H.; Hilton, John; Wong, Dean F.

    2008-01-01

    The widely used Logan plot in radioligand receptor dynamic PET studies produces marked noise-induced negative biases in the estimates of total distribution volume (DVT) and binding potential (BP). To avoid the inconsistencies in the estimates from the Logan plot, a new graphical analysis method was proposed and characterized in this study. The new plot with plasma input and with reference tissue input was first derived to estimate DVT and BP. A condition was provided to ensure that the estima...

  16. An algebraic method for constructing stable and consistent autoregressive filters

    International Nuclear Information System (INIS)

    Harlim, John; Hong, Hoon; Robbins, Jacob L.

    2015-01-01

    In this paper, we introduce an algebraic method to construct stable and consistent univariate autoregressive (AR) models of low order for filtering and predicting nonlinear turbulent signals with memory depth. By stable, we refer to the classical stability condition for the AR model. By consistent, we refer to the classical consistency constraints of Adams–Bashforth methods of order-two. One attractive feature of this algebraic method is that the model parameters can be obtained without directly knowing any training data set as opposed to many standard, regression-based parameterization methods. It takes only long-time average statistics as inputs. The proposed method provides a discretization time step interval which guarantees the existence of stable and consistent AR model and simultaneously produces the parameters for the AR models. In our numerical examples with two chaotic time series with different characteristics of decaying time scales, we find that the proposed AR models produce significantly more accurate short-term predictive skill and comparable filtering skill relative to the linear regression-based AR models. These encouraging results are robust across wide ranges of discretization times, observation times, and observation noise variances. Finally, we also find that the proposed model produces an improved short-time prediction relative to the linear regression-based AR-models in forecasting a data set that characterizes the variability of the Madden–Julian Oscillation, a dominant tropical atmospheric wave pattern

  17. Bootstrap embedding: An internally consistent fragment-based method

    Energy Technology Data Exchange (ETDEWEB)

    Welborn, Matthew; Tsuchimochi, Takashi; Van Voorhis, Troy [Department of Chemistry, Massachusetts Institute of Technology, 77 Massachusetts Avenue, Cambridge, Massachusetts 02139 (United States)

    2016-08-21

    Strong correlation poses a difficult problem for electronic structure theory, with computational cost scaling quickly with system size. Fragment embedding is an attractive approach to this problem. By dividing a large complicated system into smaller manageable fragments “embedded” in an approximate description of the rest of the system, we can hope to ameliorate the steep cost of correlated calculations. While appealing, these methods often converge slowly with fragment size because of small errors at the boundary between fragment and bath. We describe a new electronic embedding method, dubbed “Bootstrap Embedding,” a self-consistent wavefunction-in-wavefunction embedding theory that uses overlapping fragments to improve the description of fragment edges. We apply this method to the one dimensional Hubbard model and a translationally asymmetric variant, and find that it performs very well for energies and populations. We find Bootstrap Embedding converges rapidly with embedded fragment size, overcoming the surface-area-to-volume-ratio error typical of many embedding methods. We anticipate that this method may lead to a low-scaling, high accuracy treatment of electron correlation in large molecular systems.

  18. Benefits of extensive recruitment effort persist during follow-ups and are consistent across age group and survey method. the TRAILS study

    NARCIS (Netherlands)

    E. Nederhof (Esther); F. Jörg (Frederike); D. Raven (Dennis); R. Veenstra (René); F.C. Verhulst (Frank); J. Ormel (Johan Hans); A.J. Oldehinkel (Albertine)

    2012-01-01

    textabstractBackground: Extensive recruitment effort at baseline increases representativeness of study populations by decreasing non-response and associated bias. First, it is not known to what extent increased attrition occurs during subsequent measurement waves among subjects who were

  19. Self-consistent studies of magnetic thin film Ni (001)

    International Nuclear Information System (INIS)

    Wang, C.S.; Freeman, A.J.

    1979-01-01

    Advances in experimental methods for studying surface phenomena have provided the stimulus to develop theoretical methods capable of interpreting this wealth of new information. Of particular interest have been the relative roles of bulk and surface contributions since in several important cases agreement between experiment and bulk self-consistent (SC) calculations within the local spin density functional formalism (LSDF) is lacking. We discuss our recent extension of the (LSDF) approach to the study of thin films (slabs) and the role of surface effects on magnetic properties. Results are described for Ni (001) films using our new SC numerical basis set LCAO method. Self-consistency within the superposition of overlapping spherical atomic charge density model is obtained iteratively with the atomic configuration as the adjustable parameter. Results are presented for the electronic charge densities and local density of states. The origin and role of (magnetic) surface states is discussed by comparison with results of earlier bulk calculations

  20. Fully consistent CFD methods for incompressible flow computations

    DEFF Research Database (Denmark)

    Kolmogorov, Dmitry; Shen, Wen Zhong; Sørensen, Niels N.

    2014-01-01

    Nowadays collocated grid based CFD methods are one of the most e_cient tools for computations of the ows past wind turbines. To ensure the robustness of the methods they require special attention to the well-known problem of pressure-velocity coupling. Many commercial codes to ensure the pressure...

  1. The study of consistent properties of gelatinous shampoo with minoxidil

    Directory of Open Access Journals (Sweden)

    I. V. Gnitko

    2016-04-01

    Full Text Available The aim of the work is the study of consistent properties of gelatinous shampoo with minoxidil 1% for the complex therapy and prevention of alopecia. This shampoo with minoxidil was selected according to the complex physical-chemical, biopharmaceutical and microbiological investigations. Methods and results. It has been established that consistent properties of the gelatinous minoxidil 1% shampoo and the «mechanical stability» (1.70 describe the formulation as exceptionally thixotropic composition with possibility of restoration after mechanical loads. Also this fact allows to predict stability of the consistent properties during long storage. Conclusion. Factors of dynamic flowing for the foam detergent gel with minoxidil (Кd1=38.9%; Kd2=78.06% quantitatively confirm sufficient degree of distribution at the time of spreading composition on the skin surface of the hairy part of head or during technological operations of manufacturing. Insignificant difference of «mechanical stability» for the gelatinous minoxidil 1% shampoo and its base indicates the absence of interactions between active substance and the base.

  2. A Preliminary Study toward Consistent Soil Moisture from AMSR2

    NARCIS (Netherlands)

    Parinussa, R.M.; Holmes, T.R.H.; Wanders, N.; Dorigo, W.A.; de Jeu, R.A.M.

    2015-01-01

    A preliminary study toward consistent soil moisture products from the Advanced Microwave Scanning Radiometer 2 (AMSR2) is presented. Its predecessor, the Advanced Microwave Scanning Radiometer for Earth Observing System (AMSR-E), has providedEarth scientists with a consistent and continuous global

  3. On Consistency Test Method of Expert Opinion in Ecological Security Assessment.

    Science.gov (United States)

    Gong, Zaiwu; Wang, Lihong

    2017-09-04

    To reflect the initiative design and initiative of human security management and safety warning, ecological safety assessment is of great value. In the comprehensive evaluation of regional ecological security with the participation of experts, the expert's individual judgment level, ability and the consistency of the expert's overall opinion will have a very important influence on the evaluation result. This paper studies the consistency measure and consensus measure based on the multiplicative and additive consistency property of fuzzy preference relation (FPR). We firstly propose the optimization methods to obtain the optimal multiplicative consistent and additively consistent FPRs of individual and group judgments, respectively. Then, we put forward a consistency measure by computing the distance between the original individual judgment and the optimal individual estimation, along with a consensus measure by computing the distance between the original collective judgment and the optimal collective estimation. In the end, we make a case study on ecological security for five cities. Result shows that the optimal FPRs are helpful in measuring the consistency degree of individual judgment and the consensus degree of collective judgment.

  4. Unilateral Measures addressing Non-Trade Concerns. A Study on WTO Consistency, Relevance of other International Agreements, Economic Effectiveness and Impact on Developing Countries of Measures concerning Non-Product-Related Processes and Production Methods

    International Nuclear Information System (INIS)

    Van den Bossche, P.; Schrijver, N.; Faber, G.

    2007-01-01

    Over the last two years, the debate in the Netherlands on trade measures addressing non-trade concerns has focused on two important and politically sensitive issues, namely: (1) the sustainability of the large-scale production of biomass as an alternative source of energy; and (2) the production of livestock products in a manner that is consistent with animal welfare requirements. In February 2007 a report was issued on the 'Toetsingskader voor Duurzame Biomassa', the so-called Cramer Report. This report discusses the risks associated with large-scale biomass production and establishes a list of criteria for the sustainable production of biomass. These criteria reflect a broad range of non-trade concerns, including environmental protection, global warming, food security, biodiversity, economic prosperity and social welfare. The report recognizes that the implementation of the criteria (including the establishment of a certification system) will require careful consideration of the obligations of the Netherlands under EU and WTO law. Governments called upon to address non-trade concerns may do so by using different types of measures. Prominent among these are measures concerning processes and production methods of products. In the present study, these issues are examined primarily with regard to existing, proposed or still purely hypothetical measures for implementing the Cramer criteria for the sustainable production of biomass. Several other, non-energy-related issues are discussed in this report

  5. An eigenvalue approach to quantum plasmonics based on a self-consistent hydrodynamics method.

    Science.gov (United States)

    Ding, Kun; Chan, C T

    2018-02-28

    Plasmonics has attracted much attention not only because it has useful properties such as strong field enhancement, but also because it reveals the quantum nature of matter. To handle quantum plasmonics effects, ab initio packages or empirical Feibelman d-parameters have been used to explore the quantum correction of plasmonic resonances. However, most of these methods are formulated within the quasi-static framework. The self-consistent hydrodynamics model offers a reliable approach to study quantum plasmonics because it can incorporate the quantum effect of the electron gas into classical electrodynamics in a consistent manner. Instead of the standard scattering method, we formulate the self-consistent hydrodynamics method as an eigenvalue problem to study quantum plasmonics with electrons and photons treated on the same footing. We find that the eigenvalue approach must involve a global operator, which originates from the energy functional of the electron gas. This manifests the intrinsic nonlocality of the response of quantum plasmonic resonances. Our model gives the analytical forms of quantum corrections to plasmonic modes, incorporating quantum electron spill-out effects and electrodynamical retardation. We apply our method to study the quantum surface plasmon polariton for a single flat interface.

  6. Homogenization of Periodic Masonry Using Self-Consistent Scheme and Finite Element Method

    Science.gov (United States)

    Kumar, Nitin; Lambadi, Harish; Pandey, Manoj; Rajagopal, Amirtham

    2016-01-01

    Masonry is a heterogeneous anisotropic continuum, made up of the brick and mortar arranged in a periodic manner. Obtaining the effective elastic stiffness of the masonry structures has been a challenging task. In this study, the homogenization theory for periodic media is implemented in a very generic manner to derive the anisotropic global behavior of the masonry, through rigorous application of the homogenization theory in one step and through a full three-dimensional behavior. We have considered the periodic Eshelby self-consistent method and the finite element method. Two representative unit cells that represent the microstructure of the masonry wall exactly are considered for calibration and numerical application of the theory.

  7. Self-consistent field variational cellular method as applied to the band structure calculation of sodium

    International Nuclear Information System (INIS)

    Lino, A.T.; Takahashi, E.K.; Leite, J.R.; Ferraz, A.C.

    1988-01-01

    The band structure of metallic sodium is calculated, using for the first time the self-consistent field variational cellular method. In order to implement the self-consistency in the variational cellular theory, the crystal electronic charge density was calculated within the muffin-tin approximation. The comparison between our results and those derived from other calculations leads to the conclusion that the proposed self-consistent version of the variational cellular method is fast and accurate. (author) [pt

  8. Systematic studies of molecular vibrational anharmonicity and vibration-rotation interaction by self-consistent-field higher derivative methods: Applications to asymmetric and symmetric top and linear polyatomic molecules

    Energy Technology Data Exchange (ETDEWEB)

    Clabo, D.A. Jr.

    1987-04-01

    Inclusion of the anharmonicity normal mode vibrations (i.e., the third and fourth (and higher) derivatives of a molecular Born-Oppenheimer potential energy surface) is necessary in order to theoretically reproduce experimental fundamental vibrational frequencies of a molecule. Although ab initio determinations of harmonic vibrational frequencies may give errors of only a few percent by the inclusion of electron correlation within a large basis set for small molecules, in general, molecular fundamental vibrational frequencies are more often available from high resolution vibration-rotation spectra. Recently developed analytic third derivatives methods for self-consistent-field (SCF) wavefunctions have made it possible to examine with previously unavailable accuracy and computational efficiency the anharmonic force fields of small molecules.

  9. Systematic studies of molecular vibrational anharmonicity and vibration-rotation interaction by self-consistent-field higher derivative methods: Applications to asymmetric and symmetric top and linear polyatomic molecules

    International Nuclear Information System (INIS)

    Clabo, D.A. Jr.

    1987-04-01

    Inclusion of the anharmonicity normal mode vibrations [i.e., the third and fourth (and higher) derivatives of a molecular Born-Oppenheimer potential energy surface] is necessary in order to theoretically reproduce experimental fundamental vibrational frequencies of a molecule. Although ab initio determinations of harmonic vibrational frequencies may give errors of only a few percent by the inclusion of electron correlation within a large basis set for small molecules, in general, molecular fundamental vibrational frequencies are more often available from high resolution vibration-rotation spectra. Recently developed analytic third derivatives methods for self-consistent-field (SCF) wavefunctions have made it possible to examine with previously unavailable accuracy and computational efficiency the anharmonic force fields of small molecules

  10. Modifications of imaging spectroscopy methods for increases spatial and temporal consistency: A case study of change in leafy spurge distribution between 1999 and 2001 in Theodore Roosevelt National Park, North Dakota

    Science.gov (United States)

    Dudek, Kathleen Burke

    The noxious weed leafy spurge (Euphorbia esula L.) has spread throughout the northern Great Plains of North America since it was introduced in the early 1800s, and it is currently a significant management concern. Accurate, rapid location and repeatable measurements are critical for successful temporal monitoring of infestations. Imaging spectroscopy is well suited for identification of spurge; however, the development and dissemination of standardized hyperspectral mapping procedures that produce consistent multi-temporal maps has been absent. Airborne Visible/Infrared Imaging Spectrometer (AVIRIS) data, collected in 1999 and 2001 over Theodore Roosevelt National Park, North Dakota, were used to locate leafy spurge. Published image-processing methods were tested to determine the most successful for consistent maps. Best results were obtained using: (1) NDVI masking; (2) cross-track illumination correction; (3) image-derived spectral libraries; and (4) mixture-tuned matched filtering algorithm. Application of the algorithm was modified to standardize processing and eliminate threshold decisions; the image-derived library was refined to eliminate additional variability. Primary (spurge dominant), secondary (spurge non-dominant), abundance, and area-wide vegetation maps were produced. Map accuracies were analyzed with point, polygon, and grid reference sets, using confusion matrices and regression between field-measured and image-derived abundances. Accuracies were recalculated after applying a majority filter, and buffers ranging from 1-5 pixels wide around classified pixels, to accommodate poor reference-image alignment. Overall accuracy varied from 39% to 82%, however, regression analyses yielded r2 = 0.725, indicating a strong relationship between field and image-derived densities. Accuracy was sensitive to: (1) registration offsets between field and image locations; (2) modification of analytical methods; and (3) reference data quality. Sensor viewing angle

  11. A Benchmark Estimate for the Capital Stock. An Optimal Consistency Method

    OpenAIRE

    Jose Miguel Albala-Bertrand

    2001-01-01

    There are alternative methods to estimate a capital stock for a benchmark year. These methods, however, do not allow for an independent check, which could establish whether the estimated benchmark level is too high or too low. I propose here an optimal consistency method (OCM), which may allow estimating a capital stock level for a benchmark year and/or checking the consistency of alternative estimates of a benchmark capital stock.

  12. Self-Consistent Study of Conjugated Aromatic Molecular Transistors

    International Nuclear Information System (INIS)

    Jing, Wang; Yun-Ye, Liang; Hao, Chen; Peng, Wang; Note, R.; Mizuseki, H.; Kawazoe, Y.

    2010-01-01

    We study the current through conjugated aromatic molecular transistors modulated by a transverse field. The self-consistent calculation is realized with density function theory through the standard quantum chemistry software Gaussian03 and the non-equilibrium Green's function formalism. The calculated I – V curves controlled by the transverse field present the characteristics of different organic molecular transistors, the transverse field effect of which is improved by the substitutions of nitrogen atoms or fluorine atoms. On the other hand, the asymmetry of molecular configurations to the axis connecting two sulfur atoms is in favor of realizing the transverse field modulation. Suitably designed conjugated aromatic molecular transistors possess different I – V characteristics, some of them are similar to those of metal-oxide-semiconductor field-effect transistors (MOSFET). Some of the calculated molecular devices may work as elements in graphene electronics. Our results present the richness and flexibility of molecular transistors, which describe the colorful prospect of next generation devices. (condensed matter: electronic structure, electrical, magnetic, and optical properties)

  13. Longitudinal tDCS: Consistency across Working Memory Training Studies

    Directory of Open Access Journals (Sweden)

    Marian E. Berryhill

    2017-04-01

    Full Text Available There is great interest in enhancing and maintaining cognitive function. In recent years, advances in noninvasive brain stimulation devices, such as transcranial direct current stimulation (tDCS, have targeted working memory in particular. Despite controversy surrounding outcomes of single-session studies, a growing field of working memory training studies incorporate multiple sessions of tDCS. It is useful to take stock of these findings because there is a diversity of paradigms employed and the outcomes observed between research groups. This will be important in assessing cognitive training programs paired with stimulation techniques and identifying the more useful and less effective approaches. Here, we treat the tDCS+ working memory training field as a case example, but also survey training benefits in other neuromodulatory techniques (e.g., tRNS, tACS. There are challenges associated with the broad parameter space including: individual differences, stimulation intensity, duration, montage, session number, session spacing, training task selection, timing of follow up testing, near and far transfer tasks. In summary, although the field of assisted cognitive training is young, some design choices are more favorable than others. By way of heuristic, the current evidence supports including more training/tDCS sessions (5+, applying anodal tDCS targeting prefrontal regions, including follow up testing on trained and transfer tasks after a period of no contact. What remains unclear, but important for future translational value is continuing work to pinpoint optimal values for the tDCS parameters on a per cognitive task basis. Importantly the emerging literature shows notable consistency in the application of tDCS for WM across various participant populations compared to single session experimental designs.

  14. Linearized self-consistent quasiparticle GW method: Application to semiconductors and simple metals

    Science.gov (United States)

    Kutepov, A. L.; Oudovenko, V. S.; Kotliar, G.

    2017-10-01

    We present a code implementing the linearized quasiparticle self-consistent GW method (LQSGW) in the LAPW basis. Our approach is based on the linearization of the self-energy around zero frequency which differs it from the existing implementations of the QSGW method. The linearization allows us to use Matsubara frequencies instead of working on the real axis. This results in efficiency gains by switching to the imaginary time representation in the same way as in the space time method. The all electron LAPW basis set eliminates the need for pseudopotentials. We discuss the advantages of our approach, such as its N3 scaling with the system size N, as well as its shortcomings. We apply our approach to study the electronic properties of selected semiconductors, insulators, and simple metals and show that our code produces the results very close to the previously published QSGW data. Our implementation is a good platform for further many body diagrammatic resummations such as the vertex-corrected GW approach and the GW+DMFT method. Program Files doi:http://dx.doi.org/10.17632/cpchkfty4w.1 Licensing provisions: GNU General Public License Programming language: Fortran 90 External routines/libraries: BLAS, LAPACK, MPI (optional) Nature of problem: Direct implementation of the GW method scales as N4 with the system size, which quickly becomes prohibitively time consuming even in the modern computers. Solution method: We implemented the GW approach using a method that switches between real space and momentum space representations. Some operations are faster in real space, whereas others are more computationally efficient in the reciprocal space. This makes our approach scale as N3. Restrictions: The limiting factor is usually the memory available in a computer. Using 10 GB/core of memory allows us to study the systems up to 15 atoms per unit cell.

  15. Self-consistent Bayesian analysis of space-time symmetry studies

    International Nuclear Information System (INIS)

    Davis, E.D.

    1996-01-01

    We introduce a Bayesian method for the analysis of epithermal neutron transmission data on space-time symmetries in which unique assignment of the prior is achieved by maximisation of the cross entropy and the imposition of a self-consistency criterion. Unlike the maximum likelihood method used in previous analyses of parity-violation data, our method is freed of an ad hoc cutoff parameter. Monte Carlo studies indicate that our self-consistent Bayesian analysis is superior to the maximum likelihood method when applied to the small data samples typical of symmetry studies. (orig.)

  16. Directional selection in temporally replicated studies is remarkably consistent.

    Science.gov (United States)

    Morrissey, Michael B; Hadfield, Jarrod D

    2012-02-01

    Temporal variation in selection is a fundamental determinant of evolutionary outcomes. A recent paper presented a synthetic analysis of temporal variation in selection in natural populations. The authors concluded that there is substantial variation in the strength and direction of selection over time, but acknowledged that sampling error would result in estimates of selection that were more variable than the true values. We reanalyze their dataset using techniques that account for the necessary effect of sampling error to inflate apparent levels of variation and show that directional selection is remarkably constant over time, both in magnitude and direction. Thus we cannot claim that the available data support the existence of substantial temporal heterogeneity in selection. Nonetheless, we conject that temporal variation in selection could be important, but that there are good reasons why it may not appear in the available data. These new analyses highlight the importance of applying techniques that estimate parameters of the distribution of selection, rather than parameters of the distribution of estimated selection (which will reflect both sampling error and "real" variation in selection); indeed, despite availability of methods for the former, focus on the latter has been common in synthetic reviews of the aspects of selection in nature, and can lead to serious misinterpretations. © 2011 The Author(s). Evolution© 2011 The Society for the Study of Evolution.

  17. Linearized self-consistent quasiparticle GW method: Application to semiconductors and simple metals

    International Nuclear Information System (INIS)

    Kutepov, A. L.

    2017-01-01

    We present a code implementing the linearized self-consistent quasiparticle GW method (QSGW) in the LAPW basis. Our approach is based on the linearization of the self-energy around zero frequency which differs it from the existing implementations of the QSGW method. The linearization allows us to use Matsubara frequencies instead of working on the real axis. This results in efficiency gains by switching to the imaginary time representation in the same way as in the space time method. The all electron LAPW basis set eliminates the need for pseudopotentials. We discuss the advantages of our approach, such as its N 3 scaling with the system size N, as well as its shortcomings. We apply our approach to study the electronic properties of selected semiconductors, insulators, and simple metals and show that our code produces the results very close to the previously published QSGW data. Our implementation is a good platform for further many body diagrammatic resummations such as the vertex-corrected GW approach and the GW+DMFT method.

  18. Consistent calculation of the polarization electric dipole moment by the shell-correction method

    International Nuclear Information System (INIS)

    Denisov, V.Yu.

    1992-01-01

    Macroscopic calculations of the polarization electric dipole moment which arises in nuclei with an octupole deformation are discussed in detail. This dipole moment is shown to depend on the position of the center of gravity. The conditions of consistency of the radii of the proton and neutron potentials and the radii of the proton and neutron surfaces, respectively, are discussed. These conditions must be incorporated in a shell-correction calculation of this dipole moment. A correct calculation of this moment by the shell-correction method is carried out. Dipole transitions between (on the one hand) levels belonging to an octupole vibrational band and (on the other) the ground state in rare-earth nuclei with a large quadrupole deformation are studied. 19 refs., 3 figs

  19. An Economical Approach to Estimate a Benchmark Capital Stock. An Optimal Consistency Method

    OpenAIRE

    Jose Miguel Albala-Bertrand

    2003-01-01

    There are alternative methods of estimating capital stock for a benchmark year. However, these methods are costly and time-consuming, requiring the gathering of much basic information as well as the use of some convenient assumptions and guesses. In addition, a way is needed of checking whether the estimated benchmark is at the correct level. This paper proposes an optimal consistency method (OCM), which enables a capital stock to be estimated for a benchmark year, and which can also be used ...

  20. A self-consistent nodal method in response matrix formalism for the multigroup diffusion equations

    International Nuclear Information System (INIS)

    Malambu, E.M.; Mund, E.H.

    1996-01-01

    We develop a nodal method for the multigroup diffusion equations, based on the transverse integration procedure (TIP). The efficiency of the method rests upon the convergence properties of a high-order multidimensional nodal expansion and upon numerical implementation aspects. The discrete 1D equations are cast in response matrix formalism. The derivation of the transverse leakage moments is self-consistent i.e. does not require additional assumptions. An outstanding feature of the method lies in the linear spatial shape of the local transverse leakage for the first-order scheme. The method is described in the two-dimensional case. The method is validated on some classical benchmark problems. (author)

  1. Consistency of external dosimetry in epidemiologic studies of nuclear workers

    International Nuclear Information System (INIS)

    Fix, J.J.; Gilbert, E.S.

    1992-01-01

    Efforts are underway to pool data from epidemiologic studies of nuclear workers to obtain more precise estimates of radiation risk than would be possible from any single study. The International Agency for Research on Cancer (IARC) is coordinating combined analyses of data from studies in the United States, Canada, and the United Kingdom. In the U.S., the Department of Energy (DOE) has established the Comprehensive Epidemiologic Data Resource (CEDR) to provide investigators an opportunity to analyze data from several DOE laboratories. IARC investigators, in collaboration with those conducting the individual studies, have developed a dosimetry protocol for the international combined analyses. (author)

  2. Consistency of external dosimetry in epidemiologic studies of nuclear workers

    International Nuclear Information System (INIS)

    Fix, J.J.; Gilbert, E.S.

    1992-05-01

    Efforts are underway to pool data from epidemiologic studies of nuclear workers to obtain more precise estimates of radiation risk than would be possible from any single study. The International Agency for Research on Cancer (IARC) is coordinating combined analyses of data from studies in the United States, Canada, and the United Kingdom. In the US, the Department of Energy (DOE) has established the Comprehensive Epidemiologic Data Resource (CEDR) to provide investigators an opportunity to analyze data from several DOE laboratories. IARC investigators, in collaboration with those conducting the individual studies, have developed a dosimetry protocol for the international combined analyses

  3. Method used to test the imaging consistency of binocular camera's left-right optical system

    Science.gov (United States)

    Liu, Meiying; Wang, Hu; Liu, Jie; Xue, Yaoke; Yang, Shaodong; Zhao, Hui

    2016-09-01

    To binocular camera, the consistency of optical parameters of the left and the right optical system is an important factor that will influence the overall imaging consistency. In conventional testing procedure of optical system, there lacks specifications suitable for evaluating imaging consistency. In this paper, considering the special requirements of binocular optical imaging system, a method used to measure the imaging consistency of binocular camera is presented. Based on this method, a measurement system which is composed of an integrating sphere, a rotary table and a CMOS camera has been established. First, let the left and the right optical system capture images in normal exposure time under the same condition. Second, a contour image is obtained based on the multiple threshold segmentation result and the boundary is determined using the slope of contour lines near the pseudo-contour line. Third, the constraint of gray level based on the corresponding coordinates of left-right images is established and the imaging consistency could be evaluated through standard deviation σ of the imaging grayscale difference D (x, y) between the left and right optical system. The experiments demonstrate that the method is suitable for carrying out the imaging consistency testing for binocular camera. When the standard deviation 3σ distribution of imaging gray difference D (x, y) between the left and right optical system of the binocular camera does not exceed 5%, it is believed that the design requirements have been achieved. This method could be used effectively and paves the way for the imaging consistency testing of the binocular camera.

  4. A Study on the Consistency of Discretization Equation in Unsteady Heat Transfer Calculations

    Directory of Open Access Journals (Sweden)

    Wenhua Zhang

    2013-01-01

    Full Text Available The previous studies on the consistency of discretization equation mainly focused on the finite difference method, but the issue of consistency still remains with several problems far from totally solved in the actual numerical computation. For instance, the consistency problem is involved in the numerical case where the boundary variables are solved explicitly while the variables away from the boundary are solved implicitly. And when the coefficient of discretization equation of nonlinear numerical case is the function of variables, calculating the coefficient explicitly and the variables implicitly might also give rise to consistency problem. Thus the present paper mainly researches the consistency problems involved in the explicit treatment of the second and third boundary conditions and that of thermal conductivity which is the function of temperature. The numerical results indicate that the consistency problem should be paid more attention and not be neglected in the practical computation.

  5. Consistency analysis of Keratograph and traditional methods to evaluate tear film function

    Directory of Open Access Journals (Sweden)

    Pei-Yang Shen

    2015-05-01

    Full Text Available AIM: To investigate repeatability and accuracy of a latest Keratograph for evaluating the tear film stability and to compare its measurements with that of traditional examination methods. METHODS: The results of noninvasive tear film break-up time(NI-BUTincluding the first tear film break-up time(BUT-fand the average tear film break-up time(BUT-avewere measured by Keratograph. The repeatability of the measurements was evaluated by coefficient of variation(CVand intraclass correlation coefficient(ICC. Wilcoxon Signed-Rank test was used to compare NI-BUT with fluorescein tear film break-up time(FBUTto confirm the correlation between NI-BUT and FBUT, Schirmer I test values. Bland-Altman analysis was used to evaluate consistency. RESULTS: The study recruited 48 subjects(48 eyes(mean age 38.7±15.2 years. The CV and ICC of BUT-f were respectively 12.6% and 0.95, those of BUT-ave were 9.8% and 0.96. The value of BUT-f was lower than that of FBUT. The difference had statistical significance(6.16±2.46s vs 7.46±1.92s, PPCONCLUSION: Keratograph can provide NI-BUT data that has a better repeatability and reliability, which has great application prospects in diagnosis and treatment of dry eye and refractive corneal surgery.

  6. Integrating the Toda Lattice with Self-Consistent Source via Inverse Scattering Method

    International Nuclear Information System (INIS)

    Urazboev, Gayrat

    2012-01-01

    In this work, there is shown that the solutions of Toda lattice with self-consistent source can be found by the inverse scattering method for the discrete Sturm-Liuville operator. For the considered problem the one-soliton solution is obtained.

  7. RELIABILITY ASSESSMENT OF ENTROPY METHOD FOR SYSTEM CONSISTED OF IDENTICAL EXPONENTIAL UNITS

    Institute of Scientific and Technical Information of China (English)

    Sun Youchao; Shi Jun

    2004-01-01

    The reliability assessment of unit-system near two levels is the most important content in the reliability multi-level synthesis of complex systems. Introducing the information theory into system reliability assessment, using the addible characteristic of information quantity and the principle of equivalence of information quantity, an entropy method of data information conversion is presented for the system consisted of identical exponential units. The basic conversion formulae of entropy method of unit test data are derived based on the principle of information quantity equivalence. The general models of entropy method synthesis assessment for system reliability approximate lower limits are established according to the fundamental principle of the unit reliability assessment. The applications of the entropy method are discussed by way of practical examples. Compared with the traditional methods, the entropy method is found to be valid and practicable and the assessment results are very satisfactory.

  8. Consistency analysis of subspace identification methods based on a linear regression approach

    DEFF Research Database (Denmark)

    Knudsen, Torben

    2001-01-01

    In the literature results can be found which claim consistency for the subspace method under certain quite weak assumptions. Unfortunately, a new result gives a counter example showing inconsistency under these assumptions and then gives new more strict sufficient assumptions which however does n...... not include important model structures as e.g. Box-Jenkins. Based on a simple least squares approach this paper shows the possible inconsistency under the weak assumptions and develops only slightly stricter assumptions sufficient for consistency and which includes any model structure...

  9. The method and program system CABEI for adjusting consistency between natural element and its isotopes data

    Energy Technology Data Exchange (ETDEWEB)

    Tingjin, Liu; Zhengjun, Sun [Chinese Nuclear Data Center, Beijing, BJ (China)

    1996-06-01

    To meet the requirement of nuclear engineering, especially nuclear fusion reactor, now the data in the major evaluated libraries are given not only for natural element but also for its isotopes. Inconsistency between element and its isotopes data is one of the main problem in present evaluated neutron libraries. The formulas for adjusting to satisfy simultaneously the two kinds of consistent relationships were derived by means of least square method, the program system CABEI were developed. This program was tested by calculating the Fe data in CENDL-2.1. The results show that adjusted values satisfy the two kinds of consistent relationships.

  10. Direct numerical simulation of interfacial instabilities: A consistent, conservative, all-speed, sharp-interface method

    Science.gov (United States)

    Chang, Chih-Hao; Deng, Xiaolong; Theofanous, Theo G.

    2013-06-01

    We present a conservative and consistent numerical method for solving the Navier-Stokes equations in flow domains that may be separated by any number of material interfaces, at arbitrarily-high density/viscosity ratios and acoustic-impedance mismatches, subjected to strong shock waves and flow speeds that can range from highly supersonic to near-zero Mach numbers. A principal aim is prediction of interfacial instabilities under superposition of multiple potentially-active modes (Rayleigh-Taylor, Kelvin-Helmholtz, Richtmyer-Meshkov) as found for example with shock-driven, immersed fluid bodies (locally oblique shocks)—accordingly we emphasize fidelity supported by physics-based validation, including experiments. Consistency is achieved by satisfying the jump discontinuities at the interface within a conservative 2nd-order scheme that is coupled, in a conservative manner, to the bulk-fluid motions. The jump conditions are embedded into a Riemann problem, solved exactly to provide the pressures and velocities along the interface, which is tracked by a level set function to accuracy of O(Δx5, Δt4). Subgrid representation of the interface is achieved by allowing curvature of its constituent interfacial elements to obtain O(Δx3) accuracy in cut-cell volume, with attendant benefits in calculating cell- geometric features and interface curvature (O(Δx3)). Overall the computation converges at near-theoretical O(Δx2). Spurious-currents are down to machine error and there is no time-step restriction due to surface tension. Our method is built upon a quadtree-like adaptive mesh refinement infrastructure. When necessary, this is supplemented by body-fitted grids to enhance resolution of the gas dynamics, including flow separation, shear layers, slip lines, and critical layers. Comprehensive comparisons with exact solutions for the linearized Rayleigh-Taylor and Kelvin-Helmholtz problems demonstrate excellent performance. Sample simulations of liquid drops subjected to

  11. Dynamic and quantitative method of analyzing service consistency evolution based on extended hierarchical finite state automata.

    Science.gov (United States)

    Fan, Linjun; Tang, Jun; Ling, Yunxiang; Li, Benxian

    2014-01-01

    This paper is concerned with the dynamic evolution analysis and quantitative measurement of primary factors that cause service inconsistency in service-oriented distributed simulation applications (SODSA). Traditional methods are mostly qualitative and empirical, and they do not consider the dynamic disturbances among factors in service's evolution behaviors such as producing, publishing, calling, and maintenance. Moreover, SODSA are rapidly evolving in terms of large-scale, reusable, compositional, pervasive, and flexible features, which presents difficulties in the usage of traditional analysis methods. To resolve these problems, a novel dynamic evolution model extended hierarchical service-finite state automata (EHS-FSA) is constructed based on finite state automata (FSA), which formally depict overall changing processes of service consistency states. And also the service consistency evolution algorithms (SCEAs) based on EHS-FSA are developed to quantitatively assess these impact factors. Experimental results show that the bad reusability (17.93% on average) is the biggest influential factor, the noncomposition of atomic services (13.12%) is the second biggest one, and the service version's confusion (1.2%) is the smallest one. Compared with previous qualitative analysis, SCEAs present good effectiveness and feasibility. This research can guide the engineers of service consistency technologies toward obtaining a higher level of consistency in SODSA.

  12. Dynamic and Quantitative Method of Analyzing Service Consistency Evolution Based on Extended Hierarchical Finite State Automata

    Directory of Open Access Journals (Sweden)

    Linjun Fan

    2014-01-01

    Full Text Available This paper is concerned with the dynamic evolution analysis and quantitative measurement of primary factors that cause service inconsistency in service-oriented distributed simulation applications (SODSA. Traditional methods are mostly qualitative and empirical, and they do not consider the dynamic disturbances among factors in service’s evolution behaviors such as producing, publishing, calling, and maintenance. Moreover, SODSA are rapidly evolving in terms of large-scale, reusable, compositional, pervasive, and flexible features, which presents difficulties in the usage of traditional analysis methods. To resolve these problems, a novel dynamic evolution model extended hierarchical service-finite state automata (EHS-FSA is constructed based on finite state automata (FSA, which formally depict overall changing processes of service consistency states. And also the service consistency evolution algorithms (SCEAs based on EHS-FSA are developed to quantitatively assess these impact factors. Experimental results show that the bad reusability (17.93% on average is the biggest influential factor, the noncomposition of atomic services (13.12% is the second biggest one, and the service version’s confusion (1.2% is the smallest one. Compared with previous qualitative analysis, SCEAs present good effectiveness and feasibility. This research can guide the engineers of service consistency technologies toward obtaining a higher level of consistency in SODSA.

  13. Self-consistent collective coordinate method for large amplitude collective motions

    International Nuclear Information System (INIS)

    Sakata, F.; Hashimoto, Y.; Marumori, T.; Une, T.

    1982-01-01

    A recent development of the self-consistent collective coordinate method is described. The self-consistent collective coordinate method was proposed on the basis of the fundamental principle called the invariance principle of the Schroedinger equation. If this is formulated within a framework of the time dependent Hartree Fock (TDHF) theory, a classical version of the theory is obtained. A quantum version of the theory is deduced by formulating it within a framework of the unitary transformation method with auxiliary bosons. In this report, the discussion is concentrated on a relation between the classical theory and the quantum theory, and an applicability of the classical theory. The aim of the classical theory is to extract a maximally decoupled collective subspace out of a huge dimensional 1p - 1h parameter space introduced by the TDHF theory. An intimate similarity between the classical theory and a full quantum boson expansion method (BEM) was clarified. Discussion was concentrated to a simple Lipkin model. Then a relation between the BEM and the unitary transformation method with auxiliary bosons was discussed. It became clear that the quantum version of the theory had a strong relation to the BEM, and that the BEM was nothing but a quantum analogue of the present classical theory. The present theory was compared with the full TDHF calculation by using a simple model. (Kato, T.)

  14. Simplified DFT methods for consistent structures and energies of large systems

    Science.gov (United States)

    Caldeweyher, Eike; Gerit Brandenburg, Jan

    2018-05-01

    Kohn–Sham density functional theory (DFT) is routinely used for the fast electronic structure computation of large systems and will most likely continue to be the method of choice for the generation of reliable geometries in the foreseeable future. Here, we present a hierarchy of simplified DFT methods designed for consistent structures and non-covalent interactions of large systems with particular focus on molecular crystals. The covered methods are a minimal basis set Hartree–Fock (HF-3c), a small basis set screened exchange hybrid functional (HSE-3c), and a generalized gradient approximated functional evaluated in a medium-sized basis set (B97-3c), all augmented with semi-classical correction potentials. We give an overview on the methods design, a comprehensive evaluation on established benchmark sets for geometries and lattice energies of molecular crystals, and highlight some realistic applications on large organic crystals with several hundreds of atoms in the primitive unit cell.

  15. Physically consistent data assimilation method based on feedback control for patient-specific blood flow analysis.

    Science.gov (United States)

    Ii, Satoshi; Adib, Mohd Azrul Hisham Mohd; Watanabe, Yoshiyuki; Wada, Shigeo

    2018-01-01

    This paper presents a novel data assimilation method for patient-specific blood flow analysis based on feedback control theory called the physically consistent feedback control-based data assimilation (PFC-DA) method. In the PFC-DA method, the signal, which is the residual error term of the velocity when comparing the numerical and reference measurement data, is cast as a source term in a Poisson equation for the scalar potential field that induces flow in a closed system. The pressure values at the inlet and outlet boundaries are recursively calculated by this scalar potential field. Hence, the flow field is physically consistent because it is driven by the calculated inlet and outlet pressures, without any artificial body forces. As compared with existing variational approaches, although this PFC-DA method does not guarantee the optimal solution, only one additional Poisson equation for the scalar potential field is required, providing a remarkable improvement for such a small additional computational cost at every iteration. Through numerical examples for 2D and 3D exact flow fields, with both noise-free and noisy reference data as well as a blood flow analysis on a cerebral aneurysm using actual patient data, the robustness and accuracy of this approach is shown. Moreover, the feasibility of a patient-specific practical blood flow analysis is demonstrated. Copyright © 2017 John Wiley & Sons, Ltd.

  16. Outcomes Definitions and Statistical Tests in Oncology Studies: A Systematic Review of the Reporting Consistency.

    Science.gov (United States)

    Rivoirard, Romain; Duplay, Vianney; Oriol, Mathieu; Tinquaut, Fabien; Chauvin, Franck; Magne, Nicolas; Bourmaud, Aurelie

    2016-01-01

    Quality of reporting for Randomized Clinical Trials (RCTs) in oncology was analyzed in several systematic reviews, but, in this setting, there is paucity of data for the outcomes definitions and consistency of reporting for statistical tests in RCTs and Observational Studies (OBS). The objective of this review was to describe those two reporting aspects, for OBS and RCTs in oncology. From a list of 19 medical journals, three were retained for analysis, after a random selection: British Medical Journal (BMJ), Annals of Oncology (AoO) and British Journal of Cancer (BJC). All original articles published between March 2009 and March 2014 were screened. Only studies whose main outcome was accompanied by a corresponding statistical test were included in the analysis. Studies based on censored data were excluded. Primary outcome was to assess quality of reporting for description of primary outcome measure in RCTs and of variables of interest in OBS. A logistic regression was performed to identify covariates of studies potentially associated with concordance of tests between Methods and Results parts. 826 studies were included in the review, and 698 were OBS. Variables were described in Methods section for all OBS studies and primary endpoint was clearly detailed in Methods section for 109 RCTs (85.2%). 295 OBS (42.2%) and 43 RCTs (33.6%) had perfect agreement for reported statistical test between Methods and Results parts. In multivariable analysis, variable "number of included patients in study" was associated with test consistency: aOR (adjusted Odds Ratio) for third group compared to first group was equal to: aOR Grp3 = 0.52 [0.31-0.89] (P value = 0.009). Variables in OBS and primary endpoint in RCTs are reported and described with a high frequency. However, statistical tests consistency between methods and Results sections of OBS is not always noted. Therefore, we encourage authors and peer reviewers to verify consistency of statistical tests in oncology studies.

  17. A fast inverse consistent deformable image registration method based on symmetric optical flow computation

    International Nuclear Information System (INIS)

    Yang Deshan; Li Hua; Low, Daniel A; Deasy, Joseph O; Naqa, Issam El

    2008-01-01

    Deformable image registration is widely used in various radiation therapy applications including daily treatment planning adaptation to map planned tissue or dose to changing anatomy. In this work, a simple and efficient inverse consistency deformable registration method is proposed with aims of higher registration accuracy and faster convergence speed. Instead of registering image I to a second image J, the two images are symmetrically deformed toward one another in multiple passes, until both deformed images are matched and correct registration is therefore achieved. In each pass, a delta motion field is computed by minimizing a symmetric optical flow system cost function using modified optical flow algorithms. The images are then further deformed with the delta motion field in the positive and negative directions respectively, and then used for the next pass. The magnitude of the delta motion field is forced to be less than 0.4 voxel for every pass in order to guarantee smoothness and invertibility for the two overall motion fields that are accumulating the delta motion fields in both positive and negative directions, respectively. The final motion fields to register the original images I and J, in either direction, are calculated by inverting one overall motion field and combining the inversion result with the other overall motion field. The final motion fields are inversely consistent and this is ensured by the symmetric way that registration is carried out. The proposed method is demonstrated with phantom images, artificially deformed patient images and 4D-CT images. Our results suggest that the proposed method is able to improve the overall accuracy (reducing registration error by 30% or more, compared to the original and inversely inconsistent optical flow algorithms), reduce the inverse consistency error (by 95% or more) and increase the convergence rate (by 100% or more). The overall computation speed may slightly decrease, or increase in most cases

  18. RPA method based on the self-consistent cranking model for 168Er and 158Dy

    International Nuclear Information System (INIS)

    Kvasil, J.; Cwiok, S.; Chariev, M.M.; Choriev, B.

    1983-01-01

    The low-lying nuclear states in 168 Er and 158 Dy are analysed within the random phase approximation (RPA) method based on the self-consistent cranking model (SCCM). The moment of inertia, the value of chemical potential, and the strength constant k 1 have been obtained from the symmetry condition. The pairing strength constants Gsub(tau) have been determined from the experimental values of neutron and proton pairing energies for nonrotating nuclei. A quite good agreement with experimental energies of states with positive parity was obtained without introducing the two-phonon vibrational states

  19. Analytical free energy gradient for the molecular Ornstein-Zernike self-consistent-field method

    Directory of Open Access Journals (Sweden)

    N.Yoshida

    2007-09-01

    Full Text Available An analytical free energy gradient for the molecular Ornstein-Zernike self-consistent-field (MOZ-SCF method is presented. MOZ-SCF theory is one of the theories to considering the solvent effects on the solute electronic structure in solution. [Yoshida N. et al., J. Chem. Phys., 2000, 113, 4974] Molecular geometries of water, formaldehyde, acetonitrile and acetone in water are optimized by analytical energy gradient formula. The results are compared with those from the polarizable continuum model (PCM, the reference interaction site model (RISM-SCF and the three dimensional (3D RISM-SCF.

  20. Quasiparticle self-consistent GW method for the spectral properties of complex materials.

    Science.gov (United States)

    Bruneval, Fabien; Gatti, Matteo

    2014-01-01

    The GW approximation to the formally exact many-body perturbation theory has been applied successfully to materials for several decades. Since the practical calculations are extremely cumbersome, the GW self-energy is most commonly evaluated using a first-order perturbative approach: This is the so-called G 0 W 0 scheme. However, the G 0 W 0 approximation depends heavily on the mean-field theory that is employed as a basis for the perturbation theory. Recently, a procedure to reach a kind of self-consistency within the GW framework has been proposed. The quasiparticle self-consistent GW (QSGW) approximation retains some positive aspects of a self-consistent approach, but circumvents the intricacies of the complete GW theory, which is inconveniently based on a non-Hermitian and dynamical self-energy. This new scheme allows one to surmount most of the flaws of the usual G 0 W 0 at a moderate calculation cost and at a reasonable implementation burden. In particular, the issues of small band gap semiconductors, of large band gap insulators, and of some transition metal oxides are then cured. The QSGW method broadens the range of materials for which the spectral properties can be predicted with confidence.

  1. A Dynamic Linear Hashing Method for Redundancy Management in Train Ethernet Consist Network

    Directory of Open Access Journals (Sweden)

    Xiaobo Nie

    2016-01-01

    Full Text Available Massive transportation systems like trains are considered critical systems because they use the communication network to control essential subsystems on board. Critical system requires zero recovery time when a failure occurs in a communication network. The newly published IEC62439-3 defines the high-availability seamless redundancy protocol, which fulfills this requirement and ensures no frame loss in the presence of an error. This paper adopts these for train Ethernet consist network. The challenge is management of the circulating frames, capable of dealing with real-time processing requirements, fast switching times, high throughout, and deterministic behavior. The main contribution of this paper is the in-depth analysis it makes of network parameters imposed by the application of the protocols to train control and monitoring system (TCMS and the redundant circulating frames discarding method based on a dynamic linear hashing, using the fastest method in order to resolve all the issues that are dealt with.

  2. An exact and consistent adjoint method for high-fidelity discretization of the compressible flow equations

    Science.gov (United States)

    Subramanian, Ramanathan Vishnampet Ganapathi

    , and can be tailored to achieve global conservation up to arbitrary orders of accuracy. We again confirm that the sensitivity gradient for turbulent jet noise computed using our dual-consistent method is only limited by computing precision.

  3. Bosons system with finite repulsive interaction: self-consistent field method

    International Nuclear Information System (INIS)

    Renatino, M.M.B.

    1983-01-01

    Some static properties of a boson system (T = zero degree Kelvin), under the action of a repulsive potential are studied. For the repulsive potential, a model was adopted consisting of a region where it is constant (r c ), and a decay as 1/r (r > r c ). The self-consistent field approximation used takes into account short range correlations through a local field corrections, which leads to an effective field. The static structure factor S(q-vector) and the effective potential ψ(q-vector) are obtained through a self-consistent calculation. The pair-correlation function g(r-vector) and the energy of the collective excitations E(q-vector) are also obtained, from the structure factor. The density of the system and the parameters of the repulsive potential, that is, its height and the size of the constant region were used as variables for the problem. The results obtained for S(q-vector), g(r-vector) and E(q-vector) for a fixed ratio r o /r c and a variable λ, indicates the raising of a system structure, which is more noticeable when the potential became more repulsive. (author)

  4. A consistent modelling methodology for secondary settling tanks: a reliable numerical method.

    Science.gov (United States)

    Bürger, Raimund; Diehl, Stefan; Farås, Sebastian; Nopens, Ingmar; Torfs, Elena

    2013-01-01

    The consistent modelling methodology for secondary settling tanks (SSTs) leads to a partial differential equation (PDE) of nonlinear convection-diffusion type as a one-dimensional model for the solids concentration as a function of depth and time. This PDE includes a flux that depends discontinuously on spatial position modelling hindered settling and bulk flows, a singular source term describing the feed mechanism, a degenerating term accounting for sediment compressibility, and a dispersion term for turbulence. In addition, the solution itself is discontinuous. A consistent, reliable and robust numerical method that properly handles these difficulties is presented. Many constitutive relations for hindered settling, compression and dispersion can be used within the model, allowing the user to switch on and off effects of interest depending on the modelling goal as well as investigate the suitability of certain constitutive expressions. Simulations show the effect of the dispersion term on effluent suspended solids and total sludge mass in the SST. The focus is on correct implementation whereas calibration and validation are not pursued.

  5. Evaluation of a Consistent LES/PDF Method Using a Series of Experimental Spray Flames

    Science.gov (United States)

    Heye, Colin; Raman, Venkat

    2012-11-01

    A consistent method for the evolution of the joint-scalar probability density function (PDF) transport equation is proposed for application to large eddy simulation (LES) of turbulent reacting flows containing evaporating spray droplets. PDF transport equations provide the benefit of including the chemical source term in closed form, however, additional terms describing LES subfilter mixing must be modeled. The recent availability of detailed experimental measurements provide model validation data for a wide range of evaporation rates and combustion regimes, as is well-known to occur in spray flames. In this work, the experimental data will used to investigate the impact of droplet mass loading and evaporation rates on the subfilter scalar PDF shape in comparison with conventional flamelet models. In addition, existing model term closures in the PDF transport equations are evaluated with a focus on their validity in the presence of regime changes.

  6. A novel method for identification and quantification of consistently differentially methylated regions.

    Directory of Open Access Journals (Sweden)

    Ching-Lin Hsiao

    Full Text Available Advances in biotechnology have resulted in large-scale studies of DNA methylation. A differentially methylated region (DMR is a genomic region with multiple adjacent CpG sites that exhibit different methylation statuses among multiple samples. Many so-called "supervised" methods have been established to identify DMRs between two or more comparison groups. Methods for the identification of DMRs without reference to phenotypic information are, however, less well studied. An alternative "unsupervised" approach was proposed, in which DMRs in studied samples were identified with consideration of nature dependence structure of methylation measurements between neighboring probes from tiling arrays. Through simulation study, we investigated effects of dependencies between neighboring probes on determining DMRs where a lot of spurious signals would be produced if the methylation data were analyzed independently of the probe. In contrast, our newly proposed method could successfully correct for this effect with a well-controlled false positive rate and a comparable sensitivity. By applying to two real datasets, we demonstrated that our method could provide a global picture of methylation variation in studied samples. R source codes to implement the proposed method were freely available at http://www.csjfann.ibms.sinica.edu.tw/eag/programlist/ICDMR/ICDMR.html.

  7. Selection of suitable propagation method for consistent plantlets production in Stevia rebaudiana (Bertoni)

    OpenAIRE

    Khalil, Shahid Akbar; Zamir, Roshan; Ahmad, Nisar

    2014-01-01

    Stevia rebaudiana (Bert.) is an emerging sugar alternative and anti-diabetic plant in Pakistan. That is why people did not know the exact time of propagation. The main objective of the present study was to establish feasible propagation methods for healthy biomass production. In the present study, seed germination, stem cuttings and micropropagation were investigated for higher productivity. Fresh seeds showed better germination (25.51–40%) but lost viability after a few days of storage. In o...

  8. A consistent, differential versus integral, method for measuring the delayed neutron yield in fissions

    International Nuclear Information System (INIS)

    Flip, A.; Pang, H.F.; D'Angelo, A.

    1995-01-01

    Due to the persistent uncertainties: ∼ 5 % (the uncertainty, here and there after, is at 1σ) in the prediction of the 'reactivity scale' (β eff ) for a fast power reactor, an international project was recently initiated in the framework of the OECD/NEA activities for reevaluation, new measurements and integral benchmarking of delayed neutron (DN) data and related kinetic parameters (principally β eff ). Considering that the major part of this uncertainty is due to uncertainties in the DN yields (v d ) and the difficulty for further improvement of the precision in differential (e.g. Keepin's method) measurements, an international cooperative strategy was adopted aiming at extracting and consistently interpreting information from both differential (nuclear) and integral (in reactor) measurements. The main problem arises from the integral side; thus the idea was to realize β eff like measurements (both deterministic and noise) in 'clean' assemblies. The 'clean' calculational context permitted the authors to develop a theory allowing to link explicitly this integral experimental level with the differential one, via a unified 'Master Model' which relates v d and measurables quantities (on both levels) linearly. The combined error analysis is consequently largely simplified and the final uncertainty drastically reduced (theoretically, by a factor √3). On the other hand the same theoretical development leading to the 'Master Model', also resulted in a structured scheme of approximations of the general (stochastic) Boltzmann equation allowing a consistent analysis of the large range of measurements concerned (stochastic, dynamic, static ... ). This paper is focused on the main results of this theoretical development and its application to the analysis of the Preliminary results of the BERENICE program (β eff measurements in MASURCA, the first assembly in CADARACHE-FRANCE)

  9. New exact solutions of the(2+1-dimensional Broer-Kaup equation by the consistent Riccati expansion method

    Directory of Open Access Journals (Sweden)

    Jiang Ying

    2017-01-01

    Full Text Available In this work, we study the (2+1-D Broer-Kaup equation. The composite periodic breather wave, the exact composite kink breather wave and the solitary wave solutions are obtained by using the coupled degradation technique and the consistent Riccati expansion method. These results may help us to investigate some complex dynamical behaviors and the interaction between composite non-linear waves in high dimensional models

  10. Biomedical journals lack a consistent method to detect outcome reporting bias: a cross-sectional analysis.

    Science.gov (United States)

    Huan, L N; Tejani, A M; Egan, G

    2014-10-01

    An increasing amount of recently published literature has implicated outcome reporting bias (ORB) as a major contributor to skewing data in both randomized controlled trials and systematic reviews; however, little is known about the current methods in place to detect ORB. This study aims to gain insight into the detection and management of ORB by biomedical journals. This was a cross-sectional analysis involving standardized questions via email or telephone with the top 30 biomedical journals (2012) ranked by impact factor. The Cochrane Database of Systematic Reviews was excluded leaving 29 journals in the sample. Of 29 journals, 24 (83%) responded to our initial inquiry of which 14 (58%) answered our questions and 10 (42%) declined participation. Five (36%) of the responding journals indicated they had a specific method to detect ORB, whereas 9 (64%) did not have a specific method in place. The prevalence of ORB in the review process seemed to differ with 4 (29%) journals indicating ORB was found commonly, whereas 7 (50%) indicated ORB was uncommon or never detected by their journal previously. The majority (n = 10/14, 72%) of journals were unwilling to report or make discrepancies found in manuscripts available to the public. Although the minority, there were some journals (n = 4/14, 29%) which described thorough methods to detect ORB. Many journals seemed to lack a method with which to detect ORB and its estimated prevalence was much lower than that reported in literature suggesting inadequate detection. There exists a potential for overestimation of treatment effects of interventions and unclear risks. Fortunately, there are journals within this sample which appear to utilize comprehensive methods for detection of ORB, but overall, the data suggest improvements at the biomedical journal level for detecting and minimizing the effect of this bias are needed. © 2014 John Wiley & Sons Ltd.

  11. DFTB3: Extension of the self-consistent-charge density-functional tight-binding method (SCC-DFTB).

    Science.gov (United States)

    Gaus, Michael; Cui, Qiang; Elstner, Marcus

    2012-04-10

    The self-consistent-charge density-functional tight-binding method (SCC-DFTB) is an approximate quantum chemical method derived from density functional theory (DFT) based on a second-order expansion of the DFT total energy around a reference density. In the present study we combine earlier extensions and improve them consistently with, first, an improved Coulomb interaction between atomic partial charges, and second, the complete third-order expansion of the DFT total energy. These modifications lead us to the next generation of the DFTB methodology called DFTB3, which substantially improves the description of charged systems containing elements C, H, N, O, and P, especially regarding hydrogen binding energies and proton affinities. As a result, DFTB3 is particularly applicable to biomolecular systems. Remaining challenges and possible solutions are also briefly discussed.

  12. A self consistent study of the phase transition in the scalar electroweak theory at finite temperature

    International Nuclear Information System (INIS)

    Kerres, U.; Mack, G.; Palma, G.

    1994-12-01

    We propose the study of the phase transition in the scalar electroweak theory at finite temperature by a two-step method. It combines i) dimensional reduction to a 3-dimensional lattice theory via perturbative blockspin transformation, and ii) either further real space renormalization group transformations, or solution of gap equations, for the 3d lattice theory. A gap equation can be obtained by using the Peierls inequality to find the best quadratic approximation to the 3d action. This method avoids the lack of self consistency of the usual treatments which do not separate infrared and UV-problems by introduction of a lattice cutoff. The effective 3d lattice action could also be used in computer simulations. (orig.)

  13. A self consistent study of the phase transition in the scalar electroweak theory at finite temperature

    International Nuclear Information System (INIS)

    Kerres, U.

    1995-01-01

    We propose the study of the phase transition in the scalar electroweak theory at finite temperature by a two-step method. It combines i) dimensional reduction to a 3-dimensional lattice theory via perturbative blockspin transformation, and ii) either further real space renormalization group transformations, or solution of gap equations, for the 3d lattice theory. A gap equation can be obtained by using the Peierls inequality to find the best quadratic approximation to the 3d action. This method avoids the lack of self consistency of the usual treatments which do not separate infrared and UV-problems by introduction of a lattice cutoff. The effective 3d lattice action could also be used in computer simulations. ((orig.))

  14. A consistent formulation of the finite element method for solving diffusive-convective transport problems

    International Nuclear Information System (INIS)

    Carmo, E.G.D. do; Galeao, A.C.N.R.

    1986-01-01

    A new method specially designed to solve highly convective transport problems is proposed. Using a variational approach it is shown that this weighted residual method belongs to a class of Petrov-Galerkin's approximation. Some examples are presented in order to demonstrate the adequacy of this method in predicting internal or external boundary layers. (Author) [pt

  15. A particle method with adjustable transport properties - the generalized consistent Boltzmann algorithm

    International Nuclear Information System (INIS)

    Garcia, A.L.; Alexander, F.J.; Alder, B.J.

    1997-01-01

    The consistent Boltzmann algorithm (CBA) for dense, hard-sphere gases is generalized to obtain the van der Waals equation of state and the corresponding exact viscosity at all densities except at the highest temperatures. A general scheme for adjusting any transport coefficients to higher values is presented

  16. Dosimetric Consistency of Co-60 Teletherapy Unit- a ten years Study.

    Science.gov (United States)

    Baba, Misba H; Mohib-Ul-Haq, M; Khan, Aijaz A

    2013-01-01

    The goal of the Radiation standards and Dosimetry is to ensure that the output of the Teletherapy Unit is within ±2% of the stated one and the output of the treatment dose calculation methods are within ±5%. In the present paper, we studied the dosimetry of Cobalt-60 (Co-60) Teletherapy unit at Sher-I-Kashmir Institute of Medical Sciences (SKIMS) for last 10 years. Radioactivity is the phenomenon of disintegration of unstable nuclides called radionuclides. Among these radionuclides, Cobalt-60, incorporated in Telecobalt Unit, is commonly used in therapeutic treatment of cancer. Cobalt-60 being unstable decays continuously into Ni-60 with half life of 5.27 years thereby resulting in the decrease in its activity, hence dose rate (output). It is, therefore, mandatory to measure the dose rate of the Cobalt-60 source regularly so that the patient receives the same dose every time as prescribed by the radiation oncologist. The under dosage may lead to unsatisfactory treatment of cancer and over dosage may cause radiation hazards. Our study emphasizes the consistency between actual output and output obtained using decay method. The methodology involved in the present study is the calculations of actual dose rate of Co-60 Teletherapy Unit by two techniques i.e. Source to Surface Distance (SSD) and Source to Axis Distance (SAD), used for the External Beam Radiotherapy, of various cancers, using the standard methods. Thereby, a year wise comparison has been made between average actual dosimetric output (dose rate) and the average expected output values (obtained by using decay method for Co-60.). The present study shows that there is a consistency in the average output (dose rate) obtained by the actual dosimetry values and the expected output values obtained using decay method. The values obtained by actual dosimetry are within ±2% of the expected values. The results thus obtained in a year wise comparison of average output by actual dosimetry done regularly as a part of

  17. A design method for two-layer beams consisting of normal and fibered high strength concrete

    International Nuclear Information System (INIS)

    Iskhakov, I.; Ribakov, Y.

    2007-01-01

    Two-layer fibered concrete beams can be analyzed using conventional methods for composite elements. The compressed zone of such beam section is made of high strength concrete (HSC), and the tensile one of normal strength concrete (NSC). The problems related to such type of beams are revealed and studied. An appropriate depth of each layer is prescribed. Compatibility conditions between HSC and NSC layers are found. It is based on the shear deformations equality on the layers border in a section with maximal depth of the compression zone. For the first time a rigorous definition of HSC is given using a comparative analysis of deformability and strength characteristics of different concrete classes. According to this definition, HSC has no download branch in the stress-strain diagram, the stress-strain function has minimum exponent, the ductility parameter is minimal and the concrete tensile strength remains constant with an increase in concrete compression strength. The application fields of two-layer concrete beams based on different static schemes and load conditions make known. It is known that the main disadvantage of HSCs is their low ductility. In order to overcome this problem, fibers are added to the HSC layer. Influence of different fiber volume ratios on structural ductility is discussed. An upper limit of the required fibers volume ratio is found based on compatibility equation of transverse tensile concrete deformations and deformations of fibers

  18. Selection of suitable propagation method for consistent plantlets production in Stevia rebaudiana (Bertoni)

    Science.gov (United States)

    Khalil, Shahid Akbar; Zamir, Roshan; Ahmad, Nisar

    2014-01-01

    Stevia rebaudiana (Bert.) is an emerging sugar alternative and anti-diabetic plant in Pakistan. That is why people did not know the exact time of propagation. The main objective of the present study was to establish feasible propagation methods for healthy biomass production. In the present study, seed germination, stem cuttings and micropropagation were investigated for higher productivity. Fresh seeds showed better germination (25.51–40%) but lost viability after a few days of storage. In order to improve the germination percentage, seeds were irradiated with 2.5, 5.0, 7.5 and 10 Gy gamma doses. But gamma irradiation did not show any significant change in seed germination. A great variation in survival of stem cutting was observed in each month of 2012. October and November were found the most suitable months for stem cutting survival (60%). In order to enhance survival, stem cuttings were also dipped in different plant growth regulators (PGRs) solution. Only indole butyric acid (IBA; 1000 ppm) treated cutting showed a higher survival (33%) than control (11.1%). Furthermore, simple and feasible indirect regeneration system was established from leaf explants. Best callus induction (84.6%) was observed on MS-medium augmented with 6-benzyladenine (BA) and 2,4-dichlorophenoxyacetic acid (2,4-D; 2.0 mg l−1). For the first time, we obtained the highest number of shoots (106) on a medium containing BA (1.5 mg l−1) and gibberellic acid (GA3; 0.5 mg l−1). Plantlets were successfully acclimatized in plastic pots. The current results preferred micropropagation (85%) over seed germination (25.51–40%) and stem cutting (60%). PMID:25473365

  19. Selection of suitable propagation method for consistent plantlets production in Stevia rebaudiana (Bertoni).

    Science.gov (United States)

    Khalil, Shahid Akbar; Zamir, Roshan; Ahmad, Nisar

    2014-12-01

    Stevia rebaudiana (Bert.) is an emerging sugar alternative and anti-diabetic plant in Pakistan. That is why people did not know the exact time of propagation. The main objective of the present study was to establish feasible propagation methods for healthy biomass production. In the present study, seed germination, stem cuttings and micropropagation were investigated for higher productivity. Fresh seeds showed better germination (25.51-40%) but lost viability after a few days of storage. In order to improve the germination percentage, seeds were irradiated with 2.5, 5.0, 7.5 and 10 Gy gamma doses. But gamma irradiation did not show any significant change in seed germination. A great variation in survival of stem cutting was observed in each month of 2012. October and November were found the most suitable months for stem cutting survival (60%). In order to enhance survival, stem cuttings were also dipped in different plant growth regulators (PGRs) solution. Only indole butyric acid (IBA; 1000 ppm) treated cutting showed a higher survival (33%) than control (11.1%). Furthermore, simple and feasible indirect regeneration system was established from leaf explants. Best callus induction (84.6%) was observed on MS-medium augmented with 6-benzyladenine (BA) and 2,4-dichlorophenoxyacetic acid (2,4-D; 2.0 mg l(-1)). For the first time, we obtained the highest number of shoots (106) on a medium containing BA (1.5 mg l(-1)) and gibberellic acid (GA3; 0.5 mg l(-1)). Plantlets were successfully acclimatized in plastic pots. The current results preferred micropropagation (85%) over seed germination (25.51-40%) and stem cutting (60%).

  20. A RTS-based method for direct and consistent calculating intermittent peak cooling loads

    International Nuclear Information System (INIS)

    Chen Tingyao; Cui, Mingxian

    2010-01-01

    The RTS method currently recommended by ASHRAE Handbook is based on continuous operation. However, most of air-conditioning systems, if not all, in commercial buildings, are intermittently operated in practice. The application of the current RTS method to intermittent air-conditioning in nonresidential buildings could result in largely underestimated design cooling loads, and inconsistently sized air-conditioning systems. Improperly sized systems could seriously deteriorate the performance of system operation and management. Therefore, a new method based on both the current RTS method and the principles of heat transfer has been developed. The first part of the new method is the same as the current RTS method in principle, but its calculation procedure is simplified by the derived equations in a close form. The technical data available in the current RTS method can be utilized to compute zone responses to a change in space air temperature so that no efforts are needed for regenerating new technical data. Both the overall RTS coefficients and the hourly cooling loads computed in the first part are used to estimate the additional peak cooling load due to a change from continuous operation to intermittent operation. It only needs one more step after the current RTS method to determine the intermittent peak cooling load. The new RTS-based method has been validated by EnergyPlus simulations. The root mean square deviation (RMSD) between the relative additional peak cooling loads (RAPCLs) computed by the two methods is 1.8%. The deviation of the RAPCL varies from -3.0% to 5.0%, and the mean deviation is 1.35%.

  1. Internal consistency and validity of an observational method for assessing disability in mobility in patients with osteoarthritis.

    NARCIS (Netherlands)

    Steultjens, M.P.M.; Dekker, J.; Baar, M.E. van; Oostendorp, R.A.B.; Bijlsma, J.W.J.

    1999-01-01

    Objective: To establish the internal consistency of validity of an observational method for assessing diasbility in mobility in patients with osteoarthritis (OA), Methods: Data were obtained from 198 patients with OA of the hip or knee. Results of the observational method were compared with results

  2. Consistent method of truncating the electron self-energy in nonperturbative QED

    International Nuclear Information System (INIS)

    Rembiesa, P.

    1986-01-01

    A nonperturbative method of solving the Dyson-Schwinger equations for the fermion propagator is considered. The solution satisfies the Ward-Takahashi identity, allows multiplicative regularization, and exhibits a physical-mass pole

  3. Technical Note: Regularization performances with the error consistency method in the case of retrieved atmospheric profiles

    Directory of Open Access Journals (Sweden)

    S. Ceccherini

    2007-01-01

    Full Text Available The retrieval of concentration vertical profiles of atmospheric constituents from spectroscopic measurements is often an ill-conditioned problem and regularization methods are frequently used to improve its stability. Recently a new method, that provides a good compromise between precision and vertical resolution, was proposed to determine analytically the value of the regularization parameter. This method is applied for the first time to real measurements with its implementation in the operational retrieval code of the satellite limb-emission measurements of the MIPAS instrument and its performances are quantitatively analyzed. The adopted regularization improves the stability of the retrieval providing smooth profiles without major degradation of the vertical resolution. In the analyzed measurements the retrieval procedure provides a vertical resolution that, in the troposphere and low stratosphere, is smaller than the vertical field of view of the instrument.

  4. Delta self-consistent field method to obtain potential energy surfaces of excited molecules on surfaces

    DEFF Research Database (Denmark)

    Gavnholt, Jeppe; Olsen, Thomas; Engelund, Mads

    2008-01-01

    is a density-functional method closely resembling standard density-functional theory (DFT), the only difference being that in Delta SCF one or more electrons are placed in higher lying Kohn-Sham orbitals instead of placing all electrons in the lowest possible orbitals as one does when calculating the ground......-state energy within standard DFT. We extend the Delta SCF method by allowing excited electrons to occupy orbitals which are linear combinations of Kohn-Sham orbitals. With this extra freedom it is possible to place charge locally on adsorbed molecules in the calculations, such that resonance energies can...... be estimated, which is not possible in traditional Delta SCF because of very delocalized Kohn-Sham orbitals. The method is applied to N2, CO, and NO adsorbed on different metallic surfaces and compared to ordinary Delta SCF without our modification, spatially constrained DFT, and inverse...

  5. Self-consistent study of space-charge-dominated beams in a misaligned transport system

    International Nuclear Information System (INIS)

    Sing Babu, P.; Goswami, A.; Pandit, V.S.

    2013-01-01

    A self-consistent particle-in-cell (PIC) simulation method is developed to investigate the dynamics of space-charge-dominated beams through a misaligned solenoid based transport system. Evolution of beam centroid, beam envelope and emittance is studied as a function of misalignment parameters for various types of beam distributions. Simulation results performed up to 40 mA of proton beam indicate that centroid oscillations induced by the displacement and rotational misalignments of solenoids do not depend of the beam distribution. It is shown that the beam envelope around the centroid is independent of the centroid motion for small centroid oscillation. In addition, we have estimated the loss of beam during the transport caused by the misalignment for various beam distributions

  6. Microscopic and self-consistent description of nuclear properties by extended generator-coordinate method

    International Nuclear Information System (INIS)

    Didong, M.

    1976-01-01

    The extend generator-coordinated method is discussed and a procedure is given for the solution of the Hill-Wheeler equation. The HFB-theory, the particle-number and angular-momentum projections necessary for symmetry, and the modified surprice delta interaction are discussed. The described procedures are used to calculate 72 Ge, 70 Zn and 74 Ge properties. (BJ) [de

  7. From virtual clustering analysis to self-consistent clustering analysis: a mathematical study

    Science.gov (United States)

    Tang, Shaoqiang; Zhang, Lei; Liu, Wing Kam

    2018-03-01

    In this paper, we propose a new homogenization algorithm, virtual clustering analysis (VCA), as well as provide a mathematical framework for the recently proposed self-consistent clustering analysis (SCA) (Liu et al. in Comput Methods Appl Mech Eng 306:319-341, 2016). In the mathematical theory, we clarify the key assumptions and ideas of VCA and SCA, and derive the continuous and discrete Lippmann-Schwinger equations. Based on a key postulation of "once response similarly, always response similarly", clustering is performed in an offline stage by machine learning techniques (k-means and SOM), and facilitates substantial reduction of computational complexity in an online predictive stage. The clear mathematical setup allows for the first time a convergence study of clustering refinement in one space dimension. Convergence is proved rigorously, and found to be of second order from numerical investigations. Furthermore, we propose to suitably enlarge the domain in VCA, such that the boundary terms may be neglected in the Lippmann-Schwinger equation, by virtue of the Saint-Venant's principle. In contrast, they were not obtained in the original SCA paper, and we discover these terms may well be responsible for the numerical dependency on the choice of reference material property. Since VCA enhances the accuracy by overcoming the modeling error, and reduce the numerical cost by avoiding an outer loop iteration for attaining the material property consistency in SCA, its efficiency is expected even higher than the recently proposed SCA algorithm.

  8. Bootstrapping the economy -- a non-parametric method of generating consistent future scenarios

    OpenAIRE

    Müller, Ulrich A; Bürgi, Roland; Dacorogna, Michel M

    2004-01-01

    The fortune and the risk of a business venture depends on the future course of the economy. There is a strong demand for economic forecasts and scenarios that can be applied to planning and modeling. While there is an ongoing debate on modeling economic scenarios, the bootstrapping (or resampling) approach presented here has several advantages. As a non-parametric method, it directly relies on past market behaviors rather than debatable assumptions on models and parameters. Simultaneous dep...

  9. Consistent economic cross-sectoral climate change impact scenario analysis: Method and application to Austria

    Directory of Open Access Journals (Sweden)

    Karl W. Steininger

    2016-03-01

    Full Text Available Climate change triggers manifold impacts at the national to local level, which in turn have various economy-wide implications (e.g. on welfare, employment, or tax revenues. In its response, society needs to prioritize which of these impacts to address and what share of resources to spend on each respective adaptation. A prerequisite to achieving that end is an economic impact analysis that is consistent across sectors and acknowledges intersectoral and economy-wide feedback effects. Traditional Integrated Assessment Models (IAMs are usually operating at a level too aggregated for this end, while bottom-up impact models most often are not fully comprehensive, focusing on only a subset of climate sensitive sectors and/or a subset of climate change impact chains. Thus, we develop here an approach which applies climate and socioeconomic scenario analysis, harmonized economic costing, and sector explicit bandwidth analysis in a coupled framework of eleven (biophysical impact assessment models and a uniform multi-sectoral computable general equilibrium model. In applying this approach to the alpine country of Austria, we find that macroeconomic feedbacks can magnify sectoral climate damages up to fourfold, or that by mid-century costs of climate change clearly outweigh benefits, with net costs rising two- to fourfold above current damage cost levels. The resulting specific impact information – differentiated by climate and economic drivers – can support sector-specific adaptation as well as adaptive capacity building. Keywords: climate impact, local impact, economic evaluation, adaptation

  10. Self-consistent EXAFS PDF Projection Method by Matched Correction of Fourier Filter Signal Distortion

    International Nuclear Information System (INIS)

    Lee, Jay Min; Yang, Dong-Seok

    2007-01-01

    Inverse problem solving computation was performed for solving PDF (pair distribution function) from simulated data EXAFS based on data FEFF. For a realistic comparison with experimental data, we chose a model of the first sub-shell Mn-0 pair showing the Jahn Teller distortion in crystalline LaMnO3. To restore the Fourier filtering signal distortion, involved in the first sub-shell information isolated from higher shell contents, relevant distortion matching function was computed initially from the proximity model, and iteratively from the prior-guess during consecutive regularization computation. Adaptive computation of EXAFS background correction is an issue of algorithm development, but our preliminary test was performed under the simulated background correction perfectly excluding the higher shell interference. In our numerical result, efficient convergence of iterative solution indicates a self-consistent tendency that a true PDF solution is convinced as a counterpart of genuine chi-data, provided that a background correction function is iteratively solved using an extended algorithm of MEPP (Matched EXAFS PDF Projection) under development

  11. Internal consistency and validity of an observational method for assessing disability in mobility in patients with osteoarthritis

    NARCIS (Netherlands)

    Steultjens, M. P.; Dekker, J.; van Baar, M. E.; Oostendorp, R. A.; Bijlsma, J. W.

    1999-01-01

    To establish the internal consistency and validity of an observational method for assessing disability in mobility in patients with osteoarthritis (OA). Data were obtained from 198 patients with OA of the hip or knee. Results of the observational method were compared with results of self-report

  12. How consistent are beliefs about the causes and solutions to illness? An experimental study.

    OpenAIRE

    Ogden, J; Jubb, A

    2008-01-01

    Objectives: Research illustrates that people hold beliefs about the causes and solutions to illness. This study aimed to assess the consistency in these beliefs in terms of their variation according to type of problem and whether they are consistent with each other. Further, the study aimed to assess whether they are open to change and whether changing beliefs about cause resulted in a subsequent shift in beliefs about solutions. Design: Experimental factorial 3 (problem) × 2 (manipulated cau...

  13. Solvent effects in time-dependent self-consistent field methods. II. Variational formulations and analytical gradients

    International Nuclear Information System (INIS)

    Bjorgaard, J. A.; Velizhanin, K. A.; Tretiak, S.

    2015-01-01

    This study describes variational energy expressions and analytical excited state energy gradients for time-dependent self-consistent field methods with polarizable solvent effects. Linear response, vertical excitation, and state-specific solventmodels are examined. Enforcing a variational ground stateenergy expression in the state-specific model is found to reduce it to the vertical excitation model. Variational excited state energy expressions are then provided for the linear response and vertical excitation models and analytical gradients are formulated. Using semiempiricalmodel chemistry, the variational expressions are verified by numerical and analytical differentiation with respect to a static external electric field. Lastly, analytical gradients are further tested by performing microcanonical excited state molecular dynamics with p-nitroaniline

  14. Optical forces, torques, and force densities calculated at a microscopic level using a self-consistent hydrodynamics method

    Science.gov (United States)

    Ding, Kun; Chan, C. T.

    2018-04-01

    The calculation of optical force density distribution inside a material is challenging at the nanoscale, where quantum and nonlocal effects emerge and macroscopic parameters such as permittivity become ill-defined. We demonstrate that the microscopic optical force density of nanoplasmonic systems can be defined and calculated using the microscopic fields generated using a self-consistent hydrodynamics model that includes quantum, nonlocal, and retardation effects. We demonstrate this technique by calculating the microscopic optical force density distributions and the optical binding force induced by external light on nanoplasmonic dimers. This approach works even in the limit when the nanoparticles are close enough to each other so that electron tunneling occurs, a regime in which classical electromagnetic approach fails completely. We discover that an uneven distribution of optical force density can lead to a light-induced spinning torque acting on individual particles. The hydrodynamics method offers us an accurate and efficient approach to study optomechanical behavior for plasmonic systems at the nanoscale.

  15. Self-Consistency Method to Evaluate a Linear Expansion Thermal Coefficient of Composite with Dispersed Inclusions

    Directory of Open Access Journals (Sweden)

    V. S. Zarubin

    2015-01-01

    Full Text Available The rational use of composites as structural materials, while perceiving the thermal and mechanical loads, to a large extent determined by their thermoelastic properties. From the presented review of works devoted to the analysis of thermoelastic characteristics of composites, it follows that the problem of estimating these characteristics is important. Among the thermoelastic properties of composites occupies an important place its temperature coefficient of linear expansion.Along with fiber composites are widely used in the technique of dispersion hardening composites, in which the role of inclusions carry particles of high-strength and high-modulus materials, including nanostructured elements. Typically, the dispersed particles have similar dimensions in all directions, which allows the shape of the particles in the first approximation the ball.In an article for the composite with isotropic spherical inclusions of a plurality of different materials by the self-produced design formulas relating the temperature coefficient of linear expansion with volume concentration of inclusions and their thermoelastic characteristics, as well as the thermoelastic properties of the matrix of the composite. Feature of the method is the self-accountability thermomechanical interaction of a single inclusion or matrix particles with a homogeneous isotropic medium having the desired temperature coefficient of linear expansion. Averaging over the volume of the composite arising from such interaction perturbation strain and stress in the inclusions and the matrix particles and makes it possible to obtain such calculation formulas.For the validation of the results of calculations of the temperature coefficient of linear expansion of the composite of this type used two-sided estimates that are based on the dual variational formulation of linear thermoelasticity problem in an inhomogeneous solid containing two alternative functional (such as Lagrange and Castigliano

  16. VLE measurements using a static cell vapor phase manual sampling method accompanied with an empirical data consistency test

    International Nuclear Information System (INIS)

    Freitag, Joerg; Kosuge, Hitoshi; Schmelzer, Juergen P.; Kato, Satoru

    2015-01-01

    Highlights: • We use a new, simple static cell vapor phase manual sampling method (SCVMS) for VLE (x, y, T) measurement. • The method is applied to non-azeotropic, asymmetric and two-liquid phase forming azeotropic binaries. • The method is approved by a data consistency test, i.e., a plot of the polarity exclusion factor vs. pressure. • The consistency test reveals that with the new SCVMS method accurate VLE near ambient temperature can be measured. • Moreover, the consistency test approves that the effect of air in the SCVMS system is negligible. - Abstract: A new static cell vapor phase manual sampling (SCVMS) method is used for the simple measurement of constant temperature x, y (vapor + liquid) equilibria (VLE). The method was applied to the VLE measurements of the (methanol + water) binary at T/K = (283.2, 298.2, 308.2 and 322.9), asymmetric (acetone + 1-butanol) binary at T/K = (283.2, 295.2, 308.2 and 324.2) and two-liquid phase forming azeotropic (water + 1-butanol) binary at T/K = (283.2 and 298.2). The accuracy of the experimental data was approved by a data consistency test, that is, an empirical plot of the polarity exclusion factor, β, vs. the system pressure, P. The SCVMS data are accurate, because the VLE data converge to the same lnβ vs. lnP straight line determined from conventional distillation-still method and a headspace gas chromatography method

  17. Application of the adiabatic self-consistent collective coordinate method to a solvable model of prolate-oblate shape coexistence

    International Nuclear Information System (INIS)

    Kobayasi, Masato; Matsuyanagi, Kenichi; Nakatsukasa, Takashi; Matsuo, Masayuki

    2003-01-01

    The adiabatic self-consistent collective coordinate method is applied to an exactly solvable multi-O(4) model that is designed to describe nuclear shape coexistence phenomena. The collective mass and dynamics of large amplitude collective motion in this model system are analyzed, and it is shown that the method yields a faithful description of tunneling motion through a barrier between the prolate and oblate local minima in the collective potential. The emergence of the doublet pattern is clearly described. (author)

  18. An efficient method to transcription factor binding sites imputation via simultaneous completion of multiple matrices with positional consistency.

    Science.gov (United States)

    Guo, Wei-Li; Huang, De-Shuang

    2017-08-22

    Transcription factors (TFs) are DNA-binding proteins that have a central role in regulating gene expression. Identification of DNA-binding sites of TFs is a key task in understanding transcriptional regulation, cellular processes and disease. Chromatin immunoprecipitation followed by high-throughput sequencing (ChIP-seq) enables genome-wide identification of in vivo TF binding sites. However, it is still difficult to map every TF in every cell line owing to cost and biological material availability, which poses an enormous obstacle for integrated analysis of gene regulation. To address this problem, we propose a novel computational approach, TFBSImpute, for predicting additional TF binding profiles by leveraging information from available ChIP-seq TF binding data. TFBSImpute fuses the dataset to a 3-mode tensor and imputes missing TF binding signals via simultaneous completion of multiple TF binding matrices with positional consistency. We show that signals predicted by our method achieve overall similarity with experimental data and that TFBSImpute significantly outperforms baseline approaches, by assessing the performance of imputation methods against observed ChIP-seq TF binding profiles. Besides, motif analysis shows that TFBSImpute preforms better in capturing binding motifs enriched in observed data compared with baselines, indicating that the higher performance of TFBSImpute is not simply due to averaging related samples. We anticipate that our approach will constitute a useful complement to experimental mapping of TF binding, which is beneficial for further study of regulation mechanisms and disease.

  19. A consistent method for finite volume discretization of body forces on collocated grids applied to flow through an actuator disk

    DEFF Research Database (Denmark)

    Troldborg, Niels; Sørensen, Niels N.; Réthoré, Pierre-Elouan

    2015-01-01

    This paper describes a consistent algorithm for eliminating the numerical wiggles appearing when solving the finite volume discretized Navier-Stokes equations with discrete body forces in a collocated grid arrangement. The proposed method is a modification of the Rhie-Chow algorithm where the for...

  20. Fully self-consistent multiparticle-multi-hole configuration mixing method - Applications to a few light nuclei

    International Nuclear Information System (INIS)

    Robin, Caroline

    2014-01-01

    This thesis project takes part in the development of the multiparticle-multi-hole configuration mixing method aiming to describe the structure of atomic nuclei. Based on a double variational principle, this approach allows to determine the expansion coefficients of the wave function and the single-particle states at the same time. In this work we apply for the first time the fully self-consistent formalism of the mp-mh method to the description of a few p- and sd-shell nuclei, using the D1S Gogny interaction. A first study of the 12 C nucleus is performed in order to test the doubly iterative convergence procedure when different types of truncation criteria are applied to select the many-body configurations included in the wave-function. A detailed analysis of the effect caused by the orbital optimization is conducted. In particular, its impact on the one-body density and on the fragmentation of the ground state wave function is analyzed. A systematic study of sd-shell nuclei is then performed. A careful analysis of the correlation content of the ground state is first conducted and observables quantities such as binding and separation energies, as well as charge radii are calculated and compared to experimental data. Satisfactory results are found. Spectroscopic properties are also studied. Excitation energies of low-lying states are found in very good agreement with experiment, and the study of magnetic dipole features are also satisfactory. Calculation of electric quadrupole properties, and in particular transition probabilities B(E2), however reveal a clear lack of collectivity of the wave function, due to the reduced valence space used to select the many-body configurations. Although the renormalization of orbitals leads to an important fragmentation of the ground state wave function, only little effect is observed on B(E2) probabilities. A tentative explanation is given. Finally, the structure description of nuclei provided by the multiparticle

  1. Self-consistent study of local and nonlocal magnetoresistance in a YIG/Pt bilayer

    Science.gov (United States)

    Wang, Xi-guang; Zhou, Zhen-wei; Nie, Yao-zhuang; Xia, Qing-lin; Guo, Guang-hua

    2018-03-01

    We present a self-consistent study of the local spin Hall magnetoresistance (SMR) and nonlocal magnon-mediated magnetoresistance (MMR) in a heavy-metal/magnetic-insulator heterostructure at finite temperature. We find that the thermal fluctuation of magnetization significantly affects the SMR. It appears unidirectional with respect to the direction of electrical current (or magnetization). The unidirectionality of SMR originates from the asymmetry of creation or annihilation of thermal magnons induced by the spin Hall torque. Also, a self-consistent model can well describe the features of MMR.

  2. The nuclear N-body problem and the effective interaction in self-consistent mean-field methods

    International Nuclear Information System (INIS)

    Duguet, Thomas

    2002-01-01

    This work deals with two aspects of mean-field type methods extensively used in low-energy nuclear structure. The first study is at the mean-field level. The link between the wave-function describing an even-even nucleus and the odd-even neighbor is revisited. To get a coherent description as a function of the pairing intensity in the system, the utility of the formalization of this link through a two steps process is demonstrated. This two-steps process allows to identify the role played by different channels of the force when a nucleon is added in the system. In particular, perturbative formula evaluating the contribution of time-odd components of the functional to the nucleon separation energy are derived for zero and realistic pairing intensities. Self-consistent calculations validate the developed scheme as well as the derived perturbative formula. This first study ends up with an extended analysis of the odd-even mass staggering in nuclei. The new scheme allows to identify the contribution to this observable coming from different channels of the force. The necessity of a better understanding of time-odd terms in order to decide which odd-even mass formulae extracts the pairing gap the most properly is identified. These terms being nowadays more or less out of control, extended studies are needed to make precise the fit of a pairing force through the comparison of theoretical and experimental odd-even mass differences. The second study deals with beyond mean-field methods taking care of the correlations associated with large amplitude oscillations in nuclei. Their effects are usually incorporated through the GCM or the projected mean-field method. We derive a perturbation theory motivating such variational calculations from a diagrammatic point of view for the first time. Resuming two-body correlations in the energy expansion, we obtain an effective interaction removing the hard-core problem in the context of configuration mixing calculations. Proceeding to a

  3. A Combined Self-Consistent Method to Estimate the Effective Properties of Polypropylene/Calcium Carbonate Composites

    Directory of Open Access Journals (Sweden)

    Zhongqiang Xiong

    2018-01-01

    Full Text Available In this work, trying to avoid difficulty of application due to the irregular filler shapes in experiments, self-consistent and differential self-consistent methods were combined to obtain a decoupled equation. The combined method suggests a tenor γ independent of filler-contents being an important connection between high and low filler-contents. On one hand, the constant parameter can be calculated by Eshelby’s inclusion theory or the Mori–Tanaka method to predict effective properties of composites coinciding with its hypothesis. On the other hand, the parameter can be calculated with several experimental results to estimate the effective properties of prepared composites of other different contents. In addition, an evaluation index σ f ′ of the interactional strength between matrix and fillers is proposed based on experiments. In experiments, a hyper-dispersant was synthesized to prepare polypropylene/calcium carbonate (PP/CaCO3 composites up to 70 wt % of filler-content with dispersion, whose dosage was only 5 wt % of the CaCO3 contents. Based on several verifications, it is hoped that the combined self-consistent method is valid for other two-phase composites in experiments with the same application progress as in this work.

  4. SU-F-T-26: A Study of the Consistency of Brachytherapy Treatments for Vaginal Cuff

    Energy Technology Data Exchange (ETDEWEB)

    Shojaei, M; Pella, S; Dumitru, N [21st Century Oncology, Boca Raton, FL (United States)

    2016-06-15

    Purpose: To evaluate to treatment consistency over the total number of fractions when treatment what HDR brachytherapy using the ML cylinders. At the same time the dosimetric impact on the critical organs is monitored over the total number of fractions. Methods: A retrospective analysis of 10 patients treated with Cylinder applicators, from 2015–2016 were considered for this study. The CT scans of these patients, taken before each treatment were separately imported in to the treatment planning system and paired with the initial CT scan after completing the contouring. Two sets of CT images were fused together with respective to the applicator, using landmark registration. The doses of each plan were imported as well and a cumulative dosimetric analysis was made for bladder, bowels, and rectum and PTV. Results: No contour of any of the OAR was exactly similar when CT images were fused on each other. The PTV volumes vary from fraction to fraction. There was always a difference between the doses received by the OARs between treatments. The maximum dose varied between 5% and 30% in rectum and bladder. The minimum dose varied between 5% and 8% in rectum and bladder. The average dose varied between 15% and 20% in rectum and bladder. Deviation in placement were noticed between fractions. Conclusion: The variation in volumes of OARs and isodoses near the OARs, indicate that the estimated doses to OARs on the planning system may not be the same dose delivered to the patient in all the fractions. There are no major differences between the prescribed dose and the delivered dose over the total number of fractions. In some cases the critical organs will benefit if the consecutive plans will made after the CT scans will be registered with the initial scan and then planned.

  5. A study of the consistent and the lumped source approximations in finite element neutron diffusion calculations

    International Nuclear Information System (INIS)

    Ozgener, B.; Azgener, H.A.

    1991-01-01

    In finite element formulations for the solution of the within-group neutron diffusion equation, two different treatments are possible for the group source term: the consistent source approximation (CSA) and the lumped source approximation (LSA). CSA results in intra-group scattering and fission matrices which have the same nondiagonal structure as the global coefficient matrix. This situation might be regarded as a disadvantage, compared to the conventional (i.e. finite difference) methods where the intra-group scattering and fission matrices are diagonal. To overcome this disadvantage, LSA could be used to diagonalize these matrices. LSA is akin to the lumped mass approximation of continuum mechanics. We concentrate on two different aspects of the source approximations. Although it has been reported that LSA does not modify the asymptotic h 2 convergence behaviour for linear elements, the effect of LSA on convergence of higher degree elements has not been investigated. Thus, we would be interested in determining, p, the asymptotic order of convergence, in: Δk |k eff (analytical) -k eff (finite element)| = Ch p (1) for finite element approximations of varying degree (N) with both of the source approximations. Since (1) is valid in the asymptotic limit, we must use ultra-fine meshes and quadruple precision arithmetic. For our order of convergence study, we used infinite cylindrical geometry with azimuthal symmetry. Hence, the effects of singularities remain uninvestigated. The second aspect we dwell on is the performance of LSA in bilinear 3-D finite element calculations, compared to CSA. LSA has been used quite extensively in 1- and 2-D even-parity transport and diffusion calculations. In this work, we will try to assess the relative merits of LSA and CSA in 3-D problems. (author)

  6. Consistent interactive segmentation of pulmonary ground glass nodules identified in CT studies

    Science.gov (United States)

    Zhang, Li; Fang, Ming; Naidich, David P.; Novak, Carol L.

    2004-05-01

    Ground glass nodules (GGNs) have proved especially problematic in lung cancer diagnosis, as despite frequently being malignant they characteristically have extremely slow rates of growth. This problem is further magnified by the small size of many of these lesions now being routinely detected following the introduction of multislice CT scanners capable of acquiring contiguous high resolution 1 to 1.25 mm sections throughout the thorax in a single breathhold period. Although segmentation of solid nodules can be used clinically to determine volume doubling times quantitatively, reliable methods for segmentation of pure ground glass nodules have yet to be introduced. Our purpose is to evaluate a newly developed computer-based segmentation method for rapid and reproducible measurements of pure ground glass nodules. 23 pure or mixed ground glass nodules were identified in a total of 8 patients by a radiologist and subsequently segmented by our computer-based method using Markov random field and shape analysis. The computer-based segmentation was initialized by a click point. Methodological consistency was assessed using the overlap ratio between 3 segmentations initialized by 3 different click points for each nodule. The 95% confidence interval on the mean of the overlap ratios proved to be [0.984, 0.998]. The computer-based method failed on two nodules that were difficult to segment even manually either due to especially low contrast or markedly irregular margins. While achieving consistent manual segmentation of ground glass nodules has proven problematic most often due to indistinct boundaries and interobserver variability, our proposed method introduces a powerful new tool for obtaining reproducible quantitative measurements of these lesions. It is our intention to further document the value of this approach with a still larger set of ground glass nodules.

  7. Self-consistent DFT +U method for real-space time-dependent density functional theory calculations

    Science.gov (United States)

    Tancogne-Dejean, Nicolas; Oliveira, Micael J. T.; Rubio, Angel

    2017-12-01

    We implemented various DFT+U schemes, including the Agapito, Curtarolo, and Buongiorno Nardelli functional (ACBN0) self-consistent density-functional version of the DFT +U method [Phys. Rev. X 5, 011006 (2015), 10.1103/PhysRevX.5.011006] within the massively parallel real-space time-dependent density functional theory (TDDFT) code octopus. We further extended the method to the case of the calculation of response functions with real-time TDDFT+U and to the description of noncollinear spin systems. The implementation is tested by investigating the ground-state and optical properties of various transition-metal oxides, bulk topological insulators, and molecules. Our results are found to be in good agreement with previously published results for both the electronic band structure and structural properties. The self-consistent calculated values of U and J are also in good agreement with the values commonly used in the literature. We found that the time-dependent extension of the self-consistent DFT+U method yields improved optical properties when compared to the empirical TDDFT+U scheme. This work thus opens a different theoretical framework to address the nonequilibrium properties of correlated systems.

  8. Assessment of rigid multi-modality image registration consistency using the multiple sub-volume registration (MSR) method

    International Nuclear Information System (INIS)

    Ceylan, C; Heide, U A van der; Bol, G H; Lagendijk, J J W; Kotte, A N T J

    2005-01-01

    Registration of different imaging modalities such as CT, MRI, functional MRI (fMRI), positron (PET) and single photon (SPECT) emission tomography is used in many clinical applications. Determining the quality of any automatic registration procedure has been a challenging part because no gold standard is available to evaluate the registration. In this note we present a method, called the 'multiple sub-volume registration' (MSR) method, for assessing the consistency of a rigid registration. This is done by registering sub-images of one data set on the other data set, performing a crude non-rigid registration. By analysing the deviations (local deformations) of the sub-volume registrations from the full registration we get a measure of the consistency of the rigid registration. Registration of 15 data sets which include CT, MR and PET images for brain, head and neck, cervix, prostate and lung was performed utilizing a rigid body registration with normalized mutual information as the similarity measure. The resulting registrations were classified as good or bad by visual inspection. The resulting registrations were also classified using our MSR method. The results of our MSR method agree with the classification obtained from visual inspection for all cases (p < 0.02 based on ANOVA of the good and bad groups). The proposed method is independent of the registration algorithm and similarity measure. It can be used for multi-modality image data sets and different anatomic sites of the patient. (note)

  9. Consistency of parametric registration in serial MRI studies of brain tumor progression

    International Nuclear Information System (INIS)

    Mang, Andreas; Buzug, Thorsten M.; Schnabel, Julia A.; Crum, William R.; Modat, Marc; Ourselin, Sebastien; Hawkes, David J.; Camara-Rey, Oscar; Palm, Christoph; Caseiras, Gisele Brasil; Jaeger, H.R.

    2008-01-01

    The consistency of parametric registration in multi-temporal magnetic resonance (MR) imaging studies was evaluated. Serial MRI scans of adult patients with a brain tumor (glioma) were aligned by parametric registration. The performance of low-order spatial alignment (6/9/12 degrees of freedom) of different 3D serial MR-weighted images is evaluated. A registration protocol for the alignment of all images to one reference coordinate system at baseline is presented. Registration results were evaluated for both, multimodal intra-timepoint and mono-modal multi-temporal registration. The latter case might present a challenge to automatic intensity-based registration algorithms due to ill-defined correspondences. The performance of our algorithm was assessed by testing the inverse registration consistency. Four different similarity measures were evaluated to assess consistency. Careful visual inspection suggests that images are well aligned, but their consistency may be imperfect. Sub-voxel inconsistency within the brain was found for allsimilarity measures used for parametric multi-temporal registration. T1-weighted images were most reliable for establishing spatial correspondence between different timepoints. The parametric registration algorithm is feasible for use in this application. The sub-voxel resolution mean displacement error of registration transformations demonstrates that the algorithm converges to an almost identical solution for forward and reverse registration. (orig.)

  10. Self-consistent simulation studies of periodically focused intense charged-particle beams

    International Nuclear Information System (INIS)

    Chen, C.; Jameson, R.A.

    1995-01-01

    A self-consistent two-dimensional model is used to investigate intense charged-particle beam propagation through a periodic solenoidal focusing channel, particularly in the regime in which there is a mismatch between the beam and the focusing channel. The present self-consistent studies confirm that mismatched beams exhibit nonlinear resonances and chaotic behavior in the envelope evolution, as predicted by an earlier envelope analysis [C. Chen and R. C. Davidson, Phys. Rev. Lett. 72, 2195 (1994)]. Transient effects due to emittance growth are studied, and halo formation is investigated. The halo size is estimated. The halo characteristics for a periodic focusing channel are found to be qualitatively the same as those for a uniform focusing channel. A threshold condition is obtained numerically for halo formation in mismatched beams in a uniform focusing channel, which indicates that relative envelope mismatch must be kept well below 20% to prevent space-charge-dominated beams from developing halos

  11. Daily Behavior Report Cards: An Investigation of the Consistency of On-Task Data across Raters and Methods

    Science.gov (United States)

    Chafouleas, Sandra M.; Riley-Tillman, T. Chris; Sassu, Kari A.; LaFrance, Mary J.; Patwa, Shamim S.

    2007-01-01

    In this study, the consistency of on-task data collected across raters using either a Daily Behavior Report Card (DBRC) or systematic direct observation was examined to begin to understand the decision reliability of using DBRCs to monitor student behavior. Results suggested very similar conclusions might be drawn when visually examining data…

  12. Modeling of the 3RS tau protein with self-consistent field method and Monte Carlo simulation

    NARCIS (Netherlands)

    Leermakers, F.A.M.; Jho, Y.S.; Zhulina, E.B.

    2010-01-01

    Using a model with amino acid resolution of the 196 aa N-terminus of the 3RS tau protein, we performed both a Monte Carlo study and a complementary self-consistent field (SCF) analysis to obtain detailed information on conformational properties of these moieties near a charged plane (mimicking the

  13. Self-consistent beam halo studies ampersand halo diagnostic development in a continuous linear focusing channel

    International Nuclear Information System (INIS)

    Jameson, R.A.

    1994-01-01

    Beam halos are formed via self-consistent motion of the beam particles. Interactions of single particles with time-varying density distributions of other particles are a major source of halo. Aspects of these interactions are studied for an initially equilibrium distribution in a radial, linear, continuous focusing system. When there is a mismatch, it is shown that in the self-consistent system, there is a threshold in space-charge and mismatch, above which a halo is formed that extends to ∼1.5 times the initial maximum mismatch radius. Tools are sought for characterizing the halo dynamics. Testing the particles against the width of the mismatch driving resonance is useful for finding a conservative estimate of the threshold. The exit, entering and transition times, and the time evolution of the halo, are also explored using this technique. Extension to higher dimensions is briefly discussed

  14. Study of Experiment on Rock-like Material Consist of fly-ash, Cement and Mortar

    Science.gov (United States)

    Nan, Qin; Hongwei, Wang; Yongyan, Wang

    2018-03-01

    Study the uniaxial compression test of rock-like material consist of coal ash, cement and mortar by changing the sand cement ratio, replace of fine coal, grain diameter, water-binder ratio and height-diameter ratio. We get the law of four factors above to rock-like material’s uniaxial compression characteristics and the quantitative relation. The effect law can be sum up as below: sample’s uniaxial compressive strength and elasticity modulus tend to decrease with the increase of sand cement ratio, replace of fine coal and water-binder ratio, and it satisfies with power function relation. With high ratio increases gradually, the uniaxial compressive strength and elastic modulus is lower, and presents the inverse function curve; Specimen tensile strength decreases gradually with the increase of fly ash. By contrast, uniaxial compression failure phenomenon is consistent with the real rock common failure pattern.

  15. Gate-controlled current and inelastic electron tunneling spectrum of benzene: a self-consistent study.

    Science.gov (United States)

    Liang, Y Y; Chen, H; Mizuseki, H; Kawazoe, Y

    2011-04-14

    We use density functional theory based nonequilibrium Green's function to self-consistently study the current through the 1,4-benzenedithiol (BDT). The elastic and inelastic tunneling properties through this Au-BDT-Au molecular junction are simulated, respectively. For the elastic tunneling case, it is found that the current through the tilted molecule can be modulated effectively by the external gate field, which is perpendicular to the phenyl ring. The gate voltage amplification comes from the modulation of the interaction between the electrodes and the molecules in the junctions. For the inelastic case, the electron tunneling scattered by the molecular vibrational modes is considered within the self-consistent Born approximation scheme, and the inelastic electron tunneling spectrum is calculated.

  16. The Consistency of Performance Management System Based on Attributes of the Performance Indicator: An Empirical Study

    Directory of Open Access Journals (Sweden)

    Jan Zavadsky

    2014-07-01

    Full Text Available Purpose: The performance management system (PMS is a metasystem over all business processes at the strategic and operational level. Effectiveness of the various management systems depends on many factors. One of them is the consistent definition of each system elements. The main purpose of this study is to explore if the performance management systems of the sample companies is consistent and how companies can create such a system. The consistency in this case is based on the homogenous definition of attributes relating to the performance indicator as a basic element of PMS.Methodology: At the beginning, we used an affinity diagram that helped us to clarify and to group various attributes of performance indicators. The main research results we achieved are through empirical study. The empirical study was carried out in a sample of Slovak companies. The criterion for selection was the existence of the certified management systems according to the ISO 9001. Representativeness of the sample companies was confirmed by application of Pearson´s chi-squared test (χ2 - test due to above standards. Findings: Coming from the review of various literature, we defined four groups of attributes relating to the performance indicator: formal attributes, attributes of target value, informational attributes and attributes of evaluation. The whole set contains 21 attributes. The consistency of PMS is based not on maximum or minimum number of attributes, but on the same type of attributes for each performance indicator used in PMS at both the operational and strategic level. The main findings are: companies use various financial and non-financial indicators at strategic or operational level; companies determine various attributes of performance indicator, but most of the performance indicators are otherwise determined; we identified the common attributes for the whole sample of companies. Practical implications: The research results have got an implication for

  17. The Internal Consistency and Validity of the Vaccination Attitudes Examination Scale: A Replication Study.

    Science.gov (United States)

    Wood, Louise; Smith, Michael; Miller, Christopher B; O'Carroll, Ronan E

    2018-06-19

    Vaccinations are important preventative health behaviors. The recently developed Vaccination Attitudes Examination (VAX) Scale aims to measure the reasons behind refusal/hesitancy regarding vaccinations. The aim of this replication study is to conduct an independent test of the newly developed VAX Scale in the UK. We tested (a) internal consistency (Cronbach's α); (b) convergent validity by assessing its relationships with beliefs about medication, medical mistrust, and perceived sensitivity to medicines; and (c) construct validity by testing how well the VAX Scale discriminated between vaccinators and nonvaccinators. A sample of 243 UK adults completed the VAX Scale, the Beliefs About Medicines Questionnaire, the Perceived Sensitivity to Medicines Scale, and the Medical Mistrust Index, in addition to demographics of age, gender, education levels, and social deprivation. Participants were asked (a) whether they received an influenza vaccination in the past year and (b) if they had a young child, whether they had vaccinated the young child against influenza in the past year. The VAX (a) demonstrated high internal consistency (α = .92); (b) was positively correlated with medical mistrust and beliefs about medicines, and less strongly correlated with perceived sensitivity to medicines; and (c) successfully differentiated parental influenza vaccinators from nonvaccinators. The VAX demonstrated good internal consistency, convergent validity, and construct validity in an independent UK sample. It appears to be a useful measure to help us understand the health beliefs that promote or deter vaccination behavior.

  18. Parental depressive and anxiety symptoms during pregnancy and attention problems in children: A cross-cohort consistency study

    OpenAIRE

    Batenburg-Eddes, Tamara; Brion, Maria; Henrichs, Jens; Jaddoe, Vincent; Hofman, Albert; Verhulst, Frank; Lawlor, Debbie; Davey-Smith, George; Tiemeier, Henning

    2013-01-01

    Background: Maternal depression and anxiety during pregnancy have been associated with offspring-attention deficit problems. Aim: We explored possible intrauterine effects by comparing maternal and paternal symptoms during pregnancy, by investigating cross-cohort consistency, and by investigating whether parental symptoms in early childhood may explain any observed intrauterine effect. Methods: This study was conducted in two cohorts (Generation R, n = 2,280 and ALSPAC, n = 3,442). Pregnant w...

  19. Method and apparatus for fabricating a composite structure consisting of a filamentary material in a metal matrix

    Science.gov (United States)

    Banker, J.G.; Anderson, R.C.

    1975-10-21

    A method and apparatus are provided for preparing a composite structure consisting of filamentary material within a metal matrix. The method is practiced by the steps of confining the metal for forming the matrix in a first chamber, heating the confined metal to a temperature adequate to effect melting thereof, introducing a stream of inert gas into the chamber for pressurizing the atmosphere in the chamber to a pressure greater than atmospheric pressure, confining the filamentary material in a second chamber, heating the confined filamentary material to a temperature less than the melting temperature of the metal, evacuating the second chamber to provide an atmosphere therein at a pressure, placing the second chamber in registry with the first chamber to provide for the forced flow of the molten metal into the second chamber to effect infiltration of the filamentary material with the molten metal, and thereafter cooling the metal infiltrated-filamentary material to form said composite structure.

  20. Method and apparatus for fabricating a composite structure consisting of a filamentary material in a metal matrix

    International Nuclear Information System (INIS)

    Banker, J.G.; Anderson, R.C.

    1975-01-01

    A method and apparatus are provided for preparing a composite structure consisting of filamentary material within a metal matrix. The method is practiced by the steps of confining the metal for forming the matrix in a first chamber, heating the confined metal to a temperature adequate to effect melting thereof, introducing a stream of inert gas into the chamber for pressurizing the atmosphere in the chamber to a pressure greater than atmospheric pressure, confining the filamentary material in a second chamber, heating the confined filamentary material to a temperature less than the melting temperature of the metal, evacuating the second chamber to provide an atmosphere therein at a pressure, placing the second chamber in registry with the first chamber to provide for the forced flow of the molten metal into the second chamber to effect infiltration of the filamentary material with the molten metal, and thereafter cooling the metal infiltrated-filamentary material to form said composite structure

  1. Self-consistent ab initio Calculations for Photoionization and Electron-Ion Recombination Using the R-Matrix Method

    Science.gov (United States)

    Nahar, S. N.

    2003-01-01

    Most astrophysical plasmas entail a balance between ionization and recombination. We present new results from a unified method for self-consistent and ab initio calculations for the inverse processes of photoionization and (e + ion) recombination. The treatment for (e + ion) recombination subsumes the non-resonant radiative recombination and the resonant dielectronic recombination processes in a unified scheme (S.N. Nahar and A.K. Pradhan, Phys. Rev. A 49, 1816 (1994);H.L. Zhang, S.N. Nahar, and A.K. Pradhan, J.Phys.B, 32,1459 (1999)). Calculations are carried out using the R-matrix method in the close coupling approximation using an identical wavefunction expansion for both processes to ensure self-consistency. The results for photoionization and recombination cross sections may also be compared with state-of-the-art experiments on synchrotron radiation sources for photoionization, and on heavy ion storage rings for recombination. The new experiments display heretofore unprecedented detail in terms of resonances and background cross sections and thereby calibrate the theoretical data precisely. We find a level of agreement between theory and experiment at about 10% for not only the ground state but also the metastable states. The recent experiments therefore verify the estimated accuracy of the vast amount of photoionization data computed under the OP, IP and related works. features. Present work also reports photoionization cross sections including relativistic effects in the Breit-Pauli R-matrix (BPRM) approximation. Detailed features in the calculated cross sections exhibit the missing resonances due to fine structure. Self-consistent datasets for photoionization and recombination have so far been computed for approximately 45 atoms and ions. These are being reported in a continuing series of publications in Astrophysical J. Supplements (e.g. references below). These data will also be available from the electronic database TIPTOPBASE (http://heasarc.gsfc.nasa.gov)

  2. HBV infection in relation to consistent condom use: a population-based study in Peru.

    Science.gov (United States)

    Bernabe-Ortiz, Antonio; Carcamo, Cesar P; Scott, John D; Hughes, James P; Garcia, Patricia J; Holmes, King K

    2011-01-01

    Data on hepatitis B virus (HBV) prevalence are limited in developing countries. There is also limited information of consistent condom use efficacy for reducing HBV transmission at the population level. The study goal was to evaluate the prevalence and factors associated with HBV infection in Peru, and the relationship between anti-HBc positivity and consistent condom use. Data from two different surveys performed in 28 mid-sized Peruvian cities were analyzed. Participants aged 18-29 years were selected using a multistage cluster sampling. Information was collected through a validated two-part questionnaire. The first part (face-to-face) concerned demographic data, while the second part (self-administered using handheld computers) concerned sexual behavior. Hepatitis B core antibody (anti-HBc) was tested in 7,000 blood samples. Prevalences and associations were adjusted for sample strata, primary sampling units and population weights. Anti-HBc prevalence was 5.0% (95%CI 4.1%-5.9%), with the highest prevalence among jungle cities: 16.3% (95%CI 13.8%-19.1%). In the multivariable analysis, Anti-HBc positivity was directly associated with geographic region (highlands OR = 2.05; 95%CI 1.28-3.27, and jungle OR = 4.86; 95%CI 3.05-7.74; compared to coastal region); and inversely associated with age at sexual debut (OR = 0.90; 95%CI 0.85-0.97). Consistent condom use, evaluated in about 40% of participants, was associated with reduced prevalence (OR = 0.34; 95%CI 0.15-0.79) after adjusting for gender, geographic region, education level, lifetime number of sex partners, age at sexual debut and year of survey. Residence in highlands or jungle cities is associated with higher anti-HBc prevalences, whereas increasing age at sexual debut were associated with lower prevalences. Consistent condom use was associated with decreased risk of anti-HBc. Findings from this study emphasize the need of primary prevention programs (vaccination) especially in the jungle

  3. HBV infection in relation to consistent condom use: a population-based study in Peru.

    Directory of Open Access Journals (Sweden)

    Antonio Bernabe-Ortiz

    Full Text Available Data on hepatitis B virus (HBV prevalence are limited in developing countries. There is also limited information of consistent condom use efficacy for reducing HBV transmission at the population level. The study goal was to evaluate the prevalence and factors associated with HBV infection in Peru, and the relationship between anti-HBc positivity and consistent condom use.Data from two different surveys performed in 28 mid-sized Peruvian cities were analyzed. Participants aged 18-29 years were selected using a multistage cluster sampling. Information was collected through a validated two-part questionnaire. The first part (face-to-face concerned demographic data, while the second part (self-administered using handheld computers concerned sexual behavior. Hepatitis B core antibody (anti-HBc was tested in 7,000 blood samples. Prevalences and associations were adjusted for sample strata, primary sampling units and population weights. Anti-HBc prevalence was 5.0% (95%CI 4.1%-5.9%, with the highest prevalence among jungle cities: 16.3% (95%CI 13.8%-19.1%. In the multivariable analysis, Anti-HBc positivity was directly associated with geographic region (highlands OR = 2.05; 95%CI 1.28-3.27, and jungle OR = 4.86; 95%CI 3.05-7.74; compared to coastal region; and inversely associated with age at sexual debut (OR = 0.90; 95%CI 0.85-0.97. Consistent condom use, evaluated in about 40% of participants, was associated with reduced prevalence (OR = 0.34; 95%CI 0.15-0.79 after adjusting for gender, geographic region, education level, lifetime number of sex partners, age at sexual debut and year of survey.Residence in highlands or jungle cities is associated with higher anti-HBc prevalences, whereas increasing age at sexual debut were associated with lower prevalences. Consistent condom use was associated with decreased risk of anti-HBc. Findings from this study emphasize the need of primary prevention programs (vaccination especially in the

  4. Geometry of the self-consistent collective-coordinate method for the large-amplitude collective motion

    International Nuclear Information System (INIS)

    Sakata, Fumihiko; Marumori, Toshio; Hashimoto, Yukio; Une, Tsutomu.

    1983-05-01

    The geometry of the self-consistent collective-coordinate (SCC) method formulated within the framework of the time-dependent Hartree-Fock (TDHF) theory is investigated by associating the variational parameters with a symplectic manifold (a TDHF manifold). With the use of a canonical-variables parametrization, it is shown that the TDHF equation is equivalent to the canonical equations of motion in classical mechanics in the TDHF manifold. This enables us to investigate geometrical structure of the SCC method in the language of the classical mechanics. The SCC method turns out to give a prescription how to dynamically extract a ''maximally-decoupled'' collective submanifold (hypersurface) out of the TDHF manifold, in such a way that a certain kind of trajectories corresponding to the large-amplitude collective motion under consideration can be reproduced on the hypersurface as precisely as possible. The stability of the hypersurface at each point on it is investigated, in order to see whether the hypersurface obtained by the SCC method is really an approximate integral surface in the TDHF manifold or not. (author)

  5. Neuroimaging studies of word and pseudoword reading: consistencies, inconsistencies, and limitations.

    Science.gov (United States)

    Mechelli, Andrea; Gorno-Tempini, Maria Luisa; Price, Cathy J

    2003-02-15

    Several functional neuroimaging studies have compared words and pseudowords to test different cognitive models of reading. There are difficulties with this approach, however, because cognitive models do not make clear-cut predictions at the neural level. Therefore, results can only be interpreted on the basis of prior knowledge of cognitive anatomy. Furthermore, studies comparing words and pseudowords have produced inconsistent results. The inconsistencies could reflect false-positive results due to the low statistical thresholds applied or confounds from nonlexical aspects of the stimuli. Alternatively, they may reflect true effects that are inconsistent across subjects; dependent on experimental parameters such as stimulus rate or duration; or not replicated across studies because of insufficient statistical power. In this fMRI study, we investigate consistent and inconsistent differences between word and pseudoword reading in 20 subjects, and distinguish between effects associated with increases and decreases in activity relative to fixation. In addition, the interaction of word type with stimulus duration is explored. We find that words and pseudowords activate the same set of regions relative to fixation, and within this system, there is greater activation for pseudowords than words in the left frontal operculum, left posterior inferior temporal gyrus, and the right cerebellum. The only effects of words relative to pseudowords consistent over subjects are due to decreases in activity for pseudowords relative to fixation; and there are no significant interactions between word type and stimulus duration. Finally, we observe inconsistent but highly significant effects of word type at the individual subject level. These results (i) illustrate that pseudowords place increased demands on areas that have previously been linked to lexical retrieval, and (ii) highlight the importance of including one or more baselines to qualify word type effects. Furthermore, (iii

  6. MRI Study on the Functional and Spatial Consistency of Resting State-Related Independent Components of the Brain Network

    Energy Technology Data Exchange (ETDEWEB)

    Jeong, Bum Seok [Korea Advanced Institute of Science and Technology, Daejeon (Korea, Republic of); Choi, Jee Wook [Daejeon St. Mary' s Hospital, The Catholic University of Korea College of Medicine, Daejeon (Korea, Republic of); Kim, Ji Woong [College of Medical Science, Konyang University, Daejeon(Korea, Republic of)

    2012-06-15

    Resting-state networks (RSNs), including the default mode network (DMN), have been considered as markers of brain status such as consciousness, developmental change, and treatment effects. The consistency of functional connectivity among RSNs has not been fully explored, especially among resting-state-related independent components (RSICs). This resting-state fMRI study addressed the consistency of functional connectivity among RSICs as well as their spatial consistency between 'at day 1' and 'after 4 weeks' in 13 healthy volunteers. We found that most RSICs, especially the DMN, are reproducible across time, whereas some RSICs were variable in either their spatial characteristics or their functional connectivity. Relatively low spatial consistency was found in the basal ganglia, a parietal region of left frontoparietal network, and the supplementary motor area. The functional connectivity between two independent components, the bilateral angular/supramarginal gyri/intraparietal lobule and bilateral middle temporal/occipital gyri, was decreased across time regardless of the correlation analysis method employed, (Pearson's or partial correlation). RSICs showing variable consistency are different between spatial characteristics and functional connectivity. To understand the brain as a dynamic network, we recommend further investigation of both changes in the activation of specific regions and the modulation of functional connectivity in the brain network.

  7. MRI Study on the Functional and Spatial Consistency of Resting State-Related Independent Components of the Brain Network

    International Nuclear Information System (INIS)

    Jeong, Bum Seok; Choi, Jee Wook; Kim, Ji Woong

    2012-01-01

    Resting-state networks (RSNs), including the default mode network (DMN), have been considered as markers of brain status such as consciousness, developmental change, and treatment effects. The consistency of functional connectivity among RSNs has not been fully explored, especially among resting-state-related independent components (RSICs). This resting-state fMRI study addressed the consistency of functional connectivity among RSICs as well as their spatial consistency between 'at day 1' and 'after 4 weeks' in 13 healthy volunteers. We found that most RSICs, especially the DMN, are reproducible across time, whereas some RSICs were variable in either their spatial characteristics or their functional connectivity. Relatively low spatial consistency was found in the basal ganglia, a parietal region of left frontoparietal network, and the supplementary motor area. The functional connectivity between two independent components, the bilateral angular/supramarginal gyri/intraparietal lobule and bilateral middle temporal/occipital gyri, was decreased across time regardless of the correlation analysis method employed, (Pearson's or partial correlation). RSICs showing variable consistency are different between spatial characteristics and functional connectivity. To understand the brain as a dynamic network, we recommend further investigation of both changes in the activation of specific regions and the modulation of functional connectivity in the brain network.

  8. Optimization of preparation method for ketoprofen-loaded microspheres consisting polymeric blends using simplex lattice mixture design

    Energy Technology Data Exchange (ETDEWEB)

    Das, Sanjoy Kumar, E-mail: sanjoydasju@gmail.com; Khanam, Jasmina; Nanda, Arunabha

    2016-12-01

    In the present investigation, simplex lattice mixture design was applied for formulation development and optimization of a controlled release dosage form of ketoprofen microspheres consisting polymers like ethylcellulose and Eudragit{sup ®}RL 100; when those were formed by oil-in-oil emulsion solvent evaporation method. The investigation was carried out to observe the effects of polymer amount, stirring speed and emulsifier concentration (% w/w) on percentage yield, average particle size, drug entrapment efficiency and in vitro drug release in 8 h from the microspheres. Analysis of variance (ANOVA) was used to estimate the significance of the models. Based on the desirability function approach numerical optimization was carried out. Optimized formulation (KTF-O) showed close match between actual and predicted responses with desirability factor 0.811. No adverse reaction between drug and polymers were observed on the basis of Fourier transform infrared (FTIR) spectroscopy and Differential scanning calorimetric (DSC) analysis. Scanning electron microscopy (SEM) was carried out to show discreteness of microspheres (149.2 ± 1.25 μm) and their surface conditions during pre and post dissolution operations. The drug release pattern from KTF-O was best explained by Korsmeyer-Peppas and Higuchi models. The batch of optimized microspheres were found with maximum entrapment (~ 90%), minimum loss (~ 10%) and prolonged drug release for 8 h (91.25%) which may be considered as favourable criteria of controlled release dosage form. - Graphical abstract: Optimization of preparation method for ketoprofen-loaded microspheres consisting polymeric blends using simplex lattice mixture design. - Highlights: • Simplex lattice design was used to optimize ketoprofen-loaded microspheres. • Polymeric blend (Ethylcellulose and Eudragit® RL 100) was used. • Microspheres were prepared by oil-in-oil emulsion solvent evaporation method. • Optimized formulation depicted favourable

  9. Providing Consistent (A)ATSR Solar Channel Radiometry for Climate Studies

    Science.gov (United States)

    Smith, D.; Latter, B. G.; Poulsen, C.

    2012-04-01

    Data from the solar reflectance channels of the Along Track Scanning Radiometer (ATSR) series of instruments are being used in applications for monitoring trends in clouds and aerosols. In order to provide quantitative information, the radiometric calibrations of the sensors must be consistent, stable and ideally traced to international standards. This paper describes the methods used to monitor the calibrations of the ATSR instruments to provide consistent Level 1b radiometric data sets. Comparisons of the in-orbit calibrations are made by reference to data from quasi stable sites such as DOME-C in Antarctica or Saharan Desert sites. Comparisons are performed either by time coincident match-ups of the radiometric data for sensors having similar spectral bands and view/solar geometry and overpass times as for AATSR and MERIS; or via a reference BRDF model derived from averages of measurements over the site from a reference sensor where there is limited or no temporal overlap (e.g. MODIS-Aqua, ATSR-1 and ATSR-2). The results of the intercomparisons provide values of the long term calibration drift and systematic biases between the sensors. Look-up tables based on smoothed averages of the drift measurements are used to provide the corrections to Level 1b data. The uncertainty budgets for the comparisons are provided. It is also possible to perform comparisons of measurements against high spectral resolution instruments that are co-located on the same platform, i.e. AATSR/SCIA on ENVISAT and ATSR-2/GOME on ERS-2. The comparisons are performed by averaging the spectrometer data to the spectral response of the filter radiometer, and averaging the radiometer data to the spatial resolution of the spectrometer. In this paper, the authors present the results of the inter-comparisons to achieve a consistent calibration for the solar channels of the complete ATSR dataset. An assessment of the uncertainties associated with the techniques will be discussed. The impacts of the

  10. Self-Consistent Monte Carlo Study of the Coulomb Interaction under Nano-Scale Device Structures

    Science.gov (United States)

    Sano, Nobuyuki

    2011-03-01

    It has been pointed that the Coulomb interaction between the electrons is expected to be of crucial importance to predict reliable device characteristics. In particular, the device performance is greatly degraded due to the plasmon excitation represented by dynamical potential fluctuations in high-doped source and drain regions by the channel electrons. We employ the self-consistent 3D Monte Carlo (MC) simulations, which could reproduce both the correct mobility under various electron concentrations and the collective plasma waves, to study the physical impact of dynamical potential fluctuations on device performance under the Double-gate MOSFETs. The average force experienced by an electron due to the Coulomb interaction inside the device is evaluated by performing the self-consistent MC simulations and the fixed-potential MC simulations without the Coulomb interaction. Also, the band-tailing associated with the local potential fluctuations in high-doped source region is quantitatively evaluated and it is found that the band-tailing becomes strongly dependent of position in real space even inside the uniform source region. This work was partially supported by Grants-in-Aid for Scientific Research B (No. 2160160) from the Ministry of Education, Culture, Sports, Science and Technology in Japan.

  11. Flexibility of Gender Stereotypes: Italian Study on Comparative Gender-consistent and Gender-inconsistent Information

    Directory of Open Access Journals (Sweden)

    Elisabetta Sagone

    2018-05-01

    Full Text Available The topic of this study is flexibility in gender stereotyping linked to attribution of toys, socio-cognitive traits, and occupations in 160 Italian children aged 6 to 12 years. We used the Gender Toys Choice, the Gender Traits Choice, and the Gender Jobs Choice, a selected set of colored cards containing masculine and feminine stimuli to assign to a male or female or both male and female silhouette (the flexible-choice technique. In order to verify the change of flexibility in gender stereotyping, we made use of four cartoon stories with male and female characters with typical or atypical traits and performing gender-consistent or gender-inconsistent activities. Results indicated that the exposure to cartoon stories with gender-inconsistent information rather than cartoon stories with gender-consistent information increased flexibility in gender stereotyping, showing age differences in favor of children aged 11-12. Implications in relation to the developmental-constructivist approach were noted.

  12. A study of Consistency in the Selection of Search Terms and Search Concepts: A Case Study in National Taiwan University

    Directory of Open Access Journals (Sweden)

    Mu-hsuan Huang

    2001-12-01

    Full Text Available This article analyzes the consistency in the selection of search terms and search contents of college and graduate students in National Taiwan University when they are using PsycLIT CD-ROM database. 31 students conducted pre-assigned searches, doing 59 searches generating 609 search terms. The study finds the consistency in selection of search terms of first level is 22.14% and second level is 35%. These results are similar with others’ researches. About the consistency in search concepts, no matter the overlaps of searched articles or judge relevant articles are lower than other researches. [Article content in Chinese

  13. Communication: On the consistency of approximate quantum dynamics simulation methods for vibrational spectra in the condensed phase.

    Science.gov (United States)

    Rossi, Mariana; Liu, Hanchao; Paesani, Francesco; Bowman, Joel; Ceriotti, Michele

    2014-11-14

    Including quantum mechanical effects on the dynamics of nuclei in the condensed phase is challenging, because the complexity of exact methods grows exponentially with the number of quantum degrees of freedom. Efforts to circumvent these limitations can be traced down to two approaches: methods that treat a small subset of the degrees of freedom with rigorous quantum mechanics, considering the rest of the system as a static or classical environment, and methods that treat the whole system quantum mechanically, but using approximate dynamics. Here, we perform a systematic comparison between these two philosophies for the description of quantum effects in vibrational spectroscopy, taking the Embedded Local Monomer model and a mixed quantum-classical model as representatives of the first family of methods, and centroid molecular dynamics and thermostatted ring polymer molecular dynamics as examples of the latter. We use as benchmarks D2O doped with HOD and pure H2O at three distinct thermodynamic state points (ice Ih at 150 K, and the liquid at 300 K and 600 K), modeled with the simple q-TIP4P/F potential energy and dipole moment surfaces. With few exceptions the different techniques yield IR absorption frequencies that are consistent with one another within a few tens of cm(-1). Comparison with classical molecular dynamics demonstrates the importance of nuclear quantum effects up to the highest temperature, and a detailed discussion of the discrepancies between the various methods let us draw some (circumstantial) conclusions about the impact of the very different approximations that underlie them. Such cross validation between radically different approaches could indicate a way forward to further improve the state of the art in simulations of condensed-phase quantum dynamics.

  14. Efficient implementation of three-dimensional reference interaction site model self-consistent-field method: application to solvatochromic shift calculations.

    Science.gov (United States)

    Minezawa, Noriyuki; Kato, Shigeki

    2007-02-07

    The authors present an implementation of the three-dimensional reference interaction site model self-consistent-field (3D-RISM-SCF) method. First, they introduce a robust and efficient algorithm for solving the 3D-RISM equation. The algorithm is a hybrid of the Newton-Raphson and Picard methods. The Jacobian matrix is analytically expressed in a computationally useful form. Second, they discuss the solute-solvent electrostatic interaction. For the solute to solvent route, the electrostatic potential (ESP) map on a 3D grid is constructed directly from the electron density. The charge fitting procedure is not required to determine the ESP. For the solvent to solute route, the ESP acting on the solute molecule is derived from the solvent charge distribution obtained by solving the 3D-RISM equation. Matrix elements of the solute-solvent interaction are evaluated by the direct numerical integration. A remarkable reduction in the computational time is observed in both routes. Finally, the authors implement the first derivatives of the free energy with respect to the solute nuclear coordinates. They apply the present method to "solute" water and formaldehyde in aqueous solvent using the simple point charge model, and the results are compared with those from other methods: the six-dimensional molecular Ornstein-Zernike SCF, the one-dimensional site-site RISM-SCF, and the polarizable continuum model. The authors also calculate the solvatochromic shifts of acetone, benzonitrile, and nitrobenzene using the present method and compare them with the experimental and other theoretical results.

  15. Assessment of mastication in healthy children and children with cerebral palsy: a validity and consistency study.

    Science.gov (United States)

    Remijn, L; Speyer, R; Groen, B E; Holtus, P C M; van Limbeek, J; Nijhuis-van der Sanden, M W G

    2013-05-01

    The aim of this study was to develop the Mastication Observation and Evaluation instrument for observing and assessing the chewing ability of children eating solid and lumpy foods. This study describes the process of item definition and item selection and reports the content validity, reproducibility and consistency of the instrument. In the developmental phase, 15 experienced speech therapists assessed item relevance and descriptions over three Delphi rounds. Potential items were selected based on the results from a literature review. At the initial Delphi round, 17 potential items were included. After three Delphi rounds, 14 items that regarded as providing distinctive value in assessment of mastication (consensus >75%) were included in the Mastication Observation and Evaluation instrument. To test item reproducibility and consistency, two experts and five students evaluated video recordings of 20 children (10 children with cerebral palsy aged 29-65 months and 10 healthy children aged 11-42 months) eating bread and a biscuit. Reproducibility was estimated by means of the intraclass correlation coefficient (ICC). With the exception of one item concerning chewing duration, all items showed good to excellent intra-observer agreement (ICC students: 0.73-1.0). With the exception of chewing duration and number of swallows, inter-observer agreement was fair to excellent for all items (ICC experts: 0.68-1.0 and ICC students: 0.42-1.0). Results indicate that this tool is a feasible instrument and could be used in clinical practice after further research is completed on the reliability of the tool. © 2013 Blackwell Publishing Ltd.

  16. Consistency Study About Critical Thinking Skill of PGSD Students (Teacher Candidate of Elementary School) on Energy Material

    Science.gov (United States)

    Wijayanti, M. D.; Raharjo, S. B.; Saputro, S.; Mulyani, S.

    2017-09-01

    This study aims to examine the consistency of critical thinking ability of PGSD students in Energy material. The study population is PGSD students in UNS Surakarta. Samples are using cluster random sampling technique obtained by 101 students. Consistency of student’s response in knowing the critical thinking ability of PGSD students can be used as a benchmark of PGSD students’ understanding to see the equivalence of IPA problem, especially in energy material presented with various phenomena. This research uses descriptive method. Data are obtained through questionnaires and interviews. The research results that the average level of critical thinking in this study is divided into 3 levels, i.e.: level 1 (54.85%), level 2 (19.93%), and level 3 (25.23%). The data of the research result affect to the weak of students’ Energy materials’ understanding. In addition, indicators identify that assumptions and arguments analysis are also still low. Ideally, the consistency of critical thinking ability as a whole has an impact on the expansion of students’ conceptual understanding. The results of the study may become a reference to improve the subsequent research in order to obtain positive changes in the ability of critical thinking of students who directly improve the concept of students’ better understanding, especially in energy materials at various real problems occured.

  17. Self-consistent Study of Fast Particle Redistribution by Alfven Eigenmodes During Ion Cyclotron Resonance Heating

    International Nuclear Information System (INIS)

    Bergkvist, T.; Hellsten, T.; Johnson, T.

    2006-01-01

    Alfven eigenmodes (AEs) excited by fusion born α particles can degrade the heating efficiency of a burning plasma and throw out αs. To experimentally study the effects of excitation of AEs and the redistribution of the fast ions, ion cyclotron resonance heating (ICRH) is often used. The distribution function of thermonuclear αs in a reactor is expected to be isotropic and constantly renewed through DT reactions. The distribution function of cyclotron heated ions is strongly anisotropic, and the ICRH do not only renew the distribution function but also provide a strong decorrelation mechanism between the fast ions and the AE. Because of the sensitivity of the AE dynamics on the details of the distribution function, the location of the resonance surfaces in phase space and the extent of the overlapping resonant regions for different AEs, a self-consistent treatment of the AE excitation and the ICRH is necessary. Interactions of fast ions with AEs during ICRH has been implemented in the SELFO code. Simulations are in good agreement with the experimentally observer pitch-fork splitting and rapid damping of the AE as ICRH is turned off. The redistribution of fast ions have been studied in the presence of several driven AEs. (author)

  18. Study of impurity effects on CFETR steady-state scenario by self-consistent integrated modeling

    Science.gov (United States)

    Shi, Nan; Chan, Vincent S.; Jian, Xiang; Li, Guoqiang; Chen, Jiale; Gao, Xiang; Shi, Shengyu; Kong, Defeng; Liu, Xiaoju; Mao, Shifeng; Xu, Guoliang

    2017-12-01

    Impurity effects on fusion performance of China fusion engineering test reactor (CFETR) due to extrinsic seeding are investigated. An integrated 1.5D modeling workflow evolves plasma equilibrium and all transport channels to steady state. The one modeling framework for integrated tasks framework is used to couple the transport solver, MHD equilibrium solver, and source and sink calculations. A self-consistent impurity profile constructed using a steady-state background plasma, which satisfies quasi-neutrality and true steady state, is presented for the first time. Studies are performed based on an optimized fully non-inductive scenario with varying concentrations of Argon (Ar) seeding. It is found that fusion performance improves before dropping off with increasing {{Z}\\text{eff}} , while the confinement remains at high level. Further analysis of transport for these plasmas shows that low-k ion temperature gradient modes dominate the turbulence. The decrease in linear growth rate and resultant fluxes of all channels with increasing {{Z}\\text{eff}} can be traced to impurity profile change by transport. The improvement in confinement levels off at higher {{Z}\\text{eff}} . Over the regime of study there is a competition between the suppressed transport and increasing radiation that leads to a peak in the fusion performance at {{Z}\\text{eff}} (~2.78 for CFETR). Extrinsic impurity seeding to control divertor heat load will need to be optimized around this value for best fusion performance.

  19. Self-Consistent-Field Method and τ-Functional Method on Group Manifold in Soliton Theory: a Review and New Results

    Directory of Open Access Journals (Sweden)

    Seiya Nishiyama

    2009-01-01

    Full Text Available The maximally-decoupled method has been considered as a theory to apply an basic idea of an integrability condition to certain multiple parametrized symmetries. The method is regarded as a mathematical tool to describe a symmetry of a collective submanifold in which a canonicity condition makes the collective variables to be an orthogonal coordinate-system. For this aim we adopt a concept of curvature unfamiliar in the conventional time-dependent (TD self-consistent field (SCF theory. Our basic idea lies in the introduction of a sort of Lagrange manner familiar to fluid dynamics to describe a collective coordinate-system. This manner enables us to take a one-form which is linearly composed of a TD SCF Hamiltonian and infinitesimal generators induced by collective variable differentials of a canonical transformation on a group. The integrability condition of the system read the curvature C = 0. Our method is constructed manifesting itself the structure of the group under consideration. To go beyond the maximaly-decoupled method, we have aimed to construct an SCF theory, i.e., υ (external parameter-dependent Hartree-Fock (HF theory. Toward such an ultimate goal, the υ-HF theory has been reconstructed on an affine Kac-Moody algebra along the soliton theory, using infinite-dimensional fermion. An infinite-dimensional fermion operator is introduced through a Laurent expansion of finite-dimensional fermion operators with respect to degrees of freedom of the fermions related to a υ-dependent potential with a Υ-periodicity. A bilinear equation for the υ-HF theory has been transcribed onto the corresponding τ-function using the regular representation for the group and the Schur-polynomials. The υ-HF SCF theory on an infinite-dimensional Fock space F∞ leads to a dynamics on an infinite-dimensional Grassmannian Gr∞ and may describe more precisely such a dynamics on the group manifold. A finite-dimensional Grassmannian is identified with a Gr

  20. ANTHEM: a two-dimensional multicomponent self-consistent hydro-electron transport code for laser-matter interaction studies

    International Nuclear Information System (INIS)

    Mason, R.J.

    1982-01-01

    The ANTHEM code for the study of CO 2 -laser-generated transport is outlined. ANTHEM treats the background plasma as coupled Eulerian thermal and ion fluids, and the suprathermal electrons as either a third fluid or a body of evolving collisional PIC particles. The electrons scatter off the ions; the suprathermals drag against the thermal background. Self-consistent E- and B-fields are computed by the Implicit Moment Method. The current status of the code is described. Typical output from ANTHEM is discussed with special application to Augmented-Return-Current CO 2 -laser-driven targets

  1. Studies on the consistency of internally taken contrast medium for pancreas CT

    Energy Technology Data Exchange (ETDEWEB)

    Matsushima, Kishio; Mimura, Seiichi; Tahara, Seiji; Kitayama, Takuichi; Inamura, Keiji; Mikami, Yasutaka; Hashimoto, Keiji; Hiraki, Yoshio; Aono, Kaname

    1985-02-01

    A problem of Pancreatic CT scanning is the discrimination between the pancreas and the adjacent gastrointestinal tract. Generally we administer a dilution of gastrografin internally to make the discrimination. The degree of dilution has been decided by experience at each hospital. When the consistency of the contrast medium is low in density, an enhancement effect cannot be expected, but when the consistency is high, artifacts appear. We have experimented on the degree of the dilution and CT-No to decide the optimum consistency of gastrografin for the diagnosis of pancreatic disease. Statistical analysis of the results show the optimum dilution of gastrografin to be 1.5%.

  2. A feasibility study on FP transmutation for Self-Consistent Nuclear Energy System (SCNES)

    International Nuclear Information System (INIS)

    Fujita, Reiko; Kawashima, Masatoshi; Ueda, Hiroaki; Takagi, Ryuzo; Matsuura, Haruaki; Fujii-e, Yoichi

    1997-01-01

    A fast reactor core/fuel cycle concept is discussed for the future 'Self-Consistent Nuclear Energy System (SCNES)' concept. The present study mainly discussed long-lived fission products (LLFPs) burning capability and recycle scheme in the framework of metallic fuel fast reactor cycle, aiming at the goals for fuel breeding capability and confinement for TRU and radio-active FPs within the system. In present paper, burning capability for Cs135 and Zr93 is mainly discussed from neutronic and chemical view points, assuming metallic fuel cycle system. The recent experimental results indicate that Cs can be separable along with the pyroprocess for metal fuel recycle system, as previously designed for a candidate fuel cycle system. Combining neutron spectrum-shift for target sub-assemblies and isotope separation using tunable laser, LLFP burning capability is enhanced. This result indicates that major LLFPs can be treated in the additional recycle schemes to avoid LLFP accumulation along with energy production. In total, the proposed fuel cycle is an candidate for realizing SCNES concept. (author)

  3. Interface Consistency

    DEFF Research Database (Denmark)

    Staunstrup, Jørgen

    1998-01-01

    This paper proposes that Interface Consistency is an important issue for the development of modular designs. Byproviding a precise specification of component interfaces it becomes possible to check that separately developedcomponents use a common interface in a coherent matter thus avoiding a very...... significant source of design errors. Awide range of interface specifications are possible, the simplest form is a syntactical check of parameter types.However, today it is possible to do more sophisticated forms involving semantic checks....

  4. A consistency-based feature selection method allied with linear SVMs for HIV-1 protease cleavage site prediction.

    Directory of Open Access Journals (Sweden)

    Orkun Oztürk

    Full Text Available BACKGROUND: Predicting type-1 Human Immunodeficiency Virus (HIV-1 protease cleavage site in protein molecules and determining its specificity is an important task which has attracted considerable attention in the research community. Achievements in this area are expected to result in effective drug design (especially for HIV-1 protease inhibitors against this life-threatening virus. However, some drawbacks (like the shortage of the available training data and the high dimensionality of the feature space turn this task into a difficult classification problem. Thus, various machine learning techniques, and specifically several classification methods have been proposed in order to increase the accuracy of the classification model. In addition, for several classification problems, which are characterized by having few samples and many features, selecting the most relevant features is a major factor for increasing classification accuracy. RESULTS: We propose for HIV-1 data a consistency-based feature selection approach in conjunction with recursive feature elimination of support vector machines (SVMs. We used various classifiers for evaluating the results obtained from the feature selection process. We further demonstrated the effectiveness of our proposed method by comparing it with a state-of-the-art feature selection method applied on HIV-1 data, and we evaluated the reported results based on attributes which have been selected from different combinations. CONCLUSION: Applying feature selection on training data before realizing the classification task seems to be a reasonable data-mining process when working with types of data similar to HIV-1. On HIV-1 data, some feature selection or extraction operations in conjunction with different classifiers have been tested and noteworthy outcomes have been reported. These facts motivate for the work presented in this paper. SOFTWARE AVAILABILITY: The software is available at http

  5. FDTD method for computing the off-plane band structure in a two-dimensional photonic crystal consisting of nearly free-electron metals

    Energy Technology Data Exchange (ETDEWEB)

    Xiao Sanshui; He Sailing

    2002-12-01

    An FDTD numerical method for computing the off-plane band structure of a two-dimensional photonic crystal consisting of nearly free-electron metals is presented. The method requires only a two-dimensional discretization mesh for a given off-plane wave number k{sub z} although the off-plane propagation is a three-dimensional problem. The off-plane band structures of a square lattice of metallic rods with the high-frequency metallic model in the air are studied, and a complete band gap for some nonzero off-plane wave number k{sub z} is founded.

  6. FDTD method for computing the off-plane band structure in a two-dimensional photonic crystal consisting of nearly free-electron metals

    International Nuclear Information System (INIS)

    Xiao Sanshui; He Sailing

    2002-01-01

    An FDTD numerical method for computing the off-plane band structure of a two-dimensional photonic crystal consisting of nearly free-electron metals is presented. The method requires only a two-dimensional discretization mesh for a given off-plane wave number k z although the off-plane propagation is a three-dimensional problem. The off-plane band structures of a square lattice of metallic rods with the high-frequency metallic model in the air are studied, and a complete band gap for some nonzero off-plane wave number k z is founded

  7. Consistent dietary patterns identified from childhood to adulthood: the cardiovascular risk in Young Finns Study.

    Science.gov (United States)

    Mikkilä, V; Räsänen, L; Raitakari, O T; Pietinen, P; Viikari, J

    2005-06-01

    Dietary patterns are useful in nutritional epidemiology, providing a comprehensive alternative to the traditional approach based on single nutrients. The Cardiovascular Risk in Young Finns Study is a prospective cohort study with a 21-year follow-up. At baseline, detailed quantitative information on subjects' food consumption was obtained using a 48 h dietary recall method (n 1768, aged 3-18 years). The interviews were repeated after 6 and 21 years (n 1200 and n 1037, respectively). We conducted a principal component analysis to identify major dietary patterns at each study point. A set of two similar patterns was recognised throughout the study. Pattern 1 was positively correlated with consumption of traditional Finnish foods, such as rye, potatoes, milk, butter, sausages and coffee, and negatively correlated with fruit, berries and dairy products other than milk. Pattern 1 type of diet was more common among male subjects, smokers and those living in rural areas. Pattern 2, predominant among female subjects, non-smokers and in urban areas, was characterised by more health-conscious food choices such as vegetables, legumes and nuts, tea, rye, cheese and other dairy products, and also by consumption of alcoholic beverages. Tracking of the pattern scores was observed, particularly among subjects who were adolescents at baseline. Of those originally belonging to the uppermost quintile of pattern 1 and 2 scores, 41 and 38 % respectively, persisted in the same quintile 21 years later. Our results suggest that food behaviour and concrete food choices are established already in childhood or adolescence and may significantly track into adulthood.

  8. Pressure variation of the valence band width in Ge: A self-consistent GW study

    DEFF Research Database (Denmark)

    Modak, Paritosh; Svane, Axel; Christensen, Niels Egede

    2009-01-01

    . In the present work we report results of quasiparticle self-consistent GW  (QSGW) band calculations for diamond- as well as β-tin-type Ge under pressure. For both phases we find that the band width increases with pressure. For β-tin Ge this agrees with experiment and density-functional theory, but for diamond Ge...

  9. [Consistency study of PowerPlex 21 kit and Goldeneye 20A kit and forensic application].

    Science.gov (United States)

    Ren, He; Liu, Ying; Zhang, Qing-Xia; Jiao, Zhang-Ping

    2014-06-01

    To ensure the consistency of genotype results for PowerPlex 21 kit and Goldeneye 20A kit. The STR loci were amplified in DNA samples from 205 unrelated individuals in Beijing Han population. And consistency of 19 overlap STR loci typing were observed. The genetic polymorphism of D1S1656 locus was obtained. All 19 overlap loci typing showed consistent. The proportion of peak height of heterozygous loci in two kits showed no statistical difference (P > 0.05). The observed heterozygosis of D1S1656 was 0.878. The discrimination power was 0.949. The excluding probability of paternity of triplet was 0.751. The excluding probability of paternity of diploid was 0.506. The polymorphism information content was 0.810. PowerPlex 21 kit and Goldeneye 20A kit present a good consistency. The primer design is reasonable. The polymorphism of D1S1656 is good. The two kits can be used for human genetic analysis, paternity test, and individual identification in forensic practice.

  10. Consistency of Higher Education Institutions’ Strategies: A Study Based on the Stakeholders’ Perception using the Balanced Scorecard

    Directory of Open Access Journals (Sweden)

    Alexsandra Barcelos Dias

    2016-10-01

    Full Text Available The strategic orientation of the company was conceived as a management tool known as the Balanced Scorecard (BSC, which aims to measure and monitor the strategy in action. The objective of this study was to verify the strategic consistency in the perception of the stakeholders at Private Higher Education Institutions (HEI, through the perspective of the Balanced Scorecard. The method used was a descriptive research, through a quantitative approach. Data were collected through a questionnaire, applied at four HEIs in the State of Minas Gerais, including directors / coordinators, teachers and students called stakeholders, to identify, based on a Balanced Scorecard model with four indicators in each perspective (financial, clients, learning and growth and internal processes, the consistency of the strategies as perceived by these groups. The main results pointed to a perception difference of the managers regarding the perspectives, with a greater degree of importance given to the perspective “Learning and Growth” and “Internal Processes”. The group of teachers attributed less importance to the “Customers” perspective. The main inconsistencies were found in the “Internal Processes” perspective. The “Financial” perspective presented less gaps when compared between the groups, which reveals a strategic inconsistency at the HEIs through the stakeholders’ perception. It is concluded that strategic consistency can contribute to organizational competitiveness, identifying the existence of alignment in the actions developed that result in greater efficiency for a competitive scenario according to its stakeholders.

  11. Chemical Composition Analysis and Product Consistency Tests of the ORP Phase 5 Nepheline Study Glasses

    Energy Technology Data Exchange (ETDEWEB)

    Fox, K. M. [Savannah River Site (SRS), Aiken, SC (United States). Savannah River National Lab. (SRNL); Edwards, T. B. [Savannah River Site (SRS), Aiken, SC (United States). Savannah River National Lab. (SRNL); Caldwell, M. E. [Savannah River Site (SRS), Aiken, SC (United States). Savannah River National Lab. (SRNL); Riley, W. T. [Savannah River Site (SRS), Aiken, SC (United States). Savannah River National Lab. (SRNL)

    2018-02-01

    In this report, the Savannah River National Laboratory (SRNL) provides chemical analyses and Product Consistency Test (PCT) results for a series of simulated high-level waste glass compositions fabricated by the Pacific Northwest National Laboratory (PNNL). These data will be used in the development of improved models for the prediction of nepheline crystallization in support of the Hanford Tank Waste Treatment and Immobilization Plant (WTP).

  12. Using qualitative methods to inform the trade-off between content validity and consistency in utility assessment: the example of type 2 diabetes and Alzheimer's Disease

    Directory of Open Access Journals (Sweden)

    Gargon Elizabeth

    2010-02-01

    Full Text Available Abstract Background Key stakeholders regard generic utility instruments as suitable tools to inform health technology assessment decision-making regarding allocation of resources across competing interventions. These instruments require a 'descriptor', a 'valuation' and a 'perspective' of the economic evaluation. There are various approaches that can be taken for each of these, offering a potential lack of consistency between instruments (a basic requirement for comparisons across diseases. The 'reference method' has been proposed as a way to address the limitations of the Quality-Adjusted Life Year (QALY. However, the degree to which generic measures can assess patients' specific experiences with their disease would remain unresolved. This has been neglected in the discussions on methods development and its impact on the QALY values obtained and resulting cost per QALY estimate underestimated. This study explored the content of utility instruments relevant to type 2 diabetes and Alzheimer's disease (AD as examples, and the role of qualitative research in informing the trade-off between content coverage and consistency. Method A literature review was performed to identify qualitative and quantitative studies regarding patients' experiences with type 2 diabetes or AD, and associated treatments. Conceptual models for each indication were developed. Generic- and disease-specific instruments were mapped to the conceptual models. Results Findings showed that published descriptions of relevant concepts important to patients with type 2 diabetes or AD are available for consideration in deciding on the most comprehensive approach to utility assessment. While the 15-dimensional health related quality of life measure (15D seemed the most comprehensive measure for both diseases, the Health Utilities Index 3 (HUI 3 seemed to have the least coverage for type 2 diabetes and the EuroQol-5 Dimensions (EQ-5D for AD. Furthermore, some of the utility instruments

  13. METHODS FOR SPEECH CONTACT ANALYSIS: THE CASE OF THE ‘UBIQUITY OF RHETORIC’. PROOFING THE CONCEPTUAL CONSISTENCY OF SPEECH AS LINGUISTIC MACRO-SETTING

    Directory of Open Access Journals (Sweden)

    Dr. Fee-Alexandra HAASE

    2012-11-01

    Full Text Available In this article we will apply a method of proof for conceptual consistency in a long historical range taking the example of rhetoric and persuasion. We will analyze the evidentially present linguistic features of this concept within three linguistic areas: The Indo-European languages, the Semitic languages, and the Afro-Asiatic languages. We have chosen the case of the concept ‘rhetoric’ / ’persuasion’ as a paradigm for this study. With the phenomenon of ‘linguistic dispersion’ we can explain the development of language as undirected, but with linguistic consistency across the borders of language families. We will prove that the Semitic and Indo-European languages are related. As a consequence, the strict differentiation between the Semitic and the Indo-European language families is outdated following the research positions of Starostin. In contrast to this, we will propose a theory of cultural exchange between the two language families.

  14. Consistency-dependent optical properties of lubricating grease studied by terahertz spectroscopy

    International Nuclear Information System (INIS)

    Tian Lu; Zhao Kun; Zhou Qing-Li; Shi Yu-Lei; Zhao Dong-Mei; Zhang Cun-Lin; Zhao Song-Qing; Zhao Hui; Bao Ri-Ma; Zhu Shou-Ming; Miao Qing

    2011-01-01

    The optical properties of four kinds of lubricating greases (urea, lithium, extreme pressure lithium, molybdenum disulfide lithium greases) with different NLGL (National Lubricant Grease Institute of America) numbers were investigated using terahertz time-domain spectroscopy. Greases with different NLGL grades have unique spectral features in the terahertz range. Comparison of the experimental data with predictions based on Lorentz—Lorenz theory exhibited that the refractive indices of each kind of lubricating grease were dependent on the their consistency. In addition, molybdenum disulfide (MoS 2 ) as a libricant additive shows strong absorption from 0.2 to 1.4 THz, leading to higher absorption of MoS 2 -lithium grease than that of lithium grease. (general)

  15. Studies of self-consistent field structure in a quasi-optical gyrotron

    International Nuclear Information System (INIS)

    Antonsen, T.M. Jr.

    1993-04-01

    The presence of an electron beam in a quasi-optical gyrotron cavity alters the structure of the fields from that of the empty cavity. A computer code has been written which calculates this alteration for either an electron beam or a thin dielectric tube placed in the cavity. Experiments measuring the quality factor of such a cavity performed for the case of a dielectric tube and the results agree with the predictions of the code. Simulations of the case of an electron beam indicate that self-consistent effects can be made small in that almost all the power leaves the cavity in a symmetric gaussian-like mode provided the resonator parameters are chosen carefully. (author) 6 figs., 1 tab., 13 refs

  16. Consistency in the Reporting of Sensitive Behaviors by Adolescent American Indian Women: A Comparison of Interviewing Methods

    Science.gov (United States)

    Mullany, Britta; Barlow, Allison; Neault, Nicole; Billy, Trudy; Hastings, Ranelda; Coho-Mescal, Valerie; Lorenzo, Sherilyn; Walkup, John T.

    2013-01-01

    Computer-assisted interviewing techniques have increasingly been used in program and research settings to improve data collection quality and efficiency. Little is known, however, regarding the use of such techniques with American Indian (AI) adolescents in collecting sensitive information. This brief compares the consistency of AI adolescent…

  17. Computational study of depth completion consistent with human bi-stable perception for ambiguous figures.

    Science.gov (United States)

    Mitsukura, Eiichi; Satoh, Shunji

    2018-03-01

    We propose a computational model that is consistent with human perception of depth in "ambiguous regions," in which no binocular disparity exists. Results obtained from our model reveal a new characteristic of depth perception. Random dot stereograms (RDS) are often used as examples because RDS provides sufficient disparity for depth calculation. A simple question confronts us: "How can we estimate the depth of a no-texture image region, such as one on white paper?" In such ambiguous regions, mathematical solutions related to binocular disparities are not unique or indefinite. We examine a mathematical description of depth completion that is consistent with human perception of depth for ambiguous regions. Using computer simulation, we demonstrate that resultant depth-maps qualitatively reproduce human depth perception of two kinds. The resultant depth maps produced using our model depend on the initial depth in the ambiguous region. Considering this dependence from psychological viewpoints, we conjecture that humans perceive completed surfaces that are affected by prior-stimuli corresponding to the initial condition of depth. We conducted psychological experiments to verify the model prediction. An ambiguous stimulus was presented after a prior stimulus removed ambiguity. The inter-stimulus interval (ISI) was inserted between the prior stimulus and post-stimulus. Results show that correlation of perception between the prior stimulus and post-stimulus depends on the ISI duration. Correlation is positive, negative, and nearly zero in the respective cases of short (0-200 ms), medium (200-400 ms), and long ISI (>400 ms). Furthermore, based on our model, we propose a computational model that can explain the dependence. Copyright © 2017 Elsevier Ltd. All rights reserved.

  18. A Theoretically Consistent Method for Minimum Mean-Square Error Estimation of Mel-Frequency Cepstral Features

    DEFF Research Database (Denmark)

    Jensen, Jesper; Tan, Zheng-Hua

    2014-01-01

    We propose a method for minimum mean-square error (MMSE) estimation of mel-frequency cepstral features for noise robust automatic speech recognition (ASR). The method is based on a minimum number of well-established statistical assumptions; no assumptions are made which are inconsistent with others....... The strength of the proposed method is that it allows MMSE estimation of mel-frequency cepstral coefficients (MFCC's), cepstral mean-subtracted MFCC's (CMS-MFCC's), velocity, and acceleration coefficients. Furthermore, the method is easily modified to take into account other compressive non-linearities than...... the logarithmic which is usually used for MFCC computation. The proposed method shows estimation performance which is identical to or better than state-of-the-art methods. It further shows comparable ASR performance, where the advantage of being able to use mel-frequency speech features based on a power non...

  19. A new relative radiometric consistency processing method for change detection based on wavelet transform and a low-pass filter

    Institute of Scientific and Technical Information of China (English)

    2010-01-01

    The research purpose of this paper is to show the limitations of the existing radiometric normalization approaches and their disadvantages in change detection of artificial objects by comparing the existing approaches,on the basis of which a preprocessing approach to radiometric consistency,based on wavelet transform and a spatial low-pass filter,has been devised.This approach first separates the high frequency information and low frequency information by wavelet transform.Then,the processing of relative radiometric consistency based on a low-pass filter is conducted on the low frequency parts.After processing,an inverse wavelet transform is conducted to obtain the results image.The experimental results show that this approach can substantially reduce the influence on change detection of linear or nonlinear radiometric differences in multi-temporal images.

  20. Development of a self-consistent model of dust grain charging at elevated pressures using the method of moments

    International Nuclear Information System (INIS)

    Filippov, A.V.; Dyatko, N.A.; Pal', A.F.; Starostin, A.N.

    2003-01-01

    A model of dust grain charging is constructed using the method of moments. The dust grain charging process in a weakly ionized helium plasma produced by a 100-keV electron beam at atmospheric pressure is studied theoretically. In simulations, the beam current density was varied from 1 to 10 6 μA/cm 2 . It is shown that, in a He plasma, dust grains of radius 5 μm and larger perturb the electron temperature only slightly, although the reduced electric field near the grain reaches 8 Td, the beam current density being 10 6 μA/cm 2 . It is found that, at distances from the grain that are up to several tens or hundreds of times larger than its radius, the electron and ion densities are lower than their equilibrium values. Conditions are determined under which the charging process may be described by a model with constant electron transport coefficients. The dust grain charge is shown to be weakly affected by secondary electron emission. In a beam-produced helium plasma, the dust grain potential calculated in the drift-diffusion model is shown to be close to that calculated in the orbit motion limited model. It is found that, in the vicinity of a body perturbing the plasma, there may be no quasineutral plasma presheath with an ambipolar diffusion of charged particles. The conditions for the onset of this presheath in a beam-produced plasma are determined

  1. A CONSISTENT STUDY OF METALLICITY EVOLUTION AT 0.8 < z < 2.6

    Energy Technology Data Exchange (ETDEWEB)

    Wuyts, Eva; Kurk, Jaron; Förster Schreiber, Natascha M.; Genzel, Reinhard; Wisnioski, Emily; Bandara, Kaushala; Wuyts, Stijn; Beifiori, Alessandra; Bender, Ralf; Buschkamp, Peter; Chan, Jeffrey; Davies, Ric; Eisenhauer, Frank; Fossati, Matteo; Kulkarni, Sandesh K.; Lang, Philipp [Max-Planck-Institut für extraterrestrische Physik, Giessenbachstr. 1, D-85741 Garching (Germany); Brammer, Gabriel B. [Space Telescope Science Institute, Baltimore, MD 21218 (United States); Burkert, Andreas [Universitäts-Sternwarte München, Scheinerstr. 1, D-81679 München (Germany); Carollo, C. Marcella; Lilly, Simon J., E-mail: evawuyts@mpe.mpg.de [Institute of Astronomy, Department of Physics, Eidgensösische Technische Hochschule, ETH Zürich, CH-8093 (Switzerland); and others

    2014-07-10

    We present the correlations between stellar mass, star formation rate (SFR), and the [N II]/Hα flux ratio as an indicator of gas-phase metallicity for a sample of 222 galaxies at 0.8 < z < 2.6 and log (M {sub *}/M {sub ☉}) = 9.0-11.5 from the LUCI, SINS/zC-SINF, and KMOS{sup 3D} surveys. This sample provides a unique analysis of the mass-metallicity relation (MZR) over an extended redshift range using consistent data analysis techniques and a uniform strong-line metallicity indicator. We find a constant slope at the low-mass end of the relation and can fully describe its redshift evolution through the evolution of the characteristic turnover mass where the relation begins to flatten at the asymptotic metallicity. At a fixed mass and redshift, our data do not show a correlation between the [N II]/Hα ratio and SFR, which disagrees with the 0.2-0.3 dex offset in [N II]/Hα predicted by the ''fundamental relation'' between stellar mass, SFR, and metallicity discussed in recent literature. However, the overall evolution toward lower [N II]/Hα at earlier times does broadly agree with these predictions.

  2. Study of mango endogenous pectinases as a tool to engineer mango purée consistency.

    Science.gov (United States)

    Jamsazzadeh Kermani, Zahra; Shpigelman, Avi; Houben, Ken; ten Geuzendam, Belinda; Van Loey, Ann M; Hendrickx, Marc E

    2015-04-01

    The objective of this work was to evaluate the possibility of using mango endogenous pectinases to change the viscosity of mango purée. Hereto, the structure of pectic polysaccharide and the presence of sufficiently active endogenous enzymes of ripe mango were determined. Pectin of mango flesh had a high molecular weight and was highly methoxylated. Pectin methylesterase showed a negligible activity which is related to the confirmed presence of a pectin methylesterase inhibitor. Pectin contained relatively high amounts of galactose and considerable β-galactosidase (β-Gal) activity was observed. The possibility of stimulating β-Gal activity during processing (temperature/pressure, time) was investigated. β-Gal of mango was rather temperature labile but pressure stable relatively to the temperature and pressure levels used to inactivate destructive enzymes in industry. Creating processing conditions allowing endogenous β-Gal activity did not substantially change the consistency of mango purée. Copyright © 2014 Elsevier Ltd. All rights reserved.

  3. Self-consistent simulation study on magnetized inductively coupled plasma for 450 mm semiconductor wafer processing

    International Nuclear Information System (INIS)

    Lee, Ho-Jun; Kim, Yun-Gi

    2012-01-01

    The characteristics of weakly magnetized inductively coupled plasma (MICP) are investigated using a self-consistent simulation based on the drift–diffusion approximation with anisotropic transport coefficients. MICP is a plasma source utilizing the cavity mode of the low-frequency branch of the right-hand circularly polarized wave. The model system is 700 mm in diameter and has a 250 mm gap between the radio-frequency window and wafer holder. The model chamber size is chosen to verify the applicability of this type of plasma source to the 450 mm wafer process. The effects of electron density distribution and external axial magnetic field on the propagation properties of the plasma wave, including the wavelength modulation and refraction toward the high-density region, are demonstrated. The restricted electron transport and thermal conductivity in the radial direction due to the magnetic field result in small temperature gradient along the field lines and off-axis peak density profile. The calculated impedance seen from the antenna terminal shows that MICP has a resistance component that is two to threefold higher than that of ICP. This property is practically important for large-size, low-pressure plasma sources because high resistance corresponds to high power-transfer efficiency and stable impedance matching characteristics. For the 0.665 Pa argon plasma, MICP shows a radial density uniformity of 6% within 450 mm diameter, which is much better than that of nonmagnetized ICP.

  4. A CONSISTENT STUDY OF METALLICITY EVOLUTION AT 0.8 < z < 2.6

    International Nuclear Information System (INIS)

    Wuyts, Eva; Kurk, Jaron; Förster Schreiber, Natascha M.; Genzel, Reinhard; Wisnioski, Emily; Bandara, Kaushala; Wuyts, Stijn; Beifiori, Alessandra; Bender, Ralf; Buschkamp, Peter; Chan, Jeffrey; Davies, Ric; Eisenhauer, Frank; Fossati, Matteo; Kulkarni, Sandesh K.; Lang, Philipp; Brammer, Gabriel B.; Burkert, Andreas; Carollo, C. Marcella; Lilly, Simon J.

    2014-01-01

    We present the correlations between stellar mass, star formation rate (SFR), and the [N II]/Hα flux ratio as an indicator of gas-phase metallicity for a sample of 222 galaxies at 0.8 < z < 2.6 and log (M * /M ☉ ) = 9.0-11.5 from the LUCI, SINS/zC-SINF, and KMOS 3D surveys. This sample provides a unique analysis of the mass-metallicity relation (MZR) over an extended redshift range using consistent data analysis techniques and a uniform strong-line metallicity indicator. We find a constant slope at the low-mass end of the relation and can fully describe its redshift evolution through the evolution of the characteristic turnover mass where the relation begins to flatten at the asymptotic metallicity. At a fixed mass and redshift, our data do not show a correlation between the [N II]/Hα ratio and SFR, which disagrees with the 0.2-0.3 dex offset in [N II]/Hα predicted by the ''fundamental relation'' between stellar mass, SFR, and metallicity discussed in recent literature. However, the overall evolution toward lower [N II]/Hα at earlier times does broadly agree with these predictions

  5. Tuning the electronic properties of gated multilayer phosphorene: A self-consistent tight-binding study

    Science.gov (United States)

    Li, L. L.; Partoens, B.; Peeters, F. M.

    2018-04-01

    By taking account of the electric-field-induced charge screening, a self-consistent calculation within the framework of the tight-binding approach is employed to obtain the electronic band structure of gated multilayer phosphorene and the charge densities on the different phosphorene layers. We find charge density and screening anomalies in single-gated multilayer phosphorene and electron-hole bilayers in dual-gated multilayer phosphorene. Due to the unique puckered lattice structure, both intralayer and interlayer charge screenings are important in gated multilayer phosphorene. We find that the electric-field tuning of the band structure of multilayer phosphorene is distinctively different in the presence and absence of charge screening. For instance, it is shown that the unscreened band gap of multilayer phosphorene decreases dramatically with increasing electric-field strength. However, in the presence of charge screening, the magnitude of this band-gap decrease is significantly reduced and the reduction depends strongly on the number of phosphorene layers. Our theoretical results of the band-gap tuning are compared with recent experiments and good agreement is found.

  6. A simple, sufficient, and consistent method to score the status of threats and demography of imperiled species

    Directory of Open Access Journals (Sweden)

    Jacob W. Malcom

    2016-07-01

    Full Text Available Managers of large, complex wildlife conservation programs need information on the conservation status of each of many species to help strategically allocate limited resources. Oversimplifying status data, however, runs the risk of missing information essential to strategic allocation. Conservation status consists of two components, the status of threats a species faces and the species’ demographic status. Neither component alone is sufficient to characterize conservation status. Here we present a simple key for scoring threat and demographic changes for species using detailed information provided in free-form textual descriptions of conservation status. This key is easy to use (simple, captures the two components of conservation status without the cost of more detailed measures (sufficient, and can be applied by different personnel to any taxon (consistent. To evaluate the key’s utility, we performed two analyses. First, we scored the threat and demographic status of 37 species recently recommended for reclassification under the Endangered Species Act (ESA and 15 control species, then compared our scores to two metrics used for decision-making and reports to Congress. Second, we scored the threat and demographic status of all non-plant ESA-listed species from Florida (54 spp., and evaluated scoring repeatability for a subset of those. While the metrics reported by the U.S. Fish and Wildlife Service (FWS are often consistent with our scores in the first analysis, the results highlight two problems with the oversimplified metrics. First, we show that both metrics can mask underlying demographic declines or threat increases; for example, ∼40% of species not recommended for reclassification had changes in threats or demography. Second, we show that neither metric is consistent with either threats or demography alone, but conflates the two. The second analysis illustrates how the scoring key can be applied to a substantial set of species to

  7. Self-consistent-field method and τ-functional method on group manifold in soliton theory. II. Laurent coefficients of soliton solutions for sln and for sun

    International Nuclear Information System (INIS)

    Nishiyama, Seiya; Providencia, Joao da; Komatsu, Takao

    2007-01-01

    To go beyond perturbative method in terms of variables of collective motion, using infinite-dimensional fermions, we have aimed to construct the self-consistent-field (SCF) theory, i.e., time dependent Hartree-Fock theory on associative affine Kac-Moody algebras along the soliton theory. In this paper, toward such an ultimate goal we will reconstruct a theoretical frame for a υ (external parameter)-dependent SCF method to describe more precisely the dynamics on the infinite-dimensional fermion Fock space. An infinite-dimensional fermion operator is introduced through Laurent expansion of finite-dimensional fermion operators with respect to degrees of freedom of the fermions related to a υ-dependent and a Υ-periodic potential. As an illustration, we derive explicit expressions for the Laurent coefficients of soliton solutions for sl n and for su n on infinite-dimensional Grassmannian. The associative affine Kac-Moody algebras play a crucial role to determine the dynamics on the infinite-dimensional fermion Fock space

  8. Ethical dilemmas of a large national multi-centre study in Australia: time for some consistency.

    Science.gov (United States)

    Driscoll, Andrea; Currey, Judy; Worrall-Carter, Linda; Stewart, Simon

    2008-08-01

    To examine the impact and obstacles that individual Institutional Research Ethics Committee (IRECs) had on a large-scale national multi-centre clinical audit called the National Benchmarks and Evidence-based National Clinical guidelines for Heart failure management programmes Study. Multi-centre research is commonplace in the health care system. However, IRECs continue to fail to differentiate between research and quality audit projects. The National Benchmarks and Evidence-based National Clinical guidelines for Heart failure management programmes study used an investigator-developed questionnaire concerning a clinical audit for heart failure programmes throughout Australia. Ethical guidelines developed by the National governing body of health and medical research in Australia classified the National Benchmarks and Evidence-based National Clinical guidelines for Heart failure management programmes Study as a low risk clinical audit not requiring ethical approval by IREC. Fifteen of 27 IRECs stipulated that the research proposal undergo full ethical review. None of the IRECs acknowledged: national quality assurance guidelines and recommendations nor ethics approval from other IRECs. Twelve of the 15 IRECs used different ethics application forms. Variability in the type of amendments was prolific. Lack of uniformity in ethical review processes resulted in a six- to eight-month delay in commencing the national study. Development of a national ethics application form with full ethical review by the first IREC and compulsory expedited review by subsequent IRECs would resolve issues raised in this paper. IRECs must change their ethics approval processes to one that enhances facilitation of multi-centre research which is now normative process for health services. The findings of this study highlight inconsistent ethical requirements between different IRECs. Also highlighted are the obstacles and delays that IRECs create when undertaking multi-centre clinical audits

  9. Studying the Consistency between and within the Student Mental Models for Atomic Structure

    Science.gov (United States)

    Zarkadis, Nikolaos; Papageorgiou, George; Stamovlasis, Dimitrios

    2017-01-01

    Science education research has revealed a number of student mental models for atomic structure, among which, the one based on Bohr's model seems to be the most dominant. The aim of the current study is to investigate the coherence of these models when students apply them for the explanation of a variety of situations. For this purpose, a set of…

  10. Development of a consistent Monte Carlo-deterministic transport methodology based on the method of characteristics and MCNP5

    International Nuclear Information System (INIS)

    Karriem, Z.; Ivanov, K.; Zamonsky, O.

    2011-01-01

    This paper presents work that has been performed to develop an integrated Monte Carlo- Deterministic transport methodology in which the two methods make use of exactly the same general geometry and multigroup nuclear data. The envisioned application of this methodology is in reactor lattice physics methods development and shielding calculations. The methodology will be based on the Method of Long Characteristics (MOC) and the Monte Carlo N-Particle Transport code MCNP5. Important initial developments pertaining to ray tracing and the development of an MOC flux solver for the proposed methodology are described. Results showing the viability of the methodology are presented for two 2-D general geometry transport problems. The essential developments presented is the use of MCNP as geometry construction and ray tracing tool for the MOC, verification of the ray tracing indexing scheme that was developed to represent the MCNP geometry in the MOC and the verification of the prototype 2-D MOC flux solver. (author)

  11. Decentralized Method for Load Sharing and Power Management in a Hybrid Single/Three-Phase-Islanded Microgrid Consisting of Hybrid Source PV/Battery Units

    DEFF Research Database (Denmark)

    Karimi, Yaser; Oraee, Hashem; Guerrero, Josep M.

    2017-01-01

    This paper proposes a new decentralized power management and load sharing method for a photovoltaic based, hybrid single/three-phase islanded microgrid consisting of various PV units, battery units and hybrid PV/battery units. The proposed method is not limited to the systems with separate PV...... in different load, PV generation and battery conditions is validated experimentally in a microgrid lab prototype consisted of one three-phase unit and two single-phase units....

  12. Microemulsion Electrokinetic Chromatography in Combination with Chemometric Methods to Evaluate the Holistic Quality Consistency and Predict the Antioxidant Activity of Ixeris sonchifolia (Bunge Hance Injection.

    Directory of Open Access Journals (Sweden)

    Lanping Yang

    Full Text Available In this paper, microemulsion electrokinetic chromatography (MEEKC fingerprints combined with quantification were successfully developed to monitor the holistic quality consistency of Ixeris sonchifolia (Bge. Hance Injection (ISHI. ISHI is a Chinese traditional patent medicine used for its anti-inflammatory and hemostatic effects. The effects of five crucial experimental variables on MEEKC were optimized by the central composite design. Under the optimized conditions, the MEEKC fingerprints of 28 ISHIs were developed. Quantitative determination of seven marker compounds was employed simultaneously, then 28 batches of samples from two manufacturers were clearly divided into two clusters by the principal component analysis. In fingerprint assessments, a systematic quantitative fingerprint method was established for the holistic quality consistency evaluation of ISHI from qualitative and quantitative perspectives, by which the qualities of 28 samples were well differentiated. In addition, the fingerprint-efficacy relationship between the fingerprints and the antioxidant activities was established utilizing orthogonal projection to latent structures, which provided important medicinal efficacy information for quality control. The present study offered a powerful and holistic approach to evaluating the quality consistency of herbal medicines and their preparations.

  13. Methods of Studying Persons.

    Science.gov (United States)

    Heinemann, Allen W.; Shontz, Franklin C.

    Conventional research strategies typically emphasize behavior-determining tendencies so strongly that the person as a whole is ignored. Research strategies for studying whole persons focus on symbolic structures, formulate specific questions in advance, study persons one at a time, use individualized measures, and regard participants as expert…

  14. Experimental and theoretical studies of perceptible color fading of decorative paints consisting of mixed pigments

    International Nuclear Information System (INIS)

    Auger, Jean-Claude; McLoughlin, Daragh

    2017-01-01

    We study the color fading of paints films composed of mixtures of white rutile titanium dioxide and yellow arylide pigments dispersed in two polymer binders at different volume concentrations. The samples were exposed to ultraviolet radiations in an accelerated weathering tester during three weeks. The measured patterns in color variations appeared to be independent of the chemistry of the binders. We then developed a theoretical framework, based on the Radiative transfer Equation of light and the One Particle T-Matrix formalism to simulate the color fading process. The loss of color is correlated to the progressive decrease of the original colored pigment volume-filling fraction as the destructive UV radiations penetrate deeper into the films. The calculated patterns of color variations of paints film composed by mixtures of white pigments with yellow Cadmium Sulfate (CdS) and red Cerium Sulfide (Ce_2S_3) pigments showed the same trend as that seen experimentally. - Highlights: • Theoretical framework to simulate color-fading process of paints. • Good comparison between simulation and experimental data. • Color Fading depends on total amount of perceptible pigments.

  15. A Study of Clay-Epoxy Nanocomposites Consisting of Unmodified Clay and Organo Clay

    Directory of Open Access Journals (Sweden)

    Graham Edward

    2006-04-01

    Full Text Available Clay-epoxy nanocomposites were synthesized from DGEBA resin and montmorillonite clay with an in-situ polymerization. One type of untreated clay and two types of organo clay were used to produce the nanocompsoites. The aims of this study were to examine the nanocomposite structure using different tools and to compare the results between the unmodified clay and modified clays as nanofillers. Although diffractogram in reflection mode did not show any apparent peak of both types of materials, the transmitted XRD (X-Ray Difraction graphs, DSC (Differential Scanning Calorimeter analysis and TEM (Transmission Electron Microscope images revealed that the modified clay-epoxy and unmodified clay-epoxy provides different results. Interestingly, the micrographs showed that some of the modified clay layers possessed non-exfoliated layers in the modified clay-epoxy nanocomposites. Clay aggregates and a hackle pattern were found from E-SEM images for both types of nanocomposite materials. It is shown that different tools should be used to determine the nanocomposite structure.

  16. High resolution transmission electron microscopic study of nanoporous carbon consisting of curved single graphite sheets

    International Nuclear Information System (INIS)

    Bourgeois, L.N.; Bursill, L.A.

    1997-01-01

    A high resolution transmission electron microscopic study of a nanoporous carbon rich in curved graphite monolayers is presented. Observations of very thin regions. including the effect of tilting the specimen with respect to the electron beam, are reported. The initiation of single sheet material on an oriented graphite substrate is also observed. When combined with image simulations and independent measurements of the density (1.37g cm -3 ) and sp 3 /sp 2 +sp 2 bonding fraction (0.16), these observations suggest that this material is a two phase mixture containing a relatively low density aggregation of essentially capped single shells like squat nanotubes and polyhedra, plus a relatively dense 'amorphous' carbon structure which may be described using a random-Schwarzite model. Some negatively-curved sheets were also identified in the low density phase. Finally, some discussion is offered regarding the growth mechanisms responsible for this nanoporous carbon and its relationship with the structures of amorphous carbons across a broad range of densities, porosities and sp 3 /sp 2 +sp 3 bonding fractions

  17. The discrete null space method for the energy-consistent integration of constrained mechanical systems. Part III: Flexible multibody dynamics

    International Nuclear Information System (INIS)

    Leyendecker, Sigrid; Betsch, Peter; Steinmann, Paul

    2008-01-01

    In the present work, the unified framework for the computational treatment of rigid bodies and nonlinear beams developed by Betsch and Steinmann (Multibody Syst. Dyn. 8, 367-391, 2002) is extended to the realm of nonlinear shells. In particular, a specific constrained formulation of shells is proposed which leads to the semi-discrete equations of motion characterized by a set of differential-algebraic equations (DAEs). The DAEs provide a uniform description for rigid bodies, semi-discrete beams and shells and, consequently, flexible multibody systems. The constraints may be divided into two classes: (i) internal constraints which are intimately connected with the assumption of rigidity of the bodies, and (ii) external constraints related to the presence of joints in a multibody framework. The present approach thus circumvents the use of rotational variables throughout the whole time discretization, facilitating the design of energy-momentum methods for flexible multibody dynamics. After the discretization has been completed a size-reduction of the discrete system is performed by eliminating the constraint forces. Numerical examples dealing with a spatial slider-crank mechanism and with intersecting shells illustrate the performance of the proposed method

  18. Association between consistent purchase of anticonvulsants or lithium and suicide risk: a longitudinal cohort study from Denmark, 1995-2001.

    Science.gov (United States)

    Smith, Eric G; Søndergård, Lars; Lopez, Ana Garcia; Andersen, Per Kragh; Kessing, Lars Vedel

    2009-10-01

    Prior studies suggest anticonvulsants purchasers may be at greater risk of suicide than lithium purchasers. Longitudinal, retrospective cohort study of all individuals in Denmark purchasing anticonvulsants (valproic acid, carbamazepine, oxcarbazepine or lamotrigine) (n=9952) or lithium (n=6693) from 1995-2001 who also purchased antipsychotics at least once (to select out nonpsychiatric anticonvulsant use). Poisson regression of suicides by medication purchased (anticonvulsants or lithium) was conducted, controlling for age, sex, and calendar year. Confounding by indication was addressed by restricting the comparison to individuals prescribed the same medication: individuals with minimal medication exposure (e.g., who purchased only a single prescription of anticonvulsants) were compared to those individuals with more consistent medication exposure (i.e., purchasing > or = 6 prescriptions of anticonvulsants). Demographics and frequency of anticonvulsant, lithium, or antipsychotic use were similar between lithium and anticonvulsant purchasers. Among patients who also purchased antipsychotic at least once during the study period, purchasing anticonvulsants more consistently (> or = 6 prescriptions) was associated with a substantial reduction in the risk of suicide (RR=0.22, 95% CI=0.11-0.42, panticonvulsant and consistent lithium purchasers were similar. Lack of information about diagnoses and potential confounders, as well as other covariates that may differ between minimal and consistent medication purchasers, are limitations to this study. In this longitudinal study of anticonvulsant purchasers likely to have psychiatric disorders, consistent anticonvulsant treatment was associated with decreased risk of completed suicide.

  19. A Novel Degradation Estimation Method for a Hybrid Energy Storage System Consisting of Battery and Double-Layer Capacitor

    Directory of Open Access Journals (Sweden)

    Yuanbin Yu

    2016-01-01

    Full Text Available This paper presents a new method for battery degradation estimation using a power-energy (PE function in a battery/ultracapacitor hybrid energy storage system (HESS, and the integrated optimization which concerns both parameters matching and control for HESS has been done as well. A semiactive topology of HESS with double-layer capacitor (EDLC coupled directly with DC-link is adopted for a hybrid electric city bus (HECB. In the purpose of presenting the quantitative relationship between system parameters and battery serving life, the data during a 37-minute driving cycle has been collected and decomposed into discharging/charging fragments firstly, and then the optimal control strategy which is supposed to maximally use the available EDLC energy is presented to decompose the power between battery and EDLC. Furthermore, based on a battery degradation model, the conversion of power demand by PE function and PE matrix is applied to evaluate the relationship between the available energy stored in HESS and the serving life of battery pack. Therefore, according to the approach which could decouple parameters matching and optimal control of the HESS, the process of battery degradation and its serving life estimation for HESS has been summed up.

  20. Assessment of consistency of the whole tumor and single section perfusion imaging with 256-slice spiral CT: a preliminary study

    International Nuclear Information System (INIS)

    Sun Hongliang; Xu Yanyan; Hu Yingying; Tian Yuanjiang; Wang Wu

    2014-01-01

    Objective: To determine the consistency between quantitative CT perfusion measurements of colorectal cancer obtained from single section with maximal tumor dimension and from average of whole tumor, and compare intra- and inter-observer consistency of the two analysis methods. Methods: Twenty-two patients with histologically proven colorectal cancer were examined prospectively with 256-slice CT and the whole tumor perfusion images were obtained. Perfusion parameters were obtained from region of interest (ROI) inserted in single section showing maximal tumor dimension, then from ROI inserted in all tumor-containing sections by two radiologists. Consistency between values of blood flow (BF), blood volume (BV) and time to peak (TTP) calculated by two methods was assessed. Intra-observer consistency was evaluated by comparing repeated measurements done by the same radiologist using both methods after 3 months. Perfusion measurements were done by another radiologist independently to assess inter-observer consistency of both methods. The results from different methods were compared using paired t test and Bland-Altman plot. Results: Twenty-two patients were examined successfully. The perfusion parameters BF, BV and TTP obtained by whole tumor perfusion and single-section analysis were (35.59 ± 14.59) ml · min -1 · 100 g -1 , (17.55 ±4.21) ml · 100 g -1 , (21.30 ±7.57) s and (34.64 ± 13.29)ml · min -1 · 100 g -1 , (17.61 ±6.39)ml · 100 g -1 , (19.82 ±9.01) s, respectively. No significant differences were observed between the means of the perfusion parameters (BF, BV, TTP) calculated by the two methods (t=0.218, -0.033, -0.668, P>0.05, respectively). The intra-observer 95% limits of consistency of perfusion parameters were BF -5.3% to 10.0%, BV -13.8% to 10.8%, TTP -15.0% to 12.6% with whole tumor analysis, respectively; BF -14.3% to 16.5%, BV -24.2% to 22.2%, TTP -19.0% to 16.1% with single section analysis, respectively. The inter-observer 95% limits of

  1. Preventing syndemic Zika virus, HIV/STIs and unintended pregnancy: dual method use and consistent condom use among Brazilian women in marital and civil unions.

    Science.gov (United States)

    Tsuyuki, Kiyomi; Gipson, Jessica D; Barbosa, Regina Maria; Urada, Lianne A; Morisky, Donald E

    2017-12-12

    Syndemic Zika virus, HIV and unintended pregnancy call for an urgent understanding of dual method (condoms with another modern non-barrier contraceptive) and consistent condom use. Multinomial and logistic regression analysis using data from the Pesquisa Nacional de Demografia e Saúde da Criança e da Mulher (PNDS), a nationally representative household survey of reproductive-aged women in Brazil, identified the socio-demographic, fertility and relationship context correlates of exclusive non-barrier contraception, dual method use and condom use consistency. Among women in marital and civil unions, half reported dual protection (30% condoms, 20% dual methods). In adjusted models, condom use was associated with older age and living in the northern region of Brazil or in urban areas, whereas dual method use (versus condom use) was associated with younger age, living in the southern region of Brazil, living in non-urban areas and relationship age homogamy. Among condom users, consistent condom use was associated with reporting Afro-religion or other religion, not wanting (more) children and using condoms only (versus dual methods). Findings highlight that integrated STI prevention and family planning services should target young married/in union women, couples not wanting (more) children and heterogamous relationships to increase dual method use and consistent condom use.

  2. Investigating the consistency between proxy-based reconstructions and climate models using data assimilation: a mid-Holocene case study

    NARCIS (Netherlands)

    A. Mairesse; H. Goosse; P. Mathiot; H. Wanner; S. Dubinkina (Svetlana)

    2013-01-01

    htmlabstractThe mid-Holocene (6 kyr BP; thousand years before present) is a key period to study the consistency between model results and proxy-based reconstruction data as it corresponds to a standard test for models and a reasonable number of proxy-based records is available. Taking advantage of

  3. Consistent lithological units and its influence on geomechanical stratification in shale reservoir: case study from Baltic Basin, Poland.

    Science.gov (United States)

    Pachytel, Radomir; Jarosiński, Marek; Bobek, Kinga

    2017-04-01

    Geomechanical investigations in shale reservoir are crucial to understand rock behavior during hydraulic fracturing treatment and to solve borehole wall stability problem. Anisotropy should be considered as key mechanical parameter while trying to characterize shale properties in variety of scales. We are developing a concept of step-by-step approach to characterize and upscale the Consistent Lithological Units (CLU) at several scales of analysis. We decided that the most regional scale model, comparable to lithostratigraphic formations, is too general for hydraulic fracture propagation study thus a more detailed description is needed. The CLU's hierarchic model aims in upscale elastic properties with their anisotropy based on available data from vertical borehole. For the purpose of our study we have an access to continuous borehole core profile and full set of geophysical logging from several wells in the Pomeranian part of the Ordovician and Silurian shale complex belongs to the Baltic Basin. We are focused on shale properties that might be crucial for mechanical response to hydraulic fracturing: mineral components, porosity, density, elastic parameters and natural fracture pattern. To prepare the precise CLU model we compare several methods of determination and upscaling every single parameter used for consistent units secretion. Mineralogical data taken from ULTRA log, GEM log, X-ray diffraction and X-ray fluorescence were compared with Young modulus from sonic logs and Triaxial Compressive Strength Tests. The results showed the impact of clay content and porosity increase on Young's modulus reduction while carbonates (both calcite and dolomite) have stronger impact on elastic modulus growth, more than quartz, represented here mostly by detrital particles. Comparing the shales of similar composition in a few wells of different depths we concluded that differences in diagenesis and compaction due to variation in formation depth in a range of 1 km has negligible

  4. Association between consistent purchase of anticonvulsants or lithium and suicide risk: a longitudinal cohort study from Denmark, 1995-2001

    DEFF Research Database (Denmark)

    Smith, Eric G; Søndergård, Lars; Lopez, Ana Garcia

    2009-01-01

    BACKGROUND: Prior studies suggest anticonvulsants purchasers may be at greater risk of suicide than lithium purchasers. METHODS: Longitudinal, retrospective cohort study of all individuals in Denmark purchasing anticonvulsants (valproic acid, carbamazepine, oxcarbazepine or lamotrigine) (n=9952...

  5. Cation solvation with quantum chemical effects modeled by a size-consistent multi-partitioning quantum mechanics/molecular mechanics method.

    Science.gov (United States)

    Watanabe, Hiroshi C; Kubillus, Maximilian; Kubař, Tomáš; Stach, Robert; Mizaikoff, Boris; Ishikita, Hiroshi

    2017-07-21

    In the condensed phase, quantum chemical properties such as many-body effects and intermolecular charge fluctuations are critical determinants of the solvation structure and dynamics. Thus, a quantum mechanical (QM) molecular description is required for both solute and solvent to incorporate these properties. However, it is challenging to conduct molecular dynamics (MD) simulations for condensed systems of sufficient scale when adapting QM potentials. To overcome this problem, we recently developed the size-consistent multi-partitioning (SCMP) quantum mechanics/molecular mechanics (QM/MM) method and realized stable and accurate MD simulations, using the QM potential to a benchmark system. In the present study, as the first application of the SCMP method, we have investigated the structures and dynamics of Na + , K + , and Ca 2+ solutions based on nanosecond-scale sampling, a sampling 100-times longer than that of conventional QM-based samplings. Furthermore, we have evaluated two dynamic properties, the diffusion coefficient and difference spectra, with high statistical certainty. Furthermore the calculation of these properties has not previously been possible within the conventional QM/MM framework. Based on our analysis, we have quantitatively evaluated the quantum chemical solvation effects, which show distinct differences between the cations.

  6. Methods for regionalization of impacts of non-toxic air pollutants in life-cycle assessments often tell a consistent story

    DEFF Research Database (Denmark)

    Djomo, Sylvestre Njakou; Knudsen, Marie Trydeman; Andersen, Mikael Skou

    2017-01-01

    There is an ongoing debate regarding the influence of the source location of pollution on the fate of pollutants and their subsequent impacts. Several methods have been developed to derive site-dependent characterization factors (CFs) for use in life-cycle assessment (LCA). Consistent, precise, a...

  7. Consistent model driven architecture

    Science.gov (United States)

    Niepostyn, Stanisław J.

    2015-09-01

    The goal of the MDA is to produce software systems from abstract models in a way where human interaction is restricted to a minimum. These abstract models are based on the UML language. However, the semantics of UML models is defined in a natural language. Subsequently the verification of consistency of these diagrams is needed in order to identify errors in requirements at the early stage of the development process. The verification of consistency is difficult due to a semi-formal nature of UML diagrams. We propose automatic verification of consistency of the series of UML diagrams originating from abstract models implemented with our consistency rules. This Consistent Model Driven Architecture approach enables us to generate automatically complete workflow applications from consistent and complete models developed from abstract models (e.g. Business Context Diagram). Therefore, our method can be used to check practicability (feasibility) of software architecture models.

  8. Decentralized method for load sharing and power management in a hybrid single/three-phase islanded microgrid consisting of hybrid source PV/battery units

    DEFF Research Database (Denmark)

    Karimi, Yaser; Guerrero, Josep M.; Oraee, Hashem

    2016-01-01

    This paper proposes a new decentralized power management and load sharing method for a photovoltaic based, hybrid single/three-phase islanded microgrid consisting of various PV units, battery units and hybrid PV/battery units. The proposed method takes into account the available PV power...... and battery conditions of the units to share the load among them and power flow among different phases is performed automatically through three-phase units. Modified active power-frequency droop functions are used according to operating states of each unit and the frequency level is used as trigger...... for switching between the states. Efficacy of the proposed method in different load, PV generation and battery conditions is validated experimentally in a microgrid lab prototype consisted of one three-phase unit and two single-phase units....

  9. Electronic structure of thin films by the self-consistent numerical-basis-set linear combination of atomic orbitals method: Ni(001)

    International Nuclear Information System (INIS)

    Wang, C.S.; Freeman, A.J.

    1979-01-01

    We present the self-consistent numerical-basis-set linear combination of atomic orbitals (LCAO) discrete variational method for treating the electronic structure of thin films. As in the case of bulk solids, this method provides for thin films accurate solutions of the one-particle local density equations with a non-muffin-tin potential. Hamiltonian and overlap matrix elements are evaluated accurately by means of a three-dimensional numerical Diophantine integration scheme. Application of this method is made to the self-consistent solution of one-, three-, and five-layer Ni(001) unsupported films. The LCAO Bloch basis set consists of valence orbitals (3d, 4s, and 4p states for transition metals) orthogonalized to the frozen-core wave functions. The self-consistent potential is obtained iteratively within the superposition of overlapping spherical atomic charge density model with the atomic configurations treated as adjustable parameters. Thus the crystal Coulomb potential is constructed as a superposition of overlapping spherically symmetric atomic potentials and, correspondingly, the local density Kohn-Sham (α = 2/3) potential is determined from a superposition of atomic charge densities. At each iteration in the self-consistency procedure, the crystal charge density is evaluated using a sampling of 15 independent k points in (1/8)th of the irreducible two-dimensional Brillouin zone. The total density of states (DOS) and projected local DOS (by layer plane) are calculated using an analytic linear energy triangle method (presented as an Appendix) generalized from the tetrahedron scheme for bulk systems. Distinct differences are obtained between the surface and central plane local DOS. The central plane DOS is found to converge rapidly to the DOS of bulk paramagnetic Ni obtained by Wang and Callaway. Only a very small surplus charge (0.03 electron/atom) is found on the surface planes, in agreement with jellium model calculations

  10. Consistency argued students of fluid

    Science.gov (United States)

    Viyanti; Cari; Suparmi; Winarti; Slamet Budiarti, Indah; Handika, Jeffry; Widyastuti, Fatma

    2017-01-01

    Problem solving for physics concepts through consistency arguments can improve thinking skills of students and it is an important thing in science. The study aims to assess the consistency of the material Fluid student argmentation. The population of this study are College students PGRI Madiun, UIN Sunan Kalijaga Yogyakarta and Lampung University. Samples using cluster random sampling, 145 samples obtained by the number of students. The study used a descriptive survey method. Data obtained through multiple-choice test and interview reasoned. Problem fluid modified from [9] and [1]. The results of the study gained an average consistency argmentation for the right consistency, consistency is wrong, and inconsistent respectively 4.85%; 29.93%; and 65.23%. Data from the study have an impact on the lack of understanding of the fluid material which is ideally in full consistency argued affect the expansion of understanding of the concept. The results of the study as a reference in making improvements in future studies is to obtain a positive change in the consistency of argumentations.

  11. Analysis Method of Transfer Pricing Used by Multinational Companies Related to Tax Avoidance and its Consistencies to the Arm's Length Principle

    Directory of Open Access Journals (Sweden)

    Nuraini Sari

    2015-12-01

    Full Text Available The purpose of this study is to evaluate about how Starbucks Corporation uses transfer pricing to minimize the tax bill. In addition, the study also will evaluate how Indonesia’s domestic rules can overcome the case if Starbucks UK case happens in Indonesia. There are three steps conducted in this study. First, using information provided by UK Her Majesty's Revenue and Customs (HMRC and other related articles, find methods used by Starbucks UK to minimize the tax bill. Second, find Organisation for Economic Co-Operation and Development (OECD viewpoint regarding Starbucks Corporation cases. Third, analyze how Indonesia’s transfer pricing rules will work if Starbucks UK’s cases happened in Indonesia. The results showed that there were three inter-company transactions that helped Starbucks UK to minimize the tax bill, such as coffee costs, royalty on intangible property, and interest on inter-company loans. Through a study of OECD’s BEPS action plans, it is recommended to improve the OECD Model Tax Convention including Indonesia’s domestic tax rules in order to produce a fair and transparent judgment on transfer pricing. This study concluded that by using current tax rules, although UK HMRC has been disadvantaged because transfer pricing practices done by most of multinational companies, UK HMRC still cannot prove the transfer pricing practices are not consistent with arm’s length principle. Therefore, current international tax rules need to be improved.

  12. Hybrid method for consistent model of the Pacific absolute plate motion and a test for inter-hotspot motion since 70Ma

    Science.gov (United States)

    Harada, Y.; Wessel, P.; Sterling, A.; Kroenke, L.

    2002-12-01

    Inter-hotspot motion within the Pacific plate is one of the most controversial issues in recent geophysical studies. However, it is a fact that many geophysical and geological data including ages and positions of seamount chains in the Pacific plate can largely be explained by a simple model of absolute motion derived from assumptions of rigid plates and fixed hotspots. Therefore we take the stand that if a model of plate motion can explain the ages and positions of Pacific hotspot tracks, inter-hotspot motion would not be justified. On the other hand, if any discrepancies between the model and observations are found, the inter-hotspot motion may then be estimated from these discrepancies. To make an accurate model of the absolute motion of the Pacific plate, we combined two different approaches: the polygonal finite rotation method (PFRM) by Harada and Hamano (2000) and the hot-spotting technique developed by Wessel and Kroenke (1997). The PFRM can determine accurate positions of finite rotation poles for the Pacific plate if the present positions of hotspots are known. On the other hand, the hot-spotting technique can predict present positions of hotspots if the absolute plate motion is given. Therefore we can undertake iterative calculations using the two methods. This hybrid method enables us to determine accurate finite rotation poles for the Pacific plate solely from geometry of Hawaii, Louisville and Easter(Crough)-Line hotspot tracks from around 70 Ma to present. Information of ages can be independently assigned to the model after the poles and rotation angles are determined. We did not detect any inter-hotspot motion from the geometry of these Pacific hotspot tracks using this method. The Ar-Ar ages of Pacific seamounts including new age data of ODP Leg 197 are used to test the newly determined model of the Pacific plate motion. The ages of Hawaii, Louisville, Easter(Crough)-Line, and Cobb hotspot tracks are quite consistent with each other from 70 Ma to

  13. The Plumber’s Nightmare Phase in Diblock Copolymer/Homopolymer Blends. A Self-Consistent Field Theory Study.

    KAUST Repository

    Martinez-Veracoechea, Francisco J.

    2009-11-24

    Using self-consistent field theory, the Plumber\\'s Nightmare and the double diamond phases are predicted to be stable in a finite region of phase diagrams for blends of AB diblock copolymer (DBC) and A-component homopolymer. To the best of our knowledge, this is the first time that the P phase has been predicted to be stable using self-consistent field theory. The stabilization is achieved by tuning the composition or conformational asymmetry of the DBC chain, and the architecture or length of the homopolymer. The basic features of the phase diagrams are the same in all cases studied, suggesting a general type of behavior for these systems. Finally, it is noted that the homopolymer length should be a convenient variable to stabilize bicontinuous phases in experiments. © 2009 American Chemical Society.

  14. A relativistic self-consistent model for studying enhancement of space charge limited emission due to counter-streaming ions

    Science.gov (United States)

    Lin, M. C.; Verboncoeur, J.

    2016-10-01

    A maximum electron current transmitted through a planar diode gap is limited by space charge of electrons dwelling across the gap region, the so called space charge limited (SCL) emission. By introducing a counter-streaming ion flow to neutralize the electron charge density, the SCL emission can be dramatically raised, so electron current transmission gets enhanced. In this work, we have developed a relativistic self-consistent model for studying the enhancement of maximum transmission by a counter-streaming ion current. The maximum enhancement is found when the ion effect is saturated, as shown analytically. The solutions in non-relativistic, intermediate, and ultra-relativistic regimes are obtained and verified with 1-D particle-in-cell simulations. This self-consistent model is general and can also serve as a comparison for verification of simulation codes, as well as extension to higher dimensions.

  15. Evaluation of the quality consistency of powdered poppy capsule extractive by an averagely linear-quantified fingerprint method in combination with antioxidant activities and two compounds analyses.

    Science.gov (United States)

    Zhang, Yujing; Sun, Guoxiang; Hou, Zhifei; Yan, Bo; Zhang, Jing

    2017-12-01

    A novel averagely linear-quantified fingerprint method was proposed and successfully applied to monitor the quality consistency of alkaloids in powdered poppy capsule extractive. Averagely linear-quantified fingerprint method provided accurate qualitative and quantitative similarities for chromatographic fingerprints of Chinese herbal medicines. The stability and operability of the averagely linear-quantified fingerprint method were verified by the parameter r. The average linear qualitative similarity SL (improved based on conventional qualitative "Similarity") was used as a qualitative criterion in the averagely linear-quantified fingerprint method, and the average linear quantitative similarity PL was introduced as a quantitative one. PL was able to identify the difference in the content of all the chemical components. In addition, PL was found to be highly correlated to the contents of two alkaloid compounds (morphine and codeine). A simple flow injection analysis was developed for the determination of antioxidant capacity in Chinese Herbal Medicines, which was based on the scavenging of 2,2-diphenyl-1-picrylhydrazyl radical by antioxidants. The fingerprint-efficacy relationship linking chromatographic fingerprints and antioxidant activities was investigated utilizing orthogonal projection to latent structures method, which provided important pharmacodynamic information for Chinese herbal medicines quality control. In summary, quantitative fingerprinting based on averagely linear-quantified fingerprint method can be applied for monitoring the quality consistency of Chinese herbal medicines, and the constructed orthogonal projection to latent structures model is particularly suitable for investigating the fingerprint-efficacy relationship. © 2017 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  16. Standard test methods for determining chemical durability of nuclear, hazardous, and mixed waste glasses and multiphase glass ceramics: The product consistency test (PCT)

    CERN Document Server

    American Society for Testing and Materials. Philadelphia

    2002-01-01

    1.1 These product consistency test methods A and B evaluate the chemical durability of homogeneous glasses, phase separated glasses, devitrified glasses, glass ceramics, and/or multiphase glass ceramic waste forms hereafter collectively referred to as “glass waste forms” by measuring the concentrations of the chemical species released to a test solution. 1.1.1 Test Method A is a seven-day chemical durability test performed at 90 ± 2°C in a leachant of ASTM-Type I water. The test method is static and conducted in stainless steel vessels. Test Method A can specifically be used to evaluate whether the chemical durability and elemental release characteristics of nuclear, hazardous, and mixed glass waste forms have been consistently controlled during production. This test method is applicable to radioactive and simulated glass waste forms as defined above. 1.1.2 Test Method B is a durability test that allows testing at various test durations, test temperatures, mesh size, mass of sample, leachant volume, a...

  17. Personality and behavior prediction and consistency across cultures: A multimethod study of Blacks and Whites in South Africa.

    Science.gov (United States)

    Fetvadjiev, Velichko H; Meiring, Deon; van de Vijver, Fons J R; Nel, J Alewyn; Sekaja, Lusanda; Laher, Sumaya

    2018-03-01

    The cross-cultural universality of behavior's consistency and predictability from personality, assumed in trait models though challenged in cultural psychological models, has usually been operationalized in terms of beliefs and perceptions, and assessed using single-instance self-reports. In a multimethod study of actual behavior across a range of situations, we examined predictability and consistency in participants from the more collectivistic Black ethnic group and the more individualistic White group in South Africa. Participants completed personality questionnaires before the behavior measurements. In Study 1, 107 Black and 241 White students kept diaries for 21 days, recording their behaviors and the situations in which they had occurred. In Study 2, 57 Black and 52 White students were video-recorded in 12 situations in laboratory settings, and external observers scored their behaviors. Across both studies, behavior was predicted by personality on average equally well in the 2 groups, and equally well when using trait-adjective- and behavior-based personality measures. The few cultural differences in situational variability were not in line with individualism-collectivism; however, subjective perceptions of variability, operationalized as dialectical beliefs, were more in line with individualism-collectivism: Blacks viewed their behavior as more variable than Whites. We propose drawing a distinction between subjective beliefs and objective behavior in the study of personality and culture. Larger cultural differences can be expected in beliefs and perceptions than in the links between personality and actual behavior. (PsycINFO Database Record (c) 2018 APA, all rights reserved).

  18. Diagnostic consistency and interchangeability of schizophrenic disorders and bipolar disorders: A 7-year follow-up study.

    Science.gov (United States)

    Hung, Yen-Ni; Yang, Shu-Yu; Kuo, Chian-Jue; Lin, Shih-Ku

    2018-03-01

    The change in psychiatric diagnoses in clinical practice is not an unusual phenomenon. The interchange between the diagnoses of schizophrenic disorders and bipolar disorders is a major clinical issue because of the differences in treatment regimens and long-term prognoses. In this study, we used a nationwide population-based sample to compare the diagnostic consistency and interchange rate between schizophrenic disorders and bipolar disorders. In total, 25 711 and 11 261 patients newly diagnosed as having schizophrenic disorder and bipolar disorder, respectively, were retrospectively enrolled from the Psychiatric Inpatient Medical Claims database between 2001 and 2005. We followed these two cohorts for 7 years to determine whether their diagnoses were consistent throughout subsequent hospitalizations. The interchange between the two diagnoses was analyzed. In the schizophrenic disorder cohort, the overall diagnostic consistency rate was 87.3% and the rate of change to bipolar disorder was 3.0% during the 7-year follow-up. Additional analyses of subtypes revealed that the change rate from schizoaffective disorder to bipolar disorder was 12.0%. In the bipolar disorder cohort, the overall diagnostic consistency rate was 71.9% and the rate of change to schizophrenic disorder was 8.3%. Changes in the diagnosis of a major psychosis are not uncommon. The interchange between the diagnoses of schizophrenic disorders and bipolar disorders might be attributed to the evolution of clinical symptoms and the observation of preserved social functions that contradict the original diagnosis. While making a psychotic diagnosis, clinicians should be aware of the possibility of the change in diagnosis in the future. © 2017 The Authors. Psychiatry and Clinical Neurosciences © 2017 Japanese Society of Psychiatry and Neurology.

  19. Optimum collective submanifold in resonant cases by the self-consistent collective-coordinate method for large-amplitude collective motion

    International Nuclear Information System (INIS)

    Hashimoto, Y.; Marumori, T.; Sakata, F.

    1987-01-01

    With the purpose of clarifying characteristic difference of the optimum collective submanifolds in nonresonant and resonant cases, we develop an improved method of solving the basic equations of the self-consistent collective-coordinate (SCC) method for large-amplitude collective motion. It is shown that, in the resonant cases, there inevitably arise essential coupling terms which break the maximal-decoupling property of the collective motion, and we have to extend the optimum collective submanifold so as to properly treat the degrees of freedom which bring about the resonances

  20. Quasiparticle self-consistent GW study of cuprates: electronic structure, model parameters, and the two-band theory for Tc.

    Science.gov (United States)

    Jang, Seung Woo; Kotani, Takao; Kino, Hiori; Kuroki, Kazuhiko; Han, Myung Joon

    2015-07-24

    Despite decades of progress, an understanding of unconventional superconductivity still remains elusive. An important open question is about the material dependence of the superconducting properties. Using the quasiparticle self-consistent GW method, we re-examine the electronic structure of copper oxide high-Tc materials. We show that QSGW captures several important features, distinctive from the conventional LDA results. The energy level splitting between d(x(2)-y(2)) and d(3z(2)-r(2)) is significantly enlarged and the van Hove singularity point is lowered. The calculated results compare better than LDA with recent experimental results from resonant inelastic xray scattering and angle resolved photoemission experiments. This agreement with the experiments supports the previously suggested two-band theory for the material dependence of the superconducting transition temperature, Tc.

  1. Multiscale methods framework: self-consistent coupling of molecular theory of solvation with quantum chemistry, molecular simulations, and dissipative particle dynamics.

    Science.gov (United States)

    Kovalenko, Andriy; Gusarov, Sergey

    2018-01-31

    In this work, we will address different aspects of self-consistent field coupling of computational chemistry methods at different time and length scales in modern materials and biomolecular science. Multiscale methods framework yields dramatically improved accuracy, efficiency, and applicability by coupling models and methods on different scales. This field benefits many areas of research and applications by providing fundamental understanding and predictions. It could also play a particular role in commercialization by guiding new developments and by allowing quick evaluation of prospective research projects. We employ molecular theory of solvation which allows us to accurately introduce the effect of the environment on complex nano-, macro-, and biomolecular systems. The uniqueness of this method is that it can be naturally coupled with the whole range of computational chemistry approaches, including QM, MM, and coarse graining.

  2. QUALITATIVE METHODS IN CREATIVITY STUDIES

    DEFF Research Database (Denmark)

    Hertel, Frederik

    2015-01-01

    In this article we will focus on developing a qualitative research design suitable for conducting case study in creativity. The case is a team of workers (See Hertel, 2015) doing industrial cleaning in the Danish food industry. The hypothesis is that these workers are both participating in......-specific methods, involving a discussion of creativity test, divergent and convergent thinking, for studying creativity in this specific setting. Beside from that we will develop a research design involving a combination of methods necessary for conducting a case study in the setting mentioned....

  3. (In)Consistencies in Responses to Sodium Bicarbonate Supplementation: A Randomised, Repeated Measures, Counterbalanced and Double-Blind Study.

    Science.gov (United States)

    Froio de Araujo Dias, Gabriela; da Eira Silva, Vinicius; de Salles Painelli, Vitor; Sale, Craig; Giannini Artioli, Guilherme; Gualano, Bruno; Saunders, Bryan

    2015-01-01

    Intervention studies do not account for high within-individual variation potentially compromising the magnitude of an effect. Repeat administration of a treatment allows quantification of individual responses and determination of the consistency of responses. We determined the consistency of metabolic and exercise responses following repeated administration of sodium bicarbonate (SB). 15 physically active males (age 25±4 y; body mass 76.0±7.3 kg; height 1.77±0.05 m) completed six cycling capacity tests at 110% of maximum power output (CCT110%) following ingestion of either 0.3 g∙kg-1BM of SB (4 trials) or placebo (PL, 2 trials). Blood pH, bicarbonate, base excess and lactate were determined at baseline, pre-exercise, post-exercise and 5-min post-exercise. Total work done (TWD) was recorded as the exercise outcome. SB supplementation increased blood pH, bicarbonate and base excess prior to every trial (all p ≤ 0.001); absolute changes in pH, bicarbonate and base excess from baseline to pre-exercise were similar in all SB trials (all p > 0.05). Blood lactate was elevated following exercise in all trials (p ≤ 0.001), and was higher in some, but not all, SB trials compared to PL. TWD was not significantly improved with SB vs. PL in any trial (SB1: +3.6%; SB2 +0.3%; SB3: +2.1%; SB4: +6.7%; all p > 0.05), although magnitude-based inferences suggested a 93% likely improvement in SB4. Individual analysis showed ten participants improved in at least one SB trial above the normal variation of the test although five improved in none. The mechanism for improved exercise with SB was consistently in place prior to exercise, although this only resulted in a likely improvement in one trial. SB does not consistently improve high intensity cycling capacity, with results suggesting that caution should be taken when interpreting the results from single trials as to the efficacy of SB supplementation. ClinicalTrials.gov NCT02474628.

  4. (InConsistencies in Responses to Sodium Bicarbonate Supplementation: A Randomised, Repeated Measures, Counterbalanced and Double-Blind Study.

    Directory of Open Access Journals (Sweden)

    Gabriela Froio de Araujo Dias

    Full Text Available Intervention studies do not account for high within-individual variation potentially compromising the magnitude of an effect. Repeat administration of a treatment allows quantification of individual responses and determination of the consistency of responses. We determined the consistency of metabolic and exercise responses following repeated administration of sodium bicarbonate (SB.15 physically active males (age 25±4 y; body mass 76.0±7.3 kg; height 1.77±0.05 m completed six cycling capacity tests at 110% of maximum power output (CCT110% following ingestion of either 0.3 g∙kg-1BM of SB (4 trials or placebo (PL, 2 trials. Blood pH, bicarbonate, base excess and lactate were determined at baseline, pre-exercise, post-exercise and 5-min post-exercise. Total work done (TWD was recorded as the exercise outcome.SB supplementation increased blood pH, bicarbonate and base excess prior to every trial (all p ≤ 0.001; absolute changes in pH, bicarbonate and base excess from baseline to pre-exercise were similar in all SB trials (all p > 0.05. Blood lactate was elevated following exercise in all trials (p ≤ 0.001, and was higher in some, but not all, SB trials compared to PL. TWD was not significantly improved with SB vs. PL in any trial (SB1: +3.6%; SB2 +0.3%; SB3: +2.1%; SB4: +6.7%; all p > 0.05, although magnitude-based inferences suggested a 93% likely improvement in SB4. Individual analysis showed ten participants improved in at least one SB trial above the normal variation of the test although five improved in none.The mechanism for improved exercise with SB was consistently in place prior to exercise, although this only resulted in a likely improvement in one trial. SB does not consistently improve high intensity cycling capacity, with results suggesting that caution should be taken when interpreting the results from single trials as to the efficacy of SB supplementation.ClinicalTrials.gov NCT02474628.

  5. A self-consistent MoD-WM/MM structural refinement method: characterization of hydrogen bonding in the orytricha nova G-1uar

    Energy Technology Data Exchange (ETDEWEB)

    Batista, Enrique R [Los Alamos National Laboratory; Newcomer, Micharel B [YALE UNIV; Raggin, Christina M [YALE UNIV; Gascon, Jose A [YALE UNIV; Loria, J Patrick [YALE UNIV; Batista, Victor S [YALE UNIV

    2008-01-01

    This paper generalizes the MoD-QM/MM hybrid method, developed for ab initio computations of protein electrostatic potentials [Gasc6n, l.A.; Leung, S.S.F.; Batista, E.R.; Batista, V.S. J. Chem. Theory Comput. 2006,2, 175-186], as a practical algorithm for structural refinement of extended systems. The computational protocol involves a space-domain decomposition scheme for the formal fragmentation of extended systems into smaller, partially overlapping, molecular domains and the iterative self-consistent energy minimization of the constituent domains by relaxation of their geometry and electronic structure. The method accounts for mutual polarization of the molecular domains, modeled as Quantum-Mechanical (QM) layers embedded in the otherwise classical Molecular-Mechanics (MM) environment according to QM/MM hybrid methods. The method is applied to the description of benchmark models systems that allow for direct comparisons with full QM calculations, and subsequently applied to the structural characterization of the DNA Oxytricha nova Guanine quadruplex (G4). The resulting MoD-QM/MM structural model of the DNA G4 is compared to recently reported highresolution X-ray diffraction and NMR models, and partially validated by direct comparisons between {sup 1}H NMR chemical shifts that are highly sensitive to hydrogen-bonding and stacking interactions and the corresponding theoretical values obtained at the density functional theory DFT QM/MM (BH&H/6-31 G*:Amber) level in conjunction with the gauge independent atomic orbital (GIAO) method for the ab initio self consistent-field (SCF) calculation of NMR chemical shifts.

  6. RNA-seq reveals more consistent reference genes for gene expression studies in human non-melanoma skin cancers

    Directory of Open Access Journals (Sweden)

    Van L.T. Hoang

    2017-08-01

    Full Text Available Identification of appropriate reference genes (RGs is critical to accurate data interpretation in quantitative real-time PCR (qPCR experiments. In this study, we have utilised next generation RNA sequencing (RNA-seq to analyse the transcriptome of a panel of non-melanoma skin cancer lesions, identifying genes that are consistently expressed across all samples. Genes encoding ribosomal proteins were amongst the most stable in this dataset. Validation of this RNA-seq data was examined using qPCR to confirm the suitability of a set of highly stable genes for use as qPCR RGs. These genes will provide a valuable resource for the normalisation of qPCR data for the analysis of non-melanoma skin cancer.

  7. Assessment of effects of atomoxetine in adult patients with ADHD: consistency among three geographic regions in a response maintenance study.

    Science.gov (United States)

    Tanaka, Yoko; Escobar, Rodrigo; Upadhyaya, Himanshu P

    2017-06-01

    A previous study (Upadhyaya et al. in Eur J Psychiatry 2013b; 27:185-205) reported that adults with attention-deficit/hyperactivity disorder (ADHD) demonstrated maintenance of response for up to 25 weeks after initially responding to atomoxetine treatment. In the present report, the consistency of treatment effect across three geographic regions (Europe, United States/Canada [US/Can], and Latin America [Latin Am]) was explored. Data were analyzed from a phase 3, multicenter, randomized, double-blind, maintenance-of-response (randomized withdrawal) trial of atomoxetine versus placebo in adults with ADHD. Patients were randomized to atomoxetine (N = 266) or placebo (N = 258) for 25 weeks. Consistency assessments included the interaction test, pairwise t tests, noninferiority, and the criteria from Basic Principles on Global Clinical Trials (Ministry of Health, Labour and Welfare of Japan 2007). Atomoxetine-treated patients maintained the improved ADHD symptoms relative to placebo-treated patients on the Conners' Adult ADHD Rating Scale Investigator-Rated: Screening Version 18-Item (CAARS-Inv:SV) total score in all three regions (atomoxetine-placebo mean difference = -4.55, -3.18, and -0.07 for Europe, US/Can, and Latin Am, respectively). For the Latin Am region, the mean change in total score (0.41) was notably smaller for the placebo group than for Europe (5.87) and US/Can (4.39). Similar results were observed for the CAARS-Inv:SV hyperactivity/impulsivity and inattention subscale scores. Overall, patients maintained the response with atomoxetine treatment compared to placebo; however, the magnitude of treatment effect differed among the regions studied, being numerically higher in the EU and US/Can than Latin Am. Trial registration http://www.clinicaltrials.gov/(NCT00700427 ).

  8. The reflexive case study method

    DEFF Research Database (Denmark)

    Rittenhofer, Iris

    2015-01-01

    This paper extends the international business research on small to medium-sized enterprises (SME) at the nexus of globalization. Based on a conceptual synthesis across disciplines and theoretical perspectives, it offers management research a reflexive method for case study research of postnational...

  9. Two-dimensional tracking and TDI are consistent methods for evaluating myocardial longitudinal peak strain in left and right ventricle basal segments in athletes

    OpenAIRE

    Lstefani, L.Stefani; Ltoncelli, L.Toncelli; Mgianassi, M.Gianassi; Pmanetti, P.Manetti; Vdi, V.Di Tante; Mrvono, M.R.Vono; Amoretti, A.Moretti; Bcappelli, B.Cappelli; Gpedrizzetti, G.Pedrizzetti; Ggalanti, G.Galanti

    2007-01-01

    Abstract Background Myocardial contractility can be investigated using longitudinal peak strain. It can be calculated using the Doppler-derived TDI method and the non-Doppler method based on tissue tracking on B-mode images. Both are validated and show good reproducibility, but no comparative analysis of their results has yet been conducted. This study analyzes the results obtained from the basal segments of the ventricular chambers in a group of athletes. Methods 30 regularly-trained athlete...

  10. Is This Year's Exam as Demanding as Last Year's? Using a Pilot Method to Evaluate the Consistency of Examination Demands over Time

    Science.gov (United States)

    Crisp, Victoria; Novakovic, Nadezda

    2009-01-01

    Maintaining standards over time is a much debated topic in the context of national examinations in the UK. This study used a pilot method to compare the demands, over time, of two examination units testing administration. The method involved 15 experts revising a framework of demand types and making paired comparisons of examinations from…

  11. Study of consistency properties of instillation liniment-gel for therapy of pyoinflammatory diseases of maxillofacial region

    Directory of Open Access Journals (Sweden)

    A. V. Kurinnoy

    2012-12-01

    Full Text Available Using rotary viscometer «Reotest 2» researches of consistency properties of instillation gel-liniment for antimicrobial therapy of pyoinfl ammatory diseases of maxillufacial area are conducted. It is defi ned, that consistency properties of gel-liniment for antimicrobial therapy of pyoinflammatory diseases of maxillufacial area are within the limits of rheological optimum of consistency of ointments, and value «mechanical stability» (1,33 characterizes the system as exceptionally thixotropic, providing recoverability of the system after loading and allows to forecast stability of consistency properties of gel-liniment at the prolonged storage.

  12. Dimension of ring polymers in bulk studied by Monte-Carlo simulation and self-consistent theory.

    Science.gov (United States)

    Suzuki, Jiro; Takano, Atsushi; Deguchi, Tetsuo; Matsushita, Yushu

    2009-10-14

    We studied equilibrium conformations of ring polymers in melt over the wide range of segment number N of up to 4096 with Monte-Carlo simulation and obtained N dependence of radius of gyration of chains R(g). The simulation model used is bond fluctuation model (BFM), where polymer segments bear excluded volume; however, the excluded volume effect vanishes at N-->infinity, and linear polymer can be regarded as an ideal chain. Simulation for ring polymers in melt was performed, and the nu value in the relationship R(g) proportional to N(nu) is decreased gradually with increasing N, and finally it reaches the limiting value, 1/3, in the range of N>or=1536, i.e., R(g) proportional to N(1/3). We confirmed that the simulation result is consistent with that of the self-consistent theory including the topological effect and the osmotic pressure of ring polymers. Moreover, the averaged chain conformation of ring polymers in equilibrium state was given in the BFM. In small N region, the segment density of each molecule near the center of mass of the molecule is decreased with increasing N. In large N region the decrease is suppressed, and the density is found to be kept constant without showing N dependence. This means that ring polymer molecules do not segregate from the other molecules even if ring polymers in melt have the relationship nu=1/3. Considerably smaller dimensions of ring polymers at high molecular weight are due to their inherent nature of having no chain ends, and hence they have less-entangled conformations.

  13. Structural Consistency, Consistency, and Sequential Rationality.

    OpenAIRE

    Kreps, David M; Ramey, Garey

    1987-01-01

    Sequential equilibria comprise consistent beliefs and a sequentially ra tional strategy profile. Consistent beliefs are limits of Bayes ratio nal beliefs for sequences of strategies that approach the equilibrium strategy. Beliefs are structurally consistent if they are rationaliz ed by some single conjecture concerning opponents' strategies. Consis tent beliefs are not necessarily structurally consistent, notwithstan ding a claim by Kreps and Robert Wilson (1982). Moreover, the spirit of stru...

  14. A study of self-consistent Hartree-Fock plus Bardeen-Cooper-Schrieffer calculations with finite-range interactions

    Science.gov (United States)

    Anguiano, M.; Lallena, A. M.; Co', G.; De Donno, V.

    2014-02-01

    In this work we test the validity of a Hartree-Fock plus Bardeen-Cooper-Schrieffer model in which a finite-range interaction is used in the two steps of the calculation by comparing the results obtained to those found in fully self-consistent Hartree-Fock-Bogoliubov calculations using the same interaction. Specifically, we consider the Gogny-type D1S and D1M forces. We study a wide range of spherical nuclei, far from the stability line, in various regions of the nuclear chart, from oxygen to tin isotopes. We calculate various quantities related to the ground state properties of these nuclei, such as binding energies, radii, charge and density distributions, and elastic electron scattering cross sections. The pairing effects are studied by direct comparison with the Hartree-Fock results. Despite its relative simplicity, in most cases, our model provides results very close to those of the Hartree-Fock-Bogoliubov calculations, and it reproduces the empirical evidence of pairing effects rather well in the nuclei investigated.

  15. Training Support Staff to Modify Fluids to Appropriate Safe Consistencies for Adults with Intellectual Disabilities and Dysphagia: An Efficacy Study

    Science.gov (United States)

    Chadwick, D. D.; Stubbs, J.; Fovargue, S.; Anderson, D.; Stacey, G.; Tye, S.

    2014-01-01

    Background: Modifying the consistency of food and drink is a strategy commonly used in the management of dysphagia for people with intellectual disabilities (ID). People with ID often depend on others for the preparation of food and drink and therefore depend on those caregivers achieving the correct consistency to keep them safe and avoid…

  16. BMI was found to be a consistent determinant related to misreporting of energy, protein and potassium intake using self-report and duplicate portion methods.

    Science.gov (United States)

    Trijsburg, Laura; Geelen, Anouk; Hollman, Peter Ch; Hulshof, Paul Jm; Feskens, Edith Jm; Van't Veer, Pieter; Boshuizen, Hendriek C; de Vries, Jeanne Hm

    2017-03-01

    As misreporting, mostly under-reporting, of dietary intake is a generally known problem in nutritional research, we aimed to analyse the association between selected determinants and the extent of misreporting by the duplicate portion method (DP), 24 h recall (24hR) and FFQ by linear regression analysis using the biomarker values as unbiased estimates. For each individual, two DP, two 24hR, two FFQ and two 24 h urinary biomarkers were collected within 1·5 years. Also, for sixty-nine individuals one or two doubly labelled water measurements were obtained. The associations of basic determinants (BMI, gender, age and level of education) with misreporting of energy, protein and K intake of the DP, 24hR and FFQ were evaluated using linear regression analysis. Additionally, associations between other determinants, such as physical activity and smoking habits, and misreporting were investigated. The Netherlands. One hundred and ninety-seven individuals aged 20-70 years. Higher BMI was associated with under-reporting of dietary intake assessed by the different dietary assessment methods for energy, protein and K, except for K by DP. Men tended to under-report protein by the DP, FFQ and 24hR, and persons of older age under-reported K but only by the 24hR and FFQ. When adjusted for the basic determinants, the other determinants did not show a consistent association with misreporting of energy or nutrients and by the different dietary assessment methods. As BMI was the only consistent determinant of misreporting, we conclude that BMI should always be taken into account when assessing and correcting dietary intake.

  17. Methods to study postprandial lipemia

    DEFF Research Database (Denmark)

    Ooi, Teik Chye; Nordestgaard, Børge G

    2011-01-01

    to the liver. In general, PPL occurs over 4-6 h in normal individuals, depending on the amount and type of fats consumed. The complexity of PPL changes is compounded by ingestion of food before the previous meal is fully processed. PPL testing is done to determine the impact of (a) exogenous factors...... such as the amount and type of food consumed, and (b) endogenous factors such as the metabolic/genetic status of the subjects, on PPL. To study PPL appropriately, different methods are used to suit the study goal. This paper provides an overview of the methodological aspects of PPL testing. It deals with markers...

  18. Study of the behavior of the consistency rates of a clay with the incorporation of waste of burned ceramic blocks

    International Nuclear Information System (INIS)

    Oliveira, Orley Magalhaes de; Crivelari, Rubem Mateus; Munhoz Junior, Antonio Hortencio; Silva-Valenzuela, Maria das Gracas da; Valenzuela-Diaz, Francisco Rolando

    2016-01-01

    One of the important parts in the process of manufacturing a structural ceramic product is its conformation. The clay which is the basis for these products need to have an appropriate plasticity. In Ceramics Industries that produce ceramic blocks and tiles plasticity clay and a key property for this production. This Industries are a lot of pieces that do not pass the quality control for not having a uniform visual appearance or have small cracks, these lots are usually discarded, which leads to material waste and produces a lot of waste. The objective of this work is the study of the behavior of consistency indexes, plastic limit (LP); the liquid limit (LL) and plasticity index (PI) of a clay from Vitoria da Conquista, Bahia, with the addition of several waste percentages of burnt and ground ceramic blocks. Our results demonstrate that the addition of the reject only affect the plasticity of clay from an increase of over 100%, which makes possible its incorporation in ceramic paste. (author)

  19. Consistency of kinematic and kinetic patterns during a prolonged spell of cricket fast bowling: an exploratory laboratory study.

    Science.gov (United States)

    Schaefer, Andrew; O'dwyer, Nicholas; Ferdinands, René E D; Edwards, Suzi

    2018-03-01

    Due to the high incidence of lumbar spine injury in fast bowlers, international cricket organisations advocate limits on workload for bowlers under 19 years of age in training/matches. The purpose of this study was to determine whether significant changes in either fast bowling technique or movement variability could be detected throughout a 10-over bowling spell that exceeded the recommended limit. Twenty-five junior male fast bowlers bowled at competition pace while three-dimensional kinematic and kinetic data were collected for the leading leg, trunk and bowling arm. Separate analyses for the mean and within-participant standard deviation of each variable were performed using repeated measures factorial analyses of variance and computation of effect sizes. No substantial changes were observed in mean values or variability of any kinematic, kinetic or performance variables, which instead revealed a high degree of consistency in kinematic and kinetic patterns. Therefore, the suggestion that exceeding the workload limit per spell causes technique- and loading-related changes associated with lumbar injury risk is not valid and cannot be used to justify the restriction of bowling workload. For injury prevention, the focus instead should be on the long-term effect of repeated spells and on the fast bowling technique itself.

  20. Parental depressive and anxiety symptoms during pregnancy and attention problems in children: a cross-cohort consistency study.

    NARCIS (Netherlands)

    van Batenburg, T.; Brion, M.J.; Henrichs, J.; Jaddoe, V.W.V.; Hofman, A.; Verhulst, F.C.; Lawlor, D.A.; Davey Smith, G.; Tiemeier, H.

    2013-01-01

    Background: Maternal depression and anxiety during pregnancy have been associated with offspring-attention deficit problems. Aim: We explored possible intrauterine effects by comparing maternal and paternal symptoms during pregnancy, by investigating cross-cohort consistency, and by investigating

  1. Parental depressive and anxiety symptoms during pregnancy and attention problems in children: A cross-cohort consistency study

    NARCIS (Netherlands)

    T. van Batenburg-Eddes (Tamara); M.-J. Brion (Maria); J. Henrichs (Jens); V.W.V. Jaddoe (Vincent); A. Hofman (Albert); F.C. Verhulst (Frank); D.A. Lawlor (Debbie); G. Davey-Smith (George); H.W. Tiemeier (Henning)

    2013-01-01

    textabstractBackground: Maternal depression and anxiety during pregnancy have been associated with offspring-attention deficit problems. Aim: We explored possible intrauterine effects by comparing maternal and paternal symptoms during pregnancy, by investigating cross-cohort consistency, and by

  2. Vibrational frequency scaling factors for correlation consistent basis sets and the methods CC2 and MP2 and their spin-scaled SCS and SOS variants

    Energy Technology Data Exchange (ETDEWEB)

    Friese, Daniel H., E-mail: daniel.h.friese@uit.no [Centre for Theoretical and Computational Chemistry CTCC, Department of Chemistry, University of Tromsø, N-9037 Tromsø (Norway); Törk, Lisa; Hättig, Christof, E-mail: christof.haettig@rub.de [Lehrstuhl für Theoretische Chemie, Ruhr-Universität Bochum, D-44801 Bochum (Germany)

    2014-11-21

    We present scaling factors for vibrational frequencies calculated within the harmonic approximation and the correlated wave-function methods coupled cluster singles and doubles model (CC2) and Møller-Plesset perturbation theory (MP2) with and without a spin-component scaling (SCS or spin-opposite scaling (SOS)). Frequency scaling factors and the remaining deviations from the reference data are evaluated for several non-augmented basis sets of the cc-pVXZ family of generally contracted correlation-consistent basis sets as well as for the segmented contracted TZVPP basis. We find that the SCS and SOS variants of CC2 and MP2 lead to a slightly better accuracy for the scaled vibrational frequencies. The determined frequency scaling factors can also be used for vibrational frequencies calculated for excited states through response theory with CC2 and the algebraic diagrammatic construction through second order and their spin-component scaled variants.

  3. A reduced-scaling density matrix-based method for the computation of the vibrational Hessian matrix at the self-consistent field level

    International Nuclear Information System (INIS)

    Kussmann, Jörg; Luenser, Arne; Beer, Matthias; Ochsenfeld, Christian

    2015-01-01

    An analytical method to calculate the molecular vibrational Hessian matrix at the self-consistent field level is presented. By analysis of the multipole expansions of the relevant derivatives of Coulomb-type two-electron integral contractions, we show that the effect of the perturbation on the electronic structure due to the displacement of nuclei decays at least as r −2 instead of r −1 . The perturbation is asymptotically local, and the computation of the Hessian matrix can, in principle, be performed with O(N) complexity. Our implementation exhibits linear scaling in all time-determining steps, with some rapid but quadratic-complexity steps remaining. Sample calculations illustrate linear or near-linear scaling in the construction of the complete nuclear Hessian matrix for sparse systems. For more demanding systems, scaling is still considerably sub-quadratic to quadratic, depending on the density of the underlying electronic structure

  4. Improving Students' Language Performance Through Consistent Use of E-Learning: An Empirical Study in Japanese, Korean, Hindi and Sanskrit

    Directory of Open Access Journals (Sweden)

    Sara LIBRENJAK

    2016-12-01

    Full Text Available This paper describes the backing theories, methodology, and results of a two-semester long case study of the application of technology in teaching four Asian languages (Japanese, Korean, Hindi, and Sanskrit to Croatian students. We have developed e-learning materials to follow the curriculum in Croatia and deployed them in Asian language classrooms. Students who agreed to participate in the study were tested before using the materials, and after each semester, and their progress was surveyed. In the case of Japanese students (N=53, we have thoroughly monitored their usage and compared the progress of students who have diligently studied vocabulary and grammar using our materials on Memrise, and those who have neglected their studies. This was measured through their scores on the Memrise, which shows the user's activity. Also, their progress was measured using standardized tests that were designed in such a manner to resemble Japanese Language Proficiency Test. We have found that frequent users progressed averagely 20,3% after each semester, while non-frequent users have progressed only 11,6%, proving this method to be related to stable and constant use of e-materials.

  5. Internal consistency and content validity of a questionnaire aimed to assess the stages of behavioral lifestyle changes in Colombian schoolchildren: The Fuprecol study

    Directory of Open Access Journals (Sweden)

    Yasmira CARRILLO-BERNATE

    Full Text Available ABSTRACT Objective To assess internal consistency and content validity of a questionnaire aimed to assess the stages of Behavioural Lifestyle Changes in a sample of school-aged children and adolescents aged 9 to 17 years-old. Methods This validation study involved 675 schoolchildren from three official school in the city of Bogota, Colombia. A self-administered questionnaire called Behavioural Lifestyle Changes has been designed to explore stages of change regarding to physical activity/exercise, fruit and vegetable consumption, alcohol abuse, tobacco use, and drug abuse. Cronbach-α, Kappa index and exploratory factor analysis were used for evaluating the internal consistency and validity of content, respectively. Results The study population consisted of 51.1% males and the participants’ average age was 12.7±2.4 years-old. Behavioural Lifestyle Changes scored 0.720 (range 0.691 to 0.730 on the Cronbach α and intra-observer reproducibility was good (Kappa=0.71. Exploratory factor analysis determined two factors (factor 1: physical activity/exercise, fruit and vegetable consumption, and factor 2: alcohol abuse tobacco use and drug abuse, explaining 67.78% of variance by the items and six interactions χ2/gL=11649.833; p<0.001. Conclusion Behavioural Lifestyle Changes Questionnaire was seen to have suitable internal consistency and validity. This instrument can be recommended, mainly within the context of primary attention for studying the stages involved in the lifestyle behavioural changes model on a school-based population.

  6. Assessing the Consistency and Microbiological Effectiveness of Household Water Treatment Practices by Urban and Rural Populations Claiming to Treat Their Water at Home: A Case Study in Peru

    Science.gov (United States)

    Rosa, Ghislaine; Huaylinos, Maria L.; Gil, Ana; Lanata, Claudio; Clasen, Thomas

    2014-01-01

    Background Household water treatment (HWT) can improve drinking water quality and prevent disease if used correctly and consistently by vulnerable populations. Over 1.1 billion people report treating their water prior to drinking it. These estimates, however, are based on responses to household surveys that may exaggerate the consistency and microbiological performance of the practice—key factors for reducing pathogen exposure and achieving health benefits. The objective of this study was to examine how HWT practices are actually performed by households identified as HWT users, according to international monitoring standards. Methods and Findings We conducted a 6-month case study in urban (n = 117 households) and rural (n = 115 households) Peru, a country in which 82.8% of households report treating their water at home. We used direct observation, in-depth interviews, surveys, spot-checks, and water sampling to assess water treatment practices among households that claimed to treat their drinking water at home. While consistency of reported practices was high in both urban (94.8%) and rural (85.3%) settings, availability of treated water (based on self-report) at time of collection was low, with 67.1% and 23.0% of urban and rural households having treated water at all three sampling visits. Self-reported consumption of untreated water in the home among adults and children water of self-reported users was significantly better than source water in the urban setting and negligible but significantly better in the rural setting. However, only 46.3% and 31.6% of households had drinking water water quality. The lack of consistency and sub-optimal microbiological effectiveness also raises questions about the potential of HWT to prevent waterborne diseases. PMID:25522371

  7. Outrunning major weight gain: a prospective study of 8,340consistent runners during 7 years of follow-up

    Energy Technology Data Exchange (ETDEWEB)

    Williams, Paul T.

    2006-01-06

    Background: Body weight increases with aging. Short-term,longitudinal exercise training studies suggest that increasing exerciseproduces acute weight loss, but it is not clear if the maintenance oflong-term, vigorous exercise attenuates age-related weight gain inproportion to the exercise dose. Methods: Prospective study of 6,119 maleand 2,221 female runners whose running distance changed less than 5 km/wkbetween their baseline and follow-up survey 7 years later. Results: Onaverage, men who ran modest (0-24 km/wk), intermediate (24-48 km/wk) orprolonged distances (>_48 km/wk) all gained weight throughage 64,however, those who ran ?48 km/wk had one-half the average annual weightgain of those who ran<24 km/wk. Age-related weight gain, and itsreduction by running, were both greater in younger than older men. Incontrast, men s gain in waist circumference with age, and its reductionby running, were the same in older and younger men. Women increased theirbody weight and waist and hip circumferences over time, regardless ofage, which was also reduced in proportion to running distance. In bothsexes, running did not attenuate weight gain uniformly, but ratherdisproportionately prevented more extreme increases. Conclusion: Men andwomen who remain vigorously active gain less weight as they age and thereduction is in proportion to the exercise dose.

  8. Enhanced microwave absorption properties of MnO{sub 2} hollow microspheres consisted of MnO{sub 2} nanoribbons synthesized by a facile hydrothermal method

    Energy Technology Data Exchange (ETDEWEB)

    Wang, Yan; Han, Bingqian; Chen, Nan; Deng, Dongyang; Guan, Hongtao [Department of Materials Science and Engineering, Yunnan University, 650091, Kunming (China); Wang, Yude, E-mail: ydwang@ynu.edu.cn [Department of Materials Science and Engineering, Yunnan University, 650091, Kunming (China); Yunnan Province Key Lab of Micro-Nano Materials and Technology, Yunnan University, 650091, Kunming (China)

    2016-08-15

    MnO{sub 2} hollow microspheres consisted of nanoribbons were successfully fabricated via a facile hydrothermal method with SiO{sub 2} sphere templates. The crystal structure, morphology and microwave absorption properties in X and Ku band of the as-synthesized samples were characterized by powder X-ray diffraction (XRD), transmission electron microscopy (TEM) and a vector network analyzer. The results show that the three-dimensional (3D) hollow microspheres are assembled by ultra thin and narrow one-dimensional (1D) nanoribbons. A rational process for the formation of hollow microspheres is proposed. The 3D MnO{sub 2} hollow microspheres possess improved dielectric and magnetic properties than the 1D nanoribbons prepared by the same procedures with the absence of SiO{sub 2} hard templates, which are closely related to their special nanostructures. The MnO{sub 2} microspheres also show much better microwave absorption properties in X (8–12 GHz) and Ku (12–18 GHz) microwave band compared with 1D MnO{sub 2} nanoribbons. The minimum reflection loss of −40 dB for hollow microsphere can be observed at 14.2 GHz and reflection loss below −10 dB is 3.5 GHz with a thickness of only 4 mm. The possible mechanism for the enhanced microwave absorption properties is also discussed. - Graphical abstract: MnO{sub 2} hollow microspheres composed of nanoribbons show the excellent microwave absorption properties in X and Ku band. - Highlights: • MnO{sub 2} hollow microspheres consisted of MnO{sub 2} nanoribbons were successfully prepared. • MnO{sub 2} hollow microspheres possess good microwave absorption performances. • The excellent microwave absorption properties are in X and Ku microwave band. • Electromagnetic impedance matching is great contribution to absorption properties.

  9. Parental depressive and anxiety symptoms during pregnancy and attention problems in children : A cross-cohort consistency study

    NARCIS (Netherlands)

    van Batenburg-Eddes, T.; Brion, M.J.; Henrichs, J.; Jaddoe, V.W.; Hofman, A.; Verhulst, F.C.; Lawlor, D.A.; Davey Smith, G.; Tiemeier, H.W.

    2013-01-01

    Background:  Maternal depression and anxiety during pregnancy have been associated with offspring-attention deficit problems. Aim:  We explored possible intrauterine effects by comparing maternal and paternal symptoms during pregnancy, by investigating cross-cohort consistency, and by investigating

  10. Examining Word Factors and Child Factors for Acquisition of Conditional Sound-Spelling Consistencies: A Longitudinal Study

    Science.gov (United States)

    Kim, Young-Suk Grace; Petscher, Yaacov; Park, Younghee

    2016-01-01

    It has been suggested that children acquire spelling by picking up conditional sound-spelling consistencies. To examine this hypothesis, we investigated how variation in word characteristics (words that vary systematically in terms of phoneme-grapheme correspondences) and child factors (individual differences in the ability to extract…

  11. The Plumber’s Nightmare Phase in Diblock Copolymer/Homopolymer Blends. A Self-Consistent Field Theory Study.

    KAUST Repository

    Martinez-Veracoechea, Francisco J.; Escobedo, Fernando A.

    2009-01-01

    Using self-consistent field theory, the Plumber's Nightmare and the double diamond phases are predicted to be stable in a finite region of phase diagrams for blends of AB diblock copolymer (DBC) and A-component homopolymer. To the best of our

  12. Factor structure, internal consistency and reliability of the Posttraumatic Stress Disorder Checklist (PCL: an exploratory study Estrutura fatorial, consistência interna e confiabilidade do Posttraumatic Stress Disorder Checklist (PCL: um estudo exploratório

    Directory of Open Access Journals (Sweden)

    Eduardo de Paula Lima

    2012-01-01

    Full Text Available INTRODUCTION: Posttraumatic stress disorder (PTSD is an anxiety disorder resulting from exposure to traumatic events. The Posttraumatic Stress Disorder Checklist (PCL is a self-report measure largely used to evaluate the presence of PTSD. OBJECTIVE: To investigate the internal consistency, temporal reliability and factor validity of the Portuguese language version of the PCL used in Brazil. METHODS: A total of 186 participants were recruited. The sample was heterogeneous with regard to occupation, sociodemographic data, mental health history, and exposure to traumatic events. Subjects answered the PCL at two occasions within a 15 days’ interval (range: 5-15 days. RESULTS: Cronbach’s alpha coefficients indicated high internal consistency for the total scale (0.91 and for the theoretical dimensions of the Diagnostic and Statistical Manual of Mental Disorders, 4th edition (DSM-IV (0.83, 0.81, and 0.80. Temporal reliability (test-retest was high and consistent for different cutoffs. Maximum likelihood exploratory factor analysis (EFA was conducted and oblique rotation (Promax was applied. The Kaiser-Meyer-Olkin (KMO index (0.911 and Bartlett’s test of sphericity (χ² = 1,381.34, p INTRODUÇÃO: O transtorno do estresse pós-traumático (TEPT é um transtorno de ansiedade decorrente da exposição a eventos traumáticos. Entre as medidas de avaliação dos sintomas, destaca-se o Posttraumatic Stress Disorder Checklist (PCL. OBJETIVO: Investigar a consistência interna, a confiabilidade temporal e a validade fatorial da versão do PCL em português, utilizada no Brasil. MÉTODOS: Participaram do estudo 186 indivíduos heterogêneos em relação a ocupação, características sociodemográficas, histórico de saúde mental e exposição a eventos traumáticos. O PCL foi aplicado em dois momentos considerando um intervalo máximo de 15 dias (intervalo: 5-15 dias. RESULTADOS: A consistência interna (alfa de Cronbach foi adequada para a escala

  13. Analysis Method of Transfer Pricing Used by Multinational Companies Related to Tax Avoidance and Its Consistencies to the Arm's Length Principle

    OpenAIRE

    Sari, Nuraini; Hunar, Ririn Susanti

    2015-01-01

    The purpose of this study is to evaluate about how Starbucks Corporation uses transfer pricing to minimize the tax bill. In addition, the study also will evaluate how Indonesia’s domestic rules can overcome the case if Starbucks UK case happens in Indonesia. There are three steps conducted in this study. First, using information provided by UK Her Majesty's Revenue and Customs (HMRC) and other related articles, find methods used by Starbucks UK to minimize the tax bill. Second, find Organisat...

  14. Consistency of the single calculus chain for climatological studies using long-term measurements from thessaloniki lidar station

    Science.gov (United States)

    Siomos, Nikolaos; Voudouri, Kalliopi A.; Filioglou, Maria; Giannakaki, Eleni; Amiridis, Vasilis; D'Amico, Giuseppe; Balis, Dimitris S.

    2018-04-01

    The long term analysis of 15 years of lidar data derived from a Raman lidar at Thessaloniki is presented here. All measurements have been processed with the latest version 4 of the EARLINET Single Calculus Chain algorithm and are compared with the results from the current operational retrieval algorithm. In this paper we investigate the consistency between the EARLINET database and SCC for the case of Thessaloniki and we identify the issues to be considered when switching from current operations to SCC.

  15. Methods in studying ECM degradation

    NARCIS (Netherlands)

    Everts, V.; Buttle, D.J.

    2008-01-01

    Almost all tissues in our body contain specific cells associated with the tissue itself, and an extracellular matrix (ECM) that consists of a variety of proteins of which the bulk is formed by different types of collagens, glycoproteins and proteoglycans. TheECMplays a pivotal role in numerous

  16. Reporting consistently on CSR

    DEFF Research Database (Denmark)

    Thomsen, Christa; Nielsen, Anne Ellerup

    2006-01-01

    This chapter first outlines theory and literature on CSR and Stakeholder Relations focusing on the different perspectives and the contextual and dynamic character of the CSR concept. CSR reporting challenges are discussed and a model of analysis is proposed. Next, our paper presents the results...... of a case study showing that companies use different and not necessarily consistent strategies for reporting on CSR. Finally, the implications for managerial practice are discussed. The chapter concludes by highlighting the value and awareness of the discourse and the discourse types adopted...... in the reporting material. By implementing consistent discourse strategies that interact according to a well-defined pattern or order, it is possible to communicate a strong social commitment on the one hand, and to take into consideration the expectations of the shareholders and the other stakeholders...

  17. Electronic symmetry breaking in polyatomic molecules. Multiconfiguration self-consistent field study of the cyclopropenyl radical C3H3

    International Nuclear Information System (INIS)

    Hoffmann, M.R.; Laidig, W.D.; Kim, K.S.; Fox, D.J.; Schaefer, H.F. III

    1984-01-01

    For equilateral triangle geometries (point group D/sub 3h/), the C 3 H 3 radical has a degenerate 2 E'' electronic ground state. Although the 2 A 2 and 2 B 1 components separate in energy for C/sub 2v/ geometries, these two components should have identical energies for equilateral triangle structures. In fact, when approximate wave functions are used and the orbitals not required to transform according to the D/sub 3h/ irreducible representations, an energy separation between the 2 A 2 and 2 B 1 components is observed. At the single configuration self-consistent field (SCF) level of theory this separation is 2.8 kcal with a double-zeta basis set and 2.4 kcal with double-zeta plus polarization. It has been demonstrated that this spurious separation may be greatly reduced using multiconfiguration self-consistent field (up to 7474 variationally optimum configurations) and configuration interaction (up to 60 685 space and spin adapted configurations) techniques. Configurations differing by three and four electrons from the Hartree--Fock reference function are found necessary to reduce the 2 A 2 - 2 B 1 separation to below 0.5 kcal

  18. Towards the harmonization between National Forest Inventory and Forest Condition Monitoring. Consistency of plot allocation and effect of tree selection methods on sample statistics in Italy.

    Science.gov (United States)

    Gasparini, Patrizia; Di Cosmo, Lucio; Cenni, Enrico; Pompei, Enrico; Ferretti, Marco

    2013-07-01

    In the frame of a process aiming at harmonizing National Forest Inventory (NFI) and ICP Forests Level I Forest Condition Monitoring (FCM) in Italy, we investigated (a) the long-term consistency between FCM sample points (a subsample of the first NFI, 1985, NFI_1) and recent forest area estimates (after the second NFI, 2005, NFI_2) and (b) the effect of tree selection method (tree-based or plot-based) on sample composition and defoliation statistics. The two investigations were carried out on 261 and 252 FCM sites, respectively. Results show that some individual forest categories (larch and stone pine, Norway spruce, other coniferous, beech, temperate oaks and cork oak forests) are over-represented and others (hornbeam and hophornbeam, other deciduous broadleaved and holm oak forests) are under-represented in the FCM sample. This is probably due to a change in forest cover, which has increased by 1,559,200 ha from 1985 to 2005. In case of shift from a tree-based to a plot-based selection method, 3,130 (46.7%) of the original 6,703 sample trees will be abandoned, and 1,473 new trees will be selected. The balance between exclusion of former sample trees and inclusion of new ones will be particularly unfavourable for conifers (with only 16.4% of excluded trees replaced by new ones) and less for deciduous broadleaves (with 63.5% of excluded trees replaced). The total number of tree species surveyed will not be impacted, while the number of trees per species will, and the resulting (plot-based) sample composition will have a much larger frequency of deciduous broadleaved trees. The newly selected trees have-in general-smaller diameter at breast height (DBH) and defoliation scores. Given the larger rate of turnover, the deciduous broadleaved part of the sample will be more impacted. Our results suggest that both a revision of FCM network to account for forest area change and a plot-based approach to permit statistical inference and avoid bias in the tree sample

  19. Self-consistent mean field theory studies of the thermodynamics and quantum spin dynamics of magnetic Skyrmions.

    Science.gov (United States)

    Wieser, R

    2017-05-04

    A self-consistent mean field theory is introduced and used to investigate the thermodynamics and spin dynamics of an S  =  1 quantum spin system with a magnetic Skyrmion. The temperature dependence of the Skyrmion profile as well as the phase diagram are calculated. In addition, the spin dynamics of a magnetic Skyrmion is described by solving the time dependent Schrödinger equation with additional damping term. The Skyrmion annihilation process driven by an electric field is used to compare the trajectories of the quantum mechanical simulation with a semi-classical description for the spin expectation values using a differential equation similar to the classical Landau-Lifshitz-Gilbert equation.

  20. A self-consistent field study of diblock copolymer/charged particle system morphologies for nanofiltration membranes

    International Nuclear Information System (INIS)

    Zhang, Bo; Ye, Xianggui; Edwards, Brian J.

    2013-01-01

    A combination of self-consistent field theory and density functional theory was used to examine the stable, 3-dimensional equilibrium morphologies formed by diblock copolymers with a tethered nanoparticle attached either between the two blocks or at the end of one of the blocks. Both neutral and interacting particles were examined, with and without favorable/unfavorable energetic potentials between the particles and the block segments. The phase diagrams of the various systems were constructed, allowing the identification of three types of ordered mesophases composed of lamellae, hexagonally packed cylinders, and spheroids. In particular, we examined the conditions under which the mesophases could be generated wherein the tethered particles were primarily located within the interface between the two blocks of the copolymer. Key factors influencing these properties were determined to be the particle position along the diblock chain, the interaction potentials of the blocks and particles, the block copolymer composition, and molecular weight of the copolymer

  1. Theoretical study of ion bunching for pellet fusion in self-consistent time dependent space charge fields

    International Nuclear Information System (INIS)

    Lu, P.C.

    1977-01-01

    The use of intense ion beams as a heating source for the fusion reaction in pellets of D-T appears to have several potential advantages over the use of electron beams. If ion bunching can be accomplished, then existing technology can be used to achieve the required power, energy and time scales for pellet fusion. A scheme to be considered is that of a pre-formed nonuniform plasma adjacent to a partially transparent anode through which a space charge limited electron beam is injected from the terminals of a convergent spherical geometry with a finite (or zero) rise-time. At the instant of beam injection, the virtual cathode is formed. Due to the space charge fields set up by the beam, the plasma ions are accelerated towards the region beyond the virtual cathode. A self-consistent transient analysis of the interactions between the electron beam and the background plasma is performed. The numerical calculations show that by specifying the target plasma for perfect bunching the ions can be made to bunch nearly perfectly. Also, by considering the depletion of initial plasma and accounting for the fact that the virtual anode-virtual cathode gap region is moving opposite to the direction of the ions, one can considerably enhance the instantaneous power delivered to the target over that which is injected at the terminals of the device, even with a finite rise-time on the current pulse

  2. The consistency effect depends on markedness in less succesful but not succesful problem solvers: An eye movement study in primary school children

    NARCIS (Netherlands)

    van der Schoot, M.; Bakker Arkema, A.H.; Horsley, T.M.; van Lieshout, E.C.D.M.

    2009-01-01

    This study examined the effects of consistency (relational term consistent vs. inconsistent with required arithmetic operation) and markedness (relational term unmarked ['more than'] vs. marked ['less than']) on word problem solving in 10-12 years old children differing in problem-solving skill. The

  3. Is Consumer Response to Plain/Standardised Tobacco Packaging Consistent with Framework Convention on Tobacco Control Guidelines? A Systematic Review of Quantitative Studies

    Science.gov (United States)

    Stead, Martine; Moodie, Crawford; Angus, Kathryn; Bauld, Linda; McNeill, Ann; Thomas, James; Hastings, Gerard; Hinds, Kate; O'Mara-Eves, Alison; Kwan, Irene; Purves, Richard I.; Bryce, Stuart L.

    2013-01-01

    Background and Objectives Standardised or ‘plain’ tobacco packaging was introduced in Australia in December 2012 and is currently being considered in other countries. The primary objective of this systematic review was to locate, assess and synthesise published and grey literature relating to the potential impacts of standardised tobacco packaging as proposed by the guidelines for the international Framework Convention on Tobacco Control: reduced appeal, increased salience and effectiveness of health warnings, and more accurate perceptions of product strength and harm. Methods Electronic databases were searched and researchers in the field were contacted to identify studies. Eligible studies were published or unpublished primary research of any design, issued since 1980 and concerning tobacco packaging. Twenty-five quantitative studies reported relevant outcomes and met the inclusion criteria. A narrative synthesis was conducted. Results Studies that explored the impact of package design on appeal consistently found that standardised packaging reduced the appeal of cigarettes and smoking, and was associated with perceived lower quality, poorer taste and less desirable smoker identities. Although findings were mixed, standardised packs tended to increase the salience and effectiveness of health warnings in terms of recall, attention, believability and seriousness, with effects being mediated by the warning size, type and position on pack. Pack colour was found to influence perceptions of product harm and strength, with darker coloured standardised packs generally perceived as containing stronger tasting and more harmful cigarettes than fully branded packs; lighter coloured standardised packs suggested weaker and less harmful cigarettes. Findings were largely consistent, irrespective of location and sample. Conclusions The evidence strongly suggests that standardised packaging will reduce the appeal of packaging and of smoking in general; that it will go some way

  4. Quality control activities in support of the plutonium workers study. Assessment of coding consistency for data collected at Rocky Flats

    International Nuclear Information System (INIS)

    Reyes, M.; Wilkinson, G.S.; Acquavella, J.F.

    1984-03-01

    The Plutonium Workers Study is a multifaceted epidemiologic investigation of workers at six Department of Energy (DOE) facilities: Los Alamos, Rocky Flats, Mound, Savannah River, Oak Ridge, and Hanford. Information from a variety of record sources has been collected and abstracted for these studies. This report considers the accuracy of the demographic, occupational, and radiation exposure data collected for studies at Rocky Flats. the majority of the information was accurately abstracted, and analyses based on these data may be conducted

  5. Theoretical modeling of large molecular systems. Advances in the local self consistent field method for mixed quantum mechanics/molecular mechanics calculations.

    Science.gov (United States)

    Monari, Antonio; Rivail, Jean-Louis; Assfeld, Xavier

    2013-02-19

    Molecular mechanics methods can efficiently compute the macroscopic properties of a large molecular system but cannot represent the electronic changes that occur during a chemical reaction or an electronic transition. Quantum mechanical methods can accurately simulate these processes, but they require considerably greater computational resources. Because electronic changes typically occur in a limited part of the system, such as the solute in a molecular solution or the substrate within the active site of enzymatic reactions, researchers can limit the quantum computation to this part of the system. Researchers take into account the influence of the surroundings by embedding this quantum computation into a calculation of the whole system described at the molecular mechanical level, a strategy known as the mixed quantum mechanics/molecular mechanics (QM/MM) approach. The accuracy of this embedding varies according to the types of interactions included, whether they are purely mechanical or classically electrostatic. This embedding can also introduce the induced polarization of the surroundings. The difficulty in QM/MM calculations comes from the splitting of the system into two parts, which requires severing the chemical bonds that link the quantum mechanical subsystem to the classical subsystem. Typically, researchers replace the quantoclassical atoms, those at the boundary between the subsystems, with a monovalent link atom. For example, researchers might add a hydrogen atom when a C-C bond is cut. This Account describes another approach, the Local Self Consistent Field (LSCF), which was developed in our laboratory. LSCF links the quantum mechanical portion of the molecule to the classical portion using a strictly localized bond orbital extracted from a small model molecule for each bond. In this scenario, the quantoclassical atom has an apparent nuclear charge of +1. To achieve correct bond lengths and force constants, we must take into account the inner shell of

  6. Family adaptability and cohesion in families consisting of Asian immigrant women living in South Korea: A 3-year longitudinal study.

    Science.gov (United States)

    Kim, Yeon-Pyo; Kim, Sun; Joh, Ju-Youn

    2015-06-01

    South Korea's low birth rate, aging society, and female migration to urban areas due to industrialization have caused an accelerated inflow of Asian female immigrants into Korea to marry Korean men, especially in rural areas. This study was performed to determine how family function of multicultural families changes over time and what factors affect the changes in family function of multicultural families. The study subjects were 62 Asian immigrant women married to South Korean men living in South Korea. In a 1st wave study in August 2008, the socioeconomic factors and Family Adaptability and Cohesion Scale III (FACES III) scores were measured. A 3-year follow-up study was then conducted in August 2011, and the results were compared with the 1st wave study results. The mean family adaptability score was 24.6 in the 1st wave study and 26.1 at the 3-year follow-up. The average family cohesion score was 31.0 in the 1st wave study and 36.7 at the 3-year follow-up. There was a statistically significant increase in family cohesion after 3 years (P adaptability did not change over time; however, conversely, family cohesion increased. The age difference between husband and wife and the subjective SES had a positive association with the changes in family cohesion. Copyright © 2013 Wiley Publishing Asia Pty Ltd.

  7. The Value of Mixed Methods Research: A Mixed Methods Study

    Science.gov (United States)

    McKim, Courtney A.

    2017-01-01

    The purpose of this explanatory mixed methods study was to examine the perceived value of mixed methods research for graduate students. The quantitative phase was an experiment examining the effect of a passage's methodology on students' perceived value. Results indicated students scored the mixed methods passage as more valuable than those who…

  8. Modal Bin Hybrid Model: A surface area consistent, triple-moment sectional method for use in process-oriented modeling of atmospheric aerosols

    Science.gov (United States)

    Kajino, Mizuo; Easter, Richard C.; Ghan, Steven J.

    2013-09-01

    triple-moment sectional (TMS) aerosol dynamics model, Modal Bin Hybrid Model (MBHM), has been developed. In addition to number and mass (volume), surface area is predicted (and preserved), which is important for aerosol processes and properties such as gas-to-particle mass transfer, heterogeneous reaction, and light extinction cross section. The performance of MBHM was evaluated against double-moment sectional (DMS) models with coarse (BIN4) to very fine (BIN256) size resolutions for simulating evolution of particles under simultaneously occurring nucleation, condensation, and coagulation processes (BINx resolution uses x sections to cover the 1 nm to 1 µm size range). Because MBHM gives a physically consistent form of the intrasectional distributions, errors and biases of MBHM at BIN4-8 resolution were almost equivalent to those of DMS at BIN16-32 resolution for various important variables such as the moments Mk (k: 0, 2, 3), dMk/dt, and the number and volume of particles larger than a certain diameter. Another important feature of MBHM is that only a single bin is adequate to simulate full aerosol dynamics for particles whose size distribution can be approximated by a single lognormal mode. This flexibility is useful for process-oriented (multicategory and/or mixing state) modeling: Primary aerosols whose size parameters would not differ substantially in time and space can be expressed by a single or a small number of modes, whereas secondary aerosols whose size changes drastically from 1 to several hundred nanometers can be expressed by a number of modes. Added dimensions can be applied to MBHM to represent mixing state or photochemical age for aerosol mixing state studies.

  9. Prevalence of high frequency hearing loss consistent with noise exposure among people working with sound systems and general population in Brazil: A cross-sectional study

    Directory of Open Access Journals (Sweden)

    Trevisani Virgínia FM

    2008-05-01

    Full Text Available Abstract Background Music is ever present in our daily lives, establishing a link between humans and the arts through the senses and pleasure. Sound technicians are the link between musicians and audiences or consumers. Recently, general concern has arisen regarding occurrences of hearing loss induced by noise from excessively amplified sound-producing activities within leisure and professional environments. Sound technicians' activities expose them to the risk of hearing loss, and consequently put at risk their quality of life, the quality of the musical product and consumers' hearing. The aim of this study was to measure the prevalence of high frequency hearing loss consistent with noise exposure among sound technicians in Brazil and compare this with a control group without occupational noise exposure. Methods This was a cross-sectional study comparing 177 participants in two groups: 82 sound technicians and 95 controls (non-sound technicians. A questionnaire on music listening habits and associated complaints was applied, and data were gathered regarding the professionals' numbers of working hours per day and both groups' hearing complaint and presence of tinnitus. The participants' ear canals were visually inspected using an otoscope. Hearing assessments were performed (tonal and speech audiometry using a portable digital AD 229 E audiometer funded by FAPESP. Results There was no statistically significant difference between the sound technicians and controls regarding age and gender. Thus, the study sample was homogenous and would be unlikely to lead to bias in the results. A statistically significant difference in hearing loss was observed between the groups: 50% among the sound technicians and 10.5% among the controls. The difference could be addressed to high sound levels. Conclusion The sound technicians presented a higher prevalence of high frequency hearing loss consistent with noise exposure than did the general population, although

  10. Self-consistent relativistic QRPA studies of soft modes and spin-isospin resonances in unstable nuclei

    International Nuclear Information System (INIS)

    Paar, N.; Niksic, T.; Marketin, T.; Vretenar, D.; Ring, P.

    2005-01-01

    The excitation phenomena in unstable nuclei are investigated in the framework of the relativistic quasiparticle random-phase approximation (RQRPA) in the relativistic Hartree-Bogolyubov model (RHB) which is extended to include effective interactions with explicit density-dependent meson-nucleon couplings. The properties of the pygmy dipole resonance (PDR) are examined in 132 Sn and within isotopic chains, showing that already at moderate proton-neutron asymmetry the PDR peak energy is located above the neutron emission threshold. A method is suggested for determining the size of the neutron skin within an isotopic chain, based on the measurement of the excitation energies of the Gamow-Teller resonance relative to the isobaric analog state. In addition, for the first time the relativistic RHB+RQRPA model, with tensor ω meson-nucleon couplings, is employed in calculations of β-decay half-lives of nuclei of the relevance for the r-process. (orig.)

  11. Self-consistent relativistic QRPA studies of soft modes and spin-isospin resonances in unstable nuclei

    Energy Technology Data Exchange (ETDEWEB)

    Paar, N. [Technische Universitaet Darmstadt, Institut fuer Kernphysik, Darmstadt (Germany); University of Zagreb, Physics Department, Faculty of Science (Croatia); University of Washington, Institute for Nuclear Theory, Seattle (United States); Niksic, T. [University of Zagreb, Physics Department, Faculty of Science (Croatia); University of Washington, Institute for Nuclear Theory, Seattle (United States); Marketin, T.; Vretenar, D. [University of Zagreb, Physics Department, Faculty of Science (Croatia); Ring, P. [Physik-Department der Technischen Universitaet Muenchen, Garching (Germany)

    2005-09-01

    The excitation phenomena in unstable nuclei are investigated in the framework of the relativistic quasiparticle random-phase approximation (RQRPA) in the relativistic Hartree-Bogolyubov model (RHB) which is extended to include effective interactions with explicit density-dependent meson-nucleon couplings. The properties of the pygmy dipole resonance (PDR) are examined in {sup 132}Sn and within isotopic chains, showing that already at moderate proton-neutron asymmetry the PDR peak energy is located above the neutron emission threshold. A method is suggested for determining the size of the neutron skin within an isotopic chain, based on the measurement of the excitation energies of the Gamow-Teller resonance relative to the isobaric analog state. In addition, for the first time the relativistic RHB+RQRPA model, with tensor {omega} meson-nucleon couplings, is employed in calculations of {beta}-decay half-lives of nuclei of the relevance for the r-process. (orig.)

  12. Studies on Erythropoietin Bioassay Method

    Energy Technology Data Exchange (ETDEWEB)

    Cho, Kyoung Sam; Ro, Heung Kyu; Lee, Mun Ho [Seoul National University College of Medicine, Seoul (Korea, Republic of)

    1975-09-15

    It is the purpose of this paper to design the most preferable method of erythropoietin bioassay in Korea. Bioassay utilizing polycythemic mice are currently in general use for the indirect determination of erythropoietin. Assay animals are usually prepared either by transfusion or by exposure to reduced oxygen tension in specially constructed chamber. We prepared the polycythemic mice by the specially constructed hypobaric chamber. We observed weights and hematocrits of the mice in the hypobaric chamber, then hematocrits and 72 hours {sup 59}Fe red cell uptake ratio of the polycythemic mice induced by hypoxia after removal from the hypobaric chamber. We designed the method of erythropoietin bioassay according to the results obtained by above experiments. Then we measured the 72 hours {sup 59}Fe red cell uptake ratio of the polycythemic mice with normal saline, normal plasma and anemic plasma according to the method we designed. The results are followed:1) The hematocrits of the mice in hypobaric chamber increased to 74% in 11 days. It is preferable to maintain the pressure of the chamber to 400 mmHg for first 4 days then 300 mmHg for last 10 days to reduce the death rate and time consuming in hypobaric chamber. 2) After removal from the hypobaric chamber, the 72 hours {sup 59}Fe red cell uptake ratio decreased rapidly and maintained the lowest level from the fourth day to tenth day. 3) We design the method of erythropoietin bioassay according to the results of above experiment and to the half life of erythropoietin. 4) The Korean product {sup 59}Fe is mixture of {sup 55}Fe and {sup 59}Fe. And the {sup 59}Fe red cell uptake ratio in normal mice was far less with Korean product {sup 59}Fe than with pure {sup 59}Fe of foreign product. So it is desirable to use pure {sup 59}Fe in this method of erythropoietin bioassay. 5) Considering the cost, the technique, the time consuming and the sensitivity it is the most preferable method of erythropoietin bioassay in Korea

  13. Early identification of technical issues: a sensitivity study to check LISTRA1A internal consistency and structure

    International Nuclear Information System (INIS)

    Harvey, T.F.; Maninger, R.C.; Rabsatt, S.

    1979-01-01

    This report describes a sensitivity study using LISTRA1A, a model for use in the development of a long-range, time-dependent plan for licensing nuclear waste repositories. The objectives of the model are: (1) to provide information concerning the impact of various licensing strategies on the ability to dispose of nuclear waste effectively; and (2) to provide long-range budget forecasts for differing strategies of the Nuclear Regulatory Commission (NRC) and the Department of Energy (DOE). The model is designed to analyze the interaction between NRC regulatory policy and DOE technical programs. A sensitivity study is reported for a single parameter in a hypothetical review process

  14. Self-consistent meta-generalized gradient approximation study of adsorption of aromatic molecules on noble metal surfaces

    DEFF Research Database (Denmark)

    Ferrighi, Lara; Madsen, Georg Kent Hellerup; Hammer, Bjørk

    2011-01-01

    aromatic molecules considered. The adsorption of pentacene is studied on Au, Ag, and Cu surfaces. In agreement with experiment, the adsorption energies are found to increase with decreasing nobleness, but the dependency is underestimated. We point out how the kinetic energy density can discriminate between...

  15. Study of Water-Oil Emulsion Breaking by Stabilized Solution Consisting of Anionic Surface Acting Agent - Soda Ash - Polymer (ASP)

    Science.gov (United States)

    Kulichkov, S. V.; Avtomonov, E. G.; Andreeva, L. V.; Solomennik, S. F.; Nikitina, A. V.

    2018-01-01

    The paper provides a laboratory research of breaking natural water-oil emulsions: - by non-stabilized ASP; by stabilized ASP; by mixture of stabilized and non-stabilized ASP in different proportions and production of refinery water of the required quality with the use of IronGuard 2495 as flocculant. Oil-in-water emulsion is stable. Classic methods are not suitable for residual water treatment: sediment gravity flow; filtration; centrifuge test. Microemulsion formed after ASP application has low boundary tension and high pH. It contributes to transfer of oil phase into a water one, forming oil-in-water emulsion. Alkaline condition has adverse effect on demulsifying ability of agents, flocculation and boundary tension. For breaking of water-oil emulsion at EBU before the interchanger water or water-oil emulsion from the wells that were not APS-treated in ratio of 1:9 shall be delivered. Residual water after EBU must be prepared in water tanks by dilution in great volume.

  16. Theoretical study of hydrogen storage in a truncated triangular pyramid molecule consisting of pyridine and benzene rings bridged by vinylene groups

    Science.gov (United States)

    Ishikawa, Shigeru; Nemoto, Tetsushi; Yamabe, Tokio

    2018-06-01

    Hydrogen storage in a truncated triangular pyramid molecule C33H21N3, which consists of three pyridine rings and one benzene ring bridged by six vinylene groups, is studied by quantum chemical methods. The molecule is derived by substituting three benzene rings in a truncated tetrahedron hydrocarbon C36H24 with pyridine rings. The optimized molecular structure under C 3v symmetry shows no imaginary vibrational modes at the B3LYP/cc-pVTZ level of theory. The hydrogen storage process is investigated based on the MP2/cc-pVTZ method. Like the structure before substitution, the C33H21N3 molecule has a cavity that stores a hydrogen molecule with a binding energy of - 140 meV. The Langmuir isotherm shows that this cavity can store hydrogen at higher temperatures and lower pressures than usual physisorption materials. The C33H21N3 molecule has a kinetic advantage over the C36H24 molecule because the former molecule has a lower barrier (+ 560 meV) for the hydrogen molecule entering the cavity compared with the latter molecule (+ 730 meV) owing to the lack of hydrogen atoms narrowing the opening.

  17. Report of special study meeting on 'Atomic energy research aiming at consistent nuclear fuel cycle', fiscal year 1992

    International Nuclear Information System (INIS)

    Nishina, Kojiro; Nishihara, Hideaki; Mishima, Kaichiro

    1994-12-01

    This meeting was held on March 4, 1993. Since the first power generation with the JPDR and the initial criticality of the KUR, 30 years, and since the initial criticality of the KUCA, 20 years have elapsed. The researchers in universities have contributed greatly to the research and education of atomic energy, but the perspective of leading the world hereafter in this field is very uncertain. This study meeting was held to seek the way to make the proper contribution. In the meeting, lectures were given on Japanese policy on nuclear fuel cycle, the present state of the upstream research and the downstream research in Japan, the experimental plan in NUCEF, the present state of the researches on TRU decay heat data and TRU nucleus data, the present state of the experimental researches at KUCA and at FCA, the present state of the research on the heat removal from high conversion LWRs and the KUR, the present state of the research on radioactive waste treatment, and the present state of TRU chemical research. The record of the holding of this study meeting is added. (K.I.)

  18. Consistent codling moth population decline by two years of mating disruption in apple: a Flemish case study.

    Science.gov (United States)

    Bangels, E; Beliën, T

    2012-01-01

    Codling moth (Cydia pomonella) is one of the most important pests in apple and pear. In 2010 mating disruption became a key pest management tactic in Flemish pip fruit orchards, largely due to a government subsidy and demonstrating projects aiming to widen the area treated by pheromones as large as possible. As a consequence, the mating disruption strategy was applied at approximately 7.500 ha, or half of the pip fruit area, in 2010 and 2011. The sudden large-scale implementation of this technique changed the codling moth management landscape. Here we present a case study of a commercially managed orchard that suffered from high codling moth pressures for many years, as did the surrounding area. The RAK3 mating disruption system was introduced at this location in 2010, and was continued in 2011. Systematic detailed codling moth flight data for this location are available for many years. In addition, comprehensive data on damage levels of chemically untreated windows spread all over the test orchard in a randomized block design were obtained in successive years, enabling us to thoroughly evaluate the effect of the changed codling moth management strategy. Data from 2011 included damage levels in chemically treated windows when the entire orchard was applied once at the flight peak of Cydia pomonella. In 2009, before introduction of mating disruption, a mean of 8.25 +/- 5.54% of the fruits were infested at harvest when assessed in completely untreated windows. After two years of mating disruption, supported with a full chemical support in 2010, except for the untreated assessment windows, and only one application on the flight peak of 2011, damage was reduced to less than 0.03% at harvest. This is a valuable case study to demonstrate the benefits of the mating disruption approach.

  19. Protocol for a multicentre, multistage, prospective study in China using system-based approaches for consistent improvement in surgical safety.

    Science.gov (United States)

    Yu, Xiaochu; Jiang, Jingmei; Liu, Changwei; Shen, Keng; Wang, Zixing; Han, Wei; Liu, Xingrong; Lin, Guole; Zhang, Ye; Zhang, Ying; Ma, Yufen; Bo, Haixin; Zhao, Yupei

    2017-06-15

    Surgical safety has emerged as a crucial global health issue in the past two decades. Although several safety-enhancing tools are available, the pace of large-scale improvement remains slow, especially in developing countries such as China. The present project (Modern Surgery and Anesthesia Safety Management System Construction and Promotion) aims to develop and validate system-based integrated approaches for reducing perioperative deaths and complications using a multicentre, multistage design. The project involves collection of clinical and outcome information for 1 20 000 surgical inpatients at four regionally representative academic/teaching general hospitals in China during three sequential stages: preparation and development, effectiveness validation and improvement of implementation for promotion. These big data will provide the evidence base for the formulation, validation and improvement processes of a system-based stratified safety intervention package covering the entire surgical pathway. Attention will be directed to managing inherent patient risks and regulating medical safety behaviour. Information technology will facilitate data collection and intervention implementation, provide supervision mechanisms and guarantee transfer of key patient safety messages between departments and personnel. Changes in rates of deaths, surgical complications during hospitalisation, length of stay, system adoption and implementation rates will be analysed to evaluate effectiveness and efficiency. This study was approved by the institutional review boards of Peking Union Medical College Hospital, First Hospital of China Medical University, Qinghai Provincial People's Hospital, Xiangya Hospital Central South University and the Institute of Basic Medical Sciences, Chinese Academy of Medical Sciences. Study findings will be disseminated via peer-reviewed journals, conference presentations and patent papers. © Article author(s) (or their employer(s) unless otherwise

  20. Adaptation of rat jaw muscle fibers in postnatal development with a different food consistency: an immunohistochemical and electromyographic study.

    Science.gov (United States)

    Kawai, Nobuhiko; Sano, Ryota; Korfage, Joannes A M; Nakamura, Saika; Kinouchi, Nao; Kawakami, Emi; Tanne, Kazuo; Langenbach, Geerling E J; Tanaka, Eiji

    2010-06-01

    The development of the craniofacial system occurs, among other reasons, as a response to functional needs. In particular, the deficiency of the proper masticatory stimulus affects the growth. The purpose of this study was to relate alterations of muscle activity during postnatal development to adaptational changes in the muscle fibers. Fourteen 21-day-old Wistar strain male rats were randomly divided into two groups and fed on either a solid (hard-diet group) or a powder (soft-diet group) diet for 63 days. A radio-telemetric device was implanted to record muscle activity continuously from the superficial masseter, anterior belly of digastric and anterior temporalis muscles. The degree of daily muscle use was quantified by the total duration of muscle activity per day (duty time), the total burst number and their average length exceeding specified levels of the peak activity (5, 20 and 50%). The fiber type composition of the muscles was examined by the myosin heavy chain content of fibers by means of immunohistochemical staining and their cross-sectional area was measured. All muscle fibers were identified as slow type I and fast type IIA, IIX or IIB (respectively, with increasing twitch contraction speed and fatigability). At lower activity levels (exceeding 5% of the peak activity), the duty time of the anterior belly of the digastric muscle was significantly higher in the soft-diet group than in the hard-diet group (P fast transition of muscle fiber was shown in only the superficial masseter muscle. Therefore, the reduction in the amount of powerful muscle contractions could be important for the slow-to-fast transition of the myosin heavy chain isoform in muscle fibers.

  1. Bitcoin Meets Strong Consistency

    OpenAIRE

    Decker, Christian; Seidel, Jochen; Wattenhofer, Roger

    2014-01-01

    The Bitcoin system only provides eventual consistency. For everyday life, the time to confirm a Bitcoin transaction is prohibitively slow. In this paper we propose a new system, built on the Bitcoin blockchain, which enables strong consistency. Our system, PeerCensus, acts as a certification authority, manages peer identities in a peer-to-peer network, and ultimately enhances Bitcoin and similar systems with strong consistency. Our extensive analysis shows that PeerCensus is in a secure state...

  2. Does the low hole transport mass in and Si nanowires lead to mobility enhancements at high field and stress: A self-consistent tight-binding study

    Science.gov (United States)

    Kotlyar, R.; Linton, T. D.; Rios, R.; Giles, M. D.; Cea, S. M.; Kuhn, K. J.; Povolotskyi, Michael; Kubis, Tillmann; Klimeck, Gerhard

    2012-06-01

    The hole surface roughness and phonon limited mobility in the silicon , , and square nanowires under the technologically important conditions of applied gate bias and stress are studied with the self-consistent Poisson-sp3d5s*-SO tight-binding bandstructure method. Under an applied gate field, the hole carriers in a wire undergo a volume to surface inversion transition diminishing the positive effects of the high and valence band nonparabolicities, which are known to lead to the large gains of the phonon limited mobility at a zero field in narrow wires. Nonetheless, the hole mobility in the unstressed wires down to the 5 nm size remains competitive or shows an enhancement at high gate field over the large wire limit. Down to the studied 3 nm sizes, the hole mobility is degraded by strong surface roughness scattering in and wires. The channels are shown to experience less surface scattering degradation. The physics of the surface roughness scattering dependence on wafer and channel orientations in a wire is discussed. The calculated uniaxial compressive channel stress gains of the hole mobility are found to reduce in the narrow wires and at the high field. This exacerbates the stressed mobility degradation with size. Nonetheless, stress gains of a factor of 2 are obtained for wires down to 3 nm size at a 5×1012 cm-2 hole inversion density per gate area.

  3. Consistent classical supergravity theories

    International Nuclear Information System (INIS)

    Muller, M.

    1989-01-01

    This book offers a presentation of both conformal and Poincare supergravity. The consistent four-dimensional supergravity theories are classified. The formulae needed for further modelling are included

  4. A self-consistent, multivariate method for the determination of gas-phase rate coefficients, applied to reactions of atmospheric VOCs and the hydroxyl radical

    Science.gov (United States)

    Shaw, Jacob T.; Lidster, Richard T.; Cryer, Danny R.; Ramirez, Noelia; Whiting, Fiona C.; Boustead, Graham A.; Whalley, Lisa K.; Ingham, Trevor; Rickard, Andrew R.; Dunmore, Rachel E.; Heard, Dwayne E.; Lewis, Ally C.; Carpenter, Lucy J.; Hamilton, Jacqui F.; Dillon, Terry J.

    2018-03-01

    Gas-phase rate coefficients are fundamental to understanding atmospheric chemistry, yet experimental data are not available for the oxidation reactions of many of the thousands of volatile organic compounds (VOCs) observed in the troposphere. Here, a new experimental method is reported for the simultaneous study of reactions between multiple different VOCs and OH, the most important daytime atmospheric radical oxidant. This technique is based upon established relative rate concepts but has the advantage of a much higher throughput of target VOCs. By evaluating multiple VOCs in each experiment, and through measurement of the depletion in each VOC after reaction with OH, the OH + VOC reaction rate coefficients can be derived. Results from experiments conducted under controlled laboratory conditions were in good agreement with the available literature for the reaction of 19 VOCs, prepared in synthetic gas mixtures, with OH. This approach was used to determine a rate coefficient for the reaction of OH with 2,3-dimethylpent-1-ene for the first time; k = 5.7 (±0.3) × 10-11 cm3 molecule-1 s-1. In addition, a further seven VOCs had only two, or fewer, individual OH rate coefficient measurements available in the literature. The results from this work were in good agreement with those measurements. A similar dataset, at an elevated temperature of 323 (±10) K, was used to determine new OH rate coefficients for 12 aromatic, 5 alkane, 5 alkene and 3 monoterpene VOC + OH reactions. In OH relative reactivity experiments that used ambient air at the University of York, a large number of different VOCs were observed, of which 23 were positively identified. Due to difficulties with detection limits and fully resolving peaks, only 19 OH rate coefficients were derived from these ambient air samples, including 10 reactions for which data were previously unavailable at the elevated reaction temperature of T = 323 (±10) K.

  5. Time-consistent and market-consistent evaluations

    NARCIS (Netherlands)

    Pelsser, A.; Stadje, M.A.

    2014-01-01

    We consider evaluation methods for payoffs with an inherent financial risk as encountered for instance for portfolios held by pension funds and insurance companies. Pricing such payoffs in a way consistent to market prices typically involves combining actuarial techniques with methods from

  6. Preparation of CuIn1-xGaxS2 (x = 0.5) flowers consisting of nanoflakes via a solvothermal method

    International Nuclear Information System (INIS)

    Liang Xiaojuan; Zhong Jiasong; Yang Fan; Hua Wei; Jin Huaidong; Liu Haitao; Sun Juncai; Xiang Weidong

    2011-01-01

    Highlights: → We report for the first time a small biomolecule-assisted route using L-cysteine as sulfur source and complexing agent to synthesis CuIn 0.5 Ga 0.5 S 2 crystals. → The possible mechanisms leading to CuIn 0.5 Ga 0.5 S 2 flowers consisting of nanoflakes were proposed. → In addition, the morphology, structure, and phase composition of the as-prepared CuIn 0.5 Ga 0.5 S 2 products were investigated in detail by XRD, FESEM, EDS, XPS, TEM (HRTEM) and SAED. - Abstract: CuIn 1-x Ga x S 2 (x = 0.5) flowers consisting of nanoflakes were successfully prepared by a biomolecule-assisted solvothermal route at 220 deg. C for 10 h, employing copper chloride, gallium chloride, indium chloride and L-cysteine as precursors. The biomolecule L-cysteine acting as sulfur source was found to play a very important role in the formation of the final product. The diameter of the CuIn 0.5 Ga 0.5 S 2 flowers was 1-2 μm, and the thickness of the flakes was about 15 nm. The obtained products were characterized by X-ray diffraction (XRD), energy dispersion spectroscopy (EDS), X-ray photoelectron spectroscopy (XPS), field-emission scanning electron microscopy (FESEM), transmission electron microscopy (TEM), high-resolution transmission electron microscopy (HRTEM), selected area electron diffraction spectroscopy (SAED), and UV-vis absorption spectroscopy. The influences of the reaction temperature, reaction time, sulfur source and the molar ratio of Cu-to-L-cysteine (reactants) on the formation of the target compound were investigated. The formation mechanism of the CuIn 0.5 Ga 0.5 S 2 flowers consisting of flakes was discussed.

  7. Consistency of orthodox gravity

    Energy Technology Data Exchange (ETDEWEB)

    Bellucci, S. [INFN, Frascati (Italy). Laboratori Nazionali di Frascati; Shiekh, A. [International Centre for Theoretical Physics, Trieste (Italy)

    1997-01-01

    A recent proposal for quantizing gravity is investigated for self consistency. The existence of a fixed-point all-order solution is found, corresponding to a consistent quantum gravity. A criterion to unify couplings is suggested, by invoking an application of their argument to more complex systems.

  8. Quasiparticles and thermodynamical consistency

    International Nuclear Information System (INIS)

    Shanenko, A.A.; Biro, T.S.; Toneev, V.D.

    2003-01-01

    A brief and simple introduction into the problem of the thermodynamical consistency is given. The thermodynamical consistency relations, which should be taken into account under constructing a quasiparticle model, are found in a general manner from the finite-temperature extension of the Hellmann-Feynman theorem. Restrictions following from these relations are illustrated by simple physical examples. (author)

  9. Systematic homogenization and self-consistent flux and pin power reconstruction for nodal diffusion methods. 1: Diffusion equation-based theory

    International Nuclear Information System (INIS)

    Zhang, H.; Rizwan-uddin; Dorning, J.J.

    1995-01-01

    A diffusion equation-based systematic homogenization theory and a self-consistent dehomogenization theory for fuel assemblies have been developed for use with coarse-mesh nodal diffusion calculations of light water reactors. The theoretical development is based on a multiple-scales asymptotic expansion carried out through second order in a small parameter, the ratio of the average diffusion length to the reactor characteristic dimension. By starting from the neutron diffusion equation for a three-dimensional heterogeneous medium and introducing two spatial scales, the development systematically yields an assembly-homogenized global diffusion equation with self-consistent expressions for the assembly-homogenized diffusion tensor elements and cross sections and assembly-surface-flux discontinuity factors. The rector eigenvalue 1/k eff is shown to be obtained to the second order in the small parameter, and the heterogeneous diffusion theory flux is shown to be obtained to leading order in that parameter. The latter of these two results provides a natural procedure for the reconstruction of the local fluxes and the determination of pin powers, even though homogenized assemblies are used in the global nodal diffusion calculation

  10. Studies on Hepa filter test methods

    International Nuclear Information System (INIS)

    Lee, S.H.; Jon, K.S.; Park, W.J.; Ryoo, R.

    1981-01-01

    The purpose of this study is to compare testing methods of the HEPA filter adopted in other countries with each other, and to design and construct a test duct system to establish testing methods. The American D.O.P. test method, the British NaCl test method and several other independently developed methods are compared. It is considered that the D.O.P. method is most suitable for in-plant and leak tests

  11. A Dietary Feedback System for the Delivery of Consistent Personalized Dietary Advice in the Web-Based Multicenter Food4Me Study.

    Science.gov (United States)

    Forster, Hannah; Walsh, Marianne C; O'Donovan, Clare B; Woolhead, Clara; McGirr, Caroline; Daly, E J; O'Riordan, Richard; Celis-Morales, Carlos; Fallaize, Rosalind; Macready, Anna L; Marsaux, Cyril F M; Navas-Carretero, Santiago; San-Cristobal, Rodrigo; Kolossa, Silvia; Hartwig, Kai; Mavrogianni, Christina; Tsirigoti, Lydia; Lambrinou, Christina P; Godlewska, Magdalena; Surwiłło, Agnieszka; Gjelstad, Ingrid Merethe Fange; Drevon, Christian A; Manios, Yannis; Traczyk, Iwona; Martinez, J Alfredo; Saris, Wim H M; Daniel, Hannelore; Lovegrove, Julie A; Mathers, John C; Gibney, Michael J; Gibney, Eileen R; Brennan, Lorraine

    2016-06-30

    Despite numerous healthy eating campaigns, the prevalence of diets high in saturated fatty acids, sugar, and salt and low in fiber, fruit, and vegetables remains high. With more people than ever accessing the Internet, Web-based dietary assessment instruments have the potential to promote healthier dietary behaviors via personalized dietary advice. The objectives of this study were to develop a dietary feedback system for the delivery of consistent personalized dietary advice in a multicenter study and to examine the impact of automating the advice system. The development of the dietary feedback system included 4 components: (1) designing a system for categorizing nutritional intakes; (2) creating a method for prioritizing 3 nutrient-related goals for subsequent targeted dietary advice; (3) constructing decision tree algorithms linking data on nutritional intake to feedback messages; and (4) developing personal feedback reports. The system was used manually by researchers to provide personalized nutrition advice based on dietary assessment to 369 participants during the Food4Me randomized controlled trial, with an automated version developed on completion of the study. Saturated fatty acid, salt, and dietary fiber were most frequently selected as nutrient-related goals across the 7 centers. Average agreement between the manual and automated systems, in selecting 3 nutrient-related goals for personalized dietary advice across the centers, was highest for nutrient-related goals 1 and 2 and lower for goal 3, averaging at 92%, 87%, and 63%, respectively. Complete agreement between the 2 systems for feedback advice message selection averaged at 87% across the centers. The dietary feedback system was used to deliver personalized dietary advice within a multi-country study. Overall, there was good agreement between the manual and automated feedback systems, giving promise to the use of automated systems for personalizing dietary advice. Clinicaltrials.gov NCT01530139

  12. A self-consistent, multivariate method for the determination of gas-phase rate coefficients, applied to reactions of atmospheric VOCs and the hydroxyl radical

    Directory of Open Access Journals (Sweden)

    J. T. Shaw

    2018-03-01

    Full Text Available Gas-phase rate coefficients are fundamental to understanding atmospheric chemistry, yet experimental data are not available for the oxidation reactions of many of the thousands of volatile organic compounds (VOCs observed in the troposphere. Here, a new experimental method is reported for the simultaneous study of reactions between multiple different VOCs and OH, the most important daytime atmospheric radical oxidant. This technique is based upon established relative rate concepts but has the advantage of a much higher throughput of target VOCs. By evaluating multiple VOCs in each experiment, and through measurement of the depletion in each VOC after reaction with OH, the OH + VOC reaction rate coefficients can be derived. Results from experiments conducted under controlled laboratory conditions were in good agreement with the available literature for the reaction of 19 VOCs, prepared in synthetic gas mixtures, with OH. This approach was used to determine a rate coefficient for the reaction of OH with 2,3-dimethylpent-1-ene for the first time; k =  5.7 (±0.3  ×  10−11 cm3 molecule−1 s−1. In addition, a further seven VOCs had only two, or fewer, individual OH rate coefficient measurements available in the literature. The results from this work were in good agreement with those measurements. A similar dataset, at an elevated temperature of 323 (±10 K, was used to determine new OH rate coefficients for 12 aromatic, 5 alkane, 5 alkene and 3 monoterpene VOC + OH reactions. In OH relative reactivity experiments that used ambient air at the University of York, a large number of different VOCs were observed, of which 23 were positively identified. Due to difficulties with detection limits and fully resolving peaks, only 19 OH rate coefficients were derived from these ambient air samples, including 10 reactions for which data were previously unavailable at the elevated reaction temperature of T =  323 (±10 K.

  13. Preparation of CuIn{sub 1-x}Ga{sub x}S{sub 2} (x = 0.5) flowers consisting of nanoflakes via a solvothermal method

    Energy Technology Data Exchange (ETDEWEB)

    Liang Xiaojuan [College of Chemistry and Materials Engineering, Wenzhou University, Wenzhou, Zhejiang Province 325035 (China); Institute of Materials and Technology, Dalian Maritime University, Dalian 116026 (China); Zhong Jiasong; Yang Fan; Hua Wei; Jin Huaidong [College of Chemistry and Materials Engineering, Wenzhou University, Wenzhou, Zhejiang Province 325035 (China); Liu Haitao, E-mail: lht@wzu.edu.cn [College of Chemistry and Materials Engineering, Wenzhou University, Wenzhou, Zhejiang Province 325035 (China); Sun Juncai [Institute of Materials and Technology, Dalian Maritime University, Dalian 116026 (China); Xiang Weidong, E-mail: weidongxiang@yahoo.com.cn [College of Chemistry and Materials Engineering, Wenzhou University, Wenzhou, Zhejiang Province 325035 (China)

    2011-05-26

    Highlights: > We report for the first time a small biomolecule-assisted route using L-cysteine as sulfur source and complexing agent to synthesis CuIn{sub 0.5}Ga{sub 0.5}S{sub 2} crystals. > The possible mechanisms leading to CuIn{sub 0.5}Ga{sub 0.5}S{sub 2} flowers consisting of nanoflakes were proposed. > In addition, the morphology, structure, and phase composition of the as-prepared CuIn{sub 0.5}Ga{sub 0.5}S{sub 2} products were investigated in detail by XRD, FESEM, EDS, XPS, TEM (HRTEM) and SAED. - Abstract: CuIn{sub 1-x}Ga{sub x}S{sub 2} (x = 0.5) flowers consisting of nanoflakes were successfully prepared by a biomolecule-assisted solvothermal route at 220 deg. C for 10 h, employing copper chloride, gallium chloride, indium chloride and L-cysteine as precursors. The biomolecule L-cysteine acting as sulfur source was found to play a very important role in the formation of the final product. The diameter of the CuIn{sub 0.5}Ga{sub 0.5}S{sub 2} flowers was 1-2 {mu}m, and the thickness of the flakes was about 15 nm. The obtained products were characterized by X-ray diffraction (XRD), energy dispersion spectroscopy (EDS), X-ray photoelectron spectroscopy (XPS), field-emission scanning electron microscopy (FESEM), transmission electron microscopy (TEM), high-resolution transmission electron microscopy (HRTEM), selected area electron diffraction spectroscopy (SAED), and UV-vis absorption spectroscopy. The influences of the reaction temperature, reaction time, sulfur source and the molar ratio of Cu-to-L-cysteine (reactants) on the formation of the target compound were investigated. The formation mechanism of the CuIn{sub 0.5}Ga{sub 0.5}S{sub 2} flowers consisting of flakes was discussed.

  14. Self-consistent method for quantifying indium content from X-ray spectra of thick compound semiconductor specimens in a transmission electron microscope.

    Science.gov (United States)

    Walther, T; Wang, X

    2016-05-01

    Based on Monte Carlo simulations of X-ray generation by fast electrons we calculate curves of effective sensitivity factors for analytical transmission electron microscopy based energy-dispersive X-ray spectroscopy including absorption and fluorescence effects, as a function of Ga K/L ratio for different indium and gallium containing compound semiconductors. For the case of InGaN alloy thin films we show that experimental spectra can thus be quantified without the need to measure specimen thickness or density, yielding self-consistent values for quantification with Ga K and Ga L lines. The effect of uncertainties in the detector efficiency are also shown to be reduced. © 2015 The Authors Journal of Microscopy © 2015 Royal Microscopical Society.

  15. BMI was found to be a consistent determinant related to misreporting of energy, protein and potassium intake using self-report and duplicate portion methods

    NARCIS (Netherlands)

    Trijsburg, L.E.; Geelen, M.M.E.E.; Hollman, P.C.H.; Hulshof, P.J.M.; Feskens, E.J.M.; Veer, van 't P.; Boshuizen, H.C.; Vries, de J.H.M.

    2017-01-01


    As misreporting, mostly under-reporting, of dietary intake is a generally known problem in nutritional research, we aimed to analyse the association between selected determinants and the extent of misreporting by the duplicate portion method (DP), 24 h recall (24hR) and FFQ by linear regression

  16. Second-order perturbation theory with a density matrix renormalization group self-consistent field reference function: theory and application to the study of chromium dimer.

    Science.gov (United States)

    Kurashige, Yuki; Yanai, Takeshi

    2011-09-07

    We present a second-order perturbation theory based on a density matrix renormalization group self-consistent field (DMRG-SCF) reference function. The method reproduces the solution of the complete active space with second-order perturbation theory (CASPT2) when the DMRG reference function is represented by a sufficiently large number of renormalized many-body basis, thereby being named DMRG-CASPT2 method. The DMRG-SCF is able to describe non-dynamical correlation with large active space that is insurmountable to the conventional CASSCF method, while the second-order perturbation theory provides an efficient description of dynamical correlation effects. The capability of our implementation is demonstrated for an application to the potential energy curve of the chromium dimer, which is one of the most demanding multireference systems that require best electronic structure treatment for non-dynamical and dynamical correlation as well as large basis sets. The DMRG-CASPT2/cc-pwCV5Z calculations were performed with a large (3d double-shell) active space consisting of 28 orbitals. Our approach using large-size DMRG reference addressed the problems of why the dissociation energy is largely overestimated by CASPT2 with the small active space consisting of 12 orbitals (3d4s), and also is oversensitive to the choice of the zeroth-order Hamiltonian. © 2011 American Institute of Physics

  17. The association of neighbourhood and individual social capital with consistent self-rated health: a longitudinal study in Brazilian pregnant and postpartum women

    Directory of Open Access Journals (Sweden)

    Lamarca Gabriela A

    2013-01-01

    Full Text Available Abstract Background Social conditions, social relationships and neighbourhood environment, the components of social capital, are important determinants of health. The objective of this study was to investigate the association of neighbourhood and individual social capital with consistent self-rated health in women between the first trimester of pregnancy and six months postpartum. Methods A multilevel cohort study in 34 neighbourhoods was performed on 685 Brazilian women recruited at antenatal units in two cities in the State of Rio de Janeiro, Brazil. Self-rated health (SRH was assessed in the 1st trimester of pregnancy (baseline and six months after childbirth (follow-up. The participants were divided into two groups: 1. Good SRH – good SRH at baseline and follow-up, and, 2. Poor SRH – poor SRH at baseline and follow-up. Exploratory variables collected at baseline included neighbourhood social capital (neighbourhood-level variable, individual social capital (social support and social networks, demographic and socioeconomic characteristics, health-related behaviours and self-reported diseases. A hierarchical binomial multilevel analysis was performed to test the association between neighbourhood and individual social capital and SRH, adjusted for covariates. Results The Good SRH group reported higher scores of social support and social networks than the Poor SRH group. Although low neighbourhood social capital was associated with poor SRH in crude analysis, the association was not significant when individual socio-demographic variables were included in the model. In the final model, women reporting poor SRH both at baseline and follow-up had lower levels of social support (positive social interaction [OR 0.82 (95% CI: 0.73-0.90] and a lower likelihood of friendship social networks [OR 0.61 (95% CI: 0.37-0.99] than the Good SRH group. The characteristics that remained associated with poor SRH were low level of schooling, Black and Brown

  18. Method and equipment for fast transmission of a signal consisting of many data pulses by a normal well-logging cable

    International Nuclear Information System (INIS)

    Pitts, R.W. Jr.; Whatley, H.A. Jr.

    1975-01-01

    Well logging methods and equipment in general and a nuclear well logging process and apparatus in particular are presented. They increase the number of pulses which can be transmitted to the top of the well from a well logging instrument placed at the bottom, during a given time interval, by means of a processing giving two pulses for each pulse corresponding to a detection. The equipment is a double-spectrum well logging system, with near and far detectors [fr

  19. Active teaching methods, studying responses and learning

    DEFF Research Database (Denmark)

    Christensen, Hans Peter; Vigild, Martin Etchells; Thomsen, Erik Vilain

    2010-01-01

    Students’ study strategies when exposed to activating teaching methods are measured, analysed and compared to study strategies in more traditional lecture-based teaching. The resulting learning outcome is discussed.......Students’ study strategies when exposed to activating teaching methods are measured, analysed and compared to study strategies in more traditional lecture-based teaching. The resulting learning outcome is discussed....

  20. Active teaching methods, studying responses and learning

    DEFF Research Database (Denmark)

    Christensen, Hans Peter; Vigild, Martin Etchells; Thomsen, Erik Vilain

    Students’ study strategies when exposed to activating teaching methods are measured, analysed and compared to study strategies in more traditional lecture-based teaching.......Students’ study strategies when exposed to activating teaching methods are measured, analysed and compared to study strategies in more traditional lecture-based teaching....

  1. Combining the Complete Active Space Self-Consistent Field Method and the Full Configuration Interaction Quantum Monte Carlo within a Super-CI Framework, with Application to Challenging Metal-Porphyrins.

    Science.gov (United States)

    Li Manni, Giovanni; Smart, Simon D; Alavi, Ali

    2016-03-08

    A novel stochastic Complete Active Space Self-Consistent Field (CASSCF) method has been developed and implemented in the Molcas software package. A two-step procedure is used, in which the CAS configuration interaction secular equations are solved stochastically with the Full Configuration Interaction Quantum Monte Carlo (FCIQMC) approach, while orbital rotations are performed using an approximated form of the Super-CI method. This new method does not suffer from the strong combinatorial limitations of standard MCSCF implementations using direct schemes and can handle active spaces well in excess of those accessible to traditional CASSCF approaches. The density matrix formulation of the Super-CI method makes this step independent of the size of the CI expansion, depending exclusively on one- and two-body density matrices with indices restricted to the relatively small number of active orbitals. No sigma vectors need to be stored in memory for the FCIQMC eigensolver--a substantial gain in comparison to implementations using the Davidson method, which require three or more vectors of the size of the CI expansion. Further, no orbital Hessian is computed, circumventing limitations on basis set expansions. Like the parent FCIQMC method, the present technique is scalable on massively parallel architectures. We present in this report the method and its application to the free-base porphyrin, Mg(II) porphyrin, and Fe(II) porphyrin. In the present study, active spaces up to 32 electrons and 29 orbitals in orbital expansions containing up to 916 contracted functions are treated with modest computational resources. Results are quite promising even without accounting for the correlation outside the active space. The systems here presented clearly demonstrate that large CASSCF calculations are possible via FCIQMC-CASSCF without limitations on basis set size.

  2. Radionuclide methods application in cardiac studies

    International Nuclear Information System (INIS)

    Kotina, E.D.; Ploskikh, V.A.; Babin, A.V.

    2013-01-01

    Radionuclide methods are one of the most modern methods of functional diagnostics of diseases of the cardio-vascular system that requires the use of mathematical methods of processing and analysis of data obtained during the investigation. Study is carried out by means of one-photon emission computed tomography (SPECT). Mathematical methods and software for SPECT data processing are developed. This software allows defining physiologically meaningful indicators for cardiac studies

  3. Self-consistent quark bags

    International Nuclear Information System (INIS)

    Rafelski, J.

    1979-01-01

    After an introductory overview of the bag model the author uses the self-consistent solution of the coupled Dirac-meson fields to represent a bound state of strongly ineteracting fermions. In this framework he discusses the vivial approach to classical field equations. After a short description of the used numerical methods the properties of bound states of scalar self-consistent Fields and the solutions of a self-coupled Dirac field are considered. (HSI) [de

  4. Coordinating user interfaces for consistency

    CERN Document Server

    Nielsen, Jakob

    2001-01-01

    In the years since Jakob Nielsen's classic collection on interface consistency first appeared, much has changed, and much has stayed the same. On the one hand, there's been exponential growth in the opportunities for following or disregarding the principles of interface consistency-more computers, more applications, more users, and of course the vast expanse of the Web. On the other, there are the principles themselves, as persistent and as valuable as ever. In these contributed chapters, you'll find details on many methods for seeking and enforcing consistency, along with bottom-line analys

  5. Data on consistency among different methods to assess atherosclerotic plaque echogenicity on standard ultrasound and intraplaque neovascularization on contrast-enhanced ultrasound imaging in human carotid artery

    Directory of Open Access Journals (Sweden)

    Mattia Cattaneo

    2016-12-01

    Full Text Available Here we provide the correlation among different carotid ultrasound (US variables to assess echogenicity n standard carotid US and to assess intraplaque neovascularization on contrast enhanced US. We recruited 45 consecutive subjects with an asymptomatic≥50% carotid artery stenosis. Carotid plaque echogenicity at standard US was visually graded according to Gray–Weale classification (GW and measured by the greyscale median (GSM, a semi-automated computerized measurement performed by Adobe Photoshop®. On CEUS imaging IPNV was graded according to the visual appearance of contrast within the plaque according to three different methods: CEUS_A (1=absent; 2=present; CEUS_B a three-point scale (increasing IPNV from 1 to 3; CEUS_C a four-point scale (increasing IPNV from 0 to 3. We have also implemented a new simple quantification method derived from region of interest (ROI signal intensity ratio as assessed by QLAB software. Further information is available in “Contrast-enhanced ultrasound imaging of intraplaque neovascularization and its correlation to plaque echogenicity in human carotid arteries atherosclerosis (M. Cattaneo, D. Staub, A.P. Porretta, J.M. Gallino, P. Santini, C. Limoni et al., 2016 [1].

  6. Improving consistency in findings from pharmacoepidemiological studies: The IMI-protect (Pharmacoepidemiological research on outcomes of therapeutics by a European consortium) project

    NARCIS (Netherlands)

    De Groot, Mark C.H.; Schlienger, Raymond; Reynolds, Robert; Gardarsdottir, Helga; Juhaeri, Juhaeri; Hesse, Ulrik; Gasse, Christiane; Rottenkolber, Marietta; Schuerch, Markus; Kurz, Xavier; Klungel, Olaf H.

    2013-01-01

    Background: Pharmacoepidemiological (PE) research should provide consistent, reliable and reproducible results to contribute to the benefit-risk assessment of medicines. IMI-PROTECT aims to identify sources of methodological variations in PE studies using a common protocol and analysis plan across

  7. Consistency and Word-Frequency Effects on Spelling among First- To Fifth-Grade French Children: A Regression-Based Study

    Science.gov (United States)

    Lete, Bernard; Peereman, Ronald; Fayol, Michel

    2008-01-01

    We describe a large-scale regression study that examines the influence of lexical (word frequency, lexical neighborhood) and sublexical (feedforward and feedback consistency) variables on spelling accuracy among first, second, and third- to fifth-graders. The wordset analyzed contained 3430 French words. Predictors in the stepwise regression…

  8. College Students; Justification for Digital Piracy: A Mixed Methods Study

    Science.gov (United States)

    Yu, Szde

    2012-01-01

    A mixed methods project was devoted to understanding college students' justification for digital piracy. The project consisted of two studies, a qualitative one and a quantitative one. Qualitative interviews were conducted to identify main themes in students' justification for digital piracy, and then the findings were tested in a quantitative…

  9. Consistency in PERT problems

    OpenAIRE

    Bergantiños, Gustavo; Valencia-Toledo, Alfredo; Vidal-Puga, Juan

    2016-01-01

    The program evaluation review technique (PERT) is a tool used to schedule and coordinate activities in a complex project. In assigning the cost of a potential delay, we characterize the Shapley rule as the only rule that satisfies consistency and other desirable properties.

  10. Structural dynamics of phenylisothiocyanate in the light-absorbing excited states: Resonance Raman and complete active space self-consistent field calculation study

    International Nuclear Information System (INIS)

    Ouyang, Bing; Xue, Jia-Dan; Zheng, Xuming; Fang, Wei-Hai

    2014-01-01

    The excited state structural dynamics of phenyl isothiocyanate (PITC) after excitation to the light absorbing S 2 (A′), S 6 (A′), and S 7 (A′) excited states were studied by using the resonance Raman spectroscopy and complete active space self-consistent field method calculations. The UV absorption bands of PITC were assigned. The vibrational assignments were done on the basis of the Fourier transform (FT)-Raman and FT-infrared measurements, the density-functional theory computations, and the normal mode analysis. The A-, B-, and C-bands resonance Raman spectra in cyclohexane, acetonitrile, and methanol solvents were, respectively, obtained at 299.1, 282.4, 266.0, 252.7, 228.7, 217.8, and 208.8 nm excitation wavelengths to probe the corresponding structural dynamics of PITC. The results indicated that the structural dynamics in the S 2 (A′), S 6 (A′), and S 7 (A′) excited states were very different. The conical intersection point CI(S 2 /S 1 ) were predicted to play important role in the low-lying excited state decay dynamics. Two major decay channels were predicted for PITC upon excitation to the S 2 (A′) state: the radiative S 2,min → S 0 transition and the nonradiative S 2 → S 1 internal conversion via CI(S 2 /S 1 ). The differences in the decay dynamics between methyl isothiocyanate and PITC in the first light absorbing excited state were discussed. The role of the intersystem crossing point ISC(S 1 /T 1 ) in the excited state decay dynamics of PITC is evaluated

  11. A systematic approach to evaluate parameter consistency in the inlet stream of source separated biowaste composting facilities: A case study in Colombia.

    Science.gov (United States)

    Oviedo-Ocaña, E R; Torres-Lozada, P; Marmolejo-Rebellon, L F; Torres-López, W A; Dominguez, I; Komilis, D; Sánchez, A

    2017-04-01

    Biowaste is commonly the largest fraction of municipal solid waste (MSW) in developing countries. Although composting is an effective method to treat source separated biowaste (SSB), there are certain limitations in terms of operation, partly due to insufficient control to the variability of SSB quality, which affects process kinetics and product quality. This study assesses the variability of the SSB physicochemical quality in a composting facility located in a small town of Colombia, in which SSB collection was performed twice a week. Likewise, the influence of the SSB physicochemical variability on the variability of compost parameters was assessed. Parametric and non-parametric tests (i.e. Student's t-test and the Mann-Whitney test) showed no significant differences in the quality parameters of SSB among collection days, and therefore, it was unnecessary to establish specific operation and maintenance regulations for each collection day. Significant variability was found in eight of the twelve quality parameters analyzed in the inlet stream, with corresponding coefficients of variation (CV) higher than 23%. The CVs for the eight parameters analyzed in the final compost (i.e. pH, moisture, total organic carbon, total nitrogen, C/N ratio, total phosphorus, total potassium and ash) ranged from 9.6% to 49.4%, with significant variations in five of those parameters (CV>20%). The above indicate that variability in the inlet stream can affect the variability of the end-product. Results suggest the need to consider variability of the inlet stream in the performance of composting facilities to achieve a compost of consistent quality. Copyright © 2017 Elsevier Ltd. All rights reserved.

  12. Social supports and mental health: a cross-sectional study on the correlation of self-consistency and congruence in China.

    Science.gov (United States)

    Gu, YanMei; Hu, Jie; Hu, YaPing; Wang, JianRong

    2016-06-28

    Psychosocial job characteristics require nursing staff with high self-consistency and good mental health. However, the attention and effort of such study remained very limited in China. A self-administered questionnaire was distributed to the bedside nurses in an affiliated hospital of Hebei Medical University, China. Of 218 registered bedside nurses eligible to participate in the survey anonymously, the data producing sample of 172 subjects resulted in a 79 % of effective response rate.. The Social Support Rating Scale was used to measure social support, and the Self-Consistency and Congruence Scale were used to measure mental health. Compared with the normal referenced group of college students, higher self-flexibility scores, lower self-conflict and self-stethoscope scores from the sample group were obtained with statistical significance in self-conflict scores. The close correlations were observed between participants' social support and Self-Consistency and Congruence Scale score. The difference of Social Support Rating Scale score was significant in demographic features including years of work, marital status, only child family, and levels of cooperation with other health worker. Bedside nurses in this study show a better inner harmony, and their Self-Consistency and Congruence closely correlates with the levels of social support. Thus, it is substantial to improve inner perception of support and external factors, such as the workplace support, and offer beneficial social environment to improve the bedside nurse's sub-health symptoms and decrease the high turnover rate.

  13. Consistent Visual Analyses of Intrasubject Data

    Science.gov (United States)

    Kahng, SungWoo; Chung, Kyong-Mee; Gutshall, Katharine; Pitts, Steven C.; Kao, Joyce; Girolami, Kelli

    2010-01-01

    Visual inspection of single-case data is the primary method of interpretation of the effects of an independent variable on a dependent variable in applied behavior analysis. The purpose of the current study was to replicate and extend the results of DeProspero and Cohen (1979) by reexamining the consistency of visual analysis across raters. We…

  14. Geometrically Consistent Mesh Modification

    KAUST Repository

    Bonito, A.

    2010-01-01

    A new paradigm of adaptivity is to execute refinement, coarsening, and smoothing of meshes on manifolds with incomplete information about their geometry and yet preserve position and curvature accuracy. We refer to this collectively as geometrically consistent (GC) mesh modification. We discuss the concept of discrete GC, show the failure of naive approaches, and propose and analyze a simple algorithm that is GC and accuracy preserving. © 2010 Society for Industrial and Applied Mathematics.

  15. Consistency of land surface reflectance data: presentation of a new tool and case study with Formosat-2, SPOT-4 and Landsat-5/7/8 data

    Science.gov (United States)

    Claverie, M.; Vermote, E.; Franch, B.; Huc, M.; Hagolle, O.; Masek, J.

    2013-12-01

    Maintaining consistent dataset of Surface Reflectance (SR) data derived from the large panel of in-orbit sensors is an important challenge to ensure long term analysis of earth observation data. Continuous validation of such SR products through comparison with a reference dataset is thus an important challenge. Validating with in situ or airborne SR data is not easy since the sensors rarely match completely the same spectral, spatial and directional characteristics of the satellite measurement. Inter-comparison between satellites sensors data appears as a valuable tool to maintain a long term consistency of the data. However, satellite data are acquired at various times of the day (i.e., variation of the atmosphere content) and within a relative large range of geometry (view and sun angles). Also, even if band-to-band spectral characteristics of optical sensors are closed, they rarely have identical spectral responses. As the results, direct comparisons without consideration of these differences are poorly suitable. In this study, we suggest a new systematic method to assess land optical SR data from high to medium resolution sensors. We used MODIS SR products (MO/YD09CMG) which benefit from a long term calibration/validation process, to assess SR from 3 sensors data: Formosat-2 (280 scenes 24x24km - 5 sites), SPOT-4 (62 scenes 120x60km - 1 site) and Landsat-5/7 (104 180x180km scenes - 50 sites). The main issue concerns the difference in term of geometry acquisition between MODIS and compared sensors data. We used the VJB model (Vermote et al. 2009, TGRS) to correct MODIS SR from BRDF effects and to simulate SR at the corresponding geometry (view and sun angles) of each pixel of the compared sensor data. The comparison is done at the CMG spatial resolution (0.05°) which ensures a constant field-of-view and negligible geometrical errors. Figure 1 displays the summary of the NIR results through APU graphs where metrics A, P and U stands for Accuracy, Precision and

  16. Can consistent benchmarking within a standardized pain management concept decrease postoperative pain after total hip arthroplasty? A prospective cohort study including 367 patients

    Directory of Open Access Journals (Sweden)

    Benditz A

    2016-12-01

    Full Text Available Achim Benditz,1 Felix Greimel,1 Patrick Auer,2 Florian Zeman,3 Antje Göttermann,4 Joachim Grifka,1 Winfried Meissner,4 Frederik von Kunow1 1Department of Orthopedics, University Medical Center Regensburg, 2Clinic for anesthesia, Asklepios Klinikum Bad Abbach, Bad Abbach, 3Centre for Clinical Studies, University Medical Center Regensburg, Regensburg, 4Department of Anesthesiology and Intensive Care, Jena University Hospital, Jena, Germany Background: The number of total hip replacement surgeries has steadily increased over recent years. Reduction in postoperative pain increases patient satisfaction and enables better mobilization. Thus, pain management needs to be continuously improved. Problems are often caused not only by medical issues but also by organization and hospital structure. The present study shows how the quality of pain management can be increased by implementing a standardized pain concept and simple, consistent, benchmarking.Methods: All patients included in the study had undergone total hip arthroplasty (THA. Outcome parameters were analyzed 24 hours after surgery by means of the questionnaires from the German-wide project “Quality Improvement in Postoperative Pain Management” (QUIPS. A pain nurse interviewed patients and continuously assessed outcome quality parameters. A multidisciplinary team of anesthetists, orthopedic surgeons, and nurses implemented a regular procedure of data analysis and internal benchmarking. The health care team was informed of any results, and suggested improvements. Every staff member involved in pain management participated in educational lessons, and a special pain nurse was trained in each ward.Results: From 2014 to 2015, 367 patients were included. The mean maximal pain score 24 hours after surgery was 4.0 (±3.0 on an 11-point numeric rating scale, and patient satisfaction was 9.0 (±1.2. Over time, the maximum pain score decreased (mean 3.0, ±2.0, whereas patient satisfaction

  17. Activating teaching methods, studying responses and learning

    OpenAIRE

    Christensen, Hans Peter; Vigild, Martin E.; Thomsen, Erik; Szabo, Peter; Horsewell, Andy

    2009-01-01

    Students’ study strategies when exposed to activating teaching methods are measured, analysed and compared to study strategies in more traditional lecture-based teaching. The resulting learning outcome is discussed. Peer Reviewed

  18. Recent Studies on Trojan Horse Method

    International Nuclear Information System (INIS)

    Cherubini, S.; Spitaleri, C.; Gulino, M.

    2011-01-01

    The study of nuclear reactions that are important for the understanding of astrophysical problems received an increasing attention over the last decades. The Trojan Horse Method was proposed as a tool to overcome some of the problems connected with the measurement of cross-sections between charged particles at astrophysical energies. Here we present some recent studies on this method. (authors)

  19. Consistently high sports/exercise activity is associated with better sleep quality, continuity and depth in midlife women: the SWAN sleep study.

    Science.gov (United States)

    Kline, Christopher E; Irish, Leah A; Krafty, Robert T; Sternfeld, Barbara; Kravitz, Howard M; Buysse, Daniel J; Bromberger, Joyce T; Dugan, Sheila A; Hall, Martica H

    2013-09-01

    To examine relationships between different physical activity (PA) domains and sleep, and the influence of consistent PA on sleep, in midlife women. Cross-sectional. Community-based. 339 women in the Study of Women's Health Across the Nation Sleep Study (52.1 ± 2.1 y). None. Sleep was examined using questionnaires, diaries and in-home polysomnography (PSG). PA was assessed in three domains (Active Living, Household/Caregiving, Sports/Exercise) using the Kaiser Physical Activity Survey (KPAS) up to 4 times over 6 years preceding the sleep assessments. The association between recent PA and sleep was evaluated using KPAS scores immediately preceding the sleep assessments. The association between the historical PA pattern and sleep was examined by categorizing PA in each KPAS domain according to its pattern over the 6 years preceding sleep assessments (consistently low, inconsistent/consistently moderate, or consistently high). Greater recent Sports/Exercise activity was associated with better sleep quality (diary "restedness" [P sleep continuity (diary sleep efficiency [SE; P = 0.02]) and depth (higher NREM delta electroencephalographic [EEG] power [P = 0.04], lower NREM beta EEG power [P Sports/Exercise activity was also associated with better Pittsburgh Sleep Quality Index scores (P = 0.02) and higher PSG-assessed SE (P sleep and Active Living or Household/Caregiving activity (either recent or historical pattern) were noted. Consistently high levels of recreational physical activity, but not lifestyle- or household-related activity, are associated with better sleep in midlife women. Increasing recreational physical activity early in midlife may protect against sleep disturbance in this population.

  20. Frequency and determinants of consistent STI/HIV testing among men who have sex with men testing at STI outpatient clinics in the Netherlands: a longitudinal study.

    Science.gov (United States)

    Visser, Maartje; Heijne, Janneke C M; Hogewoning, Arjan A; van Aar, Fleur

    2017-09-01

    Men who have sex with men (MSM) are at highest risk for STIs and HIV infections in the Netherlands. However, official guidelines on STI testing among MSM are lacking. They are advised to test for STIs at least every six months, but their testing behaviour is not well known. This study aimed to get insight into the proportion and determinants of consistent 6-monthly STI testing among MSM testing at STI outpatient clinics in the Netherlands. This study included longitudinal surveillance data of STI consultations among MSM from all 26 STI outpatient clinics in the Netherlands between 1 June 2014 and 31 December 2015. Multinomial logistic regression analysis was used to identify determinants of consistent 6-monthly testing compared with single testing and inconsistent testing. Determinants of time between consultations among men with multiple consultations were analysed using a Cox Prentice-Williams-Peterson gap-time model. A total of 34 605 STI consultations of 18 634 MSM were included. 8966 (48.1%) men had more than one consultation, and 3516 (18.9%) men tested consistently 6-monthly. Indicators of high sexual risk behaviour, including having a history of STI, being HIV positive and having more than 10 sex partners, were positively associated with both being a consistent tester and returning to the STI clinic sooner. Men who were notified by a partner or who reported STI symptoms were also more likely to return to the STI clinic sooner, but were less likely to be consistent testers, identifying a group of event-driven testers. The proportion of consistent 6-monthly testers among MSM visiting Dutch STI outpatient clinics was low. Testing behaviour was associated with sexual risk behaviour, but exact motives to test consistently remain unclear. Evidence-based testing guidelines are needed to achieve optimal reductions in STI transmission in the future. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence

  1. Time-consistent actuarial valuations

    NARCIS (Netherlands)

    Pelsser, A.A.J.; Salahnejhad Ghalehjooghi, A.

    2016-01-01

    Time-consistent valuations (i.e. pricing operators) can be created by backward iteration of one-period valuations. In this paper we investigate the continuous-time limits of well-known actuarial premium principles when such backward iteration procedures are applied. This method is applied to an

  2. The Rucio Consistency Service

    CERN Document Server

    Serfon, Cedric; The ATLAS collaboration

    2016-01-01

    One of the biggest challenge with Large scale data management system is to ensure the consistency between the global file catalog and what is physically on all storage elements. To tackle this issue, the Rucio software which is used by the ATLAS Distributed Data Management system has been extended to automatically handle lost or unregistered files (aka Dark Data). This system automatically detects these inconsistencies and take actions like recovery or deletion of unneeded files in a central manner. In this talk, we will present this system, explain the internals and give some results.

  3. Experimental study on rapid embankment construction methods

    International Nuclear Information System (INIS)

    Hirano, Hideaki; Egawa, Kikuji; Hyodo, Kazuya; Kannoto, Yasuo; Sekimoto, Tsuyoshi; Kobayashi, Kokichi.

    1982-01-01

    In the construction of a thermal or nuclear power plant in a coastal area, shorter embankment construction period has come to be called for recently. This tendency is remarkable where construction period is limited due to meteorological or sea conditions. To meet this requirement, the authors have been conducting basic experimental studies on two methods for the rapid execution of embankment construction, that is, Steel Plate Cellular Bulkhead Embedding Method and Ship Hull Caisson Method. This paper presents an outline of the results of the experimental study on these two methods. (author)

  4. Is cosmology consistent?

    International Nuclear Information System (INIS)

    Wang Xiaomin; Tegmark, Max; Zaldarriaga, Matias

    2002-01-01

    We perform a detailed analysis of the latest cosmic microwave background (CMB) measurements (including BOOMERaNG, DASI, Maxima and CBI), both alone and jointly with other cosmological data sets involving, e.g., galaxy clustering and the Lyman Alpha Forest. We first address the question of whether the CMB data are internally consistent once calibration and beam uncertainties are taken into account, performing a series of statistical tests. With a few minor caveats, our answer is yes, and we compress all data into a single set of 24 bandpowers with associated covariance matrix and window functions. We then compute joint constraints on the 11 parameters of the 'standard' adiabatic inflationary cosmological model. Our best fit model passes a series of physical consistency checks and agrees with essentially all currently available cosmological data. In addition to sharp constraints on the cosmic matter budget in good agreement with those of the BOOMERaNG, DASI and Maxima teams, we obtain a heaviest neutrino mass range 0.04-4.2 eV and the sharpest constraints to date on gravity waves which (together with preference for a slight red-tilt) favor 'small-field' inflation models

  5. Consistent Quantum Theory

    Science.gov (United States)

    Griffiths, Robert B.

    2001-11-01

    Quantum mechanics is one of the most fundamental yet difficult subjects in physics. Nonrelativistic quantum theory is presented here in a clear and systematic fashion, integrating Born's probabilistic interpretation with Schrödinger dynamics. Basic quantum principles are illustrated with simple examples requiring no mathematics beyond linear algebra and elementary probability theory. The quantum measurement process is consistently analyzed using fundamental quantum principles without referring to measurement. These same principles are used to resolve several of the paradoxes that have long perplexed physicists, including the double slit and Schrödinger's cat. The consistent histories formalism used here was first introduced by the author, and extended by M. Gell-Mann, J. Hartle and R. Omnès. Essential for researchers yet accessible to advanced undergraduate students in physics, chemistry, mathematics, and computer science, this book is supplementary to standard textbooks. It will also be of interest to physicists and philosophers working on the foundations of quantum mechanics. Comprehensive account Written by one of the main figures in the field Paperback edition of successful work on philosophy of quantum mechanics

  6. Cross- cultural validation of the Brazilian Portuguese version of the Social Phobia Inventory (SPIN): study of the items and internal consistency.

    Science.gov (United States)

    Osório, Flávia de Lima; Crippa, José Alexandre S; Loureiro, Sonia Regina

    2009-03-01

    The objective of the present study was to carry out the cross- cultural validation for Brazilian Portuguese of the Social Phobia Inventory, an instrument for the evaluation of fear, avoidance and physiological symptoms associated with social anxiety disorder. The process of translation and adaptation involved four bilingual professionals, appreciation and approval of the back- translation by the authors of the original scale, a pilot study with 30 Brazilian university students, and appreciation by raters who confirmed the face validity of the Portuguese version, which was named ' Inventário de Fobia Social' . As part of the psychometric study of the Social Phobia Inventory, analysis of the items and evaluation of the internal consistency of the instrument were performed in a study conducted on 2314 university students. The results demonstrated that item 11, related to the fear of public speaking, was the most frequently scored item. The correlation of the items with the total score was quite adequate, ranging from 0.44 to 0.71, as was the internal consistency, which ranged from 0.71 to 0.90. The authors conclude that the Brazilian Portuguese version of the Social Phobia Inventory proved to be adequate regarding the psychometric properties initially studied, with qualities quite close to those of the original study. Studies that will evaluate the remaining indicators of validity of the Social Phobia Inventory in clinical and non-clinical samples are considered to be opportune and necessary.

  7. Report on the uncertainty methods study

    International Nuclear Information System (INIS)

    1998-06-01

    The Uncertainty Methods Study (UMS) Group, following a mandate from CSNI, has compared five methods for calculating the uncertainty in the predictions of advanced 'best estimate' thermal-hydraulic codes: the Pisa method (based on extrapolation from integral experiments) and four methods identifying and combining input uncertainties. Three of these, the GRS, IPSN and ENUSA methods, use subjective probability distributions, and one, the AEAT method, performs a bounding analysis. Each method has been used to calculate the uncertainty in specified parameters for the LSTF SB-CL-18 5% cold leg small break LOCA experiment in the ROSA-IV Large Scale Test Facility (LSTF). The uncertainty analysis was conducted essentially blind and the participants did not use experimental measurements from the test as input apart from initial and boundary conditions. Participants calculated uncertainty ranges for experimental parameters including pressurizer pressure, primary circuit inventory and clad temperature (at a specified position) as functions of time

  8. A relativistic self-consistent model for studying enhancement of space charge limited field emission due to counter-streaming ions

    International Nuclear Information System (INIS)

    Lin, M. C.; Lu, P. S.; Chang, P. C.; Ragan-Kelley, B.; Verboncoeur, J. P.

    2014-01-01

    Recently, field emission has attracted increasing attention despite the practical limitation that field emitters operate below the Child-Langmuir space charge limit. By introducing counter-streaming ion flow to neutralize the electron charge density, the space charge limited field emission (SCLFE) current can be dramatically enhanced. In this work, we have developed a relativistic self-consistent model for studying the enhancement of SCLFE by a counter-streaming ion current. The maximum enhancement is found when the ion effect is saturated, as shown analytically. The solutions in non-relativistic, intermediate, and ultra-relativistic regimes are obtained and verified with 1-D particle-in-cell simulations. This self-consistent model is general and can also serve as a benchmark or comparison for verification of simulation codes, as well as extension to higher dimensions

  9. Decay correction methods in dynamic PET studies

    International Nuclear Information System (INIS)

    Chen, K.; Reiman, E.; Lawson, M.

    1995-01-01

    In order to reconstruct positron emission tomography (PET) images in quantitative dynamic studies, the data must be corrected for radioactive decay. One of the two commonly used methods ignores physiological processes including blood flow that occur at the same time as radioactive decay; the other makes incorrect use of time-accumulated PET counts. In simulated dynamic PET studies using 11 C-acetate and 18 F-fluorodeoxyglucose (FDG), these methods are shown to result in biased estimates of the time-activity curve (TAC) and model parameters. New methods described in this article provide significantly improved parameter estimates in dynamic PET studies

  10. Consistent force fields for saccharides

    DEFF Research Database (Denmark)

    Rasmussen, Kjeld

    1999-01-01

    Consistent force fields for carbohydrates were hitherto developed by extensive optimization ofpotential energy function parameters on experimental data and on ab initio results. A wide range of experimental data is used: internal structures obtained from gas phase electron diffraction and from x......-anomeric effects are accounted for without addition of specific terms. The work is done in the framework of the Consistent Force Field which originatedin Israel and was further developed in Denmark. The actual methods and strategies employed havebeen described previously. Extensive testing of the force field...

  11. Critical Appraisal of Mixed Methods Studies

    Science.gov (United States)

    Heyvaert, Mieke; Hannes, Karin; Maes, Bea; Onghena, Patrick

    2013-01-01

    In several subdomains of the social, behavioral, health, and human sciences, research questions are increasingly answered through mixed methods studies, combining qualitative and quantitative evidence and research elements. Accordingly, the importance of including those primary mixed methods research articles in systematic reviews grows. It is…

  12. A method optimization study for atomic absorption ...

    African Journals Online (AJOL)

    A sensitive, reliable and relative fast method has been developed for the determination of total zinc in insulin by atomic absorption spectrophotometer. This designed study was used to optimize the procedures for the existing methods. Spectrograms of both standard and sample solutions of zinc were recorded by measuring ...

  13. Optimizing Usability Studies by Complementary Evaluation Methods

    NARCIS (Netherlands)

    Schmettow, Martin; Bach, Cedric; Scapin, Dominique

    2014-01-01

    This paper examines combinations of complementary evaluation methods as a strategy for efficient usability problem discovery. A data set from an earlier study is re-analyzed, involving three evaluation methods applied to two virtual environment applications. Results of a mixed-effects logistic

  14. A New Method to Study Analytic Inequalities

    Directory of Open Access Journals (Sweden)

    Xiao-Ming Zhang

    2010-01-01

    Full Text Available We present a new method to study analytic inequalities involving n variables. Regarding its applications, we proved some well-known inequalities and improved Carleman's inequality.

  15. Market-consistent actuarial valuation

    CERN Document Server

    Wüthrich, Mario V

    2016-01-01

    This is the third edition of this well-received textbook, presenting powerful methods for measuring insurance liabilities and assets in a consistent way, with detailed mathematical frameworks that lead to market-consistent values for liabilities. Topics covered are stochastic discounting with deflators, valuation portfolio in life and non-life insurance, probability distortions, asset and liability management, financial risks, insurance technical risks, and solvency. Including updates on recent developments and regulatory changes under Solvency II, this new edition of Market-Consistent Actuarial Valuation also elaborates on different risk measures, providing a revised definition of solvency based on industry practice, and presents an adapted valuation framework which takes a dynamic view of non-life insurance reserving risk.

  16. Stool consistency and stool frequency are excellent clinical markers for adequate colon preparation after polyethylene glycol 3350 cleansing protocol: a prospective clinical study in children.

    Science.gov (United States)

    Safder, Shaista; Demintieva, Yulia; Rewalt, Mary; Elitsur, Yoram

    2008-12-01

    Colon preparation for a colonoscopy in children is a difficult task because of the unpalatable taste and large volume of cleansing solution that needs to be consumed to ensure a clean colon. Consequently, an unprepared colon frequently occurs in routine practices, which causes early termination and a repeated procedure. (1) To assess the effectiveness of polyethylene glycol solution (PEG 3350) in preparing the colon of children scheduled for a colonoscopy and (2) to investigate clinical markers associated with an adequate colon preparation before a colonoscopy. A total of 167 children scheduled for a colonoscopy. In a prospective study, children scheduled for a colonoscopy were given PEG 3350 solution (1.5 g/kg per day, up to 100 g/d) over a 4-day preparation period. Each day, a simple questionnaire that documents the amount of liquid consumed, adverse effects, and the number and consistency of stool was completed by the parents. After a colonoscopy procedure, the colon preparation was assigned a number grade. The data were later assessed and were compared to determine the association between the grade of cleansing and the frequency and/or consistency of stool during preparation. Colon preparation was completed in 149 children, 133 of whom were adequately prepared. Inadequate preparation was found in 16 children; the procedure was terminated prematurely in 2 of these patients because of unacceptable conditions. No significant adverse effects were noted. A number of >or=5 stools/d, and liquid stool consistency in the last 2 days of preparation were associated with adequate colon preparation. PEG 3350 solution is safe, efficacious, and tolerable for children. Stool frequency and consistency in the last 2 days of preparation were excellent markers (positive predictive value 91%-95%), which predict an adequately clean colon before a colonoscopy in children.

  17. The Principle of Energetic Consistency

    Science.gov (United States)

    Cohn, Stephen E.

    2009-01-01

    A basic result in estimation theory is that the minimum variance estimate of the dynamical state, given the observations, is the conditional mean estimate. This result holds independently of the specifics of any dynamical or observation nonlinearity or stochasticity, requiring only that the probability density function of the state, conditioned on the observations, has two moments. For nonlinear dynamics that conserve a total energy, this general result implies the principle of energetic consistency: if the dynamical variables are taken to be the natural energy variables, then the sum of the total energy of the conditional mean and the trace of the conditional covariance matrix (the total variance) is constant between observations. Ensemble Kalman filtering methods are designed to approximate the evolution of the conditional mean and covariance matrix. For them the principle of energetic consistency holds independently of ensemble size, even with covariance localization. However, full Kalman filter experiments with advection dynamics have shown that a small amount of numerical dissipation can cause a large, state-dependent loss of total variance, to the detriment of filter performance. The principle of energetic consistency offers a simple way to test whether this spurious loss of variance limits ensemble filter performance in full-blown applications. The classical second-moment closure (third-moment discard) equations also satisfy the principle of energetic consistency, independently of the rank of the conditional covariance matrix. Low-rank approximation of these equations offers an energetically consistent, computationally viable alternative to ensemble filtering. Current formulations of long-window, weak-constraint, four-dimensional variational methods are designed to approximate the conditional mode rather than the conditional mean. Thus they neglect the nonlinear bias term in the second-moment closure equation for the conditional mean. The principle of

  18. Comparative Study of Daylighting Calculation Methods

    Directory of Open Access Journals (Sweden)

    Mandala Ariani

    2018-01-01

    Full Text Available The aim of this study is to assess five daylighting calculation method commonly used in architectural study. The methods used include hand calculation methods (SNI/DPMB method and BRE Daylighting Protractors, scale models studied in an artificial sky simulator and computer programs using Dialux and Velux lighting software. The test room is conditioned by the uniform sky conditions, simple room geometry with variations of the room reflectance (black, grey, and white color. The analyses compared the result (including daylight factor, illumination, and coefficient of uniformity value and examines the similarity and contrast the result different. The color variations trial is used to analyses the internally reflection factor contribution to the result.

  19. Deep Feature Consistent Variational Autoencoder

    OpenAIRE

    Hou, Xianxu; Shen, Linlin; Sun, Ke; Qiu, Guoping

    2016-01-01

    We present a novel method for constructing Variational Autoencoder (VAE). Instead of using pixel-by-pixel loss, we enforce deep feature consistency between the input and the output of a VAE, which ensures the VAE's output to preserve the spatial correlation characteristics of the input, thus leading the output to have a more natural visual appearance and better perceptual quality. Based on recent deep learning works such as style transfer, we employ a pre-trained deep convolutional neural net...

  20. Consistency of GPS and strong-motion records: case study of the Mw9.0 Tohoku-Oki 2011 earthquake

    Science.gov (United States)

    Psimoulis, Panos; Houlié, Nicolas; Michel, Clotaire; Meindl, Michael; Rothacher, Markus

    2014-05-01

    High-rate GPS data are today commonly used to supplement seismic data for the Earth surface motions focusing on earthquake characterisation and rupture modelling. Processing of GPS records using Precise Point Positioning (PPP) can provide real-time information of seismic wave propagation, tsunami early-warning and seismic rupture. Most studies have shown differences between the GPS and seismic systems at very long periods (e.g. >100sec) and static displacements. The aim of this study is the assessment of the consistency of GPS and strong-motion records by comparing their respective displacement waveforms for several frequency bands. For this purpose, the records of the GPS (GEONET) and the strong-motion (KiK-net and K-NET) networks corresponding to the Mw9.0 Tohoku 2011 earthquake were analysed. The comparison of the displacement waveforms of collocated (distance<100m) GPS and strong-motion sites show that the consistency between the two datasets depends on the frequency of the excitation. Differences are mainly due to the GPS noise at relatively short-periods (<3-4 s) and the saturation of the strong-motion sensors for relatively long-periods (40-80 s). Furthermore the agreement between the GPS and strong-motion records also depends on the direction of the excitation signal and the distance from the epicentre. In conclusion, velocities and displacements recovered from GPS and strong-motion records are consistent for long-periods (3-100 s), proving that GPS networks can contribute to the real-time estimation of the long-period ground motion map of an earthquake.

  1. Comparative studies on different molecular methods for ...

    African Journals Online (AJOL)

    The present study aims to evaluate two molecular methods for epidemiological typing of multi drug resistant Klebsiella pneumoniae isolated from Mansoura Hospitals. In this study, a total of 300 clinical isolates were collected from different patients distributed among Mansoura Hospitals, Dakahlia governorate, Egypt.

  2. Comparative study of environmental impact assessment methods ...

    African Journals Online (AJOL)

    This study aims to introduce and systematically investigate the environmental issues during important decision-making stages. Meanwhile, impacts of development on the environmental components will be also analyzed. This research studies various methods of predicting the environmental changes and determining the ...

  3. Consistency of extreme flood estimation approaches

    Science.gov (United States)

    Felder, Guido; Paquet, Emmanuel; Penot, David; Zischg, Andreas; Weingartner, Rolf

    2017-04-01

    Estimations of low-probability flood events are frequently used for the planning of infrastructure as well as for determining the dimensions of flood protection measures. There are several well-established methodical procedures to estimate low-probability floods. However, a global assessment of the consistency of these methods is difficult to achieve, the "true value" of an extreme flood being not observable. Anyway, a detailed comparison performed on a given case study brings useful information about the statistical and hydrological processes involved in different methods. In this study, the following three different approaches for estimating low-probability floods are compared: a purely statistical approach (ordinary extreme value statistics), a statistical approach based on stochastic rainfall-runoff simulation (SCHADEX method), and a deterministic approach (physically based PMF estimation). These methods are tested for two different Swiss catchments. The results and some intermediate variables are used for assessing potential strengths and weaknesses of each method, as well as for evaluating the consistency of these methods.

  4. SS-HORSE method for studying resonances

    Energy Technology Data Exchange (ETDEWEB)

    Blokhintsev, L. D. [Moscow State University, Skobeltsyn Institute of Nuclear Physics (Russian Federation); Mazur, A. I.; Mazur, I. A., E-mail: 008043@pnu.edu.ru [Pacific National University (Russian Federation); Savin, D. A.; Shirokov, A. M. [Moscow State University, Skobeltsyn Institute of Nuclear Physics (Russian Federation)

    2017-03-15

    A new method for analyzing resonance states based on the Harmonic-Oscillator Representation of Scattering Equations (HORSE) formalism and analytic properties of partial-wave scattering amplitudes is proposed. The method is tested by applying it to the model problem of neutral-particle scattering and can be used to study resonance states on the basis of microscopic calculations performed within various versions of the shell model.

  5. Generating Consistent Program Tutorials

    DEFF Research Database (Denmark)

    Vestdam, Thomas

    2002-01-01

    In this paper we present a tool that supports construction of program tutorials. A program tutorial provides the reader with an understanding of an example program by interleaving fragments of source code and explaining text. An example program can for example illustrate how to use a library or a......, and we see potential in using the tool to produce program tutorials to be used for frameworks, libraries, and in educational contexts.......In this paper we present a tool that supports construction of program tutorials. A program tutorial provides the reader with an understanding of an example program by interleaving fragments of source code and explaining text. An example program can for example illustrate how to use a library...... or a framework. We present a means for specifying the fragments of a program that are to be in-lined in the tutorial text. These in-line fragments are defined by addressing named syntactical elements, such as classes and methods, but it is also possible to address individual code lines by labeling them...

  6. All SNPs are not created equal: Genome-wide association studies reveal a consistent pattern of enrichment among functionally annotated SNPs

    NARCIS (Netherlands)

    Schork, A.J.; Thompson, W.K.; Pham, P.; Torkamani, A.; Roddey, J.C.; Sullivan, P.F.; Kelsoe, J.; O'Donovan, M.C.; Furberg, H.; Absher, D.; Agudo, A.; Almgren, P.; Ardissino, D.; Assimes, T.L.; Bandinelli, S.; Barzan, L.; Bencko, V.; Benhamou, S.; Benjamin, E.J.; Bernardinelli, L.; Bis, J.; Boehnke, M.; Boerwinkle, E.; Boomsma, D.I.; Brennan, P.; Canova, C.; Castellsagué, X.; Chanock, S.; Chasman, D.I.; Conway, D.I.; Dackor, J.; de Geus, E.J.C.; Duan, J.; Elosua, R.; Everett, B.; Fabianova, E.; Ferrucci, L.; Foretova, L.; Fortmann, S.P.; Franceschini, N.; Frayling, T.M.; Furberg, C.; Gejman, P.V.; Groop, L.; Gu, F.; Guralnik, J.; Hankinson, S.E.; Haritunians, T.; Healy, C.; Hofman, A.; Holcátová, I.; Hunter, D.J.; Hwang, S.J.; Ioannidis, J.P.A.; Iribarren, C.; Jackson, A.U.; Janout, V.; Kaprio, J.; Kim, Y.; Kjaerheim, K.; Knowles, J.W.; Kraft, P.; Ladenvall, C.; Lagiou, P.; Lanthrop, M.; Lerman, C.; Levinson, D.F.; Levy, D.; Li, M.D.; Lin, D.Y.; Lips, E.H.; Lissowska, J.; Lowry, R.B.; Lucas, G.; Macfarlane, T.V.; Maes, H.H.M.; Mannucci, P.M.; Mates, D.; Mauri, F.; McGovern, J.A.; McKay, J.D.; McKnight, B.; Melander, O.; Merlini, P.A.; Milaneschi, Y.; Mohlke, K.L.; O'Donnell, C.J.; Pare, G.; Penninx, B.W.J.H.; Perry, J.R.B.; Posthuma, D.; Preis, S.R.; Psaty, B.; Quertermous, T.; Ramachandran, V.S.; Richiardi, L.; Ridker, P.M.; Rose, J.; Rudnai, P.; Salomaa, V.; Sanders, A.R.; Schwartz, S.M.; Shi, J.; Smit, J.H.; Stringham, H.M.; Szeszenia-Dabrowska, N.; Tanaka, T.; Taylor, K.; Thacker, E.E.; Thornton, L.; Tiemeier, H.; Tuomilehto, J.; Uitterlinden, A.G.; van Duijn, C.M.; Vink, J.M.; Vogelzangs, N.; Voight, B.F.; Walter, S.; Willemsen, G.; Zaridze, D.; Znaor, A.; Akil, H.; Anjorin, A.; Backlund, L.; Badner, J.A.; Barchas, J.D.; Barrett, T.; Bass, N.; Bauer, M.; Bellivier, F.; Bergen, S.E.; Berrettini, W.; Blackwood, D.; Bloss, C.S.; Breen, G.; Breuer, R.; Bunner, W.E.; Burmeister, M.; Byerley, W. F.; Caesar, S.; Chambert, K.; Cichon, S.; St Clair, D.; Collier, D.A.; Corvin, A.; Coryell, W.H.; Craddock, N.; Craig, D.W.; Daly, M.; Day, R.; Degenhardt, F.; Djurovic, S.; Dudbridge, F.; Edenberg, H.J.; Elkin, A.; Etain, B.; Farmer, A.E.; Ferreira, M.A.; Ferrier, I.; Flickinger, M.; Foroud, T.; Frank, J.; Fraser, C.; Frisén, L.; Gershon, E.S.; Gill, M.; Gordon-Smith, K.; Green, E.K.; Greenwood, T.A.; Grozeva, D.; Guan, W.; Gurling, H.; Gustafsson, O.; Hamshere, M.L.; Hautzinger, M.; Herms, S.; Hipolito, M.; Holmans, P.A.; Hultman, C. M.; Jamain, S.; Jones, E.G.; Jones, I.; Jones, L.; Kandaswamy, R.; Kennedy, J.L.; Kirov, G. K.; Koller, D.L.; Kwan, P.; Landén, M.; Langstrom, N.; Lathrop, M.; Lawrence, J.; Lawson, W.B.; Leboyer, M.; Lee, P.H.; Li, J.; Lichtenstein, P.; Lin, D.; Liu, C.; Lohoff, F.W.; Lucae, S.; Mahon, P.B.; Maier, W.; Martin, N.G.; Mattheisen, M.; Matthews, K.; Mattingsdal, M.; McGhee, K.A.; McGuffin, P.; McInnis, M.G.; McIntosh, A.; McKinney, R.; McLean, A.W.; McMahon, F.J.; McQuillin, A.; Meier, S.; Melle, I.; Meng, F.; Mitchell, P.B.; Montgomery, G.W.; Moran, J.; Morken, G.; Morris, D.W.; Moskvina, V.; Muglia, P.; Mühleisen, T.W.; Muir, W.J.; Müller-Myhsok, B.; Myers, R.M.; Nievergelt, C.M.; Nikolov, I.; Nimgaonkar, V.L.; Nöthen, M.M.; Nurnberger, J.I.; Nwulia, E.A.; O'Dushlaine, C.; Osby, U.; Óskarsson, H.; Owen, M.J.; Petursson, H.; Pickard, B.S.; Porgeirsson, P.; Potash, J.B.; Propping, P.; Purcell, S.M.; Quinn, E.; Raychaudhuri, S.; Rice, J.; Rietschel, M.; Ruderfer, D.; Schalling, M.; Schatzberg, A.F.; Scheftner, W.A.; Schofield, P.R.; Schulze, T.G.; Schumacher, J.; Schwarz, M.M.; Scolnick, E.; Scott, L.J.; Shilling, P.D.; Sigurdsson, E.; Sklar, P.; Smith, E.N.; Stefansson, H.; Stefansson, K.; Steffens, M; Steinberg, S.; Strauss, J.; Strohmaier, J.; Szelinger, S.; Thompson, R.C.; Tozzi, F.; Treutlein, J.; Vincent, J.B.; Watson, S.J.; Wienker, T.F.; Williamson, R.; Witt, S.H.; Wright, A.; Xu, W.; Young, A.H.; Zandi, P.P.; Zhang, P.; Zöllner, S.; Agartz, I.; Albus, M.; Alexander, M.; Amdur, R. L.; Amin, F.; Bitter, I.; Black, D.W.; Børglum, A.D.; Brown, M.A.; Bruggeman, R.; Buccola, N.G.; Cahn, W.; Cantor, R.M.; Carr, V.J.; Catts, S. V.; Choudhury, K.; Cloninger, C. R.; Cormican, P.; Danoy, P. A.; Datta, S.; DeHert, M.; Demontis, D.; Dikeos, D.; Donnelly, P.; Donohoe, G.; Duong, L.; Dwyer, S.; Fanous, A.; Fink-Jensen, A.; Freedman, R.; Freimer, N.B.; Friedl, M.; Georgieva, L.; Giegling, I.; Glenthoj, B.; Godard, S.; Golimbet, V.; de Haan, L.; Hansen, M.; Hansen, T.; Hartmann, A.M.; Henskens, F. A.; Hougaard, D. M.; Ingason, A.; Jablensky, A. V.; Jakobsen, K.D.; Jay, M.; Jönsson, E.G.; Jürgens, G.; Kahn, R.S.; Keller, M.C.; Kendler, K.S.; Kenis, G.; Kenny, E.; Konnerth, H.; Konte, B.; Krabbendam, L.; Krasucki, R.; Lasseter, V. K.; Laurent, C.; Lencz, T.; Lerer, F. B.; Liang, K. Y.; Lieberman, J. A.; Linszen, D.H.; Lönnqvist, J.; Loughland, C. M.; Maclean, A. W.; Maher, B.S.; Malhotra, A.K.; Mallet, J.; Malloy, P.; McGrath, J. J.; McLean, D. E.; Michie, P. T.; Milanova, V.; Mors, O.; Mortensen, P.B.; Mowry, B. J.; Myin-Germeys, I.; Neale, B.; Nertney, D. A.; Nestadt, G.; Nielsen, J.; Nordentoft, M.; Norton, N.; O'Neill, F.; Olincy, A.; Olsen, L.; Ophoff, R.A.; Orntoft, T. F.; van Os, J.; Pantelis, C.; Papadimitriou, G.; Pato, C.N.; Peltonen, L.; Pickard, B.; Pietilainen, O.P.; Pimm, J.; Pulver, A. E.; Puri, V.; Quested, D.; Rasmussen, H.B.; Rethelyi, J.M.; Ribble, R.; Riley, B.P.; Rossin, L.; Ruggeri, M.; Rujescu, D.; Schall, U.; Schwab, S. G.; Scott, R.J.; Silverman, J.M.; Spencer, C. C.; Strange, A.; Strengman, E.; Stroup, T.S.; Suvisaari, J.; Terenius, L.; Thirumalai, S.; Timm, S.; Toncheva, D.; Tosato, S.; van den Oord, E.J.; Veldink, J.; Visscher, P.M.; Walsh, D.; Wang, A. G.; Werge, T.; Wiersma, D.; Wildenauer, D. B.; Williams, H.J.; Williams, N.M.; van Winkel, R.; Wormley, B.; Zammit, S.; Schork, N.J.; Andreassen, O.A.; Dale, A.M.

    2013-01-01

    Recent results indicate that genome-wide association studies (GWAS) have the potential to explain much of the heritability of common complex phenotypes, but methods are lacking to reliably identify the remaining associated single nucleotide polymorphisms (SNPs). We applied stratified False Discovery

  7. All SNPs Are Not Created Equal: Genome-Wide Association Studies Reveal a Consistent Pattern of Enrichment among Functionally Annotated SNPs

    NARCIS (Netherlands)

    Schork, Andrew J.; Thompson, Wesley K.; Pham, Phillip; Torkamani, Ali; Roddey, J. Cooper; Sullivan, Patrick F.; Kelsoe, John R.; O'Donovan, Michael C.; Furberg, Helena; Schork, Nicholas J.; Andreassen, Ole A.; Dale, Anders M.; Absher, Devin; Agudo, Antonio; Almgren, Peter; Ardissino, Diego; Assimes, Themistocles L.; Bandinelli, Stephania; Barzan, Luigi; Bencko, Vladimir; Benhamou, Simone; Benjamin, Emelia J.; Bernardinelli, Luisa; Bis, Joshua; Boehnke, Michael; Boerwinkle, Eric; Boomsma, Dorret I.; Brennan, Paul; Canova, Cristina; Castellsagué, Xavier; Chanock, Stephen; Chasman, Daniel; Conway, David I.; Dackor, Jennifer; de Geus, Eco J. C.; Duan, Jubao; Elosua, Roberto; Everett, Brendan; Fabianova, Eleonora; Ferrucci, Luigi; Foretova, Lenka; Fortmann, Stephen P.; Franceschini, Nora; Frayling, Timothy; Furberg, Curt; Gejman, Pablo V.; Groop, Leif; Gu, Fangyi; de Haan, Lieuwe; Linszen, Don H.

    2013-01-01

    Recent results indicate that genome-wide association studies (GWAS) have the potential to explain much of the heritability of common complex phenotypes, but methods are lacking to reliably identify the remaining associated single nucleotide polymorphisms (SNPs). We applied stratified False Discovery

  8. Is consumer response to plain/standardised tobacco packaging consistent with framework convention on tobacco control guidelines? A systematic review of quantitative studies.

    Science.gov (United States)

    Stead, Martine; Moodie, Crawford; Angus, Kathryn; Bauld, Linda; McNeill, Ann; Thomas, James; Hastings, Gerard; Hinds, Kate; O'Mara-Eves, Alison; Kwan, Irene; Purves, Richard I; Bryce, Stuart L

    2013-01-01

    Standardised or 'plain' tobacco packaging was introduced in Australia in December 2012 and is currently being considered in other countries. The primary objective of this systematic review was to locate, assess and synthesise published and grey literature relating to the potential impacts of standardised tobacco packaging as proposed by the guidelines for the international Framework Convention on Tobacco Control: reduced appeal, increased salience and effectiveness of health warnings, and more accurate perceptions of product strength and harm. Electronic databases were searched and researchers in the field were contacted to identify studies. Eligible studies were published or unpublished primary research of any design, issued since 1980 and concerning tobacco packaging. Twenty-five quantitative studies reported relevant outcomes and met the inclusion criteria. A narrative synthesis was conducted. Studies that explored the impact of package design on appeal consistently found that standardised packaging reduced the appeal of cigarettes and smoking, and was associated with perceived lower quality, poorer taste and less desirable smoker identities. Although findings were mixed, standardised packs tended to increase the salience and effectiveness of health warnings in terms of recall, attention, believability and seriousness, with effects being mediated by the warning size, type and position on pack. Pack colour was found to influence perceptions of product harm and strength, with darker coloured standardised packs generally perceived as containing stronger tasting and more harmful cigarettes than fully branded packs; lighter coloured standardised packs suggested weaker and less harmful cigarettes. Findings were largely consistent, irrespective of location and sample. The evidence strongly suggests that standardised packaging will reduce the appeal of packaging and of smoking in general; that it will go some way to reduce consumer misperceptions regarding product harm

  9. Is consumer response to plain/standardised tobacco packaging consistent with framework convention on tobacco control guidelines? A systematic review of quantitative studies.

    Directory of Open Access Journals (Sweden)

    Martine Stead

    Full Text Available Standardised or 'plain' tobacco packaging was introduced in Australia in December 2012 and is currently being considered in other countries. The primary objective of this systematic review was to locate, assess and synthesise published and grey literature relating to the potential impacts of standardised tobacco packaging as proposed by the guidelines for the international Framework Convention on Tobacco Control: reduced appeal, increased salience and effectiveness of health warnings, and more accurate perceptions of product strength and harm.Electronic databases were searched and researchers in the field were contacted to identify studies. Eligible studies were published or unpublished primary research of any design, issued since 1980 and concerning tobacco packaging. Twenty-five quantitative studies reported relevant outcomes and met the inclusion criteria. A narrative synthesis was conducted.Studies that explored the impact of package design on appeal consistently found that standardised packaging reduced the appeal of cigarettes and smoking, and was associated with perceived lower quality, poorer taste and less desirable smoker identities. Although findings were mixed, standardised packs tended to increase the salience and effectiveness of health warnings in terms of recall, attention, believability and seriousness, with effects being mediated by the warning size, type and position on pack. Pack colour was found to influence perceptions of product harm and strength, with darker coloured standardised packs generally perceived as containing stronger tasting and more harmful cigarettes than fully branded packs; lighter coloured standardised packs suggested weaker and less harmful cigarettes. Findings were largely consistent, irrespective of location and sample.The evidence strongly suggests that standardised packaging will reduce the appeal of packaging and of smoking in general; that it will go some way to reduce consumer misperceptions

  10. Apparently-Different Clearance Rates from Cohort Studies of Mycoplasma genitalium Are Consistent after Accounting for Incidence of Infection, Recurrent Infection, and Study Design.

    Directory of Open Access Journals (Sweden)

    Timo Smieszek

    Full Text Available Mycoplasma genitalium is a potentially major cause of urethritis, cervicitis, pelvic inflammatory disease, infertility, and increased HIV risk. A better understanding of its natural history is crucial to informing control policy. Two extensive cohort studies (students in London, UK; Ugandan sex workers suggest very different clearance rates; we aimed to understand the reasons and obtain improved estimates by making maximal use of the data from the studies. As M. genitalium is a sexually-transmitted infectious disease, we developed a model for time-to-event analysis that incorporates the processes of (reinfection and clearance, and fitted to data from the two cohort studies to estimate incidence and clearance rates under different scenarios of sexual partnership dynamics and study design (including sample handling and associated test sensitivity. In the London students, the estimated clearance rate is 0.80 p.a. (mean duration 15 months, with incidence 1.31%-3.93% p.a. Without adjusting for study design, corresponding estimates from the Ugandan data are 3.44 p.a. (mean duration 3.5 months and 58% p.a. Apparent differences in clearance rates are probably mostly due to lower testing sensitivity in the Uganda study due to differences in sample handling, with 'true' clearance rates being similar, and adjusted incidence in Uganda being 28% p.a. Some differences are perhaps due to the sex workers having more-frequent antibiotic treatment, whilst reinfection within ongoing sexual partnerships might have caused some of the apparently-persistent infection in the London students. More information on partnership dynamics would inform more accurate estimates of natural-history parameters. Detailed studies in men are also required.

  11. Validation of temporal and spatial consistency of facility- and speed-specific vehicle-specific power distributions for emission estimation: A case study in Beijing, China.

    Science.gov (United States)

    Zhai, Zhiqiang; Song, Guohua; Lu, Hongyu; He, Weinan; Yu, Lei

    2017-09-01

    Vehicle-specific power (VSP) has been found to be highly correlated with vehicle emissions. It is used in many studies on emission modeling such as the MOVES (Motor Vehicle Emissions Simulator) model. The existing studies develop specific VSP distributions (or OpMode distribution in MOVES) for different road types and various average speeds to represent the vehicle operating modes on road. However, it is still not clear if the facility- and speed-specific VSP distributions are consistent temporally and spatially. For instance, is it necessary to update periodically the database of the VSP distributions in the emission model? Are the VSP distributions developed in the city central business district (CBD) area applicable to its suburb area? In this context, this study examined the temporal and spatial consistency of the facility- and speed-specific VSP distributions in Beijing. The VSP distributions in different years and in different areas are developed, based on real-world vehicle activity data. The root mean square error (RMSE) is employed to quantify the difference between the VSP distributions. The maximum differences of the VSP distributions between different years and between different areas are approximately 20% of that between different road types. The analysis of the carbon dioxide (CO 2 ) emission factor indicates that the temporal and spatial differences of the VSP distributions have no significant impact on vehicle emission estimation, with relative error of less than 3%. The temporal and spatial differences have no significant impact on the development of the facility- and speed-specific VSP distributions for the vehicle emission estimation. The database of the specific VSP distributions in the VSP-based emission models can maintain in terms of time. Thus, it is unnecessary to update the database regularly, and it is reliable to use the history vehicle activity data to forecast the emissions in the future. In one city, the areas with less data can still

  12. Development of methods for body composition studies

    International Nuclear Information System (INIS)

    Mattsson, Soeren; Thomas, Brian J

    2006-01-01

    This review is focused on experimental methods for determination of the composition of the human body, its organs and tissues. It summarizes the development and current status of fat determinations from body density, total body water determinations through the dilution technique, whole and partial body potassium measurements for body cell mass estimates, in vivo neutron activation analysis for body protein measurements, dual-energy absorptiometry (DEXA), computed tomography (CT) and magnetic resonance imaging (MRI, fMRI) and spectroscopy (MRS) for body composition studies on tissue and organ levels, as well as single- and multiple-frequency bioimpedance (BIA) and anthropometry as simple easily available methods. Methods for trace element analysis in vivo are also described. Using this wide range of measurement methods, together with gradually improved body composition models, it is now possible to quantify a number of body components and follow their changes in health and disease. (review)

  13. Development of methods for body composition studies

    Energy Technology Data Exchange (ETDEWEB)

    Mattsson, Soeren [Department of Radiation Physics, Lund University, Malmoe University Hospital, SE-205 02 Malmoe (Sweden); Thomas, Brian J [School of Physical and Chemical Sciences, Queensland University of Technology, Brisbane, QLD 4001 (Australia)

    2006-07-07

    This review is focused on experimental methods for determination of the composition of the human body, its organs and tissues. It summarizes the development and current status of fat determinations from body density, total body water determinations through the dilution technique, whole and partial body potassium measurements for body cell mass estimates, in vivo neutron activation analysis for body protein measurements, dual-energy absorptiometry (DEXA), computed tomography (CT) and magnetic resonance imaging (MRI, fMRI) and spectroscopy (MRS) for body composition studies on tissue and organ levels, as well as single- and multiple-frequency bioimpedance (BIA) and anthropometry as simple easily available methods. Methods for trace element analysis in vivo are also described. Using this wide range of measurement methods, together with gradually improved body composition models, it is now possible to quantify a number of body components and follow their changes in health and disease. (review)

  14. Environmental reference materials methods and case studies

    DEFF Research Database (Denmark)

    Schramm-Nielsen, Karina Edith

    1998-01-01

    . This study lasted 22 months as well. The samples were produced and stored according to a 2³ factorial design. The influences of storage temperature, UV radiation and ultra-filtration on the stability of NH4-N and total phosphorous have been investigated. A Youden plot method is suggested for the graphical....... The methods have been evaluated with regard to their robustness towards variations in the chemical analytical method and with regard to the number of times a significant out of control situation is indicated. The second study regards the stability of NH4-N and total phosphorous in autoclaved seawater samples...... with wastewater. The purpose was to improve ortho-phosphate (and total phosphorous) homogeneity. A procedure is suggested which includes freeze-drying and redissolving. All calculations have been performed in SAS® primarily by means of elementary procedures, analyses of variance procedures, SAS Insight and SAS...

  15. Consistence of Network Filtering Rules

    Institute of Scientific and Technical Information of China (English)

    SHE Kun; WU Yuancheng; HUANG Juncai; ZHOU Mingtian

    2004-01-01

    The inconsistence of firewall/VPN(Virtual Private Network) rule makes a huge maintainable cost.With development of Multinational Company,SOHO office,E-government the number of firewalls/VPN will increase rapidly.Rule table in stand-alone or network will be increased in geometric series accordingly.Checking the consistence of rule table manually is inadequate.A formal approach can define semantic consistence,make a theoretic foundation of intelligent management about rule tables.In this paper,a kind of formalization of host rules and network ones for auto rule-validation based on SET theory were proporsed and a rule validation scheme was defined.The analysis results show the superior performance of the methods and demonstrate its potential for the intelligent management based on rule tables.

  16. Impact of coil design on the contrast-to-noise ratio, precision, and consistency of quantitative cartilage morphometry at 3 Tesla: a pilot study for the osteoarthritis initiative.

    Science.gov (United States)

    Eckstein, Felix; Kunz, Manuela; Hudelmaier, Martin; Jackson, Rebecca; Yu, Joseph; Eaton, Charles B; Schneider, Erika

    2007-02-01

    Phased-array (PA) coils generally provide higher signal-to-noise ratios (SNRs) than quadrature knee coils. In this pilot study for the Osteoarthritis Initiative (OAI) we compared these two types of coils in terms of contrast-to-noise ratio (CNR), precision, and consistency of quantitative femorotibial cartilage measurements. Test-retest measurements were acquired using coronal fast low-angle shot with water excitation (FLASHwe) and coronal multiplanar reconstruction (MPR) of sagittal double-echo steady state with water excitation (DESSwe) at 3T. The precision errors for cartilage volume and thickness were coil and coil with FLASHwe, and coil and sequence. The PA coil measurements did not always fully agree with the quadrature coil measurements, and some differences were significant. The higher CNR of the PA coil did not translate directly into improved precision of cartilage measurement; however, summing up cartilage plates within the medial and lateral compartment reduced precision errors. Copyright (c) 2007 Wiley-Liss, Inc.

  17. Personality, Study Methods and Academic Performance

    Science.gov (United States)

    Entwistle, N. J.; Wilson, J. D.

    1970-01-01

    A questionnaire measuring four student personality types--stable introvert, unstable introvert, stable extrovert and unstable extrovert--along with the Eysenck Personality Inventory (Form A) were give to 72 graduate students at Aberdeen University and the results showed recognizable interaction between study methods, motivation and personality…

  18. Combined Teaching Method: An Experimental Study

    Science.gov (United States)

    Kolesnikova, Iryna V.

    2016-01-01

    The search for the best approach to business education has led educators and researchers to seek many different teaching strategies, ranging from the traditional teaching methods to various experimental approaches such as active learning techniques. The aim of this experimental study was to compare the effects of the traditional and combined…

  19. Condensed matter studies by nuclear methods

    International Nuclear Information System (INIS)

    Krolas, K.; Tomala, K.

    1988-01-01

    The separate abstract was prepared for 1 of the papers in this volume. The remaining 13 papers dealing with the use but not with advances in the use of nuclear methods in studies of condensed matter, were considered outside the subject scope of INIS. (M.F.W.)

  20. Study of the orbital correction method

    International Nuclear Information System (INIS)

    Meserve, R.A.

    1976-01-01

    Two approximations of interest in atomic, molecular, and solid state physics are explored. First, a procedure for calculating an approximate Green's function for use in perturbation theory is derived. In lowest order it is shown to be equivalent to treating the contribution of the bound states of the unperturbed Hamiltonian exactly and representing the continuum contribution by plane waves orthogonalized to the bound states (OPW's). If the OPW approximation were inadequate, the procedure allows for systematic improvement of the approximation. For comparison purposes an exact but more limited procedure for performing second-order perturbation theory, one that involves solving an inhomogeneous differential equation, is also derived. Second, the Kohn-Sham many-electron formalism is discussed and formulae are derived and discussed for implementing perturbation theory within the formalism so as to find corrections to the total energy of a system through second order in the perturbation. Both approximations were used in the calculation of the polarizability of helium, neon, and argon. The calculation included direct and exchange effects by the Kohn-Sham method and full self-consistency was demanded. The results using the differential equation method yielded excellent agreement with the coupled Hartree-Fock results of others and with experiment. Moreover, the OPW approximation yielded satisfactory comparison with the results of calculation by the exact differential equation method. Finally, both approximations were used in the calculation of properties of hydrogen fluoride and methane. The appendix formulates a procedure using group theory and the internal coordinates of a molecular system to simplify the calculation of vibrational frequencies

  1. Study on Processing Method of Image Shadow

    Directory of Open Access Journals (Sweden)

    Wang Bo

    2014-07-01

    Full Text Available In order to effectively remove disturbance of shadow and enhance robustness of information processing of computer visual image, this paper makes study on inspection and removal of image shadow. It makes study the continual removal algorithm of shadow based on integration, the illumination surface and texture, it respectively introduces their work principles and realization method, it can effectively carrying processing for shadow by test.

  2. [Obsessive-compulsive symptoms, tics, stereotypic movements or need for absolute consistency? The occurrence of repetitive activities in patients with pervasive developmental disorders--case studies].

    Science.gov (United States)

    Bryńska, Anita; Lipińska, Elzbieta; Matelska, Monika

    2011-01-01

    Repetitive and stereotyped behaviours in the form of stereotyped interests or specific routine activities are one ofthe diagnostic criteria in pervasive developmental disorders. The occurrence of repetitive behaviours in patients with pervasive developmental disorders is a starting point for questions about the type and classification criteria of such behaviours. The aim of the article is to present case studies of patients with pervasive developmental disorders and co-morbid symptoms in the form of routine activities, tics, obsessive-compulsive symptoms or stereotyped behaviours. The first case study describes a patient with Asperger's syndrome and obsessive compulsive symptoms. The diagnostic problems regarding complex motor tics are discussed in the second case study which describes a patient with Asperger's syndrome and Gilles de la Tourette syndrome. The third and fourth case study describes mono-zygotic twins with so called High Functioning Autism whose repetitive activities point to either obsessive compulsive symptoms, stereotypic movements, need for absolute consistency or echopraxia. The possible comorbidity of pervasive developmental disorders and symptoms in the form of repetitive behaviours, possible interactions as well as diagnostic challenges is discussed in the article.

  3. Designing a mixed methods study in primary care.

    Science.gov (United States)

    Creswell, John W; Fetters, Michael D; Ivankova, Nataliya V

    2004-01-01

    Mixed methods or multimethod research holds potential for rigorous, methodologically sound investigations in primary care. The objective of this study was to use criteria from the literature to evaluate 5 mixed methods studies in primary care and to advance 3 models useful for designing such investigations. We first identified criteria from the social and behavioral sciences to analyze mixed methods studies in primary care research. We then used the criteria to evaluate 5 mixed methods investigations published in primary care research journals. Of the 5 studies analyzed, 3 included a rationale for mixing based on the need to develop a quantitative instrument from qualitative data or to converge information to best understand the research topic. Quantitative data collection involved structured interviews, observational checklists, and chart audits that were analyzed using descriptive and inferential statistical procedures. Qualitative data consisted of semistructured interviews and field observations that were analyzed using coding to develop themes and categories. The studies showed diverse forms of priority: equal priority, qualitative priority, and quantitative priority. Data collection involved quantitative and qualitative data gathered both concurrently and sequentially. The integration of the quantitative and qualitative data in these studies occurred between data analysis from one phase and data collection from a subsequent phase, while analyzing the data, and when reporting the results. We recommend instrument-building, triangulation, and data transformation models for mixed methods designs as useful frameworks to add rigor to investigations in primary care. We also discuss the limitations of our study and the need for future research.

  4. Methods for Analyzing Multivariate Phenotypes in Genetic Association Studies

    Directory of Open Access Journals (Sweden)

    Qiong Yang

    2012-01-01

    Full Text Available Multivariate phenotypes are frequently encountered in genetic association studies. The purpose of analyzing multivariate phenotypes usually includes discovery of novel genetic variants of pleiotropy effects, that is, affecting multiple phenotypes, and the ultimate goal of uncovering the underlying genetic mechanism. In recent years, there have been new method development and application of existing statistical methods to such phenotypes. In this paper, we provide a review of the available methods for analyzing association between a single marker and a multivariate phenotype consisting of the same type of components (e.g., all continuous or all categorical or different types of components (e.g., some are continuous and others are categorical. We also reviewed causal inference methods designed to test whether the detected association with the multivariate phenotype is truly pleiotropy or the genetic marker exerts its effects on some phenotypes through affecting the others.

  5. German precursor study: methods and results

    International Nuclear Information System (INIS)

    Hoertner, H.; Frey, W.; von Linden, J.; Reichart, G.

    1985-01-01

    This study has been prepared by the GRS by contract of the Federal Minister of Interior. The purpose of the study is to show how the application of system-analytic tools and especially of probabilistic methods on the Licensee Event Reports (LERs) and on other operating experience can support a deeper understanding of the safety-related importance of the events reported in reactor operation, the identification of possible weak points, and further conclusions to be drawn from the events. Additionally, the study aimed at a comparison of its results for the severe core damage frequency with those of the German Risk Study as far as this is possible and useful. The German Precursor Study is a plant-specific study. The reference plant is Biblis NPP with its very similar Units A and B, whereby the latter was also the reference plant for the German Risk Study

  6. Fingerprint analysis and quality consistency evaluation of flavonoid compounds for fermented Guava leaf by combining high-performance liquid chromatography time-of-flight electrospray ionization mass spectrometry and chemometric methods.

    Science.gov (United States)

    Wang, Lu; Tian, Xiaofei; Wei, Wenhao; Chen, Gong; Wu, Zhenqiang

    2016-10-01

    Guava leaves are used in traditional herbal teas as antidiabetic therapies. Flavonoids are the main active of Guava leaves and have many physiological functions. However, the flavonoid compositions and activities of Guava leaves could change due to microbial fermentation. A high-performance liquid chromatography time-of-flight electrospray ionization mass spectrometry method was applied to identify the varieties of the flavonoids in Guava leaves before and after fermentation. High-performance liquid chromatography, hierarchical cluster analysis and principal component analysis were used to quantitatively determine the changes in flavonoid compositions and evaluate the consistency and quality of Guava leaves. Monascus anka Saccharomyces cerevisiae fermented Guava leaves contained 2.32- and 4.06-fold more total flavonoids and quercetin, respectively, than natural Guava leaves. The flavonoid compounds of the natural Guava leaves had similarities ranging from 0.837 to 0.927. The flavonoid compounds from the Monascus anka S. cerevisiae fermented Guava leaves had similarities higher than 0.993. This indicated that the quality consistency of the fermented Guava leaves was better than that of the natural Guava leaves. High-performance liquid chromatography fingerprinting and chemometric analysis are promising methods for evaluating the degree of fermentation of Guava leaves based on quality consistency, which could be used in assessing flavonoid compounds for the production of fermented Guava leaves. © 2016 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  7. Hair MDMA samples are consistent with reported ecstasy use: findings from a study investigating effects of ecstasy on mood and memory.

    Science.gov (United States)

    Scholey, A B; Owen, L; Gates, J; Rodgers, J; Buchanan, T; Ling, J; Heffernan, T; Swan, P; Stough, C; Parrott, A C

    2011-01-01

    Our group has conducted several Internet investigations into the biobehavioural effects of self-reported recreational use of MDMA (3,4-methylenedioxymethamphetamine or Ecstasy) and other psychosocial drugs. Here we report a new study examining the relationship between self-reported Ecstasy use and traces of MDMA found in hair samples. In a laboratory setting, 49 undergraduate volunteers performed an Internet-based assessment which included mood scales and the University of East London Drug Use Questionnaire, which asks for history and current drug use. They also provided a hair sample for determination of exposure to MDMA over the previous month. Self-report of Ecstasy use and presence in hair samples were consistent (p happiness and higher self-reported stress. Self-reported Ecstasy use, but not presence in hair, was also associated with decreased tension. Different psychoactive drugs can influence long-term mood and cognition in complex and dynamically interactive ways. Here we have shown a good correspondence between self-report and objective assessment of exposure to MDMA. These data suggest that the Internet has potentially high utility as a useful medium to complement traditional laboratory studies into the sequelae of recreational drug use. Copyright © 2010 S. Karger AG, Basel.

  8. All SNPs are not created equal: genome-wide association studies reveal a consistent pattern of enrichment among functionally annotated SNPs

    DEFF Research Database (Denmark)

    Schork, Andrew J; Thompson, Wesley K; Pham, Phillip

    2013-01-01

    Recent results indicate that genome-wide association studies (GWAS) have the potential to explain much of the heritability of common complex phenotypes, but methods are lacking to reliably identify the remaining associated single nucleotide polymorphisms (SNPs). We applied stratified False...... Discovery Rate (sFDR) methods to leverage genic enrichment in GWAS summary statistics data to uncover new loci likely to replicate in independent samples. Specifically, we use linkage disequilibrium-weighted annotations for each SNP in combination with nominal p-values to estimate the True Discovery Rate...... in introns, and negative enrichment for intergenic SNPs. Stratified enrichment directly leads to increased TDR for a given p-value, mirrored by increased replication rates in independent samples. We show this in independent Crohn's disease GWAS, where we find a hundredfold variation in replication rate...

  9. Evaluating the antiemetic administration consistency to prevent chemotherapy-induced nausea and vomiting with the standard guidelines: a prospective observational study.

    Science.gov (United States)

    Vazin, Afsaneh; Eslami, Davood; Sahebi, Ebrahim

    2017-01-01

    Nausea and vomiting (NV) are the most prevalent adverse effects of chemotherapy (CT). This study was conducted to evaluate adherence of the health care team to standard guidelines for antiemetics usage to prevent acute chemotherapy-induced nausea and vomiting (CINV) in a large CT center. A prospective study was performed during an 11-month period on patients receiving CT. A form was designed to collect patients' demographic information and their chemotherapeutic and antiemetic regimen data. The Likert scale was used to measure the effectiveness of the antiemetics in patients. In this study, the effect of patient-related risk factors on the incidence rate of CINV was examined. Based on the results, CINV events were reported by 74.4% of patients. The antiemetic regimen of 71.2% of the patients complied with the guidelines. The complete response, complete protection, and complete control end points did not differ significantly between patients undergoing guidelines-consistent prophylaxis or guidelines-inconsistent prophylaxis. The females clearly showed a higher incidence rate of CINV ( P =0.001) during the first course of CT ( P =0.006). A history of motion sickness did not affect the incidence of NV. The maximum compliance error occurred for the use of aprepitant, as 16.16% of the patients who were receiving aprepitant did not comply with its instructions. The results of this study highlight how CINV was controlled in this center, which was significantly lower than that of the global standard. Perhaps, factors such as noncompliance to antiemetic regimens with standard guidelines and the failure to adhere to the administration instructions of the antiemetics were involved in the incomplete control of CINV.

  10. Consistency and diversity of spike dynamics in the neurons of bed nucleus of stria terminalis of the rat: a dynamic clamp study.

    Directory of Open Access Journals (Sweden)

    Attila Szücs

    Full Text Available Neurons display a high degree of variability and diversity in the expression and regulation of their voltage-dependent ionic channels. Under low level of synaptic background a number of physiologically distinct cell types can be identified in most brain areas that display different responses to standard forms of intracellular current stimulation. Nevertheless, it is not well understood how biophysically different neurons process synaptic inputs in natural conditions, i.e., when experiencing intense synaptic bombardment in vivo. While distinct cell types might process synaptic inputs into different patterns of action potentials representing specific "motifs" of network activity, standard methods of electrophysiology are not well suited to resolve such questions. In the current paper we performed dynamic clamp experiments with simulated synaptic inputs that were presented to three types of neurons in the juxtacapsular bed nucleus of stria terminalis (jcBNST of the rat. Our analysis on the temporal structure of firing showed that the three types of jcBNST neurons did not produce qualitatively different spike responses under identical patterns of input. However, we observed consistent, cell type dependent variations in the fine structure of firing, at the level of single spikes. At the millisecond resolution structure of firing we found high degree of diversity across the entire spectrum of neurons irrespective of their type. Additionally, we identified a new cell type with intrinsic oscillatory properties that produced a rhythmic and regular firing under synaptic stimulation that distinguishes it from the previously described jcBNST cell types. Our findings suggest a sophisticated, cell type dependent regulation of spike dynamics of neurons when experiencing a complex synaptic background. The high degree of their dynamical diversity has implications to their cooperative dynamics and synchronization.

  11. Case studies: Soil mapping using multiple methods

    Science.gov (United States)

    Petersen, Hauke; Wunderlich, Tina; Hagrey, Said A. Al; Rabbel, Wolfgang; Stümpel, Harald

    2010-05-01

    Soil is a non-renewable resource with fundamental functions like filtering (e.g. water), storing (e.g. carbon), transforming (e.g. nutrients) and buffering (e.g. contamination). Degradation of soils is meanwhile not only to scientists a well known fact, also decision makers in politics have accepted this as a serious problem for several environmental aspects. National and international authorities have already worked out preservation and restoration strategies for soil degradation, though it is still work of active research how to put these strategies into real practice. But common to all strategies the description of soil state and dynamics is required as a base step. This includes collecting information from soils with methods ranging from direct soil sampling to remote applications. In an intermediate scale mobile geophysical methods are applied with the advantage of fast working progress but disadvantage of site specific calibration and interpretation issues. In the framework of the iSOIL project we present here some case studies for soil mapping performed using multiple geophysical methods. We will present examples of combined field measurements with EMI-, GPR-, magnetic and gammaspectrometric techniques carried out with the mobile multi-sensor-system of Kiel University (GER). Depending on soil type and actual environmental conditions, different methods show a different quality of information. With application of diverse methods we want to figure out, which methods or combination of methods will give the most reliable information concerning soil state and properties. To investigate the influence of varying material we performed mapping campaigns on field sites with sandy, loamy and loessy soils. Classification of measured or derived attributes show not only the lateral variability but also gives hints to a variation in the vertical distribution of soil material. For all soils of course soil water content can be a critical factor concerning a succesful

  12. Frank Gilbreth and health care delivery method study driven learning.

    Science.gov (United States)

    Towill, Denis R

    2009-01-01

    The purpose of this article is to look at method study, as devised by the Gilbreths at the beginning of the twentieth century, which found early application in hospital quality assurance and surgical "best practice". It has since become a core activity in all modern methods, as applied to healthcare delivery improvement programmes. The article traces the origin of what is now currently and variously called "business process re-engineering", "business process improvement" and "lean healthcare" etc., by different management gurus back to the century-old pioneering work of Frank Gilbreth. The outcome is a consistent framework involving "width", "length" and "depth" dimensions within which healthcare delivery systems can be analysed, designed and successfully implemented to achieve better and more consistent performance. Healthcare method (saving time plus saving motion) study is best practised as co-joint action learning activity "owned" by all "players" involved in the re-engineering process. However, although process mapping is a key step forward, in itself it is no guarantee of effective re-engineering. It is not even the beginning of the end of the change challenge, although it should be the end of the beginning. What is needed is innovative exploitation of method study within a healthcare organisational learning culture accelerated via the Gilbreth Knowledge Flywheel. It is shown that effective healthcare delivery pipeline improvement is anchored into a team approach involving all "players" in the system especially physicians. A comprehensive process study, constructive dialogue, proper and highly professional re-engineering plus managed implementation are essential components. Experience suggests "learning" is thereby achieved via "natural groups" actively involved in healthcare processes. The article provides a proven method for exploiting Gilbreths' outputs and their many successors in enabling more productive evidence-based healthcare delivery as summarised

  13. Cross- cultural validation of the Brazilian Portuguese version of the Social Phobia Inventory (SPIN: study of the items and internal consistency Validação transcultural da versão para o português do Brasil do Social Phobia Inventory (SPIN: estudo dos itens e da consistência interna

    Directory of Open Access Journals (Sweden)

    Flávia de Lima Osório

    2009-03-01

    Full Text Available OBJECTIVE: The objective of the present study was to carry out the cross- cultural validation for Brazilian Portuguese of the Social Phobia Inventory, an instrument for the evaluation of fear, avoidance and physiological symptoms associated with social anxiety disorder. METHOD: The process of translation and adaptation involved four bilingual professionals, appreciation and approval of the back- translation by the authors of the original scale, a pilot study with 30 Brazilian university students, and appreciation by raters who confirmed the face validity of the Portuguese version, which was named " Inventário de Fobia Social" . As part of the psychometric study of the Social Phobia Inventory, analysis of the items and evaluation of the internal consistency of the instrument were performed in a study conducted on 2314 university students. RESULTS: The results demonstrated that item 11, related to the fear of public speaking, was the most frequently scored item. The correlation of the items with the total score was quite adequate, ranging from 0.44 to 0.71, as was the internal consistency, which ranged from 0.71 to 0.90. DISCUSSION/CONCLUSION: The authors conclude that the Brazilian Portuguese version of the Social Phobia Inventory proved to be adequate regarding the psychometric properties initially studied, with qualities quite close to those of the original study. Studies that will evaluate the remaining indicators of validity of the Social Phobia Inventory in clinical and non-clinical samples are considered to be opportune and necessary.OBJETIVO: O objetivo deste estudo foi realizar a validação transcultural para o português do Brasil do Social Phobia Inventory, um instrumento para avaliação e mensuração dos sintomas de medo, evitação e sintomas fisiológicos associados ao transtorno de ansiedade social. MÉTODO: O processo de tradução e adaptação envolveu quatro profissionais bilingües, apreciação e aprovação da back

  14. Image recognition and consistency of response

    Science.gov (United States)

    Haygood, Tamara M.; Ryan, John; Liu, Qing Mary A.; Bassett, Roland; Brennan, Patrick C.

    2012-02-01

    Purpose: To investigate the connection between conscious recognition of an image previously encountered in an experimental setting and consistency of response to the experimental question. Materials and Methods: Twenty-four radiologists viewed 40 frontal chest radiographs and gave their opinion as to the position of a central venous catheter. One-to-three days later they again viewed 40 frontal chest radiographs and again gave their opinion as to the position of the central venous catheter. Half of the radiographs in the second set were repeated images from the first set and half were new. The radiologists were asked of each image whether it had been included in the first set. For this study, we are evaluating only the 20 repeated images. We used the Kruskal-Wallis test and Fisher's exact test to determine the relationship between conscious recognition of a previously interpreted image and consistency in interpretation of the image. Results. There was no significant correlation between recognition of the image and consistency in response regarding the position of the central venous catheter. In fact, there was a trend in the opposite direction, with radiologists being slightly more likely to give a consistent response with respect to images they did not recognize than with respect to those they did recognize. Conclusion: Radiologists' recognition of previously-encountered images in an observer-performance study does not noticeably color their interpretation on the second encounter.

  15. Can consistent benchmarking within a standardized pain management concept decrease postoperative pain after total hip arthroplasty? A prospective cohort study including 367 patients.

    Science.gov (United States)

    Benditz, Achim; Greimel, Felix; Auer, Patrick; Zeman, Florian; Göttermann, Antje; Grifka, Joachim; Meissner, Winfried; von Kunow, Frederik

    2016-01-01

    The number of total hip replacement surgeries has steadily increased over recent years. Reduction in postoperative pain increases patient satisfaction and enables better mobilization. Thus, pain management needs to be continuously improved. Problems are often caused not only by medical issues but also by organization and hospital structure. The present study shows how the quality of pain management can be increased by implementing a standardized pain concept and simple, consistent, benchmarking. All patients included in the study had undergone total hip arthroplasty (THA). Outcome parameters were analyzed 24 hours after surgery by means of the questionnaires from the German-wide project "Quality Improvement in Postoperative Pain Management" (QUIPS). A pain nurse interviewed patients and continuously assessed outcome quality parameters. A multidisciplinary team of anesthetists, orthopedic surgeons, and nurses implemented a regular procedure of data analysis and internal benchmarking. The health care team was informed of any results, and suggested improvements. Every staff member involved in pain management participated in educational lessons, and a special pain nurse was trained in each ward. From 2014 to 2015, 367 patients were included. The mean maximal pain score 24 hours after surgery was 4.0 (±3.0) on an 11-point numeric rating scale, and patient satisfaction was 9.0 (±1.2). Over time, the maximum pain score decreased (mean 3.0, ±2.0), whereas patient satisfaction significantly increased (mean 9.8, ±0.4; p benchmarking a standardized pain management concept. But regular benchmarking, implementation of feedback mechanisms, and staff education made the pain management concept even more successful. Multidisciplinary teamwork and flexibility in adapting processes seem to be highly important for successful pain management.

  16. The Impact of Couple HIV Testing and Counseling on Consistent Condom Use Among Pregnant Women and Their Male Partners: An Observational Study.

    Science.gov (United States)

    Rosenberg, Nora E; Graybill, Lauren A; Wesevich, Austin; McGrath, Nuala; Golin, Carol E; Maman, Suzanne; Bhushan, Nivedita; Tsidya, Mercy; Chimndozi, Limbikani; Hoffman, Irving F; Hosseinipour, Mina C; Miller, William C

    2017-08-01

    In sub-Saharan Africa couple HIV testing and counseling (CHTC) has been associated with substantial increases in safe sex, especially when at least one partner is HIV infected. However, this relationship has not been characterized in an Option B+ context. The study was conducted at the antenatal clinic at Bwaila District Hospital in Lilongwe, Malawi in 2016 under an Option B+ program. Ninety heterosexual couples with an HIV-infected pregnant woman (female-positive couples) and 47 couples with an HIV-uninfected pregnant woman (female-negative couples) were enrolled in an observational study. Each couple member was assessed immediately before and 1 month after CHTC for safe sex (abstinence or consistent condom use in the last month). Generalized estimating equations were used to model change in safe sex before and after CHTC and to compare safe sex between female-positive and female-negative couples. Mean age was 26 years among women and 32 years among men. Before CHTC, safe sex was comparable among female-positive couples (8%) and female-negative couples (2%) [risk ratio (RR): 3.7, 95% confidence interval (CI): 0.5 to 29.8]. One month after CHTC, safe sex was higher among female-positive couples (75%) than among female-negative couples (3%) (RR: 30.0, 95% CI: 4.3 to 207.7). Safe sex increased substantially after CTHC for female-positive couples (RR 9.6, 95% CI: 4.6 to 20.0), but not for female-negative couples (RR: 1.2, 95% CI: 0.1 to 18.7). Engaging pregnant couples in CHTC can have prevention benefits for couples with an HIV-infected pregnant woman, but additional prevention approaches may be needed for couples with an HIV-uninfected pregnant woman.

  17. COMPARATIVE STUDIES OF THREE METHODS FOR MEASURING PEPSIN ACTIVITY

    Science.gov (United States)

    Loken, Merle K.; Terrill, Kathleen D.; Marvin, James F.; Mosser, Donn G.

    1958-01-01

    Comparison has been made of a simple method originated by Absolon and modified in our laboratories for assay of proteolytic activity using RISA (radioactive iodinated serum albumin—Abbott Laboratories), with the commonly used photometric methods of Anson and Kunitz. In this method, pepsin was incubated with an albumin substrate containing RISA, followed by precipitation of the undigested substrate with trichloroacetic acid and measurement of radioactive digestion products in the supernatant fluid. The I131—albumin bond was shown in the present studies to be altered only by the proteolytic activity, and not by the incubation procedures at various values of pH. Any free iodine present originally in the RISA was removed by a single passage through a resin column (amberlite IRA-400-C1). Pepsin was shown to be most stable in solution at a pH of 5.5. Activity of pepsin was shown to be maximal when it was incubated with albumin at a pH of 2.5. Pepsin activity was shown to be altered in the presence of various electrolytes. Pepsin activity measured by the RISA and Anson methods as a function of concentration or of time of incubation indicated that these two methods are in good agreement and are equally sensitive. Consistently smaller standard errors were obtained by the RISA method of pepsin assay than were obtained with either of the other methods. PMID:13587910

  18. Are radiosensitivity data derived from natural field conditions consistent with data from controlled exposures? A case study of Chernobyl wildlife chronically exposed to low dose rates

    International Nuclear Information System (INIS)

    Garnier-Laplace, J.; Geras’kin, S.; Della-Vedova, C.; Beaugelin-Seiller, K.; Hinton, T.G.; Real, A.; Oudalova, A.

    2013-01-01

    The discrepancy between laboratory or controlled conditions ecotoxicity tests and field data on wildlife chronically exposed to ionising radiation is presented for the first time. We reviewed the available chronic radiotoxicity data acquired in contaminated fields and used a statistical methodology to support the comparison with knowledge on inter-species variation of sensitivity to controlled external γ irradiation. We focus on the Chernobyl Exclusion Zone and effects data on terrestrial wildlife reported in the literature corresponding to chronic dose rate exposure situations (from background∼100 nGy/h up to ∼10 mGy/h). When needed, we reconstructed the dose rate to organisms and obtained consistent unbiased data sets necessary to establish the dose rate–effect relationship for a number of different species and endpoints. Then, we compared the range of variation of radiosensitivity of species from the Chernobyl-Exclusion Zone with the statistical distribution established for terrestrial species chronically exposed to purely gamma external irradiation (or chronic Species radioSensitivity Distribution – SSD). We found that the best estimate of the median value (HDR 50 ) of the distribution established for field conditions at Chernobyl (about 100 μGy/h) was eight times lower than the one from controlled experiments (about 850 μGy/h), suggesting that organisms in their natural environmental were more sensitive to radiation. This first comparison highlights the lack of mechanistic understanding and the potential confusion coming from sampling strategies in the field. To confirm the apparent higher sensitive of wildlife in the Chernobyl Exclusion Zone, we call for more a robust strategy in field, with adequate design to deal with confounding factors. -- Highlights: ► Discrepancy between controlled tests and Chernobyl effects data on wildlife was examined. ► We proposed a method to correct the dosimetry used for Chernobyl wildlife. ► Wildlife from the

  19. Forced Ignition Study Based On Wavelet Method

    Science.gov (United States)

    Martelli, E.; Valorani, M.; Paolucci, S.; Zikoski, Z.

    2011-05-01

    The control of ignition in a rocket engine is a critical problem for combustion chamber design. Therefore it is essential to fully understand the mechanism of ignition during its earliest stages. In this paper the characteristics of flame kernel formation and initial propagation in a hydrogen-argon-oxygen mixing layer are studied using 2D direct numerical simulations with detailed chemistry and transport properties. The flame kernel is initiated by adding an energy deposition source term in the energy equation. The effect of unsteady strain rate is studied by imposing a 2D turbulence velocity field, which is initialized by means of a synthetic field. An adaptive wavelet method, based on interpolating wavelets is used in this study to solve the compressible reactive Navier- Stokes equations. This method provides an alternative means to refine the computational grid points according to local demands of the physical solution. The present simulations show that in the very early instants the kernel perturbed by the turbulent field is characterized by an increased burning area and a slightly increased rad- ical formation. In addition, the calculations show that the wavelet technique yields a significant reduction in the number of degrees of freedom necessary to achieve a pre- scribed solution accuracy.

  20. Evaluation of the Consistency of MODIS Land Cover Product (MCD12Q1 Based on Chinese 30 m GlobeLand30 Datasets: A Case Study in Anhui Province, China

    Directory of Open Access Journals (Sweden)

    Dong Liang

    2015-11-01

    Full Text Available Land cover plays an important role in the climate and biogeochemistry of the Earth system. It is of great significance to produce and evaluate the global land cover (GLC data when applying the data to the practice at a specific spatial scale. The objective of this study is to evaluate and validate the consistency of the Moderate Resolution Imaging Spectroradiometer (MODIS land cover product (MCD12Q1 at a provincial scale (Anhui Province, China based on the Chinese 30 m GLC product (GlobeLand30. A harmonization method is firstly used to reclassify the land cover types between five classification schemes (International Geosphere Biosphere Programme (IGBP global vegetation classification, University of Maryland (UMD, MODIS-derived Leaf Area Index and Fractional Photosynthetically Active Radiation (LAI/FPAR, MODIS-derived Net Primary Production (NPP, and Plant Functional Type (PFT of MCD12Q1 and ten classes of GlobeLand30, based on the knowledge rule (KR and C4.5 decision tree (DT classification algorithm. A total of five harmonized land cover types are derived including woodland, grassland, cropland, wetland and artificial surfaces, and four evaluation indicators are selected including the area consistency, spatial consistency, classification accuracy and landscape diversity in the three sub-regions of Wanbei, Wanzhong and Wannan. The results indicate that the consistency of IGBP is the best among the five schemes of MCD12Q1 according to the correlation coefficient (R. The “woodland” LAI/FPAR is the worst, with a spatial similarity (O of 58.17% due to the misclassification between “woodland” and “others”. The consistency of NPP is the worst among the five schemes as the agreement varied from 1.61% to 56.23% in the three sub-regions. Furthermore, with the biggest difference of diversity indices between LAI/FPAR and GlobeLand30, the consistency of LAI/FPAR is the weakest. This study provides a methodological reference for evaluating the

  1. Designing A Mixed Methods Study In Primary Care

    Science.gov (United States)

    Creswell, John W.; Fetters, Michael D.; Ivankova, Nataliya V.

    2004-01-01

    BACKGROUND Mixed methods or multimethod research holds potential for rigorous, methodologically sound investigations in primary care. The objective of this study was to use criteria from the literature to evaluate 5 mixed methods studies in primary care and to advance 3 models useful for designing such investigations. METHODS We first identified criteria from the social and behavioral sciences to analyze mixed methods studies in primary care research. We then used the criteria to evaluate 5 mixed methods investigations published in primary care research journals. RESULTS Of the 5 studies analyzed, 3 included a rationale for mixing based on the need to develop a quantitative instrument from qualitative data or to converge information to best understand the research topic. Quantitative data collection involved structured interviews, observational checklists, and chart audits that were analyzed using descriptive and inferential statistical procedures. Qualitative data consisted of semistructured interviews and field observations that were analyzed using coding to develop themes and categories. The studies showed diverse forms of priority: equal priority, qualitative priority, and quantitative priority. Data collection involved quantitative and qualitative data gathered both concurrently and sequentially. The integration of the quantitative and qualitative data in these studies occurred between data analysis from one phase and data collection from a subsequent phase, while analyzing the data, and when reporting the results. DISCUSSION We recommend instrument-building, triangulation, and data transformation models for mixed methods designs as useful frameworks to add rigor to investigations in primary care. We also discuss the limitations of our study and the need for future research. PMID:15053277

  2. Research Methods in European Union Studies

    DEFF Research Database (Denmark)

    Lynggaard, Kennet; Manners, Ian; Löfgren, Karl

    Research on the European Union over the past few years has been strongly implicated in the crises that currently grip Europe with a failure to ask the pertinent questions as well as a perceived weakness in the methods and evidence used by researchers providing the basis for these allegations....... This volume moves the study of EU research strategies beyond the dichotomies of the past towards a new agenda for research on Europe through a rich diversity of problem-solving based research. This new agenda acknowledges the weaknesses of the past and moves beyond them towards greater openness and awareness...

  3. Study of superconducting magnetic bearing applicable to the flywheel energy storage system that consist of HTS-bulks and superconducting-coils

    International Nuclear Information System (INIS)

    Seino, Hiroshi; Nagashima, Ken; Tanaka, Yoshichika; Nakauchi, Masahiko

    2010-01-01

    The Railway Technical Research Institute conducted a study to develop a superconducting magnetic bearing applicable to the flywheel energy-storage system for railways. In the first step of the study, the thrust rolling bearing was selected for application, and adopted liquid-nitrogen-cooled HTS-bulk as a rotor, and adopted superconducting coil as a stator for the superconducting magnetic bearing. Load capacity of superconducting magnetic bearing was verified up to 10 kN in the static load test. After that, rotation test of that approximately 5 kN thrust load added was performed with maximum rotation of 3000rpm. In the results of bearing rotation test, it was confirmed that position in levitation is able to maintain with stability during the rotation. Heat transfer properties by radiation in vacuum and conductivity by tenuous gas were basically studied by experiment by the reason of confirmation of rotor cooling method. The experimental result demonstrates that the optimal gas pressure is able to obtain without generating windage drag. In the second stage of the development, thrust load capacity of the bearing will be improved aiming at the achievement of the energy capacity of a practical scale. In the static load test of the new superconducting magnetic bearing, stable 20kN-levitation force was obtained.

  4. [Gastric magnetic resonance study (methods, semiotics)].

    Science.gov (United States)

    Stashuk, G A

    2003-01-01

    The paper shows the potentialities of gastric study by magnetic resonance imaging (MRI). The methodic aspects of gastric study have been worked out. The MRI-semiotics of the unchanged and tumor-affected wall of the stomach and techniques in examining patients with gastric cancer of various sites are described. Using the developed procedure, MRI was performed in 199 patients, including 154 patients with gastric pathology and 45 control individuals who had no altered gastric wall. Great emphasis is placed on the role of MRI in the diagnosis of endophytic (diffuse) gastric cancer that is of priority value in its morphological structure. MRI was found to play a role in the diagnosis of the spread of a tumorous process both along the walls of the stomach and to its adjacent anatomic structures.

  5. CSM research: Methods and application studies

    Science.gov (United States)

    Knight, Norman F., Jr.

    1989-01-01

    Computational mechanics is that discipline of applied science and engineering devoted to the study of physical phenomena by means of computational methods based on mathematical modeling and simulation, utilizing digital computers. The discipline combines theoretical and applied mechanics, approximation theory, numerical analysis, and computer science. Computational mechanics has had a major impact on engineering analysis and design. When applied to structural mechanics, the discipline is referred to herein as computational structural mechanics. Complex structures being considered by NASA for the 1990's include composite primary aircraft structures and the space station. These structures will be much more difficult to analyze than today's structures and necessitate a major upgrade in computerized structural analysis technology. NASA has initiated a research activity in structural analysis called Computational Structural Mechanics (CSM). The broad objective of the CSM activity is to develop advanced structural analysis technology that will exploit modern and emerging computers, such as those with vector and/or parallel processing capabilities. Here, the current research directions for the Methods and Application Studies Team of the Langley CSM activity are described.

  6. Physical Model Method for Seismic Study of Concrete Dams

    Directory of Open Access Journals (Sweden)

    Bogdan Roşca

    2008-01-01

    Full Text Available The study of the dynamic behaviour of concrete dams by means of the physical model method is very useful to understand the failure mechanism of these structures to action of the strong earthquakes. Physical model method consists in two main processes. Firstly, a study model must be designed by a physical modeling process using the dynamic modeling theory. The result is a equations system of dimensioning the physical model. After the construction and instrumentation of the scale physical model a structural analysis based on experimental means is performed. The experimental results are gathered and are available to be analysed. Depending on the aim of the research may be designed an elastic or a failure physical model. The requirements for the elastic model construction are easier to accomplish in contrast with those required for a failure model, but the obtained results provide narrow information. In order to study the behaviour of concrete dams to strong seismic action is required the employment of failure physical models able to simulate accurately the possible opening of joint, sliding between concrete blocks and the cracking of concrete. The design relations for both elastic and failure physical models are based on dimensional analysis and consist of similitude relations among the physical quantities involved in the phenomenon. The using of physical models of great or medium dimensions as well as its instrumentation creates great advantages, but this operation involves a large amount of financial, logistic and time resources.

  7. A study on manufacturing and construction method of buffer

    International Nuclear Information System (INIS)

    Chijimatsu, Masakazu; Sugita, Yutaka; Amemiya, Kiyoshi

    1999-09-01

    As an engineered barrier system in the geological disposal of high-level waste, multibarrier system is considered. Multibarrier system consists of the vitrified waste, the overpack and the buffer. Bentonite is one of the potential material as the buffer because of its low water permeability, self-sealing properties, radionuclides adsorption and retardation properties, thermal conductivity, chemical buffering properties, overpack supporting properties, stress buffering properties, etc. In order to evaluate the functions of buffer, a lot of experiments has been conducted. The evaluations of these functions are based on the assumption that the buffer is emplaced or constructed in the disposal tunnel (or disposal pit) properly. Therefore, it is necessary to study on the manufacturing / construction method of buffer. As the manufacturing / construction technology of the buffer, the block installation method and in-situ compaction method, etc, are being investigated. The block installation method is to emplace the buffer blocks manufactured in advance at the ground facility, and construction processes of the block installation method at the underground will be simplified compared with the in-situ compaction method. On the other hand, the in-situ compaction method is to introduce the buffer material with specified water content into the disposal tunnel and to make the buffer with high density at the site using a compaction machine. In regard to the in-situ compaction method, it is necessary to investigate the optimum finished thickness of one layer because it is impossible to construct the buffer at one time. This report describes the results of compaction property test and the summary of the past investigation results in connection with the manufacturing / construction method. Then this report shows the construction method that will be feasible in the actual disposal site. (J.P.N.)

  8. Analytical method of waste allocation in waste management systems: Concept, method and case study

    International Nuclear Information System (INIS)

    Bergeron, Francis C.

    2017-01-01

    Waste is not a rejected item to dispose anymore but increasingly a secondary resource to exploit, influencing waste allocation among treatment operations in a waste management (WM) system. The aim of this methodological paper is to present a new method for the assessment of the WM system, the “analytical method of the waste allocation process” (AMWAP), based on the concept of the “waste allocation process” defined as the aggregation of all processes of apportioning waste among alternative waste treatment operations inside or outside the spatial borders of a WM system. AMWAP contains a conceptual framework and an analytical approach. The conceptual framework includes, firstly, a descriptive model that focuses on the description and classification of the WM system. It includes, secondly, an explanatory model that serves to explain and to predict the operation of the WM system. The analytical approach consists of a step-by-step analysis for the empirical implementation of the conceptual framework. With its multiple purposes, AMWAP provides an innovative and objective modular method to analyse a WM system which may be integrated in the framework of impact assessment methods and environmental systems analysis tools. Its originality comes from the interdisciplinary analysis of the WAP and to develop the conceptual framework. AMWAP is applied in the framework of an illustrative case study on the household WM system of Geneva (Switzerland). It demonstrates that this method provides an in-depth and contextual knowledge of WM. - Highlights: • The study presents a new analytical method based on the waste allocation process. • The method provides an in-depth and contextual knowledge of the waste management system. • The paper provides a reproducible procedure for professionals, experts and academics. • It may be integrated into impact assessment or environmental system analysis tools. • An illustrative case study is provided based on household waste

  9. Analytical method of waste allocation in waste management systems: Concept, method and case study

    Energy Technology Data Exchange (ETDEWEB)

    Bergeron, Francis C., E-mail: francis.b.c@videotron.ca

    2017-01-15

    Waste is not a rejected item to dispose anymore but increasingly a secondary resource to exploit, influencing waste allocation among treatment operations in a waste management (WM) system. The aim of this methodological paper is to present a new method for the assessment of the WM system, the “analytical method of the waste allocation process” (AMWAP), based on the concept of the “waste allocation process” defined as the aggregation of all processes of apportioning waste among alternative waste treatment operations inside or outside the spatial borders of a WM system. AMWAP contains a conceptual framework and an analytical approach. The conceptual framework includes, firstly, a descriptive model that focuses on the description and classification of the WM system. It includes, secondly, an explanatory model that serves to explain and to predict the operation of the WM system. The analytical approach consists of a step-by-step analysis for the empirical implementation of the conceptual framework. With its multiple purposes, AMWAP provides an innovative and objective modular method to analyse a WM system which may be integrated in the framework of impact assessment methods and environmental systems analysis tools. Its originality comes from the interdisciplinary analysis of the WAP and to develop the conceptual framework. AMWAP is applied in the framework of an illustrative case study on the household WM system of Geneva (Switzerland). It demonstrates that this method provides an in-depth and contextual knowledge of WM. - Highlights: • The study presents a new analytical method based on the waste allocation process. • The method provides an in-depth and contextual knowledge of the waste management system. • The paper provides a reproducible procedure for professionals, experts and academics. • It may be integrated into impact assessment or environmental system analysis tools. • An illustrative case study is provided based on household waste

  10. Study on the scope of fault tree method applicability

    International Nuclear Information System (INIS)

    Ito, Taiju

    1980-03-01

    In fault tree analysis of the reliability of nuclear safety system, including reliability analysis of nuclear protection system, there seem to be some documents in which application of the fault tree method is unreasonable. In fault tree method, the addition rule and the multiplication rule are usually used. The addition rule and the multiplication rule must hold exactly or at least practically. The addition rule has no problem but the multiplication rule has occasionally some problem. For unreliability, mean unavailability and instantaneous unavailability of the elements, holding or not of the multiplication rule has been studied comprehensively. Between the unreliability of each element without maintenance, the multiplication rule holds. Between the instantaneous unavailability of each element, with maintenance or not, the multiplication rule also holds. Between the unreliability of each subsystem with maintenance, however, the multiplication rule does not hold, because the product value is larger than the value of unreliability for a parallel system consisting of the two subsystems with maintenance. Between the mean unavailability of each element without maintenance, the multiplication rule also does not hold, because the product value is smaller than the value of mean unavailability for a parallel system consisting of the two elements without maintenance. In these cases, therefore, the fault tree method may not be applied by rote for reliability analysis of the system. (author)

  11. Factor structure and internal consistency of the 12-item General Health Questionnaire (GHQ-12 and the Subjective Vitality Scale (VS, and the relationship between them: a study from France

    Directory of Open Access Journals (Sweden)

    Ismaïl Amany

    2009-03-01

    Full Text Available Abstract Background The objectives of this study were to test the factor structure and internal consistency of the 12-item General Health Questionnaire (GHQ-12 and the Subjective Vitality Scale (VS in elderly French people, and to test the relationship between these two questionnaires. Methods Using a standard 'forward-backward' translation procedure, the English language versions of the two instruments (i.e. the 12-item General Health Questionnaire and the Subjective Vitality Scale were translated into French. A sample of adults aged 58–72 years then completed both questionnaires. Internal consistency was assessed by Cronbach's alpha coefficient. The factor structures of the two instruments were extracted by confirmatory factor analysis (CFA. Finally, the relationship between the two instruments was assessed by correlation analysis. Results In all, 217 elderly adults participated in the study. The mean age of the respondents was 61.7 (SD = 6.2 years. The mean GHQ-12 score was 17.4 (SD = 8.0, and analysis showed satisfactory internal consistency (Cronbach's alpha coefficient = 0.78. The mean VS score was 22.4 (SD = 7.4 and its internal consistency was found to be good (Cronbach's alpha coefficient = 0.83. While CFA showed that the VS was uni-dimensional, analysis for the GHQ-12 demonstrated a good fit not only to the two-factor model (positive vs. negative items but also to a three-factor model. As expected, there was a strong and significant negative correlation between the GHQ-12 and the VS (r = -0.71, P Conclusion The results showed that the French versions of the 12-item General Health Questionnaire (GHQ-12 and the Subjective Vitality Scale (VS are reliable measures of psychological distress and vitality. They also confirm a significant negative correlation between these two instruments, lending support to their convergent validity in an elderly French population. The findings indicate that both measures have good structural

  12. Studies on shock phenomena in two-phase flow, (4). Characteristics in channel flow consisting of bubbly mixture and liquid in series

    Energy Technology Data Exchange (ETDEWEB)

    Akagawa, Koji; Fujii, Terushige; Ito, Yutaka; Hiraki, Sei

    1982-04-01

    The research carried out so far was related to the case in which the mean void ratio in a pipe distributed almost invariably in axial direction. However, in actual piping system, the distribution of void ratio sometimes changes in axial direction such as evaporating tubes. In this study, in order to clarify the basic characteristics of shock phenomena in a piping system in which the density of two-phase flow changes in axial direction, experiment was carried out on air and water two-component bubbly flow, in which single phase was in upstream, and two-phase flow with constant void ratio in axial direction was in downstream. Also, the theoretical study on the phenomena was performed. The experimental setup and experimental method, the result of the waveform of pressure response, the behavior of pressure waves at the interface of two-phase flow and single phase flow, the qualitative analysis of the waveform of pressure response, and the analysis of pressure rise are reported. By the sudden closure of a valve, the pressure in two-phase flow rose by the initial potential surge, thereafter stepped pressure rise was observed. This phenomenon can be explained by the reflection of pressure waves at the interface of two-phase flow and single phase flow.

  13. Social media methods for studying rare diseases.

    Science.gov (United States)

    Schumacher, Kurt R; Stringer, Kathleen A; Donohue, Janet E; Yu, Sunkyung; Shaver, Ashley; Caruthers, Regine L; Zikmund-Fisher, Brian J; Fifer, Carlen; Goldberg, Caren; Russell, Mark W

    2014-05-01

    For pediatric rare diseases, the number of patients available to support traditional research methods is often inadequate. However, patients who have similar diseases cluster "virtually" online via social media. This study aimed to (1) determine whether patients who have the rare diseases Fontan-associated protein losing enteropathy (PLE) and plastic bronchitis (PB) would participate in online research, and (2) explore response patterns to examine social media's role in participation compared with other referral modalities. A novel, internet-based survey querying details of potential pathogenesis, course, and treatment of PLE and PB was created. The study was available online via web and Facebook portals for 1 year. Apart from 2 study-initiated posts on patient-run Facebook pages at the study initiation, all recruitment was driven by study respondents only. Response patterns and referral sources were tracked. A total of 671 respondents with a Fontan palliation completed a valid survey, including 76 who had PLE and 46 who had PB. Responses over time demonstrated periodic, marked increases as new online populations of Fontan patients were reached. Of the responses, 574 (86%) were from the United States and 97 (14%) were international. The leading referral sources were Facebook, internet forums, and traditional websites. Overall, social media outlets referred 84% of all responses, making it the dominant modality for recruiting the largest reported contemporary cohort of Fontan patients and patients who have PLE and PB. The methodology and response patterns from this study can be used to design research applications for other rare diseases. Copyright © 2014 by the American Academy of Pediatrics.

  14. Methods and findings of the SNR study

    International Nuclear Information System (INIS)

    Koeberlein, K.; Schaefer, H.; Spindler, H.

    1983-01-01

    A featfinding committee of the German Federal Parliament in July 1980 recommended to perform a ''risk-oriented study'' of the SNR-300, the German 300 MW fast breeder prototype reactor being under construction in Kalkar. The main aim of this study was to allow a comparative safety evaluation between the SNR-300 and a modern PWR, thus to prepare a basis for a political decision on the SNR-300. Methods and main results of the study are presented in this paper. In the first step of the risk analysis six groups of accidents have been identified which may initiate core destruction. These groups comprise all conceivable courses, potentially leading to core destruction. By reliability analyses, expected frequency of each group has been calculated. In the accident analysis potential failure modes of the reactor tank have been investigated. Core destruction may be accompanied by the release of significant amounts of mechanical energy. The primary coolant system of SNR-300 is designed to withstand mechanical energy releases up to 370 MJ. Design features make it possible to cool the molten core inside the reactor tank. (orig./RW) [de

  15. Application to ion exchange study of an interferometry method

    International Nuclear Information System (INIS)

    Platzer, R.

    1960-01-01

    The numerous experiments carried out on ion exchange between clay suspensions and solutions have so far been done by studying the equilibrium between the two phases; by this method it is very difficult to obtain the kinetic properties of the exchange reactions. At method consisting of observation with an interferential microscope using polarised white light shows up the variations in concentration which take place during the ion exchange between an ionic solution and a montmorillonite slab as well as between an ionic solution and a grain of organic ion exchanger. By analysing the results it will be possible to compare the exchange constants of organic ion exchangers with those of mineral ion exchangers. (author) [fr

  16. Methods for environmental change; an exploratory study

    Directory of Open Access Journals (Sweden)

    Kok Gerjo

    2012-11-01

    Full Text Available Abstract Background While the interest of health promotion researchers in change methods directed at the target population has a long tradition, interest in change methods directed at the environment is still developing. In this survey, the focus is on methods for environmental change; especially about how these are composed of methods for individual change (‘Bundling’ and how within one environmental level, organizations, methods differ when directed at the management (‘At’ or applied by the management (‘From’. Methods The first part of this online survey dealt with examining the ‘bundling’ of individual level methods to methods at the environmental level. The question asked was to what extent the use of an environmental level method would involve the use of certain individual level methods. In the second part of the survey the question was whether there are differences between applying methods directed ‘at’ an organization (for instance, by a health promoter versus ‘from’ within an organization itself. All of the 20 respondents are experts in the field of health promotion. Results Methods at the individual level are frequently bundled together as part of a method at a higher ecological level. A number of individual level methods are popular as part of most of the environmental level methods, while others are not chosen very often. Interventions directed at environmental agents often have a strong focus on the motivational part of behavior change. There are different approaches targeting a level or being targeted from a level. The health promoter will use combinations of motivation and facilitation. The manager will use individual level change methods focusing on self-efficacy and skills. Respondents think that any method may be used under the right circumstances, although few endorsed coercive methods. Conclusions Taxonomies of theoretical change methods for environmental change should include combinations of individual

  17. Taxonomic Dimensions for Studying Situational Method Development

    NARCIS (Netherlands)

    Aydin, Mehmet N.; Harmsen, Frank; van Hillegersberg, Jos; Ralyté, Jolita; Brinkkemper, Sjaak; Henderson-Sellers, Brian

    2007-01-01

    This paper is concerned with fragmented literature on situational method development, which is one of fundamental topics related to information systems development (ISD) methods. As the topic has attracted many scholars from various and possibly complementary schools of thought, different

  18. Methods for environmental change; an exploratory study.

    Science.gov (United States)

    Kok, Gerjo; Gottlieb, Nell H; Panne, Robert; Smerecnik, Chris

    2012-11-28

    While the interest of health promotion researchers in change methods directed at the target population has a long tradition, interest in change methods directed at the environment is still developing. In this survey, the focus is on methods for environmental change; especially about how these are composed of methods for individual change ('Bundling') and how within one environmental level, organizations, methods differ when directed at the management ('At') or applied by the management ('From'). The first part of this online survey dealt with examining the 'bundling' of individual level methods to methods at the environmental level. The question asked was to what extent the use of an environmental level method would involve the use of certain individual level methods. In the second part of the survey the question was whether there are differences between applying methods directed 'at' an organization (for instance, by a health promoter) versus 'from' within an organization itself. All of the 20 respondents are experts in the field of health promotion. Methods at the individual level are frequently bundled together as part of a method at a higher ecological level. A number of individual level methods are popular as part of most of the environmental level methods, while others are not chosen very often. Interventions directed at environmental agents often have a strong focus on the motivational part of behavior change.There are different approaches targeting a level or being targeted from a level. The health promoter will use combinations of motivation and facilitation. The manager will use individual level change methods focusing on self-efficacy and skills. Respondents think that any method may be used under the right circumstances, although few endorsed coercive methods. Taxonomies of theoretical change methods for environmental change should include combinations of individual level methods that may be bundled and separate suggestions for methods targeting a level

  19. Methods for environmental change; an exploratory study

    NARCIS (Netherlands)

    Nell Gottlieb; Robert Panne; Chris Smerecnik; Gerjo Kok

    2012-01-01

    Background: While the interest of health promotion researchers in change methods directed at the target population has a long tradition, interest in change methods directed at the environment is still developing. In this survey, the focus is on methods for environmental change; especially about how

  20. 3D analysis methods - Study and seminar

    International Nuclear Information System (INIS)

    Daaviittila, A.

    2003-10-01

    The first part of the report results from a study that was performed as a Nordic co-operation activity with active participation from Studsvik Scandpower and Westinghouse Atom in Sweden, and VTT in Finland. The purpose of the study was to identify and investigate the effects rising from using the 3D transient com-puter codes in BWR safety analysis, and their influence on the transient analysis methodology. One of the main questions involves the critical power ratio (CPR) calculation methodology. The present way, where the CPR calculation is per-formed with a separate hot channel calculation, can be artificially conservative. In the investigated cases, no dramatic minimum CPR effect coming from the 3D calculation is apparent. Some cases show some decrease in the transient change of minimum CPR with the 3D calculation, which confirms the general thinking that the 1D calculation is conservative. On the other hand, the observed effect on neutron flux behaviour is quite large. In a slower transient the 3D effect might be stronger. The second part of the report is a summary of a related seminar that was held on the 3D analysis methods. The seminar was sponsored by the Reactor Safety part (NKS-R) of the Nordic Nuclear Safety Research Programme (NKS). (au)

  1. Facilitation Standards: A Mixed Methods Study

    Science.gov (United States)

    Hunter, Jennifer

    2017-01-01

    Online education is increasing as a solution to manage ever increasing enrollment numbers at higher education institutions. Intentionally and thoughtfully constructed courses allow students to improve performance through practice and self-assessment and instructors benefit from improving consistency in providing content and assessing process,…

  2. Measuring process and knowledge consistency

    DEFF Research Database (Denmark)

    Edwards, Kasper; Jensen, Klaes Ladeby; Haug, Anders

    2007-01-01

    When implementing configuration systems, knowledge about products and processes are documented and replicated in the configuration system. This practice assumes that products are specified consistently i.e. on the same rule base and likewise for processes. However, consistency cannot be taken...... for granted; rather the contrary, and attempting to implement a configuration system may easily ignite a political battle. This is because stakes are high in the sense that the rules and processes chosen may only reflect one part of the practice, ignoring a majority of the employees. To avoid this situation......, this paper presents a methodology for measuring product and process consistency prior to implementing a configuration system. The methodology consists of two parts: 1) measuring knowledge consistency and 2) measuring process consistency. Knowledge consistency is measured by developing a questionnaire...

  3. Extension of the self-consistent-charge density-functional tight-binding method: third-order expansion of the density functional theory total energy and introduction of a modified effective coulomb interaction.

    Science.gov (United States)

    Yang, Yang; Yu, Haibo; York, Darrin; Cui, Qiang; Elstner, Marcus

    2007-10-25

    The standard self-consistent-charge density-functional-tight-binding (SCC-DFTB) method (Phys. Rev. B 1998, 58, 7260) is derived by a second-order expansion of the density functional theory total energy expression, followed by an approximation of the charge density fluctuations by charge monopoles and an effective damped Coulomb interaction between the atomic net charges. The central assumptions behind this effective charge-charge interaction are the inverse relation of atomic size and chemical hardness and the use of a fixed chemical hardness parameter independent of the atomic charge state. While these approximations seem to be unproblematic for many covalently bound systems, they are quantitatively insufficient for hydrogen-bonding interactions and (anionic) molecules with localized net charges. Here, we present an extension of the SCC-DFTB method to incorporate third-order terms in the charge density fluctuations, leading to chemical hardness parameters that are dependent on the atomic charge state and a modification of the Coulomb scaling to improve the electrostatic treatment within the second-order terms. These modifications lead to a significant improvement in the description of hydrogen-bonding interactions and proton affinities of biologically relevant molecules.

  4. Computational Studies of Protein Hydration Methods

    Science.gov (United States)

    Morozenko, Aleksandr

    It is widely appreciated that water plays a vital role in proteins' functions. The long-range proton transfer inside proteins is usually carried out by the Grotthuss mechanism and requires a chain of hydrogen bonds that is composed of internal water molecules and amino acid residues of the protein. In other cases, water molecules can facilitate the enzymes catalytic reactions by becoming a temporary proton donor/acceptor. Yet a reliable way of predicting water protein interior is still not available to the biophysics community. This thesis presents computational studies that have been performed to gain insights into the problems of fast and accurate prediction of potential water sites inside internal cavities of protein. Specifically, we focus on the task of attainment of correspondence between results obtained from computational experiments and experimental data available from X-ray structures. An overview of existing methods of predicting water molecules in the interior of a protein along with a discussion of the trustworthiness of these predictions is a second major subject of this thesis. A description of differences of water molecules in various media, particularly, gas, liquid and protein interior, and theoretical aspects of designing an adequate model of water for the protein environment are widely discussed in chapters 3 and 4. In chapter 5, we discuss recently developed methods of placement of water molecules into internal cavities of a protein. We propose a new methodology based on the principle of docking water molecules to a protein body which allows to achieve a higher degree of matching experimental data reported in protein crystal structures than other techniques available in the world of biophysical software. The new methodology is tested on a set of high-resolution crystal structures of oligopeptide-binding protein (OppA) containing a large number of resolved internal water molecules and applied to bovine heart cytochrome c oxidase in the fully

  5. Adopting a blended learning approach to teaching evidence based medicine: a mixed methods study

    OpenAIRE

    Ilic, Dragan; Hart, William; Fiddes, Patrick; Misso, Marie; Villanueva, Elmer

    2013-01-01

    Background Evidence Based Medicine (EBM) is a core unit delivered across many medical schools. Few studies have investigated the most effective method of teaching a course in EBM to medical students. The objective of this study was to identify whether a blended-learning approach to teaching EBM is more effective a didactic-based approach at increasing medical student competency in EBM. Methods A mixed-methods study was conducted consisting of a controlled trial and focus groups with second ye...

  6. Orthology and paralogy constraints: satisfiability and consistency.

    Science.gov (United States)

    Lafond, Manuel; El-Mabrouk, Nadia

    2014-01-01

    A variety of methods based on sequence similarity, reconciliation, synteny or functional characteristics, can be used to infer orthology and paralogy relations between genes of a given gene family  G. But is a given set  C of orthology/paralogy constraints possible, i.e., can they simultaneously co-exist in an evolutionary history for  G? While previous studies have focused on full sets of constraints, here we consider the general case where  C does not necessarily involve a constraint for each pair of genes. The problem is subdivided in two parts: (1) Is  C satisfiable, i.e. can we find an event-labeled gene tree G inducing  C? (2) Is there such a G which is consistent, i.e., such that all displayed triplet phylogenies are included in a species tree? Previous results on the Graph sandwich problem can be used to answer to (1), and we provide polynomial-time algorithms for satisfiability and consistency with a given species tree. We also describe a new polynomial-time algorithm for the case of consistency with an unknown species tree and full knowledge of pairwise orthology/paralogy relationships, as well as a branch-and-bound algorithm in the case when unknown relations are present. We show that our algorithms can be used in combination with ProteinOrtho, a sequence similarity-based orthology detection tool, to extract a set of robust orthology/paralogy relationships.

  7. From SOPs to Reports to Evaluations: Learning and Memory as a Case Study of how Missing Data and Methods Impact Interpretation

    Science.gov (United States)

    In an era of global trade and regulatory cooperation, consistent and scientifically based interpretation of developmental neurotoxicity (DNT) studies is essential. Because there is flexibility in the selection of test method(s), consistency can be especially challenging for lea...

  8. COMPARATIVE STUDY ON MILK CASEIN ASSAY METHODS

    Directory of Open Access Journals (Sweden)

    RODICA CĂPRIłĂ

    2008-05-01

    Full Text Available Casein, the main milk protein was determined by different assay methods: the gravimetric method, the method based on the neutralization of the NaOH excess used for the casein precipitate solving and the method based on the titration of the acetic acid used for the casein precipitation. The last method is the simplest one, with the fewer steps, and also with the lowest error degree. The results of the experiment revealed that the percentage of casein from the whole milk protein represents between 72.6–81.3% in experiment 1, between 73.6–81.3% in experiment 2 and between 74.3–81% in experiment 3.

  9. A new approach to hull consistency

    Directory of Open Access Journals (Sweden)

    Kolev Lubomir

    2016-06-01

    Full Text Available Hull consistency is a known technique to improve the efficiency of iterative interval methods for solving nonlinear systems describing steady-states in various circuits. Presently, hull consistency is checked in a scalar manner, i.e. successively for each equation of the nonlinear system with respect to a single variable. In the present poster, a new more general approach to implementing hull consistency is suggested which consists in treating simultaneously several equations with respect to the same number of variables.

  10. Expert Consensus on Characteristics of Wisdom: A Delphi Method Study

    Science.gov (United States)

    Jeste, Dilip V.; Ardelt, Monika; Blazer, Dan; Kraemer, Helena C.; Vaillant, George; Meeks, Thomas W.

    2010-01-01

    Purpose: Wisdom has received increasing attention in empirical research in recent years, especially in gerontology and psychology, but consistent definitions of wisdom remain elusive. We sought to better characterize this concept via an expert consensus panel using a 2-phase Delphi method. Design and Methods: A survey questionnaire comprised 53…

  11. A Critical Study of Agglomerated Multigrid Methods for Diffusion

    Science.gov (United States)

    Nishikawa, Hiroaki; Diskin, Boris; Thomas, James L.

    2011-01-01

    Agglomerated multigrid techniques used in unstructured-grid methods are studied critically for a model problem representative of laminar diffusion in the incompressible limit. The studied target-grid discretizations and discretizations used on agglomerated grids are typical of current node-centered formulations. Agglomerated multigrid convergence rates are presented using a range of two- and three-dimensional randomly perturbed unstructured grids for simple geometries with isotropic and stretched grids. Two agglomeration techniques are used within an overall topology-preserving agglomeration framework. The results show that multigrid with an inconsistent coarse-grid scheme using only the edge terms (also referred to in the literature as a thin-layer formulation) provides considerable speedup over single-grid methods but its convergence deteriorates on finer grids. Multigrid with a Galerkin coarse-grid discretization using piecewise-constant prolongation and a heuristic correction factor is slower and also grid-dependent. In contrast, grid-independent convergence rates are demonstrated for multigrid with consistent coarse-grid discretizations. Convergence rates of multigrid cycles are verified with quantitative analysis methods in which parts of the two-grid cycle are replaced by their idealized counterparts.

  12. Studies of cancer risk among Chernobyl liquidators: materials and methods

    International Nuclear Information System (INIS)

    Kesminiene, A.; Cardis, E.; Tenet, V.; Ivanov, V.K.; Kurtinaitis, J.; Malakhova, I.; Stengrevics, A.; Tekkel, M.

    2002-01-01

    The current paper presents the methods and design of two case-control studies among Chernobyl liquidators - one of leukaemia and non-Hodgkin lymphoma, the other of thyroid cancer risk - carried out in Belarus, Estonia, Latvia, Lithuania and Russia. The specific objective of these studies is to estimate the radiation induced risk of these diseases among liquidators of the Chernobyl accident, and, in particular, to study the effect of exposure protraction and radiation type on the risk of radiation induced cancer in the low-to-medium- (0-500 mSv) radiation dose range. The study population consists of the approximately 10,000 Baltic, 40,000 Belarus and 51,000 Russian liquidators who worked in the 30 km zone in 1986-1987, and who were registered in the Chernobyl registry of these countries. The studies included cases diagnosed in 1993-1998 for all countries but Belarus, where the study period was extended until 2000. Four controls were selected in each country from the national cohort for each case, matched on age, gender and region of residence. Information on study subjects was obtained through face-to-face interview using a standardised questionnaire with questions on demographic factors, time, place and conditions of work as a liquidator and potential risk and confounding factors for the tumours of interest. Overall, 136 cases and 595 controls after receiving their consent were included in the studies. A method of analytical dose reconstruction has been developed, validated and applied to the estimation of doses and related uncertainties for all the subjects in the study. Dose-response analyses are underway and results are likely to have important implications to assess the adequacy of existing protection standards, which are based on risk estimates derived from analyses of the mortality of atomic bomb survivors and other high dose studies. (author)

  13. Quantitative methods for studying design protocols

    CERN Document Server

    Kan, Jeff WT

    2017-01-01

    This book is aimed at researchers and students who would like to engage in and deepen their understanding of design cognition research. The book presents new approaches for analyzing design thinking and proposes methods of measuring design processes. These methods seek to quantify design issues and design processes that are defined based on notions from the Function-Behavior-Structure (FBS) design ontology and from linkography. A linkograph is a network of linked design moves or segments. FBS ontology concepts have been used in both design theory and design thinking research and have yielded numerous results. Linkography is one of the most influential and elegant design cognition research methods. In this book Kan and Gero provide novel and state-of-the-art methods of analyzing design protocols that offer insights into design cognition by integrating segmentation with linkography by assigning FBS-based codes to design moves or segments and treating links as FBS transformation processes. They propose and test ...

  14. Food studies: an introduction to research methods

    National Research Council Canada - National Science Library

    Miller, Jeff; Deutsch, Jonathan

    2009-01-01

    .... Designed for the classroom as well as for the independent scholar, the book details the predominant research methods in the field, provides a series of interactive questions and templates to help...

  15. Triangulation of Methods in Labour Studies in Nigeria: Reflections ...

    African Journals Online (AJOL)

    One of the distinctive aspects of social science research in Nigeria as in other ... method in their investigations while relegating qualitative methods to the background. In labour studies, adopting only quantitative method to studying workers ...

  16. Pilot study of alternating radiotherapy and three-drug combined chemotherapy consisting of ifosfamide, cisplatin and vindesine in localized inoperable non-small cell lung cancer

    International Nuclear Information System (INIS)

    Rikimaru, Toru; Tanaka, Yasuyuki; Ichikawa, Yoichiro; Oizumi, Kotaro; Fukurono, Kazuyoshi; Hayabuchi, Naofumi

    1993-01-01

    During the period from February 1991 through October 1992, we conducted a pilot phase II trial of an 'Alternating Radiotherapy and Chemotherapy' for 15 patients with localized inoperable non-small cell lung cancer. The combined regimen, consisting of ifosfamide 1.5 g/m 2 on days 1 through 3, cisplatin 80 mg/m 2 and vindesine 3 mg/m 2 on day 1, was given repeatedly every 4 weeks. Patients were treated in a split course fashion with combination chemotherapy sandwiched between radiation therapy (total dose 60 Gy). Of 15 evaluable patients, complete remission, partial remission and no change were obtained in 1, 13 and 1 patients, respectively, with an overall response rate of 93.3%. The median survival for all patients was 62 weeks. Hematologic toxicity was severe and was judged to be dose limiting. It was, however, clinically manageable with colony stimulating factor. These results indicate that this alternating radiotherapy and chemotherapy is feasible for localized non-small cell lung cancer and warrants further clinical trials. (author)

  17. Self-assembly behavior of pH- and thermosensitive amphiphilic triblock copolymers in solution: experimental studies and self-consistent field theory simulations.

    Science.gov (United States)

    Cai, Chunhua; Zhang, Liangshun; Lin, Jiaping; Wang, Liquan

    2008-10-09

    We investigated, both experimentally and theoretically, the self-assembly behaviors of pH- and thermosensitive poly(L-glutamic acid)- b-poly(propylene oxide)-b-poly(L-glutamic acid) (PLGA-b-PPO-b-PLGA) triblock copolymers in aqueous solution by means of transmission electron microscopy (TEM), scanning electron microscopy (SEM), dynamic light scattering (DLS), circular dichroism (CD), and self-consistent field theory (SCFT) simulations. Vesicles were observed when the hydrophilic PLGA block length is shorter or the pH value of solution is lower. The vesicles were found to transform to spherical micelles when the PLGA block length increases or its conformation changes from helix to coil with increasing the pH value. In addition, increasing temperature gives rise to a decrease in the size of aggregates, which is related to the dehydration of the PPO segments at higher temperatures. The SCFT simulation results show that the vesicles transform to the spherical micelles with increasing the fraction or statistical length of A block in model ABA triblock copolymer, which corresponds to the increase in the PLGA length or its conformation change from helix to coil in experiments, respectively. The SCFT calculations also provide chain distribution information in the aggregates. On the basis of both experimental and SCFT results, the mechanism of the structure change of the PLGA- b-PPO- b-PLGA aggregates was proposed.

  18. Clinical experimental stress studies: methods and assessment.

    Science.gov (United States)

    Bali, Anjana; Jaggi, Amteshwar Singh

    2015-01-01

    Stress is a state of threatened homeostasis during which a variety of adaptive processes are activated to produce physiological and behavioral changes. Stress induction methods are pivotal for understanding these physiological or pathophysiological changes in the body in response to stress. Furthermore, these methods are also important for the development of novel pharmacological agents for stress management. The well-described methods to induce stress in humans include the cold pressor test, Trier Social Stress Test, Montreal Imaging Stress Task, Maastricht Acute Stress Test, CO2 challenge test, Stroop test, Paced Auditory Serial Addition Task, noise stress, and Mannheim Multicomponent Stress Test. Stress assessment in humans is done by measuring biochemical markers such as cortisol, cortisol awakening response, dexamethasone suppression test, salivary α-amylase, plasma/urinary norepinephrine, norepinephrine spillover rate, and interleukins. Physiological and behavioral changes such as galvanic skin response, heart rate variability, pupil size, and muscle and/or skin sympathetic nerve activity (microneurography) and cardiovascular parameters such as heart rate, blood pressure, and self-reported anxiety are also monitored to assess stress response. This present review describes these commonly employed methods to induce stress in humans along with stress assessment methods.

  19. Does Corporate Governance Affect Sustainability Disclosure? A Mixed Methods Study

    Directory of Open Access Journals (Sweden)

    Zeeshan Mahmood

    2018-01-01

    Full Text Available This research paper aims to understand the impact of corporate governance (CG on economic, social, and environmental sustainability disclosures. This paper adopted an explanatory sequential mixed methods approach. The data regarding corporate governance and sustainability disclosure were collected from top 100 companies listed on the Pakistan Stock Exchange (PSE for the period ranging from 2012 to 2015. In addition to the quantitative data, we collected qualitative data through interviews with five board members of different companies. Overall, our results indicate that CG elements enhance sustainability disclosures. This study concludes that a large board size consisting of a female director and a CSR committee (CSRC is better able to check and control management decisions regarding sustainability issues (be they economic, environment, or social and resulted in better sustainability disclosure. This paper, through quantitative and qualitative analysis, provides a methodological and empirical contribution to the literature on corporate governance and sustainability reporting in emerging and developing countries.

  20. Are radiosensitivity data derived from natural field conditions consistent with data from controlled exposures? A case study of Chernobyl wildlife chronically exposed to low dose rates.

    Science.gov (United States)

    Garnier-Laplace, J; Geras'kin, S; Della-Vedova, C; Beaugelin-Seiller, K; Hinton, T G; Real, A; Oudalova, A

    2013-07-01

    The discrepancy between laboratory or controlled conditions ecotoxicity tests and field data on wildlife chronically exposed to ionising radiation is presented for the first time. We reviewed the available chronic radiotoxicity data acquired in contaminated fields and used a statistical methodology to support the comparison with knowledge on inter-species variation of sensitivity to controlled external γ irradiation. We focus on the Chernobyl Exclusion Zone and effects data on terrestrial wildlife reported in the literature corresponding to chronic dose rate exposure situations (from background ~100 nGy/h up to ~10 mGy/h). When needed, we reconstructed the dose rate to organisms and obtained consistent unbiased data sets necessary to establish the dose rate-effect relationship for a number of different species and endpoints. Then, we compared the range of variation of radiosensitivity of species from the Chernobyl-Exclusion Zone with the statistical distribution established for terrestrial species chronically exposed to purely gamma external irradiation (or chronic Species radioSensitivity Distribution - SSD). We found that the best estimate of the median value (HDR50) of the distribution established for field conditions at Chernobyl (about 100 μGy/h) was eight times lower than the one from controlled experiments (about 850 μGy/h), suggesting that organisms in their natural environmental were more sensitive to radiation. This first comparison highlights the lack of mechanistic understanding and the potential confusion coming from sampling strategies in the field. To confirm the apparent higher sensitive of wildlife in the Chernobyl Exclusion Zone, we call for more a robust strategy in field, with adequate design to deal with confounding factors. Copyright © 2012 Elsevier Ltd. All rights reserved.

  1. Microfluidic methods to study emulsion formation

    NARCIS (Netherlands)

    Muijlwijk, Kelly

    2017-01-01

    Emulsions are dispersions of one liquid in another that are commonly used in various products, and methods such as high-pressure homogenisers and colloid mills are used to form emulsions. The size and size distribution of emulsion droplets are important for the final product properties and thus

  2. Theoretical studies of potential energy surfaces and computational methods

    Energy Technology Data Exchange (ETDEWEB)

    Shepard, R. [Argonne National Laboratory, IL (United States)

    1993-12-01

    This project involves the development, implementation, and application of theoretical methods for the calculation and characterization of potential energy surfaces involving molecular species that occur in hydrocarbon combustion. These potential energy surfaces require an accurate and balanced treatment of reactants, intermediates, and products. This difficult challenge is met with general multiconfiguration self-consistent-field (MCSCF) and multireference single- and double-excitation configuration interaction (MRSDCI) methods. In contrast to the more common single-reference electronic structure methods, this approach is capable of describing accurately molecular systems that are highly distorted away from their equilibrium geometries, including reactant, fragment, and transition-state geometries, and of describing regions of the potential surface that are associated with electronic wave functions of widely varying nature. The MCSCF reference wave functions are designed to be sufficiently flexible to describe qualitatively the changes in the electronic structure over the broad range of geometries of interest. The necessary mixing of ionic, covalent, and Rydberg contributions, along with the appropriate treatment of the different electron-spin components (e.g. closed shell, high-spin open-shell, low-spin open shell, radical, diradical, etc.) of the wave functions, are treated correctly at this level. Further treatment of electron correlation effects is included using large scale multireference CI wave functions, particularly including the single and double excitations relative to the MCSCF reference space. This leads to the most flexible and accurate large-scale MRSDCI wave functions that have been used to date in global PES studies.

  3. Parquet equations for numerical self-consistent-field theory

    International Nuclear Information System (INIS)

    Bickers, N.E.

    1991-01-01

    In recent years increases in computational power have provided new motivation for the study of self-consistent-field theories for interacting electrons. In this set of notes, the so-called parquet equations for electron systems are derived pedagogically. The principal advantages of the parquet approach are outlined, and its relationship to simpler self-consistent-field methods, including the Baym-Kadanoff technique, is discussed in detail. (author). 14 refs, 9 figs

  4. Choice, internal consistency, and rationality

    OpenAIRE

    Aditi Bhattacharyya; Prasanta K. Pattanaik; Yongsheng Xu

    2010-01-01

    The classical theory of rational choice is built on several important internal consistency conditions. In recent years, the reasonableness of those internal consistency conditions has been questioned and criticized, and several responses to accommodate such criticisms have been proposed in the literature. This paper develops a general framework to accommodate the issues raised by the criticisms of classical rational choice theory, and examines the broad impact of these criticisms from both no...

  5. Microfluidic methods to study emulsion formation

    OpenAIRE

    Muijlwijk, Kelly

    2017-01-01

    Emulsions are dispersions of one liquid in another that are commonly used in various products, and methods such as high-pressure homogenisers and colloid mills are used to form emulsions. The size and size distribution of emulsion droplets are important for the final product properties and thus need to be controlled. Rapid coalescence of droplets during emulsification increases droplet size and widens the size distribution, and therefore needs to be prevented. To increase stability of emulsio...

  6. ABCD Matrix Method a Case Study

    CERN Document Server

    Seidov, Zakir F; Yahalom, Asher

    2004-01-01

    In the Israeli Electrostatic Accelerator FEL, the distance between the accelerator's end and the wiggler's entrance is about 2.1 m, and 1.4 MeV electron beam is transported through this space using four similar quadrupoles (FODO-channel). The transfer matrix method (ABCD matrix method) was used for simulating the beam transport, a set of programs is written in the several programming languages (MATHEMATICA, MATLAB, MATCAD, MAPLE) and reasonable agreement is demonstrated between experimental results and simulations. Comparison of ABCD matrix method with the direct "numerical experiments" using EGUN, ELOP, and GPT programs with and without taking into account the space-charge effects showed the agreement to be good enough as well. Also the inverse problem of finding emittance of the electron beam at the S1 screen position (before FODO-channel), by using the spot image at S2 screen position (after FODO-channel) as function of quad currents, is considered. Spot and beam at both screens are described as tilted eel...

  7. A Mixed Methods Sampling Methodology for a Multisite Case Study

    Science.gov (United States)

    Sharp, Julia L.; Mobley, Catherine; Hammond, Cathy; Withington, Cairen; Drew, Sam; Stringfield, Sam; Stipanovic, Natalie

    2012-01-01

    The flexibility of mixed methods research strategies makes such approaches especially suitable for multisite case studies. Yet the utilization of mixed methods to select sites for these studies is rarely reported. The authors describe their pragmatic mixed methods approach to select a sample for their multisite mixed methods case study of a…

  8. Studying Landslide Displacements in Megamendung (Indonesia Using GPS Survey Method

    Directory of Open Access Journals (Sweden)

    Hasanuddin Z. Abidin

    2004-11-01

    Full Text Available Landslide is one of prominent geohazards that frequently affects Indonesia, especially in the rainy season. It destroys not only environment and property, but usually also causes deaths. Landslide monitoring is therefore very crucial and should be continuously done. One of the methods that can have a contribution in studying landslide phenomena is repeated GPS survey method. This paper presents and discusses the operational performances, constraints and results of GPS surveys conducted in a well known landslide prone area in West Java (Indonesia, namely Megamendung, the hilly region close to Bogor. Three GPS surveys involving 8 GPS points have been conducted, namely on April 2002, May 2003 and May 2004, respectively. The estimated landslide displacements in the area are relatively quite large in the level of a few dm to a few m. Displacements up to about 2-3 m were detected in the April 2002 to May 2003 period, and up to about 3-4 dm in the May 2003 to May 2004 period. In both periods, landslides in general show the northwest direction of displacements. Displacements vary both spatially and temporally. This study also suggested that in order to conclude the existence of real and significant displacements of GPS points, the GPS estimated displacements should be subjected to three types of testing namely: the congruency test on spatial displacements, testing on the agreement between the horizontal distance changes with the predicted direction of landslide displacement, and testing on the consistency of displacement directions on two consecutive periods.

  9. Isotope angiocardiography. Method and preliminary own studies

    Energy Technology Data Exchange (ETDEWEB)

    Stepinska, J; Ruzyllo, W; Konieczny, W [Centrum Medyczne Ksztalcenia Podyplomowego, Warsaw (Poland)

    1979-01-01

    Method of technetium isotope 99 m pass through the heart recording with the aid of radioisotope scanner connected with seriograph and computer is being presented. Preliminary tests were carried out in 26 patients with coronary disease without or with previous myocardial infarction, cardiomyopathy, ventricular septal defect and in patients with artificial mitral and aortic valves. The obtained scans were evaluated qualitatively and compared with performed later contrast X-rays of the heart. Size of the right ventricle, volume and rate of left atrial evacuation, size and contractability of left ventricle were evaluated. Similarity of direct and isotope angiocardiographs, non-invasional character and repeatability of isotope angiocardiography advocate its usefulness.

  10. Consistency and reproducibility of next-generation sequencing and other multigene mutational assays: A worldwide ring trial study on quantitative cytological molecular reference specimens.

    Science.gov (United States)

    Malapelle, Umberto; Mayo-de-Las-Casas, Clara; Molina-Vila, Miguel A; Rosell, Rafael; Savic, Spasenija; Bihl, Michel; Bubendorf, Lukas; Salto-Tellez, Manuel; de Biase, Dario; Tallini, Giovanni; Hwang, David H; Sholl, Lynette M; Luthra, Rajyalakshmi; Weynand, Birgit; Vander Borght, Sara; Missiaglia, Edoardo; Bongiovanni, Massimo; Stieber, Daniel; Vielh, Philippe; Schmitt, Fernando; Rappa, Alessandra; Barberis, Massimo; Pepe, Francesco; Pisapia, Pasquale; Serra, Nicola; Vigliar, Elena; Bellevicine, Claudio; Fassan, Matteo; Rugge, Massimo; de Andrea, Carlos E; Lozano, Maria D; Basolo, Fulvio; Fontanini, Gabriella; Nikiforov, Yuri E; Kamel-Reid, Suzanne; da Cunha Santos, Gilda; Nikiforova, Marina N; Roy-Chowdhuri, Sinchita; Troncone, Giancarlo

    2017-08-01

    Molecular testing of cytological lung cancer specimens includes, beyond epidermal growth factor receptor (EGFR), emerging predictive/prognostic genomic biomarkers such as Kirsten rat sarcoma viral oncogene homolog (KRAS), neuroblastoma RAS viral [v-ras] oncogene homolog (NRAS), B-Raf proto-oncogene, serine/threonine kinase (BRAF), and phosphatidylinositol-4,5-bisphosphate 3-kinase catalytic subunit α (PIK3CA). Next-generation sequencing (NGS) and other multigene mutational assays are suitable for cytological specimens, including smears. However, the current literature reflects single-institution studies rather than multicenter experiences. Quantitative cytological molecular reference slides were produced with cell lines designed to harbor concurrent mutations in the EGFR, KRAS, NRAS, BRAF, and PIK3CA genes at various allelic ratios, including low allele frequencies (AFs; 1%). This interlaboratory ring trial study included 14 institutions across the world that performed multigene mutational assays, from tissue extraction to data analysis, on these reference slides, with each laboratory using its own mutation analysis platform and methodology. All laboratories using NGS (n = 11) successfully detected the study's set of mutations with minimal variations in the means and standard errors of variant fractions at dilution points of 10% (P = .171) and 5% (P = .063) despite the use of different sequencing platforms (Illumina, Ion Torrent/Proton, and Roche). However, when mutations at a low AF of 1% were analyzed, the concordance of the NGS results was low, and this reflected the use of different thresholds for variant calling among the institutions. In contrast, laboratories using matrix-assisted laser desorption/ionization-time of flight (n = 2) showed lower concordance in terms of mutation detection and mutant AF quantification. Quantitative molecular reference slides are a useful tool for monitoring the performance of different multigene mutational

  11. Innovative Interpretive Qualitative Case Study Research Method ...

    African Journals Online (AJOL)

    lc2o

    The combined use of case study and systems theory is rarely discussed in the ... Scott, 2002), the main benefit of doing qualitative research is the patience ..... Teaching ICT to teacher candidates ... English Language Teachers. London: Arnold.

  12. Travel Efficiency Assessment Method: Three Case Studies

    Science.gov (United States)

    This slide presentation summarizes three case studies EPA conducted in partnership with Boston, Kansas City, and Tucson, to assess the potential benefits of employing travel efficiency strategies in these areas.

  13. Lexical processing and distributional knowledge in sound-spelling mapping in a consistent orthography: A longitudinal study of reading and spelling in dyslexic and typically developing children.

    Science.gov (United States)

    Marinelli, Chiara Valeria; Cellini, Pamela; Zoccolotti, Pierluigi; Angelelli, Paola

    This study examined the ability to master lexical processing and use knowledge of the relative frequency of sound-spelling mappings in both reading and spelling. Twenty-four dyslexic and dysgraphic children and 86 typically developing readers were followed longitudinally in 3rd and 5th grades. Effects of word regularity, word frequency, and probability of sound-spelling mappings were examined in two experimental tasks: (a) spelling to dictation; and (b) orthographic judgment. Dyslexic children showed larger regularity and frequency effects than controls in both tasks. Sensitivity to distributional information of sound-spelling mappings was already detected by third grade, indicating early acquisition even in children with dyslexia. Although with notable differences, knowledge of the relative frequencies of sound-spelling mapping influenced both reading and spelling. Results are discussed in terms of their theoretical and empirical implications.

  14. Consistent guiding center drift theories

    International Nuclear Information System (INIS)

    Wimmel, H.K.

    1982-04-01

    Various guiding-center drift theories are presented that are optimized in respect of consistency. They satisfy exact energy conservation theorems (in time-independent fields), Liouville's theorems, and appropriate power balance equations. A theoretical framework is given that allows direct and exact derivation of associated drift-kinetic equations from the respective guiding-center drift-orbit theories. These drift-kinetic equations are listed. Northrop's non-optimized theory is discussed for reference, and internal consistency relations of G.C. drift theories are presented. (orig.)

  15. Weak consistency and strong paraconsistency

    Directory of Open Access Journals (Sweden)

    Gemma Robles

    2009-11-01

    Full Text Available In a standard sense, consistency and paraconsistency are understood as, respectively, the absence of any contradiction and as the absence of the ECQ (“E contradictione quodlibet” rule that allows us to conclude any well formed formula from any contradiction. The aim of this paper is to explain the concepts of weak consistency alternative to the standard one, the concepts of paraconsistency related to them and the concept of strong paraconsistency, all of which have been defined by the author together with José M. Méndez.

  16. Glass consistency and glass performance

    International Nuclear Information System (INIS)

    Plodinec, M.J.; Ramsey, W.G.

    1994-01-01

    Glass produced by the Defense Waste Processing Facility (DWPF) will have to consistently be more durable than a benchmark glass (evaluated using a short-term leach test), with high confidence. The DWPF has developed a Glass Product Control Program to comply with this specification. However, it is not clear what relevance product consistency has on long-term glass performance. In this report, the authors show that DWPF glass, produced in compliance with this specification, can be expected to effectively limit the release of soluble radionuclides to natural environments. However, the release of insoluble radionuclides to the environment will be limited by their solubility, and not glass durability

  17. Evaluating Temporal Consistency in Marine Biodiversity Hotspots.

    Science.gov (United States)

    Piacenza, Susan E; Thurman, Lindsey L; Barner, Allison K; Benkwitt, Cassandra E; Boersma, Kate S; Cerny-Chipman, Elizabeth B; Ingeman, Kurt E; Kindinger, Tye L; Lindsley, Amy J; Nelson, Jake; Reimer, Jessica N; Rowe, Jennifer C; Shen, Chenchen; Thompson, Kevin A; Heppell, Selina S

    2015-01-01

    With the ongoing crisis of biodiversity loss and limited resources for conservation, the concept of biodiversity hotspots has been useful in determining conservation priority areas. However, there has been limited research into how temporal variability in biodiversity may influence conservation area prioritization. To address this information gap, we present an approach to evaluate the temporal consistency of biodiversity hotspots in large marine ecosystems. Using a large scale, public monitoring dataset collected over an eight year period off the US Pacific Coast, we developed a methodological approach for avoiding biases associated with hotspot delineation. We aggregated benthic fish species data from research trawls and calculated mean hotspot thresholds for fish species richness and Shannon's diversity indices over the eight year dataset. We used a spatial frequency distribution method to assign hotspot designations to the grid cells annually. We found no areas containing consistently high biodiversity through the entire study period based on the mean thresholds, and no grid cell was designated as a hotspot for greater than 50% of the time-series. To test if our approach was sensitive to sampling effort and the geographic extent of the survey, we followed a similar routine for the northern region of the survey area. Our finding of low consistency in benthic fish biodiversity hotspots over time was upheld, regardless of biodiversity metric used, whether thresholds were calculated per year or across all years, or the spatial extent for which we calculated thresholds and identified hotspots. Our results suggest that static measures of benthic fish biodiversity off the US West Coast are insufficient for identification of hotspots and that long-term data are required to appropriately identify patterns of high temporal variability in biodiversity for these highly mobile taxa. Given that ecological communities are responding to a changing climate and other

  18. Testing the visual consistency of web sites

    NARCIS (Netherlands)

    van der Geest, Thea; Loorbach, N.R.

    2005-01-01

    Consistency in the visual appearance of Web pages is often checked by experts, such as designers or reviewers. This article reports a card sort study conducted to determine whether users rather than experts could distinguish visual (in-)consistency in Web elements and pages. The users proved to

  19. Study on Laser Welding Process Monitoring Method

    OpenAIRE

    Knag , Heeshin

    2017-01-01

    International audience; In this paper, a study of quality monitoring technology for the laser welding was conducted. The laser welding and the industrial robotic systems were used with robot-based laser welding systems. The laser system used in this study was 1.6 kW fiber laser, while the robot system was Industrial robot (pay-load : 130 kg). The robot-based laser welding system was equipped with a laser scanner system for remote laser welding. The welding joints of steel plate and steel plat...

  20. Study on Laser Welding Process Monitoring Method

    OpenAIRE

    Heeshin Knag

    2016-01-01

    In this paper, a study of quality monitoring technology for the laser welding was conducted. The laser welding and the industrial robotic systems were used with robot-based laser welding systems. The laser system used in this study was 1.6 kW fiber laser, while the robot system was Industrial robot (pay-load : 130 kg). The robot-based laser welding system was equipped with a laser scanner system for remote laser welding. The welding joints of steel plate and steel plate coated with zinc were ...

  1. Chronic exposure of mutant DISC1 mice to lead produces sex-dependent abnormalities consistent with schizophrenia and related mental disorders: a gene-environment interaction study.

    Science.gov (United States)

    Abazyan, Bagrat; Dziedzic, Jenifer; Hua, Kegang; Abazyan, Sofya; Yang, Chunxia; Mori, Susumu; Pletnikov, Mikhail V; Guilarte, Tomas R

    2014-05-01

    The glutamatergic hypothesis of schizophrenia suggests that hypoactivity of the N-methyl-D-aspartate receptor (NMDAR) is an important factor in the pathophysiology of schizophrenia and related mental disorders. The environmental neurotoxicant, lead (Pb(2+)), is a potent and selective antagonist of the NMDAR. Recent human studies have suggested an association between prenatal Pb(2+) exposure and the increased likelihood of schizophrenia later in life, possibly via interacting with genetic risk factors. In order to test this hypothesis, we examined the neurobehavioral consequences of interaction between Pb(2+) exposure and mutant disrupted in schizophrenia 1 (mDISC1), a risk factor for major psychiatric disorders. Mutant DISC1 and control mice born by the same dams were raised and maintained on a regular diet or a diet containing moderate levels of Pb(2+). Chronic, lifelong exposure of mDISC1 mice to Pb(2+) was not associated with gross developmental abnormalities but produced sex-dependent hyperactivity, exaggerated responses to the NMDAR antagonist, MK-801, mildly impaired prepulse inhibition of the acoustic startle, and enlarged lateral ventricles. Together, these findings support the hypothesis that environmental toxins could contribute to the pathogenesis of mental disease in susceptible individuals.

  2. Theoretical and simulation studies of seeding methods

    Energy Technology Data Exchange (ETDEWEB)

    Pellegrini, Claudio [Univ. of California, Los Angeles, CA (United States)

    2017-12-11

    We report the theoretical and experimental studies done with the support of DOE-Grant DE-SC0009983 to increase an X-ray FEL peak power from the present level of 20 to 40 GW to one or more TW by seeding, undulator tapering and using the new concept of the Double Bunch FEL.

  3. Production studies and documentary participants: a method

    NARCIS (Netherlands)

    Sanders, Willemien

    2016-01-01

    It was only after I finished my PhD thesis that I learned that my research related to production studies. Departing from the question of ethics in documentary filmmaking, I investigated both the perspective of filmmakers and participants on ethical issues in the documentary filmmaking practice,

  4. A new δf method for neoclassical transport studies

    International Nuclear Information System (INIS)

    Wang, W.X.; Nakajima, N.; Okamoto, M.; Murakami, S.

    1999-01-01

    A new δf method is presented in detail to solve the drift kinetic equation for the simulation study of neoclassical transport. It is demonstrated that valid results essentially rely on the correct evaluation of the marker density g in the weight calculation. A new weighting scheme is developed without assuming g in the weight equation for advancing particle weights, unlike previous schemes. This scheme employs an additional weight function to directly solve g from its kinetic equation based on the δf method itself. Therefore, the severe constraint that the real marker distribution must be consistent with the initially assumed g is relaxed. An improved like-particle collision scheme is also presented. By compensating for momentum, energy and particle losses, the conservations of all three quantities are greatly improved during collisions. With the improvement in both the like-particle collision scheme and the weighting scheme, the δf simulation shows a significantly improved performance. The new δf method is applied to the study of ion neoclassical transports due to self-collisions, taking the effect of finite orbit width into account. The ion thermal transport near the magnetic axis is shown to be greatly reduced from its conventional neoclassical level, like that of previous δf simulations. On the other hand, the direct particle loss from the confinement region may strongly increase the ion thermal transport near the edge. It is found that the ion parallel flow near the axis is also largely reduced due to non-standard orbit topology. (author)

  5. Dynamically consistent oil import tariffs

    International Nuclear Information System (INIS)

    Karp, L.; Newbery, D.M.

    1992-01-01

    The standard theory of optimal tariffs considers tariffs on perishable goods produced abroad under static conditions, in which tariffs affect prices only in that period. Oil and other exhaustable resources do not fit this model, for current tariffs affect the amount of oil imported, which will affect the remaining stock and hence its future price. The problem of choosing a dynamically consistent oil import tariff when suppliers are competitive but importers have market power is considered. The open-loop Nash tariff is solved for the standard competitive case in which the oil price is arbitraged, and it was found that the resulting tariff rises at the rate of interest. This tariff was found to have an equilibrium that in general is dynamically inconsistent. Nevertheless, it is shown that necessary and sufficient conditions exist under which the tariff satisfies the weaker condition of time consistency. A dynamically consistent tariff is obtained by assuming that all agents condition their current decisions on the remaining stock of the resource, in contrast to open-loop strategies. For the natural case in which all agents choose their actions simultaneously in each period, the dynamically consistent tariff was characterized, and found to differ markedly from the time-inconsistent open-loop tariff. It was shown that if importers do not have overwhelming market power, then the time path of the world price is insensitive to the ability to commit, as is the level of wealth achieved by the importer. 26 refs., 4 figs

  6. Methods of Teaching Reading to EFL Learners: A Case Study

    Science.gov (United States)

    Sanjaya, Dedi; Rahmah; Sinulingga, Johan; Lubis, Azhar Aziz; Yusuf, Muhammad

    2014-01-01

    Methods of teaching reading skill are not the same in different countries. It depends on the condition and situation of the learners. Observing the method of teaching in Malaysia was the purpose of this study and the result of the study shows that there are 5 methods that are applied in classroom activities namely Grammar Translation Method (GTM),…

  7. Comparative Study of Molecular Basket Sorbents Consisting of Polyallylamine and Polyethylenimine Functionalized SBA-15 for CO2 Capture from Flue Gas.

    Science.gov (United States)

    Wang, Dongxiang; Wang, Xiaoxing; Song, Chunshan

    2017-11-17

    Polyallylamine (PAA)-based molecular basket sorbents (MBS) have been studied for CO 2 capture in comparison with polyethylenimine (PEI)-based MBS. The characterizations including N 2 physisorption, diffuse reflectance infrared Fourier transform spectroscopy (DRIFTS), and thermogravimetric analysis (TGA) showed that PAA (M n =15 000) is more rigid and has more steric hindrance inside SBA-15 pores than PEI owing mainly to its different polymer structure. The effects of temperature and PAA loading on the CO 2 sorption capacity of PAA-based MBS have been examined by TGA by using 100 % CO 2 gas stream and compared with PEI/SBA-15. It was found that the capacity of the PAA/SBA-15 sorbent increased with increasing temperature. The optimum capacity of 88 mg CO2  g sorb -1 was obtained at 140 °C for PAA(50)/SBA-15 whereas the optimum sorption temperature was 75 and 90 °C for PEI-I(50)/SBA-15 (PEI-I, M n =423) and PEI-II(50)/SBA-15 (PEI-II, M n =25 000), respectively. The capacity initially increased with the increase of PAA loading and then dropped at high amine contents, owing to the increased diffusion barrier. The highest CO 2 capacity of 109 mg CO2  g sorb -1 was obtained at a PAA loading of 65 wt %, whereas the PAA(50)/SBA-15 sorbent gave the best amine efficiency of 0.23 mol CO2  mol N -1 . The effect of moisture was examined in a fixed-bed flow system with simulated flue gas containing 15 % CO 2 and 4.5 % O 2 in N 2 . It was found that the presence of moisture significantly enhanced CO 2 sorption over PAA(50)/SBA-15 and greatly improved its cyclic stability and regenerability. Compared with PEI/SBA-15, PAA/SBA-15 possesses a better thermal stability and higher resistance to oxidative degradation. However, the CO 2 sorption rate over the PAA(50)/SBA-15 sorbent was much slower. © 2017 Wiley-VCH Verlag GmbH & Co. KGaA, Weinheim.

  8. The Language Teaching Methods Scale: Reliability and Validity Studies

    Science.gov (United States)

    Okmen, Burcu; Kilic, Abdurrahman

    2016-01-01

    The aim of this research is to develop a scale to determine the language teaching methods used by English teachers. The research sample consisted of 300 English teachers who taught at Duzce University and in primary schools, secondary schools and high schools in the Provincial Management of National Education in the city of Duzce in 2013-2014…

  9. A comparative study of the apparent total tract digestibility of carbohydrates in Icelandic and Danish warmblood horses fed two different haylages and a concentrate consisting of sugar beet pulp and black oats.

    Science.gov (United States)

    Jensen, Rasmus Bovbjerg; Brokner, Christine; Knudsen, Knud Erik Bach; Tauson, Anne-Helene

    2010-10-01

    Four Icelandic (ICE) and four Danish Warmblood (DW) horses were used in a crossover study with two treatments to investigate the effect of breed and the effect of stage of maturity of haylage on the apparent total tract digestibility (ATTD) of a diet consisting of sugar beet pulp, black oats and haylage early or late cut. Fibre was analysed as crude fibre (CF), acid detergent fibre (ADF), neutral detergent fibre (NDF) and dietary fibre (DF = non-starch polysaccharides (NSP) plus lignin). In haylage all analysed fibre fractions increased with advancing stage of maturity, with the cell wall components cellulose, non-cellulosic residue, xylose and lignin causing this increase. Crude protein (CP) and sugars decreased with advancing stage of maturity. Feeding early cut haylage resulted in a significantly (p haylage. There was a significantly (p haylage. Concentrations of total short-chain fatty acids were significantly (p haylage, reflecting the higher fermentability (higher ATTD) of this diet. There was no marked effect of breed on faecal parameters. The DF analysis method gave the most appropriate differentiation of the fibre fractions and their digestibility, compared to the traditional CF, ADF and NDF analyses. A major advantage of the DF analysis is the capacity of recovering soluble fibres. The results suggested that ICE had higher ATTD of DF than DW, and this was caused by a tendency for a higher ATTD of cellulose, but further studies are required to verify that in general.

  10. Consistently violating the non-Gaussian consistency relation

    International Nuclear Information System (INIS)

    Mooij, Sander; Palma, Gonzalo A.

    2015-01-01

    Non-attractor models of inflation are characterized by the super-horizon evolution of curvature perturbations, introducing a violation of the non-Gaussian consistency relation between the bispectrum's squeezed limit and the power spectrum's spectral index. In this work we show that the bispectrum's squeezed limit of non-attractor models continues to respect a relation dictated by the evolution of the background. We show how to derive this relation using only symmetry arguments, without ever needing to solve the equations of motion for the perturbations

  11. A comparative study of Averrhoabilimbi extraction method

    Science.gov (United States)

    Zulhaimi, H. I.; Rosli, I. R.; Kasim, K. F.; Akmal, H. Muhammad; Nuradibah, M. A.; Sam, S. T.

    2017-09-01

    In recent year, bioactive compound in plant has become a limelight in the food and pharmaceutical market, leading to research interest to implement effective technologies for extracting bioactive substance. Therefore, this study is focusing on extraction of Averrhoabilimbi by different extraction technique namely, maceration and ultrasound-assisted extraction. Fewplant partsof Averrhoabilimbiweretaken as extraction samples which are fruits, leaves and twig. Different solvents such as methanol, ethanol and distilled water were utilized in the process. Fruit extractsresult in highest extraction yield compared to other plant parts. Ethanol and distilled water have significant role compared to methanol in all parts and both extraction technique. The result also shows that ultrasound-assisted extraction gave comparable result with maceration. Besides, the shorter period on extraction process gives useful in term of implementation to industries.

  12. Self-consistent radial sheath

    International Nuclear Information System (INIS)

    Hazeltine, R.D.

    1988-12-01

    The boundary layer arising in the radial vicinity of a tokamak limiter is examined, with special reference to the TEXT tokamak. It is shown that sheath structure depends upon the self-consistent effects of ion guiding-center orbit modification, as well as the radial variation of E /times/ B-induced toroidal rotation. Reasonable agreement with experiment is obtained from an idealized model which, however simplified, preserves such self-consistent effects. It is argued that the radial sheath, which occurs whenever confining magnetic field-lines lie in the plasma boundary surface, is an object of some intrinsic interest. It differs from the more familiar axial sheath because magnetized charges respond very differently to parallel and perpendicular electric fields. 11 refs., 1 fig

  13. Brief Report: Consistency of Search Engine Rankings for Autism Websites

    Science.gov (United States)

    Reichow, Brian; Naples, Adam; Steinhoff, Timothy; Halpern, Jason; Volkmar, Fred R.

    2012-01-01

    The World Wide Web is one of the most common methods used by parents to find information on autism spectrum disorders and most consumers find information through search engines such as Google or Bing. However, little is known about how the search engines operate or the consistency of the results that are returned over time. This study presents the…

  14. Gender wage gap studies : consistency and decomposition

    OpenAIRE

    Kunze, Astrid

    2006-01-01

    This paper reviews the empirical literature on the gender wage gap, with particular attention given to the identification of the key parameters in human capital wage regression models. This is of great importance in the literature for two main reasons. First, the main explanatory variables in the wage model, i.e., measures of work experience and the time-out-of-work, are endogenous. As a result, applying traditional estimators may lead to inconsistent parameter estimates. Secon...

  15. Lagrangian multiforms and multidimensional consistency

    Energy Technology Data Exchange (ETDEWEB)

    Lobb, Sarah; Nijhoff, Frank [Department of Applied Mathematics, University of Leeds, Leeds LS2 9JT (United Kingdom)

    2009-10-30

    We show that well-chosen Lagrangians for a class of two-dimensional integrable lattice equations obey a closure relation when embedded in a higher dimensional lattice. On the basis of this property we formulate a Lagrangian description for such systems in terms of Lagrangian multiforms. We discuss the connection of this formalism with the notion of multidimensional consistency, and the role of the lattice from the point of view of the relevant variational principle.

  16. Consistency and Communication in Committees

    OpenAIRE

    Inga Deimen; Felix Ketelaar; Mark T. Le Quement

    2013-01-01

    This paper analyzes truthtelling incentives in pre-vote communication in heterogeneous committees. We generalize the classical Condorcet jury model by introducing a new informational structure that captures consistency of information. In contrast to the impossibility result shown by Coughlan (2000) for the classical model, full pooling of information followed by sincere voting is an equilibrium outcome of our model for a large set of parameter values implying the possibility of ex post confli...

  17. Comparative study of three methods of esophageal anastomosis in dogs

    Directory of Open Access Journals (Sweden)

    Z. T. Abd Al-Maseeh

    2009-01-01

    Full Text Available This study was performed to compare three methods of esophageal anastomosis. Twenty four healthy adult dogs were used in this study. The animals were divided into three groups; each one consisted of 8 animals. In group 1; two layers were used to perform the esophageal anastomosis. The first layer represented simple interrupted suture to close the mucosa with knot inside the lumen, and the second layer represented horizontal mattress interrupted suture to close the other layers of esophagus. While in group 2; one layer of cross interrupted mattress suture was used to close all layers of esophageal wall, and in group 3; one layer of Schmieden's suture was used to close all layers of esophageal wall. The results of clinical, radiological and histopathological studies after 15 and 30 days of surgical operation revealed that most of the animals showed different degrees of difficulty concerning the moderate dysphagia and regurgitation. The radiological study showed significant difference of stenosis. The best results were recorded in the second group where the mean degree of stenosis was 7.69%, however the mean degree of stenosis was 42.80% in the first group, while the mean degree of stenosis in the third groups was 37.81%, through 30 days. The histopathological study of group 2 showed rapid healing of the site of anastomosis, lack of granulation tissue and consequently the less degree of stricture and other complications as compared with groups 1 and 3. The Schmieden's suture was characterized by its standard short time as compared with group 1 and 2, although accompanied by some complications. In conclusion this study revealed that the cross mattress suture used in the second group characterized by faster healing and minimal amount of fibrous tissue formation manifested by decrease in moderate degree of stenosis as compared with the two other suture patterns used in the first and third groups.

  18. Optimization method development of the core characteristics of a fast reactor in order to explore possible high performance solutions (a solution being a consistent set of fuel, core, system and safety)

    International Nuclear Information System (INIS)

    Ingremeau, J.-J.X.

    2011-01-01

    In the study of any new nuclear reactor, the design of the core is an important step. However designing and optimising a reactor core is quite complex as it involves neutronics, thermal-hydraulics and fuel thermomechanics and usually design of such a system is achieved through an iterative process, involving several different disciplines. In order to solve quickly such a multi-disciplinary system, while observing the appropriate constraints, a new approach has been developed to optimise both the core performance (in-cycle Pu inventory, fuel burn-up, etc...) and the core safety characteristics (safety estimators) of a Fast Neutron Reactor. This new approach, called FARM (Fast Reactor Methodology) uses analytical models and interpolations (Meta-models) from CEA reference codes for neutronics, thermal-hydraulics and fuel behaviour, which are coupled to automatically design a core based on several optimization variables. This global core model is then linked to a genetic algorithm and used to explore and optimise new core designs with improved performance. Consideration has also been given to which parameters can be best used to define the core performance and how safety can be taken into account.This new approach has been used to optimize the design of three concepts of Gas cooled Fast Reactor (GFR). For the first one, using a SiC/SiCf-cladded carbide-fuelled helium-bonded pin, the results demonstrate that the CEA reference core obtained with the traditional iterative method was an optimal core, but among many other possibilities (that is to say on the Pareto front). The optimization also found several other cores which exhibit some improved features at the expense of other safety or performance estimators. An evolution of this concept using a 'buffer', a new technology being developed at CEA, has hence been introduced in FARM. The FARM optimisation produced several core designs using this technology, and estimated their performance. The results obtained show that

  19. Perceptions of physiotherapists towards research: a mixed methods study.

    Science.gov (United States)

    Janssen, J; Hale, L; Mirfin-Veitch, B; Harland, T

    2016-06-01

    To explore the perceptions of physiotherapists towards the use of and participation in research. Concurrent mixed methods research, combining in-depth interviews with three questionnaires (demographics, Edmonton Research Orientation Survey, visual analogue scales for confidence and motivation to participate in research). One physiotherapy department in a rehabilitation hospital, consisting of seven specialised areas. Twenty-five subjects {four men and 21 women, mean age 38 [standard deviation (SD) 11] years} who had been registered as a physiotherapist for a mean period of 15 (SD 10) years participated in this study. They were registered with the New Zealand Board of Physiotherapy, held a current practising certificate, and were working as a physiotherapist or physiotherapy/allied health manager at the hospital. The primary outcome measure was in-depth interviews and the secondary outcome measures were the three questionnaires. Physiotherapists were generally positive towards research, but struggled with the concept of research, the available literature and the time to commit to research. Individual confidence and orientation towards research seemed to influence how these barriers were perceived. This study showed that physiotherapists struggle to implement research in their daily practice and become involved in research. Changing physiotherapists' conceptions of research, making it more accessible and providing dedicated research time could facilitate increased involvement in the physiotherapy profession. Copyright © 2015 Chartered Society of Physiotherapy. Published by Elsevier Ltd. All rights reserved.

  20. Slope failures in surface mines, methods of studying landslides

    Energy Technology Data Exchange (ETDEWEB)

    Flisiak, J; Korman, S; Mazurek, J

    1977-01-01

    This paper presents a review of methods of measuring landslide fissures, displacement of ground surface points in the landslide area and of points inside the landslide. An analysis of the landslide process is given, stressing various stages and phases of a landslide. Studies carried out by the Institute of Mining Geomechanics of the Technical University of Mining and Metallurgy in Cracow are evaluated. The studies concentrated on the final state of slopes in brown coal surface mines after a landslide occurs. The necessity of developing an apparatus for continuous recording of displacements of points on a landslide surface is stressed. An apparatus developed by the Institute and used for continuous measuring and recording of displacements is described. The apparatus is used to measure displacements of points during the initial phase of a landslide and during the phase of the largest displacements. The principle of the system consists in locating a number of observation points on the ground and a slope. The points are connected among themselves by flexible connectors. The connectors are equipped with potentiometric transmitters which transform the relative displacements into electric pulses. These pulses are recorded by a conventional recording apparatus. (55 refs.) (In Polish)

  1. Towards thermodynamical consistency of quasiparticle picture

    International Nuclear Information System (INIS)

    Biro, T.S.; Shanenko, A.A.; Toneev, V.D.; Research Inst. for Particle and Nuclear Physics, Hungarian Academy of Sciences, Budapest

    2003-01-01

    The purpose of the present article is to call attention to some realistic quasi-particle-based description of the quark/gluon matter and its consistent implementation in thermodynamics. A simple and transparent representation of the thermodynamical consistency conditions is given. This representation allows one to review critically and systemize available phenomenological approaches to the deconfinement problem with respect to their thermodynamical consistency. A particular attention is paid to the development of a method for treating the string screening in the dense matter of unbound color charges. The proposed method yields an integrable effective pair potential, which can be incorporated into the mean-field picture. The results of its application are in reasonable agreement with lattice data on the QCD thermodynamics [ru

  2. Toward thermodynamic consistency of quasiparticle picture

    International Nuclear Information System (INIS)

    Biro, T.S.; Toneev, V.D.; Shanenko, A.A.

    2003-01-01

    The purpose of the present article is to call attention to some realistic quasiparticle-based description of quark/gluon matter and its consistent implementation in thermodynamics. A simple and transparent representation of the thermodynamic consistency conditions is given. This representation allows one to review critically and systemize available phenomenological approaches to the deconfinement problem with respect to their thermodynamic consistency. Particular attention is paid to the development of a method for treating the string screening in the dense matter of unbound color charges. The proposed method yields an integrable effective pair potential that can be incorporated into the mean-field picture. The results of its application are in reasonable agreement with lattice data on the QCD thermodynamics

  3. A Consistent Phylogenetic Backbone for the Fungi

    Science.gov (United States)

    Ebersberger, Ingo; de Matos Simoes, Ricardo; Kupczok, Anne; Gube, Matthias; Kothe, Erika; Voigt, Kerstin; von Haeseler, Arndt

    2012-01-01

    The kingdom of fungi provides model organisms for biotechnology, cell biology, genetics, and life sciences in general. Only when their phylogenetic relationships are stably resolved, can individual results from fungal research be integrated into a holistic picture of biology. However, and despite recent progress, many deep relationships within the fungi remain unclear. Here, we present the first phylogenomic study of an entire eukaryotic kingdom that uses a consistency criterion to strengthen phylogenetic conclusions. We reason that branches (splits) recovered with independent data and different tree reconstruction methods are likely to reflect true evolutionary relationships. Two complementary phylogenomic data sets based on 99 fungal genomes and 109 fungal expressed sequence tag (EST) sets analyzed with four different tree reconstruction methods shed light from different angles on the fungal tree of life. Eleven additional data sets address specifically the phylogenetic position of Blastocladiomycota, Ustilaginomycotina, and Dothideomycetes, respectively. The combined evidence from the resulting trees supports the deep-level stability of the fungal groups toward a comprehensive natural system of the fungi. In addition, our analysis reveals methodologically interesting aspects. Enrichment for EST encoded data—a common practice in phylogenomic analyses—introduces a strong bias toward slowly evolving and functionally correlated genes. Consequently, the generalization of phylogenomic data sets as collections of randomly selected genes cannot be taken for granted. A thorough characterization of the data to assess possible influences on the tree reconstruction should therefore become a standard in phylogenomic analyses. PMID:22114356

  4. Modeling and Testing Legacy Data Consistency Requirements

    DEFF Research Database (Denmark)

    Nytun, J. P.; Jensen, Christian Søndergaard

    2003-01-01

    An increasing number of data sources are available on the Internet, many of which offer semantically overlapping data, but based on different schemas, or models. While it is often of interest to integrate such data sources, the lack of consistency among them makes this integration difficult....... This paper addresses the need for new techniques that enable the modeling and consistency checking for legacy data sources. Specifically, the paper contributes to the development of a framework that enables consistency testing of data coming from different types of data sources. The vehicle is UML and its...... accompanying XMI. The paper presents techniques for modeling consistency requirements using OCL and other UML modeling elements: it studies how models that describe the required consistencies among instances of legacy models can be designed in standard UML tools that support XMI. The paper also considers...

  5. Comparison of the xenon-133 washout method with the microsphere method in dog brain perfusion studies

    International Nuclear Information System (INIS)

    Heikkitae, J.; Kettunen, R.; Ahonen, A.

    1982-01-01

    The validity of the Xenon-washout method in estimation of regional cerebral blood flow was tested against a radioactive microsphere method in anaesthetized dogs. The two compartmental model seemed not to be well suited for cerebral perfusion studies by Xe-washout method, although bi-exponential analysis of washout curves gave perfusion values correlating with the microsphere method but depending on calculation method

  6. Descriptive study of the Socratic method: evidence for verbal shaping.

    Science.gov (United States)

    Calero-Elvira, Ana; Froján-Parga, María Xesús; Ruiz-Sancho, Elena María; Alpañés-Freitag, Manuel

    2013-12-01

    In this study we analyzed 65 fragments of session recordings in which a cognitive behavioral therapist employed the Socratic method with her patients. Specialized coding instruments were used to categorize the verbal behavior of the psychologist and the patients. First the fragments were classified as more or less successful depending on the overall degree of concordance between the patient's verbal behavior and the therapeutic objectives. Then the fragments were submitted to sequential analysis so as to discover regularities linking the patient's verbal behavior and the therapist's responses to it. Important differences between the more and the less successful fragments involved the therapist's approval or disapproval of verbalizations that approximated therapeutic goals. These approvals and disapprovals were associated with increases and decreases, respectively, in the patient's behavior. These results are consistent with the existence, in this particular case, of a process of shaping through which the therapist modifies the patient's verbal behavior in the overall direction of his or her chosen therapeutic objectives. © 2013.

  7. Do Health Systems Have Consistent Performance Across Locations and Is Consistency Associated With Higher Performance?

    Science.gov (United States)

    Crespin, Daniel J; Christianson, Jon B; McCullough, Jeffrey S; Finch, Michael D

    This study addresses whether health systems have consistent diabetes care performance across their ambulatory clinics and whether increasing consistency is associated with improvements in clinic performance. Study data included 2007 to 2013 diabetes care intermediate outcome measures for 661 ambulatory clinics in Minnesota and bordering states. Health systems provided more consistent performance, as measured by the standard deviation of performance for clinics in a system, relative to propensity score-matched proxy systems created for comparison purposes. No evidence was found that improvements in consistency were associated with higher clinic performance. The combination of high performance and consistent care is likely to enhance a health system's brand reputation, allowing it to better mitigate the financial risks of consumers seeking care outside the organization. These results suggest that larger health systems are most likely to deliver the combination of consistent and high-performance care. Future research should explore the mechanisms that drive consistent care within health systems.

  8. Studying the method of linearization of exponential calibration curves

    International Nuclear Information System (INIS)

    Bunzh, Z.A.

    1989-01-01

    The results of study of the method for linearization of exponential calibration curves are given. The calibration technique and comparison of the proposed method with piecewise-linear approximation and power series expansion, are given

  9. Maintaining consistency in distributed systems

    Science.gov (United States)

    Birman, Kenneth P.

    1991-01-01

    In systems designed as assemblies of independently developed components, concurrent access to data or data structures normally arises within individual programs, and is controlled using mutual exclusion constructs, such as semaphores and monitors. Where data is persistent and/or sets of operation are related to one another, transactions or linearizability may be more appropriate. Systems that incorporate cooperative styles of distributed execution often replicate or distribute data within groups of components. In these cases, group oriented consistency properties must be maintained, and tools based on the virtual synchrony execution model greatly simplify the task confronting an application developer. All three styles of distributed computing are likely to be seen in future systems - often, within the same application. This leads us to propose an integrated approach that permits applications that use virtual synchrony with concurrent objects that respect a linearizability constraint, and vice versa. Transactional subsystems are treated as a special case of linearizability.

  10. Complex of radioanalytical methods for radioecological study of STS

    International Nuclear Information System (INIS)

    Artemev, O.I.; Larin, V.N.; Ptitskaya, L.D.; Smagulova, G.S.

    1998-01-01

    Today the main task of the Institute of Radiation Safety and Ecology is the assessment of parameters of radioecological situation in areas of nuclear testing on the territory of the former Semipalatinsk Test Site (STS). According to the diagram below, the radioecological study begins with the Field radiometry and environmental sampling followed by the coordinate fixation. This work is performed by the staff of the Radioecology Laboratory equipped with the state-of-the-art devices of dosimetry and radiometry. All the devices annually undergo the State Check by the RK Gosstandard Centre in Almaty. The air samples are also collected for determination of radon content. Environmental samples are measured for the total gamma activity in order to dispatch and discard samples with the insufficient level of homogenization. Samples are measured with the gamma radiometry installation containing NaJ(TI) scintillation detector. The installation background is measured everyday and many times. Time duration of measurement depends on sample activity. Further, samples are measured with alpha and beta radiometers for the total alpha and beta activity that characterizes the radioactive contamination of sampling locations. Apart from the Radiometry Laboratory the analytical complex includes the Radiochemistry and Gamma Spectrometry Laboratories. The direct gamma spectral (instrumental) methods in most cases allow to obtain the sufficiently rapid information about the radionuclides present in a sample. The state-of-the-art equipment together with the computer technology provide the high quantitative and qualitative precision and high productivity as well. One of the advantages of the method is that samples after measurement maintain their state and can be used for the repeated measurements or radiochemical reanalyzes. The Gamma Spectrometry Laboratory has three state-of-the-art gamma spectral installations consisting of high resolution semi-conductive detectors and equipped with

  11. Studying collaborative information seeking: Experiences with three methods

    DEFF Research Database (Denmark)

    Hyldegård, Jette Seiden; Hertzum, Morten; Hansen, Preben

    2015-01-01

    , however, benefit from a discussion of methodological issues. This chapter describes the application of three methods for collecting and analyzing data in three CIS studies. The three methods are Multidimensional Exploration, used in a CIS study of students’ in-formation behavior during a group assignment......; Task-structured Observation, used in a CIS study of patent engineers; and Condensed Observation, used in a CIS study of information-systems development. The three methods are presented in the context of the studies for which they were devised, and the experiences gained using the methods are discussed....... The chapter shows that different methods can be used for collecting and analyzing data about CIS incidents. Two of the methods focused on tasks and events in work settings, while the third was applied in an educational setting. Commonalities and differences among the methods are discussed to inform decisions...

  12. Narrative Inquiry as Travel Study Method: Affordances and Constraints

    Science.gov (United States)

    Craig, Cheryl J.; Zou, Yali; Poimbeauf, Rita

    2014-01-01

    This article maps how narrative inquiry--the use of story to study human experience--has been employed as both method and form to capture cross-cultural learning associated with Western doctoral students' travel study to eastern destinations. While others were the first to employ this method in the travel study domain, we are the first to…

  13. The study of technological prevention method of road accident ...

    African Journals Online (AJOL)

    The study of technological prevention method of road accident related to driver and vehicle. ... road accident prevention method based on the factors studied. The study of this paper can provide forceful data analysis support for the road traffic safety related research. Keywords: road accident; accident prevention; road safety.

  14. Decentralized Consistent Updates in SDN

    KAUST Repository

    Nguyen, Thanh Dang

    2017-04-10

    We present ez-Segway, a decentralized mechanism to consistently and quickly update the network state while preventing forwarding anomalies (loops and blackholes) and avoiding link congestion. In our design, the centralized SDN controller only pre-computes information needed by the switches during the update execution. This information is distributed to the switches, which use partial knowledge and direct message passing to efficiently realize the update. This separation of concerns has the key benefit of improving update performance as the communication and computation bottlenecks at the controller are removed. Our evaluations via network emulations and large-scale simulations demonstrate the efficiency of ez-Segway, which compared to a centralized approach, improves network update times by up to 45% and 57% at the median and the 99th percentile, respectively. A deployment of a system prototype in a real OpenFlow switch and an implementation in P4 demonstrate the feasibility and low overhead of implementing simple network update functionality within switches.

  15. The Use of Qualitative Case Studies as an Experiential Teaching Method in the Training of Pre-Service Teachers

    Science.gov (United States)

    Arseven, Ilhami

    2018-01-01

    This study presents the suitability of case studies, which is a qualitative research method and can be used as a teaching method in the training of pre-service teachers, for experiential learning theory. The basic view of experiential learning theory on learning and the qualitative case study paradigm are consistent with each other within the…

  16. Consistency in the World Wide Web

    DEFF Research Database (Denmark)

    Thomsen, Jakob Grauenkjær

    Tim Berners-Lee envisioned that computers will behave as agents of humans on the World Wide Web, where they will retrieve, extract, and interact with information from the World Wide Web. A step towards this vision is to make computers capable of extracting this information in a reliable...... and consistent way. In this dissertation we study steps towards this vision by showing techniques for the specication, the verication and the evaluation of the consistency of information in the World Wide Web. We show how to detect certain classes of errors in a specication of information, and we show how...... the World Wide Web, in order to help perform consistent evaluations of web extraction techniques. These contributions are steps towards having computers reliable and consistently extract information from the World Wide Web, which in turn are steps towards achieving Tim Berners-Lee's vision. ii...

  17. Consistent histories and operational quantum theory

    International Nuclear Information System (INIS)

    Rudolph, O.

    1996-01-01

    In this work a generalization of the consistent histories approach to quantum mechanics is presented. We first critically review the consistent histories approach to nonrelativistic quantum mechanics in a mathematically rigorous way and give some general comments about it. We investigate to what extent the consistent histories scheme is compatible with the results of the operational formulation of quantum mechanics. According to the operational approach, nonrelativistic quantum mechanics is most generally formulated in terms of effects, states, and operations. We formulate a generalized consistent histories theory using the concepts and the terminology which have proven useful in the operational formulation of quantum mechanics. The logical rule of the logical interpretation of quantum mechanics is generalized to the present context. The algebraic structure of the generalized theory is studied in detail

  18. A Simple Method for Decreasing the Liquid Junction Potential in a Flow-through-Type Differential pH Sensor Probe Consisting of pH-FETs by Exerting Spatiotemporal Control of the Liquid Junction

    Science.gov (United States)

    Yamada, Akira; Mohri, Satoshi; Nakamura, Michihiro; Naruse, Keiji

    2015-01-01

    The liquid junction potential (LJP), the phenomenon that occurs when two electrolyte solutions of different composition come into contact, prevents accurate measurements in potentiometry. The effect of the LJP is usually remarkable in measurements of diluted solutions with low buffering capacities or low ion concentrations. Our group has constructed a simple method to eliminate the LJP by exerting spatiotemporal control of a liquid junction (LJ) formed between two solutions, a sample solution and a baseline solution (BLS), in a flow-through-type differential pH sensor probe. The method was contrived based on microfluidics. The sensor probe is a differential measurement system composed of two ion-sensitive field-effect transistors (ISFETs) and one Ag/AgCl electrode. With our new method, the border region of the sample solution and BLS is vibrated in order to mix solutions and suppress the overshoot after the sample solution is suctioned into the sensor probe. Compared to the conventional method without vibration, our method shortened the settling time from over two min to 15 s and reduced the measurement error by 86% to within 0.060 pH. This new method will be useful for improving the response characteristics and decreasing the measurement error of many apparatuses that use LJs. PMID:25835300

  19. Time-Consistent and Market-Consistent Evaluations (Revised version of 2012-086)

    NARCIS (Netherlands)

    Stadje, M.A.; Pelsser, A.

    2014-01-01

    Abstract: We consider evaluation methods for payoffs with an inherent financial risk as encountered for instance for portfolios held by pension funds and insurance companies. Pricing such payoffs in a way consistent to market prices typically involves combining actuarial techniques with methods from

  20. Comparative study of discretization methods of microarray data for inferring transcriptional regulatory networks

    Directory of Open Access Journals (Sweden)

    Ji Wei

    2010-10-01

    Full Text Available Abstract Background Microarray data discretization is a basic preprocess for many algorithms of gene regulatory network inference. Some common discretization methods in informatics are used to discretize microarray data. Selection of the discretization method is often arbitrary and no systematic comparison of different discretization has been conducted, in the context of gene regulatory network inference from time series gene expression data. Results In this study, we propose a new discretization method "bikmeans", and compare its performance with four other widely-used discretization methods using different datasets, modeling algorithms and number of intervals. Sensitivities, specificities and total accuracies were calculated and statistical analysis was carried out. Bikmeans method always gave high total accuracies. Conclusions Our results indicate that proper discretization methods can consistently improve gene regulatory network inference independent of network modeling algorithms and datasets. Our new method, bikmeans, resulted in significant better total accuracies than other methods.

  1. Consistent application of codes and standards

    International Nuclear Information System (INIS)

    Scott, M.A.

    1989-01-01

    The guidelines presented in the US Department of Energy, General Design Criteria (DOE 6430.1A), and the Design and Evaluation Guidelines for Department of Energy Facilities Subject to Natural Phenomena Hazards (UCRL-15910) provide a consistent and well defined approach to determine the natural phenomena hazards loads for US Department of Energy site facilities. The guidelines for the application of loads combinations and allowables criteria are not as well defined and are more flexible in interpretation. This flexibility in the interpretation of load combinations can lead to conflict between the designer and overseer. The establishment of an efficient set of acceptable design criteria, based on US Department of Energy guidelines, provides a consistent baseline for analysis, design, and review. Additionally, the proposed method should not limit the design and analytical innovation necessary to analyze or qualify the unique structure. This paper investigates the consistent application of load combinations, analytical methods, and load allowables and suggests a reference path consistent with the US Department of Energy guidelines

  2. Probability Machines: Consistent Probability Estimation Using Nonparametric Learning Machines

    Science.gov (United States)

    Malley, J. D.; Kruppa, J.; Dasgupta, A.; Malley, K. G.; Ziegler, A.

    2011-01-01

    Summary Background Most machine learning approaches only provide a classification for binary responses. However, probabilities are required for risk estimation using individual patient characteristics. It has been shown recently that every statistical learning machine known to be consistent for a nonparametric regression problem is a probability machine that is provably consistent for this estimation problem. Objectives The aim of this paper is to show how random forests and nearest neighbors can be used for consistent estimation of individual probabilities. Methods Two random forest algorithms and two nearest neighbor algorithms are described in detail for estimation of individual probabilities. We discuss the consistency of random forests, nearest neighbors and other learning machines in detail. We conduct a simulation study to illustrate the validity of the methods. We exemplify the algorithms by analyzing two well-known data sets on the diagnosis of appendicitis and the diagnosis of diabetes in Pima Indians. Results Simulations demonstrate the validity of the method. With the real data application, we show the accuracy and practicality of this approach. We provide sample code from R packages in which the probability estimation is already available. This means that all calculations can be performed using existing software. Conclusions Random forest algorithms as well as nearest neighbor approaches are valid machine learning methods for estimating individual probabilities for binary responses. Freely available implementations are available in R and may be used for applications. PMID:21915433

  3. Self-consistent asset pricing models

    Science.gov (United States)

    Malevergne, Y.; Sornette, D.

    2007-08-01

    We discuss the foundations of factor or regression models in the light of the self-consistency condition that the market portfolio (and more generally the risk factors) is (are) constituted of the assets whose returns it is (they are) supposed to explain. As already reported in several articles, self-consistency implies correlations between the return disturbances. As a consequence, the alphas and betas of the factor model are unobservable. Self-consistency leads to renormalized betas with zero effective alphas, which are observable with standard OLS regressions. When the conditions derived from internal consistency are not met, the model is necessarily incomplete, which means that some sources of risk cannot be replicated (or hedged) by a portfolio of stocks traded on the market, even for infinite economies. Analytical derivations and numerical simulations show that, for arbitrary choices of the proxy which are different from the true market portfolio, a modified linear regression holds with a non-zero value αi at the origin between an asset i's return and the proxy's return. Self-consistency also introduces “orthogonality” and “normality” conditions linking the betas, alphas (as well as the residuals) and the weights of the proxy portfolio. Two diagnostics based on these orthogonality and normality conditions are implemented on a basket of 323 assets which have been components of the S&P500 in the period from January 1990 to February 2005. These two diagnostics show interesting departures from dynamical self-consistency starting about 2 years before the end of the Internet bubble. Assuming that the CAPM holds with the self-consistency condition, the OLS method automatically obeys the resulting orthogonality and normality conditions and therefore provides a simple way to self-consistently assess the parameters of the model by using proxy portfolios made only of the assets which are used in the CAPM regressions. Finally, the factor decomposition with the

  4. Self-consistent velocity dependent effective interactions

    International Nuclear Information System (INIS)

    Kubo, Takayuki; Sakamoto, Hideo; Kammuri, Tetsuo; Kishimoto, Teruo.

    1993-09-01

    The field coupling method is extended to a system with a velocity dependent mean potential. By means of this method, we can derive the effective interactions which are consistent with the mean potential. The self-consistent velocity dependent effective interactions are applied to the microscopic analysis of the structures of giant dipole resonances (GDR) of 148,154 Sm, of the first excited 2 + states of Sn isotopes and of the first excited 3 - states of Mo isotopes. It is clarified that the interactions play crucial roles in describing the splitting of the resonant structure of GDR peaks, in restoring the energy weighted sum rule values, and in reducing B (Eλ) values. (author)

  5. Evidence for Consistency of the Glycation Gap in Diabetes

    OpenAIRE

    Nayak, Ananth U.; Holland, Martin R.; Macdonald, David R.; Nevill, Alan; Singh, Baldev M.

    2011-01-01

    OBJECTIVE Discordance between HbA1c and fructosamine estimations in the assessment of glycemia is often encountered. A number of mechanisms might explain such discordance, but whether it is consistent is uncertain. This study aims to coanalyze paired glycosylated hemoglobin (HbA1c)-fructosamine estimations by using fructosamine to determine a predicted HbA1c, to calculate a glycation gap (G-gap) and to determine whether the G-gap is consistent over time. RESEARCH DESIGN AND METHODS We include...

  6. Development of constraint algorithm for the number of electrons in molecular orbitals consisting mainly 4f atomic orbitals of rare-earth elements and its introduction to tight-binding quantum chemical molecular dynamics method

    International Nuclear Information System (INIS)

    Endou, Akira; Onuma, Hiroaki; Jung, Sun-ho

    2007-01-01

    Our original tight-binding quantum chemical molecular dynamics code, Colors', has been successfully applied to the theoretical investigation of complex materials including rare-earth elements, e.g., metal catalysts supported on a CeO 2 surface. To expand our code so as to obtain a good convergence for the electronic structure of a calculation system including a rare-earth element, we developed a novel algorithm to provide a constraint condition for the number of electrons occupying the selected molecular orbitals that mainly consist of 4f atomic orbitals of the rare-earth element. This novel algorithm was introduced in Colors. Using Colors, we succeeded in obtaining the classified electronic configurations of the 4f atomic orbitals of Ce 4+ and reduced Ce ions in a CeO 2 bulk model with one oxygen defect, which makes it difficult to obtain a good convergence using a conventional first-principles quantum chemical calculation code. (author)

  7. Electromagnetic computation methods for lightning surge protection studies

    CERN Document Server

    Baba, Yoshihiro

    2016-01-01

    This book is the first to consolidate current research and to examine the theories of electromagnetic computation methods in relation to lightning surge protection. The authors introduce and compare existing electromagnetic computation methods such as the method of moments (MOM), the partial element equivalent circuit (PEEC), the finite element method (FEM), the transmission-line modeling (TLM) method, and the finite-difference time-domain (FDTD) method. The application of FDTD method to lightning protection studies is a topic that has matured through many practical applications in the past decade, and the authors explain the derivation of Maxwell's equations required by the FDTD, and modeling of various electrical components needed in computing lightning electromagnetic fields and surges with the FDTD method. The book describes the application of FDTD method to current and emerging problems of lightning surge protection of continuously more complex installations, particularly in critical infrastructures of e...

  8. The Consistency Between Clinical and Electrophysiological Diagnoses

    Directory of Open Access Journals (Sweden)

    Esra E. Okuyucu

    2009-09-01

    Full Text Available OBJECTIVE: The aim of this study was to provide information concerning the impact of electrophysiological tests in the clinical management and diagnosis of patients, and to evaluate the consistency between referring clinical diagnoses and electrophysiological diagnoses. METHODS: The study included 957 patients referred to the electroneuromyography (ENMG laboratory from different clinics with different clinical diagnoses in 2008. Demographic data, referring clinical diagnoses, the clinics where the requests wanted, and diagnoses after ENMG testing were recorded and statistically evaluated. RESULTS: In all, 957 patients [644 (67.3% female and 313 (32.7% male] were included in the study. Mean age of the patients was 45.40 ± 14.54 years. ENMG requests were made by different specialists; 578 (60.4% patients were referred by neurologists, 122 (12.8% by orthopedics, 140 (14.6% by neurosurgeons, and 117 (12.2% by physical treatment and rehabilitation departments. According to the results of ENMG testing, 513 (53.6% patients’ referrals were related to their referral diagnosis, whereas 397 (41.5% patients had normal ENMG test results, and 47 (4.9% patients had a diagnosis that differed from the referring diagnosis. Among the relation between the referral diagnosis and electrophysiological diagnosis according to the clinics where the requests were made, there was no statistical difference (p= 0.794, but there were statistically significant differences between the support of different clinical diagnoses, such as carpal tunnel syndrome, polyneuropathy, radiculopathy-plexopathy, entrapment neuropathy, and myopathy based on ENMG test results (p< 0.001. CONCLUSION: ENMG is a frequently used neurological examination. As such, referrals for ENMG can be made to either support the referring diagnosis or to exclude other diagnoses. This may explain the inconsistency between clinical referring diagnoses and diagnoses following ENMG

  9. Comparative study of the geostatistical ore reserve estimation method over the conventional methods

    International Nuclear Information System (INIS)

    Kim, Y.C.; Knudsen, H.P.

    1975-01-01

    Part I contains a comprehensive treatment of the comparative study of the geostatistical ore reserve estimation method over the conventional methods. The conventional methods chosen for comparison were: (a) the polygon method, (b) the inverse of the distance squared method, and (c) a method similar to (b) but allowing different weights in different directions. Briefly, the overall result from this comparative study is in favor of the use of geostatistics in most cases because the method has lived up to its theoretical claims. A good exposition on the theory of geostatistics, the adopted study procedures, conclusions and recommended future research are given in Part I. Part II of this report contains the results of the second and the third study objectives, which are to assess the potential benefits that can be derived by the introduction of the geostatistical method to the current state-of-the-art in uranium reserve estimation method and to be instrumental in generating the acceptance of the new method by practitioners through illustrative examples, assuming its superiority and practicality. These are given in the form of illustrative examples on the use of geostatistics and the accompanying computer program user's guide

  10. Is the molecular statics method suitable for the study of nanomaterials? A study case of nanowires

    International Nuclear Information System (INIS)

    Chang, I-L; Chen, Y-C

    2007-01-01

    Both molecular statics and molecular dynamics methods were employed to study the mechanical properties of copper nanowires. The size effect on both elastic and plastic properties of square cross-sectional nanowire was examined and compared systematically using two molecular approaches. It was found consistently from both molecular methods that the elastic and plastic properties of nanowires depend on the lateral size of nanowires. As the lateral size of nanowires decreases, the values of Young's modulus decrease and dislocation nucleation stresses increase. However, it was shown that the dislocation nucleation stress would be significantly influenced by the axial periodic length of the nanowire model using the molecular statics method while molecular dynamics simulations at two distinct temperatures (0.01 and 300 K) did not show the same dependence. It was concluded that molecular statics as an energy minimization numerical scheme is quite insensitive to the instability of atomic structure especially without thermal fluctuation and might not be a suitable tool for studying the behaviour of nanomaterials beyond the elastic limit

  11. Self-consistent modelling of resonant tunnelling structures

    DEFF Research Database (Denmark)

    Fiig, T.; Jauho, A.P.

    1992-01-01

    We report a comprehensive study of the effects of self-consistency on the I-V-characteristics of resonant tunnelling structures. The calculational method is based on a simultaneous solution of the effective-mass Schrödinger equation and the Poisson equation, and the current is evaluated...... applied voltages and carrier densities at the emitter-barrier interface. We include the two-dimensional accumulation layer charge and the quantum well charge in our self-consistent scheme. We discuss the evaluation of the current contribution originating from the two-dimensional accumulation layer charges......, and our qualitative estimates seem consistent with recent experimental studies. The intrinsic bistability of resonant tunnelling diodes is analyzed within several different approximation schemes....

  12. Reliability studies of diagnostic methods in Indian traditional Ayurveda medicine

    DEFF Research Database (Denmark)

    Kurande, Vrinda Hitendra; Waagepetersen, Rasmus; Toft, Egon

    2013-01-01

    as prakriti classification), method development (pulse diagnosis), quality assurance for diagnosis and treatment and in the conduct of clinical studies. Several reliability studies are conducted in western medicine. The investigation of the reliability of traditional Chinese, Japanese and Sasang medicine...

  13. Using the Case Study Method in Teaching College Physics

    Science.gov (United States)

    Burko, Lior M.

    2016-01-01

    The case study teaching method has a long history (starting at least with Socrates) and wide current use in business schools, medical schools, law schools, and a variety of other disciplines. However, relatively little use is made of it in the physical sciences, specifically in physics or astronomy. The case study method should be considered by…

  14. Japan Diabetic Nephropathy Cohort Study: study design, methods, and implementation.

    Science.gov (United States)

    Furuichi, Kengo; Shimizu, Miho; Toyama, Tadashi; Koya, Daisuke; Koshino, Yoshitaka; Abe, Hideharu; Mori, Kiyoshi; Satoh, Hiroaki; Imanishi, Masahito; Iwano, Masayuki; Yamauchi, Hiroyuki; Kusano, Eiji; Fujimoto, Shouichi; Suzuki, Yoshiki; Okuda, Seiya; Kitagawa, Kiyoki; Iwata, Yasunori; Kaneko, Shuichi; Nishi, Shinichi; Yokoyama, Hitoshi; Ueda, Yoshihiko; Haneda, Masakazu; Makino, Hirofumi; Wada, Takashi

    2013-12-01

    Diabetic nephropathy, leading to end-stage renal disease, has a considerable impact on public health and the social economy. However, there are few national registries of diabetic nephropathy in Japan. The aims of this prospective cohort study are to obtain clinical data and urine samples for revising the clinical staging of diabetic nephropathy, and developing new diagnostic markers for early diabetic nephropathy. The Japanese Society of Nephrology established a nationwide, web-based, and prospective registry system. On the system, there are two basic registries; the Japan Renal Biopsy Registry (JRBR), and the Japan Kidney Disease Registry (JKDR). In addition to the two basic registries, we established a new prospective registry to the system; the Japan Diabetic Nephropathy Cohort Study (JDNCS), which collected physical and laboratory data. We analyzed the data of 321 participants (106 female, 215 male; average age 65 years) in the JDNCS. Systolic and diastolic blood pressure was 130.1 and 72.3 mmHg, respectively. Median estimated glomerular filtration rate (eGFR) was 33.3 ml/min/1.73 m(2). Proteinuria was 1.8 g/gCr, and serum levels of albumin were 3.6 g/dl. The majority of the JDNCS patients presented with preserved eGFR and low albuminuria or low eGFR and advanced proteinuria. In the JRBR and JKDR registries, 484 and 125 participants, respectively, were enrolled as having diabetes mellitus. In comparison with the JRBR and JKDR registries, the JDNCS was characterized by diabetic patients presenting with low proteinuria with moderately preserved eGFR. There are few national registries of diabetic nephropathy to evaluate prognosis in Japan. Future analysis of the JDNCS will provide clinical insights into the epidemiology and renal and cardiovascular outcomes of type 2 diabetic patients in Japan.

  15. Study of inverse methods in remote sensing with laser

    International Nuclear Information System (INIS)

    Jesus, Wellington Carlos de

    2009-01-01

    The Laboratory of Environmental Applications of Lasers at IPEN realizes a study about atmospherics properties, such as extinction and backscattering coefficient. These coefficient are estimated by an inverse method, whose estimate quality is difficult to measure. This work presents a method with good statistic approach to retrieval the same coefficients. The new method, however, offers a number of advantages compared to the first method in use, including (1) the ability to incorporate different kinds of information under a common retrieval philosophy and (2) the method provides number of ways for evaluating the quality of the retrieval. Thus we hope improve the accuracy of estimates. (author)

  16. DIETFITS Study (Diet Intervention Examining The Factors Interacting with Treatment Success) – Study Design and Methods

    Science.gov (United States)

    Stanton, Michael; Robinson, Jennifer; Kirkpatrick, Susan; Farzinkhou, Sarah; Avery, Erin; Rigdon, Joseph; Offringa, Lisa; Trepanowski, John; Hauser, Michelle; Hartle, Jennifer; Cherin, Rise; King, Abby C.; Ioannidis, John P.A.; Desai, Manisha; Gardner, Christopher D.

    2017-01-01

    Numerous studies have attempted to identify successful dietary strategies for weight loss, and many have focused on Low-Fat vs. Low-Carbohydrate comparisons. Despite relatively small between-group differences in weight loss found in most previous studies, researchers have consistently observed relatively large between-subject differences in weight loss within any given diet group (e.g., ~25 kg weight loss to ~5 kg weight gain). The primary objective of this study was to identify predisposing individual factors at baseline that help explain differential weight loss achieved by individuals assigned to the same diet, particularly a pre-determined multi-locus genotype pattern and insulin resistance status. Secondary objectives included discovery strategies for further identifying potential genetic risk scores. Exploratory objectives included investigation of an extensive set of physiological, psychosocial, dietary, and behavioral variables as moderating and/or mediating variables and/or secondary outcomes. The target population was generally healthy, free-living adults with BMI 28-40 kg/m2 (n=600). The intervention consisted of a 12-month protocol of 22 one-hour evening instructional sessions led by registered dietitians, with ~15-20 participants/class. Key objectives of dietary instruction included focusing on maximizing the dietary quality of both Low-Fat and Low-Carbohydrate diets (i.e., Healthy Low-Fat vs. Healthy Low-Carbohydrate), and maximally differentiating the two diets from one another. Rather than seeking to determine if one dietary approach was better than the other for the general population, this study sought to examine whether greater overall weight loss success could be achieved by matching different people to different diets. Here we present the design and methods of the study. PMID:28027950

  17. DIETFITS study (diet intervention examining the factors interacting with treatment success) - Study design and methods.

    Science.gov (United States)

    Stanton, Michael V; Robinson, Jennifer L; Kirkpatrick, Susan M; Farzinkhou, Sarah; Avery, Erin C; Rigdon, Joseph; Offringa, Lisa C; Trepanowski, John F; Hauser, Michelle E; Hartle, Jennifer C; Cherin, Rise J; King, Abby C; Ioannidis, John P A; Desai, Manisha; Gardner, Christopher D

    2017-02-01

    Numerous studies have attempted to identify successful dietary strategies for weight loss, and many have focused on Low-Fat vs. Low-Carbohydrate comparisons. Despite relatively small between-group differences in weight loss found in most previous studies, researchers have consistently observed relatively large between-subject differences in weight loss within any given diet group (e.g., ~25kg weight loss to ~5kg weight gain). The primary objective of this study was to identify predisposing individual factors at baseline that help explain differential weight loss achieved by individuals assigned to the same diet, particularly a pre-determined multi-locus genotype pattern and insulin resistance status. Secondary objectives included discovery strategies for further identifying potential genetic risk scores. Exploratory objectives included investigation of an extensive set of physiological, psychosocial, dietary, and behavioral variables as moderating and/or mediating variables and/or secondary outcomes. The target population was generally healthy, free-living adults with BMI 28-40kg/m 2 (n=600). The intervention consisted of a 12-month protocol of 22 one-hour evening instructional sessions led by registered dietitians, with ~15-20 participants/class. Key objectives of dietary instruction included focusing on maximizing the dietary quality of both Low-Fat and Low-Carbohydrate diets (i.e., Healthy Low-Fat vs. Healthy Low-Carbohydrate), and maximally differentiating the two diets from one another. Rather than seeking to determine if one dietary approach was better than the other for the general population, this study sought to examine whether greater overall weight loss success could be achieved by matching different people to different diets. Here we present the design and methods of the study. Copyright © 2016 Elsevier Inc. All rights reserved.

  18. Consistent Valuation across Curves Using Pricing Kernels

    Directory of Open Access Journals (Sweden)

    Andrea Macrina

    2018-03-01

    Full Text Available The general problem of asset pricing when the discount rate differs from the rate at which an asset’s cash flows accrue is considered. A pricing kernel framework is used to model an economy that is segmented into distinct markets, each identified by a yield curve having its own market, credit and liquidity risk characteristics. The proposed framework precludes arbitrage within each market, while the definition of a curve-conversion factor process links all markets in a consistent arbitrage-free manner. A pricing formula is then derived, referred to as the across-curve pricing formula, which enables consistent valuation and hedging of financial instruments across curves (and markets. As a natural application, a consistent multi-curve framework is formulated for emerging and developed inter-bank swap markets, which highlights an important dual feature of the curve-conversion factor process. Given this multi-curve framework, existing multi-curve approaches based on HJM and rational pricing kernel models are recovered, reviewed and generalised and single-curve models extended. In another application, inflation-linked, currency-based and fixed-income hybrid securities are shown to be consistently valued using the across-curve valuation method.

  19. METHODS ADVANCEMENT FOR MILK ANALYSIS: THE MAMA STUDY

    Science.gov (United States)

    The Methods Advancement for Milk Analysis (MAMA) study was designed by US EPA and CDC investigators to provide data to support the technological and study design needs of the proposed National Children=s Study (NCS). The NCS is a multi-Agency-sponsored study, authorized under the...

  20. Toward a consistent RHA-RPA

    International Nuclear Information System (INIS)

    Shepard, J.R.

    1991-01-01

    The authors examine the RPA based on a relativistic Hartree approximation description for nuclear ground states. This model includes contributions from the negative energy sea at the 1-loop level. They emphasize consistency between the treatment of the ground state and the RPA. This consistency is important in the description of low-lying collective levels but less important for the longitudinal (e, e') quasi-elastic response. They also study the effect of imposing a 3-momentum cutoff on negative energy sea contributions. A cutoff of twice the nucleon mass improves agreement with observed spin orbit splittings in nuclei compared to the standard infinite cutoff results, an effect traceable to the fact that imposing the cutoff reduces m*/m. The cutoff is much less important than consistency in the description of low-lying collective levels. The cutoff model provides excellent agreement with quasi-elastic (e, e') data

  1. Comparison study on cell calculation method of fast reactor

    International Nuclear Information System (INIS)

    Chiba, Gou

    2002-10-01

    Effective cross sections obtained by cell calculations are used in core calculations in current deterministic methods. Therefore, it is important to calculate the effective cross sections accurately and several methods have been proposed. In this study, some of the methods are compared to each other using a continuous energy Monte Carlo method as a reference. The result shows that the table look-up method used in Japan Nuclear Cycle Development Institute (JNC) sometimes has a difference over 10% in effective microscopic cross sections and be inferior to the sub-group method. The problem was overcome by introducing a new nuclear constant system developed in JNC, in which the ultra free energy group library is used. The system can also deal with resonance interaction effects between nuclides which are not able to be considered by other methods. In addition, a new method was proposed to calculate effective cross section accurately for power reactor fuel subassembly where the new nuclear constant system cannot be applied. This method uses the sub-group method and the ultra fine energy group collision probability method. The microscopic effective cross sections obtained by this method agree with the reference values within 5% difference. (author)

  2. A Study of Effectiveness of Rational, Emotive, Behavior Therapy (REBT) with Group Method on Decrease of Stress among Diabetic Patients

    OpenAIRE

    Kianoush Zahrakar

    2012-01-01

    Introduction: The purpose of the present research was studying the effectiveness of Rational Emotive Behavior Therapy (REBT) with Group method in decreasing stress of diabetic patients. Methods: The population of research consisted of all diabetic patients that are member of diabetic patient’s association 0f karaj city. The sample consisted of 30 diabetic patients (experimental group 15 persons and control group 15 persons) that selected through random sampling. Research design was experiment...

  3. Studies of the Raman Spectra of Cyclic and Acyclic Molecules: Combination and Prediction Spectrum Methods

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Taijin; Assary, Rajeev S.; Marshall, Christopher L.; Gosztola, David J.; Curtiss, Larry A.; Stair, Peter C.

    2012-04-02

    A combination of Raman spectroscopy and density functional methods was employed to investigate the spectral features of selected molecules: furfural, 5-hydroxymethyl furfural (HMF), methanol, acetone, acetic acid, and levulinic acid. The computed spectra and measured spectra are in excellent agreement, consistent with previous studies. Using the combination and prediction spectrum method (CPSM), we were able to predict the important spectral features of two platform chemicals, HMF and levulinic acid.The results have shown that CPSM is a useful alternative method for predicting vibrational spectra of complex molecules in the biomass transformation process.

  4. Study of test methods for radionuclide migration in aerated zone

    International Nuclear Information System (INIS)

    Li Shushen; Guo Zede; Wang Zhiming

    1993-01-01

    Aerated zone is an important natural barrier against transport of radionuclides released from disposal facilities of LLRW. This paper introduces study methods for radionuclide migration in aerated zone, including determination of water movement, laboratory simulation test, and field tracing test. For one purpose, results obtained with different methods are compared. These methods have been used in a five-year cooperative research project between CIRP and JAERI for an establishment of methodology for safety assessment on shallow land disposal of LLRW

  5. Maxillary sinusitis - a comparative study of different imaging diagnosis methods

    International Nuclear Information System (INIS)

    Hueb, Marcelo Miguel; Borges, Fabiano de Almeida; Pulcinelli, Emilte; Souza, Wandir Ferreira; Borges, Luiz Marcondes

    1999-01-01

    We conducted prospective study comparing different methods (plain X-rays, computed tomography and ultrasonography mode-A) for the initial diagnosis of maxillary sinusitis. Twenty patients (40 maxillary sinuses) with a clinical history suggestive of sinusitis included in this study. The results were classified as abnormal or normal, using computed tomography as gold standard. The sensitivity for ultrasonography and plain X-rays was 84.6% and 69.2%, respectively. The specificity of both methods was 92.6%. This study suggests that ultrasonography can be used as a good follow-up method for patients with maxillary. sinusitis. (author)

  6. Current Mathematical Methods Used in QSAR/QSPR Studies

    Directory of Open Access Journals (Sweden)

    Peixun Liu

    2009-04-01

    Full Text Available This paper gives an overview of the mathematical methods currently used in quantitative structure-activity/property relationship (QASR/QSPR studies. Recently, the mathematical methods applied to the regression of QASR/QSPR models are developing very fast, and new methods, such as Gene Expression Programming (GEP, Project Pursuit Regression (PPR and Local Lazy Regression (LLR have appeared on the QASR/QSPR stage. At the same time, the earlier methods, including Multiple Linear Regression (MLR, Partial Least Squares (PLS, Neural Networks (NN, Support Vector Machine (SVM and so on, are being upgraded to improve their performance in QASR/QSPR studies. These new and upgraded methods and algorithms are described in detail, and their advantages and disadvantages are evaluated and discussed, to show their application potential in QASR/QSPR studies in the future.

  7. Mixed methods research design for pragmatic psychoanalytic studies.

    Science.gov (United States)

    Tillman, Jane G; Clemence, A Jill; Stevens, Jennifer L

    2011-10-01

    Calls for more rigorous psychoanalytic studies have increased over the past decade. The field has been divided by those who assert that psychoanalysis is properly a hermeneutic endeavor and those who see it as a science. A comparable debate is found in research methodology, where qualitative and quantitative methods have often been seen as occupying orthogonal positions. Recently, Mixed Methods Research (MMR) has emerged as a viable "third community" of research, pursuing a pragmatic approach to research endeavors through integrating qualitative and quantitative procedures in a single study design. Mixed Methods Research designs and the terminology associated with this emerging approach are explained, after which the methodology is explored as a potential integrative approach to a psychoanalytic human science. Both qualitative and quantitative research methods are reviewed, as well as how they may be used in Mixed Methods Research to study complex human phenomena.

  8. Theory, Method, and Triangulation in the Study of Street Children.

    Science.gov (United States)

    Lucchini, Riccardo

    1996-01-01

    Describes how a comparative study of street children in Montevideo (Uruguay), Rio de Janeiro, and Mexico City contributes to a synergism between theory and method. Notes how theoretical approaches of symbolic interactionism, genetic structuralism, and habitus theory complement interview, participant observation, and content analysis methods;…

  9. Soybean allergen detection methods--a comparison study

    DEFF Research Database (Denmark)

    Pedersen, M. Højgaard; Holzhauser, T.; Bisson, C.

    2008-01-01

    Soybean containing products are widely consumed, thus reliable methods for detection of soy in foods are needed in order to make appropriate risk assessment studies to adequately protect soy allergic patients. Six methods were compared using eight food products with a declared content of soy...

  10. Theoretical prediction of the band offsets at the ZnO/anatase TiO{sub 2} and GaN/ZnO heterojunctions using the self-consistent ab initio DFT/GGA-1/2 method

    Energy Technology Data Exchange (ETDEWEB)

    Fang, D. Q., E-mail: fangdqphy@mail.xjtu.edu.cn; Zhang, S. L. [MOE Key Laboratory for Nonequilibrium Synthesis and Modulation of Condensed Matter, School of Science, Xi’an Jiaotong University, Xi’an 710049 (China)

    2016-01-07

    The band offsets of the ZnO/anatase TiO{sub 2} and GaN/ZnO heterojunctions are calculated using the density functional theory/generalized gradient approximation (DFT/GGA)-1/2 method, which takes into account the self-energy corrections and can give an approximate description to the quasiparticle characteristics of the electronic structure of semiconductors. We present the results of the ionization potential (IP)-based and interfacial offset-based band alignments. In the interfacial offset-based band alignment, to get the natural band offset, we use the surface calculations to estimate the change of reference level due to the interfacial strain. Based on the interface models and GGA-1/2 calculations, we find that the valence band maximum and conduction band minimum of ZnO, respectively, lie 0.64 eV and 0.57 eV above those of anatase TiO{sub 2}, while lie 0.84 eV and 1.09 eV below those of GaN, which agree well with the experimental data. However, a large discrepancy exists between the IP-based band offset and the calculated natural band offset, the mechanism of which is discussed. Our results clarify band alignment of the ZnO/anatase TiO{sub 2} heterojunction and show good agreement with the GW calculations for the GaN/ZnO heterojunction.

  11. Design study of fuel circulating system using Pd-alloy membrane isotope separation method

    International Nuclear Information System (INIS)

    Naito, T.; Yamada, T.; Yamanaka, T.; Aizawa, T.; Kasahara, T.; Nishikawa, M.; Asami, N.

    1980-01-01

    Design study on the fuel circulating system (FCS) for a tokamak experimental fusion reactor (JXFR) has been carried out to establish the system concept, to plan the development program, and to evaluate the feasibility of diffusion system. The FCS consists of main vacuum system, fuel gas refiners, isotope separators, fuel feeders, and auxiliary systems. In the system design, Pd-alloy membrane permeation method is adopted for fuel refining and isotope separating. All impurities are effectively removed and hydrogen isotopes are sufficiently separated by Pd-alloy membrane. The isotope separation system consists of 1st (47 separators) and 2nd (46 separators) cascades for removing protium and separating deuterium, respectively. In the FCS, while cryogenic distillation method appears to be practicable, Pd-alloy membrane diffusion method is attractive for isotope separation and refining of fuel gas. The choice will have to be based on reliability, economic, and safety analyses

  12. Methods for analysing cardiovascular studies with repeated measures

    NARCIS (Netherlands)

    Cleophas, T. J.; Zwinderman, A. H.; van Ouwerkerk, B. M.

    2009-01-01

    Background. Repeated measurements in a single subject are generally more similar than unrepeated measurements in different subjects. Unrepeated analyses of repeated data cause underestimation of the treatment effects. Objective. To review methods adequate for the analysis of cardiovascular studies

  13. Method to deterministically study photonic nanostructures in different experimental instruments

    NARCIS (Netherlands)

    Husken, B.H.; Woldering, L.A.; Blum, Christian; Tjerkstra, R.W.; Vos, Willem L.

    2009-01-01

    We describe an experimental method to recover a single, deterministically fabricated nanostructure in various experimental instruments without the use of artificially fabricated markers, with the aim to study photonic structures. Therefore, a detailed map of the spatial surroundings of the

  14. [Pituitary function of dysgenesic femal rats. Studies with grafting method].

    Science.gov (United States)

    Vanhems, E; Busquet, J

    1975-01-01

    Misulban administered to pregnant rats on the 15th day of gestation provoked gonadal dysgenesia in the offspring. Study of the pituitary function of dysgenesic female rats, realized by grafting method, showed gonadotrophic hypersecretion.

  15. Household energy studies: the gap between theory and method

    Energy Technology Data Exchange (ETDEWEB)

    Crosbie, T.

    2006-09-15

    At the level of theory it is now widely accepted that energy consumption patterns are a complex technical and socio-cultural phenomenon and to understand this phenomenon, it must be viewed from both engineering and social science perspectives. However, the methodological approaches taken in household energy studies lag behind the theoretical advances made in the last ten or fifteen years. The quantitative research methods traditionally used within the fields of building science, economics, and psychology continue to dominate household energy studies, while the qualitative ethnographic approaches to examining social and cultural phenomena traditionally used within anthropology and sociology are most frequently overlooked. This paper offers a critical review of the research methods used in household energy studies which illustrates the scope and limitations of both qualitative and quantitative research methods in this area of study. In doing so it demonstrates that qualitative research methods are essential to designing effective energy efficiency interventions. [Author].

  16. Self-consistent gravitational self-force

    International Nuclear Information System (INIS)

    Pound, Adam

    2010-01-01

    I review the problem of motion for small bodies in general relativity, with an emphasis on developing a self-consistent treatment of the gravitational self-force. An analysis of the various derivations extant in the literature leads me to formulate an asymptotic expansion in which the metric is expanded while a representative worldline is held fixed. I discuss the utility of this expansion for both exact point particles and asymptotically small bodies, contrasting it with a regular expansion in which both the metric and the worldline are expanded. Based on these preliminary analyses, I present a general method of deriving self-consistent equations of motion for arbitrarily structured (sufficiently compact) small bodies. My method utilizes two expansions: an inner expansion that keeps the size of the body fixed, and an outer expansion that lets the body shrink while holding its worldline fixed. By imposing the Lorenz gauge, I express the global solution to the Einstein equation in the outer expansion in terms of an integral over a worldtube of small radius surrounding the body. Appropriate boundary data on the tube are determined from a local-in-space expansion in a buffer region where both the inner and outer expansions are valid. This buffer-region expansion also results in an expression for the self-force in terms of irreducible pieces of the metric perturbation on the worldline. Based on the global solution, these pieces of the perturbation can be written in terms of a tail integral over the body's past history. This approach can be applied at any order to obtain a self-consistent approximation that is valid on long time scales, both near and far from the small body. I conclude by discussing possible extensions of my method and comparing it to alternative approaches.

  17. The use of radionuclide skeleton visualization method in hygienic studies

    International Nuclear Information System (INIS)

    Likutova, I.V.; Bobkova, T.E.; Belova, E.A.; Bogomazov, M.Ya.

    1984-01-01

    Inhalation, intragastric and combined effect of two cadmium compounds on rats is studied. Investigations are performed by biochemical methods and the method of radionuclide visualization of the skeleton which was performed delta hours after RPP introduction in gamma-chamber with computer tape recording for the following mathematical treatment of the image. Using the method of radionuclide skeleton visualization pronounced quantitative characteristics of changes in the bone tissue are obtained, it is found that dose dependence of these changes is especially important when estimating the complex effect. Biochemical methods, are used to find alterations, however they have not been assessed quantitatively

  18. Radiochemical studies of some preparation methods for phosphorus

    International Nuclear Information System (INIS)

    Loos-Neskovic, C.; Fedoroff, M.

    1983-01-01

    Various methods of radiochemical separation were tested for the determination of phosphorus in metals and alloys by neutron activation analysis. Classical methods of separation revealed some defects when they were applied to this problem. Methods using liquid extraction gave low yields and were not reproducible. Methods based on precipitation gave better results, but were not selective enough in most cases. Retention on alumina was not possible without preliminary separations. Authors studied a new radiochemical separation based on the extraction of elemental phosphorus in the gaseous phase after reduction at high temperature with carbon. Measurements with radioactive phosphorus showed that the extraction yield is better than 99%. (author)

  19. Poisson solvers for self-consistent multi-particle simulations

    International Nuclear Information System (INIS)

    Qiang, J; Paret, S

    2014-01-01

    Self-consistent multi-particle simulation plays an important role in studying beam-beam effects and space charge effects in high-intensity beams. The Poisson equation has to be solved at each time-step based on the particle density distribution in the multi-particle simulation. In this paper, we review a number of numerical methods that can be used to solve the Poisson equation efficiently. The computational complexity of those numerical methods will be O(N log(N)) or O(N) instead of O(N2), where N is the total number of grid points used to solve the Poisson equation

  20. Parallelization methods study of thermal-hydraulics codes

    International Nuclear Information System (INIS)

    Gaudart, Catherine

    2000-01-01

    The variety of parallelization methods and machines leads to a wide selection for programmers. In this study we suggest, in an industrial context, some solutions from the experience acquired through different parallelization methods. The study is about several scientific codes which simulate a large variety of thermal-hydraulics phenomena. A bibliography on parallelization methods and a first analysis of the codes showed the difficulty of our process on the whole applications to study. Therefore, it would be necessary to identify and extract a representative part of these applications and parallelization methods. The linear solver part of the codes forced itself. On this particular part several parallelization methods had been used. From these developments one could estimate the necessary work for a non initiate programmer to parallelize his application, and the impact of the development constraints. The different methods of parallelization tested are the numerical library PETSc, the parallelizer PAF, the language HPF, the formalism PEI and the communications library MPI and PYM. In order to test several methods on different applications and to follow the constraint of minimization of the modifications in codes, a tool called SPS (Server of Parallel Solvers) had be developed. We propose to describe the different constraints about the optimization of codes in an industrial context, to present the solutions given by the tool SPS, to show the development of the linear solver part with the tested parallelization methods and lastly to compare the results against the imposed criteria. (author) [fr