WorldWideScience

Sample records for standard candle analysis

  1. When a Standard Candle Flickers

    DEFF Research Database (Denmark)

    Wilson-Hodge, Colleen A; Cherry, Michael L; Case, Gary L

    2011-01-01

    The Crab Nebula is the only hard X-ray source in the sky that is both bright enough and steady enough to be easily used as a standard candle. As a result, it has been used as a normalization standard by most X-ray/gamma-ray telescopes. Although small-scale variations in the nebula are well known......, since the start of science operations of the Fermi Gamma-ray Burst Monitor (GBM) in 2008 August, a ~ 7% (70 mCrab) decline has been observed in the overall Crab Nebula flux in the 15-50 keV band, measured with the Earth occultation technique. This decline is independently confirmed in the ~ 15-50 ke......-100 keV band with GBM, Swift /BAT, and INTEGRAL /IBIS. The pulsed flux measured with RXTE /PCA since 1999 is consistent with the pulsar spin-down, indicating that the observed changes are nebular. Correlated variations in the Crab Nebula flux on a ~ 3 year timescale are also seen independently...

  2. When A Standard Candle Flickers

    Science.gov (United States)

    Wilson-Hodge, Colleen A.; Cherry, Michael L.; Case, Gary L.; Baumgartner, Wayne H.; Beklen Elif; Bhat, P. Narayana; Briggs, Michael S.; Camero-Arranz, Ascension; Chaplin, Vandiver; Connaughton, Valerie; hide

    2011-01-01

    The Crab Nebula is the only hard X-ray source in the sky that is both bright enough and steady enough to be easily used as a standard candle. As a result, it has been used as a normalization standard by most X-ray/gamma ray telescopes. Although small-scale variations in the nebula are well-known, since the start of science operations of the Fermi Gamma-ray Burst Monitor (GBM) in August 2008 a 7% (70 mcrab) decline has been observed in the overall Crab Nebula flux in the 15-50 keV band, measured with the Earth occultation technique. This decline is independently confirmed in the 15-50 keV band with three other instruments: the Swift Burst Alert Telescope (Swift/BAT), the Rossi X-ray Timing Explorer Proportional Counter Array (RXTE/PCA), and the INTErnational Gamma-Ray Astrophysics Laboratory Imager on Board INTEGRAL (IBIS). A similar decline is also observed in the 3 - 15 keV data from the RXTE/PCA and in the 50 - 100 keV band with GBM, Swift/BAT, and INTEGRAL/IBIS. The change in the pulsed flux measured with RXTE/PCA since 1999 is consistent with the pulsar spin-down, indicating that the observed changes are nebular. Correlated variations in the Crab Nebula flux on a 3 year timescale are also seen independently with the PCA, BAT, and IBIS from 2005 to 2008, with a flux minimum in April 2007. As of August 2010, the current flux has declined below the 2007 minimum.

  3. Difficulties in Using GRBs as Standard Candles

    Science.gov (United States)

    Goldstein, Adam

    2012-01-01

    Gamma-Ray Bursts have been detected uniformly all over the observable universe, ranging in comoving distance from a few hundred Mpc to a few thousand Mpc, representing the farthest observable objects in the universe. This large distance coverage is highly attractive to those who study cosmology and the history of the early universe since there are no other observed objects that represent such a deep and comprehensive probe of the history of the universe. For this reason, there have been extensive studies into the possibility of using GRBs as standard candles much like Type Ia Supernovae, even though little is known about the physical mechanism that produces the observed burst of gamma-rays. We discuss the attempts at defining GRBs as standard candles, such as the search for a robust luminosity indicator, pseudo-redshift predictions, the complications that emission collimation introduces into the estimation of the rest-frame energetics, and the difficulty introduced by the widely varying observed properties of GRBs. These topics will be examined with supporting data and analyses from both Fermi and Swift observations. Problems with current studies using GRBs as standard candles will be noted as well as potential paths forward to solve these problems.

  4. The Gaia Red Clump as standard candle

    Science.gov (United States)

    Ruiz-Dern, L.; Babusiaux, C.; Danielski, C.; Arenou, F.; Turon, C.; Lallement, R.

    2017-12-01

    Gaia has already provided new high precision parallaxes for two million objects, allowing to recalibrate standard candles. Red Clump stars are known to be good standard candles because of their small dependency of their luminosity on their stellar composition, colour and age. We developed methods to derive some of the main physical parameters to characterise the Red Clump as standard candle. We provide fully empirical calibrations by using visual to infrared photometry, the most up-to-date 3D extinction map, and spectroscopic atmosphere parameters. We derived new calibrations for 16 Colour-(\\gk) and Effective Temperature-(\\gk) relations and a new calibration of the RC absolute magnitude on the Gaia G and 2MASS \\ks bands. These calibrations are used afterwards to estimate the G-band interstellar extinction coefficient k_{G}. By combining of all these relations we implemented a method to determine effective temperatures and interstellar extinctions (A_0), which we will use in particular to derive asteroseismic parameters which can be directly compared with Gaia's results.

  5. Investigating the Effect of Cosmic Opacity on Standard Candles

    Energy Technology Data Exchange (ETDEWEB)

    Hu, J.; Yu, H.; Wang, F. Y., E-mail: fayinwang@nju.edu.cn [School of Astronomy and Space Science, Nanjing University, Nanjing 210093 (China)

    2017-02-10

    Standard candles can probe the evolution of dark energy over a large redshift range. But the cosmic opacity can degrade the quality of standard candles. In this paper, we use the latest observations, including Type Ia supernovae (SNe Ia) from the “joint light-curve analysis” sample and Hubble parameters, to probe the opacity of the universe. A joint fitting of the SNe Ia light-curve parameters, cosmological parameters, and opacity is used in order to avoid the cosmological dependence of SNe Ia luminosity distances. The latest gamma-ray bursts are used in order to explore the cosmic opacity at high redshifts. The cosmic reionization process is considered at high redshifts. We find that the sample supports an almost transparent universe for flat ΛCDM and XCDM models. Meanwhile, free electrons deplete photons from standard candles through (inverse) Compton scattering, which is known as an important component of opacity. This Compton dimming may play an important role in future supernova surveys. From analysis, we find that about a few per cent of the cosmic opacity is caused by Compton dimming in the two models, which can be corrected.

  6. Comparison of cosmological models using standard rulers and candles

    OpenAIRE

    Li, Xiaolei; Cao, Shuo; Zheng, Xiaogang; Li, Song; Biesiada, Marek

    2015-01-01

    In this paper, we used standard rulers and standard candles (separately and jointly) to explore five popular dark energy models under assumption of spatial flatness of the Universe. As standard rulers, we used a data set comprising 118 galactic-scale strong lensing systems (individual standard rulers if properly calibrated for the mass density profile) combined with BAO diagnostics (statistical standard ruler). Supernovae Ia served asstandard candles. Unlike in the most of previous statistica...

  7. Discerning dark energy models with high redshift standard candles

    Science.gov (United States)

    Andersen, P.; Hjorth, J.

    2017-12-01

    Following the success of type Ia supernovae in constraining cosmologies at lower redshift (z ≲ 2), effort has been spent determining if a similarly useful standardizable candle can be found at higher redshift. In this work, we determine the largest possible magnitude discrepancy between a constant dark energy ΛCDM cosmology and a cosmology in which the equation of state w(z) of dark energy is a function of redshift for high redshift standard candles (z ≳ 2). We discuss a number of popular parametrizations of w(z) with two free parameters, wzCDM cosmologies, including the Chevallier-Polarski-Linder and generalization thereof, nCPL, as well as the Jassal-Bagla-Padmanabhan parametrization. For each of these parametrizations, we calculate and find the extrema of Δμ, the difference between the distance modulus of a wzCDM cosmology and a fiducial ΛCDM cosmology as a function of redshift, given 68 per cent likelihood constraints on the parameters P = (Ωm, 0, w0, wa). The parameters are constrained using cosmic microwave background, baryon acoustic oscillations and type Ia supernovae data using CosmoMC. We find that none of the tested cosmologies can deviate more than 0.05 mag from the fiducial ΛCDM cosmology at high redshift, implying that high redshift standard candles will not aid in discerning between the wzCDM cosmology and the fiducial ΛCDM cosmology. Conversely, this implies that if high redshift standard candles are found to be in disagreement with ΛCDM at high redshift, then this is a problem not only for ΛCDM but for the entire family of wzCDM cosmologies.

  8. THE STANDARDIZED CANDLE METHOD FOR TYPE II PLATEAU SUPERNOVAE

    International Nuclear Information System (INIS)

    Olivares E, Felipe; Hamuy, Mario; Pignata, Giuliano; Maza, Jose; Bersten, Melina; Phillips, Mark M.; Morrel, Nidia I.; Suntzeff, Nicholas B.; Filippenko, Alexei V.; Kirshner, Robert P.; Matheson, Thomas

    2010-01-01

    In this paper, we study the 'standardized candle method' using a sample of 37 nearby (redshift z V ) = 0.2 mag. The correlation between plateau luminosity and expansion velocity previously reported in the literature is recovered. Using this relation and assuming a standard reddening law (R V = 3.1), we obtain Hubble diagrams (HDs) in the BVI bands with dispersions of ∼0.4 mag. Allowing R V to vary and minimizing the spread in the HDs, we obtain a dispersion range of 0.25-0.30 mag, which implies that these objects can deliver relative distances with precisions of 12%-14%. The resulting best-fit value of R V is 1.4 ± 0.1.

  9. How Beatrice Tinsley Destroyed Sandage's Quest for a Standard Candle

    Science.gov (United States)

    Mitton, Simon

    2014-01-01

    The goal of cosmology and most extragalactic optical astronomy during the heroic period spanning the half century from Hubble to Sandage (1920s - 1970s) was a search for two numbers, the Hubble constant and the deceleration parameter. Standard candles were needed to establish the measure of the universe. In 1968, Beatrice Tinsley, then a postdoctoral fellow in the astronomy department of the University of Texas at Austin showed that the great enterprise at Palomar of calibrating the galaxies was in need of major revision. At the 132nd AAS Meeting (June 1970, Boulder, Colorado) she presented a paper on galactic evolution on the magnitude-redshift relation. In her Abstract she boldly wrote: "My present conclusion is opposite to that reached by most cosmologists." In fact her claims caused great consternation among cosmologists. In 1972 she published eight papers on the evolution of galaxies, and the effects of that evolution for observational cosmology and the origin of structure.

  10. From a Better Understanding of GRB Prompt Emission to a New Type of Standard Candles?

    Science.gov (United States)

    Guiriec, Sylvain

    2016-07-01

    Recent results revealed the simultaneous existence of multiple components in the prompt emission of gamma-ray bursts (GRBs) leading to a unified spectro-temporal model for the broadband spectrum from the optical regime up to higher gamma rays. Unexpectedly, we discovered a relation intrinsic to one specific component of this model: its luminosity is strongly and tightly correlated to its spectral break energy. This new luminosity-hardness relation has the same index for all GRBs when fitted to a power law. In addition, this relation seems to have the same normalization for all GRBs; therefore, this is a promising and physically motivated tool that may establish GRBs as cosmological standard candles. During this presentation, I will introduce this new relation, which might eventually be used to (i) estimate GRB distances, (ii) to support searches for gravitational waves and cosmic high-energy neutrinos, and (iii) constrain the cosmological parameters. I will give a few examples of GRB redshift estimates using this relation and I will show why this new result cannot solely be explain by instrumental selection effects and/or measurement/analysis biases.

  11. Red clump stars and Gaia: calibration of the standard candle using a hierarchical probabilistic model

    Science.gov (United States)

    Hawkins, Keith; Leistedt, Boris; Bovy, Jo; Hogg, David W.

    2017-10-01

    Distances to individual stars in our own Galaxy are critical in order to piece together the nature of its velocity and spatial structure. Core helium burning red clump (RC) stars have similar luminosities, are abundant throughout the Galaxy and thus constitute good standard candles. We build a hierarchical probabilistic model to quantify the quality of RC stars as standard candles using parallax measurements from the first Gaia data release. A unique aspect of our methodology is to fully account for (and marginalize over) parallax, photometry and dust correction uncertainties, which lead to more robust results than standard approaches. We determine the absolute magnitude and intrinsic dispersion of the RC in 2MASS bands J, H, Ks, Gaia G band and WISE bands W1, W2, W3 and W4. We find that the absolute magnitude of the RC is -1.61 ± 0.01 (in Ks), +0.44 ± 0.01 (in G), -0.93 ± 0.01 (in J), -1.46 ± 0.01 (in H), -1.68 ± 0.02 (in W1), -1.69 ± 0.02 (in W2), -1.67 ± 0.02 (in W3) and -1.76 ± 0.01 mag (in W4). The mean intrinsic dispersion is ˜0.17 ± 0.03 mag across all bands (yielding a typical distance precision of ˜8 per cent). Thus RC stars are reliable and precise standard candles. In addition, we have also re-calibrated the zero-point of the absolute magnitude of the RC in each band, which provides a benchmark for future studies to estimate distances to RC stars. Finally, the parallax error shrinkage in the hierarchical model outlined in this work can be used to obtain more precise parallaxes than Gaia for the most distant RC stars across the Galaxy.

  12. Standard test method for determining atmospheric chloride deposition rate by wet candle method

    CERN Document Server

    American Society for Testing and Materials. Philadelphia

    2002-01-01

    1.1 This test method covers a wet candle device and its use in measuring atmospheric chloride deposition (amount of chloride salts deposited from the atmosphere on a given area per unit time). 1.2 Data on atmospheric chloride deposition can be useful in classifying the corrosivity of a specific area, such as an atmospheric test site. Caution must be exercised, however, to take into consideration the season because airborne chlorides vary widely between seasons. 1.3 This standard does not purport to address all of the safety concerns, if any, associated with its use. It is the responsibility of the user of this standard to establish appropriate safety and health practices and determine the applicability of regulatory limitations prior to use.

  13. $\\Zeta$-boson as "the standard candle" for high precision W-boson physics at LHC

    CERN Document Server

    Krasny, M W; Placzek, W; Siodmok, A

    2007-01-01

    In this paper we propose a strategy for measuring the inclusive W-boson production processes at LHC. This strategy exploits simultaneously the unique flexibility of the LHC collider in running variable beam particle species at variable beam energies, and the configuration flexibility of the LHC detectors. We propose their concrete settings for a precision measurement of the Standard Model parameters. These settings optimise the use of the Z boson and Drell-Yan pair production processes as ``the standard reference candles''. The presented strategy allows to factorise and to directly measure those of the QCD effects which affect differently the W and Z production processes. It reduces to a level of 10^{-4} the impact of uncertainties in the partonic distribution functions (PDFs) and in the transverse momentum of the quarks on the measurement precision. Last but not the least, it reduces by a factor of 10 an impact of systematic measurement errors, such as the energy scale and the measurement resolution, on the ...

  14. Standard rulers, candles, and clocks from the low-redshift universe.

    Science.gov (United States)

    Heavens, Alan; Jimenez, Raul; Verde, Licia

    2014-12-12

    We measure the length of the baryon acoustic oscillation (BAO) feature, and the expansion rate of the recent Universe, from low-redshift data only, almost model independently. We make only the following minimal assumptions: homogeneity and isotropy, a metric theory of gravity, a smooth expansion history, and the existence of standard candles (supernovæ) and a standard BAO ruler. The rest is determined by the data, which are compilations of recent BAO and type IA supernova results. Making only these assumptions, we find for the first time that the standard ruler has a length of 103.9±2.3h⁻¹ Mpc. The value is a measurement, in contrast to the model-dependent theoretical prediction determined with model parameters set by Planck data (99.3±2.1h⁻¹ Mpc). The latter assumes the cold dark matter model with a cosmological constant, and that the ruler is the sound horizon at radiation drag. Adding passive galaxies as standard clocks or a local Hubble constant measurement allows the absolute BAO scale to be determined (142.8±3.7 Mpc), and in the former case the additional information makes the BAO length determination more precise (101.9±1.9h⁻¹ Mpc). The inverse curvature radius of the Universe is weakly constrained and consistent with zero, independently of the gravity model, provided it is metric. We find the effective number of relativistic species to be N(eff)=3.53±0.32, independent of late-time dark energy or gravity physics.

  15. A Unified Model for GRB Prompt Emission from Optical to Gamma-Rays: Exploring GRBs as Standard Candles

    Science.gov (United States)

    Guiriec, Sylvain

    2018-01-01

    The Band function traditionally used for Gamma Ray Bursts (GRB) often fails to fit their prompt emission spectra. Our new model composed of three separate components provides an excellent description of the time-resolved prompt emission: a thermal-like and two non-thermal components. For the first time, analysis of GRBs with correlated optical and gamma-ray prompt emission show that our new model describes very accurately the whole broadband spectrum from the optical regime to higher energy gamma rays. In addition, this new model enables anew luminosity/hardness relation intrinsic to one of the non-thermal components showing that GRBs may be standard candles. If statistically confirmed, this relation will be used to (i) constrain the mechanisms powering GRB jets, (ii) estimate GRB distances, (iii) probe the early Universe, and (iv) constrain the cosmological parameters. I will present this new unified model using analysis of GRBs detected with various observatories and instruments such as Fermi, CGRO/BATSE and the combination of the three instruments on board Swift and Suzaku/WAM. I will discuss here the striking similarities of GRB spectral shapes, whose components inform on the nature of the prompt emission, as well as the possible universality of the proposed luminosity/hardness relation in the context of our new model.

  16. Measuring the Hubble constant with Type Ia supernovae as near-infrared standard candles

    Science.gov (United States)

    Dhawan, Suhail; Jha, Saurabh W.; Leibundgut, Bruno

    2018-01-01

    The most precise local measurements of H0 rely on observations of Type Ia supernovae (SNe Ia) coupled with Cepheid distances to SN Ia host galaxies. Recent results have shown tension comparing H0 to the value inferred from CMB observations assuming ΛCDM, making it important to check for potential systematic uncertainties in either approach. To date, precise local H0 measurements have used SN Ia distances based on optical photometry, with corrections for light curve shape and colour. Here, we analyse SNe Ia as standard candles in the near-infrared (NIR), where luminosity variations in the supernovae and extinction by dust are both reduced relative to the optical. From a combined fit to 9 nearby calibrator SNe with host Cepheid distances from Riess et al. (2016) and 27 SNe in the Hubble flow, we estimate the absolute peak J magnitude MJ = -18.524 ± 0.041 mag and H0 = 72.8 ± 1.6 (statistical) ±2.7 (systematic) km s-1 Mpc-1. The 2.2% statistical uncertainty demonstrates that the NIR provides a compelling avenue to measuring SN Ia distances, and for our sample the intrinsic (unmodeled) peak J magnitude scatter is just 0.10 mag, even without light curve shape or colour corrections. Our results do not vary significantly with different sample selection criteria, though photometric calibration in the NIR may be a dominant systematic uncertainty. Our findings suggest that tension in the competing H0 distance ladders is likely not a result of supernova systematics that could be expected to vary between optical and NIR wavelengths, like dust extinction. We anticipate further improvements in H0 with a larger calibrator sample of SNe Ia with Cepheid distances, more Hubble flow SNe Ia with NIR light curves, and better use of the full NIR photometric data set beyond simply the peak J-band magnitude.

  17. Void effect analysis of Pb-208 of fast reactors with modified CANDLE burn-up scheme

    Science.gov (United States)

    Widiawati, Nina; Su'ud, Zaki

    2015-09-01

    Void effect analysis of Pb-208 as coolant of fast reactors with modified candle burn-up scheme has been conducted. Lead cooled fast reactor (LFR) is one of the fourth-generation reactor designs. The reactor is designed with a thermal power output of 500 MWt. Modified CANDLE burn-up scheme allows the reactor to have long life operation by supplying only natural uranium as fuel cycle input. This scheme introducing discrete region, the fuel is initially put in region 1, after one cycle of 10 years of burn up it is shifted to region 2 and region 1 is filled by fresh natural uranium fuel. The reactor is designed for 100 years with 10 regions arranged axially. The results of neutronic calculation showed that the void coefficients ranged from -0.6695443 % at BOC to -0.5273626 % at EOC for 500 MWt reactor. The void coefficients of Pb-208 more negative than Pb-nat. The results showed that the reactors with Pb-208 coolant have better level of safety than Pb-nat.

  18. A Unified Model for GRB Prompt Emission from Optical to Gamma-Rays; Exploring GRBs as Standard Candles

    Science.gov (United States)

    Guiriec, S.; Kouveliotou, C.; Hartmann, D. H.; Granot, J.; Asano, K.; Meszaros, P.; Gill, R.; Gehrels, N.; McEnery, J.

    2016-01-01

    The origin of prompt emission from gamma-ray bursts (GRBs) remains to be an open question. Correlated prompt optical and gamma-ray emission observed in a handful of GRBs strongly suggests a common emission region, but failure to adequately fit the broadband GRB spectrum prompted the hypothesis of different emission mechanisms for the low- and high-energy radiations. We demonstrate that our multi-component model for GRB -ray prompt emission provides an excellent fit to GRB 110205A from optical to gamma-ray energies. Our results show that the optical and highest gamma-ray emissions have the same spatial and spectral origin, which is different from the bulk of the X- and softest gamma-ray radiation. Finally, our accurate redshift estimate for GRB 110205A demonstrates promise for using GRBs as cosmological standard candles.

  19. A UNIFIED MODEL FOR GRB PROMPT EMISSION FROM OPTICAL TO γ -RAYS; EXPLORING GRBs AS STANDARD CANDLES

    Energy Technology Data Exchange (ETDEWEB)

    Guiriec, S.; Kouveliotou, C. [Department of Physics, The George Washington University, 725 21st Street NW, Washington, DC 20052 (United States); Hartmann, D. H. [Department of Physics and Astronomy, Clemson University, Clemson, SC 29634 (United States); Granot, J.; Gill, R. [Department of Natural Sciences, The Open University of Israel, 1 University Road, P.O. Box 808, Raanana 4353701 (Israel); Asano, K. [Institute for Cosmic Ray Research, The University of Tokyo, 5-1-5 Kashiwanoha, Kashiwa, Chiba 277-8582 (Japan); Mészáros, P. [Department of Astronomy and Astrophysics and Department of Physics, Center for Particle and Gravitational Astrophysics, Pennsylvania State University, University Park, PA 16802 (United States); Gehrels, N.; McEnery, J., E-mail: sylvain.guiriec@nasa.gov [NASA Goddard Space Flight Center, Greenbelt, MD 20771 (United States)

    2016-11-01

    The origin of prompt emission from gamma-ray bursts (GRBs) remains to be an open question. Correlated prompt optical and γ -ray emission observed in a handful of GRBs strongly suggests a common emission region, but failure to adequately fit the broadband GRB spectrum prompted the hypothesis of different emission mechanisms for the low- and high-energy radiations. We demonstrate that our multi-component model for GRB γ -ray prompt emission provides an excellent fit to GRB 110205A from optical to γ -ray energies. Our results show that the optical and highest γ -ray emissions have the same spatial and spectral origin, which is different from the bulk of the X- and softest γ -ray radiation. Finally, our accurate redshift estimate for GRB 110205A demonstrates promise for using GRBs as cosmological standard candles.

  20. New mass limit for white dwarfs: super-Chandrasekhar type ia supernova as a new standard candle.

    Science.gov (United States)

    Das, Upasana; Mukhopadhyay, Banibrata

    2013-02-15

    Type Ia supernovae, sparked off by exploding white dwarfs of mass close to the Chandrasekhar limit, play the key role in understanding the expansion rate of the Universe. However, recent observations of several peculiar type Ia supernovae argue for its progenitor mass to be significantly super-Chandrasekhar. We show that strongly magnetized white dwarfs not only can violate the Chandrasekhar mass limit significantly, but exhibit a different mass limit. We establish from a foundational level that the generic mass limit of white dwarfs is 2.58 solar mass. This explains the origin of overluminous peculiar type Ia supernovae. Our finding further argues for a possible second standard candle, which has many far reaching implications, including a possible reconsideration of the expansion history of the Universe.

  1. Safety Analysis of Pb-208 Cooled 800 MWt Modified CANDLE Reactors

    Science.gov (United States)

    Su'ud, Zaki; Widiawati, Nina; Sekimoto, H.; Artoto, A.

    2017-01-01

    Safely analysis of 800MWt Pb-208 cooled fast reactors with natural Uranium as fuel cycle input employing axial-radial combined Modiified CANDLE burnup scheme has been performed. The analysis of unprotected loss of flow(ULOF) and unprotected rod run-out transient overpower (UTOP) are discussed. Some simulations for 800 MWt Pb-208 cooled fast reactors has been performed and the results show that the reactor can anticipate complete pumping failure inherently by reducing power through reactivity feedback and remove the rest of heat through natural circulations. Compared to the Pb-nat cooled long life Fast Reactors, Pb-208 cooled reactors have smaller Doppler but higher coolant density reactivity coefficient. In the UTOP accident case the analysis has been performed against external reactivity up to 0.003dk/k. And for ULOHS case it is assumed that the secondary cooling system has broken. During all accident the cladding temperature is the most critical. Especially for the case of UTOP accident. In addition the steam generator design has also consider excess power which may reach 50% extra during severe UTOP case..

  2. SN 2016jhj at redshift 0.34: extending the Type II supernova Hubble diagram using the standard candle method

    Science.gov (United States)

    de Jaeger, T.; Galbany, L.; Filippenko, A. V.; González-Gaitán, S.; Yasuda, N.; Maeda, K.; Tanaka, M.; Morokuma, T.; Moriya, T. J.; Tominaga, N.; Nomoto, K.; Komiyama, Y.; Anderson, J. P.; Brink, T. G.; Carlberg, R. G.; Folatelli, G.; Hamuy, M.; Pignata, G.; Zheng, W.

    2017-12-01

    Although Type Ia supernova cosmology has now reached a mature state, it is important to develop as many independent methods as possible to understand the true nature of dark energy. Recent studies have shown that Type II supernovae (SNe II) offer such a path and could be used as alternative distance indicators. However, the majority of these studies were unable to extend the Hubble diagram above redshift z = 0.3 because of observational limitations. Here, we show that we are now ready to move beyond low redshifts and attempt high-redshift (z ≳ 0.3) SN II cosmology as a result of new-generation deep surveys such as the Subaru/Hyper Suprime-Cam survey. Applying the 'standard candle method' to SN 2016jhj (z = 0.3398 ± 0.0002; discovered by HSC) together with a low-redshift sample, we are able to construct the highest-redshift SN II Hubble diagram to date with an observed dispersion of 0.27 mag (i.e. 12-13 per cent in distance). This work demonstrates the bright future of SN II cosmology in the coming era of large, wide-field surveys like that of the Large Synoptic Survey Telescope.

  3. Accurate weak lensing of standard candles. II. Measuring σ8 with supernovae

    Science.gov (United States)

    Quartin, Miguel; Marra, Valerio; Amendola, Luca

    2014-01-01

    Soon the number of type Ia supernova (SN) measurements should exceed 100 000. Understanding the effect of weak lensing by matter structures on the supernova brightness will then be more important than ever. Although SN lensing is usually seen as a source of systematic noise, we will show that it can be in fact turned into signal. More precisely, the non-Gaussianity introduced by lensing in the SN Hubble diagram dispersion depends rather sensitively on the amplitude σ8 of the matter power spectrum. By exploiting this relation, we are able to predict constraints on σ8 of 7% (3%) for a catalog of 100 000 (500 000) SNe of average magnitude error 0.12, without having to assume that such intrinsic dispersion and its redshift evolution are known a priori. The intrinsic dispersion has been assumed to be Gaussian; possible intrinsic non-Gaussianities in the data set (due to the SN themselves and/or to other transients) could be potentially dealt with by means of additional nuisance parameters describing higher moments of the intrinsic dispersion distribution function. This method is independent of and complementary to the standard methods based on cosmic microwave background, cosmic shear, or cluster abundance observables.

  4. A Hubble Space Telescope survey for novae in M87 - III. Are novae good standard candles 15 d after maximum brightness?

    Science.gov (United States)

    Shara, Michael M.; Doyle, Trisha F.; Pagnotta, Ashley; Garland, James T.; Lauer, Tod R.; Zurek, David; Baltz, Edward A.; Goerl, Ariel; Kovetz, Attay; Machac, Tamara; Madrid, Juan P.; Mikołajewska, Joanna; Neill, J. D.; Prialnik, Dina; Welch, D. L.; Yaron, Ofer

    2018-02-01

    Ten weeks of daily imaging of the giant elliptical galaxy M87 with the Hubble Space Telescope (HST) has yielded 41 nova light curves of unprecedented quality for extragalactic cataclysmic variables. We have recently used these light curves to demonstrate that the observational scatter in the so-called maximum-magnitude rate of decline (MMRD) relation for classical novae is so large as to render the nova-MMRD useless as a standard candle. Here, we demonstrate that a modified Buscombe-de Vaucouleurs hypothesis, namely that novae with decline times t2 > 10 d converge to nearly the same absolute magnitude about two weeks after maximum light in a giant elliptical galaxy, is supported by our M87 nova data. For 13 novae with daily sampled light curves, well determined times of maximum light in both the F606W and F814W filters, and decline times t2 > 10 d we find that M87 novae display M606W,15 = -6.37 ± 0.46 and M814W,15 = -6.11 ± 0.43. If very fast novae with decline times t2 < 10 d are excluded, the distances to novae in elliptical galaxies with stellar binary populations similar to those of M87 should be determinable with 1σ accuracies of ± 20 per cent with the above calibrations.

  5. Z boson as ``the standard candle'' for high-precision W boson physics at LHC TH1"-->

    Science.gov (United States)

    Krasny, M. W.; Fayette, F.; Płaczek, W.; Siódmok, A.

    2007-08-01

    In this paper we propose a strategy for measuring the inclusive W boson production processes at LHC. This strategy exploits simultaneously the unique flexibility of the LHC collider in running variable beam particle species at variable beam energies, and the configuration flexibility of the LHC detectors. We propose their concrete settings for a precision measurement of the standard model parameters. These dedicated settings optimise the use of the Z boson and Drell Yan-pair production processes as “the standard reference candles”. The presented strategy allows one to factorise and to directly measure those of the QCD effects that affect differently the W and Z production processes. It reduces to a level of mathcal{O}(10^{-4}) the impact of uncertainties in the partonic distribution functions (PDFs) and in the transverse momentum of the quarks on the measurement precision. Last but not the least, it reduces by a factor of 10 the impact of systematic measurement errors, such as the energy scale and the measurement resolution, on the W boson production observables.

  6. Historical floods in flood frequency analysis: Is this game worth the candle?

    Science.gov (United States)

    Strupczewski, Witold G.; Kochanek, Krzysztof; Bogdanowicz, Ewa

    2017-11-01

    In flood frequency analysis (FFA) the profit from inclusion of historical information on the largest historical pre-instrumental floods depends primarily on reliability of the information, i.e. the accuracy of magnitude and return period of floods. This study is focused on possible theoretical maximum gain in accuracy of estimates of upper quantiles, that can be obtained by incorporating the largest historical floods of known return periods into the FFA. We assumed a simple case: N years of systematic records of annual maximum flows and either one largest (XM1) or two largest (XM1 and XM2) flood peak flows in a historical M-year long period. The problem is explored by Monte Carlo simulations with the maximum likelihood (ML) method. Both correct and false distributional assumptions are considered. In the first case the two-parameter extreme value models (Gumbel, log-Gumbel, Weibull) with various coefficients of variation serve as parent distributions. In the case of unknown parent distribution, the Weibull distribution was assumed as estimating model and the truncated Gumbel as parent distribution. The return periods of XM1 and XM2 are determined from the parent distribution. The results are then compared with the case, when return periods of XM1 and XM2 are defined by their plotting positions. The results are presented in terms of bias, root mean square error and the probability of overestimation of the quantile with 100-year return period. The results of the research indicate that the maximal profit of inclusion of pre-instrumental foods in the FFA may prove smaller than the cost of reconstruction of historical hydrological information.

  7. Stellar candles for the extragalactic distance scale

    CERN Document Server

    Gieren, Wolfgang

    2003-01-01

    This volume reviews the current status with respect to both theory and observation of the extragalactic distance scale. A sufficient accuracy is required both for a precise determination of the cosmological parameters and also in order to achieve a better understanding of physical processes in extragalactic systems. The "standard candles", used to set up the extragalactic distance scale, reviewed in this book include cepheid variables, RR Lyrae variables, novae, Type Ia and Type II supernovae as well as globular clusters and planetary nebulae.

  8. Optimized Design and Discussion on Middle and Large CANDLE Reactors

    Directory of Open Access Journals (Sweden)

    Xiaoming Chai

    2012-08-01

    Full Text Available CANDLE (Constant Axial shape of Neutron flux, nuclide number densities and power shape During Life of Energy producing reactor reactors have been intensively researched in the last decades [1–6]. Research shows that this kind of reactor is highly economical, safe and efficiently saves resources, thus extending large scale fission nuclear energy utilization for thousands of years, benefitting the whole of society. For many developing countries with a large population and high energy demands, such as China and India, middle (1000 MWth and large (2000 MWth CANDLE fast reactors are obviously more suitable than small reactors [2]. In this paper, the middle and large CANDLE reactors are investigated with U-Pu and combined ThU-UPu fuel cycles, aiming to utilize the abundant thorium resources and optimize the radial power distribution. To achieve these design purposes, the present designs were utilized, simply dividing the core into two fuel regions in the radial direction. The less active fuel, such as thorium or natural uranium, was loaded in the inner core region and the fuel with low-level enrichment, e.g. 2.0% enriched uranium, was loaded in the outer core region. By this simple core configuration and fuel setting, rather than using a complicated method, we can obtain the desired middle and large CANDLE fast cores with reasonable core geometry and thermal hydraulic parameters that perform safely and economically; as is to be expected from CANDLE. To assist in understanding the CANDLE reactor’s attributes, analysis and discussion of the calculation results achieved are provided.

  9. Candle Flames in Microgravity Video

    Science.gov (United States)

    1997-01-01

    This video of a candle flame burning in space was taken by the Candle Flames in Microgravity (CFM) experiment on the Russian Mir space station. It is actually a composite of still photos from a 35mm camera since the video images were too dim. The images show a hemispherically shaped flame, primarily blue in color, with some yellow early int the flame lifetime. The actual flame is quite dim and difficult to see with the naked eye. Nearly 80 candles were burned in this experiment aboard Mir. NASA scientists have also studied how flames spread in space and how to detect fire in microgravity. Researchers hope that what they learn about fire and combustion from the flame ball experiments will help out here on Earth. Their research could help create things such as better engines for cars and airplanes. Since they use very weak flames, flame balls require little fuel. By studying how this works, engineers may be able to design engines that use far less fuel. In addition, microgravity flame research is an important step in creating new safety precautions for astronauts living in space. By understanding how fire works in space, the astronauts can be better prepared to fight it.

  10. Organic aerosol formation in citronella candle plumes

    OpenAIRE

    Bothe, Melanie; Donahue, Neil McPherson

    2010-01-01

    Citronella candles are widely used as insect repellants, especially outdoors in the evening. Because these essential oils are unsaturated, they have a unique potential to form secondary organic aerosol (SOA) via reaction with ozone, which is also commonly elevated on summer evenings when the candles are often in use. We investigated this process, along with primary aerosol emissions, by briefly placing a citronella tealight candle in a smog chamber and then adding ozone to the chamber. In rep...

  11. A Hubble Space Telescope Survey for Novae in M87. II. Snuffing out the Maximum Magnitude–Rate of Decline Relation for Novae as a Non-standard Candle, and a Prediction of the Existence of Ultrafast Novae

    Energy Technology Data Exchange (ETDEWEB)

    Shara, Michael M.; Doyle, Trisha; Zurek, David [Department of Astrophysics, American Museum of Natural History, Central Park West and 79th Street, New York, NY 10024-5192 (United States); Lauer, Tod R. [National Optical Astronomy Observatory, P.O. Box 26732, Tucson, AZ 85726 (United States); Baltz, Edward A. [KIPAC, SLAC, 2575 Sand Hill Road, M/S 29, Menlo Park, CA 94025 (United States); Kovetz, Attay [School of Physics and Astronomy, Faculty of Exact Sciences, Tel Aviv University, Tel Aviv (Israel); Madrid, Juan P. [CSIRO, Astronomy and Space Science, P.O. Box 76, Epping, NSW 1710 (Australia); Mikołajewska, Joanna [N. Copernicus Astronomical Center, Polish Academy of Sciences, Bartycka 18, PL 00-716 Warsaw (Poland); Neill, J. D. [California Institute of Technology, 1200 East California Boulevard, MC 278-17, Pasadena CA 91125 (United States); Prialnik, Dina [Department of Geosciences, Tel Aviv University, Ramat Aviv, Tel Aviv 69978 (Israel); Welch, D. L. [Department of Physics and Astronomy, McMaster University, Hamilton, L8S 4M1, Ontario (Canada); Yaron, Ofer [Department of Particle Physics and Astrophysics, Weizmann Institute of Science, 76100 Rehovot (Israel)

    2017-04-20

    The extensive grid of numerical simulations of nova eruptions from the work of Yaron et al. first predicted that some classical novae might significantly deviate from the Maximum Magnitude–Rate of Decline (MMRD) relation, which purports to characterize novae as standard candles. Kasliwal et al. have announced the observational detection of a new class of faint, fast classical novae in the Andromeda galaxy. These objects deviate strongly from the MMRD relationship, as predicted by Yaron et al. Recently, Shara et al. reported the first detections of faint, fast novae in M87. These previously overlooked objects are as common in the giant elliptical galaxy M87 as they are in the giant spiral M31; they comprise about 40% of all classical nova eruptions and greatly increase the observational scatter in the MMRD relation. We use the extensive grid of the nova simulations of Yaron et al. to identify the underlying causes of the existence of faint, fast novae. These are systems that have accreted, and can thus eject, only very low-mass envelopes, of the order of 10{sup −7}–10{sup −8} M {sub ⊙}, on massive white dwarfs. Such binaries include, but are not limited to, the recurrent novae. These same models predict the existence of ultrafast novae that display decline times, t {sub 2,} to be as short as five hours. We outline a strategy for their future detection.

  12. Electrochemical supercapacitor behaviour of functionalized candle ...

    Indian Academy of Sciences (India)

    ... and G (graphite) phase of carbon present in the candle soots. The electrochemical characterization was performed by cyclic voltammetry, galvanostatic charge/discharge test and impedance spectroscopy in 1MH2SO4 electrolyte. The functionalized candle soot electrode showed an enhanced specific capacitance value of ...

  13. 75 FR 44224 - Grant of Authority for Subzone Status; Yankee Candle Corporation (Candles and Gift Sets); Whately...

    Science.gov (United States)

    2010-07-28

    ... Status; Yankee Candle Corporation (Candles and Gift Sets); Whately and South Deerfield, MA Pursuant to... special-purpose subzone at the candle and gift set manufacturing and distribution facilities of Yankee... activity related to the manufacturing and distribution of candles and gift sets at the facilities of Yankee...

  14. Standard dilution analysis.

    Science.gov (United States)

    Jones, Willis B; Donati, George L; Calloway, Clifton P; Jones, Bradley T

    2015-02-17

    Standard dilution analysis (SDA) is a novel calibration method that may be applied to most instrumental techniques that will accept liquid samples and are capable of monitoring two wavelengths simultaneously. It combines the traditional methods of standard additions and internal standards. Therefore, it simultaneously corrects for matrix effects and for fluctuations due to changes in sample size, orientation, or instrumental parameters. SDA requires only 200 s per sample with inductively coupled plasma optical emission spectrometry (ICP OES). Neither the preparation of a series of standard solutions nor the construction of a universal calibration graph is required. The analysis is performed by combining two solutions in a single container: the first containing 50% sample and 50% standard mixture; the second containing 50% sample and 50% solvent. Data are collected in real time as the first solution is diluted by the second one. The results are used to prepare a plot of the analyte-to-internal standard signal ratio on the y-axis versus the inverse of the internal standard concentration on the x-axis. The analyte concentration in the sample is determined from the ratio of the slope and intercept of that plot. The method has been applied to the determination of FD&C dye Blue No. 1 in mouthwash by molecular absorption spectrometry and to the determination of eight metals in mouthwash, wine, cola, nitric acid, and water by ICP OES. Both the accuracy and precision for SDA are better than those observed for the external calibration, standard additions, and internal standard methods using ICP OES.

  15. Electrochemical supercapacitor behaviour of functionalized candle ...

    Indian Academy of Sciences (India)

    diamond) and G (graphite) phase of carbon present in the candle soots. The electrochemical characterization was performed by cyclic voltammetry, galvanostatic charge/discharge test and impedance spectroscopy in 1MH2SO4 electrolyte.

  16. 16 CFR 501.7 - Candles.

    Science.gov (United States)

    2010-01-01

    ... quantity of contents shall be expressed in terms of count and measure (e.g., length and diameter), to the extent that diameter of such candles need not be expressed. The requirements of § 500.7 of this chapter...

  17. Social and Economic Impact of the Candle Light Source Project Candle project impact

    Science.gov (United States)

    Baghiryan, M.

    Social and economic progress related to the realization of the CANDLE synchrotron light source creation project in Armenia is discussed. CANDLE service is multidisciplinary and long-lasting. Its impacts include significant improvement in science capacities, education quality, industrial capabilities, investment climate, country image, international relations, health level, restraining the "brain-drain", new workplaces, etc. CANDLE will serve as a universal national infrastructure assuring Armenia as a country with knowledge-based economy, a place for doing high-tech business, and be a powerful tool in achieving the country's jump forward in general.

  18. 75 FR 63200 - Petroleum Wax Candles From China

    Science.gov (United States)

    2010-10-14

    ... COMMISSION Petroleum Wax Candles From China AGENCY: United States International Trade Commission. ACTION: Scheduling of an expedited five-year review concerning the antidumping duty order on petroleum wax candles... whether revocation of the antidumping duty order on petroleum wax candles from China would be likely to...

  19. 75 FR 80843 - Petroleum Wax Candles From China

    Science.gov (United States)

    2010-12-23

    ... COMMISSION Petroleum Wax Candles From China Determination On the basis of the record \\1\\ developed in the... antidumping duty order on petroleum wax candles from China would be likely to lead to continuation or... Petroleum Wax Candles from China: Investigation No. 731-TA-282 (Third Review). Issued: December 17, 2010. By...

  20. Progress in research on chlorate candle technology

    Science.gov (United States)

    Littman, J.

    1970-01-01

    Research and development program improves sodium chlorate candle formulation, production method, and igniter design. Cobalt is used as the fuel, dry processing methods are used to lower the water content, and a device based on pyrotechnic heater concepts is used as the igniter.

  1. Electrochemical supercapacitor behaviour of functionalized candle ...

    Indian Academy of Sciences (India)

    diamond) and G (graphite) phase of carbon present in the candle ... bute as a potential material for various modern applications. [27,28]. In recent times, the ... used for bio-imaging application and confirmed that these fluorescent carbon nanoparticles ...

  2. The Chemical History of a Candle

    Indian Academy of Sciences (India)

    Home; Journals; Resonance – Journal of Science Education; Volume 7; Issue 3. The Chemical History of a Candle. H R Madhusudan. Book Review Volume 7 Issue 3 March 2002 pp 87-89. Fulltext. Click here to view fulltext PDF. Permanent link: https://www.ias.ac.in/article/fulltext/reso/007/03/0087-0089. Author Affiliations.

  3. Electrochemical supercapacitor behaviour of functionalized candle ...

    Indian Academy of Sciences (India)

    2Department of Ceramic Engineering, Gangneung-Wonju National University, Gangneung 210 702, Republic of Korea. MS received 15 March 2015; accepted 17 August 2015. Abstract. The electrochemical supercapacitor behaviour of bare, washed and nitric acid functionalized candle flame carbon soots were reported.

  4. The Chemical History of a Candle

    Indian Academy of Sciences (India)

    Home; Journals; Resonance – Journal of Science Education; Volume 7; Issue 3. The Chemical History of a Candle. Michael Faraday. Classics Volume 7 Issue 3 March 2002 pp 90-98. Fulltext. Click here to view fulltext PDF. Permanent link: http://www.ias.ac.in/article/fulltext/reso/007/03/0090-0098. Author Affiliations.

  5. The Chemical History of a Candle

    Indian Academy of Sciences (India)

    Home; Journals; Resonance – Journal of Science Education; Volume 7; Issue 3. The Chemical History of a Candle. H R Madhusudan. Book Review Volume 7 Issue 3 March 2002 pp 87-89. Fulltext. Click here to view fulltext PDF. Permanent link: http://www.ias.ac.in/article/fulltext/reso/007/03/0087-0089. Author Affiliations.

  6. Study on core radius minimization for long life Pb-Bi cooled CANDLE burnup scheme based fast reactor

    Science.gov (United States)

    Afifah, Maryam; Miura, Ryosuke; Su'ud, Zaki; Takaki, Naoyuki; Sekimoto, H.

    2015-09-01

    Fast Breeder Reactor had been interested to be developed over the world because it inexhaustible source energy, one of those is CANDLE reactor which is have strategy in burn-up scheme, need not control roads for control burn-up, have a constant core characteristics during energy production and don't need fuel shuffling. The calculation was made by basic reactor analysis which use Sodium coolant geometry core parameter as a reference core to study on minimum core reactor radius of CANDLE for long life Pb-Bi cooled, also want to perform pure coolant effect comparison between LBE and sodium in a same geometry design. The result show that the minimum core radius of Lead Bismuth cooled CANDLE is 100 cm and 500 MWth thermal output. Lead-Bismuth coolant for CANDLE reactor enable to reduce much reactor size and have a better void coefficient than Sodium cooled as the most coolant for FBR, then we will have a good point in safety analysis.

  7. Tallow Candles and Meaty Air in 'Bleak House'

    Directory of Open Access Journals (Sweden)

    Anna Henchman

    2017-12-01

    Full Text Available In Charles Dickens’s 'Bleak House' there is a strange (and disgusting pattern of characters feeling that they can ‘taste’ the air, and that that air tastes either meaty or greasy. Esther notices that snuffing ‘two great office candles in tin candlesticks’ at Mrs Jellyby’s ‘made the room taste strongly of hot tallow’, the mutton or beef fat out of which inexpensive candles were made. In 'Bleak House', candles retain their sheepy atmospheres and release them into the surrounding air when consumed. Mrs Jellyby’s home and Mr Vholes’s office are just two places in which Dickens suggests that the process of turning organic animal bodies into urban commodities (candles, parchment, wigs has not quite been completed. Candles and parchment are part animal, part object, and they constantly threaten to revert back into their animal forms. The commodification of animal bodies occurs primarily in the city, where parts of formerly living bodies are manufactured into things. Filled with the smell of burning chops or a spontaneously combusted human, Dickens’s greasier atmospheres contain animal matter suspended in the air that the characters smell, taste, and touch. Once we realize that the apparent smell of chops and candles is, in fact, Krook’s body, this act of taking the air becomes a form of cannibalism that is at least as unsettling as Michael Pollan’s recent account of cows being fed cow parts in factory farms. Drawing on this insight and on Allen MacDuffie’s analyses of energy systems in 'Bleak House', this article focuses on instances in which Dickens defamiliarizes the human consumption of energy by having his characters unintentionally ingest animal particles. Studying Dickens’s treatment of animal fat suspended in air adds a new dimension to recent work on systems of energy expenditure and exchange in an age of industrial capitalism.

  8. National Green Building Standard Analysis

    Energy Technology Data Exchange (ETDEWEB)

    none,

    2012-07-01

    DOE's Building America Program is a research and development program to improve the energy performance of new and existing homes. The ultimate goal of the Building America Program is to achieve examples of cost-effective, energy efficient solutions for all U.S. climate zones. Periodic maintenance of an ANSI standard by review of the entire document and action to revise or reaffirm it on a schedule not to exceed five years is required by ANSI. In compliance, a consensus group has once again been formed and the National Green Building Standard is currently being reviewed to comply with the periodic maintenance requirement of an ANSI standard.

  9. 75 FR 49475 - Petroleum Wax Candles From the People's Republic of China: Preliminary Results of Request for...

    Science.gov (United States)

    2010-08-13

    ... research firm in Malaysia on producers' prices for candles made and sold in Malaysia and stated that the... specially designed for Christmas. That is, they are holiday scenes and symbols. Both candles are square... from the People's Republic of China (PRC). Christmas novelty candles are candles specially designed for...

  10. 75 FR 38121 - Petroleum Wax Candles From China

    Science.gov (United States)

    2010-07-01

    ... paper-cored wicks and containing any amount of petroleum wax, except for candles containing more than 50... investigation. The Commission's designated agency ethics official has advised that a five-year review is not... Office of Government Ethics. Consequently, former employees are not required to seek Commission approval...

  11. Demonstrating Sound Wave Propagation with Candle Flame and Loudspeaker

    Science.gov (United States)

    Hrepic, Zdeslav; Nettles, Corey; Bonilla, Chelsea

    2013-01-01

    The motion of a candle flame in front of a loudspeaker has been suggested as a productive demonstration of the longitudinal wave nature of sound. The demonstration has been used also as a research tool to investigate students' understanding about sound. The underpinning of both applications is the expectation of a horizontal, back-and-forth…

  12. New Scientific Aspects of the "Burning Candle" Experiment

    Science.gov (United States)

    Massalha, Taha

    2016-01-01

    The "burning candle" experiment is used in middle school education programs to prove that air contains a component that is essential to burning (i.e., oxygen). The accepted interpretation taught by teachers in middle school is this: when burning occurs, oxygen is used up, creating an underpressure that causes a rise in water level inside…

  13. Proper Use of Candles During a Power Outage

    Centers for Disease Control (CDC) Podcasts

    2006-08-10

    Home fires are a threat after a natural disaster and fire trucks may have trouble getting to your home. If the power is out, use flashlights or other battery-powered lights if possible, instead of candles.  Created: 8/10/2006 by Emergency Communications System.   Date Released: 8/20/2008.

  14. Using Quasars as Standard Candles for Studying Dark Energy

    DEFF Research Database (Denmark)

    Denney, Kelly D.; Vestergaard, Marianne; Watson, D.

    2012-01-01

    , which relies on the technique of reverberation mapping to measure time delays between the quasar continuum and emission line variability signatures. Measuring this time delay effectively measures the radius between the central source and the emission-line gas. The emission line gas is photo...... forecasts demonstrating the power this method can have over, e.g., SNe, to constrain dark energy parameters by extending to higher redshifts than can currently be probed with any other technique....

  15. Reconstructing cosmological matter perturbations using standard candles and rulers

    Energy Technology Data Exchange (ETDEWEB)

    Alam, Ujjaini [Los Alamos National Laboratory; Sahni, Varun [IUCAA, PUNE; Starobinsky, Alexei A [LANDAU INST, MOSCOW

    2008-01-01

    For a large class of dark energy (DE) models, for which the effective gravitational constant is a constant and there is no direct exchange of energy between DE and dark matter (DM), knowledge of the expansion history suffices to reconstruct the growth factor of linearized density perturbations in the non-relativistic matter component on scales much smaller than the Hubble distance. In this paper, we develop a non-parametric method for extracting information about the perturbative growth factor from data pertaining to the luminosity or angular size distances. A comparison of the reconstructed density contrast with observations of large-scale structure and gravitational lensing can help distinguish DE models such as the cosmological constant and quintessence from models based on modified gravity theories as well as models in which DE and DM are either unified or interact directly. We show that for current supernovae (SNe) data, the linear growth factor at z = 0.3 can be constrained to 5% and the linear growth rate to 6%. With future SNe data, such as expected from the Joint Dark Energy Mission, we may be able to constrain the growth factor to 2%-3% and the growth rate to 3%-4% at z = 0.3 with this unbiased, model-independent reconstruction method. For future baryon acoustic oscillation data which would deliver measurements of both the angular diameter distance and the Hubble parameter, it should be possible to constrain the growth factor at z = 2.5%-9%. These constraints grow tighter with the errors on the data sets. With a large quantity of data expected in the next few years, this method can emerge as a competitive tool for distinguishing between different models of dark energy.

  16. Light a CANDLE. An innovative burnup strategy of nuclear reactors

    International Nuclear Information System (INIS)

    Sekimoto, Hiroshi

    2005-11-01

    CANDLE is a new burnup strategy for nuclear reactors, which stands for Constant Axial Shape of Neutron Flux, Nuclide Densities and Power Shape During Life of Energy Production. When this candle-like burnup strategy is adopted, although the fuel is fixed in a reactor core, the burning region moves, at a speed proportionate to the power output, along the direction of the core axis without changing the spatial distribution of the number density of the nuclides, neutron flux, and power density. Excess reactivity is not necessary for burnup and the shape of the power distribution and core characteristics do not change with the progress of burnup. It is not necessary to use control rods for the control of the burnup. This booklet described the concept of the CANDLE burnup strategy with basic explanations of excess neutrons and its specific application to a high-temperature gas-cooled reactor and a fast reactor with excellent neutron economy. Supplementary issues concerning the initial core and high burnup were also referred. (T. Tanaka)

  17. A Double Candle-Flame-Shaped Solar Flare Observed by SDO and STEREO

    Science.gov (United States)

    Gou, T.; Liu, R.; Wang, Y.; Liu, K.; Zhuang, B.; Zhang, Q.; Liu, J.

    2015-12-01

    We investigate an M1.4 flare occurring on 2011 January 28 near the northwest solar limb. The flare loop system exhibits a double candle-flame configuration in SDO/AIA's hot passbands, sharing a much larger cusp-shaped structure. The results of DEM analysis show that each candle flame has a similar temperature distribution as the famous Tsuneta flare. STEREO-A provides us a view from directly above the flare, and in SECCHI/EUVI 195 Å the post-flare loops are observed to propagate eastward. We performed a 3D reconstruction of the pos-flare loops with AIA and EUVI data. With the aid of the squashing factor Q based on a potential extrapolation of the photospheric field, we recognized that the footpoints of the post-flare loops were slipping along high-Q lines on the photosphere, and the reconstructed loops share similarity with the filed lines that are traced starting from the high-Q lines. The heights of the loops increase as they slip horizontally eastward, giving the loop-top a velocity of about 10 km/s. An extremely large EUV late phase in Fe XVI 33.5 nm observed by SDO/EVE is suggested to be related to the slipping magnetic reconnection occurring in the quasi-separatrix layers (QSLs) whose photosheric footprints are featured by the high-Q lines.

  18. Candle light-style OLED: a plausibly human-friendly safe night light

    Science.gov (United States)

    Jou, Jwo-Huei; Chen, Po-Wei; Hsieh, Chun-Yu; Wang, Ching-Chiun; Chen, Chien-Chih; Tung, F.-C.; Chen, Szu-Hao; Wang, Yi-Shan

    2013-09-01

    Candles emit sensationally-warm light with a very-low color-temperature, comparatively most suitable for use at night. In response to the need for such a human-friendly night light, we demonstrate the employment of a high number of candle light complementary organic emitters to generate mimic candle light based on organic light emitting diode (OLED). One resultant candle light-style OLED shows a very-high color rendering index, with an efficacy at least 300 times that of candles or twice that of an incandescent bulb. The device can be fabricated, for example, by using four candle light complementary emitters, namely: red, yellow, green, and sky-blue phosphorescent dyes, vacuum-deposited into two emission layers, separated by a nano-layer of carrier modulation material to maximize both the desirable very-high color rendering index and energy efficiency, while keeping the blue emission very low and red emission high to obtain the desirable low color temperature. With different layer structures, the OLEDs can also show color tunable between that of candle light and dusk-hue. Importantly, a romantic sensation giving and supposedly physiologically-friendly candle light-style emission can hence be driven by electricity in lieu of the hydrocarbon-burning and greenhouse gas releasing candles that were invented 5,000 years ago.

  19. Incorporating Experience Curves in Appliance Standards Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Garbesi, Karina; Chan, Peter; Greenblatt, Jeffery; Kantner, Colleen; Lekov, Alex; Meyers, Stephen; Rosenquist, Gregory; Buskirk, Robert Van; Yang, Hung-Chia; Desroches, Louis-Benoit

    2011-10-31

    The technical analyses in support of U.S. energy conservation standards for residential appliances and commercial equipment have typically assumed that manufacturing costs and retail prices remain constant during the projected 30-year analysis period. There is, however, considerable evidence that this assumption does not reflect real market prices. Costs and prices generally fall in relation to cumulative production, a phenomenon known as experience and modeled by a fairly robust empirical experience curve. Using price data from the Bureau of Labor Statistics, and shipment data obtained as part of the standards analysis process, we present U.S. experience curves for room air conditioners, clothes dryers, central air conditioners, furnaces, and refrigerators and freezers. These allow us to develop more representative appliance price projections than the assumption-based approach of constant prices. These experience curves were incorporated into recent energy conservation standards for these products. The impact on the national modeling can be significant, often increasing the net present value of potential standard levels in the analysis. In some cases a previously cost-negative potential standard level demonstrates a benefit when incorporating experience. These results imply that past energy conservation standards analyses may have undervalued the economic benefits of potential standard levels.

  20. Incorporating experience curves in appliance standards analysis

    International Nuclear Information System (INIS)

    Desroches, Louis-Benoit; Garbesi, Karina; Kantner, Colleen; Van Buskirk, Robert; Yang, Hung-Chia

    2013-01-01

    There exists considerable evidence that manufacturing costs and consumer prices of residential appliances have decreased in real terms over the last several decades. This phenomenon is generally attributable to manufacturing efficiency gained with cumulative experience producing a certain good, and is modeled by an empirical experience curve. The technical analyses conducted in support of U.S. energy conservation standards for residential appliances and commercial equipment have, until recently, assumed that manufacturing costs and retail prices remain constant during the projected 30-year analysis period. This assumption does not reflect real market price dynamics. Using price data from the Bureau of Labor Statistics, we present U.S. experience curves for room air conditioners, clothes dryers, central air conditioners, furnaces, and refrigerators and freezers. These experience curves were incorporated into recent energy conservation standards analyses for these products. Including experience curves increases the national consumer net present value of potential standard levels. In some cases a potential standard level exhibits a net benefit when considering experience, whereas without experience it exhibits a net cost. These results highlight the importance of modeling more representative market prices. - Highlights: ► Past appliance standards analyses have assumed constant equipment prices. ► There is considerable evidence of consistent real price declines. ► We incorporate experience curves for several large appliances into the analysis. ► The revised analyses demonstrate larger net present values of potential standards. ► The results imply that past standards analyses may have undervalued benefits.

  1. Qualifications of Candle Filters for Combined Cycle Combustion Applications

    Energy Technology Data Exchange (ETDEWEB)

    Tomasz Wiltowski

    2008-08-31

    The direct firing of coal produces particulate matter that has to be removed for environmental and process reasons. In order to increase the current advanced coal combustion processes, under the U.S. Department of Energy's auspices, Siemens Westinghouse Power Corporation (SWPC) has developed ceramic candle filters that can operate at high temperatures. The Coal Research Center of Southern Illinois University (SIUC), in collaboration with SWPC, developed a program for long-term filter testing at the SIUC Steam Plant followed by experiments using a single-filter reactor unit. The objectives of this program funded by the U.S. Department of Energy were to identify and demonstrate the stability of porous candle filter elements for use in high temperature atmospheric fluidized-bed combustion (AFBC) process applications. These verifications were accomplished through extended time slipstream testing of a candle filter array under AFBC conditions using SIUC's existing AFBC boiler. Temperature, mass flow rate, and differential pressure across the filter array were monitored for a duration of 45 days. After test exposure at SIUC, the filter elements were characterized using Scanning Electron Microscopy and BET surface area analyses. In addition, a single-filter reactor was built and utilized to study long term filter operation, the permeability exhibited by a filter element before and after the slipstream test, and the thermal shock resilience of a used filter by observing differential pressure changes upon rapid heating and cooling of the filter. The data acquired during the slipstream test and the post-test evaluations demonstrated the suitability of filter elements in advanced power generation applications.

  2. 76 FR 773 - Petroleum Wax Candles From the People's Republic of China: Continuation of Antidumping Duty Order

    Science.gov (United States)

    2011-01-06

    ... International Trade Administration Petroleum Wax Candles From the People's Republic of China: Continuation of... the antidumping duty order on petroleum wax candles from the People's Republic of China (``PRC... of initiation of the sunset review of the antidumping duty order on petroleum wax candles from the...

  3. Observations on the CANDLE burn-up in various geometries

    International Nuclear Information System (INIS)

    Seifritz, W.

    2007-01-01

    We have looked at all geometrical conditions under which an auto catalytically propagating burnup wave (CANDLE burn-up) is possible. Thereby, the Sine Gordon equation finds a new place in the burn-up theory of nuclear fission reactors. For a practical reactor design the axially burning 'spaghetti' reactor and the azimuthally burning 'pancake' reactor, respectively, seem to be the most promising geometries for a practical reactor design. Radial and spherical burn-waves in cylindrical and spherical geometry, respectively, are principally impossible. Also, the possible applicability of such fission burn-waves on the OKLO-phenomenon and the GEOREACTOR in the center of Earth, postulated by Herndon, is discussed. A fast CANDLE-reactor can work with only depleted uranium. Therefore, uranium mining and uranium-enrichment are not necessary anymore. Furthermore, it is also possible to dispense with reprocessing because the uranium utilization factor is as high as about 40%. Thus, this completely new reactor type can open a new era of reactor technology

  4. Development of CANDLES low background HPGe detector and half-life measurement of 180Tam

    Science.gov (United States)

    Chan, W. M.; Kishimoto, T.; Umehara, S.; Matsuoka, K.; Suzuki, K.; Yoshida, S.; Nakajima, K.; Iida, T.; Fushimi, K.; Nomachi, M.; Ogawa, I.; Tamagawa, Y.; Hazama, R.; Takemoto, Y.; Nakatani, N.; Takihira, Y.; Tozawa, M.; Kakubata, H.; Trang, V. T. T.; Ohata, T.; Tetsuno, K.; Maeda, T.; Khai, B. T.; Li, X. L.; Batpurev, T.

    2018-01-01

    A low background HPGe detector system was developed at CANDLES Experimental Hall for multipurpose use. Various low background techniques were employed, including hermatic shield design, radon gas suppression, and background reduction analysis. A new pulse shape discrimination (PSD) method was specially created for coaxial Ge detector. Using this PSD method, microphonics noise and background event at low energy region less than 200 keV can be rejected effectively. Monte Carlo simulation by GEANT4 was performed to acquire the detection efficiency and study the interaction of gamma-rays with detector system. For rare decay measurement, the detector was utilized to detect the nature's most stable isomer tantalum-180m (180Tam) decay. Two phases of tantalum physics run were completed with total livetime of 358.2 days, which Phase II has upgraded shield configuration. The world most stringent half-life limit of 180Tam has been successfully achieved.

  5. Non standard analysis, polymer models, quantum fields

    International Nuclear Information System (INIS)

    Albeverio, S.

    1984-01-01

    We give an elementary introduction to non standard analysis and its applications to the theory of stochastic processes. This is based on a joint book with J.E. Fenstad, R. Hoeegh-Krohn and T. Lindstroeem. In particular we give a discussion of an hyperfinite theory of Dirichlet forms with applications to the study of the Hamiltonian for a quantum mechanical particle in the potential created by a polymer. We also discuss new results on the existence of attractive polymer measures in dimension d 1 2 phi 2 2 )sub(d)-model of interacting quantum fields. (orig.)

  6. Accession Medical Standards Analysis and Research Activity

    Science.gov (United States)

    2010-01-01

    each Initial Entry Training ( IET ) sites to USMEPCOM but this reporting is not required by service regulations. The total numbers of reported...classification and reporting from the IET sites to MEPCOM, which is still passive, should be mandated and standardized by DoD/service regulations. Analysis would...attrit Hernia 2,029 19.7 565 27.8 335 59.3 158 47.2 18 11.4 Gastroesophageal reflux disease (GERD) 300 2.9 91 30.3 46 50.5 24 52.2 9 37.5 Diabetes

  7. Using slow-release permanganate candles to remediate PAH-contaminated water

    International Nuclear Information System (INIS)

    Rauscher, Lindy; Sakulthaew, Chainarong; Comfort, Steve

    2012-01-01

    Highlights: ► We quantified the efficacy of slow-release permanganate-paraffin candles to degrade and mineralize PAHs. ► 14 C-labeled PAHs were used to quantify both adsorption and transformation. ► Permanganate-treated PAHs were more biodegradable in soil microcosms. ► A flow-through candle system was used to quantify PAH removal in urban runoff. - Abstract: Surface waters impacted by urban runoff in metropolitan areas are becoming increasingly contaminated with polycyclic aromatic hydrocarbons (PAHs). Slow-release oxidant candles (paraffin–KMnO 4 ) are a relatively new technology being used to treat contaminated groundwater and could potentially be used to treat urban runoff. Given that these candles only release permanganate when submerged, the ephemeral nature of runoff events would influence when the permanganate is released for treating PAHs. Our objective was to determine if slow-release permanganate candles could be used to degrade and mineralize PAHs. Batch experiments quantified PAH degradation rates in the presence of the oxidant candles. Results showed most of the 16 PAHs tested were degraded within 2–4 h. Using 14 C-labled phenanthrene and benzo(a)pyrene, we demonstrated that the wax matrix of the candle initially adsorbs the PAH, but then releases the PAH back into solution as transformed, more water soluble products. While permanganate was unable to mineralize the PAHs (i.e., convert to CO 2 ), we found that the permanganate-treated PAHs were much more biodegradable in soil microcosms. To test the concept of using candles to treat PAHs in multiple runoff events, we used a flow-through system where urban runoff water was pumped over a miniature candle in repetitive wet–dry, 24-h cycles. Results showed that the candle was robust in removing PAHs by repeatedly releasing permanganate and degrading the PAHs. These results provide proof-of-concept that permanganate candles could potentially provide a low-cost, low-maintenance approach to

  8. Quantitative standard-less XRF analysis

    International Nuclear Information System (INIS)

    Ulitzka, S.

    2002-01-01

    Full text: For most analytical tasks in the mining and associated industries matrix-matched calibrations are used for the monitoring of ore grades and process control. In general, such calibrations are product specific (iron ore, bauxite, alumina, mineral sands, cement etc.) and apply to a relatively narrow concentration range but give the best precision and accuracy for those materials. A wide range of CRMs is available and for less common materials synthetic standards can be made up from 'pure' chemicals. At times, analysis of materials with varying matrices (powders, scales, fly ash, alloys, polymers, liquors, etc.) and diverse physical shapes (non-flat, metal drillings, thin layers on substrates etc.) is required that could also contain elements which are not part of a specific calibration. A qualitative analysis can provide information about the presence of certain elements and the relative intensities of element peaks in a scan can give a rough idea about their concentrations. More often however, quantitative values are required. The paper will look into the basics of quantitative standardless analysis and show results for some well-defined CRMs. Copyright (2002) Australian X-ray Analytical Association Inc

  9. Cleanup standards and pathways analysis methods

    International Nuclear Information System (INIS)

    Devgun, J.S.

    1993-01-01

    Remediation of a radioactively contaminated site requires that certain regulatory criteria be met before the site can be released for unrestricted future use. Since the ultimate objective of remediation is to protect the public health and safety, residual radioactivity levels remaining at a site after cleanup must be below certain preset limits or meet acceptable dose or risk criteria. Release of a decontaminated site requires proof that the radiological data obtained from the site meet the regulatory criteria for such a release. Typically release criteria consist of a composite of acceptance limits that depend on the radionuclides, the media in which they are present, and federal and local regulations. In recent years, the US Department of Energy (DOE) has developed a pathways analysis model to determine site-specific soil activity concentration guidelines for radionuclides that do not have established generic acceptance limits. The DOE pathways analysis computer code (developed by Argonne National Laboratory for the DOE) is called RESRAD (Gilbert et al. 1989). Similar efforts have been initiated by the US Nuclear Regulatory Commission (NRC) to develop and use dose-related criteria based on genetic pathways analyses rather than simplistic numerical limits on residual radioactivity. The focus of this paper is radionuclide contaminated soil. Cleanup standards are reviewed, pathways analysis methods are described, and an example is presented in which RESRAD was used to derive cleanup guidelines

  10. Multielement analysis of biological standards by neutron activation analysis

    International Nuclear Information System (INIS)

    Nadkarni, R.A.

    1977-01-01

    Up to 28 elements were determined in two IAEA standards: Animal Muscle H4 and Fish Soluble A 6/74, and three NBS standards: Spinach: SRM-1570, Tomato Leaves: SRM-1573 and Pine Needles: SRM-1575 by instrumental neutron-activation analysis. Seven noble metals were determined in two NBS standards: Coal: SRM-1632 and Coal Fly Ash: SRM-1633 by radiochemical procedure while 11 rare earth elements were determined in NBS standard Orchard Leaves: SRM-1571 by instrumental neutron-activation analysis. The results are in good agreement with the certified and/or literature data where available. The irradiations were performed at the Cornell TRIGA Mark II nuclear reactor at a thermal neutron flux of 1-3x10 12 ncm -2 sec -1 . The short-lived species were determined after a 2-minute irradiation in the pneumatic rabbit tube, and the longer-lived species after an 8-hour irradiation in the central thimble facility. The standards and samples were counted on coaxial 56-cm 3 Ge(Li) detector. The system resolution was 1.96 keV (FWHM) with a peak to Compton ratio of 37:1 and counting efficiency of 13%, all compared to the 1.332 MeV photopeak of Co-60. (T.I.)

  11. Standardizing Activation Analysis: New Software for Photon Activation Analysis

    International Nuclear Information System (INIS)

    Sun, Z. J.; Wells, D.; Green, J.; Segebade, C.

    2011-01-01

    Photon Activation Analysis (PAA) of environmental, archaeological and industrial samples requires extensive data analysis that is susceptible to error. For the purpose of saving time, manpower and minimizing error, a computer program was designed, built and implemented using SQL, Access 2007 and asp.net technology to automate this process. Based on the peak information of the spectrum and assisted by its PAA library, the program automatically identifies elements in the samples and calculates their concentrations and respective uncertainties. The software also could be operated in browser/server mode, which gives the possibility to use it anywhere the internet is accessible. By switching the nuclide library and the related formula behind, the new software can be easily expanded to neutron activation analysis (NAA), charged particle activation analysis (CPAA) or proton-induced X-ray emission (PIXE). Implementation of this would standardize the analysis of nuclear activation data. Results from this software were compared to standard PAA analysis with excellent agreement. With minimum input from the user, the software has proven to be fast, user-friendly and reliable.

  12. LOGISTIC REGRESSION ANALYSIS WITH STANDARDIZED MARKERS.

    Science.gov (United States)

    Huang, Ying; Pepe, Margaret S; Feng, Ziding

    2013-09-01

    Two different approaches to analysis of data from diagnostic biomarker studies are commonly employed. Logistic regression is used to fit models for probability of disease given marker values while ROC curves and risk distributions are used to evaluate classification performance. In this paper we present a method that simultaneously accomplishes both tasks. The key step is to standardize markers relative to the non-diseased population before including them in the logistic regression model. Among the advantages of this method are: (i) ensuring that results from regression and performance assessments are consistent with each other; (ii) allowing covariate adjustment and covariate effects on ROC curves to be handled in a familiar way, and (iii) providing a mechanism to incorporate important assumptions about structure in the ROC curve into the fitted risk model. We develop the method in detail for the problem of combining biomarker datasets derived from multiple studies, populations or biomarker measurement platforms, when ROC curves are similar across data sources. The methods are applicable to both cohort and case-control sampling designs. The dataset motivating this application concerns Prostate Cancer Antigen 3 (PCA3) for diagnosis of prostate cancer in patients with or without previous negative biopsy where the ROC curves for PCA3 are found to be the same in the two populations. Estimated constrained maximum likelihood and empirical likelihood estimators are derived. The estimators are compared in simulation studies and the methods are illustrated with the PCA3 dataset.

  13. ICT Standardization and use of ICT standards: a firm level analysis

    OpenAIRE

    Riillo, Cesare Fabio Antonio

    2014-01-01

    Standards perform some fundamental economic functions and their relevance for ICT is acknowledged by firms, researchers and policy-makers. This paper investigates the driving forces of formal ICT standards setting (i.e. standardization). Previous quantitative studies have neglected that ICT standards use and engagement in ICT standardization are related activities. Leveraging upon a unique module of the ICT usage survey 2013 for Luxembourg, the analysis explicitly takes into account the use o...

  14. Cancer Driver Log (CanDL): Catalog of Potentially Actionable Cancer Mutations.

    Science.gov (United States)

    Damodaran, Senthilkumar; Miya, Jharna; Kautto, Esko; Zhu, Eliot; Samorodnitsky, Eric; Datta, Jharna; Reeser, Julie W; Roychowdhury, Sameek

    2015-09-01

    Massively parallel sequencing technologies have enabled characterization of genomic alterations across multiple tumor types. Efforts have focused on identifying driver mutations because they represent potential targets for therapy. However, because of the presence of driver and passenger mutations, it is often challenging to assign the clinical relevance of specific mutations observed in patients. Currently, there are multiple databases and tools that provide in silico assessment for potential drivers; however, there is no comprehensive resource for mutations with functional characterization. Therefore, we created an expert-curated database of potentially actionable driver mutations for molecular pathologists to facilitate annotation of cancer genomic testing. We reviewed scientific literature to identify variants that have been functionally characterized in vitro or in vivo as driver mutations. We obtained the chromosome location and all possible nucleotide positions for each amino acid change and uploaded them to the Cancer Driver Log (CanDL) database with associated literature reference indicating functional driver evidence. In addition to a simple interface, the database allows users to download all or selected genes as a comma-separated values file for incorporation into their own analysis pipeline. Furthermore, the database includes a mechanism for third-party contributions to support updates for novel driver mutations. Overall, this freely available database will facilitate rapid annotation of cancer genomic testing in molecular pathology laboratories for mutations. Copyright © 2015 American Society for Investigative Pathology and the Association for Molecular Pathology. Published by Elsevier Inc. All rights reserved.

  15. FTIR Study of Comustion Species in Several Regions of a Candle Flame

    Science.gov (United States)

    White, Allen R.

    2013-06-01

    The complex chemical structure of the fuel in a candle flame, parafin, is broken down into smaller hydrocarbons in the dark region just above the candle wick during combustion. This creates fuel-rich, fuel-lean, hydrocarbon reaction, and combustion product regions in the flame during combustion that are spectroscopically rich, particularly in the infrared. IR emissions were measured for each reaction region via collection optics focused into an FTIR and used to identify IR active species present in that region and, when possible, temperature of the sampling region. The results of the measurements are useful for combustion reaction modeling as well as for future validation of mass spectroscopy sampling systems.

  16. Annual Book of ASTM Standards, Part 23: Water; Atmospheric Analysis.

    Science.gov (United States)

    American Society for Testing and Materials, Philadelphia, PA.

    Standards for water and atmospheric analysis are compiled in this segment, Part 23, of the American Society for Testing and Materials (ASTM) annual book of standards. It contains all current formally approved ASTM standard and tentative test methods, definitions, recommended practices, proposed methods, classifications, and specifications. One…

  17. Risk Analysis as Regulatory Science: Toward The Establishment of Standards.

    Science.gov (United States)

    Murakami, Michio

    2016-09-01

    Understanding how to establish standards is essential for risk communication and also provides perspectives for further study. In this paper, the concept of risk analysis as regulatory science for the establishment of standards is demonstrated through examples of standards for evacuation and provisional regulation values in foods and drinking water. Moreover, academic needs for further studies related to standards are extracted. The concepts of the traditional 'Standard I', which has a paternalistic orientation, and 'Standard II', established through stakeholder consensus, are then systemized by introducing the current status of the new standards-related movement that developed after the Fukushima nuclear power plant accident, and the perspectives of the standards are discussed. Preparation of standards on the basis of stakeholder consensus through intensive risk dialogue before a potential nuclear power plant accident is suggested to be a promising approach to ensure a safe society and enhance subjective well-being. © The Author 2016. Published by Oxford University Press.

  18. Risk Analysis as Regulatory Science: Toward The Establishment of Standards

    Science.gov (United States)

    Murakami, Michio

    2016-01-01

    Understanding how to establish standards is essential for risk communication and also provides perspectives for further study. In this paper, the concept of risk analysis as regulatory science for the establishment of standards is demonstrated through examples of standards for evacuation and provisional regulation values in foods and drinking water. Moreover, academic needs for further studies related to standards are extracted. The concepts of the traditional ‘Standard I’, which has a paternalistic orientation, and ‘Standard II’, established through stakeholder consensus, are then systemized by introducing the current status of the new standards-related movement that developed after the Fukushima nuclear power plant accident, and the perspectives of the standards are discussed. Preparation of standards on the basis of stakeholder consensus through intensive risk dialogue before a potential nuclear power plant accident is suggested to be a promising approach to ensure a safe society and enhance subjective well-being. PMID:27475751

  19. Lung inflammation and genotoxicity in mice lungs after pulmonary exposure to candle light combustion particles

    DEFF Research Database (Denmark)

    Skovmand, Astrid; Damiao Gouveia, Ana Cecilia; Koponen, Ismo Kalevi

    2017-01-01

    Candle burning produces a large amount of particles that contribute substantially to the exposure to indoor particulate matter. The exposures to various types of combustion particles, such as diesel exhaust particles, have been associated with increased risk of lung cancer by mechanisms that invo...

  20. Burning a Candle in a Vessel, a Simple Experiment with a Long History

    Science.gov (United States)

    Vera, Francisco; Rivera, Rodrigo; Nunez, Cesar

    2011-01-01

    The experiment in which a candle is burned inside an inverted vessel partially immersed in water has a history of more than 2,200 years, but even nowadays it is common that students and teachers relate the change in volume of the enclosed air to its oxygen content. Contrary to what many people think, Lavoisier concluded that any change in volume…

  1. CANDLES - Search for Neutrino-less Double Beta Decay of 48Ca

    Science.gov (United States)

    Umehara, Saori; Candles Collaboration

    2014-09-01

    CANDLES is the project to search for neutrino-less double beta decay (0 νββ) of 48Ca. The CANDLES system aims at a high sensitive measurement by a characteristic detector system and 48Ca enrichment. The system realizes a complete 4 π active shield by immersing the CaF2 scintillators in liquid scintillator. The active shield by the liquid scintillator will effectively reject background events from external origins. On the other band, we have studied 48Ca enrichment and succeeded in obtaining enriched 48Ca although it is a small amount. Now we have developed the CANDLES III system, which contained 350 g of 48Ca without enrichment, at the Kamioka underground laboratory. Two improvements, a light-concentration system and a new DAQ system, were installed for the CANDLES III system. The light-concentration system improved a energy resolution by increasing a PMT photo-coverage by 80%. The new DAQ system, which is a dead time less system, improved a rejection efficiency for a characteristic background origin. We checked detector performance with the light-concentration system and the new DAQ system. Here we will report the detector performance for background rejection and the expected sensitivity with the two improvements.

  2. Neutron Activation Analysis with k0 Standardization

    International Nuclear Information System (INIS)

    Pomme, S.

    1998-01-01

    SCK-CEN's programme on Neutron Activation Analysis with k 0 -standardisation aims to: (1) develop and implement k 0 -standardisation method for NAA; (2) to exploit the inherent qualities of NAA such as accuracy, traceability, and multi-element capability; (3) to acquire technical spin-off for nuclear measurements services. Main achievements in 1997 are reported

  3. Uranium hydrogeochemical and stream sediment reconnaissance of the Candle NTMS quadrangle, Alaska

    International Nuclear Information System (INIS)

    Hardy, L.C.; D'Andrea, R.F. Jr.; Zinkl, R.J.

    1982-07-01

    This report presents results of a Hydrogeochemical and Stream Sediment Reconnaissance (HSSR) of the Candle NTMS quadrangle, Alaska. In addition to this abbreviated data release, more complete data are available to the public in machine-readable form. These machine-readable data, as well as quarterly or semiannual program progress reports containing further information on the HSSR program in general, or on the Los Alamos National Laboratory (LANL) portion of the program in particular, are available from DOE's Technical Library at its Grand Junction Area Office. Presented in this data release are location data, field analyses, and laboratory analyses of several different sample media. For the sake of brevity, many field site observations have not been included in this volume; these data are, however, available on the magnetic tape. Appendices A through D describe the sample media and summarize the analytical results for each medium. The data have been subdivided by one of the Los Alamos National Laboratory sorting programs of Zinkl and others (1981a) into groups of stream-sediment, lake-sediment, stream-water, and lake-water samples. For each group which contains a sufficient number of observations, statistical tables, tables of raw data, and 1:1,000,000 scale maps of pertinent elements have been included in this report. Also included are maps showing results of multivariate statistical analyses. Information on the field and analytical procedures used by the Los Alamos National Laboratory during sample collection and analysis may be found in any HSSR data release prepared by the Laboratory and will not be included in this report

  4. 76 FR 46277 - Petroleum Wax Candles From the People's Republic of China: Final Results of Request for Comments...

    Science.gov (United States)

    2011-08-02

    ... solicited comments from interested parties on the best method to consider whether novelty candles should or..., LLC, and Accent Imports, respectively, for scope rulings to determine whether each company's...

  5. Standardization: using comparative maintenance costs in an economic analysis

    OpenAIRE

    Clark, Roger Nelson

    1987-01-01

    Approved for public release; distribution is unlimited This thesis investigates the use of comparative maintenance costs of functionally interchangeable equipments in similar U.S. Navy shipboard applications in an economic analysis of standardization. The economics of standardization, life-cycle costing, and the Navy 3-M System are discussed in general. An analysis of 3-M System maintenance costs for a selected equipment, diesel engines, is conducted. The potential use of comparative ma...

  6. Compilation of Published PM2.5 Emission Rates for Cooking, Candles and Incense for Use in Modeling of Exposures in Residences

    Energy Technology Data Exchange (ETDEWEB)

    Hu, Tianchao [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Singer, Brett C. [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Logue, Jennifer M. [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States)

    2012-08-01

    recent analysis of health impacts from air pollutant inhalation in homes found that PM2.5 is the most damaging at the population level. Chronic exposure to elevated PM2.5 has the potential to damage human respiratory systems, and may result in premature death. PM2.5 exposures in homes can be mitigated through various approaches including kitchen exhaust ventilation, filtration, indoor pollutant source reduction and designing ventilation systems to reduce the entry of PM2.5 from outdoors. Analysis of the potential benefits and costs of various approaches can be accomplished using computer codes that simulate the key physical processes including emissions, dilution and ventilation. The largest sources of PM2.5 in residences broadly are entry from outdoors and emissions from indoor combustion. The largest indoor sources are tobacco combustion (smoking), cooking and the burning of candles and incense. Data on the magnitude of PM2.5 and other pollutant emissions from these events and processes are required to conduct simulations for analysis. The goal of this study was to produce a database of pollutant emission rates associated with cooking and the burning of candles and incense. The target use of these data is for indoor air quality modeling.

  7. Establishing working standards of chromosome aberrations analysis for biological dosimetry

    International Nuclear Information System (INIS)

    Bui Thi Kim Luyen; Tran Que; Pham Ngoc Duy; Nguyen Thi Kim Anh; Ha Thi Ngoc Lien

    2015-01-01

    Biological dosimetry is an dose assessment method using specify bio markers of radiation. IAEA (International Atomic Energy Agency) and ISO (International Organization for Standardization) defined that dicentric chromosome is specify for radiation, it is a gold standard for biodosimetry. Along with the documents published by IAEA, WHO, ISO and OECD, our results of study on the chromosome aberrations induced by radiation were organized systematically in nine standards that dealing with chromosome aberration test and micronucleus test in human peripheral blood lymphocytes in vitro. This standard addresses: the reference dose-effect for dose estimation, the minimum detection levels, cell culture, slide preparation, scoring procedure for chromosome aberrations use for biodosimetry, the criteria for converting aberration frequency into absorbed dose, reporting of results. Following these standards, the automatic analysis devices were calibrated for improving biological dosimetry method. This standard will be used to acquire and maintain accreditation of the Biological Dosimetry laboratory in Nuclear Research Institute. (author)

  8. Standard Compliant Hazard and Threat Analysis for the Automotive Domain

    Directory of Open Access Journals (Sweden)

    Kristian Beckers

    2016-06-01

    Full Text Available The automotive industry has successfully collaborated to release the ISO 26262 standard for developing safe software for cars. The standard describes in detail how to conduct hazard analysis and risk assessments to determine the necessary safety measures for each feature. However, the standard does not concern threat analysis for malicious attackers or how to select appropriate security countermeasures. We propose the application of ISO 27001 for this purpose and show how it can be applied together with ISO 26262. We show how ISO 26262 documentation can be re-used and enhanced to satisfy the analysis and documentation demands of the ISO 27001 standard. We illustrate our approach based on an electronic steering column lock system.

  9. Concrete blocks. Analysis of UNE, ISO en standards and comparison with other international standards

    Directory of Open Access Journals (Sweden)

    Álvarez Alonso, Marina

    1990-12-01

    Full Text Available This paper attempts to describe the recently approved UNE standards through a systematic analysis of the main specifications therein contained and the values considered for each of them, as well as the drafts for ISO and EN concrete block standards. Furthermore, the study tries to place the set of ISO standards in the international environment through a comparative analysis against a representative sample of the standards prevailing in various geographical regions of the globe to determine the analogies and differences among them. PALABRAS CLAVE: albañilería, análisis de sistemas, bloque de hormigón, muros de fábrica, normativa KEY WORDS: masonry, system analysis, concrete blocks, masonry walls, standards

    En este trabajo se pretende describir la reciente aprobada normativa UNE, analizando sistemáticamente las principales prescripciones contempladas y los valores considerados para cada una de ellas, así como los proyectos de Norma ISO, y EN sobre bloques de hormigón. Asimismo se intenta situar la normativa UNE en al ámbito internacional, haciendo un análisis comparativo con una representación de Normas de distintas regiones geográficas del mundo, determinando sus analogías y diferencias.

  10. Standardization of Image Quality Analysis – ISO 19264

    DEFF Research Database (Denmark)

    Kejser, Ulla Bøgvad; Wüller, Dietmar

    2016-01-01

    There are a variety of image quality analysis tools available for the archiving world, which are based on different test charts and analysis algorithms. ISO has formed a working group in 2012 to harmonize these approaches and create a standard way of analyzing the image quality for archiving...

  11. Bootstrap Standard Error Estimates in Dynamic Factor Analysis

    Science.gov (United States)

    Zhang, Guangjian; Browne, Michael W.

    2010-01-01

    Dynamic factor analysis summarizes changes in scores on a battery of manifest variables over repeated measurements in terms of a time series in a substantially smaller number of latent factors. Algebraic formulae for standard errors of parameter estimates are more difficult to obtain than in the usual intersubject factor analysis because of the…

  12. Commercial Discount Rate Estimation for Efficiency Standards Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Fujita, K. Sydny [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States)

    2016-04-13

    Underlying each of the Department of Energy's (DOE's) federal appliance and equipment standards are a set of complex analyses of the projected costs and benefits of regulation. Any new or amended standard must be designed to achieve significant additional energy conservation, provided that it is technologically feasible and economically justified (42 U.S.C. 6295(o)(2)(A)). A proposed standard is considered economically justified when its benefits exceed its burdens, as represented by the projected net present value of costs and benefits. DOE performs multiple analyses to evaluate the balance of costs and benefits of commercial appliance and equipment e efficiency standards, at the national and individual building or business level, each framed to capture different nuances of the complex impact of standards on the commercial end user population. The Life-Cycle Cost (LCC) analysis models the combined impact of appliance first cost and operating cost changes on a representative commercial building sample in order to identify the fraction of customers achieving LCC savings or incurring net cost at the considered efficiency levels.1 Thus, the choice of commercial discount rate value(s) used to calculate the present value of energy cost savings within the Life-Cycle Cost model implicitly plays a key role in estimating the economic impact of potential standard levels.2 This report is intended to provide a more in-depth discussion of the commercial discount rate estimation process than can be readily included in standard rulemaking Technical Support Documents (TSDs).

  13. Standard Guide for Wet Sieve Analysis of Ceramic Whiteware Clays

    CERN Document Server

    American Society for Testing and Materials. Philadelphia

    2007-01-01

    1.1 This guide covers the wet sieve analysis of ceramic whiteware clays. This guide is intended for use in testing shipments of clay as well as for plant control tests. 1.2 The values stated in inch-pound units are to be regarded as standard. The values given in parentheses are mathematical conversions to SI units that are provided for information only and are not considered standard. 1.3 This standard does not purport to address all of the safety concerns, if any, associated with its use. It is the responsibility of the user of this standard to establish appropriate safety and health practices and determine the applicability of regulatory limitations prior to use.

  14. National Bureau of Standards coal flyash (SRM 1633a) as a multielement standard for instrumental neutron activation analysis

    International Nuclear Information System (INIS)

    Korotev, R.L.

    1987-01-01

    The U.S. National Bureau of Standards standard reference material 1633a (coal flyash) was standardized for the concentrations of 29 elements against chemical standards by instrumental neutron activation analysis. United States Geological Survey basalt standard BCR-1 was analyzed concurrently as a check. SRM 1633a is a good multielement comparator standard for geochemical analysis for 25 of the elements analyzed and is a better standard than rock-powder SRMs commonly used. Analytical data for USGS DTS-1, PCC-1, GSP-1, BIR-1, DNC-1, and W-2; NBS SRMs 278 and 688; and GIT-IWG (French) anorthosite AN-G are also presented. (author)

  15. Standard model for safety analysis report of fuel reprocessing plants

    International Nuclear Information System (INIS)

    1979-12-01

    A standard model for a safety analysis report of fuel reprocessing plants is established. This model shows the presentation format, the origin, and the details of the minimal information required by CNEN (Comissao Nacional de Energia Nuclear) aiming to evaluate the requests of construction permits and operation licenses made according to the legislation in force. (E.G.) [pt

  16. Standard model for safety analysis report of fuel fabrication plants

    International Nuclear Information System (INIS)

    1980-09-01

    A standard model for a safety analysis report of fuel fabrication plants is established. This model shows the presentation format, the origin, and the details of the minimal information required by CNEN (Comissao Nacional de Energia Nuclear) aiming to evaluate the requests of construction permits and operation licenses made according to the legislation in force. (E.G.) [pt

  17. Ultra Long Period Cepheids: a primary standard candle up to the Hubble flow.

    NARCIS (Netherlands)

    Saha, Abhijit; Fiorentino, Giuliana; Aloisi, Alessandra; van der Marel, Roeland; Annibali, Francesca; Clementini, Gisella; Tosi, Monica; Marconi, Marcella; Musella, Ilaria

    The cosmological distance ladder crucially depends on Classical Cepheids (CCs, with P=3-70d), which are primary distance indicators up to 25 Mpc. Within this volume, only a few SNe Ia have been calibrated through CCs, and even these carry uncertainties from the non-linearity and the metallicity

  18. When a Standard Candle Flickers: Crab Nebula Variations in Hard X-rays

    Science.gov (United States)

    Wilson-Hodge, Colleen A.

    2012-01-01

    The Crab Nebula was surprisingly variable from 2001-2010, with less variability before 2001 and since mid-2010. We presented evidence for spectral softening from RXTE, Swift/BAT, and Fermi GBM during the mid-2008-2010 flux decline. We will miss RXTE, but will continue our monitoring program using Fermi/GBM, MAXI, and Swift/BAT.

  19. Compassionate care? A critical discourse analysis of accreditation standards.

    Science.gov (United States)

    Whitehead, Cynthia; Kuper, Ayelet; Freeman, Risa; Grundland, Batya; Webster, Fiona

    2014-06-01

    We rely upon formal accreditation and curricular standards to articulate the priorities of professional training. The language used in standards affords value to certain constructs and makes others less apparent. Leveraging standards can be a useful way for educators to incorporate certain elements into training. This research was designed to look for ways to embed the teaching and practice of compassionate care into Canadian family medicine residency training. We conducted a Foucauldian critical discourse analysis of compassionate care in recent formal family medicine residency training documents. Critical discourse analysis is premised on the notion that language is connected to practices and to what is accorded value and power. We assembled an archive of texts and examined them to analyse how compassionate care is constructed, how notions of compassionate care relate to other key ideas in the texts, and the implications of these framings. There were very few words, metaphors or statements that related to concepts of compassionate care in our archive. Even potential proxies, notably the doctor-patient relationship and patient-centred care, were not primarily depicted in ways that linked them to ideas of compassion or caring. There was a reduction in language related to compassionate care in the 2013 standards compared with the standards published in 2006. Our research revealed negative findings and a relative absence of the construct of compassionate care in our archival documents. This work demonstrates how a shift in curricular focus can have the unintended consequence of making values that are taken for granted less visible. Given that standards shape training, we must pay attention not only to what we include, but also to what we leave out of formal documents. We risk losing important professional values from training programmes if they are not explicitly highlighted in our standards. © 2014 John Wiley & Sons Ltd.

  20. Development of the standards for probabilistic analysis of security

    International Nuclear Information System (INIS)

    Nelson, P. F.; Gonzalez C, M.

    2008-01-01

    The standard of the American Society of Mechanical Engineers (ASME) for Analysis Probability of Security (APS), for applications in nuclear plants, it was limited originally to an APS Level 1 of internal events. However, the recent efforts taken by the committee of administration of nuclear risk of the ASME, together with the committee for standards informed in risk of the American Nuclear Society (ANS), they have taken place an improved standard that the combines standard original ASME of APS Level internal events, fires inside the plant and external events, with a reserved place for events that happen to low powers and put out. This integrated standard will be used for the nuclear plants and the regulators to carry out applications informed in risk. The use of the APS has matured to the point that the programs of risk management have been developed that its is being used as part of the taking of decisions making in the nuclear facilities. The standard provides approaches to evaluate the technical capacities of an APS, relative to a matter in particular that allows them to the specialists in APS to determine if the elements of the APS are technically appropriate with regard to an application informed in particular risk. Informed applications in risk like inspection in service and technical specifications informed in risk they save time and resources, not alone to the plants, but to the regulator also. (Author)

  1. On criteria for examining analysis quality with standard reference material

    International Nuclear Information System (INIS)

    Yang Huating

    1997-01-01

    The advantages and disadvantages and applicability of some criteria for examining analysis quality with standard reference material are discussed. The combination of the uncertainties of the instrument examined and the reference material should be determined on the basis of specific situations. Without the data of the instrument's uncertainty, it would be applicable to substitute the standard deviation multiplied by certain times for the uncertainty. The result of the examining should not result in more error reported in routine measurements than it really is. Over strict examining should also be avoided

  2. Soil texture analysis by laser diffraction - standardization needed

    DEFF Research Database (Denmark)

    Callesen, Ingeborg; Palviainen, M.; Kjønaas, O. Janne

    2017-01-01

    Soil texture is a central soil quality property. Laser diffraction (LD) for determination of particle size distribution (PSD) is now widespread due to easy analysis and low cost. However, pretreatment methods and interpretation of the resulting soil PSD’s are not standardized. Comparison of LD data...... and many newer; ISO 13320:2009). PSD uncertainty caused by pretreatments and PSD bias caused by plate-shaped clay particles still calls for more method standardization work. If LD is used more generally, new pedotransfer functions for other soil properties (e.g water retention) based on sieving...

  3. Provenience studies using neutron activation analysis: the role of standardization

    Energy Technology Data Exchange (ETDEWEB)

    Harbottle, G

    1980-01-01

    This paper covers the historical background of chemical analysis of archaeological artifacts which dates back to 1790 to the first application of neutron activation analysis to archaeological ceramics and goes on to elaborate on the present day status of neutron activation analysis in provenience studies, and the role of standardization. In principle, the concentrations of elements in a neutron-activated specimen can be calculated from an exact knowledge of neutron flux, its intensity, duration and spectral (energy) distribution, plus an exact gamma ray count calibrated for efficiency, corrected for branching rates, etc. However, in practice it is far easier to compare one's unknown to a standard of known or assumed composition. The practice has been for different laboratories to use different standards. With analyses being run in the thousands throughout the world, a great benefit would be derived if analyses could be exchanged among all users and/or generators of data. The emphasis of this paper is on interlaboratory comparability of ceramic data; how far are we from it, what has been proposed in the past to achieve this goal, and what is being proposed. All of this may be summarized under the general heading of Analytical Quality Control - i.e., how to achieve precise and accurate analysis. The author proposes that anyone wishing to analyze archaeological ceramics should simply use his own standard, but attempt to calibrate that standard as nearly as possible to absolute (i.e., accurate) concentration values. The relationship of Analytical Quality Control to provenience location is also examined.

  4. Provenience studies using neutron activation analysis: the role of standardization

    International Nuclear Information System (INIS)

    Harbottle, G.

    1980-01-01

    This paper covers the historical background of chemical analysis of archaeological artifacts which dates back to 1790 to the first application of neutron activation analysis to archaeological ceramics and goes on to elaborate on the present day status of neutron activation analysis in provenience studies, and the role of standardization. In principle, the concentrations of elements in a neutron-activated specimen can be calculated from an exact knowledge of neutron flux, its intensity, duration and spectral (energy) distribution, plus an exact gamma ray count calibrated for efficiency, corrected for branching rates, etc. However, in practice it is far easier to compare one's unknown to a standard of known or assumed composition. The practice has been for different laboratories to use different standards. With analyses being run in the thousands throughout the world, a great benefit would be derived if analyses could be exchanged among all users and/or generators of data. The emphasis of this paper is on interlaboratory comparability of ceramic data; how far are we from it, what has been proposed in the past to achieve this goal, and what is being proposed. All of this may be summarized under the general heading of Analytical Quality Control - i.e., how to achieve precise and accurate analysis. The author proposes that anyone wishing to analyze archaeological ceramics should simply use his own standard, but attempt to calibrate that standard as nearly as possible to absolute (i.e., accurate) concentration values. The relationship of Analytical Quality Control to provenience location is also examined

  5. Recommendations for a proposed standard for performing systems analysis

    International Nuclear Information System (INIS)

    LaChance, J.; Whitehead, D.; Drouin, M.

    1998-01-01

    In August 1995, the Nuclear Regulatory Commission (NRC) issued a policy statement proposing improved regulatory decisionmaking by increasing the use of PRA [probabilistic risk assessment] in all regulatory matters to the extent supported by the state-of-the-art in PRA methods and data. A key aspect in using PRA in risk-informed regulatory activities is establishing the appropriate scope and attributes of the PRA. In this regard, ASME decided to develop a consensus PRA Standard. The objective is to develop a PRA Standard such that the technical quality of nuclear plant PRAs will be sufficient to support risk-informed regulatory applications. This paper presents examples recommendations for the systems analysis element of a PRA for incorporation into the ASME PRA Standard

  6. Identification of predominant odorants in thai desserts flavored by smoking with "Tian Op", a traditional Thai scented candle.

    Science.gov (United States)

    Watcharananun, Wanwarang; Cadwallader, Keith R; Huangrak, Kittiphong; Kim, Hun; Lorjaroenphon, Yaowapa

    2009-02-11

    "Tian Op", a traditional Thai scented candle, is used for the smoking and flavoring of sweets, cakes, and other desserts for the purpose of adding a unique aroma to the final product. Gas chromatography-olfactometry, aroma extract dilution analysis, and GC-MS were applied to identify the potent odorants in two types of traditional Thai desserts ("num dok mai" and "gleep lum duan") prepared using a Tian Op smoking process. On the basis of the results of AEDA and calculated odor-activity values, the predominant odorants in the Tian Op flavored desserts were vinyl ketones (C(5)-C(9)), n-aldehydes (C(5)-C(11)), (E)-2-unsaturated aldehydes (C(8)-C(11)), and omega-1-unsaturated aldehydes (C(8) and C(9)). Sensory studies of model mixtures confirmed the importance of n-aldehydes, omega-1-unsaturated aldehydes, and guaiacol as predominant odorants; however, the results showed that vinyl ketones and (E)-2-unsaturated aldehydes, despite having high odor-activity values, may be of only minor importance in the typical aroma profiles of traditional Tian Op smoked desserts.

  7. Fail save shut off valve for filtering systems employing candle filters

    Science.gov (United States)

    VanOsdol, John [Fairmont, WV

    2006-01-03

    The invention relates to an apparatus that acts as a fail save shut off valve. More specifically, the invention relates to a fail save shut off valve that allows fluid flow during normal operational conditions, but prevents the flow of fluids in the event of system failure upstream that causes over-pressurization. The present invention is particularly well suited for use in conjunction with hot gas filtering systems, which utilize ceramic candle filters. Used in such a hot gas system the present invention stops the flow of hot gas and prevents any particulate laden gas from entering the clean side of the system.

  8. DEVELOPMENT AND UTILIZATION OF TEST FACILITY FOR THE STUDY OF CANDLE FILTER SURFACE REGENERATION

    Energy Technology Data Exchange (ETDEWEB)

    Bruce S. Kang; Eric K. Johnson

    2003-07-14

    Hot gas particulate filtration is a basic component in advanced power generation systems such as Integrated Gasification Combined Cycle (IGCC) and Pressurized Fluidized Bed Combustion (PFBC). These systems require effective particulate removal to protect the downstream gas turbine and also to meet environmental emission requirements. The ceramic barrier filter is one of the options for hot gas filtration. Hot gases flow through ceramic candle filters leaving ash deposited on the outer surface of the filter. A process known as surface regeneration removes the deposited ash periodically by using a high pressure pulse of gas to back flush the filter. After this cleaning process has been completed there may be some residual ash on the filter surface. This residual ash may grow and this may then lead to mechanical failure of the filter. A Room Temperature Test Facility (RTTF) and a High Temperature Test Facility (HTTF) were built to investigate the ash characteristics during surface regeneration at room and selected high temperatures. The RTTF system was used to gain experience with the selected instrumentation and develop an operating procedure to be used later at elevated temperatures. The HTTF system is capable of conducting surface regeneration tests of a single candle filter at temperatures up to 1500 F. In order to obtain sequential digital images of ash particle distribution during the surface regeneration process, a high resolution, high speed image acquisition system was integrated into the HTTF system. The regeneration pressure and the transient pressure difference between the inside of the candle filter and the chamber during regeneration were measured using a high speed PC data acquisition system. The control variables for the high temperature regeneration tests were (1) face velocity, (2) pressure of the back pulse, and (3) cyclic ash built-up time. Coal ash sample obtained from the Power System Development Facility (PSDF) at Wilsonville, AL was used at the

  9. Standardizing Handoff Communication: Content Analysis of 27 Handoff Mnemonics.

    Science.gov (United States)

    Nasarwanji, Mahiyar F; Badir, Aysel; Gurses, Ayse P

    2016-01-01

    This study synthesizes information contained in 27 mnemonics to identify what information should be communicated during a handoff. Clustering and content analysis resulted in 12 primary information clusters that should be communicated. Given the large amount of information identified, it would be beneficial to use a structured handoff communication tool developed using a participatory approach. In addition, we recommend local standardization of information communicated during handoffs with variation across settings.

  10. 1997 Accession Medical Standards Analysis & Research Activity (AMSARA) Annual Report

    Science.gov (United States)

    1998-05-01

    Chronic knee pain 22 2.3 Adjustment disorder 16 2.7 Hearing 22 2.3 Bone injury-lower extremities 14 2.4 Eating disorder 22 2.3 Hypertension 12 2.0...Genetic Influences in Childhood-Onset Psychiatric Disorders : Autism and Attention-Deficit/Hyperactivity Disorder . Am J Hum Genet 1997;60:1276-1282...active duty Attention-Deficit/Hyperactivity Disorder Armed Forces Qualifying Test Academic Skills Defect Accession Medical Standards Analysis and

  11. Neutron activation analysis for certification of standard reference materials

    International Nuclear Information System (INIS)

    Capote Rodriguez, G.; Perez Zayas, G.; Hernandez Rivero, A.; Ribeiro Guevara, S.

    1996-01-01

    Neutron activation analysis is used extensively as one of the analytical techniques in the certification of standard reference materials. Characteristics of neutron activation analysis which make it valuable in this role are: accuracy multielemental capability to asses homogeneity, high sensitivity for many elements, and essentially non-destructive method. This paper report the concentrations of 30 elements (major, minor and trace elements) in four Cuban samples. The samples were irradiated in a thermal neutron flux of 10 12- 10 13 n.cm 2. s -1. The gamma ray spectra were measured by HPGe detectors and were analyzed using ACTAN program development in Center of Applied Studies for Nuclear Development

  12. A Comparative Analysis of Three Proposed Federal Renewable Electricity Standards

    Energy Technology Data Exchange (ETDEWEB)

    Sullivan, Patrick [National Renewable Energy Lab. (NREL), Golden, CO (United States); Logan, Jeffrey [National Renewable Energy Lab. (NREL), Golden, CO (United States); Bird, Lori [National Renewable Energy Lab. (NREL), Golden, CO (United States); Short, Walter [National Renewable Energy Lab. (NREL), Golden, CO (United States)

    2009-05-01

    This paper analyzes potential impacts of proposed national renewable electricity standard (RES) legislation. An RES is a mandate requiring certain electricity retailers to provide a minimum share of their electricity sales from qualifying renewable power generation. The analysis focuses on draft bills introduced individually by Senator Jeff Bingaman and Representative Edward Markey, and jointly by Representative Henry Waxman and Markey. The analysis uses NREL's Regional Energy Deployment System (ReEDS) model to evaluate the impacts of the proposed RES requirements on the U.S. energy sector in four scenarios.

  13. Comparative Analysis of Three Proposed Federal Renewable Electricity Standards

    Energy Technology Data Exchange (ETDEWEB)

    Sullivan, P.; Logan, J.; Bird, L.; Short, W.

    2009-05-01

    This paper analyzes potential impacts of proposed national renewable electricity standard (RES) legislation. An RES is a mandate requiring certain electricity retailers to provide a minimum share of their electricity sales from qualifying renewable power generation. The analysis focuses on draft bills introduced individually by Senator Jeff Bingaman and Representative Edward Markey, and jointly by Representative Henry Waxman and Markey. The analysis uses NREL's Regional Energy Deployment System (ReEDS) model to evaluate the impacts of the proposed RES requirements on the U.S. energy sector in four scenarios.

  14. Analysis and suggestions on standard system for general nuclear instruments

    International Nuclear Information System (INIS)

    Xiong Zhenglong

    1999-08-01

    The standard system has been analyzed and researched for the general nuclear instruments and propounded following suggestions against the problems in standard's system: seriously adopting the international standards and recommending Chinese standards toward the world; appropriately regularizing the system's frame and the standard's configurations to make it more scientific, perfect and applicable; enhancing the construction of technical and basic standards, promoting the standardization of entire nuclear instruments; replenishing the standards of the testing methods and straightening out the standard's level, further completing the standard's system. In short, all of them are to enhance quality, readability and maneuverability of standards, to exert sufficiently the effects of standards

  15. On Picturing a Candle: The Prehistory of Imagery Science.

    Science.gov (United States)

    MacKisack, Matthew; Aldworth, Susan; Macpherson, Fiona; Onians, John; Winlove, Crawford; Zeman, Adam

    2016-01-01

    The past 25 years have seen a rapid growth of knowledge about brain mechanisms involved in visual mental imagery. These advances have largely been made independently of the long history of philosophical - and even psychological - reckoning with imagery and its parent concept 'imagination'. We suggest that the view from these empirical findings can be widened by an appreciation of imagination's intellectual history, and we seek to show how that history both created the conditions for - and presents challenges to - the scientific endeavor. We focus on the neuroscientific literature's most commonly used task - imagining a concrete object - and, after sketching what is known of the neurobiological mechanisms involved, we examine the same basic act of imagining from the perspective of several key positions in the history of philosophy and psychology. We present positions that, firstly, contextualize and inform the neuroscientific account, and secondly, pose conceptual and methodological challenges to the scientific analysis of imagery. We conclude by reflecting on the intellectual history of visualization in the light of contemporary science, and the extent to which such science may resolve long-standing theoretical debates.

  16. Radiological error: analysis, standard setting, targeted instruction and teamworking

    International Nuclear Information System (INIS)

    FitzGerald, Richard

    2005-01-01

    Diagnostic radiology does not have objective benchmarks for acceptable levels of missed diagnoses [1]. Until now, data collection of radiological discrepancies has been very time consuming. The culture within the specialty did not encourage it. However, public concern about patient safety is increasing. There have been recent innovations in compiling radiological interpretive discrepancy rates which may facilitate radiological standard setting. However standard setting alone will not optimise radiologists' performance or patient safety. We must use these new techniques in radiological discrepancy detection to stimulate greater knowledge sharing, targeted instruction and teamworking among radiologists. Not all radiological discrepancies are errors. Radiological discrepancy programmes must not be abused as an instrument for discrediting individual radiologists. Discrepancy rates must not be distorted as a weapon in turf battles. Radiological errors may be due to many causes and are often multifactorial. A systems approach to radiological error is required. Meaningful analysis of radiological discrepancies and errors is challenging. Valid standard setting will take time. Meanwhile, we need to develop top-up training, mentoring and rehabilitation programmes. (orig.)

  17. Nuclear microprobe analysis of the standard reference materials

    International Nuclear Information System (INIS)

    Jaksic, M.; Fazinic, S.; Bogdanovic, I.; Tadic, T.

    2002-01-01

    Most of the presently existing Standard Reference Materials (SRM) for nuclear analytical methods are certified for the analyzed mass of the order of few hundred mg. Typical mass of sample which is analyzed by PIXE or XRF methods is very often below 1 mg. By the development of focused proton or x-ray beams, masses which can be typically analyzed go down to μg or even ng level. It is difficult to make biological or environmental SRMs which can give desired homogeneity at such low scale. However, use of fundamental parameter quantitative evaluation procedures (absolute method), minimize needs for SRMs. In PIXE and micro PIXE setup at our Institute, fundamental parameter approach is used. For exact calibration of the quantitative analysis procedure just one standard sample is needed. In our case glass standards which showed homogeneity down to micron scale were used. Of course, it is desirable to use SRMs for quality assurance, and therefore need for homogenous materials can be justified even for micro PIXE method. In this presentation, brief overview of PIXE setup calibration is given, along with some recent results of tests of several SRMs

  18. Fabrication of Water Jet Resistant and Thermally Stable Superhydrophobic Surfaces by Spray Coating of Candle Soot Dispersion.

    Science.gov (United States)

    Qahtan, Talal F; Gondal, Mohammed A; Alade, Ibrahim O; Dastageer, Mohammed A

    2017-08-08

    A facile synthesis method for highly stable carbon nanoparticle (CNP) dispersion in acetone by incomplete combustion of paraffin candle flame is presented. The synthesized CNP dispersion is the mixture of graphitic and amorphous carbon nanoparticles of the size range of 20-50 nm and manifested the mesoporosity with an average pore size of 7 nm and a BET surface area of 366 m 2 g -1 . As an application of this material, the carbon nanoparticle dispersion was spray coated (spray-based coating) on a glass surface to fabricate superhydrophobic (water contact angle > 150° and sliding angle fabricated from direct candle flame soot deposition (candle-based coating). This study proved that water jet resistant and thermally stable superhydrophobic surfaces can be easily fabricated by simple spray coating of CNP dispersion gathered from incomplete combustion of paraffin candle flame and this technique can be used for different applications with the potential for the large scale fabrication.

  19. Tracking themes in Manuchehri’s candle conundrum in Arabic and Persian poetry till the end of 7th century AH

    Directory of Open Access Journals (Sweden)

    Parvane Saneai

    2016-09-01

     Manouchehri has derived some of his themes for ode of candle conundrum from Arabic literature. Following him, other poets have applied such themes in their poets. In fact, at the beginning of this stream is seen the composition of candle conundrum whose first symptoms can be seen in Arab poetry. Following Manouchehri, the poets have used in their poems the themes as follows: Shortening the wick of the candle, candles’ yellow face, candles’ laughing and crying, candles’ soul and body, Love and Lovers.

  20. Polymer-based candle-shaped microneedle electrodes for electroencephalography on hairy skin

    Science.gov (United States)

    Arai, Miyako; Kudo, Yuta; Miki, Norihisa

    2016-06-01

    In this paper, we report on the optimization of the shape of dry microneedle electrodes for electroencephalography (EEG) on hairy locations and compare the electrodes we developed with conventional wet electrodes. We propose the use of SU-8-based candle-shaped microneedle electrodes (CMEs), which have pillars of 1.0 mm height and 0.4 mm diameter with a gap of 0.43 mm between pillars. Microneedles are formed on the top of the pillars. The shape was determined by how well the pillars can avoid hairs and support the microneedles to penetrate through the stratum corneum. The skin-electrode contact impedances of the fabricated CMEs were found to be higher and less stable than those of conventional wet electrodes. However, the CMEs successfully acquired signals with qualities as good as those of conventional wet electrodes. Given the usability of the CMEs, which do not require skin preparation or gel, they are promising alternatives to conventional wet electrodes.

  1. Standard hazard analysis, critical control point and hotel management

    Directory of Open Access Journals (Sweden)

    Vujačić Vesna

    2017-01-01

    Full Text Available Tourism is a dynamic category which is continuously evolving in the world. Specificities that have to be respected in the execution in relation to the food industry are connected with the fact that the main differences which exist regarding the food serving procedure in catering, numerous complex recipes and production technologies, staff fluctuation, old equipment. For an effective and permanent implementation, the HACCP concept is very important for building a serious base. In this case, the base is represented by the people handling the food. This paper presents international ISO standards, the concept of HACCP and the importance of its application in the tourism and hospitality industry. The concept of HACCP is a food safety management system through the analysis and control of biological, chemical and physical hazards in the entire process, from raw material production, procurement, handling, to manufacturing, distribution and consumption of the finished product. The aim of this paper is to present the importance of the application of HACCP concept in tourism and hotel management as a recognizable international standard.

  2. Quality of Standard Reference Materials for Short Time Activation Analysis

    International Nuclear Information System (INIS)

    Ismail, S.S.; Oberleitner, W.

    2003-01-01

    Some environmental reference materials (CFA-1633 b, IAEA-SL-1, SARM-1,BCR-176, Coal-1635, IAEA-SL-3, BCR-146, and SRAM-5) were analysed by short-time activation analysis. The results show that these materials can be classified in three groups, according to their activities after irradiation. The obtained results were compared in order to create a quality index for determination of short-lived nuclides at high count rates. It was found that Cfta is not a suitable standard for determining very short-lived nuclides (half-lives<1 min) because the activity it produces is 15-fold higher than that SL-3. Biological reference materials, such as SRM-1571, SRM-1573, SRM-1575, SRM-1577, IAEA-392, and IAEA-393, were also investigated by a higher counting efficiency system. The quality of this system and its well-type detector for investigating short-lived nuclides was discussed

  3. Colombeau's generalized functions and non-standard analysis

    International Nuclear Information System (INIS)

    Todorov, T.D.

    1987-10-01

    Using some methods of the Non-Standard Analysis we modify one of Colombeau's classes of generalized functions. As a result we define a class ε-circumflex of the so-called meta-functions which possesses all good properties of Colombeau's generalized functions, i.e. (i) ε-circumflex is an associative and commutative algebra over the system of the so-called complex meta-numbers C-circumflex; (ii) Every meta-function has partial derivatives of any order (which are meta-functions again); (iii) Every meta-function is integrable on any compact set of R n and the integral is a number from C-circumflex; (iv) ε-circumflex contains all tempered distributions S', i.e. S' is contained in ε' isomorphically with respect to all linear operations (including the differentiation). Thus, within the class ε-circumflex the problem of multiplication of the tempered distributions is satisfactorily solved (every two distributions in S' have a well-defined product in ε-circumflex). The crucial point is that C-circumflex is a field in contrast to the system of Colombeau's generalized numbers C-bar which is a ring only (C-bar is the counterpart of C-circumflex in Colombeau's theory). In this way we simplify and improve slightly the properties of the integral and notion of ''values of the meta-functions'' as well as the properties of the whole class ε-circumflex itself if compared with the original Colombeau theory. And, what is maybe more important, we clarify the connection between the Non-Standard Analysis and Colombeau's theory of new generalized functions in the framework of which the problem of multiplication of distributions was recently solved. (author). 14 refs

  4. Manual of Standard Operating Procedures for Veterinary Drug Residue Analysis

    International Nuclear Information System (INIS)

    2016-01-01

    Laboratories are crucial to national veterinary drug residue monitoring programmes. However, one of the main challenges laboratories encounter is obtaining access to relevant methods of analysis. Thus, in addition to training, providing technical advice and transferring technology, the Joint FAO/IAEA Division of Nuclear Techniques in Food and Agriculture has resolved to develop clear and practical manuals to support Member State laboratories. The Coordinated Research Project (CRP) on Development of Radiometric and Allied Analytical Methods to Strengthen Residue Control Programs for Antibiotic and Anthelmintic Veterinary Drug Residues has developed a number of analytical methods as standard operating procedures (SOPs), which are now compiled here. This publication contains SOPs on chromatographic and spectrometric techniques, as well as radioimmunoassay and associated screening techniques, for various anthelmintic and antimicrobial veterinary drug residue analysis. Some analytical method validation protocols are also included. The publication is primarily aimed at food and environmental safety laboratories involved in testing veterinary drug residues, including under organized national residue monitoring programmes. It is expected to enhance laboratory capacity building and competence through the use of radiometric and complementary tools and techniques. The publication is also relevant for applied research on residues of veterinary drugs in food and environmental samples

  5. The Trends and Prospects of Health Information Standards : Standardization Analysis and Suggestions

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Chang Soo [Dept. of Radiological Science, College of Health Science, Catholic University of Pusan, Pusan (Korea, Republic of)

    2008-03-15

    Ubiquitous health care system, which is one of the developing solution technologies of IT, BT and NT, could give us new medical environments in future. Implementing health information systems can be complex, expensive and frustrating. Healthcare professionals seeking to acquire or upgrade systems do not have a convenient, reliable way of specifying a level of adherence to communication standards sufficient to achieve truly efficient interoperability. Great progress has been made in establishing such standards-DICOM, IHE and HL7, notably, are now highly advanced. IHE has defined a common framework to deliver the basic interoperability needed for local and regional health information networks. It has developed a foundational set of standards-based integration profiles for information exchange with three interrelated efforts. HL7 is one of several ANSI-accredited Standards Developing Organizations operating in the healthcare arena. Most SDOs produce standards (protocols) for a particular healthcare domain such as pharmacy, medical devices, imaging or insurance transactions. HL7's domain is clinical and administrative data. HL7 is an international community of healthcare subject matter experts and information scientists collaborating to create standards for the exchange, management and integration of electronic healthcare information. The ASTM specification for Continuity of Care Record was developed by subcommittee E31.28 on electronic health records, which includes clinicians, provider institutions, administrators, patient advocates, vendors, and health industry. In this paper, there are suggestions that provide a test bed, demonstration and specification of how standards such a IHE, HL7, ASTM can be used to provide an integrated environment.

  6. Preliminary results of standard quantitative analysis by ED-XRF

    International Nuclear Information System (INIS)

    Lara, Alessandro L. de; Franca, Alana C.; Neto, Manoel R.M.; Paschuk, Sergei A.

    2013-01-01

    A comparison between the results of elemental concentration proposed by XRS-FP software , using data obtained by EDXRF technique with those obtained by stoichiometric calculation was performed. For this purpose, five standard samples of known compounds were produced: two lead- oxide, magnesium chloride and iodine in controlled amounts. The compounds were subsequently mixed and compressed to form tablets. The samples were irradiated in three endpoints, according to an orientation. The measurements were performed at the Laboratory for Radiological UTFPR using X-123SDD the Amptek detector and X-ray tube with silver target from the same manufacturer. The operating conditions tube were 05μA current to a 40 kV voltage. Finally, the 15 spectra analyzed with the software to determine the concentration of chlorine, iodine and lead. Data from this analysis were compared with results expected in stoichiometric calculations. The data provided by the program, a convergence of results, indicating homogeneity of the samples was found. Compared to the stoichiometric calculation, a considerable discrepancy that may be the result of a misconfiguration or contamination of the sample was found. At the end, we created a proposal for continuation of the work using an auxiliary calculation should be developed in the next step

  7. The impact of candle burning during All Saints' Day ceremonies on ambient alkyl-substituted benzene concentrations.

    Science.gov (United States)

    Olszowski, Tomasz; Kłos, Andrzej

    2013-11-01

    Research findings concerning benzene, toluene, ethylobenzene, meta-, para- and ortho-xylene as well as styrene (BTEXS) emission at public cemeteries during All Saints' Day are presented here. Tests were carried out at town-located cemeteries in Opole and Grodków (southern Poland) and, as a benchmark, at the centres of those same towns. The purpose of the study was to estimate BTEXS emissions caused by the candle burning and, equally important to examine, whether emissions generated by the tested sources were similar to the BTEXS emissions generated by road transport. During the festive period, significant increases in benzene concentrations, by 200 % and 144 %, were noted at the cemeteries in Opole and Grodków, as well as in toluene, by 366 % and 342 %, respectively. Styrene concentrations also increased. It was demonstrated that the ratio of toluene to benzene concentrations from emissions caused by the burning candles are comparable to the ratio established for transportation emissions.

  8. European standardization activities on residual stress analysis by neutron diffraction

    CERN Document Server

    Youtsos, A G

    2002-01-01

    A main objective of a recently completed European research project, RESTAND - residual stress standard using neutron diffraction, was to develop industrial confidence in the application of the neutron-diffraction technique for residual stress measurement and its principal deliverable was a relevant draft code of practice. In fact this draft standard was jointly developed within RESTAND and VAMAS TWA 20 - an international pre-normative research activity. As no such standard is yet available, on the basis of this draft standard document the European Standards Committee on Non-Destructive Testing (CEN TC/138) has established a new ad hoc Work Group (AHG7). The objective of this group is the development of a European pre-standard on a 'test method for measurement of residual stress by neutron diffraction'. The document contains the proposed protocol for making the measurements. It includes the scope of the method, an outline of the technique, the calibration and measurement procedures recommended, and details of ...

  9. Periodontal Manifestations of Chronic Atypical Neutrophilic Dermatosis With Lipodystrophy and Elevated Temperature (CANDLE) Syndrome in an 11 Year Old Patient

    OpenAIRE

    McKenna, Gerald J.; Ziada, Hassan M.

    2015-01-01

    Introduction: Chronic atypical neutrophilic dermatosis with lipodystrophy and elevated temperature (CANDLE) is an auto inflammatory syndrome caused by an autosomal recessive gene mutation. This very rare syndrome has been reported in only 14 patients worldwide. A number of clinical signs have been reported including joint contractures, muscle atrophy, microcytic anaemia, and panniculitis-induced childhood lipodystrophy. Further symptoms include recurrent fevers, purpuric skin lesions, periorb...

  10. Factor Rotation and Standard Errors in Exploratory Factor Analysis

    Science.gov (United States)

    Zhang, Guangjian; Preacher, Kristopher J.

    2015-01-01

    In this article, we report a surprising phenomenon: Oblique CF-varimax and oblique CF-quartimax rotation produced similar point estimates for rotated factor loadings and factor correlations but different standard error estimates in an empirical example. Influences of factor rotation on asymptotic standard errors are investigated using a numerical…

  11. [Analysis on standardization of patient posture for acupuncture treatment].

    Science.gov (United States)

    Lu, Yonghui

    2018-02-12

    The standardization of patient posture for acupuncture treatment was discussed. According to the opinions in Neijing ( Inner Canon of Huangdi ), combined with the clinical practice of acupuncture, it was believed that the patient posture for acupuncture treatment should be standardized based on Neijing . The standardized patient posture was the foundation of acupuncture, the need of blood flow and requirement of acupuncture technique. The combination of three elements was beneficial for the traveling of spirit- qi through meridian-acupoint, which could regulate balance of yin and yang to treat disease. In addition, the principles and methods of standardization of patient posture was proposed, and the important clinical significance of standardization of patient posture for acupuncture treatment was highlighted.

  12. Analysis of improved criteria for mold growth in ASHRAE standard 160 by comparison with field observations

    Science.gov (United States)

    Samuel V. Glass; Stanley D. Gatland II; Kohta Ueno; Christopher J. Schumacher

    2017-01-01

    ASHRAE Standard 160, Criteria for Moisture-Control Design Analysis in Buildings, was published in 2009. The standard sets criteria for moisture design loads, hygrothermal analysis methods, and satisfactory moisture performance of the building envelope. One of the evaluation criteria specifies conditions necessary to avoid mold growth. The current standard requires that...

  13. Vehicle Codes and Standards: Overview and Gap Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Blake, C.; Buttner, W.; Rivkin, C.

    2010-02-01

    This report identifies gaps in vehicle codes and standards and recommends ways to fill the gaps, focusing on six alternative fuels: biodiesel, natural gas, electricity, ethanol, hydrogen, and propane.

  14. European standardization activities on residual stress analysis by neutron diffraction

    International Nuclear Information System (INIS)

    Youtsos, A.G.; Ohms, C.

    2002-01-01

    A main objective of a recently completed European research project, RESTAND - residual stress standard using neutron diffraction, was to develop industrial confidence in the application of the neutron-diffraction technique for residual stress measurement and its principal deliverable was a relevant draft code of practice. In fact this draft standard was jointly developed within RESTAND and VAMAS TWA 20 - an international pre-normative research activity. As no such standard is yet available, on the basis of this draft standard document the European Standards Committee on Non-Destructive Testing (CEN TC/138) has established a new ad hoc Work Group (AHG7). The objective of this group is the development of a European pre-standard on a 'test method for measurement of residual stress by neutron diffraction'. The document contains the proposed protocol for making the measurements. It includes the scope of the method, an outline of the technique, the calibration and measurement procedures recommended, and details of how the strain data should be analysed to calculate stresses and establish the reliability of the results obtained. (orig.)

  15. European standardization activities on residual stress analysis by neutron diffraction

    Science.gov (United States)

    Youtsos, A. G.; Ohms, C.

    A main objective of a recently completed European research project, RESTAND - residual stress standard using neutron diffraction, was to develop industrial confidence in the application of the neutron-diffraction technique for residual stress measurement and its principal deliverable was a relevant draft code of practice. In fact this draft standard was jointly developed within RESTAND and VAMAS TWA 20 - an international pre-normative research activity. As no such standard is yet available, on the basis of this draft standard document the European Standards Committee on Non-Destructive Testing (CEN TC/138) has established a new ad hoc Work Group (AHG7). The objective of this group is the development of a European pre-standard on a `test method for measurement of residual stress by neutron diffraction'. The document contains the proposed protocol for making the measurements. It includes the scope of the method, an outline of the technique, the calibration and measurement procedures recommended, and details of how the strain data should be analysed to calculate stresses and establish the reliability of the results obtained.

  16. Standard Test Method for Isotopic Analysis of Uranium Hexafluoride by Single-Standard Gas Source Multiple Collector Mass Spectrometer Method

    CERN Document Server

    American Society for Testing and Materials. Philadelphia

    2011-01-01

    1.1 This test method is applicable to the isotopic analysis of uranium hexafluoride (UF6) with 235U concentrations less than or equal to 5 % and 234U, 236U concentrations of 0.0002 to 0.1 %. 1.2 This test method may be applicable to the analysis of the entire range of 235U isotopic compositions providing that adequate Certified Reference Materials (CRMs or traceable standards) are available. 1.3 This standard does not purport to address all of the safety concerns, if any, associated with its use. It is the responsibility of the user of this standard to establish appropriate safety health practices and determine the applicability of regulatory limitations prior to use.

  17. Analysis of Daylighting Requirements within ASHRAE Standard 90.1

    Energy Technology Data Exchange (ETDEWEB)

    Athalye, Rahul A.; Xie, YuLong; Liu, Bing; Rosenberg, Michael I.

    2013-08-01

    Pacific Northwest National Laboratory (PNNL), under the Building Energy Codes Program (BECP) funded by U.S. Department of Energy (DOE), provides support to the ASHRAE/IES/IESNA Standard 90.1(Standard 90.1) Standing Standards Project Committee (SSPC 90.1) and its subcommittees. In an effort to provide the ASHRAE SSPC 90.1 with data that will improve the daylighting and fenestration requirements in the Standard, PNNL collaborated with Heschong Mahone Group (HMG), now part of TRC Solutions. Combining EnergyPlus, a whole-building energy simulation software developed by DOE, with Radiance, a highly accurate illumination modeling software (Ward 1994), the daylighting requirements within Standard 90.1 were analyzed in greater detail. The initial scope of the study was to evaluate the impact of the fraction of window area compared to exterior wall area (window-to-wall ratio (WWR)) on energy consumption when daylighting controls are implemented. This scope was expanded to study the impact of fenestration visible transmittance (VT), electric lighting controls and daylighted area on building energy consumption.

  18. Standard test method for isotopic analysis of uranium hexafluoride by double standard single-collector gas mass spectrometer method

    CERN Document Server

    American Society for Testing and Materials. Philadelphia

    2010-01-01

    1.1 This is a quantitative test method applicable to determining the mass percent of uranium isotopes in uranium hexafluoride (UF6) samples with 235U concentrations between 0.1 and 5.0 mass %. 1.2 This test method may be applicable for the entire range of 235U concentrations for which adequate standards are available. 1.3 This test method is for analysis by a gas magnetic sector mass spectrometer with a single collector using interpolation to determine the isotopic concentration of an unknown sample between two characterized UF6 standards. 1.4 This test method is to replace the existing test method currently published in Test Methods C761 and is used in the nuclear fuel cycle for UF6 isotopic analyses. 1.5 The values stated in SI units are to be regarded as standard. No other units of measurement are included in this standard. 1.6 This standard does not purport to address all of the safety concerns, if any, associated with its use. It is the responsibility of the user of this standard to establish appro...

  19. Acoustic analysis of diphthongs in Standard South African English

    CSIR Research Space (South Africa)

    Martirosian, O

    2008-11-01

    Full Text Available the need for diphthongs in a Standard South African English (SSAE) ASR system by replacing them with selected variants and analysing the system results. We define a systematic process to identify and evaluate replacement options for diphthongs and find...

  20. Gasoline taxes or efficiency standards? A heterogeneous household demand analysis

    International Nuclear Information System (INIS)

    Liu, Weiwei

    2015-01-01

    Using detailed consumer expenditure survey data and a flexible semiparametric dynamic demand model, this paper estimates the price elasticity and fuel efficiency elasticity of gasoline demand at the household level. The goal is to assess the effectiveness of gasoline taxes and vehicle fuel efficiency standards on fuel consumption. The results reveal substantial interaction between vehicle fuel efficiency and the price elasticity of gasoline demand: the improvement of vehicle fuel efficiency leads to lower price elasticity and weakens consumers’ sensitivity to gasoline price changes. The offsetting effect also differs across households due to demographic heterogeneity. These findings imply that when gasoline taxes are in place, tightening efficiency standards will partially offset the strength of taxes on reducing fuel consumption. - Highlights: • Model household gasoline demand using a semiparametric approach. • Estimate heterogeneous price elasticity and fuel efficiency elasticity. • Assess the effectiveness of gasoline taxes and efficiency standards. • Efficiency standards offset the impact of gasoline taxes on fuel consumption. • The offsetting effect differs by household demographics

  1. 75 FR 27733 - Agency Information Collection Activities; Proposed Collection; Comment Request; Standard for the...

    Science.gov (United States)

    2010-05-18

    ... Fabrics Act (``FFA''), 15 U.S.C. 1193, to reduce unreasonable risks of burn injuries and deaths from fires... mattress pad will resist ignition from a smoldering cigarette. The standard requires manufacturers to... related to mattress fires, particularly those ignited by open flame sources such as lighters, candles and...

  2. Determination analysis of energy conservation standards for distribution transformers

    Energy Technology Data Exchange (ETDEWEB)

    Barnes, P.R.; Van Dyke, J.W.; McConnell, B.W.; Das, S.

    1996-07-01

    This report contains information for US DOE to use in making a determination on proposing energy conservation standards for distribution transformers as required by the Energy Policy Act of 1992. Potential for saving energy with more efficient liquid-immersed and dry-type distribution transformers could be significant because these transformers account for an estimated 140 billion kWh of the annual energy lost in the delivery of electricity. Objective was to determine whether energy conservation standards for distribution transformers would have the potential for significant energy savings, be technically feasible, and be economically justified from a national perspective. It was found that energy conservation for distribution transformers would be technically and economically feasible. Based on the energy conservation options analyzed, 3.6-13.7 quads of energy could be saved from 2000 to 2030.

  3. Reliability Analysis and Standardization of Spacecraft Command Generation Processes

    Science.gov (United States)

    Meshkat, Leila; Grenander, Sven; Evensen, Ken

    2011-01-01

    center dot In order to reduce commanding errors that are caused by humans, we create an approach and corresponding artifacts for standardizing the command generation process and conducting risk management during the design and assurance of such processes. center dot The literature review conducted during the standardization process revealed that very few atomic level human activities are associated with even a broad set of missions. center dot Applicable human reliability metrics for performing these atomic level tasks are available. center dot The process for building a "Periodic Table" of Command and Control Functions as well as Probabilistic Risk Assessment (PRA) models is demonstrated. center dot The PRA models are executed using data from human reliability data banks. center dot The Periodic Table is related to the PRA models via Fault Links.

  4. Cybersecurity Vulnerability Analysis of the PLC PRIME Standard

    Directory of Open Access Journals (Sweden)

    Miguel Seijo Simó

    2017-01-01

    Full Text Available Security in critical infrastructures such as the power grid is of vital importance. The Smart Grid puts power grid classical security approach on the ropes, since it introduces cyberphysical systems where devices, communications, and information systems must be protected. PoweRline Intelligent Metering Evolution (PRIME is a Narrowband Power-Line Communications (NB-PLC protocol widely used in the last mile of Advanced Metering Infrastructure (AMI deployments, playing a key role in the Smart Grid. Therefore, this work aims to unveil the cybersecurity vulnerabilities present in PRIME standard, proposing solutions and validating and discussing the results obtained.

  5. CANDLE reactor: an option for simple, safe, high nuclear proliferation resistant , small waste and efficient fuel use reactor

    International Nuclear Information System (INIS)

    Sekimoto, H.

    2010-01-01

    The innovative nuclear energy systems have been investigated intensively for long period in COE-INES program and CRINES activities in Tokyo Institute of Technology. Five requirements; sustainability, safety, waste, nuclear-proliferation, and economy; are considered as inevitable requirements for nuclear energy. Characteristics of small LBE cooled CANDLE fast reactor developed in this Institute are discussed for these requirements. It satisfies clearly four requirements; safety, nonproliferation and safeguard, less wastes and sustainability. For the remaining requirement, economy, a high potential to satisfy this requirement is also shown

  6. Refined analysis of piping systens according to nuclear standard regulations

    International Nuclear Information System (INIS)

    Bisconti, N.; Lazzeri, L.; Strona, P.P.

    1975-01-01

    A number of programs have been selected to perform particular analyses partly coming from available libraries such as SAP 4 for static and dynamic analysis, partly directly written such as TRATE (for thermal analysis), VASTA, VASTB (to perform the analysis required by ASME 3 for pipings of class A and class B), CFRS (for the calculation of floor response spectra etc.). All the programs are automatically linked and directed by a general program (SCATCA for class A and SCATCB for class B pipings). The starting point is a list of the fabrication, thermal, geometrical and seismic data. The geometrical data are plotted (to check for possible errors) and fed to SAP for static and dynamic analysis together with seismic data and thermal data (average temperatures) reelaborated by TRATE 2 code. The raw data from SAP (weight, thermal, fixed points displacements, seismic, other dynamic) are concerned and reordered and fed to COMBIN 2 program together with the other data from thermal analysis (from TRATE 2). From Combin 2 program all the data are listed; each load set to be considered is provided, for each point, with the necessary data (thermal moments, pressure, average temperatures, thermal gradients), all the data from seismic, weight, and other dynamic analysis are also provided. All this amount of data is stored on a file and examined by VASTA code (for class A) or VASTB (for classes B,C) in order to make a decision about the acceptability of the design. Each subprogram may have an independent output in order to check partial results. Details about each program are provided and an exemple is given, together with a discussion of some-particular problems (thermohydraulic set definition, fatigue analysis, etc.)

  7. Drought analysis of Antalya province by standardized precipitation index (SPI

    Directory of Open Access Journals (Sweden)

    Nazmi DİNÇ

    2016-12-01

    Full Text Available Drought is occurring as a result of global warming in our country and as well as over the world and defined as the precipitation deficit in a certain time period which is lower than that of the normal. It affects negatively all of the living being. Many drought indices have been developed to define the severity and characteristics of drought over time and space. In this study, drought characteristics have been evaluated by using the Standardized Precipitation Index (SPI in meteorological stations located in Alanya, Antalya, Demre, Elmalı, Finike, Gazipaşa, Korkuteli and Manavgat having long term data (1974-2014. According to 3-, 6-, 12- and 24- months time scales, the trend in SPI values are not decreasing and the SPI values were found to be between 0.99 (normal and ~-0.99 (drought close to normal. It is concluded that drought can occur in summer as well as in winter.

  8. Standard guide for corrosion-related failure analysis

    CERN Document Server

    American Society for Testing and Materials. Philadelphia

    2000-01-01

    1.1 This guide covers key issues to be considered when examining metallic failures when corrosion is suspected as either a major or minor causative factor. 1.2 Corrosion-related failures could include one or more of the following: change in surface appearance (for example, tarnish, rust, color change), pin hole leak, catastrophic structural failure (for example, collapse, explosive rupture, implosive rupture, cracking), weld failure, loss of electrical continuity, and loss of functionality (for example, seizure, galling, spalling, swelling). 1.3 Issues covered include overall failure site conditions, operating conditions at the time of failure, history of equipment and its operation, corrosion product sampling, environmental sampling, metallurgical and electrochemical factors, morphology (mode) or failure, and by considering the preceding, deducing the cause(s) of corrosion failure. This standard does not purport to address all of the safety concerns, if any, associated with its use. It is the responsibili...

  9. Production of uranium standard samples for spectrographic analysis

    International Nuclear Information System (INIS)

    Neuilly, M.; Leclerc, J.C.

    1969-01-01

    This report describes the conditions of preparation of twelve castings of uranium intended for use as reference samples in spectrographic analysis. Results are given of impurity determinations carried out by several laboratories using different methods, together with the 'probable values' of the concentrations. Samples of these different castings are now available and can be sent to any laboratory which requires them. (authors) [fr

  10. Proposed minimum reporting standards for data analysis in metabolomics

    NARCIS (Netherlands)

    Goodacre, R.; Broadhurst, D.; Smilde, A.K.; Kristal, B.S.; Baker, J.D.; Beger, R.; Bessant, C.; Connor, S.; Capuani, G.; Craig, A.; Ebbels, T.; Kell, D.B.; Manetti, C.; Newton, J.; Paternostro, G.; Somorjai, R.; Sjöström, M.; Trygg, J.; Wulfert, F.

    2007-01-01

    The goal of this group is to define the reporting requirements associated with the statistical analysis (including univariate, multivariate, informatics, machine learning etc.) of metabolite data with respect to other measured/collected experimental data (often called meta-data). These definitions

  11. Integrated Data Analysis (IDCA) Program - PETN Class 4 Standard

    Energy Technology Data Exchange (ETDEWEB)

    Sandstrom, Mary M. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Brown, Geoffrey W. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Preston, Daniel N. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Pollard, Colin J. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Warner, Kirstin F. [Naval Surface Warfare Center (NSWC), Indian Head, MD (United States). Indian Head Division; Sorensen, Daniel N. [Naval Surface Warfare Center (NSWC), Indian Head, MD (United States). Indian Head Division; Remmers, Daniel L. [Naval Surface Warfare Center (NSWC), Indian Head, MD (United States). Indian Head Division; Shelley, Timothy J. [Air Force Research Lab. (AFRL), Tyndall AFB, FL (United States); Reyes, Jose A. [Applied Research Associates, Tyndall AFB, FL (United States); Phillips, Jason J. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Hsu, Peter C. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Reynolds, John G. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States)

    2012-08-01

    The Integrated Data Collection Analysis (IDCA) program is conducting a proficiency study for Small- Scale Safety and Thermal (SSST) testing of homemade explosives (HMEs). Described here are the results for impact, friction, electrostatic discharge, and differential scanning calorimetry analysis of PETN Class 4. The PETN was found to have: 1) an impact sensitivity (DH50) range of 6 to 12 cm, 2) a BAM friction sensitivity (F50) range 7 to 11 kg, TIL (0/10) of 3.7 to 7.2 kg, 3) a ABL friction sensitivity threshold of 5 or less psig at 8 fps, 4) an ABL ESD sensitivity threshold of 0.031 to 0.326 j/g, and 5) a thermal sensitivity of an endothermic feature with Tmin = ~ 141 °C, and a exothermic feature with a Tmax = ~205°C.

  12. C. F. Braun. Standard turbine island design, safety analysis report

    International Nuclear Information System (INIS)

    1974-01-01

    A standard turbine island used with a BWR is described. It consists of the turbine-generator; steam system; condensate storage, cleanup, and transfer systems; control and instrumentation; water treatment plant; make-up demineralizer; potable and waste water systems; and a compressed air system. The turbine-generator is a tandem-compound nuclear-type turbine with one double-flow high-pressure section and a six-flow low-pressure section in three double-flow low-pressure casings. The turbine is direct connected to an 1800 rpm synchronous a-c generator. A combined moisture separator and two-stage reheater is provided. The main steam system delivers the steam generated in a BWR to the main turbine stop valves. The condensate system maintains proper water inventory. Protective features prevent loss of the system due to electrical failure of a component and isolates faults to ensure continuity of a power supply from alternate sources. (U.S.)

  13. Integrated Data Collection Analysis (IDCA) Program - Statistical Analysis of RDX Standard Data Sets

    Energy Technology Data Exchange (ETDEWEB)

    Sandstrom, Mary M. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Brown, Geoffrey W. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Preston, Daniel N. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Pollard, Colin J. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Warner, Kirstin F. [Naval Surface Warfare Center (NSWC), Indian Head, MD (United States). Indian Head Division; Sorensen, Daniel N. [Naval Surface Warfare Center (NSWC), Indian Head, MD (United States). Indian Head Division; Remmers, Daniel L. [Naval Surface Warfare Center (NSWC), Indian Head, MD (United States). Indian Head Division; Phillips, Jason J. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Shelley, Timothy J. [Air Force Research Lab. (AFRL), Tyndall AFB, FL (United States); Reyes, Jose A. [Applied Research Associates, Tyndall AFB, FL (United States); Hsu, Peter C. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Reynolds, John G. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States)

    2015-10-30

    The Integrated Data Collection Analysis (IDCA) program is conducting a Proficiency Test for Small- Scale Safety and Thermal (SSST) testing of homemade explosives (HMEs). Described here are statistical analyses of the results for impact, friction, electrostatic discharge, and differential scanning calorimetry analysis of the RDX Type II Class 5 standard. The material was tested as a well-characterized standard several times during the proficiency study to assess differences among participants and the range of results that may arise for well-behaved explosive materials. The analyses show that there are detectable differences among the results from IDCA participants. While these differences are statistically significant, most of them can be disregarded for comparison purposes to assess potential variability when laboratories attempt to measure identical samples using methods assumed to be nominally the same. The results presented in this report include the average sensitivity results for the IDCA participants and the ranges of values obtained. The ranges represent variation about the mean values of the tests of between 26% and 42%. The magnitude of this variation is attributed to differences in operator, method, and environment as well as the use of different instruments that are also of varying age. The results appear to be a good representation of the broader safety testing community based on the range of methods, instruments, and environments included in the IDCA Proficiency Test.

  14. ANSI/ASHRAE/IES Standard 90.1-2013 Preliminary Determination: Qualitative Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Halverson, Mark A.; Hart, Reid; Athalye, Rahul A.; Rosenberg, Michael I.; Richman, Eric E.; Winiarski, David W.

    2014-03-01

    Section 304(b) of the Energy Conservation and Production Act (ECPA), as amended, requires the Secretary of Energy to make a determination each time a revised version of ASHRAE Standard 90.1 is published with respect to whether the revised standard would improve energy efficiency in commercial buildings. When the U.S. Department of Energy (DOE) issues an affirmative determination on Standard 90.1, states are statutorily required to certify within two years that they have reviewed and updated the commercial provisions of their building energy code, with respect to energy efficiency, to meet or exceed the revised standard. This report provides a preliminary qualitative analysis of all addenda to ANSI/ASHRAE/IES Standard 90.1-2010 (referred to as Standard 90.1-2010 or 2010 edition) that were included in ANSI/ASHRAE/IES Standard 90.1-2013 (referred to as Standard 90.1-2013 or 2013 edition).

  15. Smart candle soot coated membranes for on-demand immiscible oil/water mixture and emulsion switchable separation.

    Science.gov (United States)

    Li, Jian; Zhao, Zhihong; Li, Dianming; Tian, Haifeng; Zha, Fei; Feng, Hua; Guo, Lin

    2017-09-21

    Oil/water separation is of great importance for the treatment of oily wastewater, including immiscible light/heavy oil-water mixtures, oil-in-water or water-in-oil emulsions. Smart surfaces with responsive wettability have received extensive attention especially for controllable oil/water separation. However, traditional smart membranes with a switchable wettability between superhydrophobicity and superhydrophilicity are limited to certain responsive materials and continuous external stimuli, such as pH, electrical field or light irradiation. Herein, a candle soot coated mesh (CSM) with a larger pore size and a candle soot coated PVDF membrane (CSP) with a smaller pore size with underwater superoleophobicity and underoil superhydrophobicity were successfully fabricated, which can be used for on-demand immiscible oil/water mixtures and surfactants-stabilized oil/water emulsion separation, respectively. Without any continuous external stimulus, the wettability of our membranes could be reversibly switched between underwater superoleophobicity and underoil superhydrophobicity simply by drying and washing alternately, thus achieving effective and switchable oil/water separation with excellent separation efficiency. We believe that such smart materials will be promising candidates for use in the removal of oil pollutants in the future.

  16. ANSI/ASHRAE/IES Standard 90.1-2013 Determination of Energy Savings: Qualitative Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Halverson, Mark A.; Rosenberg, Michael I.; Hart, Philip R.; Richman, Eric E.; Athalye, Rahul A.; Winiarski, David W.

    2014-09-04

    This report provides a final qualitative analysis of all addenda to ANSI/ASHRAE/IES Standard 90.1-2010 (referred to as Standard 90.1-2010 or 2010 edition) that were included in ANSI/ASHRAE/IES Standard 90.1-2013 (referred to as Standard 90.1-2013 or 2013 edition). All addenda in creating Standard 90.1-2013 were evaluated for their projected impact on energy efficiency. Each addendum was characterized as having a positive, neutral, or negative impact on overall building energy efficiency.

  17. ANSI/ASHRAE/IESNA Standard 90.1-2010 Preliminary Determination Quantitative Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Halverson, Mark A.; Liu, Bing; Rosenberg, Michael I.

    2010-11-01

    The United States (U.S.) Department of Energy (DOE) conducted a preliminary quantitative analysis to assess whether buildings constructed according to the requirements of the American National Standards Institute (ANSI)/American Society of Heating, Refrigerating and Air-Conditioning Engineers (ASHRAE)/Illuminating Engineering Society of North America (IESNA) Standard 90.1-2010 (ASHRAE Standard 90.1-2010, Standard 90.1-2010, or 2010 edition) would result in energy savings compared with buildings constructed to ANSI/ASHRAE/IESNA Standard 90.1-2007(ASHRAE Standard 90.1-2007, Standard 90.1-2007, or 2007 edition). The preliminary analysis considered each of the 109 addenda to ASHRAE Standard 90.1-2007 that were included in ASHRAE Standard 90.1-2010. All 109 addenda processed by ASHRAE in the creation of Standard 90.1-2010 from Standard 90.1-2007 were reviewed by DOE, and their combined impact on a suite of 16 building prototype models in 15 ASHRAE climate zones was considered. Most addenda were deemed to have little quantifiable impact on building efficiency for the purpose of DOE’s preliminary determination. However, out of the 109 addenda, 34 were preliminarily determined to have measureable and quantifiable impact.

  18. Interim versus standard methadone treatment: a benefit-cost analysis.

    Science.gov (United States)

    Schwartz, Robert P; Alexandre, Pierre K; Kelly, Sharon M; O'Grady, Kevin E; Gryczynski, Jan; Jaffe, Jerome H

    2014-03-01

    A benefit-cost analysis was conducted as part of a clinical trial in which newly-admitted methadone patients were randomly assigned to interim methadone (IM; methadone without counseling) for the first 4 months of 12 months of methadone treatment or 12 months of methadone with one of two counseling conditions. Health, residential drug treatment, criminal justice costs, and income data in 2010 dollars were obtained at treatment entry, and 4- and 12-month follow-up from 200 participants and program costs were obtained. The net benefits of treatment were greater for the IM condition but controlling for the baseline variables noted above, the difference between conditions in net monetary benefits was not significant. For the combined sample, there was a pre- to post-treatment net benefit of $1470 (95% CI: -$625; $3584) and a benefit-cost ratio of 1.5 (95% CI: 0.8, 2.3), but using our conservative approach to calculating benefits, these values were not significant. © 2014.

  19. The preparation of synthetic standards for use in instrumental neutron-activation analysis

    International Nuclear Information System (INIS)

    Eddy, B.T.; Watterson, J.I.W.; Erasmus, C.S.

    1979-01-01

    An account is given of the formulation and preparation of synthetic standards suitable for the routine analysis of minerals, ores, and ore concentrates by instrumental neutron activation. Fifteen standards were prepared, each containing from one to seven elements. The standards contain forty-four elements that produce isotopes with half-lives longer than 12 hours. An evaluation of the accuracy and precision of the method of preparation is given

  20. Canonical integration and analysis of periodic maps using non-standard analysis and life methods

    Energy Technology Data Exchange (ETDEWEB)

    Forest, E.; Berz, M.

    1988-06-01

    We describe a method and a way of thinking which is ideally suited for the study of systems represented by canonical integrators. Starting with the continuous description provided by the Hamiltonians, we replace it by a succession of preferably canonical maps. The power series representation of these maps can be extracted with a computer implementation of the tools of Non-Standard Analysis and analyzed by the same tools. For a nearly integrable system, we can define a Floquet ring in a way consistent with our needs. Using the finite time maps, the Floquet ring is defined only at the locations s/sub i/ where one perturbs or observes the phase space. At most the total number of locations is equal to the total number of steps of our integrator. We can also produce pseudo-Hamiltonians which describe the motion induced by these maps. 15 refs., 1 fig.

  1. Methodological Choices in the Content Analysis of Textbooks for Measuring Alignment with Standards

    Science.gov (United States)

    Polikoff, Morgan S.; Zhou, Nan; Campbell, Shauna E.

    2015-01-01

    With the recent adoption of the Common Core standards in many states, there is a need for quality information about textbook alignment to standards. While there are many existing content analysis procedures, these generally have little, if any, validity or reliability evidence. One exception is the Surveys of Enacted Curriculum (SEC), which has…

  2. Application of new standardization method in activation analysis with registration of soft gamma radiation

    International Nuclear Information System (INIS)

    Vo Dac Bang; Phan Thu Huong.

    1983-01-01

    An application of the new standardization method for rapid activation mass analysis with the registration of the strongly absorbed low-energy gamma radiation is described. This method makes it possible to avoid the Use of the time-consumina and laboriuous method of Internal Standard

  3. Analysis of Evidence Supporting the Educational Leadership Constituent Council 2011 Educational Leadership Program Standards

    Science.gov (United States)

    Tucker, Pamela D.; Anderson, Erin; Reynolds, Amy L.; Mawhinney, Hanne

    2016-01-01

    This document analysis provides a summary of the research from high-impact journals published between 2008 and 2013 with the explicit purpose of determining the extent to which the current empirical evidence supports the individual 2011 Educational Leadership Constituent Council Program Standards and their elements. We found that the standards are…

  4. ANSI/ASHRAE/IES Standard 90.1-2013 Preliminary Determination: Quantitative Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Halverson, Mark A.; Rosenberg, Michael I.; Wang, Weimin; Zhang, Jian; Mendon, Vrushali V.; Athalye, Rahul A.; Xie, YuLong; Hart, Reid; Goel, Supriya

    2014-03-01

    This report provides a preliminary quantitative analysis to assess whether buildings constructed according to the requirements of ANSI/ASHRAE/IES Standard 90.1-2013 would result in energy savings compared with buildings constructed to ANSI/ASHRAE/IES Standard 90.1-2010.

  5. The Nature of Science and the "Next Generation Science Standards": Analysis and Critique

    Science.gov (United States)

    McComas, William F.; Nouri, Noushin

    2016-01-01

    This paper provides a detailed analysis of the inclusion of aspects of nature of science (NOS) in the "Next Generation Science Standards" (NGSS). In this new standards document, NOS elements in eight categories are discussed in Appendix H along with illustrative statements (called exemplars). Many, but not all, of these exemplars are…

  6. Cartographic standards to improve maps produced by the Forest Inventory and Analysis program

    Science.gov (United States)

    Charles H. (Hobie) Perry; Mark D. Nelson

    2009-01-01

    The Forest Service, U.S. Department of Agriculture's Forest Inventory and Analysis (FIA) program is incorporating an increasing number of cartographic products in reports, publications, and presentations. To create greater quality and consistency within the national FIA program, a Geospatial Standards team developed cartographic design standards for FIA map...

  7. Design and analysis of control charts for standard deviation with estimated parameters

    NARCIS (Netherlands)

    Schoonhoven, M.; Riaz, M.; Does, R.J.M.M.

    2011-01-01

    This paper concerns the design and analysis of the standard deviation control chart with estimated limits. We consider an extensive range of statistics to estimate the in-control standard deviation (Phase I) and design the control chart for real-time process monitoring (Phase II) by determining the

  8. Performance Analysis of a Utility Helicopter with Standard and Advanced Rotors

    National Research Council Canada - National Science Library

    Yeo, Hyeonsoo; Bousman, William G; Johnson, Wayne

    2002-01-01

    Flight test measurements of the performance of the UH-60 Black Hawk helicopter with both standard and advanced rotors are compared with calculations obtained using the comprehensive helicopter analysis CAMRAD II...

  9. Intelligent future wireless networks for energy efficiency: overall analysis and standardization activities

    CSIR Research Space (South Africa)

    Kliks, A

    2013-10-01

    Full Text Available This chapter addresses a number of issues related to standardization and regulatory policies aiming at promoting energy-efficient communications and networking, highlighting the need of synergic approach. It encompasses the analysis of various...

  10. Quantitative chemical analysis for the standardization of copaiba oil by high resolution gas chromatography

    International Nuclear Information System (INIS)

    Tappin, Marcelo R.R.; Pereira, Jislaine F.G.; Lima, Lucilene A.; Siani, Antonio C.; Mazzei, Jose L.; Ramos, Monica F.S.

    2004-01-01

    Quantitative GC-FID was evaluated for analysis of methylated copaiba oils, using trans-(-)-caryophyllene or methyl copalate as external standards. Analytical curves showed good linearity and reproducibility in terms of correlation coefficients (0.9992 and 0.996, respectively) and relative standard deviation (< 3%). Quantification of sesquiterpenes and diterpenic acids were performed with each standard, separately. When compared with the integrator response normalization, the standardization was statistically similar for the case of methyl copalate, but the response of trans-(-)-caryophyllene was statistically (P < 0.05) different. This method showed to be suitable for classification and quality control of commercial samples of the oils. (author)

  11. Draft Regulatory Analysis. Technical support document No. 1: energy efficiency standards for consumer products

    Energy Technology Data Exchange (ETDEWEB)

    None

    1980-06-01

    A Draft Regulatory Analysis is presented that describes the analyses performed by DOE to arrive at proposed energy efficiency standards for refrigerators and refrigerator-freezers, freezers, clothes dryers, water heaters, room air conditioners, kitchen ranges and ovens, central air conditioners (cooling only), and furnaces. Standards for dishwashers, television sets, clothes washers, and humidifiers and dehumidifiders are required to be published in the Federal Register no later than December 1981. Standards for central air conditioners (heat pumps) and home heating equipment are to be published in the Federal Register no later than January 1982. Accordingly, these products are not discussed in this Draft Regulatory Analysis.

  12. ANSI/ASHRAE/IES Standard 90.1-2013 Determination of Energy Savings: Quantitative Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Halverson, Mark A.; Athalye, Rahul A.; Rosenberg, Michael I.; Xie, YuLong; Wang, Weimin; Hart, Philip R.; Zhang, Jian; Goel, Supriya; Mendon, Vrushali V.

    2014-09-04

    This report provides a final quantitative analysis to assess whether buildings constructed according to the requirements of ANSI/ASHRAE/IES Standard 90.1-2013 would result in improved energy efficiency in commercial buildings. The final analysis considered each of the 110 addenda to Standard 90.1-2010 that were included in Standard 90.1-2013. PNNL reviewed all addenda included by ASHRAE in creating Standard 90.1-2013 from Standard 90.1-2010, and considered their combined impact on a suite of prototype building models across all U.S. climate zones. Most addenda were deemed to have little quantifiable impact on building efficiency for the purpose of DOE’s final determination. However, out of the 110 total addenda, 30 were identified as having a measureable and quantifiable impact.

  13. Analysis of ultrafiltration failure in peritoneal dialysis patients by means of standard peritoneal permeability analysis.

    Science.gov (United States)

    Ho-dac-Pannekeet, M M; Atasever, B; Struijk, D G; Krediet, R T

    1997-01-01

    Ultrafiltration failure (UFF) is a complication of peritoneal dialysis (PD) treatment that occurs especially in long-term patients. Etiological factors include a large effective peritoneal surface area [measured as high mass transfer area coefficient (MTAC) of creatinine], a high effective lymphatic absorption rate (ELAR), a large residual volume, or combinations. The prevalence and etiology of UFF were studied and the contribution of transcellular water transport (TCWT) was analyzed. A new definition of UFF and guidelines for the analysis of its etiology were derived from the results. Peritoneal dialysis unit in the Academic Medical Center in Amsterdam. Cross-sectional study of standard peritoneal permeability analyses (4-hr dwells, dextran 70 as volume marker) with 1.36% glucose in 68 PD patients. Patients with negative net UF (change in intraperitoneal volume, dIPV rate (TCUFR) were lower (p lower residual volume (p = 0.03), and lower TCUFR (p = 0.01). Ultrafiltration failure was associated with a high MTAC creatinine in 3 patients, a high ELAR in 4 patients, and a combination of factors in one. As an additional possible cause, TCWT was studied, using the sodium gradient in the first hour of the dwell, corrected for diffusion (dNA). Five patients had dNA > 5 mmol/L, indicating normal TCWT. The 3 patients with dNA lower TCUFR (p = 0.04). A smaller difference was found between dIPV 3.86% and 1.36% (p = 0.04) compared to the dNA > 5 mmol/L group, but no differences were present for MTAC creatinine, ELAR, residual volume, or glucose absorption. In addition to known factors, impairment of TCWT can be a cause of UFF. A standardized dwell with 1.36% glucose overestimates UFF. Therefore, 3.86% glucose should be used for identification of patients with UFF, especially because it provides additional information on TCWT. Ultrafiltration failure can be defined as net UF exchange.

  14. Analysis of Standards Efficiency in Digital Television Via Satellite at Ku and Ka Bands

    Directory of Open Access Journals (Sweden)

    Landeros-Ayala Salvador

    2013-06-01

    Full Text Available In this paper, an analysis on the main technical features of digital television standards for satellite transmission is carried out. Based on simulations and link budgets, the standard with the best operational performance is defined, based on simulations and link budget analysis, as well as a comparative efficiency analysis is conducted for the Ku and Ka bands for both transparent and regenerative transponders in terms of power, bandwidth, information rate and link margin, including clear sky, uplink rain, downlink rain and rain in both.

  15. METHODOLOGICAL ASPECTS OF CONTENT ANALYSIS OF CONVERGENCE BETWEEN UKRAINIAN GAAP AND INTERNATIONAL FINANCIAL REPORTING STANDARDS

    Directory of Open Access Journals (Sweden)

    R. Kuzina

    2015-06-01

    Full Text Available The objective conditions of Ukraine’s integration into the global business environment the need to strengthen the accounting and financial re-porting. At the stage of attracting investment in the country there is a need in the preparation of financial statements generally accepted basic prin-ciples of which are based on common international financial reporting standards (IFRS . Relevant is the assessment of convergence of national standards and International Financial Reporting Standards. However, before you conduct content analysis necessary to determine compliance with standards of methodological approaches to the selection of key indicators for the assessment of convergence. The article is to define the methodo-logical approaches to the selection and development of indicators IFRSs list of key elements for further evaluation convergence of national and international standards. To assess the convergence was allocated 187 basic key elements measuring the level of convergence to IFRS. Sampling was carried out based on the professional judgment of the author, the key indicators of the standard, based on the evaluation of the usefulness of accounting information. These figures make it possible to calculate the specific level of convergence of international and national standards and determine how statements prepared by domestic standards corresponding to IFRS. In other words, can with some certainty assert that Ukraine has made (“good practices in IFRS implementation” or not? This calculation will assess the regulatory efforts of government agencies (Ministry of Finance on the approximation of Ukrainian standards and IFRS.

  16. Data Analysis and Statistics in Middle Grades: An Analysis of Content Standards

    Science.gov (United States)

    Sorto, M. Alejandra

    2011-01-01

    The purpose of the study reported herein was to identify the important aspects of statistical knowledge that students in the middle school grades in United States are expected to learn as well as what the teachers are expected to teach. A systematic study of 49 states standards and one set of national standards was used to identify these important…

  17. ANSI/ASHRAE/IESNA Standard 90.1-2007 Final Determination Quantitative Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Halverson, Mark A.; Liu, Bing; Richman, Eric E.; Winiarski, David W.

    2011-05-01

    The United States (U.S.) Department of Energy (DOE) conducted a final quantitative analysis to assess whether buildings constructed according to the requirements of the American National Standards Institute (ANSI)/American Society of Heating, Refrigerating and Air-Conditioning Engineers (ASHRAE)/Illuminating Engineering Society of North America (IESNA) Standard 90.1-2007 would result in energy savings compared with buildings constructed to ANSI/ASHRAE/IESNA Standard 90.1-2004. The final analysis considered each of the 44 addenda to ANSI/ASHRAE/IESNA Standard 90.1-2004 that were included in ANSI/ASHRAE/IESNA Standard 90.1-2007. All 44 addenda processed by ASHRAE in the creation of Standard 90.1-2007 from Standard 90.1-2004 were reviewed by DOE, and their combined impact on a suite of 15 building prototype models in 15 ASHRAE climate zones was considered. Most addenda were deemed to have little quantifiable impact on building efficiency for the purpose of DOE’s final determination. However, out of the 44 addenda, 9 were preliminarily determined to have measureable and quantifiable impact.

  18. Validation Test For The Result Of Neutron Activation Analysis With Standard Reference Material

    International Nuclear Information System (INIS)

    Rina M, Th; Wardani, Sri

    2001-01-01

    The validity test with standard reference material is necessary to ensure the result analysis of samples. The analysis of CRM No.8 (Vehicle Exhaust Particulates) and SRM 1646a (Estuary Sediment) has been done in P2TRR. These analysis is intended to validate the analysis result done in NAA laboratory of P2TRR. For the CRM No. 8, is elements of the 30 certified elements were successfully analyzed Meanwhile, for SRM 1646a, 21 elements of the 39 certified elements were completely investigated The quantitative analysis showed the relative difference of 2% - 15% compared to the certificate

  19. Joint Oil Analysis Program Spectrometer Standards SCP Science (Conostan) Qualification Report for D19-0, D3-100, and D12-XXX Series Standards

    Science.gov (United States)

    2015-05-20

    The Joint Oil Analysis Program previously revised the specification for their spectrometric oil standards in order to incorporate an ICP-AES test...method and to transition to commercially manufactured spectrometric oil standards. Historically, a Rotrode-AES test method was the only elemental...test method used to verify the quality of the spectrometric oil standards. The Rotrode-AES test method was a labor and time intensive process that did

  20. Operational implications of using 2006 World Health Organization growth standards in nutrition programmes: secondary data analysis.

    Science.gov (United States)

    Seal, Andrew; Kerac, Marko

    2007-04-07

    To assess the implications of adopting the World Health Organization 2006 growth standards in combination with current diagnostic criteria in emergency and non-emergency child feeding programmes. Secondary analysis of data from three standardised nutrition surveys (n=2555) for prevalence of acute malnutrition, using weight for height z score (new WHO 2006 growth standards (WHO standards). Refugee camps in Algeria, Kenya, and Bangladesh. Population Children aged 6-59 months. Important differences exist in the weight for height cut-offs used for defining acute malnutrition obtained from the WHO standards and NCHS reference data. These vary according to a child's height and according to whether z score or percentage of the median cut-offs are used. If applied and used according to current practice in nutrition programmes, the WHO standards will result in a higher measured prevalence of severe acute malnutrition during surveys but, paradoxically, a decrease in the admission of children to emergency feeding programmes and earlier discharge of recovering patients. The expected impact on case fatality rates of applying the new standards in conjunction with current diagnostic criteria is unknown. A full assessment of the appropriate use of the new WHO standards in the diagnosis of acute malnutrition is urgently needed. This should be completed before the standards are adopted by organisations that run nutrition programmes targeting acute malnutrition.

  1. Probability of Failure Analysis Standards and Guidelines for Expendable Launch Vehicles

    Science.gov (United States)

    Wilde, Paul D.; Morse, Elisabeth L.; Rosati, Paul; Cather, Corey

    2013-09-01

    Recognizing the central importance of probability of failure estimates to ensuring public safety for launches, the Federal Aviation Administration (FAA), Office of Commercial Space Transportation (AST), the National Aeronautics and Space Administration (NASA), and U.S. Air Force (USAF), through the Common Standards Working Group (CSWG), developed a guide for conducting valid probability of failure (POF) analyses for expendable launch vehicles (ELV), with an emphasis on POF analysis for new ELVs. A probability of failure analysis for an ELV produces estimates of the likelihood of occurrence of potentially hazardous events, which are critical inputs to launch risk analysis of debris, toxic, or explosive hazards. This guide is intended to document a framework for POF analyses commonly accepted in the US, and should be useful to anyone who performs or evaluates launch risk analyses for new ELVs. The CSWG guidelines provide performance standards and definitions of key terms, and are being revised to address allocation to flight times and vehicle response modes. The POF performance standard allows a launch operator to employ alternative, potentially innovative methodologies so long as the results satisfy the performance standard. Current POF analysis practice at US ranges includes multiple methodologies described in the guidelines as accepted methods, but not necessarily the only methods available to demonstrate compliance with the performance standard. The guidelines include illustrative examples for each POF analysis method, which are intended to illustrate an acceptable level of fidelity for ELV POF analyses used to ensure public safety. The focus is on providing guiding principles rather than "recipe lists." Independent reviews of these guidelines were performed to assess their logic, completeness, accuracy, self- consistency, consistency with risk analysis practices, use of available information, and ease of applicability. The independent reviews confirmed the

  2. Sleep disordered breathing analysis in a general population using standard pulse oximeter signals.

    Science.gov (United States)

    Barak-Shinar, Deganit; Amos, Yariv; Bogan, Richard K

    2013-09-01

    Obstructive sleep apnea reported as the apnea-hypopnea index (AHI) is usually measured in sleep laboratories using a high number of electrodes connected to the patient's body. In this study, we examined the use of a standard pulse oximeter system with an automated analysis based on the photoplethysmograph (PPG) signal for the diagnosis of sleep disordered breathing. Using a standard and simple device with high accuracy might provide a convenient diagnostic or screening solution for patient evaluation at home or in other out of center testing environments. The study included 140 consecutive patients that were referred routinely to a sleep laboratory [SleepMed Inc.] for the diagnosis of sleep disordered breathing. Each patient underwent an overnight polysomnography (PSG) study according to AASM guidelines in an AASM-accredited sleep laboratory. The automatic analysis is based on photoplethysmographic and saturation signals only. Those two signals were recorded for the entire night as part of the full overnight PSG sleep study. The AHI calculated from the PPG analysis is compared to the AHI calculated from the manual scoring gold standard full PSG. The AHI and total respiratory events measured by the pulse oximeter analysis correlated very well with the corresponding results obtained by the gold standard full PSG. The sensitivity and specificity of AHI = or > 5 and 15 levels measured by the analysis are both above 90 %. The sensitivity and positive predictive value for the detection of respiratory event are both above 84 %. The tested system in this study yielded an acceptable result of sleep disordered breathing compared to the gold standard PSG in patients with moderate to severe sleep apnea. Accordingly and given the convenience and simplicity of the standard pulse oximeter device, the new system can be considered suitable for home and ambulatory diagnosis or screening of sleep disordered breathing patients.

  3. Preparation of uranium standard solutions for x-ray fluorescence analysis

    International Nuclear Information System (INIS)

    Wong, C.M.; Cate, J.L.; Pickles, W.L.

    1978-03-01

    A method has been developed for gravimetrically preparing uranium nitrate standards with an estimated mean error of 0.1% (1 sigma) and a maximum error of 0.2% (1 sigma) for the total uranium weight. Two source materials, depleted uranium dioxide powder and NBS Standard Reference Material 960 uranium metal, were used to prepare stock solutions. The NBS metal proved to be superior because of the small but inherent uncertainty in the stoichiometry of the uranium oxide. These solutions were used to prepare standards in a freeze-dried configuration suitable for x-ray fluorescence analysis. Both gravimetric and freeze-drying techniques are presented. Volumetric preparation was found to be unsatisfactory for 0.1% precision for the sample size of interest. One of the primary considerations in preparing uranium standards for x-ray fluorescence analysis is the development of a technique for dispensing a 50-μl aliquot of a standard solution with a precision of 0.1% and an accuracy of 0.1%. The method developed corrects for variation in aliquoting and for evaporation loss during weighing. Two sets, each containing 50 standards have been produced. One set has been retained by LLL and one set retained by the Savannah River project

  4. Methods for preparing comparative standards and field samples for neutron activation analysis of soil

    International Nuclear Information System (INIS)

    Glasgow, D.C.; Dyer, F.F.; Robinson, L.

    1994-01-01

    One of the more difficult problems associated with comparative neutron activation analysis (CNAA) is the preparation of standards which are tailor-made to the desired irradiation and counting conditions. Frequently, there simply is not a suitable standard available commercially, or the resulting gamma spectrum is convoluted with interferences. In a recent soil analysis project, the need arose for standards which contained about 35 elements. In response, a computer spreadsheet was developed to calculate the appropriate amount of each element so that the resulting gamma spectrum is relatively free of interferences. Incorporated in the program are options for calculating all of the irradiation and counting parameters including activity produced, necessary flux/bombardment time, counting time, and appropriate source-to-detector distance. The result is multi-element standards for CNAA which have optimal concentrations. The program retains ease of use without sacrificing capability. In addition to optimized standard production, a novel soil homogenization technique was developed which is a low cost, highly efficient alternative to commercially available homogenization systems. Comparative neutron activation analysis for large scale projects has been made easier through these advancements. This paper contains details of the design and function of the NAA spreadsheet and innovative sample handling techniques

  5. The k0-based neutron activation analysis: a mono standard to standardless approach of NAA

    International Nuclear Information System (INIS)

    Acharya, R.; Nair, A.G.C.; Sudarshan, K.; Goswami, A.; Reddy, A.V.R.

    2006-01-01

    The k 0 -based neutron activation analysis (k 0 -NAA) uses neutron flux parameters, detection efficiency and nuclear constants namely k 0 and Q 0 for the determination of concentration of elements. Gold ( 197 Au) or any other element having suitable nuclear properties is used as external or internal single comparator. This article describes the principle of k 0 -NAA and standardization of method by characterization of reactor irradiation sites and calibration of efficiency of the detector and applications. The method was validated using CRMs obtained from USGS, IAEA and NIST. The applications of method includes samples like gemstones (ruby, beryl and emerald), sediments, manganese nodules and encrustations, cereals, and medicinal and edible leaves. Recently, a k-o-based internal mono standard INAA (IM-NAA) method using in-situ relative efficiency has been standardized by us for the analysis of small and large samples of different shapes and sizes. The method was applied to a new meteorite sample and large size wheat samples. Non-standard size and shape samples of nuclear cladding materials namely zircaloy 2 and 4, stainless steels (SS 316M and D9) and 1S aluminium were analysed. Standard-less analysis of these cladding materials was possible by mass balance approach since all the major and minor elements were amenable to NAA. (author)

  6. ASSESSING BUSINESS TRANSACTION STANDARDS AND THEIR ADOPTION A cross case analysis between the SETU and Vektis standards

    NARCIS (Netherlands)

    Berends, Wouter; Folmer, Erwin Johan Albert

    2010-01-01

    Nowadays businesses increasingly want to be interoperable so that they can collaborate with other organizations. Interoperability can be achieved through the use of business transaction standards, by which the organizations that use the standards collectively form a value added network. However the

  7. Thermal safety analysis of a dry storage cask for the Korean standard spent fuel - 16159

    International Nuclear Information System (INIS)

    Cha, Jeonghun; Kim, S.N.; Choi, K.W.

    2009-01-01

    A conceptual dry storage facility, which is based on a commercial dry storage facility, was designed for the Korea standard spent nuclear fuel (SNF) and preliminary thermal safety analysis was performed in this study. To perform the preliminary thermal analysis, a thermal analysis method was proposed. The thermal analysis method consists of 2 parts. By using the method, the surface temperature of the storage canister corresponding to the SNF clad temperature was calculated and the adequate air duct area was decided using the calculation result. The initial temperature of the facility was calculated and the fire condition and half air duct blockage were analyzed. (authors)

  8. A comparative analysis of quality management standards for contract research organisations in clinical trials.

    Science.gov (United States)

    Murray, Elizabeth; McAdam, Rodney

    2007-01-01

    This article compares and contrasts the main quality standards in the highly regulated pharmaceutical industry with specific focus on Good Clinical Practice (GCP), the standard for designing, conducting, recording and reporting clinical trials involving human participants. Comparison is made to ISO quality standards, which can be applied to all industries and types of organisation. The study is then narrowed to that of contract research organisations (CROs) involved in the conduct of clinical trials. The paper concludes that the ISO 9000 series of quality standards can act as a company-wide framework for quality management within such organisations by helping to direct quality efforts on a long-term basis without any loss of compliance. This study is valuable because comparative analysis in this domain is uncommon.

  9. Summary of component reliability data for probabilistic safety analysis of Korean standard nuclear power plant

    International Nuclear Information System (INIS)

    Choi, S. Y.; Han, S. H.

    2004-01-01

    The reliability data of Korean NPP that reflects the plant specific characteristics is necessary for PSA of Korean nuclear power plants. We have performed a study to develop the component reliability DB and S/W for component reliability analysis. Based on the system, we had have collected the component operation data and failure/repair data during plant operation data to 1998/2000 for YGN 3,4/UCN 3,4 respectively. Recently, we have upgraded the database by collecting additional data by 2002 for Korean standard nuclear power plants and performed component reliability analysis and Bayesian analysis again. In this paper, we supply the summary of component reliability data for probabilistic safety analysis of Korean standard nuclear power plant and describe the plant specific characteristics compared to the generic data

  10. Integrated Data Collection Analysis (IDCA) Program — RDX Standard Data Sets

    Energy Technology Data Exchange (ETDEWEB)

    Sandstrom, Mary M. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Brown, Geoffrey W. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Preston, Daniel N. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Pollard, Colin J. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Warner, Kirstin F. [Naval Surface Warfare Center (NSWC), Indian Head, MD (United States). Indian Head Division; Sorensen, Daniel N. [Naval Surface Warfare Center (NSWC), Indian Head, MD (United States). Indian Head Division; Remmers, Daniel L. [Naval Surface Warfare Center (NSWC), Indian Head, MD (United States). Indian Head Division; Phillips, Jason J. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Shelley, Timothy J. [Bureau of Alcohol, Tobacco and Firearms, Huntsville, AL (United States); Reyes, Jose A. [Applied Research Associates, Tyndall AFB, FL (United States); Hsu, Peter C. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Reynolds, John G. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States)

    2013-03-04

    The Integrated Data Collection Analysis (IDCA) program is conducting a proficiency study for Small- Scale Safety and Thermal (SSST) testing of homemade explosives (HMEs). Described here are the results for impact, friction, electrostatic discharge, and differential scanning calorimetry analysis of the RDX Type II Class 5 standard, for a third and fourth time in the Proficiency Test and averaged with the analysis results from the first and second time. The results, from averaging all four sets (1, 2, 3 and 4) of data suggest a material to have slightly more impact sensitivity, more BAM friction sensitivity, less ABL friction sensitivity, similar ESD sensitivity, and same DSC sensitivity, compared to the results from Set 1, which was used previously as the values for the RDX standard in IDCA Analysis Reports.

  11. When Is Hub Gene Selection Better than Standard Meta-Analysis?

    Science.gov (United States)

    Langfelder, Peter; Mischel, Paul S.; Horvath, Steve

    2013-01-01

    Since hub nodes have been found to play important roles in many networks, highly connected hub genes are expected to play an important role in biology as well. However, the empirical evidence remains ambiguous. An open question is whether (or when) hub gene selection leads to more meaningful gene lists than a standard statistical analysis based on significance testing when analyzing genomic data sets (e.g., gene expression or DNA methylation data). Here we address this question for the special case when multiple genomic data sets are available. This is of great practical importance since for many research questions multiple data sets are publicly available. In this case, the data analyst can decide between a standard statistical approach (e.g., based on meta-analysis) and a co-expression network analysis approach that selects intramodular hubs in consensus modules. We assess the performance of these two types of approaches according to two criteria. The first criterion evaluates the biological insights gained and is relevant in basic research. The second criterion evaluates the validation success (reproducibility) in independent data sets and often applies in clinical diagnostic or prognostic applications. We compare meta-analysis with consensus network analysis based on weighted correlation network analysis (WGCNA) in three comprehensive and unbiased empirical studies: (1) Finding genes predictive of lung cancer survival, (2) finding methylation markers related to age, and (3) finding mouse genes related to total cholesterol. The results demonstrate that intramodular hub gene status with respect to consensus modules is more useful than a meta-analysis p-value when identifying biologically meaningful gene lists (reflecting criterion 1). However, standard meta-analysis methods perform as good as (if not better than) a consensus network approach in terms of validation success (criterion 2). The article also reports a comparison of meta-analysis techniques applied to

  12. When is hub gene selection better than standard meta-analysis?

    Directory of Open Access Journals (Sweden)

    Peter Langfelder

    Full Text Available Since hub nodes have been found to play important roles in many networks, highly connected hub genes are expected to play an important role in biology as well. However, the empirical evidence remains ambiguous. An open question is whether (or when hub gene selection leads to more meaningful gene lists than a standard statistical analysis based on significance testing when analyzing genomic data sets (e.g., gene expression or DNA methylation data. Here we address this question for the special case when multiple genomic data sets are available. This is of great practical importance since for many research questions multiple data sets are publicly available. In this case, the data analyst can decide between a standard statistical approach (e.g., based on meta-analysis and a co-expression network analysis approach that selects intramodular hubs in consensus modules. We assess the performance of these two types of approaches according to two criteria. The first criterion evaluates the biological insights gained and is relevant in basic research. The second criterion evaluates the validation success (reproducibility in independent data sets and often applies in clinical diagnostic or prognostic applications. We compare meta-analysis with consensus network analysis based on weighted correlation network analysis (WGCNA in three comprehensive and unbiased empirical studies: (1 Finding genes predictive of lung cancer survival, (2 finding methylation markers related to age, and (3 finding mouse genes related to total cholesterol. The results demonstrate that intramodular hub gene status with respect to consensus modules is more useful than a meta-analysis p-value when identifying biologically meaningful gene lists (reflecting criterion 1. However, standard meta-analysis methods perform as good as (if not better than a consensus network approach in terms of validation success (criterion 2. The article also reports a comparison of meta-analysis techniques

  13. Improvement of precision method of spectrophotometry with inner standardization and its use in plutonium solutions analysis

    International Nuclear Information System (INIS)

    Stepanov, A.V.; Stepanov, D.A.; Nikitina, S.A.; Gogoleva, T.D.; Grigor'eva, M.G.; Bulyanitsa, L.S.; Panteleev, Yu.A.; Pevtsova, E.V.; Domkin, V.D.; Pen'kin, M.V.

    2006-01-01

    Precision method of spectrophotometry with inner standardization is used for analysis of pure Pu solutions. Improvement of the spectrophotometer and spectrophotometric method of analysis is done to decrease accidental constituent of relative error of the method. Influence of U, Np impurities and corrosion products on systematic constituent of error of the method, and effect of fluoride-ion on completeness of Pu oxidation in sample preparation are studied [ru

  14. Naval Research Laboratory Industrial Chemical Analysis and Respiratory Filter Standards Development

    Science.gov (United States)

    2017-09-29

    Naval Research Laboratory Washington, DC 20375-5320 NRL/MR/6360--17-9750 Naval Research Laboratory Industrial Chemical Analysis and Respiratory...LIMITATION OF ABSTRACT Naval Research Laboratory Industrial Chemical Analysis and Respiratory Filter Standards Development Thomas E. Sutto Naval Research ...09-2017 NRL Memorandum Report 2009 – 2016 63-4974-07 Naval Research Laboratory, Code 6362 4555 Overlook Avenue, SW Washington, DC 20375-5320 NRL 6.1

  15. An Analysis of the Impact of Federated Search Products on Library Instruction Using the ACRL Standards

    Science.gov (United States)

    Cox, Christopher

    2006-01-01

    Federated search products are becoming more and more prevalent in academic libraries. What are the implications of this phenomenon for instruction librarians? An analysis of federated search products using the "Information Literacy Competency Standards for Higher Education" and a thorough review of the literature offer insight concerning whether…

  16. Standard techniques for presentation and analysis of crater size-frequency data

    Science.gov (United States)

    1978-01-01

    In September 1977, a crater studies workshop was held for the purpose of developing standard data analysis and presentation techniques. This report contains the unanimous recommendations of the participants. This first meeting considered primarily crater size-frequency data. Future meetings will treat other aspects of crater studies such as morphologies.

  17. Standard model for safety analysis report of hexafluoride power plants from natural uranium

    International Nuclear Information System (INIS)

    1983-01-01

    The standard model for safety analysis report for hexafluoride production power plants from natural uranium is presented, showing the presentation form, the nature and the degree of detail, of the minimal information required by the Brazilian Nuclear Energy Commission - CNEN. (E.G.) [pt

  18. A Standards-Based Content Analysis of Selected Biological Science Websites

    Science.gov (United States)

    Stewart, Joy E.

    2010-01-01

    The purpose of this study was to analyze the biology content, instructional strategies, and assessment methods of 100 biological science websites that were appropriate for Grade 12 educational purposes. For the analysis of each website, an instrument, developed from the National Science Education Standards (NSES) for Grade 12 Life Science coupled…

  19. Phytochemical analysis and standardization of Strychnos nux-vomica extract through HPTLC techniques

    Directory of Open Access Journals (Sweden)

    Dinesh Kumar Patel

    2012-05-01

    Full Text Available Objective: The objective is to develop a noval qualitative and quantitative method by which we can determine different phytoconstituents of Strychnos nux-vomica L. Methods: To profile the phyconstituents of Strychnos nux-vomica, in the present study hydroalcoholic extract of Strychnos nux-vomica was subjected to preliminary phytochemical analysis, antimicrobial activities against certain pathogenic microorganisms, solubility test, loss on drying and pH value. Extract was also subjected to the quantitative analysis including total phenol, flavonoid and heavy metal analysis. Quantitative analysis was performed through HPTLC methods using strychnine and brucine as a standard marker. Results: Phytochemical analysis revealed the presence of alkaloid, carbohydrate, tannin, steroid, triterpenoid and glycoside in the extract. Total flavonoid and phenol content of Strychnos nux-vomica L extract was found to be 0.40 % and 0.43%. Result showed that the level of heavy metal (lead, arsenic, mercury and cadmium complie the standard level. Total bacterial count, yeast and moulds contents were found to be under the limit whereas E. coli and salmonella was found to be absent in the extract. Content of strychnine and brucine were found to be 4.75% and 3.91%. Conclusions: These studies provide valluable information for correct identification and selection of the drug from various adulterations. In future this study will be helpful for the quantitative analysis as well as standardization of the Strychnos nux-vomica L.

  20. Machinability of drilling T700/LT-03A carbon fiber reinforced plastic (CFRP) composite laminates using candle stick drill and multi-facet drill

    Science.gov (United States)

    Wang, Cheng-Dong; Qiu, Kun-Xian; Chen, Ming; Cai, Xiao-Jiang

    2015-03-01

    Carbon Fiber Reinforced Plastic (CFRP) composite laminates are widely used in aerospace and aircraft structural components due to their superior properties. However, they are regarded as difficult-to-cut materials because of bad surface quality and low productivity. Drilling is the most common hole making process for CFRP composite laminates and drilling induced delamination damage usually occurs severely at the exit side of drilling holes, which strongly deteriorate holes quality. In this work, the candle stick drill and multi-facet drill are employed to evaluate the machinability of drilling T700/LT-03A CFRP composite laminates in terms of thrust force, delamination, holes diameter and holes surface roughness. S/N ratio is used to characterize the thrust force while an ellipse-shaped delamination model is established to quantitatively analyze the delamination. The best combination of drilling parameters are determined by full consideration of S/N ratios of thrust force and the delamination. The results indicate that candle stick drill will induce the unexpected ellipse-shaped delamination even at its best drilling parameters of spindle speed of 10,000 rpm and feed rate of 0.004 mm/tooth. However, the multi-facet drill cutting at the relative lower feed rate of 0.004 mm/tooth and lower spindle speed of 6000 rpm can effectively prevent the delamination. Comprehensively, holes quality obtained by multi-facet drill is much more superior to those obtained by candle stick drill.

  1. Sustainable development induction in organizations: a convergence analysis of ISO standards management tools' parameters.

    Science.gov (United States)

    Merlin, Fabrício Kurman; Pereira, Vera Lúciaduarte do Valle; Pacheco, Waldemar

    2012-01-01

    Organizations are part of an environment in which they are pressured to meet society's demands and acting in a sustainable way. In an attempt to meet such demands, organizations make use of various management tools, among which, ISO standards are used. Although there are evidences of contributions provided by these standards, it is questionable whether its parameters converge for a possible induction for sustainable development in organizations. This work presents a theoretical study, designed on structuralism world view, descriptive and deductive method, which aims to analyze the convergence of management tools' parameters in ISO standards. In order to support the analysis, a generic framework for possible convergence was developed, based on systems approach, linking five ISO standards (ISO 9001, ISO 14001, OHSAS 18001, ISO 31000 and ISO 26000) with sustainable development and positioning them according to organization levels (strategic, tactical and operational). The structure was designed based on Brundtland report concept. The analysis was performed exploring the generic framework for possible convergence based on Nadler and Tushman model. The results found the standards can contribute to a possible sustainable development induction in organizations, as long as they meet certain minimum conditions related to its strategic alignment.

  2. Two-dimensional gel-based protein standardization verified by western blot analysis.

    Science.gov (United States)

    Haniu, Hisao; Watanabe, Daisuke; Kawashima, Yusuke; Matsumoto, Hiroyuki

    2015-01-01

    In data presentation of biochemical investigation the amount of a target protein is shown in the y-axis against the x-axis representing time, concentrations of various agents, or other parameters. Western blot is a versatile and convenient tool in such an analysis to quantify and display the amount of proteins. In western blot, so-called housekeeping gene product(s), or "housekeeping proteins," are widely used as internal standards. The rationale of using housekeeping proteins for standardization of western blot is based on the assumption that the expression of chosen housekeeping gene is always constant, which could be false under certain physiological or pathological conditions. We have devised a two-dimensional gel-based standardization method in which the protein content of each sample is determined by scanning the total protein density of two-dimensional gels and the expression of each protein is quantified as the density ratio of each protein divided by the density of the total proteins on the two-dimensional gel. The advantage of this standardization method is that it is not based on any presumed "housekeeping proteins" that are supposed to be being expressed constantly under all physiological conditions. We will show that the total density of a two-dimensional gel can render a reliable protein standardization parameter by running western blot analysis on one of the proteins analyzed by two-dimensional gels.

  3. Analytical standards production for the analysis of pomegranate anthocyanins by HPLC

    Directory of Open Access Journals (Sweden)

    Manuela Cristina Pessanha de Araújo Santiago

    2014-03-01

    Full Text Available Pomegranate (Punica granatum L. is a fruit with a long medicinal history, especially due to its phenolic compounds content, such as the anthocyanins, which are reported as one of the most important natural antioxidants. The analysis of the anthocyanins by high performance liquid chromatography (HPLC can be considered as an important tool to evaluate the quality of pomegranate juice. For research laboratories the major challenge in using HPLC for quantitative analyses is the acquisition of high purity analytical standards, since these are expensive and in some cases not even commercially available. The aim of this study was to obtain analytical standards for the qualitative and quantitative analysis of the anthocyanins from pomegranate. Five vegetable matrices (pomegranate flower, jambolan, jabuticaba, blackberry and strawberry fruits were used to isolate each of the six anthocyanins present in pomegranate fruit, using an analytical HPLC scale with non-destructive detection, it being possible to subsequently use them as analytical standards. Furthermore, their identities were confirmed by high resolution mass spectrometry. The proposed procedure showed that it is possible to obtain analytical standards of anthocyanins with a high purity grade (98.0 to 99.9% from natural sources, which was proved to be an economic strategy for the production of standards by laboratories according to their research requirements.

  4. ANSI/ASHRAE/IES Standard 90.1-2010 Final Determination Quantitative Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Halverson, Mark A.; Rosenberg, Michael I.; Liu, Bing

    2011-10-31

    The U.S. Department of Energy (DOE) conducted a final quantitative analysis to assess whether buildings constructed according to the requirements of the American National Standards Institute (ANSI)/American Society of Heating, Refrigerating and Air-Conditioning Engineers (ASHRAE)/Illuminating Engineering Society of North America (IESNA) Standard 90.1-2010 (ASHRAE Standard 90.1-2010, Standard 90.1-2010, or 2010 edition) would result in energy savings compared with buildings constructed to ANSI/ASHRAE/IESNA Standard 90.1-2007(ASHRAE Standard 90.1-2007, Standard 90.1-2007, or 2007 edition). The final analysis considered each of the 109 addenda to ASHRAE Standard 90.1-2007 that were included in ASHRAE Standard 90.1-2010. All 109 addenda processed by ASHRAE in the creation of Standard 90.1-2010 from Standard 90.1-2007 were reviewed by DOE, and their combined impact on a suite of 16 building prototype models in 15 ASHRAE climate zones was considered. Most addenda were deemed to have little quantifiable impact on building efficiency for the purpose of DOE's final determination. However, out of the 109 addenda, 34 were preliminarily determined to have a measureable and quantifiable impact. A suite of 240 computer energy simulations for building prototypes complying with ASHRAE 90.1-2007 was developed. These prototypes were then modified in accordance with these 34 addenda to create a second suite of corresponding building simulations reflecting the same buildings compliant with Standard 90.1-2010. The building simulations were conducted using the DOE EnergyPlus building simulation software. The resulting energy use from the complete suite of 480 simulation runs was then converted to energy use intensity (EUI, or energy use per unit floor area) metrics (Site EUI, Primary EUI, and energy cost intensity [ECI]) results for each simulation. For each edition of the standard, these EUIs were then aggregated to a national basis for each prototype using weighting factors based

  5. Standard model for the safety analysis report of nuclear fuel reprocessing plants

    International Nuclear Information System (INIS)

    1980-02-01

    This norm establishes the Standard Model for the Safety Analysis Report of Nuclear Fuel Reprocessing Plants, comprehending the presentation format, the detailing level of the minimum information required by the CNEN for evaluation the requests of Construction License or Operation Authorization, in accordance with the legislation in force. This regulation applies to the following basic reports: Preliminary Safety Analysis Report - PSAR, integrating part of the requirement of Construction License; and Final Safety Analysis Report (FSAR) which is the integrating part of the requirement for Operation Authorization

  6. Error analysis for duct leakage tests in ASHRAE standard 152P

    Energy Technology Data Exchange (ETDEWEB)

    Andrews, J.W.

    1997-06-01

    This report presents an analysis of random uncertainties in the two methods of testing for duct leakage in Standard 152P of the American Society of Heating, Refrigerating, and Air-Conditioning Engineers (ASHRAE). The test method is titled Standard Method of Test for Determining Steady-State and Seasonal Efficiency of Residential Thermal Distribution Systems. Equations have been derived for the uncertainties in duct leakage for given levels of uncertainty in the measured quantities used as inputs to the calculations. Tables of allowed errors in each of these independent variables, consistent with fixed criteria of overall allowed error, have been developed.

  7. An introduction to TR-X: a simplified tool for standardized analysis

    Energy Technology Data Exchange (ETDEWEB)

    Johns, Russell C [Los Alamos National Laboratory; Waters, Laurie S [Los Alamos National Laboratory; Fallgren, Andrew J [Los Alamos National Laboratory; Ghoreson, Gregory G [UNIV OF TEXAS

    2010-09-09

    TR-X is a multi-platform program that provides a graphical interface to Monte Carlo N-Particle transport (MCNP) and Monte Carlo N-Particle transport eXtended (MCNPX) codes. Included in this interface are tools to reduce the tedium of input file creation, provide standardization of model creation and analysis, and expedite the execution of the created models. TR-X provides tools to make the rapid testing of multiple permutations of these models easier, while also building in standardization that allows multiple solutions to be compared.

  8. Error Analysis of Ia Supernova and Query on Cosmic Dark Energy ...

    Indian Academy of Sciences (India)

    idea of single accreting white dwarf model (called the 'standard model') of SNIa explosion that has been negated due to recent researches (Phillips 1993; Wang et al. 2013) about the Tycho SNR from 2008 to 2010. The peak luminosity of SNIa is taken as standard candles, no longer due to different progenitors of SNIa.

  9. Cost-Effectiveness Analysis of 1-Year Treatment with Golimumab/Standard Care and Standard Care Alone for Ulcerative Colitis in Poland.

    Directory of Open Access Journals (Sweden)

    Ewa Stawowczyk

    Full Text Available The objective of this study was to assess the cost-effectiveness of induction and maintenance treatment up to 1 year of ulcerative colitis with golimumab/standard care and standard care alone in Poland.A Markov model was used to estimate the expected costs and effects of golimumab/standard care and a standard care alone. For each treatment option the costs and quality adjusted life years were calculated to estimate the incremental cost-utility ratio. The analysis was performed from the perspective of the Polish public payer and society over a 30-years time horizon. The clinical parameters were derived mainly from the PURSUIT-SC and PURSUIT-M clinical trials. Different direct and indirect costs and utility values were assigned to the various model health states.The treatment of ulcerative colitis patients with golimumab/standard care instead of a standard care alone resulted in 0.122 additional years of life with full health. The treatment with golimumab/standard care was found to be more expensive than treatment with the standard care alone from the public payer perspective and from social perspective. The incremental cost-utility ratio of golimumab/standard care compared to the standard care alone is estimated to be 391,252 PLN/QALY gained (93,155 €/QALYG from public payer perspective and 374,377 PLN/QALY gained (89,137 €/QALYG from social perspective.The biologic treatment of ulcerative colitis patients with golimumab/standard care is more effective but also more costly compared with standard care alone.

  10. Facile Fabrication and Characterization of a PDMS-Derived Candle Soot Coated Stable Biocompatible Superhydrophobic and Superhemophobic Surface.

    Science.gov (United States)

    Iqbal, R; Majhy, B; Sen, A K

    2017-09-13

    We report a simple, inexpensive, rapid, and one-step method for the fabrication of a stable and biocompatible superhydrophobic and superhemophobic surface. The proposed surface comprises candle soot particles embedded in a mixture of PDMS+n-hexane serving as the base material. The mechanism responsible for the superhydrophobic behavior of the surface is explained, and the surface is characterized based on its morphology and elemental composition, wetting properties, mechanical and chemical stability, and biocompatibility. The effect of %n-hexane in PDMS, the thickness of the PDMS+n-hexane layer (in terms of spin coating speed) and sooting time on the wetting property of the surface is studied. The proposed surface exhibits nanoscale surface asperities (average roughness of 187 nm), chemical compositions of soot particles, very high water and blood repellency along with excellent mechanical and chemical stability and excellent biocompatibility against blood sample and biological cells. The water contact angle and roll-off angle is measured as 160° ± 1° and 2°, respectively, and the blood contact angle is found to be 154° ± 1°, which indicates that the surface is superhydrophobic and superhemophobic. The proposed superhydrophobic and superhemophobic surface offers significantly improved (>40%) cell viability as compared to glass and PDMS surfaces.

  11. Role of ''standard'' fine-group cross section libraries in shielding analysis

    International Nuclear Information System (INIS)

    Weisbin, C.R.; Roussin, R.W.; Oblow, E.M.; Cullen, D.E.; White, J.E.; Wright, R.Q.

    1977-01-01

    The Divisions of Magnetic Fusion Energy (DMFE) and Reactor Development and Demonstration (DRDD) of the United States Energy Research and Development Administration (ERDA) have jointly sponsored the development of a 171 neutron, 36 gamma ray group pseudo composition independent cross section library based upon ENDF/B-IV. This library (named VITAMIN-C and packaged by RSIC as DLC-41) is intended to be generally applicable to fusion blanket and LMFBR core and shield analysis. The purpose of this paper is to evaluate this library as a possible candidate for specific designation as a ''standard'' in light of American Nuclear Society standards for fine-group cross section data sets. The rationale and qualification procedure for such a standard are discussed. Finally, current limitations and anticipated extensions to this processed data file are described

  12. Usage of Latent Class Analysis in Diagnostic Microbiology in the Absence of Gold Standard Test

    Directory of Open Access Journals (Sweden)

    Gul Bayram Abiha

    2016-12-01

    Full Text Available The evaluation of performance of various tests diagnostic tests in the absence of gold standard is an important problem. Latent class analysis (LCA is a statistical analysis method known for many years, especially in the absence of a gold standard for evaluation of diagnostic tests so that LCA has found its wide application area. During the last decade, LCA method has widely used in for determining sensivity and specifity of different microbiological tests. It has investigated in the diagnosis of mycobacterium tuberculosis, mycobacterium bovis, human papilloma virus, bordetella pertussis, influenza viruses, hepatitis E virus (HEV, hepatitis C virus (HCV and other various viral infections. Researchers have compared several diagnostic tests for the diagnosis of different pathogens with LCA. We aimed to evaluate performance of latent class analysis method used microbiological diagnosis in various diseases in several researches. When we took into account all of these tests' results, we suppose that LCA is a good statistical analysis method to assess different test performances in the absence of gold standard. [Archives Medical Review Journal 2016; 25(4.000: 467-488

  13. A Student Assessment Tool for Standardized Patient Simulations (SAT-SPS): Psychometric analysis.

    Science.gov (United States)

    Castro-Yuste, Cristina; García-Cabanillas, María José; Rodríguez-Cornejo, María Jesús; Carnicer-Fuentes, Concepción; Paloma-Castro, Olga; Moreno-Corral, Luis Javier

    2018-05-01

    The evaluation of the level of clinical competence acquired by the student is a complex process that must meet various requirements to ensure its quality. The psychometric analysis of the data collected by the assessment tools used is a fundamental aspect to guarantee the student's competence level. To conduct a psychometric analysis of an instrument which assesses clinical competence in nursing students at simulation stations with standardized patients in OSCE-format tests. The construct of clinical competence was operationalized as a set of observable and measurable behaviors, measured by the newly-created Student Assessment Tool for Standardized Patient Simulations (SAT-SPS), which was comprised of 27 items. The categories assigned to the items were 'incorrect or not performed' (0), 'acceptable' (1), and 'correct' (2). 499 nursing students. Data were collected by two independent observers during the assessment of the students' performance at a four-station OSCE with standardized patients. Descriptive statistics were used to summarize the variables. The difficulty levels and floor and ceiling effects were determined for each item. Reliability was analyzed using internal consistency and inter-observer reliability. The validity analysis was performed considering face validity, content and construct validity (through exploratory factor analysis), and criterion validity. Internal reliability and inter-observer reliability were higher than 0.80. The construct validity analysis suggested a three-factor model accounting for 37.1% of the variance. These three factors were named 'Nursing process', 'Communication skills', and 'Safe practice'. A significant correlation was found between the scores obtained and the students' grades in general, as well as with the grades obtained in subjects with clinical content. The assessment tool has proven to be sufficiently reliable and valid for the assessment of the clinical competence of nursing students using standardized patients

  14. An analysis of violations of Osha's (1987) occupational exposure to benzene standard.

    Science.gov (United States)

    Williams, Pamela R D

    2014-01-01

    The Occupational Safety and Health Administration (OSHA), which was formed by the Occupational Safety and Health Act of 1970 (OSH Act), establishes enforceable health and safety standards in the workplace and issues violations and penalties for non-compliance with these standards. The purpose of the current study was to evaluate the number and type of violations of the OSHA (1987) Occupational Exposure to Benzene Standard. Violations of the OSHA Hazard Communication Standard (HCS), particularly those that may pertain to specific provisions of the benzene standard, were also assessed. All analyses were based on OSHA inspection data that have been collected since the early 1970s and that are publicly available from the U.S. Department of Labor enforcement website. Analysis of these data shows that fewer than a thousand OSHA violations of the benzene standard have been issued over the last 25+ years. The results for benzene are in contrast to those for some other toxic and hazardous substances that are regulated by OSHA, such as blood-borne pathogens, lead, and asbestos, for which there have been issued tens of thousands of OSHA violations. The number of benzene standard violations also varies by time period, standard provision, industry sector, and other factors. In particular, the greatest number of benzene standard violations occurred during the late 1980s to early/mid 1990s, soon after the 1987 final benzene rule was promulgated. The majority of benzene standard violations also pertain to noncompliance with specific provisions and subprovisions of the standard dealing with initial exposure monitoring requirements, the communication of hazards to employees, and medical surveillance programs. Only a small fraction of HCS violations are attributed, at least in part, to potential benzene hazards in the workplace. In addition, most benzene standard violations are associated with specific industries within the manufacturing sector where benzene or benzene

  15. The Efficacy of Standardized Patient Feedback in Clinical Teaching: A Mixed Methods Analysis

    Directory of Open Access Journals (Sweden)

    Lisa Doyle Howley, PhD

    2004-01-01

    Full Text Available Introduction. The purpose of the current study was to investigate the effects of oral feedback from standardized patients on medical students’ overall perceptions of an educational exercise. We chose a mixed-methods approach to better understand the following research questions: Does satisfaction with the standardized patient exercise differ among those students who receive oral feedback and those who do not? What is the quality of oral feedback provided by standardized patients? Procedures. In order to address the first question, a basic randomized design comparing treatment (or those receiving SP feedback to control (those not receiving SP feedback was conducted. To address the second question, students in the treatment group were surveyed about their impressions of the quality of the feedback provided to them by their SP. One hundred and thirty six first year medical students were divided into treatment and control groups and interviewed one standardized patient during a single 20-minute encounter. Standardized patients were trained to simulate one of two outpatient cases and provide feedback using standard training materials. Both treatment and control groups completed a rating scale and questionnaire regarding their satisfaction with the encounter and students in the treatment group responded to additional questions regarding the quality of the SP feedback. Results. A one-way multivariate analysis of variance (MANOVA revealed significant differences among control and treatment groups on the seven combined dependent variables, Wilks’ =.890, F(7, 127=2.25, p<.034, ?2=.110. Students reported that the quality of SP feedback was very strong and additional qualitative analysis revealed further evidence to support the efficacy of providing oral SP feedback in a formative pre-clinical educational activity.

  16. Negative-pressure therapy versus standard wound care: a meta-analysis of randomized trials.

    Science.gov (United States)

    Suissa, Daniel; Danino, Alain; Nikolis, Andreas

    2011-11-01

    Several randomized controlled trials comparing negative-pressure therapy to standard wound care for chronic wounds have been published. Although these studies suggest a benefit for negative-pressure therapy, the majority of the review articles on the topic conclude that the studies are inconclusive. The authors conducted a quantitative meta-analysis of the effectiveness of negative-pressure therapy for the management of chronic wounds. The MEDLINE, EMBASE, and Cochrane databases were searched from 1993 to March of 2010 for randomized controlled trials comparing negative-pressure therapy to standard wound care for chronic wounds. Measures of wound size and time to healing, along with the corresponding p values, were extracted from the randomized controlled trials. Relative change ratios of wound size and ratios of median time to healing were combined using a random effects model for meta-analysis. Ten trials of negative-pressure therapy versus standard wound care were found. In the negative-pressure therapy group, wound size had decreased significantly more than in the standard wound care group (relative change ratio, 0.77; 95 percent confidence interval, 0.63 to 0.96). Time to healing was significantly shorter in the negative-pressure therapy group in comparison with the standard wound care group (ratio of median time to healing, 0.74; 95 percent confidence interval, 0.70 to 0.78). This quantitative meta-analysis of randomized trials suggests that negative-pressure therapy appears to be an effective treatment for chronic wounds. An effect of publication bias cannot be ruled out. Therapeutic, II.

  17. The Nature of Science and the Next Generation Science Standards: Analysis and Critique

    Science.gov (United States)

    McComas, William F.; Nouri, Noushin

    2016-08-01

    This paper provides a detailed analysis of the inclusion of aspects of nature of science (NOS) in the Next Generation Science Standards (NGSS). In this new standards document, NOS elements in eight categories are discussed in Appendix H along with illustrative statements (called exemplars). Many, but not all, of these exemplars are linked to the standards by their association with either the "practices of science" or "crosscutting concepts," but curiously not with the recommendations for science content. The study investigated all aspects of NOS in NGSS including the accuracy and inclusion of the supporting exemplar statements and the relationship of NOS in NGSS to other aspects of NOS to support teaching and learning science. We found that while 92 % of these exemplars are acceptable, only 78 % of those written actually appear with the standards. "Science as a way of knowing" is a recommended NOS category in NGSS but is not included with the standards. Also, several other NOS elements fail to be included at all grade levels thus limiting their impact. Finally, NGSS fails to include or insufficiently emphasize several frequently recommended NOS elements such as creativity and subjectivity. The paper concludes with a list of concerns and solutions to the challenges of NOS in NGSS.

  18. Are standards effective in improving automobile fuel economy? An international panel analysis

    International Nuclear Information System (INIS)

    Clerides, Sofronis; Zachariadis, Theodoros

    2007-01-01

    Although the adoption of fuel economy standards has induced fuel savings in new motor vehicles, there are arguments against standards and in favour of fuel tax increases because the latter may have lower welfare costs. We therefore attempted to analyze the impact of standards and fuel prices in the fuel consumption of new cars with the aid of cross-section time series analysis of data from 18 countries. To our knowledge, this study is the first one that attempts to explore econometrically this issue at an international level. We built an unbalanced panel comprising 384 observations from the US, Canada, Australia, Japan, Switzerland and 13 EU countries spanning a period between 1975 and 2003. We specified a dynamic panel model of fuel economy and estimated the model for the whole sample and also for North America and Europe separately. Based on these estimates, we derived three important policy conclusions. Firstly, it seems that if there were no FE standards or voluntary targets in force, transportation energy use would increase more rapidly. Secondly, if CO 2 targets are not to be tightened in Europe, retail fuel prices might have to double in order to attain the currently discussed target of 120 g CO 2 /km in the future. Thirdly, without higher fuel prices and/or tighter FE standards, one should not expect any marked improvements in fuel economy under 'business as usual' conditions. European policy makers might need to consider this issue carefully because some recent European studies tend to be optimistic in this respect

  19. The BrainMap strategy for standardization, sharing, and meta-analysis of neuroimaging data

    Directory of Open Access Journals (Sweden)

    Bzdok Danilo

    2011-09-01

    Full Text Available Abstract Background Neuroimaging researchers have developed rigorous community data and metadata standards that encourage meta-analysis as a method for establishing robust and meaningful convergence of knowledge of human brain structure and function. Capitalizing on these standards, the BrainMap project offers databases, software applications, and other associated tools for supporting and promoting quantitative coordinate-based meta-analysis of the structural and functional neuroimaging literature. Findings In this report, we describe recent technical updates to the project and provide an educational description for performing meta-analyses in the BrainMap environment. Conclusions The BrainMap project will continue to evolve in response to the meta-analytic needs of biomedical researchers in the structural and functional neuroimaging communities. Future work on the BrainMap project regarding software and hardware advances are also discussed.

  20. Standard Practice for Processing Aerospace Liquid Samples for Particulate Contamination Analysis Using Membrane Filters

    CERN Document Server

    American Society for Testing and Materials. Philadelphia

    2008-01-01

    1.1 This practice covers the processing of liquids in preparation for particulate contamination analysis using membrane filters and is limited only by the liquid-to-membrane filter compatibility. 1.2 The practice covers the procedure for filtering a measured volume of liquid through a membrane filter. When this practice is used, the particulate matter will be randomly distributed on the filter surface for subsequent contamination analysis methods. 1.3 The practice describes procedures to allow handling particles in the size range between 2 and 1000 μm with minimum losses during handling. 1.4 The values stated in SI units are to be regarded as standard. No other units of measurement are included in this standard.

  1. Behavioral Public Economics: Welfare and Policy Analysis with Non-Standard Decision-Makers

    OpenAIRE

    B. Douglas Bernheim; Antonio Rangel

    2005-01-01

    This paper has two goals. First, we discuss several emerging approaches to applied welfare analysis under non-standard (“behavioral”) assumptions concerning consumer choice. This provides a foundation for Behavioral Public Economics. Second, we illustrate applications of these approaches by surveying behavioral studies of policy problems involving saving, addiction, and public goods. We argue that the literature on behavioral public economics, though in its infancy, has already fundamentally ...

  2. Economic analysis of passive houses and lowenergy houses compared to standard houses

    OpenAIRE

    Audenaert, Amaryllis; De Cleyn, S; Vankerckhove, B

    2007-01-01

    As the energy demand used for space heating accounts for 78% of EU15 household delivered energy consumption significant reductions in energy demand can be achieved by promoting low energy buildings. Our study investigates three building types: the standard house, the low-energy house and the passive house. As more far-reaching measures concerning energy savings usually lead to higher investments, the aim of our study is to perform an economic analysis in order to determine the economic viabil...

  3. [Standard NF EN ISO 15189: comparative analysis with GBEA and implementation of the new reference support].

    Science.gov (United States)

    Rogowski, Julien; Annaix, Véronique

    2010-01-01

    Within reflexions of the healthcare system in France, Ballereau's report published in 2008 suggests modifications of medical biology supports by quality. The new reference system will be ISO 15189 which provides requirements for competence and quality. It differs from GBEA (Guidelines to Good Execution of Analysis) on the requirement of a quality system management. We accomplished a comparative study of these two standards of reference to identify differences, therefore elements to be worked or supported for accreditation process.

  4. Analysis of standard fracture toughness test based on digital image correlation data

    Czech Academy of Sciences Publication Activity Database

    Jandejsek, Ivan; Gajdoš, Lubomír; Šperl, Martin; Vavřík, Daniel

    2017-01-01

    Roč. 182, September (2017), s. 607-620 ISSN 0013-7944 R&D Projects: GA ČR(CZ) GA15-07210S; GA TA ČR(CZ) TE02000162 Keywords : DIC * full-field measurement * J-integral * CTOD * ASTM standard Subject RIV: JL - Materials Fatigue, Friction Mechanics OBOR OECD: Audio engineering, reliability analysis Impact factor: 2.151, year: 2016 http://www.sciencedirect.com/science/article/pii/S0013794417305799

  5. Post flight analysis of NASA standard star trackers recovered from the solar maximum mission

    Science.gov (United States)

    Newman, P.

    1985-01-01

    The flight hardware returned after the Solar Maximum Mission Repair Mission was analyzed to determine the effects of 4 years in space. The NASA Standard Star Tracker would be a good candidate for such analysis because it is moderately complex and had a very elaborate calibration during the acceptance procedure. However, the recovery process extensively damaged the cathode of the image dissector detector making proper operation of the tracker and a comparison with preflight characteristics impossible. Otherwise, the tracker functioned nominally during testing.

  6. Cost-Utility Analysis of Infliximab with Standard Care versus Standard Care Alone for Induction and Maintenance Treatment of Patients with Ulcerative Colitis in Poland.

    Science.gov (United States)

    Stawowczyk, Ewa; Kawalec, Paweł; Pilc, Andrzej

    2016-05-01

    To assess the cost-effectiveness of infliximab with standard care (e.g., azathioprine, prednisolone, mesalazine, and 6-mercaptopurine) versus standard care alone for induction and maintenance treatment of patients with ulcerative colitis (UC) in Poland. Cost-utility decision analytic model. A Markov model was used to estimate the expected costs and effects of infliximab/standard care and standard care alone. For each treatment option, costs and quality-adjusted life-years (QALYs) were calculated to estimate the incremental cost-utility ratio. The target population consisted of a hypothetical cohort of adult patients with moderately to severely active UC who had an inadequate response to standard treatment, including corticosteroids and 6-mercaptopurine or azathioprine, or who were intolerant to or had medical contraindications to such therapies. The analysis was performed from the perspective of the Polish public payer over a 30-year time horizon. The clinical parameters were derived mainly from the Active Ulcerative Colitis Trial (ACT) 1 and ACT 2 and from the Ulcerative Colitis Long-term Remission and Maintenance with Adalimumab (ULTRA) 2 clinical trial. Different costs and utility values were assigned to the various health states in the model; utility values were derived from a previously published study. Treatment of patients who received infliximab/standard care instead of standard care alone resulted in 0.174 additional QALYs. Treatment with infliximab/standard care was found to be more expensive than treatment with standard care alone from the Polish National Health Fund perspective. The incremental cost-utility ratio of infliximab/standard care compared with standard care alone was estimated to be 402,420 Polish zlotys (PLN)/QALY gained (95% confidence interval [CI] 253,936-531,450 PLN/QALY gained), which is equivalent to $106,743 (U.S. dollars)/QALY gained (95% CI $67,357-140,968 [U.S. dollars]/QALY gained). Treatment with infliximab/standard care instead

  7. Designing a Standard Model for Development and Execution of an Analysis Project Plan

    OpenAIRE

    Willis, Leslie T.

    2012-01-01

    Within the Operational Simulation and Analysis (OS and A) branch of the U.S. Army Armament Research, Development, and Engineering Center (ARDEC) at Picatinny Arsenal, there exists no standard model for development and execution of an Analysis Project Plan. A project plan is a formal document which, when agreed upon by parties involved, guides the execution and control of a project. Having such a plan is important to the OS and A branch and ARDEC as a whole because it documents decisions, faci...

  8. Integrated Data Collection Analysis (IDCA) Program - RDX Standard Data Set 2

    Energy Technology Data Exchange (ETDEWEB)

    Sandstrom, Mary M. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Brown, Geoffrey W. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Preston, Daniel N. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Pollard, Colin J. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Warner, Kirstin F. [Naval Surface Warfare Center (NSWC), Indian Head, MD (United States). Indian Head Division; Sorensen, Daniel N. [Naval Surface Warfare Center (NSWC), Indian Head, MD (United States). Indian Head Division; Remmers, Daniel L. [Naval Surface Warfare Center (NSWC), Indian Head, MD (United States). Indian Head Division; Phillips, Jason J. [Air Force Research Lab. (AFRL), Tyndall Air Force Base, FL (United States); Shelley, Timothy J. [Applied Research Associates, Tyndall Air Force Base, FL (United States); Reyes, Jose A. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Hsu, Peter C. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Reynolds, John G. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States)

    2013-02-20

    The Integrated Data Collection Analysis (IDCA) program is conducting a proficiency study for Small- Scale Safety and Thermal (SSST) testing of homemade explosives (HMEs). Described here are the results for impact, friction, electrostatic discharge, and differential scanning calorimetry analysis of the RDX Type II Class 5 standard, from testing the second time in the Proficiency Test. This RDX testing (Set 2) compared to the first (Set 1) was found to have about the same impact sensitivity, have more BAM friction sensitivity, less ABL friction sensitivity, similar ESD sensitivity, and same DSC sensitivity.

  9. The Influence of Adaptation and Standardization of the Marketing Mix on Performance: a Meta-Analysis

    Directory of Open Access Journals (Sweden)

    Vinícius Andrade Brei

    2011-07-01

    Full Text Available This article analyzes the relationship between strategies of standardization and adaptation of the marketing mix and performance in an international context. We carried out a meta-analysis on a sample of 23 studies published between 1992 and 2010. The sample was analyzed based on measures of the effect size (ES – or the strength of the relation (Wolf, 1986 – between standardization/adaptation and performance. The results suggest the existence of a medium strength (ES ranging from .133 to .209 for the relationship considered. The results support the existence of a positive impact of both marketing mix adaptation and standardization on performance. However, our results suggest that companies should slightly emphasize the marketing mix adaptation (ES mean = .168 instead of standardizing it (ES mean = .134 when entering in a new international market. Results also indicate that, among the adaptation choices, price (ES = .209 should be the first element of the marketing mix to be adapted, followed by promotion (ES = .155, product (ES = .154, and distribution (ES = .141. Finally, we suggest some new research paths, such as the use of quantitative methods to compare degrees of adaptation to be applied to different segments, regions, and sectors, among other suggestions.

  10. THE CRITICAL ANALYSIS OF LIMITED SOUTH ASIAN CORPORATE GOVERNANCE STANDARDS AFTER FINANCIAL CRISIS

    Directory of Open Access Journals (Sweden)

    Dinh Tran Ngoc Huy

    2015-12-01

    Full Text Available After the recent global crisis, corporate scandals and bankruptcy in US and Europe, there is some certain evidence on weak corporate governance, risk management and audit system. The 2009 India Code of Corporate Governance also revealed that during the crisis time, there are certain weaknesses although corporate structure is fairly durable. Hence, this paper chooses a different analytical approach and among its aims is to give some systematic opinions. First, it classifies limited South Asian representative corporate governance (CG standards into two (2 groups: India and Malaysia latest CG principles covered in group 1 and, group 2, including corporate governance principle from Thailand and Indonesia, so-called relative good CG group, while it uses ACCA and OECD and ICGN principles as reference. Second, it, through analysis, identifies differences and advantages between above set of standards which are and have been used as reference principles for many relevant organizations. Third, it establishes a selected comparative set of standards for South Asian representative corporate governance system in accordance to international standards. Last but not least, this paper covers some ideas and policy suggestions.

  11. Comparative Analysis of Norwegian Passive House Criteria and of Criteria related to the Concept of International Passive House Standard

    DEFF Research Database (Denmark)

    Anton, Karin; Vestergaard, Inge

    2013-01-01

    The analysis shows differences in definition of apssive house criterias. It also communicates issues os the passive house concept that are nor completely transferred by the Norwegian passive house standard.......The analysis shows differences in definition of apssive house criterias. It also communicates issues os the passive house concept that are nor completely transferred by the Norwegian passive house standard....

  12. The Analysis of the Psychological Tests Using In Educational Institutions According To the Testing Standards

    Directory of Open Access Journals (Sweden)

    Ezgi MOR DİRLİK

    2017-12-01

    Full Text Available The purpose of this research is to analyze four psychological tests which are frequently used in the Guidance and Research Centers and in the guidance services of the schools according to the standards for educational and psychological testing of APA (American Psychological Association and test adaption standards of ITC (International Testing Commission. The tests were determined based on the goal- oriented sample selecting method and were selected from the most frequently used psychological tests in Guidance and Research Centers and school’s guidance centers. These tests are: Scale of Academic Self-Concept (Akademik Benlik Kavramı Ölçeği-ABKÖ, Evaluation of Early Childhood Development Tool (Gazi Erken Çocukluk Gelişimi Değerlendirme Aracı-GEÇDA, Primary Mental Abilities 7-11 (TKT 7-11, and Wechsler Intelligence Scale for Children Revised Form (WISC-R. In this research, the chapters related to the validity, reliability and test development and revision of “Standards For Educational And Psychological Testing” (APA, 1999 and the adaptation standards developed by ITC were translated into Turkish and a checklist was created by using these documents. The checklist has got two forms as short and long form. The tests were analyzed according to the short form of the checklist by researcher. In order to examine the reliability of these analyses, the analyses were repeated in three weeks’ time. Data of these analyses were exported to the Statistical Package for Social Sciences (SPSS 20.0 and descriptive analysis was perfomed. As a result of this research, the meeting levels of the psychological tests to the test standards in the checklist and the features of the tests which should be improved according to the validity, reliability, test development and revision and test adaptation were determined. In conclusion, the standards analyzed have not been met satisfactorily by ABKÖ and GEÇDA, and according to the analyses of the realibility

  13. Data development technical support document for the aircraft crash risk analysis methodology (ACRAM) standard

    Energy Technology Data Exchange (ETDEWEB)

    Kimura, C.Y.; Glaser, R.E.; Mensing, R.W.; Lin, T.; Haley, T.A.; Barto, A.B.; Stutzke, M.A.

    1996-08-01

    The Aircraft Crash Risk Analysis Methodology (ACRAM) Panel has been formed by the US Department of Energy Office of Defense Programs (DOE/DP) for the purpose of developing a standard methodology for determining the risk from aircraft crashes onto DOE ground facilities. In order to accomplish this goal, the ACRAM panel has been divided into four teams, the data development team, the model evaluation team, the structural analysis team, and the consequence team. Each team, consisting of at least one member of the ACRAM plus additional DOE and DOE contractor personnel, specializes in the development of the methodology assigned to that team. This report documents the work performed by the data development team and provides the technical basis for the data used by the ACRAM Standard for determining the aircraft crash frequency. This report should be used to provide the generic data needed to calculate the aircraft crash frequency into the facility under consideration as part of the process for determining the aircraft crash risk to ground facilities as given by the DOE Standard Aircraft Crash Risk Assessment Methodology (ACRAM). Some broad guidance is presented on how to obtain the needed site-specific and facility specific data but this data is not provided by this document.

  14. Data development technical support document for the aircraft crash risk analysis methodology (ACRAM) standard

    International Nuclear Information System (INIS)

    Kimura, C.Y.; Glaser, R.E.; Mensing, R.W.; Lin, T.; Haley, T.A.; Barto, A.B.; Stutzke, M.A.

    1996-01-01

    The Aircraft Crash Risk Analysis Methodology (ACRAM) Panel has been formed by the US Department of Energy Office of Defense Programs (DOE/DP) for the purpose of developing a standard methodology for determining the risk from aircraft crashes onto DOE ground facilities. In order to accomplish this goal, the ACRAM panel has been divided into four teams, the data development team, the model evaluation team, the structural analysis team, and the consequence team. Each team, consisting of at least one member of the ACRAM plus additional DOE and DOE contractor personnel, specializes in the development of the methodology assigned to that team. This report documents the work performed by the data development team and provides the technical basis for the data used by the ACRAM Standard for determining the aircraft crash frequency. This report should be used to provide the generic data needed to calculate the aircraft crash frequency into the facility under consideration as part of the process for determining the aircraft crash risk to ground facilities as given by the DOE Standard Aircraft Crash Risk Assessment Methodology (ACRAM). Some broad guidance is presented on how to obtain the needed site-specific and facility specific data but this data is not provided by this document

  15. Is It Working? Distractor Analysis Results from the Test Of Astronomy STandards (TOAST) Assessment Instrument

    Science.gov (United States)

    Slater, Stephanie

    2009-05-01

    The Test Of Astronomy STandards (TOAST) assessment instrument is a multiple-choice survey tightly aligned to the consensus learning goals stated by the American Astronomical Society - Chair's Conference on ASTRO 101, the American Association of the Advancement of Science's Project 2061 Benchmarks, and the National Research Council's National Science Education Standards. Researchers from the Cognition in Astronomy, Physics and Earth sciences Research (CAPER) Team at the University of Wyoming's Science and Math Teaching Center (UWYO SMTC) have been conducting a question-by-question distractor analysis procedure to determine the sensitivity and effectiveness of each item. In brief, the frequency each possible answer choice, known as a foil or distractor on a multiple-choice test, is determined and compared to the existing literature on the teaching and learning of astronomy. In addition to having statistical difficulty and discrimination values, a well functioning assessment item will show students selecting distractors in the relative proportions to how we expect them to respond based on known misconceptions and reasoning difficulties. In all cases, our distractor analysis suggests that all items are functioning as expected. These results add weight to the validity of the Test Of Astronomy STandards (TOAST) assessment instrument, which is designed to help instructors and researchers measure the impact of course-length duration instructional strategies for undergraduate science survey courses with learning goals tightly aligned to the consensus goals of the astronomy education community.

  16. The Gold Standard Paradox in Digital Image Analysis: Manual Versus Automated Scoring as Ground Truth.

    Science.gov (United States)

    Aeffner, Famke; Wilson, Kristin; Martin, Nathan T; Black, Joshua C; Hendriks, Cris L Luengo; Bolon, Brad; Rudmann, Daniel G; Gianani, Roberto; Koegler, Sally R; Krueger, Joseph; Young, G Dave

    2017-09-01

    - Novel therapeutics often target complex cellular mechanisms. Increasingly, quantitative methods like digital tissue image analysis (tIA) are required to evaluate correspondingly complex biomarkers to elucidate subtle phenotypes that can inform treatment decisions with these targeted therapies. These tIA systems need a gold standard, or reference method, to establish analytical validity. Conventional, subjective histopathologic scores assigned by an experienced pathologist are the gold standard in anatomic pathology and are an attractive reference method. The pathologist's score can establish the ground truth to assess a tIA solution's analytical performance. The paradox of this validation strategy, however, is that tIA is often used to assist pathologists to score complex biomarkers because it is more objective and reproducible than manual evaluation alone by overcoming known biases in a human's visual evaluation of tissue, and because it can generate endpoints that cannot be generated by a human observer. - To discuss common visual and cognitive traps known in traditional pathology-based scoring paradigms that may impact characterization of tIA-assisted scoring accuracy, sensitivity, and specificity. - This manuscript reviews the current literature from the past decades available for traditional subjective pathology scoring paradigms and known cognitive and visual traps relevant to these scoring paradigms. - Awareness of the gold standard paradox is necessary when using traditional pathologist scores to analytically validate a tIA tool because image analysis is used specifically to overcome known sources of bias in visual assessment of tissue sections.

  17. Pharmacognostic standardization and physicochemical analysis of the leaves of Barleria montana Wight & Nees

    Directory of Open Access Journals (Sweden)

    Sriram Sridharan

    2016-03-01

    Full Text Available Objective: To investigate the pharmacognostic features and physiochemical properties of the leaves of Barleria montana Wight & Nees. Methods: The leaf samples were subjected to organoleptic, microscopic and macroscopic analysis. Physiochemical properties and fluorescence analysis of the sample under UV and daylight were studied as per World Health Organization norms. Results: Microscopic analysis showed that the plant possessed dorsiventral leaves, lamina, glandular trichomes, calcium carbonate cystoliths and adaxial epidermis. Physiochemical characters like ash and moisture content, extractive values, foreign matter and fluorescent characteristics of the leaf samples were determined and reported. Conclusions: Results obtained from these studies can be used as reliable markers in the identification and standardization of this plant as a herbal remedy

  18. A Prospective Analysis of the Costs, Benefits, and Impacts of U.S. Renewable Portfolio Standards

    Energy Technology Data Exchange (ETDEWEB)

    Mai, Trieu [National Renewable Energy Lab. (NREL), Golden, CO (United States); Wiser, Ryan [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Barbose, Galen [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Bird, Lori [National Renewable Energy Lab. (NREL), Golden, CO (United States); Heeter, Jenny [National Renewable Energy Lab. (NREL), Golden, CO (United States); Keyser, David [National Renewable Energy Lab. (NREL), Golden, CO (United States); Krishnan, Venkat [National Renewable Energy Lab. (NREL), Golden, CO (United States); Macknick, Jordan [National Renewable Energy Lab. (NREL), Golden, CO (United States); Millstein, Dev [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States)

    2016-12-01

    This report evaluates the future costs, benefits, and other impacts of renewable energy used to meet current state renewable portfolio standards (RPSs). It also examines a future scenario where RPSs are expanded. The analysis examines changes in electric system costs and retail electricity prices, which include all fixed and operating costs, including capital costs for all renewable, non-renewable, and supporting (e.g., transmission and storage) electric sector infrastructure; fossil fuel, uranium, and biomass fuel costs; and plant operations and maintenance expenditures. The analysis evaluates three specific benefits: air pollution, greenhouse gas emissions, and water use. It also analyzes two other impacts, renewable energy workforce and economic development, and natural gas price suppression. This analysis finds that the benefits or renewable energy used to meet RPS polices exceed the costs, even when considering the highest cost and lowest benefit outcomes.

  19. Standard test methods for chemical, mass spectrometric, spectrochemical, nuclear, and radiochemical analysis of uranium hexafluoride

    CERN Document Server

    American Society for Testing and Materials. Philadelphia

    2011-01-01

    1.1 These test methods cover procedures for subsampling and for chemical, mass spectrometric, spectrochemical, nuclear, and radiochemical analysis of uranium hexafluoride UF6. Most of these test methods are in routine use to determine conformance to UF6 specifications in the Enrichment and Conversion Facilities. 1.2 The analytical procedures in this document appear in the following order: Note 1—Subcommittee C26.05 will confer with C26.02 concerning the renumbered section in Test Methods C761 to determine how concerns with renumbering these sections, as analytical methods are replaced with stand-alone analytical methods, are best addressed in subsequent publications. Sections Subsampling of Uranium Hexafluoride 7 - 10 Gravimetric Determination of Uranium 11 - 19 Titrimetric Determination of Uranium 20 Preparation of High-Purity U3O 8 21 Isotopic Analysis 22 Isotopic Analysis by Double-Standard Mass-Spectrometer Method 23 - 29 Determination of Hydrocarbons, Chlorocarbons, and Partially Substitut...

  20. Uncertainty analysis of standardized measurements of random-incidence absorption and scattering coefficients.

    Science.gov (United States)

    Müller-Trapet, Markus; Vorländer, Michael

    2015-01-01

    This work presents an analysis of the effect of some uncertainties encountered when measuring absorption or scattering coefficients in the reverberation chamber according to International Organization for Standardization/American Society for Testing and Materials standards. This especially relates to the uncertainty due to spatial fluctuations of the sound field. By analyzing the mathematical definition of the respective coefficient, a relationship between the properties of the chamber and the test specimen and the uncertainty in the measured quantity is determined and analyzed. The validation of the established equations is presented through comparisons with measurement data. This study analytically explains the main sources of error and provides a method to obtain the product of the necessary minimum number of measurement positions and the band center frequency to achieve a given maximum uncertainty in the desired quantity. It is shown that this number depends on the ratio of room volume to sample surface area and the reverberation time of the empty chamber.

  1. Design of Standards and Labeling programs in Chile: Techno-Economic Analysis for Refrigerators

    Energy Technology Data Exchange (ETDEWEB)

    Letschert, Virginie E. [Lawrence Berkeley National Laboratory (LBNL), Berkeley, CA (United States). Environmental Energy Technologies Division.; McNeil, Michael A. [Lawrence Berkeley National Laboratory (LBNL), Berkeley, CA (United States). Environmental Energy Technologies Division.; Pavon, Mariana [Lawrence Berkeley National Laboratory (LBNL), Berkeley, CA (United States). Environmental Energy Technologies Division.; Lutz, Wolfgang F. [Lawrence Berkeley National Laboratory (LBNL), Berkeley, CA (United States). Environmental Energy Technologies Division.

    2013-05-01

    Lawrence Berkeley National Laboratory is a global leader in the study of energy efficiency and its effective implementation through government policy. The Energy Analysis and Environmental Impacts Department of LBNL’s Environmental Energy Technologies Division provides technical assistance to help federal, stat e and local government agencies in the United States, and throughout the world, develop long-term strategies, policy, and programs to encourage energy efficiency in all sectors and industries. In the past, LBNL has assisted staff of various countries government agencies and their con tractors in providing methodologies to analyze cost-effectiveness of regulations and asses s overall national impacts of efficiency programs. The paper presents the work done in collaboration with the Ministry of Energy (MoE) in Chile and the Collaborative Labeling Appliance Standards Programs (CLASP) on designing a Minimum Energy Performance Standards (MEPS) and ext ending the current labeling program for refrigerators.

  2. Standardizing risk analysis for the evaluation of oil and gas properties

    International Nuclear Information System (INIS)

    Robinson, J. G.

    1996-01-01

    Notwithstanding the advances made in 3-D seismic, horizontal drilling, and completion techniques, as well as in computer applications, all of which have improved our ability to find and produce new reserves, associated risks, both technical and economic, have also increased. Various efforts to standardize reserve evaluation, and to deal with the growing uncertainty, were discussed. Most of these efforts have failed in the face of great reluctance to change. The objective of this paper was to emphasize the need to incorporate and standardize the application of risk, to propose a revised 'expected value concept' for the economic evaluations of reserves, and to dispel the myth that statistical procedures are difficult, time consuming and expensive to apply. Essential characteristics of the various statistical procedures used in risk assessment, such as the Monte Carlo Simulation, Expected Value Determination and Decision Tree Analysis, were summarized. 8 refs., 9 tabs., 11 figs

  3. Using multiple continuous fine particle monitors to characterize tobacco, incense, candle, cooking, wood burning, and vehicular sources in indoor, outdoor, and in-transit settings

    Science.gov (United States)

    Ott, Wayne R.; Siegmann, Hans C.

    This study employed two continuous particle monitors operating on different measurement principles to measure concentrations simultaneously from common combustion sources in indoor, outdoor, and in-transit settings. The pair of instruments use (a) photo-charging (PC) operating on the principle ionization of fine particles that responds to surface particulate polycyclic aromatic hydrocarbons (PPAHs), and (b) diffusion charging (DC) calibrated to measure the active surface area of fine particles. The sources studied included: (1) secondhand smoke (cigarettes, cigars, and pipes), (2) incense (stick and cone), (3) candles used as food warmers, (4) cooking (toasting bread and frying meat), (5) fireplaces and ambient wood smoke, and (6) in-vehicle exposures traveling on California arterials and interstate highways. The ratio of the PC to the DC readings, or the PC/DC ratio, was found to be different for major categories of sources. Cooking, burning toast, and using a "canned heat" food warmer gave PC/DC ratios close to zero. Controlled experiments with 10 cigarettes averaged 0.15 ng mm -2 (ranging from 0.11 to 0.19 ng mm -2), which was similar to the PC/DC ratio for a cigar, although a pipe was slightly lower (0.09 ng mm -2). Large incense sticks had PC/DC ratios similar to those of cigarettes and cigars. The PC/DC ratios for ambient wood smoke averaged 0.29 ng mm -2 on 6 dates, or about twice those of cigarettes and cigars, reflecting a higher ratio of PAH to active surface area. The smoke from two artificial logs in a residential fireplace had a PC/DC ratio of 0.33-0.35 ng mm -2. The emissions from candles were found to vary, depending on how the candles were burned. If the candle flickered and generated soot, a higher PC/DC ratio resulted than if the candle burned uniformly in still air. Inserting piece of metal into the candle's flame caused high PPAH emissions with a record PC/DC reading of 1.8 ng mm -2. In-vehicle exposures measured on 43- and 50-min drives on a

  4. Confirmatory factors analysis of science teacher leadership in the Thailand world-class standard schools

    Science.gov (United States)

    Thawinkarn, Dawruwan

    2018-01-01

    This research aims to analyze factors of science teacher leadership in the Thailand World-Class Standard Schools. The research instrument was a five scale rating questionnaire with reliability 0.986. The sample group included 500 science teachers from World-Class Standard Schools who had been selected by using the stratified random sampling technique. Factor analysis of science teacher leadership in the Thailand World-Class Standard Schools was conducted by using M plus for Windows. The results are as follows: The results of confirmatory factor analysis on science teacher leadership in the Thailand World-Class Standard Schools revealed that the model significantly correlated with the empirical data. The consistency index value was x2 = 105.655, df = 88, P-Value = 0.086, TLI = 0.997, CFI = 0.999, RMSEA = 0.022, and SRMR = 0.019. The value of factor loading of science teacher leadership was positive, with statistical significance at the level of 0.01. The value of six factors was between 0.880-0.996. The highest factor loading was the professional learning community, followed by child-centered instruction, participation in development, the role model in teaching, transformational leaders, and self-development with factor loading at 0.996, 0.928, 0.911, 0.907, 0.901, and 0.871, respectively. The reliability of each factor was 99.1%, 86.0%, 83.0%, 82.2%, 81.0%, and 75.8%, respectively.

  5. Accurate determination of arsenic in arsenobetaine standard solutions of BCR-626 and NMIJ CRM 7901-a by neutron activation analysis coupled with internal standard method.

    Science.gov (United States)

    Miura, Tsutomu; Chiba, Koichi; Kuroiwa, Takayoshi; Narukawa, Tomohiro; Hioki, Akiharu; Matsue, Hideaki

    2010-09-15

    Neutron activation analysis (NAA) coupled with an internal standard method was applied for the determination of As in the certified reference material (CRM) of arsenobetaine (AB) standard solutions to verify their certified values. Gold was used as an internal standard to compensate for the difference of the neutron exposure in an irradiation capsule and to improve the sample-to-sample repeatability. Application of the internal standard method significantly improved linearity of the calibration curve up to 1 microg of As, too. The analytical reliability of the proposed method was evaluated by k(0)-standardization NAA. The analytical results of As in AB standard solutions of BCR-626 and NMIJ CRM 7901-a were (499+/-55)mgkg(-1) (k=2) and (10.16+/-0.15)mgkg(-1) (k=2), respectively. These values were found to be 15-20% higher than the certified values. The between-bottle variation of BCR-626 was much larger than the expanded uncertainty of the certified value, although that of NMIJ CRM 7901-a was almost negligible. Copyright (c) 2010 Elsevier B.V. All rights reserved.

  6. Economic analysis of passive houses and low-energy houses compared with standard houses

    International Nuclear Information System (INIS)

    Audenaert, A.; Cleyn, S.H. de; Vankerckhove, B.

    2008-01-01

    As the energy demand used for space heating accounts for 78% of EU15 household delivered energy consumption, significant reductions in energy demand can be achieved by promoting low-energy buildings. Our study investigates three building types: the standard house, the low-energy house and the passive house. As more far-reaching measures concerning energy savings usually lead to higher investments, the aim of our study is to perform an economic analysis in order to determine the economic viability of the three building types

  7. Component reliability analysis for development of component reliability DB of Korean standard NPPs

    International Nuclear Information System (INIS)

    Choi, S. Y.; Han, S. H.; Kim, S. H.

    2002-01-01

    The reliability data of Korean NPP that reflects the plant specific characteristics is necessary for PSA and Risk Informed Application. We have performed a project to develop the component reliability DB and calculate the component reliability such as failure rate and unavailability. We have collected the component operation data and failure/repair data of Korean standard NPPs. We have analyzed failure data by developing a data analysis method which incorporates the domestic data situation. And then we have compared the reliability results with the generic data for the foreign NPPs

  8. Platinum stable isotope analysis of geological standard reference materials by double-spike MC-ICPMS.

    Science.gov (United States)

    Creech, J B; Baker, J A; Handler, M R; Bizzarro, M

    2014-01-10

    We report a method for the chemical purification of Pt from geological materials by ion-exchange chromatography for subsequent Pt stable isotope analysis by multiple-collector inductively coupled plasma mass spectrometry (MC-ICPMS) using a 196 Pt- 198 Pt double-spike to correct for instrumental mass bias. Double-spiking of samples was carried out prior to digestion and chemical separation to correct for any mass-dependent fractionation that may occur due to incomplete recovery of Pt. Samples were digested using a NiS fire assay method, which pre-concentrates Pt into a metallic bead that is readily dissolved in acid in preparation for anion-exchange chemistry. Pt was recovered from anion-exchange resin in concentrated HNO 3 acid after elution of matrix elements, including the other platinum group elements (PGE), in dilute HCl and HNO 3 acids. The separation method has been calibrated using a precious metal standard solution doped with a range of synthetic matrices and results in Pt yields of ≥90% with purity of ≥95%. Using this chemical separation technique, we have separated Pt from 11 international geological standard reference materials comprising of PGE ores, mantle rocks, igneous rocks and one sample from the Cretaceous-Paleogene boundary layer. Pt concentrations in these samples range from ca. 5 ng g -1 to 4 μg g -1 . This analytical method has been shown to have an external reproducibility on δ 198 Pt (permil difference in the 198 Pt/ 194 Pt ratio from the IRMM-010 standard) of ±0.040 (2 sd) on Pt solution standards (Creech et al., 2013, J. Anal. At. Spectrom. 28, 853-865). The reproducibility in natural samples is evaluated by processing multiple replicates of four standard reference materials, and is conservatively taken to be ca. ±0.088 (2 sd). Pt stable isotope data for the full set of reference materials have a range of δ 198 Pt values with offsets of up to 0.4‰ from the IRMM-010 standard, which are readily resolved with this technique. These

  9. Quantile regression and clustering analysis of standardized precipitation index in the Tarim River Basin, Xinjiang, China

    Science.gov (United States)

    Yang, Peng; Xia, Jun; Zhang, Yongyong; Han, Jian; Wu, Xia

    2017-11-01

    Because drought is a very common and widespread natural disaster, it has attracted a great deal of academic interest. Based on 12-month time scale standardized precipitation indices (SPI12) calculated from precipitation data recorded between 1960 and 2015 at 22 weather stations in the Tarim River Basin (TRB), this study aims to identify the trends of SPI and drought duration, severity, and frequency at various quantiles and to perform cluster analysis of drought events in the TRB. The results indicated that (1) both precipitation and temperature at most stations in the TRB exhibited significant positive trends during 1960-2015; (2) multiple scales of SPIs changed significantly around 1986; (3) based on quantile regression analysis of temporal drought changes, the positive SPI slopes indicated less severe and less frequent droughts at lower quantiles, but clear variation was detected in the drought frequency; and (4) significantly different trends were found in drought frequency probably between severe droughts and drought frequency.

  10. Statistical analysis of solid waste composition data: Arithmetic mean, standard deviation and correlation coefficients

    DEFF Research Database (Denmark)

    Edjabou, Maklawe Essonanawe; Martín-Fernández, Josep Antoni; Scheutz, Charlotte

    2017-01-01

    Data for fractional solid waste composition provide relative magnitudes of individual waste fractions, the percentages of which always sum to 100, thereby connecting them intrinsically. Due to this sum constraint, waste composition data represent closed data, and their interpretation and analysis......, have the potential to generate spurious or misleading results. Therefore, ¨compositional data should be transformed adequately prior to any statistical analysis, such as computing mean, standard deviation and correlation coefficients....... require statistical methods, other than classical statistics that are suitable only for non-constrained data such as absolute values. However, the closed characteristics of waste composition data are often ignored when analysed. The results of this study showed, for example, that unavoidable animal...... and plastic packaging. However, correlation tests applied to waste fraction compositions (percentage values) showed a negative association in this regard, thus demonstrating that statistical analyses applied to compositional waste fraction data, without addressing the closed characteristics of these data...

  11. Retrospective Analysis of the Benefits and Impacts of U.S. Renewable Portfolio Standards

    Energy Technology Data Exchange (ETDEWEB)

    Wiser, Ryan [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Barbose, Galen [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Heeter, Jenny [National Renewable Energy Lab. (NREL), Golden, CO (United States); Mai, Trieu [National Renewable Energy Lab. (NREL), Golden, CO (United States); Bird, Lori [National Renewable Energy Lab. (NREL), Golden, CO (United States); Bolinger, Mark [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Carpenter, Alberta [National Renewable Energy Lab. (NREL), Golden, CO (United States); Heath, Garvin [National Renewable Energy Lab. (NREL), Golden, CO (United States); Keyser, David [National Renewable Energy Lab. (NREL), Golden, CO (United States); Macknick, Jordan [National Renewable Energy Lab. (NREL), Golden, CO (United States); Mills, Andrew [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Millstein, Dev [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States)

    2016-01-06

    This analysis is the first-ever comprehensive assessment of the benefits and impacts of state renewable portfolio standards (RPSs). This joint National Renewable Energy Laboratory-Lawrence Berkeley National Laboratory project provides a retrospective analysis of RPS program benefits and impacts, including greenhouse gas emissions reductions, air pollution emission reductions, water use reductions, gross jobs and economic development impacts, wholesale electricity price reduction impacts, and natural gas price reduction impacts. Wherever possible, benefits and impacts are quantified in monetary terms. The paper will inform state policymakers, RPS program administrators, industry, and others about the costs and benefits of state RPS programs. In particular, the work seeks to inform decision-making surrounding ongoing legislative proposals to scale back, freeze, or expand existing RPS programs, as well as future discussions about increasing RPS targets or otherwise increasing renewable energy associated with Clean Power Plan compliance or other emission-reduction goals.

  12. A unification of models for meta-analysis of diagnostic accuracy studies without a gold standard.

    Science.gov (United States)

    Liu, Yulun; Chen, Yong; Chu, Haitao

    2015-06-01

    Several statistical methods for meta-analysis of diagnostic accuracy studies have been discussed in the presence of a gold standard. However, in practice, the selected reference test may be imperfect due to measurement error, non-existence, invasive nature, or expensive cost of a gold standard. It has been suggested that treating an imperfect reference test as a gold standard can lead to substantial bias in the estimation of diagnostic test accuracy. Recently, two models have been proposed to account for imperfect reference test, namely, a multivariate generalized linear mixed model (MGLMM) and a hierarchical summary receiver operating characteristic (HSROC) model. Both models are very flexible in accounting for heterogeneity in accuracies of tests across studies as well as the dependence between tests. In this article, we show that these two models, although with different formulations, are closely related and are equivalent in the absence of study-level covariates. Furthermore, we provide the exact relations between the parameters of these two models and assumptions under which two models can be reduced to equivalent submodels. On the other hand, we show that some submodels of the MGLMM do not have corresponding equivalent submodels of the HSROC model, and vice versa. With three real examples, we illustrate the cases when fitting the MGLMM and HSROC models leads to equivalent submodels and hence identical inference, and the cases when the inferences from two models are slightly different. Our results generalize the important relations between the bivariate generalized linear mixed model and HSROC model when the reference test is a gold standard. © 2014, The International Biometric Society.

  13. Analysis of RIA standard curve by log-logistic and cubic log-logit models

    International Nuclear Information System (INIS)

    Yamada, Hideo; Kuroda, Akira; Yatabe, Tami; Inaba, Taeko; Chiba, Kazuo

    1981-01-01

    In order to improve goodness-of-fit in RIA standard analysis, programs for computing log-logistic and cubic log-logit were written in BASIC using personal computer P-6060 (Olivetti). Iterative least square method of Taylor series was applied for non-linear estimation of logistic and log-logistic. Hear ''log-logistic'' represents Y = (a - d)/(1 + (log(X)/c)sup(b)) + d As weights either 1, 1/var(Y) or 1/σ 2 were used in logistic or log-logistic and either Y 2 (1 - Y) 2 , Y 2 (1 - Y) 2 /var(Y), or Y 2 (1 - Y) 2 /σ 2 were used in quadratic or cubic log-logit. The term var(Y) represents squares of pure error and σ 2 represents estimated variance calculated using a following equation log(σ 2 + 1) = log(A) + J log(y). As indicators for goodness-of-fit, MSL/S sub(e)sup(2), CMD% and WRV (see text) were used. Better regression was obtained in case of alpha-fetoprotein by log-logistic than by logistic. Cortisol standard curve was much better fitted with cubic log-logit than quadratic log-logit. Predicted precision of AFP standard curve was below 5% in log-logistic in stead of 8% in logistic analysis. Predicted precision obtained using cubic log-logit was about five times lower than that with quadratic log-logit. Importance of selecting good models in RIA data processing was stressed in conjunction with intrinsic precision of radioimmunoassay system indicated by predicted precision. (author)

  14. The Apprentice: an Innovative Approach to Meet the Behavior Analysis Certification Board's Supervision Standards.

    Science.gov (United States)

    Hartley, Breanne K; Courtney, William T; Rosswurm, Mary; LaMarca, Vincent J

    2016-12-01

    The Behavior Analysis Certification Board continues to increase the standards for supervision of trainees, which is needed in order for the field to continually improve. However, this presents a challenge for organizations to meet the needs of both their clients and their supervisees based on these increasing standards. Throughout the ages, experts in all trades have passed along their knowledge and skill through apprenticeship opportunities. An apprenticeship supervision model is described that allows Board Certified Behavior Analysts to supervise future behavior analysts by mentoring, educating, and training supervisees on the science of human behavior in a format that is mutually beneficial. This innovative supervision model is discussed as it applies to an applied behavior analysis human service organization with the goal of creating a system that results in high-quality supervision in a cost-effective manner while providing maximal learning for the supervisee. The organization's previous supervision difficulties are described prior to implementing the apprenticeship supervision model, and the benefits of developing and using the apprenticeship supervision model are outlined.

  15. Suitable pellets standards development for LA-ICPMS analysis of Al2O3 powders

    International Nuclear Information System (INIS)

    Ferraz, Israel Elias; Sousa, Talita Alves de; Silva, Ieda de Souza; Gomide, Ricardo Goncalves; Oliveira, Luis Claudio de

    2013-01-01

    Chemical and physical characterization of aluminium oxides has a special interest for the nuclear industry, despite arduous chemical digestion process. Therefore, laser ablation inductively coupled plasma mass spectrometry is an attractive method for analysis. However, due to the lack of suitable matrix-matched certified reference materials (MRC) for such powders and ceramic pellets analysis, LA-ICPMS has not yet been fully applied. Furthermore, establishing calibrate curves to trace element quantification using external standards raises a significant problem. In this context, the development of suitable standard pellets to have calibration curves for chemical determination of the impurities onto aluminium oxide powders by LA-ICPMS analytical technique was aimed in this work. It was developed using two different analytical strategies: (I) boric acid pressed pellets and (II) lithium tetra-borate melted pellets, both spiked with high purity oxides of Si, Mg, Ca, Na,Fe, Cr and Ni. The analytical strategy (II) which presented the best analytical parameters was selected, a reference certificated material was analyzed and the results compared. The limits of detection, linearity, precision, accuracy and recovery study results are presented and discussed. (author)

  16. Determination of 25 elements in biological standard reference materials by neutron activation analysis

    International Nuclear Information System (INIS)

    Guzzi, G.; Pietra, R.; Sabbioni, E.

    1974-12-01

    Standard and Certified Reference Materials programme of the JRC includes the determination of trace elements in complex biological samples delivered by the U.S. National Bureau of Standards: Bovine liver (NBS SRM 1577), Orchard Leaves (NBS SRM 1571) and Tomato Leaves. The study has been performed by the use of neutron activation analysis. Due to the very low concentration of some elements, radiochemical groups or elemental separation procedures were necessary. The paper describes the techniques used to analyse 25 elements. Computer assisted instrumental neutron activation analysis with high resolution Ge(Li) spectrometry was considerably advantageous in the determination of Na, K, Cl, Mn, Fe, Rb and Co and in some cases of Ca, Zn, Cs, Sc, and Cr. For low contents of Ca, Mg, Ni and Si special chemical separation schemes, followed by Cerenkov counting have been developped. Two other separation procedures allowing the determination of As, Cd, Ga, Hg, Mo, Cu, Sr Se, Ba and P have been set up. The first, the simplified one involves the use of high resolution Ge(Li) detectors, the second, the more complete one involves a larger number of shorter measurements performed by simpler and more sensitive techniques, such as NaI(Tl) scintillation spectrometry and Cerenkov counting. The results obtained are presented and discussed

  17. Spectral analysis of a standard test track profile during passage of an agricultural tractor

    Directory of Open Access Journals (Sweden)

    M. Cutini

    2013-09-01

    Full Text Available National statistics on work safety are pointing out a decreasing trend about related injuries and fatalities but an increasing number of reports about professional diseases. In this frame are also considered the mechanical vibrations, in particular whole-body vibrations (WBV. This study aims to reproduce and analyse the vertical displacement of the wheel of an agricultural tractor during the passage on a standard surface (ISO 5008 for defining potential correlations between the surface contour and the effects on vehicle dynamic and driver comfort by analysis of the signals acting under the tractor tires. An agricultural tractor, in four setting conditions and four different forward speeds, was tested on an ISO 5008 standard test track. The accelerations at the hubs of the tractor were acquired and subsequently reproduced on a four hydraulic actuators test bench, at CRA-ING laboratories, Treviglio, Italy. The analysis of the spectrums generated have shown that a roughness surface induces a transformation of the part of energy developed by the forward speed of the vehicle in vertical acceleration that excites the elastic parts (i.e. tires, suspensions. These phenomena seem to indicate that vehicle’s vibration entity is due to the combination of surface roughness and forward speed as amplitude and to the elastic properties of the vehicle as frequency.

  18. A standard analysis method (SAM) for the automated analysis of polychlorinated biphenyls (PCBs) in soils using the chemical analysis automation (CAA) paradigm: validation and performance

    International Nuclear Information System (INIS)

    Rzeszutko, C.; Johnson, C.R.; Monagle, M.; Klatt, L.N.

    1997-10-01

    The Chemical Analysis Automation (CAA) program is developing a standardized modular automation strategy for chemical analysis. In this automation concept, analytical chemistry is performed with modular building blocks that correspond to individual elements of the steps in the analytical process. With a standardized set of behaviors and interactions, these blocks can be assembled in a 'plug and play' manner into a complete analysis system. These building blocks, which are referred to as Standard Laboratory Modules (SLM), interface to a host control system that orchestrates the entire analytical process, from sample preparation through data interpretation. The integrated system is called a Standard Analysis Method (SAME). A SAME for the automated determination of Polychlorinated Biphenyls (PCB) in soils, assembled in a mobile laboratory, is undergoing extensive testing and validation. The SAME consists of the following SLMs: a four channel Soxhlet extractor, a High Volume Concentrator, column clean up, a gas chromatograph, a PCB data interpretation module, a robot, and a human- computer interface. The SAME is configured to meet the requirements specified in U.S. Environmental Protection Agency's (EPA) SW-846 Methods 3541/3620A/8082 for the analysis of pcbs in soils. The PCB SAME will be described along with the developmental test plan. Performance data obtained during developmental testing will also be discussed

  19. Catalogue of standards at the Safety Analysis department library, the first of March 1987

    International Nuclear Information System (INIS)

    1987-01-01

    This report is a compilation of ANS and ANSI/ANS, ANSI/ASME american standards, Regulatory guides (Power Reactor Division), IEEE standards, to which are added ANSI american standards, AFNOR french standards, all of them being updated until to day. This computed compilation is divided into three parts: 1) main descriptor categories; 2) list of american standards in chronological order, list of AFNOR french standards, list of Regulatory guides in numerical order. 3) list of descriptors in alphabetic order [fr

  20. Achieving the 30% Goal: Energy and Cost Savings Analysis of ASHRAE Standard 90.1-2010

    Energy Technology Data Exchange (ETDEWEB)

    Thornton, Brian A.; Rosenberg, Michael I.; Richman, Eric E.; Wang, Weimin; Xie, YuLong; Zhang, Jian; Cho, Heejin; Mendon, Vrushali V.; Athalye, Rahul A.; Liu, Bing

    2011-05-24

    This Technical Support Document presents the energy and cost savings analysis that PNNL conducted to measure the potential energy savings of 90.1-2010 relative to 90.1-2004. PNNL conducted this analysis with inputs from many other contributors and source of information. In particular, guidance and direction was provided by the Simulation Working Group under the auspices of the SSPC90.1. This report documents the approach and methodologies that PNNL developed to evaluate the energy saving achieved from use of ASHRAE/IES Standard 90.1-2010. Specifically, this report provides PNNL’s Progress Indicator process and methodology, EnergyPlus simulation framework, prototype model descriptions. This report covers the combined upgrades from 90.1-2004 to 90.1-2010, resulting in a total of 153 addenda. PNNL has reviewed and considered all 153 addenda for quantitative analysis in the Progress Indicator process. 53 of those are included in the quantitative analysis. This report provides information on the categorization of all of the addenda, a summary of the content, and deeper explanation of the impact and modeling of 53 identified addenda with quantitative savings.

  1. From the Analysis of Work-Processes to Designing Competence-Based Occupational Standards and Vocational Curricula

    Science.gov (United States)

    Tutlys, Vidmantas; Spöttl, Georg

    2017-01-01

    Purpose: This paper aims to explore methodological and institutional challenges on application of the work-process analysis approach in the design and development of competence-based occupational standards for Lithuania. Design/methodology/approach: The theoretical analysis is based on the review of scientific literature and the analysis of…

  2. Looking Backwards with the "Personnel Evaluation Standards": An Analysis of the Development and Implementation of a Statewide Teacher Assessment Program.

    Science.gov (United States)

    Ellett, Chad D.; And Others

    1996-01-01

    The "Personnel Evaluation Standards" of D. Stufflebeam (1988) were used as a framework for the historical analysis of Louisiana's effort to implement a statewide program to evaluate its 45,000 teachers for the purpose of renewable professional certification. Using the "Standards" provided insights into the evaluation process…

  3. Low-molecular-weight heparin versus standard heparin in general and orthopaedic surgery: a meta-analysis

    NARCIS (Netherlands)

    Nurmohamed, M. T.; Rosendaal, F. R.; Büller, H. R.; Dekker, E.; Hommes, D. W.; Vandenbroucke, J. P.; Briët, E.

    1992-01-01

    Low-molecular-weight heparins (LMWHs) have theoretical advantages over standard heparin as postoperative thromboprophylactic agents. We conducted a meta-analysis of studies reported between 1984 and April, 1991, in which LMWHs were compared with standard heparin for postoperative prophylaxis. We

  4. Evaluation of spectrographic standards for the carrier-distillation analysis of PuO2

    International Nuclear Information System (INIS)

    Martell, C.J.; Myers, W.M.

    1976-05-01

    Three plutonium metals whose impurity contents have been accurately determined are used to evaluate spectrographic standards. Best results are obtained when (1) highly impure samples are diluted, (2) the internal standard, cobalt, is used, (3) a linear curve is fitted to the standard data that bracket the impurity concentration, and (4) plutonium standards containing 22 impurities are used

  5. Standardization of sample collection, isolation and analysis methods in extracellular vesicle research

    Directory of Open Access Journals (Sweden)

    Kenneth W. Witwer

    2013-05-01

    Full Text Available The emergence of publications on extracellular RNA (exRNA and extracellular vesicles (EV has highlighted the potential of these molecules and vehicles as biomarkers of disease and therapeutic targets. These findings have created a paradigm shift, most prominently in the field of oncology, prompting expanded interest in the field and dedication of funds for EV research. At the same time, understanding of EV subtypes, biogenesis, cargo and mechanisms of shuttling remains incomplete. The techniques that can be harnessed to address the many gaps in our current knowledge were the subject of a special workshop of the International Society for Extracellular Vesicles (ISEV in New York City in October 2012. As part of the “ISEV Research Seminar: Analysis and Function of RNA in Extracellular Vesicles (evRNA”, 6 round-table discussions were held to provide an evidence-based framework for isolation and analysis of EV, purification and analysis of associated RNA molecules, and molecular engineering of EV for therapeutic intervention. This article arises from the discussion of EV isolation and analysis at that meeting. The conclusions of the round table are supplemented with a review of published materials and our experience. Controversies and outstanding questions are identified that may inform future research and funding priorities. While we emphasize the need for standardization of specimen handling, appropriate normative controls, and isolation and analysis techniques to facilitate comparison of results, we also recognize that continual development and evaluation of techniques will be necessary as new knowledge is amassed. On many points, consensus has not yet been achieved and must be built through the reporting of well-controlled experiments.

  6. Standardization of sample collection, isolation and analysis methods in extracellular vesicle research

    Science.gov (United States)

    Witwer, Kenneth W.; Buzás, Edit I.; Bemis, Lynne T.; Bora, Adriana; Lässer, Cecilia; Lötvall, Jan; Nolte-‘t Hoen, Esther N.; Piper, Melissa G.; Sivaraman, Sarada; Skog, Johan; Théry, Clotilde; Wauben, Marca H.; Hochberg, Fred

    2013-01-01

    The emergence of publications on extracellular RNA (exRNA) and extracellular vesicles (EV) has highlighted the potential of these molecules and vehicles as biomarkers of disease and therapeutic targets. These findings have created a paradigm shift, most prominently in the field of oncology, prompting expanded interest in the field and dedication of funds for EV research. At the same time, understanding of EV subtypes, biogenesis, cargo and mechanisms of shuttling remains incomplete. The techniques that can be harnessed to address the many gaps in our current knowledge were the subject of a special workshop of the International Society for Extracellular Vesicles (ISEV) in New York City in October 2012. As part of the “ISEV Research Seminar: Analysis and Function of RNA in Extracellular Vesicles (evRNA)”, 6 round-table discussions were held to provide an evidence-based framework for isolation and analysis of EV, purification and analysis of associated RNA molecules, and molecular engineering of EV for therapeutic intervention. This article arises from the discussion of EV isolation and analysis at that meeting. The conclusions of the round table are supplemented with a review of published materials and our experience. Controversies and outstanding questions are identified that may inform future research and funding priorities. While we emphasize the need for standardization of specimen handling, appropriate normative controls, and isolation and analysis techniques to facilitate comparison of results, we also recognize that continual development and evaluation of techniques will be necessary as new knowledge is amassed. On many points, consensus has not yet been achieved and must be built through the reporting of well-controlled experiments. PMID:24009894

  7. Cost minimisation analysis of using acellular dermal matrix (Strattice™) for breast reconstruction compared with standard techniques.

    Science.gov (United States)

    Johnson, R K; Wright, C K; Gandhi, A; Charny, M C; Barr, L

    2013-03-01

    We performed a cost analysis (using UK 2011/12 NHS tariffs as a proxy for cost) comparing immediate breast reconstruction using the new one-stage technique of acellular dermal matrix (Strattice™) with implant versus the standard alternative techniques of tissue expander (TE)/implant as a two-stage procedure and latissimus dorsi (LD) flap reconstruction. Clinical report data were collected for operative time, length of stay, outpatient procedures, and number of elective and emergency admissions in our first consecutive 24 patients undergoing one-stage Strattice reconstruction. Total cost to the NHS based on tariff, assuming top-up payments to cover Strattice acquisition costs, was assessed and compared to the two historical control groups matched on key variables. Eleven patients having unilateral Strattice reconstruction were compared to 10 having TE/implant reconstruction and 10 having LD flap and implant reconstruction. Thirteen patients having bilateral Strattice reconstruction were compared to 12 having bilateral TE/implant reconstruction. Total costs were: unilateral Strattice, £3685; unilateral TE, £4985; unilateral LD and implant, £6321; bilateral TE, £5478; and bilateral Strattice, £6771. The cost analysis shows a financial advantage of using acellular dermal matrix (Strattice) in unilateral breast reconstruction versus alternative procedures. The reimbursement system in England (Payment by Results) is based on disease-related groups similar to that of many countries across Europe and tariffs are based on reported hospital costs, making this analysis of relevance in other countries. Copyright © 2013 Elsevier Ltd. All rights reserved.

  8. Determination of arsenic in food and dietary supplement standard reference materials by neutron activation analysis

    International Nuclear Information System (INIS)

    Paul, R.L.

    2013-01-01

    Arsenic was measured in food and dietary supplement standard reference materials by neutron activation analysis for the purpose of assigning certified or reference As mass fractions and to assess material homogeneity. Instrumental neutron activation analysis was used to value assign As in candidate SRM 3532 Calcium Dietary Supplement and candidate SRM 3262 Hypericum perforatum (St. John's Wort) Aerial Parts down to about 100 μg/kg. Values were also determined for two additional candidate St. John's Wort SRMs with As mass fractions 24 Na and 82 Br limited the reproducibility of the method below 100 μg/kg. For measurement of lower As mass fractions, a radiochemical neutron activation analysis method with extraction of As 3+ into diethyl-dithiocarbamate in chloroform and detection limits down to 0.1 μg/kg. As was used to value-assign As mass fractions for SRM 3280 Multivitamin/Multielement Tablets and for candidate SRM 3233 Fortified Breakfast Cereal, and at <10 μg/kg in candidate SRM 1845a Whole Egg Powder. (author)

  9. A Critical Appraisal of Techniques, Software Packages, and Standards for Quantitative Proteomic Analysis

    Science.gov (United States)

    Lawless, Craig; Hubbard, Simon J.; Fan, Jun; Bessant, Conrad; Hermjakob, Henning; Jones, Andrew R.

    2012-01-01

    Abstract New methods for performing quantitative proteome analyses based on differential labeling protocols or label-free techniques are reported in the literature on an almost monthly basis. In parallel, a correspondingly vast number of software tools for the analysis of quantitative proteomics data has also been described in the literature and produced by private companies. In this article we focus on the review of some of the most popular techniques in the field and present a critical appraisal of several software packages available to process and analyze the data produced. We also describe the importance of community standards to support the wide range of software, which may assist researchers in the analysis of data using different platforms and protocols. It is intended that this review will serve bench scientists both as a useful reference and a guide to the selection and use of different pipelines to perform quantitative proteomics data analysis. We have produced a web-based tool (http://www.proteosuite.org/?q=other_resources) to help researchers find appropriate software for their local instrumentation, available file formats, and quantitative methodology. PMID:22804616

  10. Early stage hot spot analysis through standard cell base random pattern generation

    Science.gov (United States)

    Jeon, Joong-Won; Song, Jaewan; Kim, Jeong-Lim; Park, Seongyul; Yang, Seung-Hune; Lee, Sooryong; Kang, Hokyu; Madkour, Kareem; ElManhawy, Wael; Lee, SeungJo; Kwan, Joe

    2017-04-01

    Due to limited availability of DRC clean patterns during the process and RET recipe development, OPC recipes are not tested with high pattern coverage. Various kinds of pattern can help OPC engineer to detect sensitive patterns to lithographic effects. Random pattern generation is needed to secure robust OPC recipe. However, simple random patterns without considering real product layout style can't cover patterning hotspot in production levels. It is not effective to use them for OPC optimization thus it is important to generate random patterns similar to real product patterns. This paper presents a strategy for generating random patterns based on design architecture information and preventing hotspot in early process development stage through a tool called Layout Schema Generator (LSG). Using LSG, we generate standard cell based on random patterns reflecting real design cell structure - fin pitch, gate pitch and cell height. The output standard cells from LSG are applied to an analysis methodology to assess their hotspot severity by assigning a score according to their optical image parameters - NILS, MEEF, %PV band and thus potential hotspots can be defined by determining their ranking. This flow is demonstrated on Samsung 7nm technology optimizing OPC recipe and early enough in the process avoiding using problematic patterns.

  11. Uterine electromyogram database and processing function interface: An open standard analysis platform for electrohysterogram signals.

    Science.gov (United States)

    Terrien, Jérémy; Marque, Catherine; Gondry, Jean; Steingrimsdottir, Thora; Karlsson, Brynjar

    2010-02-01

    The uterine electromyogram or electrohysterogram (EHG) is one of the most promising biophysical markers of preterm labor. At this time no recording parameter standard exists for EHG recordings which can be a problem for the establishment of international multicentric trials. In this paper, we present a management and processing system dedicated to storing and processing EHG signals. This system can process EHG signals recorded in different experimental conditions i.e. different sampling frequencies. The signal management is performed using an easy to use graphical user interface. Other available functions include visualization, preprocessing and analysis of EHG signals. The proposed processing functions provide temporal, spectral and time-scale parameters obtained from the EHG bibliography. The obtained results from real signals recorded in two different hospitals in two different countries are in accordance with the literature and demonstrate the potential of the proposed system. The incorporation of new functions is easy, due to a standardization of the EHG data formats. 2009 Elsevier Ltd. All rights reserved.

  12. Standard practice for extreme value analysis of nonmetallic inclusions in steel and other microstructural features

    CERN Document Server

    American Society for Testing and Materials. Philadelphia

    2008-01-01

    1.1 This practice describes a methodology to statistically characterize the distribution of the largest indigenous nonmetallic inclusions in steel specimens based upon quantitative metallographic measurements. The practice is not suitable for assessing exogenous inclusions. 1.2 Based upon the statistical analysis, the nonmetallic content of different lots of steels can be compared. 1.3 This practice deals only with the recommended test methods and nothing in it should be construed as defining or establishing limits of acceptability. 1.4 The measured values are stated in SI units. For measurements obtained from light microscopy, linear feature parameters shall be reported as micrometers, and feature areas shall be reported as micrometers. 1.5 The methodology can be extended to other materials and to other microstructural features. 1.6 This standard does not purport to address all of the safety concerns, if any, associated with its use. It is the responsibility of the user of this standard to establish app...

  13. Security analysis of standards-driven communication protocols for healthcare scenarios.

    Science.gov (United States)

    Masi, Massimiliano; Pugliese, Rosario; Tiezzi, Francesco

    2012-12-01

    The importance of the Electronic Health Record (EHR), that stores all healthcare-related data belonging to a patient, has been recognised in recent years by governments, institutions and industry. Initiatives like the Integrating the Healthcare Enterprise (IHE) have been developed for the definition of standard methodologies for secure and interoperable EHR exchanges among clinics and hospitals. Using the requisites specified by these initiatives, many large scale projects have been set up for enabling healthcare professionals to handle patients' EHRs. The success of applications developed in these contexts crucially depends on ensuring such security properties as confidentiality, authentication, and authorization. In this paper, we first propose a communication protocol, based on the IHE specifications, for authenticating healthcare professionals and assuring patients' safety. By means of a formal analysis carried out by using the specification language COWS and the model checker CMC, we reveal a security flaw in the protocol thus demonstrating that to simply adopt the international standards does not guarantee the absence of such type of flaws. We then propose how to emend the IHE specifications and modify the protocol accordingly. Finally, we show how to tailor our protocol for application to more critical scenarios with no assumptions on the communication channels. To demonstrate feasibility and effectiveness of our protocols we have fully implemented them.

  14. Preliminary analysis of the efficiency of non-standard divertor configurations in DEMO

    Directory of Open Access Journals (Sweden)

    F. Subba

    2017-08-01

    Full Text Available The standard Single Null (SN divertor is currently expected to be installed in DEMO. However, a number of alternative configurations are being evaluated in parallel as backup solutions, in case the standard divertor does not extrapolate successfully from ITER to a fusion power plant. We used the SOLPS code to produce a preliminary analysis of two such configurations, the X-Divertor (XD and the Super X-Divertor (SX, and compare them to the SN solution. Considering the nominal power flowing into the SOL (PSOL = 150 MW, we estimated the amplitude of the acceptable DEMO operational space. The acceptability criterion was chosen as plasma temperature at the target lower than 5eV, providing low sputtering and at least partial detachment, while the operational space was defined in terms of the electron density at the outboard mid-plane separatrix and of the seeded impurity (Ar only in the present study concentration. It was found that both the XD and the SXD extend the DEMO operational space, although the advantages detected so far are not dramatic. The most promising configuration seems to be the XD, which can produce acceptable target temperatures at moderate outboard mid-plane electron density (nomp=4.5×1019 m−3 and Zeff= 1.3.

  15. Statistical analysis of supersymmetric dark matter in the minimal supersymmetric standard model after WMAP

    International Nuclear Information System (INIS)

    Profumo, S.; Yaguna, C.E.

    2004-01-01

    We study supersymmetric dark matter in the general flavor diagonal minimal supersymmetric standard model by means of an extensive random scan of its parameter space. We find that, in contrast with the standard minimal supergravity lore, the large majority of viable models features either a Higgsino or a winolike lightest neutralino, and yields a relic abundance well below the Wilkinson Microwave Anisotropy Probe (WMAP) bound. Among the models with neutralino relic density within the WMAP range, Higgsinolike neutralinos are still dominant, though a sizable fraction of binos is also present. In this latter case, coannihilations are shown to be essential in order to obtain the correct neutralino abundance. We then carry out a statistical analysis and a general discussion of neutralino dark matter direct detection and of indirect neutralino detection at neutrino telescopes and at antimatter search experiments. We point out that current data exclude only a marginal portion of the viable parameter space, and that models whose thermal relic abundance lies in the WMAP range will be significantly probed only at future direct detection experiments. Finally, we emphasize the importance of relic density enhancement mechanisms for indirect detection perspectives, in particular, at future antimatter search experiments

  16. Analysis of thermal radiation in ion traps for optical frequency standards

    Science.gov (United States)

    Doležal, M.; Balling, P.; Nisbet-Jones, P. B. R.; King, S. A.; Jones, J. M.; Klein, H. A.; Gill, P.; Lindvall, T.; Wallin, A. E.; Merimaa, M.; Tamm, C.; Sanner, C.; Huntemann, N.; Scharnhorst, N.; Leroux, I. D.; Schmidt, P. O.; Burgermeister, T.; Mehlstäubler, T. E.; Peik, E.

    2015-12-01

    In many of the high-precision optical frequency standards with trapped atoms or ions that are under development to date, the ac Stark shift induced by thermal radiation leads to a major contribution to the systematic uncertainty. We present an analysis of the inhomogeneous thermal environment experienced by ions in various types of ion traps. Finite element models which allow the determination of the temperature of the trap structure and the temperature of the radiation were developed for five ion trap designs, including operational traps at PTB and NPL and further optimized designs. Models were refined based on comparison with infrared camera measurement until an agreement of better than 10% of the measured temperature rise at critical test points was reached. The effective temperature rises of the radiation seen by the ion range from 0.8 K to 2.1 K at standard working conditions. The corresponding fractional frequency shift uncertainties resulting from the uncertainty in temperature are in the 10-18 range for optical clocks based on the Sr+ and Yb+ E2 transitions, and even lower for Yb+ E3, In+ and Al+. Issues critical for heating of the trap structure and its predictability were identified and design recommendations developed.

  17. Analysis of Standards and Specific Documentation about Equipment of Dimensional Metrology

    Science.gov (United States)

    Martin, M. J.; Flores, I.; Sebastian, M. A.

    2009-11-01

    Currently the certification of quality systems and accreditation of laboratories for metrology and testing are activities of great interest within the framework of advanced production systems. In this context, the availability of standardized documents, complete and efficient as well as specific documents edited by agencies with experience in this field, is especially important to obtain better results and lower costs. This work tries to establish the foundations to evaluate a documental system about equipment of Dimensional Metrology. An integrated and complete system does not exist in international field, so the Spanish case will be analyzed as example to the general study. In this paper we consider three types of instruments commonly used in the field of Dimensional Metrology (vernier calliper, micrometer calliper and mechanical dial gauge) and are passed to analyze the contents of UNE standards that affect them directly, and the two collections of documents produced and edited by the Centro Español de Metrología (CEM), such as "calibration procedures" and "use manuals." Given the results of this analysis, a discussion on the metrological characteristics of the contents of the document in question is developed and recommendations for their use and improvement are proposed.

  18. Standard test methods for determining average grain size using semiautomatic and automatic image analysis

    CERN Document Server

    American Society for Testing and Materials. Philadelphia

    2015-01-01

    1.1 These test methods are used to determine grain size from measurements of grain intercept lengths, intercept counts, intersection counts, grain boundary length, and grain areas. 1.2 These measurements are made with a semiautomatic digitizing tablet or by automatic image analysis using an image of the grain structure produced by a microscope. 1.3 These test methods are applicable to any type of grain structure or grain size distribution as long as the grain boundaries can be clearly delineated by etching and subsequent image processing, if necessary. 1.4 These test methods are applicable to measurement of other grain-like microstructures, such as cell structures. 1.5 This standard deals only with the recommended test methods and nothing in it should be construed as defining or establishing limits of acceptability or fitness for purpose of the materials tested. 1.6 This standard does not purport to address all of the safety concerns, if any, associated with its use. It is the responsibility of the user ...

  19. A Comparative Energetic Analysis of Active and Passive Emission Control Systems Adopting Standard Emission Test Cycles

    Directory of Open Access Journals (Sweden)

    Angelo Algieri

    2012-01-01

    Full Text Available The present work aims at analysing and comparing the thermal performances of active and passive aftertreatment systems. A one-dimensional transient model has been developed in order to evaluate the heat exchange between the solid and the exhaust gas and to estimate the energy effectiveness of the apparatus. Furthermore, the effect of the engine operating conditions on the performances of emission control systems has been investigated considering standard emission test cycles. The analysis has demonstrated that the active flow control presents the higher thermal inertia and it appears more suitable to maintain the converter initial temperature level for a longer time after variations in engine load. Conversely, the traditional passive flow control is preferable when rapid “cooling” or “heating” of the solid phase is requested. Moreover, the investigation has highlighted the significant influence of the cycle time and converter length on the energetic performances of the aftertreatment apparatus.

  20. Microspectrophotometric studies of Romanowsky stained blood cells. I. Subtraction analysis of a standardized procedure.

    Science.gov (United States)

    Galbraith, W; Marshall, P N; Bacus, J W

    1980-08-01

    This paper describes a microspectrophotometric study of blood smears stained by a simple, standardized Romanowsky technique, using only the dyes azure B and cosin. Absorbance spectra are presented for twenty-two classes of cellular object, and for the two dyes in solution, together with tabulations of spectral maxima, and suitable wavelengths for use in automated image processing. The colours of objects stained with azure B/eosin are discussed in terms of absorbance spectra. By a spectral subtraction technique, it is shown that the differential colouration of various cell structures may be explained satisfactorily in terms of the varying proportions of only four dye components. These are the monomers and dimers of azure B and eosin. Polymerization was found to occur both in solution and on binding to biopolymers. A similar analysis of a conventional Romanowsky stain would present much greater difficulties, due to the greater number of dye components, which, however, contribute little to the colours observed.

  1. Cyber crime: can a standard risk analysis help in the challenges facing business continuity managers?

    Science.gov (United States)

    Vande Putte, Danny; Verhelst, Marc

    Risk management has never been easy. Finding efficient mitigating measures is not always straightforward. Finding measures for cyber crime, however, is a really huge challenge because cyber threats are changing all the time. As the sophistication of these threats is growing, their impact increases. Moreover, society and its economy have become increasingly dependent on information and communication technologies. Standard risk analysis methodologies will help to score the cyber risk and to place it in the risk tolerance matrix. This will allow business continuity managers to figure out if there is still a gap with the maximum tolerable outage for time-critical business processes and if extra business continuity measures are necessary to fill the gap.

  2. Analysis of the existing Standard on Power performance measurement and its application in complex terrain

    International Nuclear Information System (INIS)

    Cuerva, A.

    1997-01-01

    There are some groups working on the improvement of the existing Standard and recommendation on WECS power performance measurement and analysis. One of them, besides the one working in this project, is the MEASNET expert group. This one is trying to adequate the main reference, the IEC1400-12 Ref. [9]. to the current requirements on technical quality and trueness. Within this group and the MEASNET one, many deficiencies have been detected in the procedure followed up to now. Several of them belong to general aspects of the method (calculations, assumptions, etc. ) but the most critical fact regards to the inherent characteristics of complex terrain and to the issue of site calibration and uncertainties due to it, specifically. (Author)

  3. Fuzzy system for risk analysis in software projects through the attributes of quality standards iso 25000

    Directory of Open Access Journals (Sweden)

    Chau Sen Shia

    2014-02-01

    Full Text Available With the growth in demand for products and services in the IT area, companies encounter difficulties in establishing a metric or measure of quality of services to address measurably qualitative values in their planning. In this work fuzzy logic, standard SQuaRE (measurement of the quality of software products, Likertscale, GQM method (Goal-Question-Metric -indicator of quality of Software and the project risk analysis model of Boehm were used to assess the quality of services and decision-making, according to your demand and requests for software development. With the aim of improving the quality in the provision of services, the application is used to integrate the team and follow the life cycle of a project from its initial phase, and to assist in the comparison with the proposed schedule during the requirements elicitation.

  4. Analysis of Minimum Efficiency Performance Standards for Residential General Service Lighting in Chile

    Energy Technology Data Exchange (ETDEWEB)

    Letschert, Virginie E.; McNeil, Michael A.; Leiva Ibanez, Francisco Humberto; Ruiz, Ana Maria; Pavon, Mariana; Hall, Stephen

    2011-06-01

    Minimum Efficiency Performance Standards (MEPS) have been chosen as part of Chile's national energy efficiency action plan. As a first MEPS, the Ministry of Energy has decided to focus on a regulation for lighting that would ban the sale of inefficient bulbs, effectively phasing out the use of incandescent lamps. Following major economies such as the US (EISA, 2007) , the EU (Ecodesign, 2009) and Australia (AS/NZS, 2008) who planned a phase out based on minimum efficacy requirements, the Ministry of Energy has undertaken the impact analysis of a MEPS on the residential lighting sector. Fundacion Chile (FC) and Lawrence Berkeley National Laboratory (LBNL) collaborated with the Ministry of Energy and the National Energy Efficiency Program (Programa Pais de Eficiencia Energetica, or PPEE) in order to produce a techno-economic analysis of this future policy measure. LBNL has developed for CLASP (CLASP, 2007) a spreadsheet tool called the Policy Analysis Modeling System (PAMS) that allows for evaluation of costs and benefits at the consumer level but also a wide range of impacts at the national level, such as energy savings, net present value of savings, greenhouse gas (CO2) emission reductions and avoided capacity generation due to a specific policy. Because historically Chile has followed European schemes in energy efficiency programs (test procedures, labelling program definitions), we take the Ecodesign commission regulation No 244/2009 as a starting point when defining our phase out program, which means a tiered phase out based on minimum efficacy per lumen category. The following data were collected in order to perform the techno-economic analysis: (1) Retail prices, efficiency and wattage category in the current market, (2) Usage data (hours of lamp use per day), and (3) Stock data, penetration of efficient lamps in the market. Using these data, PAMS calculates the costs and benefits of efficiency standards from two distinct but related perspectives: (1) The

  5. Status analysis of Chinese standards on enclosure equipment and proposed countermeasures

    International Nuclear Information System (INIS)

    Wu Luping

    1998-12-01

    Enclosure equipment, such as glove box, tong box etc., is an important kind of equipment for nuclear industry and nuclear scientific research. The status of the establishment and implementation of Chinese standards on enclosure equipment is briefly described. Some problems and deficiency existing in these standards are pointed out. The ISO standard projects on containment enclosures as well as their present progress situations are introduced. The measure for updating Chinese standards on enclosure equipment in accordance with the principle of adopting international standards are recommended. Some issues which should be taken into account in adopting ISO standards on containment enclosures are also discussed

  6. Manual of Standard Operating Procedures for Veterinary Drug Residue Analysis (French Edition)

    International Nuclear Information System (INIS)

    2017-01-01

    Laboratories are crucial to national veterinary drug residue monitoring programmes. However, one of the main challenges laboratories encounter is obtaining access to relevant methods of analysis. Thus, in addition to training, providing technical advice and transferring technology, the Joint FAO/IAEA Division of Nuclear Techniques in Food and Agriculture has resolved to develop clear and practical manuals to support Member State laboratories. The Coordinated Research Project (CRP) on Development of Radiometric and Allied Analytical Methods to Strengthen Residue Control Programs for Antibiotic and Anthelmintic Veterinary Drug Residues has developed a number of analytical methods as standard operating procedures (SOPs), which are now compiled here. This publication contains SOPs on chromatographic and spectrometric techniques, as well as radioimmunoassay and associated screening techniques, for various anthelmintic and antimicrobial veterinary drug residue analysis. Some analytical method validation protocols are also included. The publication is primarily aimed at food and environmental safety laboratories involved in testing veterinary drug residues, including under organized national residue monitoring programmes. It is expected to enhance laboratory capacity building and competence through the use of radiometric and complementary tools and techniques. The publication is also relevant for applied research on residues of veterinary drugs in food and environmental samples

  7. A sensitivity analysis of the New Zealand standard model of foot and mouth disease.

    Science.gov (United States)

    Owen, K; Stevenson, M A; Sanson, R L

    2011-08-01

    Disease simulation models can be a valuable tool for planning a response to exotic disease incursions, as they provide a fast, low-cost mechanism for identifying the likely outcomes of a range of outbreak scenarios and disease control strategies. To use these tools effectively and with confidence, decision-makers must understand the simplifications and framing assumptions that underlie a model's structure. Sensitivity analysis, the analytical process of identifying which input variables are the key drivers of the model's output, is a crucial process in developing this understanding. This paper describes the application of a sampling-based sensitivity analysis to the New Zealand standard model (NZSM). This model is a parameter set developed for the InterSpread Plus model platform to allow the exploration of different outbreak scenarios for an epidemic of foot and mouth disease in New Zealand. Based on 200 iterations of the NZSM, run for a simulation period of 60 days, settings related to farm-to-saleyard movements and the detection of disease during the active surveillance phase of the epidemic had the greatest influence on the predicted number of infected premises. A small number of counter-intuitive findings indicated areas of model design, implementation and/or parameterisation that should be investigated further. A potentially useful result from this work would be information to aid the grouping or elimination of non-influential model settings. This would go some way towards reducing the overall complexity of the NZSM, while still allowing it to remain fit for purpose.

  8. OSPAR standard method and software for statistical analysis of beach litter data.

    Science.gov (United States)

    Schulz, Marcus; van Loon, Willem; Fleet, David M; Baggelaar, Paul; van der Meulen, Eit

    2017-09-15

    The aim of this study is to develop standard statistical methods and software for the analysis of beach litter data. The optimal ensemble of statistical methods comprises the Mann-Kendall trend test, the Theil-Sen slope estimation, the Wilcoxon step trend test and basic descriptive statistics. The application of Litter Analyst, a tailor-made software for analysing the results of beach litter surveys, to OSPAR beach litter data from seven beaches bordering on the south-eastern North Sea, revealed 23 significant trends in the abundances of beach litter types for the period 2009-2014. Litter Analyst revealed a large variation in the abundance of litter types between beaches. To reduce the effects of spatial variation, trend analysis of beach litter data can most effectively be performed at the beach or national level. Spatial aggregation of beach litter data within a region is possible, but resulted in a considerable reduction in the number of significant trends. Copyright © 2017 Elsevier Ltd. All rights reserved.

  9. SRAC: JAERI thermal reactor standard code system for reactor design and analysis

    International Nuclear Information System (INIS)

    Tsuchihashi, Keichiro; Takano, Hideki; Horikami, Kunihiko; Ishiguro, Yukio; Kaneko, Kunio; Hara, Toshiharu.

    1983-01-01

    The SRAC (Standard Reactor Analysis Code) is a code system for nuclear reactor analysis and design. It is composed of neutron cross section libraries and auxiliary processing codes, neutron spectrum routines, a variety of transport, 1-, 2- and 3-D diffusion routines, dynamic parameters and cell burn-up routines. By making the best use of the individual code function in the SRAC system, the user can select either the exact method for an accurate estimate of reactor characteristics or the economical method aiming at a shorter computer time, depending on the purpose of study. The user can select cell or core calculation; fixed source or eigenvalue problem; transport (collision probability or Sn) theory or diffusion theory. Moreover, smearing and collapsing of macroscopic cross sections are separately done by the user's selection. And a special attention is paid for double heterogeneity. Various techniques are employed to access the data storage and to optimize the internal data transfer. Benchmark calculations using the SRAC system have been made extensively for the Keff values of various types of critical assemblies (light water, heavy water and graphite moderated systems, and fast reactor systems). The calculated results show good prediction for the experimental Keff values. (author)

  10. Nephele: A cloud platform for simplified, standardized, and reproducible microbiome data analysis.

    Science.gov (United States)

    Weber, Nick; Liou, David; Dommer, Jennifer; MacMenamin, Philip; Quiñones, Mariam; Misner, Ian; Oler, Andrew; Wan, Joe; Kim, Lewis; McCarthy, Meghan Coakley; Ezeji, Samuel; Noble, Karlynn; Hurt, Darrell E

    2017-09-28

    Widespread interest in the study of the microbiome has resulted in data proliferation and the development of powerful computational tools. However, many scientific researchers lack the time, training, or infrastructure to work with large datasets or to install and use command line tools. The National Institute of Allergy and Infectious Diseases (NIAID) has created Nephele, a cloud-based microbiome data analysis platform with standardized pipelines and a simple web interface for transforming raw data into biological insights. Nephele integrates common microbiome analysis tools as well as valuable reference datasets like the healthy human subjects cohort of the Human Microbiome Project (HMP). Nephele is built on the Amazon Web Services cloud, which provides centralized and automated storage and compute capacity, thereby reducing the burden on researchers and their institutions. https://nephele.niaid.nih.gov. darrell.hurt@nih.gov. Pipeline source code, sample inputs, and results data are available at https://github.com/niaid/Nephele and at https://nephele.niaid.nih.gov/#guide. Published by Oxford University Press 2017. This work is written by US Government employees and are in the public domain in the US.

  11. Generation and Standardized, Systemic Phenotypic Analysis of Pou3f3L423P Mutant Mice.

    Directory of Open Access Journals (Sweden)

    Sudhir Kumar

    Full Text Available Increased levels of blood plasma urea were used as phenotypic parameter for establishing novel mouse models for kidney diseases on the genetic background of C3H inbred mice in the phenotype-driven Munich ENU mouse mutagenesis project. The phenotypically recessive mutant line HST011 was established and further analyzed. The causative mutation was detected in the POU domain, class 3 transcription factor 3 (Pou3f3 gene, which leads to the amino acid exchange Pou3f3L423P thereby affecting the conserved homeobox domain of the protein. Pou3f3 homozygous knockout mice are published and show perinatal death. Line Pou3f3L423P is a viable mouse model harboring a homozygous Pou3f3 mutation. Standardized, systemic phenotypic analysis of homozygous mutants was carried out in the German Mouse Clinic. Main phenotypic changes were low body weight and a state of low energy stores, kidney dysfunction and secondary effects thereof including low bone mineralization, multiple behavioral and neurological defects including locomotor, vestibular, auditory and nociceptive impairments, as well as multiple subtle changes in immunological parameters. Genome-wide transcriptome profiling analysis of kidney and brain of Pou3f3L423P homozygous mutants identified significantly regulated genes as compared to wild-type controls.

  12. Manual of Standard Operating Procedures for Veterinary Drug Residue Analysis (Spanish Edition)

    International Nuclear Information System (INIS)

    2017-01-01

    Laboratories are crucial to national veterinary drug residue monitoring programmes. However, one of the main challenges laboratories encounter is obtaining access to relevant methods of analysis. Thus, in addition to training, providing technical advice and transferring technology, the Joint FAO/IAEA Division of Nuclear Techniques in Food and Agriculture has resolved to develop clear and practical manuals to support Member State laboratories. The Coordinated Research Project (CRP) on Development of Radiometric and Allied Analytical Methods to Strengthen Residue Control Programs for Antibiotic and Anthelmintic Veterinary Drug Residues has developed a number of analytical methods as standard operating procedures (SOPs), which are now compiled here. This publication contains SOPs on chromatographic and spectrometric techniques, as well as radioimmunoassay and associated screening techniques, for various anthelmintic and antimicrobial veterinary drug residue analysis. Some analytical method validation protocols are also included. The publication is primarily aimed at food and environmental safety laboratories involved in testing veterinary drug residues, including under organized national residue monitoring programmes. It is expected to enhance laboratory capacity building and competence through the use of radiometric and complementary tools and techniques. The publication is also relevant for applied research on residues of veterinary drugs in food and environmental samples

  13. Statistical analysis of solid waste composition data: Arithmetic mean, standard deviation and correlation coefficients.

    Science.gov (United States)

    Edjabou, Maklawe Essonanawe; Martín-Fernández, Josep Antoni; Scheutz, Charlotte; Astrup, Thomas Fruergaard

    2017-11-01

    Data for fractional solid waste composition provide relative magnitudes of individual waste fractions, the percentages of which always sum to 100, thereby connecting them intrinsically. Due to this sum constraint, waste composition data represent closed data, and their interpretation and analysis require statistical methods, other than classical statistics that are suitable only for non-constrained data such as absolute values. However, the closed characteristics of waste composition data are often ignored when analysed. The results of this study showed, for example, that unavoidable animal-derived food waste amounted to 2.21±3.12% with a confidence interval of (-4.03; 8.45), which highlights the problem of the biased negative proportions. A Pearson's correlation test, applied to waste fraction generation (kg mass), indicated a positive correlation between avoidable vegetable food waste and plastic packaging. However, correlation tests applied to waste fraction compositions (percentage values) showed a negative association in this regard, thus demonstrating that statistical analyses applied to compositional waste fraction data, without addressing the closed characteristics of these data, have the potential to generate spurious or misleading results. Therefore, ¨compositional data should be transformed adequately prior to any statistical analysis, such as computing mean, standard deviation and correlation coefficients. Copyright © 2017 Elsevier Ltd. All rights reserved.

  14. Statistical analysis of fluorescence correlation spectroscopy: the standard deviation and bias.

    Science.gov (United States)

    Saffarian, Saveez; Elson, Elliot L

    2003-03-01

    We present a detailed statistical analysis of fluorescence correlation spectroscopy for a wide range of timescales. The derivation is completely analytical and can provide an excellent tool for planning and analysis of FCS experiments. The dependence of the signal-to-noise ratio on different measurement conditions is extensively studied. We find that in addition to the shot noise and the noise associated with correlated molecular dynamics there is another source of noise that appears at very large lag times. We call this the "particle noise," as its behavior is governed by the number of particles that have entered and left the laser beam sample volume during large dwell times. The standard deviations of all the points on the correlation function are calculated analytically and shown to be in good agreement with experiments. We have also investigated the bias associated with experimental correlation function measurements. A "phase diagram" for FCS experiments is constructed that demonstrates the significance of the bias for any given experiment. We demonstrate that the value of the bias can be calculated and added back as a first-order correction to the experimental correlation function.

  15. An analysis of combined standard uncertainty for radiochemical measurements of environmental samples

    International Nuclear Information System (INIS)

    Berne, A.

    1996-01-01

    It is anticipated that future data acquisitions intended for use in radiological risk assessments will require the incorporation of uncertainty analysis. Often, only one aliquot of the sample is taken and a single determination is made. Under these circumstances, the total uncertainty is calculated using the open-quotes propagation of errorsclose quotes approach. However, there is no agreement in the radioanalytical community as to the exact equations to use. The Quality Assurance/Metrology Division of the Environmental Measurements Laboratory has developed a systematic process to compute uncertainties in constituent components of the analytical procedure, as well as the combined standard uncertainty (CSU). The equations for computation are presented here, with examples of their use. They have also been incorporated into a code for use in the spreadsheet application, QuattroPro trademark. Using the spreadsheet with appropriate inputs permits an analysis of the variations in the CSU as a function of several different variables. The relative importance of the open-quotes counting uncertaintyclose quotes can also be ascertained

  16. Analysis of China's radiation environment monitoring standard system

    International Nuclear Information System (INIS)

    Lu Weiwei; Yue Huiguo; Yuan Zhilun; Yu Zhengwei; Huang Donghui; Wu Yongle

    2014-01-01

    In order to establish and improve the radiation environment monitoring standard system, to provide technical support for the radiation environment monitoring system, this work first retrieve the radiation environment monitoring standards, and clear the domestic status of the radiation environment monitoring standards. According to the Environmental Protection issued by the Ministry of the radiation environment monitoring technology specification '(HJ/T 61-2001), the radiation environment monitoring program (Provisional)' (Central Office (2003) No.56), and the radiation environment monitoring capacity assessment program (Environmental Protection management Division of the Department of nuclear safety, January 2011), environmental Protection Department of the relevant documents and other relevant information, the radiation environment monitoring standards and methods have been accessed, to find the missing items and issues proposed revision of the system requirements. Summarizing radiation environment monitoring national standards, standards of environmental protection industry, the nuclear industry standard, there are 28 standard missing items need to the health industry standards, inspection and quarantine industry standards, a total of 145 of these standards of environmental radiation monitoring, radiation monitoring of pollution sources, emergency response and early warning and monitoring from a management and technology meet the basic needs of the radiation environment monitoring. Research found that the standards in the revised 57 and in the formulation of 47, develop. After the revision of China's standards system will be further improved. The radiation environment monitoring work will be further strengthened. (authors)

  17. The Application of Cost/Benefit Analysis in the Development of Voluntary Standards

    Science.gov (United States)

    1986-07-01

    standards emerge. ( Deming , 1985, p. 5 2 ) The question this raises is precisely the focus of the study. Whether the voluntary standards concept as it now...1970. Deming , W. Edwards, "Loss from Failure of US Industry to Have More Voluntary Standards," Standards Engineering, v. 37, number 3, May/June 1985... Willian Ritchie,II Sundstrand Corp. United Airlines M. Craig Beard Roger D. Schaufele Federal Aviation Administration Douglas Aircraft Co. David R

  18. A complete analysis of a nuclear building to nuclear safety standards

    International Nuclear Information System (INIS)

    Bergeretto, G.; Giuliano, V.; Lazzeri, L.

    1975-01-01

    The nuclear standards impose on the designer the necessity of examining the loads, stresses and strains in a nuclear building even under extreme loading conditions, both due to plant malfunctions and environmental accidents. It is necessary then to generate, combine and examine a tremendous amount of data; really the lack of symmetry and general complication of the structures and the large number of loading combinations make an automatic analysis quite necessary. A largely automatized procedure is presented in view of solving the problem by a series of computer programs linked together as follows. After the seismic analysis has been performed by (SADE CODE) these data together with the data coming from thermal specifications, weight, accident descriptions etc. are fed into a finite element computer code (SAP4) for analysis. They are processed and combined by a computer code (COMBIN) according to the loading conditions (the usual list in Italy is given and briefly discussed), so that for each point (or each selected zone) under each loading condition the applied loads are listed. These data are fed to another computer code (DTP), which determines the amount of reinforcing bars necessary to accommodate the most severe of the loading conditions. The Aci 318/71 and Italian regulation procedures are followed; the characteristics of the program are briefly described and discussed. Some particular problems are discussed, e.g. the thermal stresses due to normal and accident conditions, the inelastic behavior of some frame elements (due to concrete cracking) is considered by means of an 'ad hoc' code. Typical examples are presented and the results are discussed showing a relatively large benefit in considering this inelastic effect

  19. Standard techniques for presentation and analysis of crater size-frequency data. [on moon and planetary surfaces

    Science.gov (United States)

    Arvidson, R.; Boyce, J.; Chapman, C.; Cintala, M.; Fulchignoni, M.; Moore, H.; Soderblom, L.; Neukum, G.; Schultz, P.; Strom, R.

    1979-01-01

    In September 1977 a crater studies workshop was held for the purpose of developing standardized data analysis and presentation techniques. The present report contains the unanimous recommendations of the participants. Recommendations are devoted primarily to crater size-frequency data and refer to cumulative and relative size-frequency distribution plots and to morphological analysis.

  20. Retrospective Analysis of NIST Standard Reference Material 1450, Fibrous Glass Board, for Thermal Insulation Measurements

    Science.gov (United States)

    Zarr, Robert R; Heckert, N Alan; Leigh, Stefan D

    2014-01-01

    Thermal conductivity data acquired previously for the establishment of Standard Reference Material (SRM) 1450, Fibrous Glass Board, as well as subsequent renewals 1450a, 1450b, 1450c, and 1450d, are re-analyzed collectively and as individual data sets. Additional data sets for proto-1450 material lots are also included in the analysis. The data cover 36 years of activity by the National Institute of Standards and Technology (NIST) in developing and providing thermal insulation SRMs, specifically high-density molded fibrous-glass board, to the public. Collectively, the data sets cover two nominal thicknesses of 13 mm and 25 mm, bulk densities from 60 kg·m−3 to 180 kg·m−3, and mean temperatures from 100 K to 340 K. The analysis repetitively fits six models to the individual data sets. The most general form of the nested set of multilinear models used is given in the following equation: λ(ρ,T)=a0+a1ρ+a2T+a3T3+a4e−(T−a5a6)2where λ(ρ,T) is the predicted thermal conductivity (W·m−1·K−1), ρ is the bulk density (kg·m−3), T is the mean temperature (K) and ai (for i = 1, 2, … 6) are the regression coefficients. The least squares fit results for each model across all data sets are analyzed using both graphical and analytic techniques. The prevailing generic model for the majority of data sets is the bilinear model in ρ and T. λ(ρ,T)=a0+a1ρ+a2T One data set supports the inclusion of a cubic temperature term and two data sets with low-temperature data support the inclusion of an exponential term in T to improve the model predictions. Physical interpretations of the model function terms are described. Recommendations for future renewals of SRM 1450 are provided. An Addendum provides historical background on the origin of this SRM and the influence of the SRM on external measurement programs. PMID:26601034

  1. Comprehensive approach to the validation of the standard method for total reflection X-ray fluorescence analysis of water.

    Science.gov (United States)

    Borgese, Laura; Dalipi, Rogerta; Riboldi, Alessandro; Bilo, Fabjola; Zacco, Annalisa; Federici, Stefania; Bettinelli, Maurizio; Bontempi, Elza; Depero, Laura Eleonora

    2018-05-01

    In this work, we present the validation of the chemical method for total reflection X-ray fluorescence (TXRF) analysis of water, proposed as a standard to the International Standard Organization. The complete experimental procedure to define the linear calibration range, elements sensitivities, limits of detection and quantification, precision and accuracy is presented for a commercial TXRF spectrometer equipped with Mo X-ray tube. Least squares linear regression, including all statistical tests is performed separately for each element of interest to extract sensitivities. Relative sensitivities with respect to Ga, as internal standard, are calculated. Accuracy and precision of the quantification procedure using Ga as internal standard is evaluated with reference water samples. A detailed discussion on the calibration procedure and the limitation of the use of this method for quantitative analysis of water is presented. Copyright © 2018 Elsevier B.V. All rights reserved.

  2. Geostatistical analysis of soil properties at field scale using standardized data

    Science.gov (United States)

    Millan, H.; Tarquis, A. M.; Pérez, L. D.; Matos, J.; González-Posada, M.

    2012-04-01

    -regionalization between different soil properties, which is of interest for delineating management zones within sugarcane fields. Cross-semivariograms showed larger correlation ranges than individual, univariate, semivariograms (A 29 m). All the findings were supported by multivariate spatial analysis, which showed the influence of soil tillage operations, harvesting machinery and irrigation water distribution on the status of the investigated area. Reference Millán, H., Tarquis, A.M.; Pérez, L.D.; Mato, J. and González-Posada, M. Spatial variability patterns of some Vertisol properties at a field scale using standardized data. Soil & Tillage Research, doi:10.1016/j.still.2011.11.003, 2012 (in press). Acknowledgements Funding provided by CEIGRAM (Research Centre for the Management of Agricultural and Environmental Risks) and by Spanish Ministerio de Ciencia e Innovación (MICINN) through project no. AGL2010-21501/AGR is greatly appreciated.

  3. Thermal analysis of a solar collector with a standard approach and software used to study windows

    Energy Technology Data Exchange (ETDEWEB)

    Simko, T.; Harrison, S.J. [Queen' s Univ., Kingston, ON (Canada). Dept. of Mechanical and Materials Engineering

    2007-07-01

    A method of calculating the overall heat loss coefficient of a solar collector was presented. The method was based on a standard approach used to obtain total window U-values. A model of the solar collector was developed with a finite element analysis (FEA) program. Heat loss from the solar collector was represented as the gross collector area; the overall heat loss coefficient; and the difference between the assumed mean absorber plate temperature and ambient temperature. The edge heat loss coefficient was approximated by assuming that there was a 1-D sideways heat flow through the edge area of the collector. Regional heat loss coefficients obtained with the model were then used to calculate the overall heat loss coefficient. Equations used for parallel tube type collectors were applied to the serpentine tube collector. The sightline of the solar collector was defined as being the position along the top cover below absorber plate. The same definitions for the extents of the frame, edge and center-of-glass regions for a window were applied to the collector. Multiple U-values were defined to account for heat flows outward across the top, bottom, and side surfaces of the collector. The absorber plate was simulated as being isothermal. Results were then compared with an experimental study in order to validate the method. The method was also compared with results obtained from a conventional analysis for estimating heat loss coefficients. It was concluded that the new method provided more accurate results than those obtained using the conventional method. 16 refs., 1 tab., 5 figs.

  4. Integrated Data Collection Analysis (IDCA) Program - RDX Type II Class 5 Standard, Data Set 1

    Energy Technology Data Exchange (ETDEWEB)

    Sandstrom, Mary M. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Brown, Geoffrey W. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Preston, Daniel N. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Pollard, Colin J. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Warner, Kirstin F. [Naval Surface Warfare Center (NSWC), Indian Head, MD (United States). Indian Head Division; Sorenson, Daniel N. [Naval Surface Warfare Center (NSWC), Indian Head, MD (United States). Indian Head Division; Remmers, Daniel L. [Naval Surface Warfare Center (NSWC), Indian Head, MD (United States). Indian Head Division; Moran, Jesse S. [Naval Surface Warfare Center (NSWC), Indian Head, MD (United States). Indian Head Division; Shelley, Timothy J. [Air Force Research Lab. (AFRL), Tyndall AFB, FL (United States); Reyes, Jose A. [Applied Research Associates, Inc., Tyndall AFB, FL (United States); Hsu, Peter C. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Whipple, Richard E. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Reynolds, John G. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States)

    2011-04-11

    This document describes the results of the first reference sample material—RDX Type II Class 5—examined in the proficiency study for small-scale safety and thermal (SSST) testing of explosive materials for the Integrated Data Collection Analysis (IDCA) Program. The IDCA program is conducting proficiency testing on homemade explosives (HMEs). The reference sample materials are being studied to establish the accuracy of traditional explosives safety testing for each performing laboratory. These results will be used for comparison to results from testing HMEs. This effort, funded by the Department of Homeland Security (DHS), ultimately will put the issues of safe handling of these materials in perspective with standard military explosives. The results of the study will add SSST testing results for a broad suite of different HMEs to the literature, potentially suggest new guidelines and methods for HME testing, and possibly establish what are the needed accuracies in SSST testing to develop safe handling practices. Described here are the results for impact, friction, electrostatic discharge, and scanning calorimetry analysis of a reference sample of RDX Type II Class 5. The results from each participating testing laboratory are compared using identical test material and preparation methods wherever possible. Note, however, the test procedures differ among the laboratories. These results are then compared to historical data from various sources. The performers involved are Lawrence Livermore National Laboratory (LLNL), Los Alamos National Laboratory (LANL), Air Force Research Laboratory/ RXQL (AFRL), Indian Head Division, Naval Surface Warfare Center, (IHD-NSWC), and Sandia National Laboratories (SNL). These tests are conducted as a proficiency study in order to establish some consistency in test protocols, procedures, and experiments and to understand how to compare results when test protocols are not identical.

  5. Comparison of Standard and Novel Signal Analysis Approaches to Obstructive Sleep Apnoea Classification

    Directory of Open Access Journals (Sweden)

    Aoife eRoebuck

    2015-08-01

    Full Text Available Obstructive sleep apnoea (OSA is a disorder characterised by repeated pauses in breathing during sleep, which leads to deoxygenation and voiced chokes at the end of each episode. OSA is associated by daytime sleepiness and an increased risk of serious conditions such as cardiovascular disease, diabetes and stroke. Between 2-7% of the adult population globally has OSA, but it is estimated that up to 90% of those are undiagnosed and untreated. Diagnosis of OSA requires expensive and cumbersome screening. Audio offers a potential non-contact alternative, particularly with the ubiquity of excellent signal processing on every phone.Previous studies have focused on the classification of snoring and apnoeic chokes. However, such approaches require accurate identification of events. This leads to limited accuracy and small study populations. In this work we propose an alternative approach which uses multiscale entropy (MSE coefficients presented to a classifier to identify disorder in vocal patterns indicative of sleep apnoea. A database of 858 patients was used, the largest reported in this domain. Apnoeic choke, snore, and noise events encoded with speech analysis features were input into a linear classifier. Coefficients of MSE derived from the first 4 hours of each recording were used to train and test a random forest to classify patients as apnoeic or not.Standard speech analysis approaches for event classification achieved an out of sample accuracy (Ac of 76.9% with a sensitivity (Se of 29.2% and a specificity (Sp of 88.7% but high variance. For OSA severity classification, MSE provided an out of sample Ac of 79.9%, Se of 66.0% and Sp = 88.8%. Including demographic information improved the MSE-based classification performance to Ac = 80.5%, Se = 69.2%, Sp = 87.9%. These results indicate that audio recordings could be used in screening for OSA, but are generally under-sensitive.

  6. A seismic analysis of Korean standard PWR fuels under transition core conditions

    International Nuclear Information System (INIS)

    Kim, Hyeong Koo; Park, Nam Kyu; Jang, Young Ki; Kim, Jae Ik; Kim, Kyu Tae

    2005-01-01

    The PLUS7 fuel is developed to achieve higher thermal performance, burnup and more safety margin than the conventional fuel used in the Korean Standard Nuclear Plants (KSNPs) and to sustain structural integrity under increased seismic requirement in Korea. In this study, a series of seismic analysis have been performed in order to evaluate the structural integrity of fuel assemblies associated with seismic loads in the KSNPs under transition core conditions replacing the Guardian fuel, which is a resident fuel in the KSNP reactors, with the PLUS7 fuel. For the analysis, transition core seismic models have been developed, based on the possible fuel loading patterns. And the maximum impact forces on the spacer grid and various stresses acting on the fuel components have been evaluated and compared with the through-grid strength of spacer grids and the stress criteria specified in the ASME code for each fuel component, respectively. Then three noticeable parameters regarding as important parameters governing fuel assembly dynamic behavior are evaluated to clarify their effects on the fuel impact and stress response. As a result of the study, it has been confirmed that both the PLUS7 and the Guardian fuel sustain their structural integrity under the transition core condition. And when the damping ratio is constant, increasing the natural frequency of fuel assembly results in a decrease in impact force. The fuel assembly flexural stiffness has an effect increasing the stress of fuel assembly, but not the impact force. And the spacer grid stiffness is directly related with the impact force response. (author)

  7. The PREP pipeline: standardized preprocessing for large-scale EEG analysis.

    Science.gov (United States)

    Bigdely-Shamlo, Nima; Mullen, Tim; Kothe, Christian; Su, Kyung-Min; Robbins, Kay A

    2015-01-01

    The technology to collect brain imaging and physiological measures has become portable and ubiquitous, opening the possibility of large-scale analysis of real-world human imaging. By its nature, such data is large and complex, making automated processing essential. This paper shows how lack of attention to the very early stages of an EEG preprocessing pipeline can reduce the signal-to-noise ratio and introduce unwanted artifacts into the data, particularly for computations done in single precision. We demonstrate that ordinary average referencing improves the signal-to-noise ratio, but that noisy channels can contaminate the results. We also show that identification of noisy channels depends on the reference and examine the complex interaction of filtering, noisy channel identification, and referencing. We introduce a multi-stage robust referencing scheme to deal with the noisy channel-reference interaction. We propose a standardized early-stage EEG processing pipeline (PREP) and discuss the application of the pipeline to more than 600 EEG datasets. The pipeline includes an automatically generated report for each dataset processed. Users can download the PREP pipeline as a freely available MATLAB library from http://eegstudy.org/prepcode.

  8. The PREP Pipeline: Standardized preprocessing for large-scale EEG analysis

    Directory of Open Access Journals (Sweden)

    Nima eBigdelys Shamlo

    2015-06-01

    Full Text Available The technology to collect brain imaging and physiological measures has become portable and ubiquitous, opening the possibility of large-scale analysis of real-world human imaging. By its nature, such data is large and complex, making automated processing essential. This paper shows how lack of attention to the very early stages of an EEG preprocessing pipeline can reduce the signal-to-noise ratio and introduce unwanted artifacts into the data, particularly for computations done in single precision. We demonstrate that ordinary average referencing improves the signal-to-noise ratio, but that noisy channels can contaminate the results. We also show that identification of noisy channels depends on the reference and examine the complex interaction of filtering, noisy channel identification, and referencing. We introduce a multi-stage robust referencing scheme to deal with the noisy channel-reference interaction. We propose a standardized early-stage EEG processing pipeline (PREP and discuss the application of the pipeline to more than 600 EEG datasets. The pipeline includes an automatically generated report for each dataset processed. Users can download the PREP pipeline as a freely available MATLAB library from http://eegstudy.org/prepcode/.

  9. Comparison of standard PCR/cloning to single genome sequencing for analysis of HIV-1 populations.

    Science.gov (United States)

    Jordan, Michael R; Kearney, Mary; Palmer, Sarah; Shao, Wei; Maldarelli, Frank; Coakley, Eoin P; Chappey, Colombe; Wanke, Christine; Coffin, John M

    2010-09-01

    To compare standard PCR/cloning and single genome sequencing (SGS) in their ability to reflect actual intra-patient polymorphism of HIV-1 populations, a total of 530 HIV-1 pro-pol sequences obtained by both sequencing techniques from a set of 17 ART naïve patient specimens was analyzed. For each specimen, 12 and 15 sequences, on average, were characterized by the two techniques. Using phylogenetic analysis, tests for panmixia and entropy, and Bland-Altman plots, no difference in population structure or genetic diversity was shown in 14 of the 17 subjects. Evidence of sampling bias by the presence of subsets of identical sequences was found by either method. Overall, the study shows that neither method was more biased than the other, and providing that an adequate number of PCR templates is analyzed, and that the bulk sequencing captures the diversity of the viral population, either method is likely to provide a similar measure of population diversity. Copyright 2010 Elsevier B.V. All rights reserved.

  10. A Spectral-line Analysis of the G8 III Standard ε VIR

    Energy Technology Data Exchange (ETDEWEB)

    Gray, David F., E-mail: dfgray@uwo.ca [Department of Physics and Astronomy University of Western Ontario, 1151 Richmond Street, London, Ontario N6A 3K7 (Canada)

    2017-08-10

    Eleven seasons of spectroscopic data comprised of 107 exposures for the stable G8 III standard star, ε Vir are analyzed for projected rotation rate and granulation parameters. A Fourier analysis of the line shapes yield v sin i = 3.06 ± 0.20 km s{sup −1} and a radial-tangential macroturbulence dispersion ζ {sub RT} = 5.16 ± 0.08 km s{sup −1}. The radial velocity over nine seasons is constant to 18 m s{sup −1}. The absolute radial velocity with granulation blueshifts (but not gravitational redshift) removed is −14120 ± 75 m s{sup −1}. Line-depth ratios show the temperature to be constant to 0.7 K over 11 years, although a small secular rise or cyclic variation ∼1 K cannot be ruled out. The third-signature plot shows that the star has granulation velocities 10% larger than the Sun's. Mapping the Fe i λ 6253 line bisector on to the third-signature plot indicates a normal-for-giants flux deficit area of 12.8%, indicating ∼134 K temperature difference between granules and lanes. Deficit velocities of GK giants are seen to shift to higher values with higher luminosity, ∼0.75 km s{sup −1} over Δ M {sub V} ∼ 1.5, indicating larger velocity differences between granules and lanes for giants higher in the HR diagram.

  11. Standard review plan for reviewing safety analysis reports for dry metallic spent fuel storage casks

    International Nuclear Information System (INIS)

    1988-01-01

    The Cask Standard Review Plan (CSRP) has been prepared as guidance to be used in the review of Cask Safety Analysis Reports (CSARs) for storage packages. The principal purpose of the CSRP is to assure the quality and uniformity of storage cask reviews and to present a well-defined base from which to evaluate proposed changes in the scope and requirements of reviews. The CSRP also sets forth solutions and approaches determined to be acceptable in the past by the NRC staff in dealing with a specific safety issue or safety-related design area. These solutions and approaches are presented in this form so that reviewers can take consistent and well-understood positions as the same safety issues arise in future cases. An applicant submitting a CSAR does not have to follow the solutions or approaches presented in the CSRP. However, applicants should recognize that the NRC staff has spent substantial time and effort in reviewing and developing their positions for the issues. A corresponding amount of time and effort will probably be required to review and accept new or different solutions and approaches

  12. Analysis and design of 40 m high steel chimney according to Eurocode and SRPS standards

    Directory of Open Access Journals (Sweden)

    Bešević Miroslav T.

    2015-01-01

    Full Text Available This paper presents structural analysis according to Eurocode and SRPS standards. Analysed construction is 40 m high steel chimney, made of 33 m long upper circular cross-section segment of constant diameter D=1834 mm and bottom variable circular cross-section segment of D=2234 mm on the foundation level. Wind loads are dominant loads for this kind of structures. In analyzing workload basic wind speed is thoroughly treated according to SRPS U.C7.110, and respectively wind speed is treated according to Eurocode 1 for the location of Prahovo. Design of model for the effects of its own weight, a constant load and wind load was conducted according to the second order theory for pre-formed load combinations. The load combinations do not contain partial load safety factors. Design of chimney shaft segments are carried out for different ticknes by height of the pillar, for bearing capacity of cross-section, buckling due normal stress, buckling due tangential stress and buckling control is carried out due shear stress.

  13. Comparative analysis of the dynamic characteristics of the "standard" movement hands qualified athletes who specialize in synchronized swimming

    Directory of Open Access Journals (Sweden)

    Gordeeva M.V.

    2012-02-01

    Full Text Available A purpose of work is a comparative analysis of dynamic descriptions of different types of «standard» motion hands. In theory grounded and possibility of realization of «standard» motion hands of two kinds is practically confirmed. Information of dynamic descriptions enable construction of comparative biotkinematics model of two types of «standard» motion hands and their efficiency. The most essential indexes of motion hands are selected. It is set that on the orientation of brush advancement of sportswoman depends in any of directions. It is not exposed distinctions in the phase of distant capture between the indexes of force of resistance.

  14. How the Gold Standard Functioned in Portugal: An Analysis of Some Macroeconomic Aspects

    OpenAIRE

    António Portugal Duarte; João Sousa Andrade

    2011-01-01

    The purpose of this study is to improve understanding of the gold standard period in Portugal through comparison with other monetary systems that were operated afterwards. Portugal was the first country in Europe to join Great Britain in the gold standard, in 1854, and it adhered to it for quite a long time. The principle of free gold convertibility of the Portuguese currency at a fixed price was abandoned in 1891, even though the classical gold standard as an international monetary system ...

  15. [Development of national and international standards of population age distribution for medical statistics, health-demographic analysis and risk assessment].

    Science.gov (United States)

    Demin, V F; Pal'tsev, M A; Chaban, E A

    2013-01-01

    The current European standard (CES) and the World population age distribution standard is widely used in medical and demographic studies, performed by international (WHO, etc.) and national organizations. The Russian Federal Service of States Statistics (RosStat) uses CES in demographic yearbooks and other publications. The standard is applied in calculation of the standardized mortality rate (SMR) of the population in different countries and territories. Risk assessment is also used CES. In the basis of the standards there has been laid the idea to assess mortality according to uniform standard, so to get possibility to compare the mortality rate of the population in different countries and regions, different genders and different calendar years. Analysis of the results of test calculations of the values of the SMR for the population of Russia and other countries with the use of current standards has revealed serious shortcomings of the latters and set up the task of improving them. A new concept of the development of standards based on the use of the concept of stable equilibrium of the age distribution of the population and survivorship function is proposed.

  16. Standard test method for uranium analysis in natural and waste water by X-ray fluorescence

    CERN Document Server

    American Society for Testing and Materials. Philadelphia

    2004-01-01

    1.1 This test method applies for the determination of trace uranium content in waste water. It covers concentrations of U between 0.05 mg/L and 2 mg/L. 1.2 The values stated in SI units are to be regarded as standard. No other units of measurement are included in this standard. 1.3 This standard does not purport to address all of the safety concerns, if any, associated with its use. It is the responsibility of the user of this standard to establish appropriate safety and health practices and determine the applicability of regulatory limitations prior to use.

  17. Deep brain stimulation for psychiatric diseases: a pooled analysis of published studies employing disease-specific standardized outcome scales.

    Science.gov (United States)

    Nangunoori, Raj; Tomycz, Nestor D; Quigley, Matthew; Oh, Michael Y; Whiting, Donald M

    2013-01-01

    Deep brain stimulation (DBS) has emerged in recent years as a novel therapy in the treatment of refractory psychiatric disease, including major depressive disorder (MDD), obsessive-compulsive disorder (OCD), and Tourette's syndrome (TS). Standardized outcome scales were crucial in establishing that DBS was an effective therapy for movement disorders. In order to better characterize the evidence supporting DBS for various psychiatric diseases, we performed a pooled analysis of those studies which incorporated specific standardized rating scales. A Medline search was conducted to identify all studies reporting DBS for MDD, OCD, and TS. The search yielded a total of 49 articles, of which 24 were included: 4 related to MDD (n = 48), 10 to OCD (n = 64), and 10 to TS (n = 46). A meta-analysis of DBS for MDD, OCD, and TS in studies employing disease-specific standardized outcome scales showed that the outcome scales all improved in a statistically significant fashion for these psychiatric diseases. Our pooled analysis suggests that DBS for TS has the highest efficacy amongst the psychiatric diseases currently being treated with DBS, followed by OCD and MDD. DBS for psychiatric diseases remains investigational; however, even when studies failing to incorporate standardized outcome scales are excluded, there is statistically significant evidence that DBS can improve symptoms in MDD, OCD, and TS. Standardized disease-specific outcome scales facilitate pooled analysis and should be a required metric in future studies of DBS for psychiatric disease.

  18. [Studies on the standardization of parameters for jaw movement analysis--6 degree-of-freedom jaw movements analysis].

    Science.gov (United States)

    Takeuchi, Hisahiro; Bando, Eiichi; Abe, Susumu

    2008-07-01

    To establish standardized evaluating methods for jaw movements analysis. In this paper, we investigated evaluating parameters for 6 degree-of-freedom jaw movements data. Recorded data of jaw border movements from 20 male adults were employed as basic samples. The main parameters were as follows: 1. The displacement of an intercondylar midpoint: the length of a straight line between 2 positions of this point, the intercuspal position and other jaw position. 2. The angle of intercondylar axes: the angle between 2 position of the intercondylar axis, the intercuspal position and other jaw position. 3. The angle of incisal-condylar planes: the angle between 2 position of the plane, the intercuspal position and other jaw position (this plane was defined with the incisal point and condylar points of both sides 4. The mandibular motion range index: quantitative values calculated with 2 of 3 parameters described above. The mandibular motion range index showed a close correlation with respective projected areas of the incisal paths, with the projected area of sagittal border movements on the sagittal plane r = 0.82 (p movements on the frontal plane: left lateral border movements r = 0.92 (p movements r = 0.84 (p movements data and relative relationship between the intercuspal position and other jaw position. They were independent of reference coordinate systems and could measure jaw movement quantitatively.

  19. Validation of Proposed Metrics for Two-Body Abrasion Scratch Test Analysis Standards

    Science.gov (United States)

    Street, Kenneth W., Jr.; Kobrick, Ryan L.; Klaus, David M.

    2013-01-01

    Abrasion of mechanical components and fabrics by soil on Earth is typically minimized by the effects of atmosphere and water. Potentially abrasive particles lose sharp and pointed geometrical features through erosion. In environments where such erosion does not exist, such as the vacuum of the Moon, particles retain sharp geometries associated with fracturing of their parent particles by micrometeorite impacts. The relationship between hardness of the abrasive and that of the material being abraded is well understood, such that the abrasive ability of a material can be estimated as a function of the ratio of the hardness of the two interacting materials. Knowing the abrasive nature of an environment (abrasive)/construction material is crucial to designing durable equipment for use in such surroundings. The objective of this work was to evaluate a set of standardized metrics proposed for characterizing a surface that has been scratched from a two-body abrasion test. This is achieved by defining a new abrasion region termed Zone of Interaction (ZOI). The ZOI describes the full surface profile of all peaks and valleys, rather than just measuring a scratch width. The ZOI has been found to be at least twice the size of a standard width measurement; in some cases, considerably greater, indicating that at least half of the disturbed surface area would be neglected without this insight. The ZOI is used to calculate a more robust data set of volume measurements that can be used to computationally reconstruct a resultant profile for de tailed analysis. Documenting additional changes to various surface roughness par ameters also allows key material attributes of importance to ultimate design applications to be quantified, such as depth of penetration and final abraded surface roughness. Further - more, by investigating the use of custom scratch tips for specific needs, the usefulness of having an abrasion metric that can measure the displaced volume in this standardized

  20. The Common Core State Standards Initiative: An Event History Analysis of State Adoption

    Science.gov (United States)

    LaVenia, Mark; Cohen-Vogel, Lora; Lang, Laura B.

    2015-01-01

    Today, with states' near-universal adoption of the Common Core State Standards, the political system has achieved that which was not possible less than 2 decades ago. Just why this is so remains unanswered. Some observers have attributed states' embrace of the standards to the substantial financial incentives that the federal government embedded…

  1. Color-Blind Leadership: A Critical Race Theory Analysis of the ISLLC and ELCC Standards

    Science.gov (United States)

    Davis, Bradley W.; Gooden, Mark A.; Micheaux, Donna J.

    2015-01-01

    Purpose: Working from the driving research question--"is the explicit consideration of race present in the ISLLC and ELCC standards?"--this article explores the implications of a school leadership landscape reliant on a collection of color-blind leadership standards to guide the preparation and practice of school leaders. In doing so, we…

  2. ANALYSIS OF THE IMPLEMENTATION OF INTERNATIONAL FINANCIAL REPORTING STANDARDS BY COUNTRIES WITH ECONOMIES IN TRANSITION

    Directory of Open Access Journals (Sweden)

    B. Zasadnyi

    2013-08-01

    Full Text Available The article deals with experience of application of international financial reporting standards in different countries with economies in transition. The main advantages and disadvantages of the implementation of international financial reporting standards for financial reporting for Ukrainian companies are based on this experience.

  3. Evaluation of four building energy analysis computer programs against ASHRAE standard 140-2007

    CSIR Research Space (South Africa)

    Szewczuk, S

    2014-08-01

    Full Text Available ) standard or code of practice. Agrément requested the CSIR to evaluate a range of building energy simulation computer programs. The standard against which these computer programs were to be evaluated was developed by the American Society of Heating...

  4. Measurement standards and the general problem of reference points in chemical analysis

    International Nuclear Information System (INIS)

    Richter, W.; Dube, G.

    2002-01-01

    Besides the measurement standards available in general metrology in the form of the realisations of the units of measurement, measurement standards of chemical composition are needed for the vast field of chemical measurement (measurements of the chemical composition), because it is the main aim of such measurements to quantify non-isolated substances, often in complicated matrices, to which the 'classical' measurement standards and their lower- level derivatives are not directly applicable. At present, material artefacts as well as standard measurement devices serve as chemical measurement standards. These are measurement standards in the full metrological sense only, however, if they are firmly linked to the SI unit in which the composition represented by the standard is expressed. This requirement has the consequence that only a very restricted number of really reliable chemical measurement standards exist at present. Since it is very difficult and time consuming to increase this number substantially and, on the other hand, reliable reference points are increasingly needed for all kinds of chemical measurements, primary methods of measurement and high-level reference measurements will play an increasingly important role for the establishment of worldwide comparability and hence mutual acceptance of chemical measurement results. (author)

  5. How the Environment Is Positioned in the "Next Generation Science Standards": A Critical Discourse Analysis

    Science.gov (United States)

    Hufnagel, Elizabeth; Kelly, Gregory J.; Henderson, Joseph A.

    2018-01-01

    The purpose of this paper is to describe how the environment and environmental issues are conceptualized and positioned in the Next Generation Science Standards (NGSS) to examine underlying assumptions about the environment. The NGSS are a recent set of science standards in the USA, organized and led by Achieve Inc., that propose science education…

  6. Discourse Surrounding the International Education Standards for Professional Accountants (IES): A Content Analysis Approach

    Science.gov (United States)

    Sugahara, Satoshi; Wilson, Rachel

    2013-01-01

    The development and implementation of the International Education Standards (IES) for professional accountants is currently an important issue in accounting education and for educators interested in a shift toward international education standards more broadly. The purpose of this study is to investigate professional and research discourse…

  7. A Comprehensive Analysis of High School Genetics Standards: Are States Keeping Pace with Modern Genetics?

    Science.gov (United States)

    Dougherty, M. J.; Pleasants, C.; Solow, L.; Wong, A.; Zhang, H.

    2011-01-01

    Science education in the United States will increasingly be driven by testing and accountability requirements, such as those mandated by the No Child Left Behind Act, which rely heavily on learning outcomes, or "standards," that are currently developed on a state-by-state basis. Those standards, in turn, drive curriculum and instruction.…

  8. Global rule-setting for business: A critical analysis of multi-stakeholder standards

    NARCIS (Netherlands)

    Fransen, L.W.; Kolk, A.

    2007-01-01

    In the field of global rule-setting for responsible business behaviour, multi-stakeholder standards have emerged in recent years because of their potential for effective consensus-building, knowledgesharing and interest representation. Proponents also hold that multistakeholder standards could

  9. An Analysis of Geography Content in Relation to Geography for Life Standards in Oman

    Science.gov (United States)

    Al-Nofli, Mohammed Abdullah

    2018-01-01

    Since the publication of "Geography for Life: National Geography Standards" in the United States (Geography Education Standards Project, 1994), it has been widely used to develop quality curriculum materials for what students should know and able to do in geography. This study compared geography content taught in Omani public schools…

  10. Identifying Professional Teaching Standards Using Rasch Model Analysis: The Case of Northern Cyprus

    Science.gov (United States)

    Alibaba Erden, Hale; Özer, Bekir

    2013-01-01

    Problem Statement: The Teacher's-Act defined for the state-school teachers of North Cyprus shows that teachers are not selected according to any specific standards. In North Cyprus, apart from the exam topics defined at the teacher's exam regulations, there is not any kind of identified standard for teachers. Training qualified teachers based upon…

  11. Standard Review Plan for the review of safety analysis reports for nuclear power plants: LWR edition

    International Nuclear Information System (INIS)

    1987-06-01

    The Standard Review Plan (SRP) is prepared for the guidance of staff reviewers in the Office of Nuclear Reactor Regulation in performing safety reviews of applications to construct or operate nuclear power plants. The principal purpose of the SRP is to assure the quality and uniformity of staff reviews and to present a well-defined base from which to evaluate proposed changes in the scope and requirements of reviews. It is also a purpose of the SRP to make information about regulatory matters widely available and to improve communication and understanding of the staff review process by interested members of the public and the nuclear power industry. The safety review is primarily based on the information provided by an applicant in a Safety Analysis Report (SAR). The SAR must be sufficiently detailed to permit the staff to determine whether the plant can be built and operated without undue risk to the health and safety of the public. The SAR is the principal document in which the applicant provides the information needed to understand the basis upon which this conclusion has been reached. The individual SRP sections address, in detail, who performs the review, the matters that are reviewed, the basis for review, how the review is accomplished, and the conclusions that are sought. The safety review is performed by 25 primary branches. One of the objectives of the SRP is to assign the review responsibilities to the various branches and to define the sometimes complex interfaces between them. Each SRP section identifies the branch that has the primary review responsibility for that section. In some review areas the primary branch may require support, and the branches that are assigned these secondary review responsibilities are also identified for each SRP section

  12. A new principle for the standardization of long paragraphs for reading speed analysis.

    Science.gov (United States)

    Radner, Wolfgang; Radner, Stephan; Diendorfer, Gabriela

    2016-01-01

    To investigate the reliability, validity, and statistical comparability of long paragraphs that were developed to be equivalent in construction and difficulty. Seven long paragraphs were developed that were equal in syntax, morphology, and number and position of words (111), with the same number of syllables (179) and number of characters (660). For validity analyses, the paragraphs were compared with the mean reading speed of a set of seven sentence optotypes of the RADNER Reading Charts (mean of 7 × 14 = 98 words read). Reliability analyses were performed by calculating the Cronbach's alpha value and the corrected total item correlation. Sixty participants (aged 20-77 years) read the paragraphs and the sentences (distance 40 cm; font: Times New Roman 12 pt). Test items were presented randomly; reading length was measured with a stopwatch. Reliability analysis yielded a Cronbach's alpha value of 0.988. When the long paragraphs were compared in pairwise fashion, significant differences were found in 13 of the 21 pairs (p reading speed was 173.34 ± 24.01 words per minute (wpm) for the long paragraphs and 198.26 ± 28.60 wpm for the sentence optotypes. The maximum difference in reading speed was 5.55 % for the long paragraphs and 2.95 % for the short sentence optotypes. The correlation between long paragraphs and sentence optotypes was high (r = 0.9243). Despite good reliability and equivalence in construction and degree of difficulty, a statistically significant difference in reading speed can occur between long paragraphs. Since statistical significance should be dependent only on the persons tested, either standardizing long paragraphs for statistical equality of reading speed measurements or increasing the number of presented paragraphs is recommended for comparative investigations.

  13. Prevalence of spontaneous Brugada ECG pattern recorded at standard intercostal leads: A meta-analysis.

    Science.gov (United States)

    Shi, Shaobo; Barajas-Martinez, Hector; Liu, Tao; Sun, Yaxun; Yang, Bo; Huang, Congxin; Hu, Dan

    2018-03-01

    Typical Brugada ECG pattern is the keystone in the diagnosis of Brugada syndrome. However, the exact prevalence remains unclear, especially in Asia. The present study was designed to systematically evaluate the prevalence of spontaneous Brugada ECG pattern recorded at standard leads. We searched the Medline, Embase and Chinese National Knowledge Infrastructure (CNKI) for studies of the prevalence of Brugada ECG pattern, published between Jan 1, 2003, and September 1, 2016. Pooled prevalence of type 1 and type 2-3 Brugada ECG pattern were estimated in a random-effects model, and group prevalence data by the characteristic of studies. Meta-regression analyses were performed to explore the potential sources of heterogeneity, and sensitivity analyses were conducted to assess the effect of each study on the overall prevalence. Thirty-nine eligible studies involving 558,689 subjects were identified. Pooled prevalence of type 1 and 2-3 Brugada ECG pattern was 0.03% (95%CI, 0.01%-0.06%), and 0.42% (95%CI, 0.28%-0.59%), respectively. Regions, sample size, year of publication were the main source of heterogeneity. The prevalence of type 1 Brugada ECG pattern was higher in male, Asia, adult, patient, and fever subjects; but the relation between fever and type 2-3 Brugada ECG pattern was not significant. Sensitivity analysis showed that each study did not lonely affect the prevalence of type 1 and type 2-3 Brugada ECG pattern. Brugada ECG pattern is not rare, especially preponderant in adult Asian males, and fever subjects. Clinical screening and further examination of Brugada syndrome in potential population need to be highlighted. Copyright © 2017 Elsevier Ireland Ltd. All rights reserved.

  14. A Prospective Analysis of the Costs, Benefits, and Impacts of U.S. Renewable Portfolio Standards

    Energy Technology Data Exchange (ETDEWEB)

    Mai, Trieu [National Renewable Energy Lab. (NREL), Golden, CO (United States); Wiser, Ryan [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Barbose, Galen [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Bird, Lori [National Renewable Energy Lab. (NREL), Golden, CO (United States); Heeter, Jenny [National Renewable Energy Lab. (NREL), Golden, CO (United States); Keyser, David [National Renewable Energy Lab. (NREL), Golden, CO (United States); Krishnan, Venkat [National Renewable Energy Lab. (NREL), Golden, CO (United States); Macknick, Jordan [National Renewable Energy Lab. (NREL), Golden, CO (United States); Millstein, Dev [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States)

    2016-12-31

    As states have gained experience with renewable portfolio standards (RPS) policies, many have made significant revisions to existing programs. In 2015 and 2016, seven states raised and extended their final RPS targets, while another state enacted a new RPS policy (Barbose 2016b). Interest in expanding and strengthening state RPS programs may continue, while efforts like recent proposals in many states to repeal or freeze existing RPS policies may also persist. In either context, questions about the potential costs, benefits, and other impacts of RPS programs are usually central to the decision-making process. This report follows on previous analyses that have focused on the historical costs, benefits, and other impacts of existing state RPS programs (Heeter et al. 2014; Wiser et al. 2016a). This report examines RPS outcomes prospectively, considering both current RPS policies as well as a potential expansion of those policies. The goal of this work is to provide a consistent and independent analytical methodology for that examination. This analysis relies on National Renewable Energy Laboratory’s (NREL’s) Regional Energy Deployment System (ReEDS) model to estimate changes to the U.S. electric power sector across a number of scenarios and sensitivity cases, focusing on the 2015–2050 timeframe. Based on those modeled results, we evaluate the costs, benefits, and other impacts of renewable energy contributing to RPS compliance using the suite of methods employed in a number of recent studies sponsored by the U.S. Department of Energy (DOE): a report examining retrospective benefits and impacts of RPS programs (Wiser et al. 2016a), the Wind Vision report (DOE 2015), the On the Path to SunShot report focusing on environmental benefits (Wiser et al. 2016b), and the Hydropower Vision report (DOE 2016).

  15. Three-dimensional analysis of AP600 standard plant shield building roof

    Energy Technology Data Exchange (ETDEWEB)

    Greimann, L.; Fanous, F.; Safar, S.; Khalil, A.; Bluhm, D.

    1999-06-01

    The AP600 passive containment vessel is surrounded by a concrete cylindrical shell covered with a truncated conical roof. This roof supports the passive containment cooling system (PCS) annular tank, shield plate and other nonstructural attachments. When the shield building is subjected to different loading combinations as defined in the Standard Review Plan (SRP), some of the sections in the shield building could experience forces in excess of their design values. This report summarized the three-dimensional finite element analysis that was conducted to review the adequacy of the proposed Westinghouse shield building design. The ANSYS finite element software was utilized to analyze the Shield Building Roof (SBR) under dead, snow, wind, thermal and seismic loadings. A three-dimensional model that included a portion of the shield building cylindrical shell, the conical roof and its attachments, the eccentricities at the cone-cylinder connection and at the compression ring and the PCS tank was developed. Mesh sensitivity studies were conducted to select appropriate element size in the cylinder, cone, near air intakes and in the vicinity of the eccentricities. Also, a study was carried out to correctly idealize the water-structure interaction in the PCS tank. Response spectrum analysis was used to calculate the internal forces at different sections in the SBR under Safe Shutdown Earthquake (SSE). Forty-nine structural modes and twenty sloshing modes were used. Two horizontal components of the SSE together with a vertical component were used. Modal stress resultants were combined taking into account the effects of closely spaced modes. The three earthquake directions were combined by the Square Root of the Sum Squares method. Two load combinations were studied. The load combination that included dead, snow, fluid, thermal and seismic loads was selected to be the most critical. Interaction diagrams for critical sections were developed and used to check the design

  16. Three-dimensional analysis of AP600 standard plant shield building roof

    International Nuclear Information System (INIS)

    Greimann, L.; Fanous, F.; Safar, S.; Khalil, A.; Bluhm, D.

    1999-01-01

    The AP600 passive containment vessel is surrounded by a concrete cylindrical shell covered with a truncated conical roof. This roof supports the passive containment cooling system (PCS) annular tank, shield plate and other nonstructural attachments. When the shield building is subjected to different loading combinations as defined in the Standard Review Plan (SRP), some of the sections in the shield building could experience forces in excess of their design values. This report summarized the three-dimensional finite element analysis that was conducted to review the adequacy of the proposed Westinghouse shield building design. The ANSYS finite element software was utilized to analyze the Shield Building Roof (SBR) under dead, snow, wind, thermal and seismic loadings. A three-dimensional model that included a portion of the shield building cylindrical shell, the conical roof and its attachments, the eccentricities at the cone-cylinder connection and at the compression ring and the PCS tank was developed. Mesh sensitivity studies were conducted to select appropriate element size in the cylinder, cone, near air intakes and in the vicinity of the eccentricities. Also, a study was carried out to correctly idealize the water-structure interaction in the PCS tank. Response spectrum analysis was used to calculate the internal forces at different sections in the SBR under Safe Shutdown Earthquake (SSE). Forty-nine structural modes and twenty sloshing modes were used. Two horizontal components of the SSE together with a vertical component were used. Modal stress resultants were combined taking into account the effects of closely spaced modes. The three earthquake directions were combined by the Square Root of the Sum Squares method. Two load combinations were studied. The load combination that included dead, snow, fluid, thermal and seismic loads was selected to be the most critical. Interaction diagrams for critical sections were developed and used to check the design

  17. DAG expression: high-throughput gene expression analysis of real-time PCR data using standard curves for relative quantification.

    Directory of Open Access Journals (Sweden)

    María Ballester

    Full Text Available BACKGROUND: Real-time quantitative PCR (qPCR is still the gold-standard technique for gene-expression quantification. Recent technological advances of this method allow for the high-throughput gene-expression analysis, without the limitations of sample space and reagent used. However, non-commercial and user-friendly software for the management and analysis of these data is not available. RESULTS: The recently developed commercial microarrays allow for the drawing of standard curves of multiple assays using the same n-fold diluted samples. Data Analysis Gene (DAG Expression software has been developed to perform high-throughput gene-expression data analysis using standard curves for relative quantification and one or multiple reference genes for sample normalization. We discuss the application of DAG Expression in the analysis of data from an experiment performed with Fluidigm technology, in which 48 genes and 115 samples were measured. Furthermore, the quality of our analysis was tested and compared with other available methods. CONCLUSIONS: DAG Expression is a freely available software that permits the automated analysis and visualization of high-throughput qPCR. A detailed manual and a demo-experiment are provided within the DAG Expression software at http://www.dagexpression.com/dage.zip.

  18. Human milk fortifier with high versus standard protein content for promoting growth of preterm infants: A meta-analysis.

    Science.gov (United States)

    Liu, Tian-Tian; Dang, Dan; Lv, Xiao-Ming; Wang, Teng-Fei; Du, Jin-Feng; Wu, Hui

    2015-06-01

    To compare the growth of preterm infants fed standard protein-fortified human milk with that containing human milk fortifier (HMF) with a higher-than-standard protein content. Published articles reporting randomized controlled trials and prospective observational intervention studies listed on the PubMed®, Embase®, CINAHL and Cochrane Library databases were searched using the keywords 'fortifier', 'human milk', 'breastfeeding', 'breast milk' and 'human milk fortifier'. The mean difference with 95% confidence intervals was used to compare the effect of HMF with a higher-than-standard protein content on infant growth characteristics. Five studies with 352 infants with birth weight ≤ 1750 g and a gestational age ≤ 34 weeks who were fed human milk were included in this meta-analysis. Infants in the experimental groups given human milk with higher-than-standard protein fortifier achieved significantly greater weight and length at the end of the study, and greater weight gain, length gain, and head circumference gain, compared with control groups fed human milk with the standard HMF. HMF with a higher-than-standard protein content can improve preterm infant growth compared with standard HMF. © The Author(s) 2015 Reprints and permissions: sagepub.co.uk/journalsPermissions.nav.

  19. Use of benefit-cost analysis in establishing Federal radiation protection standards: a review

    Energy Technology Data Exchange (ETDEWEB)

    Erickson, L.E.

    1979-10-01

    This paper complements other work which has evaluated the cost impacts of radiation standards on the nuclear industry. It focuses on the approaches to valuation of the health and safety benefits of radiation standards and the actual and appropriate processes of benefit-cost comparison. A brief historical review of the rationale(s) for the levels of radiation standards prior to 1970 is given. The Nuclear Regulatory Commission (NRC) established numerical design objectives for light water reactors (LWRs). The process of establishing these numerical design criteria below the radiation protection standards set in 10 CFR 20 is reviewed. EPA's 40 CFR 190 environmental standards for the uranium fuel cycle have lower values than NRC's radiation protection standards in 10 CFR 20. The task of allocating EPA's 40 CFR 190 standards to the various portions of the fuel cycle was left to the implementing agency, NRC. So whether or not EPA's standards for the uranium fuel cycle are more stringent for LWRs than NRC's numerical design objectives depends on how EPA's standards are implemented by NRC. In setting the numerical levels in Appendix I to 10 CFR 50 and 40 CFR 190 NRC and EPA, respectively, focused on the costs of compliance with various levels of radiation control. A major portion of the paper is devoted to a review and critique of the available methods for valuing health and safety benefits. All current approaches try to estimate a constant value of life and use this to vaue the expected number of lives saved. This paper argues that it is more appropriate to seek a value of a reduction in risks to health and life that varies with the extent of these risks. Additional research to do this is recommended. (DC)

  20. Standard addition strip for quantitative electrostatic spray ionization mass spectrometry analysis: determination of caffeine in drinks.

    Science.gov (United States)

    Tobolkina, Elena; Qiao, Liang; Roussel, Christophe; Girault, Hubert H

    2014-12-01

    Standard addition strips were prepared for the quantitative determination of caffeine in different beverages by electrostatic spray ionization mass spectrometry (ESTASI-MS). The gist of this approach is to dry spots of caffeine solutions with different concentrations on a polymer strip, then to deposit a drop of sample mixed with an internal standard, here theobromine on each spot and to measure the mass spectrometry signals of caffeine and theobromine by ESTASI-MS. This strip approach is very convenient and provides quantitative analyses as accurate as the classical standard addition method by MS or liquid chromatography. Copyright © 2014 Elsevier B.V. All rights reserved.

  1. The standard deviation method: data analysis by classical means and by neural networks

    International Nuclear Information System (INIS)

    Bugmann, G.; Stockar, U. von; Lister, J.B.

    1989-08-01

    The Standard Deviation Method is a method for determining particle size which can be used, for instance, to determine air-bubble sizes in a fermentation bio-reactor. The transmission coefficient of an ultrasound beam through a gassy liquid is measured repetitively. Due to the displacements and random positions of the bubbles, the measurements show a scatter whose standard deviation is dependent on the bubble-size. The precise relationship between the measured standard deviation, the transmission and the particle size has been obtained from a set of computer-simulated data. (author) 9 figs., 5 refs

  2. Standard guide for precision electroformed wet sieve analysis of nonplastic ceramic powders

    CERN Document Server

    American Society for Testing and Materials. Philadelphia

    2009-01-01

    1.1 This guide covers the determination of the particle size distribution of pulverized alumina and quartz for particle sizes from 45 to 5 μm by wet sieving. 1.2 The values stated in SI units are to be regarded as standard. No other units of measurement are included in this standard. 1.2.1 The only exception is in the Section 5, Apparatus, 5.1 where there is no relevant SI equivalent. 1.3 This standard does not purport to address the safety concerns associated with its use. It is the responsibility of the user of this standard to establish appropriate safety and health practices and determine the applicability of regulatory limitations prior to use.

  3. Analysis and Comparison of Thickness and Bending Measurements from Fabric Touch Tester (FTT and Standard Methods

    Directory of Open Access Journals (Sweden)

    Musa Atiyyah Binti Haji

    2018-03-01

    Full Text Available Fabric Touch Tester (FTT is a relatively new device from SDL Atlas to determine touch properties of fabrics. It simultaneously measures 13 touch-related fabric physical properties in four modules that include bending and thickness measurements. This study aims to comparatively analyze the thickness and bending measurements made by the FTT and the common standard methods used in the textile industry. The results obtained with the FTT for 11 different fabrics were compared with that of standard methods. Despite the different measurement principle, a good correlation was found between the two methods used for the assessment of thickness and bending. As FTT is a new tool for textile comfort measurement and no standard yet exists, these findings are essential to determine the reliability of the measurements and how they relate to the well-established standard methods.

  4. Electric vehicle charging technologies analysis and standards : final research project report.

    Science.gov (United States)

    2017-02-01

    This project has evaluated the technologies and standards associated with Electric : Vehicle Service Equipment (EVSE) and the related infrastructure, and the major cost : issue related to electric vehicle (EV) charging -- the cost of utility power. T...

  5. Development of a standard operating procedure for analysis of ammonia concentrations in coal fly ash.

    Science.gov (United States)

    2015-04-01

    Research was performed to support the development and recommendation of a standard operating : procedure (SOP) for analyzing the ammonia content in fly ash intended for use in concrete. A review : of existing ash producers found that several differen...

  6. AMSARA: Accession Medical Standards Analysis and Research Activity. Report of 2007 Attrition and Morbidity Data for 2006 Accessions

    Science.gov (United States)

    2008-08-04

    1.4 313 Disturbance of emotions specific to 302 1.1 99 0.6 childhood and adolescence 784 Headaches 268 1.0 121 0.8 696.1 Psoriasis , current or history...closely with epidemiologists at Walter Reed Army Institute of Research an the concept of a Military Medical Standard Analysis and Evaluation Data Set

  7. Development of a viability standard curve for microencapsulated probiotic bacteria using confocal microscopy and image analysis software.

    Science.gov (United States)

    Moore, Sarah; Kailasapathy, Kasipathy; Phillips, Michael; Jones, Mark R

    2015-07-01

    Microencapsulation is proposed to protect probiotic strains from food processing procedures and to maintain probiotic viability. Little research has described the in situ viability of microencapsulated probiotics. This study successfully developed a real-time viability standard curve for microencapsulated bacteria using confocal microscopy, fluorescent dyes and image analysis software. Copyright © 2015 Elsevier B.V. All rights reserved.

  8. The Assessment of a Tutoring Program to Meet CAS Standards Using a SWOT Analysis and Action Plan

    Science.gov (United States)

    Fullmer, Patricia

    2009-01-01

    This article summarizes the use of SWOT (Strengths, Weaknesses, Opportunities, and Threats) analysis and subsequent action planning as a tool of self-assessment to meet CAS (Council for the Advancement of Standards in Higher Education) requirements for systematic assessment. The use of the evaluation results to devise improvements to increase the…

  9. Flexible regression models for ROC and risk analysis, with or without a gold standard

    OpenAIRE

    Branscum, AJ; Johnson, WO; Hanson, TE; Baron, AT

    2015-01-01

    A novel semiparametric regression model is developed for evaluating the covariate-specific accuracy of a continuous medical test or biomarker. Ideally, studies designed to estimate or compare medical test accuracy will use a separate, flawless gold-standard procedure to determine the true disease status of sampled individuals. We treat this as a special case of the more complicated and increasingly common scenario in which disease status is unknown because a gold-standard procedure does not e...

  10. Japanese Cost Accounting Systems - analysis of the cost accounting systems of the Japanese cost accounting standard

    OpenAIRE

    Peter, Winter

    2005-01-01

    This paper aims at providing an insight into Japanese cost accounting. Firstly, the development of cost accounting in Japan is delineated. Subsequently, the cost accounting systems codified in the Japanese cost accounting standard are analysed based on the classification according to Hoitsch/Schmitz. Lastly, a critical appraisal of the cost accounting systems of the Japanese cost accounting standard as well as a comparison to German and American cost accounting systems are conducted.

  11. End-Use Opportunity Analysis from Progress Indicator Results for ASHRAE Standard 90.1-2013

    Energy Technology Data Exchange (ETDEWEB)

    Hart, Philip R. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Xie, YuLong [Pacific Northwest National Lab. (PNNL), Richland, WA (United States)

    2014-12-01

    This report and an accompanying spreadsheet (PNNL 2014a) compile the end use building simulation results for prototype buildings throughout the United States. The results represent he energy use of each edition of ASHRAE Standard 90.1, Energy Standard for Buildings Except Low-Rise Residential Buildings (ASHRAE 2004, 2007, 2010, 2013). PNNL examined the simulation results to determine how the remaining energy was used.

  12. [Analysis of varieties and standards of Scrophulariaceae plants used in Tibetan medicine].

    Science.gov (United States)

    Cao, Lan; Mu, Ze-jing; Zhong, Wei-hong; Zhong, Wei-jin; He, Jun-wei; Du, Xiao-lang; Zhong, Guo-yue

    2015-12-01

    In this paper, the popular domestic varieties and quality standard of Scrophulariaceae plants used in Tibetan medicine were analyzed. The results showed that there were 11 genera and 99 species (including varieties), as well as 28 medicinal materials varieties of Scrophulariaceae plants were recorded in the relevant literatures. In relevant Tibetan standards arid literatures, there are great differences in varieties, sources, parts, and efficacies of medicinal plant. Among them, about 41.4% (including 41 species) of endemic plants, about 15.2% (including 15 species) of the original plants have medicinal standard legal records, except the medicinal materials of Scrophalaria ningpoensis, Lagotis brevituba, Picrorhiza scrophulariiflora, Veronica eriogyne general, most varieties have not completed quality standard. Consequently it is necessary to reinforce the herbal textual, resources and the use present situation investigation, the effects of the species resources material foundation and biological activity, quality standard, specification the medical terms of the plants, and promote Tibetan medicinal vareties-terminologies-sources such as the criterion and quality standard system for enriching the varieties of Tibetan medicinal materials and Chinese medicinal resources.

  13. Material characterization of the clay bonded silicon carbide candle filters and ash formations in the W-APF system after 500 hours of hot gas filtration at AEP. Appendix to Advanced Particle Filter: Technical progress report No. 11, January--March 1993

    Energy Technology Data Exchange (ETDEWEB)

    Alvin, M.A.

    1993-04-05

    (1) After 500 hours of operation in the pressurized fluidized-bed combustion gas environment, the fibrous outer membrane along the clay bonded silicon carbide Schumacher Dia Schumalith candles remained intact. The fibrous outer membrane did not permit penetration of fines through the filter wall. (2) An approximate 10-15% loss of material strength occurred within the intact candle clay bonded silicon carbide matrix after 500 hours of exposure to the PFBC gas environment. A relatively uniform strength change resulted within the intact candles throughout the vessel (i.e., top to bottom plenums), as well as within the various cluster ring positions (i.e., outer versus inner ring candle filters). A somewhat higher loss of material strength, i.e., 25% was detected in fractured candle segments removed from the W-APF ash hopper. (3) Sulfur which is present in the pressurized fluidized-bed combustion gas system induced phase changes along the surface of the binder which coats the silicon carbide grains in the Schumacher Dia Schumalith candle filter matrix.

  14. Towards Uniform Accelerometry Analysis: A Standardization Methodology to Minimize Measurement Bias Due to Systematic Accelerometer Wear-Time Variation

    Directory of Open Access Journals (Sweden)

    Tarun R. Katapally, Nazeem Muhajarine

    2014-06-01

    Full Text Available Accelerometers are predominantly used to objectively measure the entire range of activity intensities – sedentary behaviour (SED, light physical activity (LPA and moderate to vigorous physical activity (MVPA. However, studies consistently report results without accounting for systematic accelerometer wear-time variation (within and between participants, jeopardizing the validity of these results. This study describes the development of a standardization methodology to understand and minimize measurement bias due to wear-time variation. Accelerometry is generally conducted over seven consecutive days, with participants' data being commonly considered 'valid' only if wear-time is at least 10 hours/day. However, even within ‘valid’ data, there could be systematic wear-time variation. To explore this variation, accelerometer data of Smart Cities, Healthy Kids study (www.smartcitieshealthykids.com were analyzed descriptively and with repeated measures multivariate analysis of variance (MANOVA. Subsequently, a standardization method was developed, where case-specific observed wear-time is controlled to an analyst specified time period. Next, case-specific accelerometer data are interpolated to this controlled wear-time to produce standardized variables. To understand discrepancies owing to wear-time variation, all analyses were conducted pre- and post-standardization. Descriptive analyses revealed systematic wear-time variation, both between and within participants. Pre- and post-standardized descriptive analyses of SED, LPA and MVPA revealed a persistent and often significant trend of wear-time’s influence on activity. SED was consistently higher on weekdays before standardization; however, this trend was reversed post-standardization. Even though MVPA was significantly higher on weekdays both pre- and post-standardization, the magnitude of this difference decreased post-standardization. Multivariable analyses with standardized SED, LPA and

  15. Development of the standards for probabilistic analysis of security; Desarrollo de los estandares para analisis probabilistico de seguridad

    Energy Technology Data Exchange (ETDEWEB)

    Nelson, P. F.; Gonzalez C, M. [UNAM, Facultad de Ingenieria, Departamento de Sistemas Energeticos, Paseo Cuauhnahuac 8532, Jiutepec, Morelos 62550 (Mexico)]. e-mail: pnelson@lairn.fi-p.unam.mx

    2008-07-01

    The standard of the American Society of Mechanical Engineers (ASME) for Analysis Probability of Security (APS), for applications in nuclear plants, it was limited originally to an APS Level 1 of internal events. However, the recent efforts taken by the committee of administration of nuclear risk of the ASME, together with the committee for standards informed in risk of the American Nuclear Society (ANS), they have taken place an improved standard that the combines standard original ASME of APS Level internal events, fires inside the plant and external events, with a reserved place for events that happen to low powers and put out. This integrated standard will be used for the nuclear plants and the regulators to carry out applications informed in risk. The use of the APS has matured to the point that the programs of risk management have been developed that its is being used as part of the taking of decisions making in the nuclear facilities. The standard provides approaches to evaluate the technical capacities of an APS, relative to a matter in particular that allows them to the specialists in APS to determine if the elements of the APS are technically appropriate with regard to an application informed in particular risk. Informed applications in risk like inspection in service and technical specifications informed in risk they save time and resources, not alone to the plants, but to the regulator also. (Author)

  16. Platinum stable isotope analysis of geological standard reference materials by double-spike MC-ICPMS

    DEFF Research Database (Denmark)

    Creech, John Benjamin; Baker, J. A.; Handler, M. R.

    2014-01-01

    (Creech et al., 2013, J. Anal. At. Spectrom. 28, 853-865). The reproducibility in natural samples is evaluated by processing multiple replicates of four standard reference materials, and is conservatively taken to be ca. ±0.088 (2sd). Pt stable isotope data for the full set of reference materials have...... metal standard solution doped with a range of synthetic matrices and results in Pt yields of ≥90% with purity of ≥95%. Using this chemical separation technique, we have separated Pt from 11 international geological standard reference materials comprising of PGE ores, mantle rocks, igneous rocks and one...... sample from the Cretaceous-Paleogene boundary layer. Pt concentrations in these samples range from ca. 5ngg to 4μgg. This analytical method has been shown to have an external reproducibility on δPt (permil difference in the Pt/Pt ratio from the IRMM-010 standard) of ±0.040 (2sd) on Pt solution standards...

  17. Comparative Analysis and Considerations for PV Interconnection Standards in the United States and China

    Energy Technology Data Exchange (ETDEWEB)

    None

    2017-01-01

    The main objectives of this report are to evaluate China's photovoltaic (PV) interconnection standards and the U.S. counterparts and to propose recommendations for future revisions to these standards. This report references the 2013 report Comparative Study of Standards for Grid-Connected PV System in China, the U.S. and European Countries, which compares U.S., European, and China's PV grid interconnection standards; reviews various metrics for the characterization of distribution network with PV; and suggests modifications to China's PV interconnection standards and requirements. The recommendations are accompanied by assessments of four high-penetration PV grid interconnection cases in the United States to illustrate solutions implemented to resolve issues encountered at different sites. PV penetration in China and in the United States has significantly increased during the past several years, presenting comparable challenges depending on the conditions of the grid at the point of interconnection; solutions are generally unique to each interconnected PV installation or PV plant.

  18. Standard test method for determining nodularity and nodule count in ductile iron using image analysis

    CERN Document Server

    American Society for Testing and Materials. Philadelphia

    2011-01-01

    1.1 This test method is used to determine the percent nodularity and the nodule count per unit area (that is, number of nodules per mm2) using a light microscopical image of graphite in nodular cast iron. Images generated by other devices, such as a scanning electron microscope, are not specifically addressed, but can be utilized if the system is calibrated in both x and y directions. 1.2 Measurement of secondary or temper carbon in other types of cast iron, for example, malleable cast iron or in graphitic tool steels, is not specifically included in this standard because of the different graphite shapes and sizes inherent to such grades 1.3 This standard deals only with the recommended test method and nothing in it should be construed as defining or establishing limits of acceptability or fitness for purpose of the material tested. 1.4 The values stated in SI units are to be regarded as standard. No other units of measurement are included in this standard. 1.5 This standard does not purport to address al...

  19. Proposal for Standard Index and Analysis of Financial Performance in 2014 of Brazilian Soccer Clubs of Serie A

    Directory of Open Access Journals (Sweden)

    Rafael da Costa Jahara

    2017-02-01

    Full Text Available This study aims to develop a standard index for analysis of the financial performance of the football clubs of Brazil who participated in the series in the year 2014. To prepare the standard index are used economic and financial indicators of liquidity, profitability and debt, as well solvency analysis of the clubs by using the Kanitz model. Based on studies on standard indices to compare the performance of companies were selected for the analysis of balance sheets and statements of income twenty clubs. This study is classified as a descriptive exploratory research, multicases and quantitative approach because the document analysis, on the information that was extracted from financial reports, published on the websites of the respective soccer clubs. As a result, you can see that the clubs generally have poor performance when individually analyzed in both the analysis of liquidity indicators, indebtedness, profitability and solvency. However, such a result can not explain the competitive performance of the teams in that league.

  20. ANALYSIS OF THE TEACHERS’ INVOLVEMENT IN THE DISCUSSION OF THE APPLICATION OF THE FEDERAL STATE EDUCATIONAL STANDARDS VIA ONLINE RESOURCES

    Directory of Open Access Journals (Sweden)

    С Н Вачкова

    2017-12-01

    Full Text Available This article presents the research results of the teachers’ involvement extent in current problems emerging in educational activities. The paper discusses the concept of involvement, its functions and scientific approaches to its analysis; suggests the original definition and structure of this concept, describes the chosen methodology of its analysis, database research and the nature of the sample, analysis tools. The base of the present research was the Internet portal “Public expertise of normative documents in education”. There is a detailed description of quantitative results, the indicators of teachers’ participation in discussing problems of education in relation to normative educational documents of Federal state educational standards of primary, basic and secondary general education. The research results showed the indicators of teachers’ activity and the expressed problems in application the Federal state educational standards.

  1. Chemical analysis of coal by energy dispersive x-ray fluorescence utilizing artificial standards

    International Nuclear Information System (INIS)

    Wheeler, B.D.

    1982-01-01

    Accurate determinations of the elemental composition of coal by classical methods can be quite difficult and are normally very time consuming. X-ray fluorescence utilizing the powder method, however, has the ability of providing accurate and rapid analyses. Unfortunately, well characterized standards, although available, are not plentiful. In addition, the durability of stability of ground and pelletized coal samples is poor resulting in deterioration with time. As a result, artificial coal standards were prepared from certified geological materials by fusing in lithium tetraborate in percentages approximating expected ash contents and compositions in coal. Since the lithium tetraborate comprises about the same percentage of the standard as does the carbon, hydrogen, and oxygen in coal, the ground and pelletized coal sample can be assayed against the fused calibration curves by compensating for the differences in the mass absorption coefficients of the two matrices. 5 figures, 4 tables

  2. A Microfluidic Immunostaining System Enables Quality Assured and Standardized Immunohistochemical Biomarker Analysis

    Science.gov (United States)

    Kwon, Seyong; Cho, Chang Hyun; Kwon, Youngmee; Lee, Eun Sook; Park, Je-Kyun

    2017-04-01

    Immunohistochemistry (IHC) plays an important role in biomarker-driven cancer therapy. Although there has been a high demand for standardized and quality assured IHC, it has rarely been achieved due to the complexity of IHC testing and the subjective validation-based process flow of IHC quality control. We present here a microfluidic immunostaining system for the standardization of IHC by creating a microfluidic linearly graded antibody (Ab)-staining device and a reference cell microarray. Unlike conventional efforts, our system deals primarily with the screening of biomarker staining conditions for quantitative quality assurance testing in IHC. We characterized the microfluidic matching of Ab staining intensity using three HER2 Abs produced by different manufacturers. The quality of HER2 Ab was also validated using tissues of breast cancer patients, demonstrating that our system is an efficient and powerful tool for the standardization and quality assurance of IHC.

  3. EMPIRICAL ANALYSIS OF INTEGRATION WITHIN THE STANDARDS-BASED INTEGRATED MANAGEMENT SYSTEMS

    Directory of Open Access Journals (Sweden)

    Stanislav Karapetrović

    2010-03-01

    Full Text Available The overall goal of this paper is to empirically analyze the augmentation and integration of ISO 9001 - based quality management systems with ISO 14001 - based environmental and other standardized management systems. Results from a survey of 298 organizations headquartered in the Spanish regions of Catalonia and the Basque Country are presented. All surveyed organizations were registered to both ISO 9001: 2000 and ISO 14001: 2004, while some had further management system standard certificates, for example in occupational health and safe ty and social responsibility. Various aspects of augmentation and integration, such as the usage of additional subsystem standards, the process of integration and the conduct of audits, are discussed through largely descriptive analyses.

  4. Standardization of proton-induced x-ray emission technique for analysis of thick samples

    Science.gov (United States)

    Ali, Shad; Zeb, Johar; Ahad, Abdul; Ahmad, Ishfaq; Haneef, M.; Akbar, Jehan

    2015-09-01

    This paper describes the standardization of the proton-induced x-ray emission (PIXE) technique for finding the elemental composition of thick samples. For the standardization, three different samples of standard reference materials (SRMs) were analyzed using this technique and the data were compared with the already known data of these certified SRMs. These samples were selected in order to cover the maximum range of elements in the periodic table. Each sample was irradiated for three different values of collected beam charges at three different times. A proton beam of 2.57 MeV obtained using 5UDH-II Pelletron accelerator was used for excitation of x-rays from the sample. The acquired experimental data were analyzed using the GUPIXWIN software. The results show that the SRM data and the data obtained using the PIXE technique are in good agreement.

  5. Analysis of Open Automated Demand Response Deployments in California and Guidelines to Transition to Industry Standards

    OpenAIRE

    Ghatikar, Girish

    2014-01-01

    This report reviews the Open Automated Demand Response (OpenADR) deployments within the territories serviced by California?s investor-owned utilities (IOUs) and the transition from the OpenADR 1.0 specification to the formal standard?OpenADR 2.0. As demand response service providers and customers start adopting OpenADR 2.0, it is necessary to ensure that the existing Automated Demand Response (AutoDR) infrastructure investment continues to be useful and takes advantage of the formal standard ...

  6. Mass fragmentographic analysis of total cholesterol in serum using a heptadeuterated internal standard

    International Nuclear Information System (INIS)

    Wolthers, B.G.; Hindriks, F.R.; Muskiet, F.A.J.; Groen, A.

    1980-01-01

    A mass fragmentographic method for the determination of total cholesterol in serum using heptadeuterated [25,26,26,26,27,27,27- 2 H] cholesterol as internal standard is presented. The results obtained are compared with a colorimetric and gas chromatographic method which were previously proposed as reference methods. Criteria for the development of absolute measurement by means of mass fragmentography and stable isotopically labelled internal standards are given. The conclusion is drawn that, at present, mass fragmentographic methods for the determination of total cholesterol in serum do not fulfil the criteria required for absolute methods. (Auth.)

  7. Mechanical properties test and microstructure analysis of polyoxymethylene (POM) micro injection moulded standard parts

    DEFF Research Database (Denmark)

    Tosello, Guido; Lucchetta, Giovanni; Hansen, Hans Nørgaard

    2009-01-01

    The tensile mechanical properties and the micro structure of micro injection moulded polyoxymethylene (POM) test parts were investigated in this paper. The effects of different injection moulding processing conditions on ultimate tensile stress and strain at break were analyzed. Additionally......, the effects of miniaturization on the mechanical properties were investigated by executing injection moulding with both a standard tool designed according to ISO 527-2 and a miniaturized test part obtained from the standard design by a downscaling factor 10. The experiments have been performed according...

  8. [Process control in acute pain management. An analysis of the degree of organization of applied standard protocols].

    Science.gov (United States)

    Erlenwein, J; Emons, M I; Hecke, A; Nestler, N; Przemeck, M; Bauer, M; Meißner, W; Petzke, F

    2014-10-01

    The aim of this study was to analyze the degree of organization of different standard protocols for acute pain management, as well as the derivation and definition of typical but structurally different models. A total of 85 hospitals provided their written standardized protocols for analysis. Protocols for defined target processes from 76 hospitals and another protocol used by more than one hospital were included into the analysis. The suggested courses of action were theoretically simulated to identify and characterize process types in a multistage evaluation process. The analysis included 148 standards. Four differentiated process types were defined ("standardized order", "analgesic ladder", "algorithm", "therapy path"), each with an increasing level of organization. These four types had the following distribution: 27 % (n = 40) "standardized order", 47 % (n = 70) "analgesic ladder", 22 % (n = 33) "algorithm", 4 % (n = 5) "therapy path". Models with a higher degree of organization included more control elements, such as action and intervention triggers or safety and supervisory elements, and were also associated with a formally better access to medication. For models with a lower degree of organization, immediate courses of action were more dependent on individual decisions. Although not quantifiable, this was particularly evident when simulating downstream courses of action. Interfaces between areas of hospital activity and a cross-departmental-boundary validity were only considered in a fraction of the protocols. Concepts from clinics with a certificate in (acute) pain management were more strongly process-oriented. For children, there were proportionately more simple concepts with a lower degree of organization and less controlling elements. This is the first analysis of a large sample of standardized protocols for acute pain management focusing on the degree of organization and the possible influence on courses of action. The analysis

  9. [Interpretation and analysis of the requirements of the standard EN ISO 15189: 2012].

    Science.gov (United States)

    Vassault, A

    2013-06-01

    The requirements related to quality management system of the standard EN ISO 15189 (2012) and the requirements of the French regulation as reported in the COFRAC document SH REF 02, are applied into actions to display, documents to write and to make available and traceability to ensure (records).

  10. A conceptual analysis of standard setting in large-scale assessments

    NARCIS (Netherlands)

    van der Linden, Willem J.

    1994-01-01

    Elements of arbitrariness in the standard setting process are explored, and an alternative to the use of cut scores is presented. The first part of the paper analyzes the use of cut scores in large-scale assessments, discussing three different functions: (1) cut scores define the qualifications used

  11. How Is Cultural Diversity Positioned in Teacher Professional Standards? An International Analysis

    Science.gov (United States)

    Santoro, Ninetta; Kennedy, Aileen

    2016-01-01

    Unprecedented levels of global mobility mean that culturally homogenous classrooms are now increasingly rare. This brings with it challenges for teachers and raises issues about what constitutes quality teaching and teachers. Professional standards are commonly seen as a key policy instrument through which teacher quality can be enhanced. This…

  12. Analysis of a non-standard mixed finite element method with applications to superconvergence

    NARCIS (Netherlands)

    Brandts, J.H.

    2009-01-01

    We show that a non-standard mixed finite element method proposed by Barrios and Gatica in 2007, is a higher order perturbation of the least-squares mixed finite element method. Therefore, it is also superconvergent whenever the least-squares mixed finite element method is superconvergent.

  13. Toward a Qualitative Analysis of Standardized Tests Using an Information Processing Model.

    Science.gov (United States)

    Armour-Thomas, Eleanor

    The use of standardized tests and test data to detect and address differences in cognitive styles is advocated here. To this end, the paper describes the componential theory of intelligence addressed by Sternberg et. al. This theory defines the components of intelligence by function and level of generality, including: (1) metacomponents: higher…

  14. 78 FR 14717 - Energy Conservation Standards for Set-Top Boxes: Availability of Initial Analysis

    Science.gov (United States)

    2013-03-07

    ... quantitative output of this model is the industry net present value (INPV), which DOE calculates as the sum of... consumer cost of capital and puts the LCC in present-value terms. The PBP represents the number of years... present value (NPV) of total consumer costs and savings expected to result from potential new standards at...

  15. Values Underpinning STEM Education in the USA: An Analysis of the Next Generation Science Standards

    Science.gov (United States)

    Hoeg, Darren G.; Bencze, John Lawrence

    2017-01-01

    The Next Generation Science Standards (NGSS) were designed to address poor science and math performance in United States schools by inculcating globally competitive science, technology, engineering, and mathematics literacies relevant to participation in future society. Considering the complex network of influences involved in the development of…

  16. DELAMINATION AND XRF ANALYSIS OF NIST LEAD IN PAINT FILM STANDARDS

    Science.gov (United States)

    The objectives of this protocol were to remove the laminate coating from lead paint film standards acquired from NIST by means of surface heating. The average XRF value did not change after removal of the polymer coating suggesting that this protocol is satisfactory for renderin...

  17. Bioelectrical impedance analysis (BIA): a proposal for standardization of the classical method in adults

    Science.gov (United States)

    González-Correa, C. H.; Caicedo-Eraso, J. C.

    2012-12-01

    The accuracy of BIA measurements is limited by different sources of error such as physical model, cross sectional area, ethnicity, body hydration, age and level of body fat among other variables. Equation for each population is required as they can produce overestimation when manufacturer's equations are used. The classical measurements hand to foot has shown better correlation against hydrodensitometry than foot to foot or hand to hand. However there is a lack for an accepted standard of BIA procedures. This is compounded when there is not a good report of the BIA study's methodology; hence the comparability between the results is poor and reduces the reliability of the method. Perhaps, standardization of methods would be the first step for BIA studies to move forward and subsequently improve its accuracy. Standardized procedures could also minimize the impact of these variables on studies results. The aim of this study was to propose a protocol as a checklist to standardize BIA procedures and produce comparable results from future studies performed with the classic hand-foot configuration in adults.

  18. Citation analysis of Computer Standards & Interfaces: Technical or also non-technical focus?

    NARCIS (Netherlands)

    G. van de Kaa (Geerten); H.J. de Vries (Henk); B. Baskaran (Balakumaran)

    2015-01-01

    textabstractThis paper analyzes to which extent research published in Computer Standards & Interfaces (CSI) has a technical focus. We find that CSI has been following its scope very closely in the last three years and that the majority of its publications have a technical focus. Articles published

  19. Marketplace analysis demonstrates quality control standards needed for black raspberry dietary supplements

    Science.gov (United States)

    There is currently no standard for the minimum anthocyanin concentration a black raspberry dietary supplement must contain for legal sale in the US. All consumer available black raspberry products (n=19), packaged as dietary supplements or otherwise prepared (freeze-dried whole and pre-ground powder...

  20. Randomization and Data-Analysis Items in Quality Standards for Single-Case Experimental Studies

    Science.gov (United States)

    Heyvaert, Mieke; Wendt, Oliver; Van den Noortgate, Wim; Onghena, Patrick

    2015-01-01

    Reporting standards and critical appraisal tools serve as beacons for researchers, reviewers, and research consumers. Parallel to existing guidelines for researchers to report and evaluate group-comparison studies, single-case experimental (SCE) researchers are in need of guidelines for reporting and evaluating SCE studies. A systematic search was…

  1. Improving the Memory Sections of the Standardized Assessment of Concussion Using Item Analysis

    Science.gov (United States)

    McElhiney, Danielle; Kang, Minsoo; Starkey, Chad; Ragan, Brian

    2014-01-01

    The purpose of the study was to improve the immediate and delayed memory sections of the Standardized Assessment of Concussion (SAC) by identifying a list of more psychometrically sound items (words). A total of 200 participants with no history of concussion in the previous six months (aged 19.60 ± 2.20 years; N?=?93 men, N?=?107 women)…

  2. Common Core Standards, Professional Texts, and Diverse Learners: A Qualitative Content Analysis

    Science.gov (United States)

    Yanoff, Elizabeth; LaDuke, Aja; Lindner, Mary

    2014-01-01

    This research study questioned the degree to which six professional texts guiding implementation of the Common Core Standards in reading address the needs of diverse learners. For the purposes of this research, diverse learners were specifically defined as above grade level readers, below grade level readers, and English learners. The researchers…

  3. Standardization of pathways to adulthood?: an analysis of Dutch cohorts born between 1850 and 1900

    NARCIS (Netherlands)

    Bras, H.; Liefbroer, A.C.; Elzinga, C.

    2010-01-01

    This article examines pathways to adulthood among Dutch cohorts born in the second half of the nineteenth century. Although largely overlooked by previous studies, theory suggests that life courses of young adults born during this period were already infl uenced by a process of standardization, in

  4. Black English: A Strength Analysis of Non-Standard English in Black Dialect.

    Science.gov (United States)

    Bousquet, Robert J.

    Many black students speak a nonprestige dialect called black English, which places them at a disadvantage academically and socially. This monograph describes the features of black English, defines its use, discusses several theories of its origin, and offers some methods for teaching black students standard spoken usage as another style of speech.…

  5. An analysis of the impact of Renewable Portfolio Standards on residential electricity prices

    Science.gov (United States)

    Larson, Andrew James

    A Renewable Portfolio Standard (RPS) has become a popular policy for states seeking to increase the amount of renewable energy generated for consumers of electricity. The success of these state programs has prompted debate about the viability of a national RPS. The impact that these state level policies have had on the price consumers pay for electricity is the subject of some debate. Several federal organizations have conducted studies of the impact that a national RPS would have on electricity prices paid by consumers. NREL and US EIA utilize models that analyze the inputs in electricity generation to examine the future price impact of changes to electricity generation and show marginal increases in prices paid by end users. Other empirical research has produced similar results, showing that the existence of an RPS increases the price of electricity. These studies miss important aspects of RPS policies that may change how we view these price increases from RPS policies. By examining the previous empirical research on RPS policies, this study seeks to identify the controls necessary to build an effective model. These controls are utilized in a fixed effects model that seeks to show how the controls and variables of interest impact electricity prices paid by residential consumers of electricity. This study utilizes a panel data set from 1990 to 2014 to analyze the impact of these policies controlling for generating capacity, the regulatory status of utilities in each state, demographic characteristics of the states, and fuel prices. The results of the regressions indicate that prices are likely to be higher in states that have an RPS compared to states that do not have such a policy. Several of the characteristics mentioned above have price impacts, and so discussing RPS policies in the context of other factors that contribute to electricity prices is essential. In particular, the regulatory status of utilities in each state is an important determinate of price as

  6. Analysis of Indonesian educational system standard with KSIM cross-impact method

    Science.gov (United States)

    Arridjal, F.; Aldila, D.; Bustamam, A.

    2017-07-01

    The Result of The Programme of International Student Assessment (PISA) on 2012 shows that Indonesia is on 64'th position from 65 countries in Mathematics Mean Score. The 2013 Learning Curve Mapping, Indonesia is included in the 10th category of countries with the lowest performance on cognitive skills aspect, i.e. 37'th position from 40 countries. Competency is built by 3 aspects, one of them is cognitive aspect. The low result of mapping on cognitive aspect, describe the low of graduate competences as an output of Indonesia National Education System (INES). INES adopting a concept Eight Educational System Standards (EESS), one of them is graduate competency standard which connected directly with Indonesia's students. This research aims is to model INES by using KSIM cross-impact. Linear regression models of EESS constructed using the accreditation national data of Senior High Schools in Indonesia. The results then interpreted as impact value on the construction of KSIM cross-impact INES. The construction is used to analyze the interaction of EESS and doing numerical simulation for possible public policy in the education sector, i.e. stimulate the growth of education staff standard, content, process and infrastructure. All simulations of public policy has been done with 2 methods i.e with a multiplier impact method and with constant intervention method. From numerical simulation result, it is shown that stimulate the growth standard of content in the construction KSIM cross-impact EESS is the best option for public policy to maximize the growth of graduate competency standard.

  7. Comparison of Low- and Standard-Dose CT for the Diagnosis of Acute Appendicitis: A Meta-Analysis.

    Science.gov (United States)

    Yun, Seong Jong; Ryu, Chang-Woo; Choi, Na Young; Kim, Hyun Cheol; Oh, Ji Young; Yang, Dal Mo

    2017-06-01

    A meta-analysis was performed to compare low-dose CT and standard-dose CT in the diagnosis of acute appendicitis with an emphasis on diagnostic value. A systematic literature search for articles published through June 2016 was performed to identify studies that compared low-dose CT with standard-dose CT for the evaluation of patients suspected of having acute appendicitis. Summary estimates of sensitivity and specificity with 95% CIs were calculated using a bivariate random-effects model. Meta-regression was used to perform statistical comparisons of low-dose CT and standard-dose CT. Of 154 studies, nine studies investigating a total of 2957 patients were included in this meta-analysis. The pooled sensitivity and specificity of low-dose CT were 96.25% (95% CI, 91.88-98.31%) and 93.22% (95% CI, 88.75-96.00%), respectively. The pooled sensitivity and specificity of standard-dose CT were 96.40% (95% CI, 93.55-98.02%) and 92.17% (95% CI, 88.24-94.86%), respectively. In a joint model estimation of meta-regression, lowand standard-dose CT did not show a statistically significant difference (p = 0.71). Both lowand standard-dose CT seem to be characterized by high positive and negative predictive values across a broad spectrum of pretest probabilities for acute appendicitis. Low-dose CT is highly effective for the diagnosis of suspected appendicitis and can be considered a valid alternative first-line imaging test that reduces the potential risk of exposure to ionizing radiation.

  8. The standard system for conducting the TNA (Training Needs Analysis) of Staff (delrapport fra EU Erasmus+ project SMART

    DEFF Research Database (Denmark)

    Jensen, Ulla Højmark

    2016-01-01

    The Training Needs Analysis (TNA) has been carried out with the staff of the partner organisations. A standard system for conducting a quantitative and a qualitative training needs analysis had been developed and it has been used as a framework for the analysis from the 4 partners: Limerick...... training needs analysis. The needs of teachers/trainers and organisations highlighted in the three multiplier-events are also included in the summary conclusions....... and translation of the standardised system to suit their own individual context. Limerick and Palermo have completed both a quantitative and a qualitative training needs analyses. Copenhagen and Esbjerg have completed a qualitative training needs analysis. This report summarises the findings of the four partners...

  9. Simulation-based estimation of mean and standard deviation for meta-analysis via Approximate Bayesian Computation (ABC).

    Science.gov (United States)

    Kwon, Deukwoo; Reis, Isildinha M

    2015-08-12

    When conducting a meta-analysis of a continuous outcome, estimated means and standard deviations from the selected studies are required in order to obtain an overall estimate of the mean effect and its confidence interval. If these quantities are not directly reported in the publications, they must be estimated from other reported summary statistics, such as the median, the minimum, the maximum, and quartiles. We propose a simulation-based estimation approach using the Approximate Bayesian Computation (ABC) technique for estimating mean and standard deviation based on various sets of summary statistics found in published studies. We conduct a simulation study to compare the proposed ABC method with the existing methods of Hozo et al. (2005), Bland (2015), and Wan et al. (2014). In the estimation of the standard deviation, our ABC method performs better than the other methods when data are generated from skewed or heavy-tailed distributions. The corresponding average relative error (ARE) approaches zero as sample size increases. In data generated from the normal distribution, our ABC performs well. However, the Wan et al. method is best for estimating standard deviation under normal distribution. In the estimation of the mean, our ABC method is best regardless of assumed distribution. ABC is a flexible method for estimating the study-specific mean and standard deviation for meta-analysis, especially with underlying skewed or heavy-tailed distributions. The ABC method can be applied using other reported summary statistics such as the posterior mean and 95 % credible interval when Bayesian analysis has been employed.

  10. Use of an internal standard 233U, 236U to improve the accuracy of isotopic uranium analysis by thermal ionization mass spectrometry. Application to isotope dilution analysis

    International Nuclear Information System (INIS)

    Chevalier, C.; Hagemann, R.; Lucas, M.; Devillers, C.

    1982-01-01

    A method using a calibrated mixture of isotopes 233 U and 236 U has been developed in order to correct the isotopic fractionation which limits the accuracy of isotopic analysis by thermal ionization mass spectrometry. The 236/233 internal standard ratio is calibrated against the 235/238 ratios of uranium isotopic standards. To perform the analysis of the unknown sample, the latter is mixed with the internal standard, the differences between the true value and the observed values of the 236/233 ratio allows the determination of a correction factor, which is applied to the measured 235/238 ratio values. Since 1978, 235 U abundance measurements on series of samples have been performed, using this technique; data are obtained with an accuracy better than 0,05%. It is intended to apply this method for precise determination of 238/233 ratio in the case of uranium concentration measurements by isotope dilution [fr

  11. Potential Bone to Implant Contact Area of Short Versus Standard Implants: An In Vitro Micro-Computed Tomography Analysis.

    Science.gov (United States)

    Quaranta, Alessandro; DʼIsidoro, Orlando; Bambini, Fabrizio; Putignano, Angelo

    2016-02-01

    To compare the available potential bone-implant contact (PBIC) area of standard and short dental implants by micro-computed tomography (μCT) assessment. Three short implants with different diameters (4.5 × 6 mm, 4.1 × 7 mm, and 4.1 × 6 mm) and 2 standard implants (3.5 × 10 mm and 3.3 × 9 mm) with diverse design and surface features were scanned with μCT. Cross-sectional images were obtained. Image data were manually processed to find the plane that corresponds to the most coronal contact point between the crestal bone and implant. The available PBIC was calculated for each sample. Later on, the cross-sectional slices were processed by a 3-dimensional (3D) software, and 3D images of each sample were used for descriptive analysis and display the microtopography and macrotopography. The wide-diameter short implant (4.5 × 6 mm) showed the higher PBIC (210.89 mm) value followed by the standard (178.07 mm and 185.37 mm) and short implants (130.70 mm and 110.70 mm). Wide-diameter short implants show a surface area comparable with standard implants. Micro-CT analysis is a promising technique to evaluate surface area in dental implants with different macrodesign, microdesign, and surface features.

  12. Standards and innovation in emerging fields: Pushing breakthrough innovation or enrolling actors? An analysis of eco-district standards in France and Denmark

    OpenAIRE

    Aurélien Acquier; Eva Boxenbaum; Rebecca Pinheiro-Croisel

    2011-01-01

    International audience; Standards and norms are central objects for institutional studies. However, their role in innovation and the creation of novelty remain unclear, in particular in new / emerging fields. Accordingly, this paper investigates the relationship between standard setting and innovation, in the context of emerging organizational fields. We consider standardization in emerging fields as a socio-technical process, which must simultaneously promote a certain degree of innovation a...

  13. Ultrasonic vocalizations (USV) in the three standard laboratory mouse strains: developmental analysis.

    Science.gov (United States)

    Wiaderkiewicz, Jan; Głowacka, Marta; Grabowska, Marta; Jarosław-Jerzy, Barski

    2013-01-01

    Mice, similarly to some other rodent species, communicate with specialized sounds in the ultrasonic range called ultrasonic vocalizations (USV). Evaluation of this behavioral activity enables estimation of the social interactions in animal models of autistic spectrum disorders (ASD). Because transgenic mouse models are generated, in most cases, on the mixed 129SV/C57BL6 genetic background, we were interested if parameters that characterize USV differ between these two mouse strains. In addition, we wanted to compare these strains with the BALB/c line. In order to analyze USV, we applied the standard isolation test to newborn animals and compared standard parameters. Obtained results indicate clear differences between the 129SV and C57BL6 strains in respect to all analyzed USV parameters. Both strains behave also differently when compared with the BALB/c strain. For this reason in experiments utilizing transgenic animals, contribution of various genetic backgrounds has to be carefully considered.

  14. Meeting the measurement uncertainty and traceability requirements of ISO/AEC standard 17025 in chemical analysis.

    Science.gov (United States)

    King, B

    2001-11-01

    The new laboratory accreditation standard, ISO/IEC 17025, reflects current thinking on good measurement practice by requiring more explicit and more demanding attention to a number of activities. These include client interactions, method validation, traceability, and measurement uncertainty. Since the publication of the standard in 1999 there has been extensive debate about its interpretation. It is the author's view that if good quality practices are already in place and if the new requirements are introduced in a manner that is fit for purpose, the additional work required to comply with the new requirements can be expected to be modest. The paper argues that the rigour required in addressing the issues should be driven by customer requirements and the factors that need to be considered in this regard are discussed. The issues addressed include the benefits, interim arrangements, specifying the analytical requirement, establishing traceability, evaluating the uncertainty and reporting the information.

  15. Preliminary researches to standardize a method of quantitative analysis on Lactobacillus acidophilus in poultry feed

    Directory of Open Access Journals (Sweden)

    Daniele Gallazzi

    2010-01-01

    Full Text Available The study focuses on the method and the problems about quantitative analyses in the research on Lactobacillus acidophilus after its addition to commercial poultry-feed, whose rough grinding is not suitable for the “IDF Standard quantitative method for lactic acid bacteria count at 37°C” employed in dairy products. Poultry-feed was prepared every month. A sample was collected before and after adding Lactobacillus acidophilus, while analyses were carried out respectively at T 0, 15 and 28 days after the food storage at 4-6°C. The best outcomes (more 30% of recovered cells compared to the standard method resulted from samples subjected to the homogenization and the addition of Skim Milk Powder.

  16. Genetic and Antigenic Analysis of Adenovirus Type 3 Strains Showing Intermediate Behavior in Standard Seroneutralization Test

    Directory of Open Access Journals (Sweden)

    Márcia TB Moraes

    1998-03-01

    Full Text Available During an epidemiological survey of acute respiratory infection in Rio de Janeiro, among 208 adenovirus isolates, we found two strains that we were not able, by a standard neutralization procedure, to distinguish between type 3 or 7. However, DNA restriction pattern for the two strains with different enzymes were analyzed and showed a typical Ad3h profile. Using a cross-neutralization test in which both Ad3p and Ad7p antisera were used in different concentration against 100 TCID50 of each adenovirus standard and both isolates, we were able to confirm that the two isolates belong to serotype 3. An hemagglutination inhibition test also corroborated the identification of both strains as adenovirus type 3. Comparing Ad3h and Ad3p genome, we observed 16 different restriction enzyme sites, three of which were located in genomic regions encoding polypeptides involved in neutralization sites

  17. Extending and simplifying the standard Köhn-pipette technique for grain size analysis

    Science.gov (United States)

    Hirsch, Florian; Raab, Thomas

    2014-05-01

    Grain size distribution is a fundamental parameter to characterize physical properties of soils and sediments. Manifold approaches exist and according to the DIN ISO 11277 soil texture is analyzed by default with the combined pipette sieving and sedimentation method developed by Köhn. With this standard technique subfractions of sand and silt as well as the total clay content can be determined but the differentiation of clay subfractions is impossible. As the differentiation of the clay subfractions yields relevant information about pedogenesis, we present a protocol basing on standard techniques of granulometry with easy to handle and low cost equipment. The protocol was tested on a set of soil samples to cover the range of grain size distributions. We used a three-step procedure for achieving the grain size distribution of soil samples taking into account the subfractions of sand, silt and clay by a combination of sedimentation, centrifugal sedimentation and wet sieving. The pipetting was done with a piston-stroke pipette instead of the referred complex pipette from the DIN ISO 11277. Our first results show that the applied protocol is less prone to operating errors than the standard Köhn-pipette technique. Furthermore, even a less experienced laboratory worker can handle 10 samples in one day. Analyses of a luvisol profile, sampled in high spatial resolution, showed that the lessivation process is characterized by translocation of fine clay from the eluvial horizon to the illuvial horizon. Therefore our protocol is a fast alternative to detect lessivation, which is otherwise only clearly identifiable by micromorphological investigation and not by the standard Köhn-pipette technique.

  18. Analysis of ductile-brittle transition shifts for standard and miniature bending specimens of irradiated steel

    International Nuclear Information System (INIS)

    Korshunov, M.E.; Korolev, Yu.N.; Krasikov, E.A.; Gabuev, N.N.; Tykhmeev, D.Yu.

    1996-01-01

    A study is made to reveal if there is a correlation between shifts in temperature curves obtained when testing thin plates and standard specimens on impact bending and fracture toughness. The tests were carried out using steel 25Kh3NM specimens irradiated by 6 x 10 19 cm -2 neutron fluence. A conclusion is made about the possibility to evaluate the degree of radiation-induced embrittlement of reactor steels on the basis of thin plate testing under quasistatic loads [ru

  19. Status and Analysis on Effects of Energy Efficiency Standards for Industrial Boilers in China

    Science.gov (United States)

    Liu, Ren; Chen, Lili; Liu, Meng; Ding, Qing; Zhao, Yuejin

    2017-11-01

    Energy conservation and environmental protection is the basic policy of China, and is an important part of ecological civilization construction. The industrial boilers in China are featured by large quantity, wide distribution, high energy consumption and heavy environmental pollution, which are key problems faced by energy conservation and environmental protection in China. Meanwhile, industrial boilers are important equipment for national economy and people’s daily life, and energy conservation gets through all segments from type selection, purchase, installation and acceptance to fuel management, operation, maintenance and service. China began to implement such national mandatory standards and regulations for industrial boiler as GB24500-2009 The Minimum Allowable Values of Energy Efficiency and Energy Efficiency Grades of Industrial Boilers and TSG G002-2010 Supervision Regulation on Energy-Saving Technology for Boilers since 2009, which obviously promote the development of energy conservation of industrial boilers, but there are also some problems with the rapid development of technologies for energy conservation of industrial boilers. In this paper, the implementation of energy efficiency standards for industrial boilers in China and the significance are analyzed based on survey data, and some suggestions are proposed for the energy efficiency standards for industrial boilers.

  20. Analysis on effects of energy efficiency regulations & standards for industrial boilers in China

    Science.gov (United States)

    Liu, Ren; Chen, Lili; Zhao, Yuejin; Liu, Meng

    2017-11-01

    The industrial boilers in China are featured by large quantity, wide distribution, high energy consumption and heavy environmental pollution, which are key problems faced by energy conservation and environmental protection in China. Meanwhile, industrial boilers are important equipment for national economy and people’s daily life, and energy conservation gets through all segments from type selection, purchase, installation and acceptance to fuel management, operation, maintenance and service. China began to implement such national mandatory standards and regulations for industrial boiler as GB24500-2009 The Minimum Allowable Values of Energy Efficiency and Energy Efficiency Grades of Industrial Boilers and TSG G002-2010 Supervision Regulation on Energy-Saving Technology for Boilers since 2009, which obviously promote the development of energy conservation of industrial boilers, but there are also some problems with the rapid development of technologies for energy conservation of industrial boilers. In this paper, the implementation of energy efficiency standards for industrial boilers in China and the significance are analyzed based on survey data, and some suggestions are proposed for the energy efficiency standards for industrial boilers. Support by Project 2015424050 of Special Fund for quality control Research in the Public Interest

  1. Comparison of Standard Wind Turbine Models with Vendor Models for Power System Stability Analysis: Preprint

    Energy Technology Data Exchange (ETDEWEB)

    Honrubia-Escribano, A.; Gomez Lazaro, E.; Jimenez-Buendia, F.; Muljadi, Eduard

    2016-11-01

    The International Electrotechnical Commission Standard 61400-27-1 was published in February 2015. This standard deals with the development of generic terms and parameters to specify the electrical characteristics of wind turbines. Generic models of very complex technological systems, such as wind turbines, are thus defined based on the four common configurations available in the market. Due to its recent publication, the comparison of the response of generic models with specific vendor models plays a key role in ensuring the widespread use of this standard. This paper compares the response of a specific Gamesa dynamic wind turbine model to the corresponding generic IEC Type III wind turbine model response when the wind turbine is subjected to a three-phase voltage dip. This Type III model represents the doubly-fed induction generator wind turbine, which is not only one of the most commonly sold and installed technologies in the current market but also a complex variable-speed operation implementation. In fact, active and reactive power transients are observed due to the voltage reduction. Special attention is given to the reactive power injection provided by the wind turbine models because it is a requirement of current grid codes. Further, the boundaries of the generic models associated with transient events that cannot be represented exactly are included in the paper.

  2. Standardization of pathways to adulthood? An analysis of Dutch cohorts born between 1850 and 1900.

    Science.gov (United States)

    Bras, Hilde; Liefbroer, Aart C; Elzinga, Cees H

    2010-11-01

    This article examines pathways to adulthood among Dutch cohorts born in the second half of the nineteenth century. Although largely overlooked by previous studies, theory suggests that life courses of young adults born during this period were already influenced by a process of standardization, in the sense that their life courses became more similar over time. Using data from a Dutch registry-based sample, we examine household trajectories: that is, sequences of living arrangements of young adults aged 15-40. Our study shows that for successive cohorts, household trajectories became more similar. We identified six types of trajectories: early death, life-cycle service, early family formation, late family formation, singlehood, and childless but with partner. Overtime, early family formation gradually became the "standard" trajectory to adulthood. However, late family formation and singlehood, common pathways within the preindustrial western European marriage pattern, remained widespread among cohorts born in the late nineteenth century. Laboring class youths, farmers' daughters, young people of mixed religious background, and urban-born youngsters were the nineteenth century forerunners of a standard pathway to adulthood.

  3. Comparison of Kayzero for Windows and k0-IAEA software packages for k0 standardization in neutron activation analysis

    Czech Academy of Sciences Publication Activity Database

    Kubešová, Marie; Kučera, Jan

    2011-01-01

    Roč. 654, č. 1 (2011), s. 206-212 ISSN 0168-9002 R&D Projects: GA ČR GA202/09/0363 Institutional research plan: CEZ:AV0Z10480505 Keywords : neutron activation analysis * ko standardization * Kayzero for Windows program * ko-IAEA program Subject RIV: BG - Nuclear, Atomic and Molecular Physics, Colliders Impact factor: 1.207, year: 2011

  4. Analysis of climate change impact on meteorological and hydrological droughts through relative standardized indices

    Science.gov (United States)

    Marcos, Patricia; Lopez-Nicolas, Antonio; Pulido-Velazquez, Manuel

    2017-04-01

    Southern Mediterranean basins are prone to droughts, due to the high temporal and spatial rainfall variability. In addition, semiarid Mediterranean regions emerge as noticeable climate change hotspots, with high uncertainty about the impacts of climate change on future droughts. Standardized drought indices have been traditionally used to assess and identify drought events, because of their simplicity and flexibility to compare the departure from normal status across regions at different timescales. Nevertheless, the statistical foundation of these indices assumes stationarity for certain aspects of the cli-matic variables, which could not be longer adopted under climate change. Thus, in recent years several modifications have been proposed in order to cope with these limitations. This contribution provides a framework to analyze climate change impact on meteorological and hydrological droughts, considering the predicted shifts in precipitation and temperature and the uncertainty of the assumed distribution parameters. To characterize drought in a climate change context, relative standardized indices instead of the traditional ones are applied: Standardized Precipitation Index (rSPI), Standardized Precipitation Evapotranspiration Index (rSPEI) and a Standardized Flow Index (rSFI). The behavior of the rSPI versus the multiscalar rSPEI is contrasted. A modification of the Thornthwaite scheme is presented to improve the representation of the intra-annual variation of the potential evapotranspiration (PET) in continental climate areas. The uncertainty due to the selected hydrological model is assessed through the comparison of the performance and outcome of three conceptual lumped-parameter models (Temez, GR2M, and HBV-light). The Temez model was selected to obtain the runoff for the rSFI, given that it showed the best fitting in our case study. To address the uncertainty of the indices distribution parameters, bootstrapping was combined with the computation of the

  5. Subtle alterations in cerebrovascular reactivity in mild cognitive impairment detected by graph theoretical analysis and not by the standard approach

    Directory of Open Access Journals (Sweden)

    Carlos A. Sánchez-Catasús

    2017-01-01

    Full Text Available There is growing support that cerebrovascular reactivity (CVR in response to a vasodilatory challenge, also defined as the cerebrovascular reserve, is reduced in Alzheimer's disease dementia. However, this is less clear in patients with mild cognitive impairment (MCI. The current standard analysis may not reflect subtle abnormalities in CVR. In this study, we aimed to investigate vasodilatory-induced changes in the topology of the cerebral blood flow correlation (CBFcorr network to study possible network-related CVR abnormalities in MCI. For this purpose, four CBFcorr networks were constructed: two using CBF SPECT data at baseline and under the vasodilatory challenge of acetazolamide (ACZ, obtained from a group of 26 MCI patients; and two equivalent networks from a group of 26 matched cognitively normal controls. The mean strength of association (SA and clustering coefficient (C were used to evaluate ACZ-induced changes on the topology of CBFcorr networks. We found that cognitively normal adults and MCI patients show different patterns of C and SA changes. The observed differences included the medial prefrontal cortices and inferior parietal lobe, which represent areas involved in MCI's cognitive dysfunction. In contrast, no substantial differences were detected by standard CVR analysis. These results suggest that graph theoretical analysis of ACZ-induced changes in the topology of the CBFcorr networks allows the identification of subtle network-related CVR alterations in MCI, which couldn't be detected by the standard approach.

  6. Three dimensional versus standard miniplate fixation in the management of mandibular fractures: A meta-analysis of randomized controlled trials.

    Science.gov (United States)

    Silajiding, Kubila; Wusiman, Patiguli; Yusufu, Bilikezi; Moming, Adili

    2017-09-01

    The aim of this meta-analysis is to evaluate the efficacy of the 3-dimensional miniplate system in comparison with the standard miniplate system for the treatment of mandibular fractures (MFs). A systematic review was conducted according to PRISMA guidelines, examining Medline-Ovid, Embase, and PubMed databases. The primary search objective was to identify all papers reporting the results of randomized control trials (RCTs) for the treatment of adults with mandibular fractures, with the aim of comparing the different techniques. The incidence of complications was evaluated; nine studies including 283 patients with different fracture sites were enrolled in the analysis. The results showed no significant differences in overall complications (odds ratio [OR], 0.92; 95% confidence interval [CI], 0.552-1.542; P = 0.81), postoperative infections (OR, 0.99; 95% CI, 0.40-2.48; P = 0.89), wound dehiscence (OR, 0.96; 95% CI, 0.13-7.37; P = 0.96), paresthesia (OR, 0.47; 95% CI, 0.20-1.07; P = 0.11), or malocclusion (OR, 1.8; 95% CI, 0.39-8.32; P = 0.47) between standard miniplates and 3-dimensional miniplates for treating mandibular fractures. Mandibular fractures treated with 3-dimensional miniplates and standard miniplates presented similar short-term complication rates, and the low postoperative maxillomandibular fixation rate of using standard miniplates also indicated that the standard miniplate has a promising application in the treatment of mandibular fractures. Copyright © 2017. Published by Elsevier Taiwan.

  7. Predictive genomic and metabolomic analysis for the standardization of enzyme data

    Directory of Open Access Journals (Sweden)

    Masaaki Kotera

    2014-05-01

    Full Text Available The IUBMB׳s Enzyme List gives a valuable library of the individual experimental facts on enzyme activities, providing the standard classification and nomenclature of enzymes. Empirical knowledge about the relationships between the enzyme protein sequences (or structures and their functions (the capability of catalyzing chemical reactions has been accumulating in public literatures and databases. This provides a complementary approach to standardize and organize enzyme data, i.e., predicting the possible enzymes, reactions and metabolites that remain to be identified experimentally. Thus, we suggest the necessity of classifying enzymes based on the evidence and different perspectives obtained from various experimental works. The KEGG (Kyoto Encyclopedia of Genes and Genomes database describes enzymes from many different viewpoints including; the IUBMB׳s enzyme nomenclature/classification (EC numbers, the similarity group of enzyme reactions (KEGG Reaction Class; RCLASS based solely on the chemical structure transformation patterns, and the similarity groups of enzyme genes (KEGG Orthology; KO based on the orthologous groups that can be mapped to the KEGG PATHWAY and BRITE functional hierarchy. Some unique identifiers were additionally introduced to the KEGG database other than the EC numbers established by IUBMB. R, RP and RC numbers are given to distinguish reactions, reactant pairs and RCLASS, respectively. Genes, including enzyme genes, have their own ID numbers in specific organisms, and they are classified into ortholog groups that are identified by K numbers. In this review, we explain the concept and methodology of this formulation with some concrete example cases. We propose it beneficial to create a standard classification scheme that deals with both experimentally identified and theoretically predicted enzymes.

  8. Acucise™ endopyelotomy in a porcine model: procedure standardization and analysis of safety and immediate efficacy

    Directory of Open Access Journals (Sweden)

    Andreoni Cássio

    2004-01-01

    Full Text Available PURPOSE: The study here presented was done to test the technical reliability and immediate efficacy of the Acucise device using a standardized technique. MATERIALS AND METHODS: 56 Acucise procedures were performed in pigs by a single surgeon who used a standardized technique: insert 5F angiographic catheter bilaterally up to the midureter, perform retrograde pyelogram, Amplatz super-stiff guidewire is advanced up to the level of the renal pelvis, angiographic catheters are removed, Acucise catheter balloon is advanced to the ureteropelvic junction (UPJ level, the super-stiff guide-wire is removed and the contrast medium in the renal pelvis is aspirated and replaced with distilled water, activate Acucise at 75 watts of pure cutting current, keep the balloon fully inflated for 10 minutes, perform retrograde ureteropyelogram to document extravasation, remove Acucise catheter and pass an ureteral stent and remove guide-wire. RESULTS: In no case did the Acucise device present malfunction. The electrocautery activation time was 2.2 seconds (ranging from 2 to 4 seconds. The extravasation of contrast medium, visible by fluoroscopy, occurred in 53 of the 56 cases (94.6%. In no case there was any evidence of intraoperative hemorrhage. CONCLUSIONS: This study revealed that performing Acucise endopyelotomy routinely in a standardized manner could largely preclude intraoperative device malfunction and eliminate complications while achieving a successful incision in the UPJ. With the guidelines that were used in this study, we believe that Acucise endopyelotomy can be completed successfully and safely in the majority of selected patients with UPJ obstruction.

  9. [Situation analysis and standard formulation of pesticide residues in traditional Chinese medicines].

    Science.gov (United States)

    Yang, Wan-Zhen; Kang, Chuan-Zhi; Ji, Rui-Feng; Zhou, L I; Wang, Sheng; Li, Zhen-Hao; Ma, Zhong-Hua; Guo, Lan-Ping

    2017-06-01

    Chinese Pharmacopoeia provides nine pesticide Maximum Residual Limits(MRLs) of traditional Chinese medicines(TCMs), The number of pesticides used in production are far more than those listed in pharmacopoeia. The lack of the standards make it's hard to reflect the real situation of pesticide residues in TCMs correctly. The paper is aimed to analyze the data of pesticide residues in TCMs from 7 089 items in 140 reports, and judging the exceedance rate of pesticides in TCMs using the MRLs of European pharmacopoeia,which is widely accepted in many countries. The results show that:①Pesticide residues in 18 kinds of TCMs are higher than MRLs,while in 137 kinds are below MRLs, such as Atractylodis Macrocephalae Rhizoma, Menthae Haplocalycis Herba and Fritillariae Thunbergii Bulbus. The average exceedance rate of all TCMs is 1.72%. The average exceedance rates of organochlorine, organophosphorus and pyrethroid are 2.26%, 1.51%, 0.37%,respectively. ②The average exceedance rate of pesticides is 2.00%, and the exceedance rate is more than 5%, accounting for 8.33%, the exceedance rate is between 1%-5%, accounting for 18.75%. the exceedance rate is between 0%-1%, accounting for 18.75%. The remaining 29 kinds of pesticides were not exceeded, accounting for 60.42%.Some reports like Greenpeace's organization exaggerated the pesticide residues in TCMs.But the pesticide residue question is still worthy of attention, so we proposed to amend the Chinese Pharmacopoeia pesticide residues standards, to increase the pesticide species of traditional Chinese medicine in production on the basis of retaining the existing types of pesticide residues, to strengthen the system research of pesticide residues in TCMs, providing a basis for making standard and promoting import and export trade in TCMs. Copyright© by the Chinese Pharmaceutical Association.

  10. Performance analysis of air-standard Diesel cycle using an alternative irreversible heat transfer approach

    International Nuclear Information System (INIS)

    Al-Hinti, I.; Akash, B.; Abu-Nada, E.; Al-Sarkhi, A.

    2008-01-01

    This study presents the investigation of air-standard Diesel cycle under irreversible heat transfer conditions. The effects of various engine parameters are presented. An alternative approach is used to evaluate net power output and cycle thermal efficiency from more realistic parameters such as air-fuel ratio, fuel mass flow rate, intake temperature, engine design parameters, etc. It is shown that for a given fuel flow rate, thermal efficiency and maximum power output increase with decreasing air-fuel ratio. Also, for a given air-fuel ratio, the maximum power output increases with increasing fuel rate. However, the effect of the thermal efficiency is limited

  11. Bayesian analysis of longitudinal Johne's disease diagnostic data without a gold standard test

    DEFF Research Database (Denmark)

    Wang, C.; Turnbull, B.W.; Nielsen, Søren Saxmose

    2011-01-01

    A Bayesian methodology was developed based on a latent change-point model to evaluate the performance of milk ELISA and fecal culture tests for longitudinal Johne's disease diagnostic data. The situation of no perfect reference test was considered; that is, no “gold standard.” A change-point proc...... an area under the receiver operating characteristic curve (AUC) of 0.984, and is superior to the raw ELISA (AUC = 0.911) and fecal culture (sensitivity = 0.358, specificity = 0.980) tests for Johne's disease diagnosis....

  12. CSER-98-002: Criticality analysis for the storage of special nuclear material sources and standards in the WRAP Facility

    Energy Technology Data Exchange (ETDEWEB)

    Goldberg, H.J.

    1998-06-22

    The Waste Receiving and Processing (WRAP) Facility will store uranium and transuranic (TRU) sources and standards for certification that WRAP meets the requirements of the Quality Assurance Program Plan (QAPP) for the Waste Isolation Pilot Plant (WIPP). In addition, WRAP must meet internal requirements for testing and validation of measuring instruments for nondestructive assay (NDA). In order to be certified for WIPP, WRAP will participate in the NDA Performance Demonstration Program (PDP). This program is a blind test of the NDA capabilities for TRU waste. It is intended to ensure that the NDA capabilities of this facility satisfy the requirements of the quality assurance program plan for the WIPP. The PDP standards have been provided by the Los Alamos National Laboratory (LANL) for this program. These standards will be used in the WRAP facility. To internally check the accuracy and sensitivity of the NDA instruments, a further set of sources and standards will also be used by the facility. Each sealed source or standard will be referred to herein as a unit. Various combinations of these units will be placed in test drums and/or boxes which will be subject to their own limits until unloaded. There will be two sealed test drums with five grams of weapons grade plutonium loaded in them. These drums will be appropriately marked and will be subject to the unit limits rather than the drum limits. This analysis shows that the storage and use of special nuclear material sources and standards within the limited control facility of WRAP (Rooms 101 and 104) is safe from a criticality standpoint. With the form, geometry, and masses involved with this evaluation, a criticality is not possible. The limits given in Section 2 should be imposed on facility operations.

  13. Evaluation of standard and advanced preprocessing methods for the univariate analysis of blood serum 1H-NMR spectra.

    Science.gov (United States)

    De Meyer, Tim; Sinnaeve, Davy; Van Gasse, Bjorn; Rietzschel, Ernst-R; De Buyzere, Marc L; Langlois, Michel R; Bekaert, Sofie; Martins, José C; Van Criekinge, Wim

    2010-10-01

    Proton nuclear magnetic resonance ((1)H-NMR)-based metabolomics enables the high-resolution and high-throughput assessment of a broad spectrum of metabolites in biofluids. Despite the straightforward character of the experimental methodology, the analysis of spectral profiles is rather complex, particularly due to the requirement of numerous data preprocessing steps. Here, we evaluate how several of the most common preprocessing procedures affect the subsequent univariate analyses of blood serum spectra, with a particular focus on how the standard methods perform compared to more advanced examples. Carr-Purcell-Meiboom-Gill 1D (1)H spectra were obtained for 240 serum samples from healthy subjects of the Asklepios study. We studied the impact of different preprocessing steps--integral (standard method) and probabilistic quotient normalization; no, equidistant (standard), and adaptive-intelligent binning; mean (standard) and maximum bin intensity data summation--on the resonance intensities of three different types of metabolites: triglycerides, glucose, and creatinine. The effects were evaluated by correlating the differently preprocessed NMR data with the independently measured metabolite concentrations. The analyses revealed that the standard methods performed inferiorly and that a combination of probabilistic quotient normalization after adaptive-intelligent binning and maximum intensity variable definition yielded the best overall results (triglycerides, R = 0.98; glucose, R = 0.76; creatinine, R = 0.70). Therefore, at least in the case of serum metabolomics, these or equivalent methods should be preferred above the standard preprocessing methods, particularly for univariate analyses. Additional optimization of the normalization procedure might further improve the analyses.

  14. Standards for Standardized Logistic Regression Coefficients

    Science.gov (United States)

    Menard, Scott

    2011-01-01

    Standardized coefficients in logistic regression analysis have the same utility as standardized coefficients in linear regression analysis. Although there has been no consensus on the best way to construct standardized logistic regression coefficients, there is now sufficient evidence to suggest a single best approach to the construction of a…

  15. A Meta-Analysis of Typhoid Diagnostic Accuracy Studies: A Recommendation to Adopt a Standardized Composite Reference.

    Directory of Open Access Journals (Sweden)

    Helen L Storey

    Full Text Available Novel typhoid diagnostics currently under development have the potential to improve clinical care, surveillance, and the disease burden estimates that support vaccine introduction. Blood culture is most often used as the reference method to evaluate the accuracy of new typhoid tests; however, it is recognized to be an imperfect gold standard. If no single gold standard test exists, use of a composite reference standard (CRS can improve estimation of diagnostic accuracy. Numerous studies have used a CRS to evaluate new typhoid diagnostics; however, there is no consensus on an appropriate CRS. In order to evaluate existing tests for use as a reference test or inclusion in a CRS, we performed a systematic review of the typhoid literature to include all index/reference test combinations observed. We described the landscape of comparisons performed, showed results of a meta-analysis on the accuracy of the more common combinations, and evaluated sources of variability based on study quality. This wide-ranging meta-analysis suggests that no single test has sufficiently good performance but some existing diagnostics may be useful as part of a CRS. Additionally, based on findings from the meta-analysis and a constructed numerical example demonstrating the use of CRS, we proposed necessary criteria and potential components of a typhoid CRS to guide future recommendations. Agreement and adoption by all investigators of a standardized CRS is requisite, and would improve comparison of new diagnostics across independent studies, leading to the identification of a better reference test and improved confidence in prevalence estimates.

  16. Full-spectrum versus standard colonoscopy for improving polyp detection rate: A systematic review and meta-analysis.

    Science.gov (United States)

    Facciorusso, Antonio; Del Prete, Valentina; Buccino, Vincenzo; Valle, Nicola Della; Nacchiero, Maurizio Cosimo; Muscatiello, Nicola

    2018-02-01

    Full-spectrum endoscopy represents a new endoscopic platform allowing a panoramic 330 degree view of the colon, but evidence of its superiority over standard colonoscopy is still lacking. Our study is the first meta-analysis comparing the efficacy of full-spectrum endoscopy with standard colonoscopy. Through a systematic literature review until May 2017, we identified eight randomized-controlled trials. Primary outcomes were polyp detection rate and adenoma detection rate, while cecal intubation time and total colonoscopy time were secondary outcomes. Direct meta-analysis was performed using a random effects model. No difference in terms of polyp detection rate and adenoma detection rate was found (risk ratio: 1.00, 95% confidence interval 0.89-1.12, P = 0.96, and 1.05, 0.94-1.17, P = 0.40, respectively). Adenoma miss rate resulted significantly in favor of full-spectrum endoscopy (risk ratio: 0.35, 0.25-0.48, P 5 mm) and pedunculated lesions (risk ratio: 0.38, 0.09-1.60, P = 0.19, and risk ratio: 0.15, 0.01-3.00, P = 0.21, respectively). Cecal intubation time was not different between the two techniques (mean standardized difference: 0.22 min, -1.18 to 1.62, P = 0.76), while total colonoscopy time was significantly shorter when adopting full-spectrum endoscopy (mean difference: -2.60, -4.60 to -0.61, P = 0.01). Sensitivity analysis confirmed all the findings. Full-spectrum endoscopy appears as a promising and reliable technology able to significantly decrease the number of adenomas missed and procedural times, while its superiority over standard colonoscopy in terms of adenoma detection rate results is still unclear. © 2017 Journal of Gastroenterology and Hepatology Foundation and John Wiley & Sons Australia, Ltd.

  17. Standardization and evaluation of the CAMP reaction for the prompt, presumptive identification of Streptococcus agalactiae (Lancefield group B) in clinical material.

    Science.gov (United States)

    Darling, C L

    1975-02-01

    Primary cultures of clinical material were screened for the presence of colonies suspected of being Streptococcus agalactiae (Lancefield group B). Sixty-three such cultures and 108 other isolates of beta-hemolytic streptococci (groups A, C, and G), encountered during the first 3 months of the investigation, were studied by Lancefield grouping, sodium hippurate hydrolysis, and a standardized CAMP test. All streptococci were inoculated perpendicularly to streaks of a beta-toxin-producing staphylococcus on sheep blood agar plates and incubated aerobically in a candle jar and anaerobically at 37 C. Plates were examined after 5 to 6 and 18 h of incubation. The production of a distinct "arrowhead" of hemolysis was indicative of a positive CAMP reaction. All group B streptococci produced a positive CAMP reaction in the candle jar or anaerobically, usually within 5 to 6 h, and aerobically after 18 h of incubation. All group A streptococci produced a positive reaction only under anaerobic conditions. Groups C and G streptococci were negative under all atmospheres. The CAMP reaction is a prompt and reliable procedure for the presumptive identification of group B streptococci when a candle jar atmosphere is used during incubation.

  18. QUALITY STANDARDS FOR DISTANCE LEARNING IN HIGHER EDUCATION: A COMPARATIVE ANALYSIS OF CANADIAN AND RUSSIAN PRACTICES

    OpenAIRE

    Natalia V. Buhanova; Konstantin V. Kuz’min; Larisa E. Petrova; Sergey A. Chemezov

    2015-01-01

    The aim of the investigation is to perform comparative analysis of the quality assessment and policies of quality assurance in postsecondary education in Canada and Russian Federation.Methods. The theoretical methods involve comparative analysis and synthesis, induction and deduction, extrapolation and modelling.Results. Russia and Canada have different policies on quality assurance in the distance learning and are at different stages of implementation of distance learning into postsecondary ...

  19. Application of quantitative salt iodine analysis compared with the standard method.

    Science.gov (United States)

    Chongchirasiri, S; Pattanachak, S; Putrasreni, N; Suwanik, R; Pattanachak, H; Tojinda, N; Pleehachinda, R

    2001-06-01

    Laboratory investigation of 50 iodated salt samples (from producers, households, markets etc) were studied at the Research Nuclear Medicine Building, Siriraj Hospital. Two methods for the determination of iodine in salt are herein described. The standard method as recommended by The Programme Against Micronutrient Malnutrition (PAMM) / The Micronutrient Initiative (MI)/ The International Council for Control of Iodine Deficiency Disorders (ICCIDD) was the iodometric titration method. The starch-KI salt iodine quantitative method was developed in our laboratory for validation purposes. This method is high in precision, accuracy, sensitivity as well as specificity. The coefficient of variation (%CV) for intra and inter assay was below 10. Iodine contents as low as 10 ppm, could be detected. The proposed starch-KI method offered some advantages: e.g. not complicated, easier to learn and easier to perform competently, could be applied for spot qualitative test and readily performed outside the laboratory. The results obtained by the starch-KI method correlated well with the standard method (y = 0.98x - 3.22, r = 0.99).

  20. A nationwide curriculum analysis of integrated plastic surgery training: is training standardized?

    Science.gov (United States)

    Schneider, Lisa F; Barr, Jason; Saadeh, Pierre B

    2013-12-01

    The integrated model of plastic surgery education, shortly to become the standard for all 6-year programs, has set minimal but no maximal exposure to plastic surgery. The authors hypothesized that the first 3 years of integrated training will show variability among residency programs. Rotation schedules for all 42 integrated programs were analyzed for plastic surgery versus 18 nonplastic surgery rotations for postgraduate years 1, 2, and 3 as well as cumulatively for the first 3 years. Rotations "strongly suggested" by the Residency Review Committee on Plastic Surgery and American Board of Plastic Surgery were also examined. Postgraduate years 1 through 3 spent a wide range of 3 to 19 months (SD ± 4.9 months) on plastic surgery (mean, 9.1 months). General surgery also varied dramatically, with 8 to 21 months (SD ± 4.0 months) of exposure (mean, 16.3 months). Surgical subspecialty rotations ranged substantially from 1 to 6 months (SD ± 1.0 months). Plastic surgery exposure was greater in programs based within plastic surgery departments than within divisions (13.8 versus 8.3 months, p plastic surgery experience in the first 3 years of residency training varies by a greater than 6-fold difference among integrated programs. This was also found in the 2.5-fold and 6-fold differences in general surgery and subspecialty surgery experiences. Since standardized residency training is an expectation by both accrediting bodies and the public, this variability may warrant closer attention.

  1. Analysis of Open Automated Demand Response Deployments in California and Guidelines to Transition to Industry Standards

    Energy Technology Data Exchange (ETDEWEB)

    Ghatikar, Girish [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Riess, David [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Piette, Mary Ann [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States)

    2014-01-02

    This report reviews the Open Automated Demand Response (OpenADR) deployments within the territories serviced by California's investor-owned utilities (IOUs) and the transition from the OpenADR 1.0 specification to the formal standard?OpenADR 2.0. As demand response service providers and customers start adopting OpenADR 2.0, it is necessary to ensure that the existing Automated Demand Response (AutoDR) infrastructure investment continues to be useful and takes advantage of the formal standard and its many benefits. This study focused on OpenADR deployments and systems used by the California IOUs and included a summary of the OpenADR deployment from the U.S. Department of Energy-funded demonstration conducted by the Sacramento Municipal Utility District (SMUD). Lawrence Berkeley National Laboratory collected and analyzed data about OpenADR 1.0 deployments, categorized architectures, developed a data model mapping to understand the technical compatibility of each version, and compared the capabilities and features of the two specifications. The findings, for the first time, provided evidence of the total enabled load shed and average first cost for system enablement in the IOU and SMUD service territories. The OpenADR 2.0a profile specification semantically supports AutoDR system architectures and data propagation with a testing and certification program that promotes interoperability, scaled deployments by multiple vendors, and provides additional features that support future services.

  2. Three-camera setup to record simultaneously standardized high-definition video for smile analysis.

    Science.gov (United States)

    Husain, Akhter; Makhija, Parmanand G; Ummer, Aseena Alungal; Kuijpers-Jagtman, Anne Marie; Kuijpers, Mette A R

    2017-11-01

    Our objective was to develop a photographic setup that would simultaneously capture subjects' smiles from 3 views, both statically and dynamically, and develop a software to crop the produced video clip and slice the frames to study the smile at different stages. Facial images were made of 96 subjects, aged 18 to 28 years, in natural head position using a standardized setup of 3 digital single lens reflex cameras, with a reference sticker (10 × 10 mm) on the forehead of each subject. To test the reproducibility of the setup, 1 operator took 3 images of all subjects on the same day and on 3 different days in a subset of 26 subjects. For the same-day observations, correlation coefficients varied between 0.87 and 0.93. For the observations on 3 different days, correlation coefficients were also high. The duplicate measurement error and the mean difference between measurements were small and not significant, pointing to good reliability. This new technique to capture standardized high-definition video and still images simultaneously from 3 positions is a reliable and practical tool. The technique is easy to learn and implement in the orthodontic office. Copyright © 2017 American Association of Orthodontists. Published by Elsevier Inc. All rights reserved.

  3. Identification of Water Quality Significant Parameter with Two Transformation/Standardization Methods on Principal Component Analysis and Scilab Software

    Directory of Open Access Journals (Sweden)

    Jovan Putranda

    2016-09-01

    Full Text Available Water quality monitoring is prone to encounter error on its recording or measuring process. The monitoring on river water quality not only aims to recognize the water quality dynamic, but also to evaluate the data to create river management policy and water pollution in order to maintain the continuity of human health or sanitation requirement, and biodiversity preservation. Evaluation on water quality monitoring needs to be started by identifying the important water quality parameter. This research objected to identify the significant parameters by using two transformation or standardization methods on water quality data, which are the river Water Quality Index, WQI (Indeks Kualitas Air, Sungai, IKAs transformation or standardization method and transformation or standardization method with mean 0 and variance 1; so that the variability of water quality parameters could be aggregated with one another. Both of the methods were applied on the water quality monitoring data which its validity and reliability have been tested. The PCA, Principal Component Analysis (Analisa Komponen Utama, AKU, with the help of Scilab software, has been used to process the secondary data on water quality parameters of Gadjah Wong river in 2004-2013, with its validity and reliability has been tested. The Scilab result was cross examined with the result from the Excel-based Biplot Add In software. The research result showed that only 18 from total 35 water quality parameters that have passable data quality. The two transformation or standardization data methods gave different significant parameter type and amount result. On the transformation or standardization mean 0 variances 1, there were water quality significant parameter dynamic to mean concentration of each water quality parameters, which are TDS, SO4, EC, TSS, NO3N, COD, BOD5, Grease Oil and NH3N. On the river WQI transformation or standardization, the water quality significant parameter showed the level of

  4. IR-drop analysis for validating power grids and standard cell architectures in sub-10nm node designs

    Science.gov (United States)

    Ban, Yongchan; Wang, Chenchen; Zeng, Jia; Kye, Jongwook

    2017-03-01

    Since chip performance and power are highly dependent on the operating voltage, the robust power distribution network (PDN) is of utmost importance in designs to provide with the reliable voltage without voltage (IR)-drop. However, rapid increase of parasitic resistance and capacitance (RC) in interconnects makes IR-drop much worse with technology scaling. This paper shows various IR-drop analyses in sub 10nm designs. The major objectives are to validate standard cell architectures, where different sizes of power/ground and metal tracks are validated, and to validate PDN architecture, where types of power hook-up approaches are evaluated with IR-drop calculation. To estimate IR-drops in 10nm and below technologies, we first prepare physically routed designs given standard cell libraries, where we use open RISC RTL, synthesize the CPU, and apply placement & routing with process-design kits (PDK). Then, static and dynamic IR-drop flows are set up with commercial tools. Using the IR-drop flow, we compare standard cell architectures, and analysis impacts on performance, power, and area (PPA) with the previous technology-node designs. With this IR-drop flow, we can optimize the best PDN structure against IR-drops as well as types of standard cell library.

  5. Validation of next generation sequencing technologies in comparison to current diagnostic gold standards for BRAF, EGFR and KRAS mutational analysis.

    Science.gov (United States)

    McCourt, Clare M; McArt, Darragh G; Mills, Ken; Catherwood, Mark A; Maxwell, Perry; Waugh, David J; Hamilton, Peter; O'Sullivan, Joe M; Salto-Tellez, Manuel

    2013-01-01

    Next Generation Sequencing (NGS) has the potential of becoming an important tool in clinical diagnosis and therapeutic decision-making in oncology owing to its enhanced sensitivity in DNA mutation detection, fast-turnaround of samples in comparison to current gold standard methods and the potential to sequence a large number of cancer-driving genes at the one time. We aim to test the diagnostic accuracy of current NGS technology in the analysis of mutations that represent current standard-of-care, and its reliability to generate concomitant information on other key genes in human oncogenesis. Thirteen clinical samples (8 lung adenocarcinomas, 3 colon carcinomas and 2 malignant melanomas) already genotyped for EGFR, KRAS and BRAF mutations by current standard-of-care methods (Sanger Sequencing and q-PCR), were analysed for detection of mutations in the same three genes using two NGS platforms and an additional 43 genes with one of these platforms. The results were analysed using closed platform-specific proprietary bioinformatics software as well as open third party applications. Our results indicate that the existing format of the NGS technology performed well in detecting the clinically relevant mutations stated above but may not be reliable for a broader unsupervised analysis of the wider genome in its current design. Our study represents a diagnostically lead validation of the major strengths and weaknesses of this technology before consideration for diagnostic use.

  6. Remnant preservation in anterior cruciate ligament reconstruction versus standard techniques: a meta-analysis of randomized controlled trials.

    Science.gov (United States)

    Ma, Tianjun; Zeng, Chun; Pan, Jianying; Zhao, Chang; Fang, Hang; Cai, Daozhang

    2017-01-01

    Preserving the remnant during anterior cruciate ligament (ACL) reconstruction is considered beneficial for graft healing, but it might increase the technical difficulties and complications. This study was to compare outcomes of using the technique of remnant preservation during the ACL reconstruction versus the standard procedure with the debridement of remnant. We searched PubMed and EMBASE and the Cochrane Library for randomized controlled trials comparing the outcomes of ACL reconstruction both with and without remnant preservation. The risk of bias was assessed in accordance with the Cochrane Collaboration's risk of bias tool. Meta-analysis was performed to compare results. Six randomized controlled trials with 346 patients were included. Statistically significant differences in favor of using technique of remnant preservation were observed for Lysholm Score, arthrometer measurements, and tibial tunnel enlargement. There was no significant difference between remnant technique of preservation and the standard procedure with respect to the IKDC (International Knee Documentation Committee) grade, IKDC score, Lachman Test, Pivot-shift Test, range of motion (ROM), and the incidence of the cyclops lesion. This meta-analysis of randomized controlled trials showed that ACL reconstruction with technique of remnant preservation cannot provide superior clinical outcomes compared with the standard procedure.

  7. Limitation of deuterium labelled methoximes as internal standards in the mass spectral analysis of prostaglandins

    International Nuclear Information System (INIS)

    Herold, D.A.; Smith, B.J.; Ross, R.M.; Marquis, F.; Ayers, C.R.; Wills, M.R.; Savory, J.

    1987-01-01

    A reported method for the preparation of D 3 -methoxime derivatives as internal standards for prostaglandin assays by gas chromatography-mass spectrometry was evaluated. Sample derivatization resulted in 1.5-86% exchange of the D 3 -methoxime in a series of prostaglandins. Exchange was minimal when the methoxime was on the 5-membered ring; whereas, acyclic methoximes exhibited extensive exchange. Induced strain energy due to the steric interaction of the hydroxyl group and the C13-C20 alkyl side chain with the gem-dimethoxylamine transition state is offered as an explanation for the unusual stability of PGE2. The use of 18 O exchange of the carboxylic acid function is presented as an alternative for the preparation of unavailable labelled eicosanoids

  8. A Prospective Analysis of the Costs, Benefits, and Impacts of U.S. Renewable Portfolio Standards

    Energy Technology Data Exchange (ETDEWEB)

    Mai, Trieu [National Renewable Energy Lab. (NREL), Golden, CO (United States); Wiser, Ryan [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Barbose, Galen [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Bird, Lori [National Renewable Energy Lab. (NREL), Golden, CO (United States); Heeter, Jenny [National Renewable Energy Lab. (NREL), Golden, CO (United States); Keyser, David [National Renewable Energy Lab. (NREL), Golden, CO (United States); Krishnan, Venkat [National Renewable Energy Lab. (NREL), Golden, CO (United States); Macknick, Jordan [National Renewable Energy Lab. (NREL), Golden, CO (United States); Millstein, Dev [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States)

    2016-12-01

    This is the third in a series of reports exploring the costs, benefits, and other impacts of state renewable portfolio standards (RPS). This report evaluates the effects of renewable electricity used to meet aggregate RPS demand growth prospectively, over the period 2015-2050, under both current RPS policies as well as a potential expansion of those policies. Relying on a well-vetted suite of methods, the report quantifies: the costs to the electric system and retail electricity price impacts; the potential societal benefits associated with reduced greenhouse gas emissions, air pollution emissions, and water use; workforce requirements and economic development effects; and consumer savings associated with reduced natural gas prices. The study quantifies these effects in both physical and monetary terms, where possible, at both national and regional levels, and characterizes key uncertainties. The two prior studies in the series have focused, instead, on the historical costs and on the historical benefits and impacts of state RPS policies.

  9. A Retrospective Analysis of the Benefits and Impacts of U.S. Renewable Portfolio Standards

    Energy Technology Data Exchange (ETDEWEB)

    Wiser, Ryan [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Barbose, Galen [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Heeter, Jenny [National Renewable Energy Lab. (NREL), Golden, CO (United States); Mai, Trieu [National Renewable Energy Lab. (NREL), Golden, CO (United States); Bird, Lori [National Renewable Energy Lab. (NREL), Golden, CO (United States); Bolinger, Mark [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Carpenter, Alberta [National Renewable Energy Lab. (NREL), Golden, CO (United States); Heath, Garvin [National Renewable Energy Lab. (NREL), Golden, CO (United States); Keyser, David [National Renewable Energy Lab. (NREL), Golden, CO (United States); Macknick, Jordan [National Renewable Energy Lab. (NREL), Golden, CO (United States); Mills, Andrew [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Millstein, Dev [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States)

    2016-01-01

    This is the second in a series of reports exploring the costs, benefits, and other impacts of state renewable portfolio standards (RPS), both retrospectively and prospectively. This report focuses on the benefits and impacts of all state RPS programs, in aggregate, for the year 2013 (the most-recent year for which the requisite data were available). Relying on a well-vetted set of methods, the study evaluates a number of important benefits and impacts in both physical and monetary terms, where possible, and characterizes key uncertainties. The prior study in this series focused on historical RPS compliance costs, and future work will evaluate costs, benefits, and other impacts of RPS policies prospectively.

  10. Standardization of data processing and statistical analysis in comparative plant proteomics experiment.

    Science.gov (United States)

    Valledor, Luis; Romero-Rodríguez, M Cristina; Jorrin-Novo, Jesus V

    2014-01-01

    Two-dimensional gel electrophoresis remains the most widely used technique for protein separation in plant proteomics experiments. Despite the continuous technical advances and improvements in current 2-DE protocols, an adequate and correct experimental design and statistical analysis of the data tend to be ignored or not properly documented in current literature. Both proper experimental design and appropriate statistical analysis are requested in order to confidently discuss our results and to conclude from experimental data.In this chapter, we describe a model procedure for a correct experimental design and a complete statistical analysis of proteomic dataset. Our model procedure covers all of the steps in data mining and processing, starting with the data preprocessing (transformation, missing value imputation, definition of outliers) and univariate statistics (parametric and nonparametric tests), and finishing with multivariate statistics (clustering, heat-mapping, PCA, ICA, PLS-DA).

  11. Concept of risk analysis developed in the Standard AR 3.1.3

    International Nuclear Information System (INIS)

    Moreira, O.; Perl, H.

    2012-01-01

    The individual risk concept developed in the norm AR 3.1.3 (Criterion of acceptance for nuclear power plants) has been based on the following idea: probability of death for a person having been exposed to a determined radiation dose. Mathematically it is expressed as follows r = P(E Λ F) , where r is the individual risk, E the event 'exposure' and F the event 'death as a result of the exposure'. In principle this probability is a conditional probability, making it feasible to apply the Bayes theorem. According to this we have then: P(EΛ F) = P(E) x P(F|E) , what constitutes the first hypothesis of the probabilistic model used in the aforementioned standard. This standard AR 3.1.3 defines risk in mathematical form to estimate quantitatively (probability) the risk. All the theoretical development is based on the validity of the application of Bayes' theorem to events which are not simultaneous, as it is the case in the occurrence of suffer health consequences after having been exposured to radiation. Applying Bayes' theorem to conditional probability for events that are not simultaneous is neither direct nor trivial. In this paper we analyze the two probabilistic hypothesis to apply the Bayes' theorem to not simultaneous conditioned events, the advantages and disadvantages of both probabilistic concepts and the validity of applying them under certain conditions. The fact of applying Bayes to not simultaneous events brings additional hypothesis, which will be exposed and shown in this paper (author)

  12. Ergonomic Analysis of Tricycle Sidecar Seats: Basis for Proposed Standard Design

    Directory of Open Access Journals (Sweden)

    Michael C. Godoy

    2015-12-01

    Full Text Available Ergonomics (also called human factors engineering is the study of human characteristics for the appropriate design of the living and work environment. It is applied in various industrial areas which includes transportation.Tricycle being one of the most common means of public transportation in Lipa City has various adaptations to suit the culture, and environment. The purpose of this study is to analyze the variability in design of the tricycles in Lipa City, Philippines and propose a standard ergonomically designed tricycle sidecar seat for a greater population. The study was conducted at 26 tricycle terminals with 232 tricycle samples within Lipa City proper including the public market area where 400 commuters were given questionnaires to determine the risk factors associated with the existing tricycle sidecar seat design. Anthropometric measurements of 100 males and 100 female commuters were obtained together with the sidecar dimensions of 232 tricycles to substantiate the observed variations in design. Using the design for the average and design for the extremes, it was found out that most of the tricycles in Lipa City, Philippines have inappropriate inclined seat and lowered sidecar seat pan height which can result to leg and abdominal pain; narrowed seat pan depth which caused pressure on buttocks and legs; narrowed backrest width which can cause upper and low back pain; low backrest height that can pose upper back pain; which can also result to abdominal pain; inclined backrest and limited vertical clearance which can cause upper back pain and neck pain. The researcher proposed a sidecar seat design standard which can be used by the Land Transportation Office, and Land Transportation Franchising and Regulatory Board to provide ease, comfort, and convenience to the passengers.

  13. Special tablets containing cellulose binder and Sr internal standard for simplifying X-ray fluorescence analysis of powder samples

    Science.gov (United States)

    Mzyk, Zofia; Anyszkiewicz, Jacek; Gorewoda, Tadeusz

    2015-12-01

    The addition of a constant amount of SrCO3 was observed to be the proper internal standard for analysis by wavelength-dispersive X-ray fluorescence spectrometry to correct the matrix and grain size effects of many constituents. The weighing of constant amounts of SrCO3, binder and sample allowed for the preparation time for analysis to be extended, and special tablets containing binder and SrCO3 were developed. Several substances were tested as binders, among which microcrystalline cellulose was chosen for further study. The prepared tablets were checked for their weight stability and the repeatability of SrCO3 addition. The tablets were then used to prepare pellets from geological samples for X-ray fluorescence analysis. The exemplary application and calibration curves for several analytes confirmed that the prepared tablets could be useful for the pelletizing of such materials to compensate for matrix effects.

  14. Characterization of Primary Standards for Use in the HPLC Analysis of the Procyanidin Content of Cocoa and Chocolate Containing Products

    Directory of Open Access Journals (Sweden)

    Mark J. Payne

    2009-10-01

    Full Text Available This report describes the characterization of a series of commercially available procyanidin standards ranging from dimers DP = 2 to decamers DP = 10 for the determination of procyanidins from cocoa and chocolate. Using a combination of HPLC with fluorescence detection and MALDI-TOF mass spectrometry, the purity of each standard was determined and these data were used to determine relative response factors. These response factors were compared with other response factors obtained from published methods. Data comparing the procyanidin analysis of a commercially available US dark chocolate calculated using each of the calibration methods indicates divergent results and demonstrate that previous methods may significantly underreport the procyanidins in cocoa-containing products. These results have far reaching implications because the previous calibration methods have been used to develop data for a variety of scientific reports, including food databases and clinical studies.

  15. Standard Test Method for Application and Analysis of Helium Accumulation Fluence Monitors for Reactor Vessel Surveillance, E706 (IIIC)

    CERN Document Server

    American Society for Testing and Materials. Philadelphia

    2007-01-01

    1.1 This test method describes the concept and use of helium accumulation for neutron fluence dosimetry for reactor vessel surveillance. Although this test method is directed toward applications in vessel surveillance, the concepts and techniques are equally applicable to the general field of neutron dosimetry. The various applications of this test method for reactor vessel surveillance are as follows: 1.1.1 Helium accumulation fluence monitor (HAFM) capsules, 1.1.2 Unencapsulated, or cadmium or gadolinium covered, radiometric monitors (RM) and HAFM wires for helium analysis, 1.1.3 Charpy test block samples for helium accumulation, and 1.1.4 Reactor vessel (RV) wall samples for helium accumulation. This standard does not purport to address all of the safety concerns, if any, associated with its use. It is the responsibility of the user of this standard to establish appropriate safety and health practices and determine the applicability of regulatory limitations prior to use.

  16. Statistical Analysis of a Round-Robin Measurement Survey of Two Candidate Materials for a Seebeck Coefficient Standard Reference Material

    Directory of Open Access Journals (Sweden)

    Lu, Z. Q. J.

    2009-01-01

    Full Text Available In an effort to develop a Standard Reference Material (SRM™ for Seebeck coefficient, we have conducted a round-robin measurement survey of two candidate materials—undoped Bi2Te3 and Constantan (55 % Cu and 45 % Ni alloy. Measurements were performed in two rounds by twelve laboratories involved in active thermoelectric research using a number of different commercial and custom-built measurement systems and techniques. In this paper we report the detailed statistical analyses on the interlaboratory measurement results and the statistical methodology for analysis of irregularly sampled measurement curves in the interlaboratory study setting. Based on these results, we have selected Bi2Te3 as the prototype standard material. Once available, this SRM will be useful for future interlaboratory data comparison and instrument calibrations.

  17. Using Applied Behaviour Analysis as Standard Practice in a UK Special Needs School

    Science.gov (United States)

    Foran, Denise; Hoerger, Marguerite; Philpott, Hannah; Jones, Elin Walker; Hughes, J. Carl; Morgan, Jonathan

    2015-01-01

    This article describes how applied behaviour analysis can be implemented effectively and affordably in a maintained special needs school in the UK. Behaviour analysts collaborate with classroom teachers to provide early intensive behaviour education for young children with autism spectrum disorders (ASD), and function based behavioural…

  18. Towards internationally acceptable standards for food additives and contaminants based on the use of risk analysis

    NARCIS (Netherlands)

    Huggett, A.; Petersen, B.J.; Walker, R.; Fisher, C.E.; Notermans, S.H.W.; Rombouts, F.M.; Abbott, P.; Debackere, M.; Hathaway, S.C.; Hecker, E.F.F.; Knaap, A.G.A.; Kuznesof, P.M.; Meyland, I.; Moy, G.; Narbonne, J.-F.; Paakkanen, J.; Smith, M.R.; Tennant, D.; Wagstaffe, P.; Wargo, J.; Würtzen, G.

    1998-01-01

    Internationally acceptable norms need to incorporate sound science and consistent risk management principles in an open and transparent manner, as set out in the Agreement on the Application of Sanitary and Phytosanitary Measures (the SPS Agreement). The process of risk analysis provides a procedure

  19. Measuring the readability of sustainability reports: : A corpus-based analysis through standard formulae and NLP

    NARCIS (Netherlands)

    Smeuninx, N.; De Clerck, B.; Aerts, Walter

    2016-01-01

    This study characterises and problematises the language of corporate reporting along region, industry, genre, and content lines by applying readability formulae and more advanced natural language processing (NLP)–based analysis to a manually assembled 2.75-million-word corpus. Readability formulae

  20. Mechanical properties test and microstructure analysis of polyoxymethylene (POM) micro injection moulded standard parts

    DEFF Research Database (Denmark)

    Tosello, Guido; Lucchetta, Giovanni; Hansen, Hans Nørgaard

    2009-01-01

    to factorial plans, in which the factors of interest were mould temperature, melt temperature and dimensional range of the specimen (i.e. macro and micro parts). Micro structure analysis was performed by means of plastography techniques and revealed that high mould and melt temperatures resulted on a thin skin...

  1. A Policy Analysis of Student Attendance Standards Related to State Education Policies

    Science.gov (United States)

    Guilliams, Mary Elizabeth

    2014-01-01

    This paper is a project report of a policy analysis of state attendance information available to public schools. Current state attendance information rarely expands beyond compulsory attendance law. It is vague, non-existent or difficult to find. Research provides strong links between student attendance and achievement. Informed school leaders…

  2. Standard guide for qualification of laboratory analysts for the analysis of nuclear fuel cycle materials

    CERN Document Server

    American Society for Testing and Materials. Philadelphia

    2003-01-01

    1.1 This guide covers the qualification of analysts to perform chemical analysis or physical measurements of nuclear fuel cycle materials. The guidance is general in that it is applicable to all analytical methods, but must be applied method by method. Also, the guidance is general in that it may be applied to initial qualification or requalification.

  3. Standard review plan for the review of safety analysis reports for nuclear power plants

    International Nuclear Information System (INIS)

    1984-04-01

    Revised information is presented concerning the stress analysis of engineered safety systems; control rod drive systems; reactor core isolation coding system; residual heat removal system; emergency core cooling system; station service water system; reactor auxiliary coding water systems; main steam supply system; and condensate and feedwater system

  4. Visual memory after epilepsy surgery in children: a standardized regression-based analysis of group and individual outcomes.

    Science.gov (United States)

    Meekes, Joost; Braams, Olga B; Braun, Kees P J; Jennekens-Schinkel, Aag; van Rijen, Peter C; Alpherts, Willem C J; Hendriks, Marc P H; van Nieuwenhuizen, Onno

    2014-07-01

    Visual memory is vulnerable to epilepsy surgery in adults, but studies in children suggest no change or small improvements. We investigated visual memory after epilepsy surgery, both group-wise and in individual children, using two techniques to assess change: 1) repeated measures analysis of variance (ANOVA) and 2) an empirically based technique for detecting cognitive change [standardized regression-based (SRB) analysis]. A prospective cohort consisting of 21 children completed comprehensive assessments of memory both before surgery (T0) and 6 (T1), 12 (T2), and 24 months (T3) after surgery. For each patient, two age- and gender-matched controls were assessed with the same tests at the same intervals. Repeated measures ANOVA replicated the results of previous studies reporting no change or minor improvements after surgery. However, group analysis of SRB results eliminated virtually all improvements, indicating that the ANOVA results were confounded by practice effects. Standardized regression-based group results showed that in fact patients scored lower after surgery than would be predicted based on their presurgical performance. Analysis of individual SRB results showed that per visual memory measure, an average of 18% of patients obtained a significantly negative SRB score, whereas, on average, only 2% obtained a significantly positive SRB score. At T3, the number of significantly negative SRB scores outweighed the number of significantly positive SRB scores in 62% of patients. There were no clear associations of clinical variables (including side and site of surgery and postsurgical seizure freedom) with memory outcome. The present analysis revealed that given their individual presurgical functioning, many children obtained disappointing results on some visual memory tests after epilepsy surgery. Comparison of the SRB analysis with ANOVA results emphasizes the importance of empirically based techniques for detecting cognitive effects of epilepsy surgery in

  5. Solid energy calibration standards for P K-edge XANES: electronic structure analysis of PPh4Br.

    Science.gov (United States)

    Blake, Anastasia V; Wei, Haochuan; Donahue, Courtney M; Lee, Kyounghoon; Keith, Jason M; Daly, Scott R

    2018-03-01

    P K-edge X-ray absorption near-edge structure (XANES) spectroscopy is a powerful method for analyzing the electronic structure of organic and inorganic phosphorus compounds. Like all XANES experiments, P K-edge XANES requires well defined and readily accessible calibration standards for energy referencing so that spectra collected at different beamlines or under different conditions can be compared. This is especially true for ligand K-edge X-ray absorption spectroscopy, which has well established energy calibration standards for Cl (Cs 2 CuCl 4 ) and S (Na 2 S 2 O 3 ·5H 2 O), but not neighboring P. This paper presents a review of common P K-edge XANES energy calibration standards and analysis of PPh 4 Br as a potential alternative. The P K-edge XANES region of commercially available PPh 4 Br revealed a single, highly resolved pre-edge feature with a maximum at 2146.96 eV. PPh 4 Br also showed no evidence of photodecomposition when repeatedly scanned over the course of several days. In contrast, we found that PPh 3 rapidly decomposes under identical conditions. Density functional theory calculations performed on PPh 3 and PPh 4 + revealed large differences in the molecular orbital energies that were ascribed to differences in the phosphorus oxidation state (III versus V) and molecular charge (neutral versus +1). Time-dependent density functional theory calculations corroborated the experimental data and allowed the spectral features to be assigned. The first pre-edge feature in the P K-edge XANES spectrum of PPh 4 Br was assigned to P 1s → P-C π* transitions, whereas those at higher energy were P 1s → P-C σ*. Overall, the analysis suggests that PPh 4 Br is an excellent alternative to other solid energy calibration standards commonly used in P K-edge XANES experiments.

  6. Application of k0-based internal mono standard instrumental neutron activation analysis method method for composition analysis of stainless steel clad sample

    International Nuclear Information System (INIS)

    Acharya, R.; Nair, A.G.C.; Reddy, A.V.R.; Goswami, A.

    2004-01-01

    The k 0 -based internal mono standard instrumental neutron activation analysis (INAA) method was used for the composition analysis of some irregular shape stainless steel (SS) samples of type SS 316M, which is used as fuel cladding in Indian fast breeder test reactor (FBTR). The method utilizes in situ relative detection efficiency using γ-rays of the activation products present in the sample for overcoming γ-ray self-attenuation. Samples were neutron activated using the thermal column as well as the core position of the reactor and the assay of radioactivity was carried out by high-resolution gamma ray spectrometry. The elements determined were Fe, Cr, Ni, Mo, Mn, Co, Cu, As and W. Since all the major elements (Fe, Cr, Ni, Mo and Mn) were amenable to NAA, the relative elemental concentrations with respect to Fe, obtained by this method, were converted to their absolute values by mass balance. The results were compared with specified compositions and found to be satisfactory. In order to validate these results obtained by the standard-less approach, sub samples of SS 316M in solution forms were analyzed by prevalent relative and k 0 methods of INAA, and results were found to be in good agreement. The accuracy of the internal mono standard INAA method has been evaluated by analyzing an alloy steel certified reference material, CRM 225/1 of British Chemical Standards (BCS)

  7. Economic and environmental analysis of standard, high efficiency, rainwater flushed, and composting toilets.

    Science.gov (United States)

    Anand, C; Apul, D S

    2011-03-01

    The current sanitation technology in developed countries is based on diluting human excreta with large volumes of centrally provided potable water. This approach is a poor use of water resources and is also inefficient, expensive, and energy intensive. The goal of this study was to compare the standard sanitation technology (Scenario 1) with alternative technologies that require less or no potable water use in toilets. The alternative technologies considered were high efficiency toilets flushed with potable water (Scenario 2), standard toilets flushed with rainwater (Scenario 3), high efficiency toilets flushed with rainwater (Scenario 4), and composting toilets (Scenario 5). Cost, energy, and carbon implications of these five design scenarios were studied using two existing University of Toledo buildings. The results showed that alternative systems modeled in Scenarios 2, 4, and 5 were viable options both from an investment and an environmental performance perspective. High efficiency fixtures that use potable water (Scenario 2) is often the most preferred method in high efficiency buildings due to reduced water use and associated reductions in annual water and wastewater costs. However, the cost, energy, and CO(2)EE analyses all showed that Scenarios 4 and 5 were preferable over Scenario 2. Cost payback periods of scenarios 2, 4 and 5 were less than 10 years; in the future, increase in water and wastewater services would further decrease the payback periods. The centralized water and wastewater services have high carbon footprints; therefore if carbon footprint reduction is a primary goal of a building complex, alternative technologies that require less potable water and generate less wastewater can largely reduce the carbon footprint. High efficiency fixtures flushed with rainwater (Scenario 4) and composting toilets (Scenario 5) required considerably less energy than direct energy demands of buildings. However, the annual carbon footprint of these technologies

  8. Population Health Metrics Research Consortium gold standard verbal autopsy validation study: design, implementation, and development of analysis datasets

    Directory of Open Access Journals (Sweden)

    Ohno Summer

    2011-08-01

    Full Text Available Abstract Background Verbal autopsy methods are critically important for evaluating the leading causes of death in populations without adequate vital registration systems. With a myriad of analytical and data collection approaches, it is essential to create a high quality validation dataset from different populations to evaluate comparative method performance and make recommendations for future verbal autopsy implementation. This study was undertaken to compile a set of strictly defined gold standard deaths for which verbal autopsies were collected to validate the accuracy of different methods of verbal autopsy cause of death assignment. Methods Data collection was implemented in six sites in four countries: Andhra Pradesh, India; Bohol, Philippines; Dar es Salaam, Tanzania; Mexico City, Mexico; Pemba Island, Tanzania; and Uttar Pradesh, India. The Population Health Metrics Research Consortium (PHMRC developed stringent diagnostic criteria including laboratory, pathology, and medical imaging findings to identify gold standard deaths in health facilities as well as an enhanced verbal autopsy instrument based on World Health Organization (WHO standards. A cause list was constructed based on the WHO Global Burden of Disease estimates of the leading causes of death, potential to identify unique signs and symptoms, and the likely existence of sufficient medical technology to ascertain gold standard cases. Blinded verbal autopsies were collected on all gold standard deaths. Results Over 12,000 verbal autopsies on deaths with gold standard diagnoses were collected (7,836 adults, 2,075 children, 1,629 neonates, and 1,002 stillbirths. Difficulties in finding sufficient cases to meet gold standard criteria as well as problems with misclassification for certain causes meant that the target list of causes for analysis was reduced to 34 for adults, 21 for children, and 10 for neonates, excluding stillbirths. To ensure strict independence for the validation of

  9. The tsunami probabilistic risk assessment (PRA). Example of accident sequence analysis of tsunami PRA according to the standard for procedure of tsunami PRA for nuclear power plants

    International Nuclear Information System (INIS)

    Ohara, Norihiro; Hasegawa, Keiko; Kuroiwa, Katsuya

    2013-01-01

    After the Fukushima Daiichi nuclear power plant (NPP) accident, standard for procedure of tsunami PRA for NPP had been established by the Standardization Committee of AESJ. Industry group had been conducting analysis of Tsunami PRA for PWR based on the standard under the cooperation with electric utilities. This article introduced overview of the standard and examples of accident sequence analysis of Tsunami PRA studied by the industry group according to the standard. The standard consisted of (1) investigation of NPP's composition, characteristics and site information, (2) selection of relevant components for Tsunami PRA and initiating events and identification of accident sequence, (3) evaluation of Tsunami hazards, (4) fragility evaluation of building and components and (5) evaluation of accident sequence. Based on the evaluation, countermeasures for further improvement of safety against Tsunami could be identified by the sensitivity analysis. (T. Tanaka)

  10. Analysis of high burnup fuel behavior under control rod ejection accident in Korea standard nuclear power plant

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Chan Bok; Lee, Chung Chan; Kim, Oh Hwan; Kim, Jong Jin [Korea Atomic Energy Research Institute, Taejon (Korea, Republic of)

    1996-07-01

    Test results of high burnup fuel behavior under RIA(reactivity insertion accident) indicated that fuel might fail at the fuel enthalpy lower than that in the current fuel failure criteria was derived by the conservative assumptions and analysis of fuel failure mechanisms, and applied to the analysis of control rod ejection accident in the 1,000 MWe Korea standard PWR. Except that three dimensional core analysis was performed instead of conventional zero dimensional analysis, all the other conservative assumptions were kept. Analysis results showed that less than on percent of the fuel rods in the core has failed which was much less than the conventional fuel failure fraction, 9.8 %, even though a newly derived fuel failure criteria -Fuel failure occurs at the power level lower than that in the current fuel failure criteria. - was applied, since transient fuel rod power level was significantly decreased by analyzing the transient fuel rod power level was significantly decreased by analyzing the transient core three dimensionally. Therefore, it can be said that results of the radiological consequence analysis for the control rod ejection accident in the FSAR where fuel failure fraction was assumed 9.8 % is still bounding. 18 tabs., 48 figs., 39 refs. (Author).

  11. Molecular identification of mumps virus genotypes from clinical samples: standardized method of analysis.

    Science.gov (United States)

    Palacios, G; Jabado, O; Cisterna, D; de Ory, F; Renwick, N; Echevarria, J E; Castellanos, A; Mosquera, M; Freire, M C; Campos, R H; Lipkin, W I

    2005-04-01

    A sensitive nested reverse transcription-PCR assay, targeting a short fragment of the gene encoding the small hydrophobic protein (SH gene), was developed to allow rapid characterization of mumps virus in clinical samples. The sensitivity and specificity of the assay were established using representative genotypes A, B, C, D, E, and F. Mumps virus RNA was characterized directly from cerebrospinal fluid (CSF) samples and in extracts of mumps virus isolates from patients with various clinical syndromes. Direct sequencing of products and subsequent phylogenetic analysis enabled genetic classification. A simple web-based system of sequence analysis was established. The study also allowed characterization of mumps virus strains from Argentina as part of a new subgenotype. This PCR assay for characterization of mumps infections coupled to a web-based analytical program provides a rapid method for identification of known and novel strains.

  12. Standard test methods for chemical, mass spectrometric, and spectrochemical analysis of nuclear-grade boron carbide

    CERN Document Server

    American Society for Testing and Materials. Philadelphia

    2004-01-01

    1.1 These test methods cover procedures for the chemical, mass spectrometric, and spectrochemical analysis of nuclear-grade boron carbide powder and pellets to determine compliance with specifications. 1.2 The analytical procedures appear in the following order: Sections Total Carbon by Combustion and Gravimetry 7-17 Total Boron by Titrimetry 18-28 Isotopic Composition by Mass Spectrometry 29-38 Chloride and Fluoride Separation by Pyrohydrolysis 39-45 Chloride by Constant-Current Coulometry 46-54 Fluoride by Ion-Selective Electrode 55-63 Water by Constant-Voltage Coulometry 64-72 Impurities by Spectrochemical Analysis 73-81 Soluble Boron by Titrimetry 82-95 Soluble Carbon by a Manometric Measurement 96-105 Metallic Impurities by a Direct Reader Spectrometric Method 106-114

  13. Standardizing effect size from linear regression models with log-transformed variables for meta-analysis.

    Science.gov (United States)

    Rodríguez-Barranco, Miguel; Tobías, Aurelio; Redondo, Daniel; Molina-Portillo, Elena; Sánchez, María José

    2017-03-17

    Meta-analysis is very useful to summarize the effect of a treatment or a risk factor for a given disease. Often studies report results based on log-transformed variables in order to achieve the principal assumptions of a linear regression model. If this is the case for some, but not all studies, the effects need to be homogenized. We derived a set of formulae to transform absolute changes into relative ones, and vice versa, to allow including all results in a meta-analysis. We applied our procedure to all possible combinations of log-transformed independent or dependent variables. We also evaluated it in a simulation based on two variables either normally or asymmetrically distributed. In all the scenarios, and based on different change criteria, the effect size estimated by the derived set of formulae was equivalent to the real effect size. To avoid biased estimates of the effect, this procedure should be used with caution in the case of independent variables with asymmetric distributions that significantly differ from the normal distribution. We illustrate an application of this procedure by an application to a meta-analysis on the potential effects on neurodevelopment in children exposed to arsenic and manganese. The procedure proposed has been shown to be valid and capable of expressing the effect size of a linear regression model based on different change criteria in the variables. Homogenizing the results from different studies beforehand allows them to be combined in a meta-analysis, independently of whether the transformations had been performed on the dependent and/or independent variables.

  14. Standardizing effect size from linear regression models with log-transformed variables for meta-analysis

    Directory of Open Access Journals (Sweden)

    Miguel Rodríguez-Barranco

    2017-03-01

    Full Text Available Abstract Background Meta-analysis is very useful to summarize the effect of a treatment or a risk factor for a given disease. Often studies report results based on log-transformed variables in order to achieve the principal assumptions of a linear regression model. If this is the case for some, but not all studies, the effects need to be homogenized. Methods We derived a set of formulae to transform absolute changes into relative ones, and vice versa, to allow including all results in a meta-analysis. We applied our procedure to all possible combinations of log-transformed independent or dependent variables. We also evaluated it in a simulation based on two variables either normally or asymmetrically distributed. Results In all the scenarios, and based on different change criteria, the effect size estimated by the derived set of formulae was equivalent to the real effect size. To avoid biased estimates of the effect, this procedure should be used with caution in the case of independent variables with asymmetric distributions that significantly differ from the normal distribution. We illustrate an application of this procedure by an application to a meta-analysis on the potential effects on neurodevelopment in children exposed to arsenic and manganese. Conclusions The procedure proposed has been shown to be valid and capable of expressing the effect size of a linear regression model based on different change criteria in the variables. Homogenizing the results from different studies beforehand allows them to be combined in a meta-analysis, independently of whether the transformations had been performed on the dependent and/or independent variables.

  15. Standardization of Licorice and TCM Formulations Using Eastern Blot Fingerprinting Analysis

    Directory of Open Access Journals (Sweden)

    Yukihiro Shoyama

    2013-01-01

    Full Text Available To prepare the antiglycyrrhizin (GC monoclonal antibody (MAb, GC was treated with NaIO4 resulting in aldehyde which can be combined with carrier protein. An antigen conjugate was performed by a matrix-assisted laser desorption/ionization TOF mass spectrometry to determine the hapten numbers in the conjugate. Anti-GC MAb was prepared from a hybridoma which was fixed from the spleen cells producing anti-GC MAb and the myeloma cells after immunization. The TCM and licorice extract were developed by TLC and blotted to a polyvinylidene difluoride (PVDF membrane. The membrane was treated by NaIO4 and protein, enzyme labeled secondary MAb, and finally substrate was added. Clear spot appeared on PVDF membrane identifying GC against a background containing large amount of impurities. In eastern blotting, the GC molecule was divided into two functions. The aglycone part is recognized as an epitope and the sugar moiety can be combined to membrane. The specific reactivity of sugar moiety in the GC molecule against anti-GC MAb might be modified by the NaIO4 treatment on the membrane because glycyrrhetic acid 3-O-glucuronide can be stained although the cross-reactivity is only 4.3%. Eastern blotting for GC can not only apply for the standardization of licorice and TCM, but also it can open for the other bioactive products.

  16. A retrospective analysis of benefits and impacts of U.S. renewable portfolio standards

    Energy Technology Data Exchange (ETDEWEB)

    Barbose, Galen; Wiser, Ryan; Heeter, Jenny; Mai, Trieu; Bird, Lori; Bolinger, Mark; Carpenter, Alberta; Heath, Garvin; Keyser, David; Macknick, Jordan; Mills, Andrew; Millstein, Dev

    2016-09-01

    As states consider revising or developing renewable portfolio standards (RPS), they are evaluating policy costs, benefits, and other impacts. We present the first U. S. national-level assessment of state RPS program benefits and impacts, focusing on new renewable electricity resources used to meet RPS compliance obligations in 2013. In our central-case scenario, reductions in life-cycle greenhouse gas emissions from displaced fossil fuel-generated electricity resulted in $2.2 billion of global benefits. Health and environmental benefits from reductions in criteria air pollutants (sulfur dioxide, nitrogen oxides, and particulate matter 2.5) were even greater, estimated at $5.2 billion in the central case. Further benefits accrued in the form of reductions in water withdrawals and consumption for power generation. Finally, although best considered resource transfers rather than net societal benefits, new renewable electricity generation used for RPS compliance in 2013 also supported nearly 200,000 U. S.-based gross jobs and reduced wholesale electricity prices and natural gas prices, saving consumers a combined $1.3-$4.9 billion. In total, the estimated benefits and impacts well-exceed previous estimates of RPS compliance costs.

  17. Comparative analysis of photograph-based clinical goniometry to standard techniques.

    Science.gov (United States)

    Crasto, Jared A; Sayari, Arash J; Gray, Robert R-L; Askari, Morad

    2015-06-01

    Assessment of joint range of motion (ROM) is an accepted evaluation of disability as well as an indicator of recovery from musculoskeletal injuries. Many goniometric techniques have been described to measure ROM, with variable validity due to inter-rater reliability. In this report, we assessed the validity of photograph-based goniometry in measurement of ROM and its inter-rater reliability and compared it to two other commonly used techniques. We examined three methods for measuring ROM in the upper extremity: manual goniometry (MG), visual estimations (VE), and photograph-based goniometry (PBG). Eight motions of the upper extremity were measured in 69 participants at an academic medical center. We found visual estimations and photograph-based goniometry to be clinically valid when tested against manual goniometry (r avg. 0.58, range 0.28 to 0.87). Photograph-based measurements afforded a satisfactory degree of inter-rater reliability (ICC avg. 0.77, range 0.28 to 0.96). Our study supports photograph-based goniometry as the new standard goniometric technique, as it has been clinically validated, is performed with greater consistency and better inter-rater reliability when compared with manual goniometry. It also allows for better documentation of measurements and potential incorporation into medical records in direct contrast to visual estimation.

  18. Monte Carlo analysis of the Neutron Standards Laboratory of the CIEMAT

    International Nuclear Information System (INIS)

    Vega C, H. R.; Mendez V, R.; Guzman G, K. A.

    2014-10-01

    By means of Monte Carlo methods was characterized the neutrons field produced by calibration sources in the Neutron Standards Laboratory of the Centro de Investigaciones Energeticas, Medioambientales y Tecnologicas (CIEMAT). The laboratory has two neutron calibration sources: 241 AmBe and 252 Cf which are stored in a water pool and are placed on the calibration bench using controlled systems at distance. To characterize the neutrons field was built a three-dimensional model of the room where it was included the stainless steel bench, the irradiation table and the storage pool. The sources model included double encapsulated of steel, as cladding. With the purpose of determining the effect that produces the presence of the different components of the room, during the characterization the neutrons spectra, the total flow and the rapidity of environmental equivalent dose to 100 cm of the source were considered. The presence of the walls, floor and ceiling of the room is causing the most modification in the spectra and the integral values of the flow and the rapidity of environmental equivalent dose. (Author)

  19. PhenStat: A Tool Kit for Standardized Analysis of High Throughput Phenotypic Data.

    Directory of Open Access Journals (Sweden)

    Natalja Kurbatova

    Full Text Available The lack of reproducibility with animal phenotyping experiments is a growing concern among the biomedical community. One contributing factor is the inadequate description of statistical analysis methods that prevents researchers from replicating results even when the original data are provided. Here we present PhenStat--a freely available R package that provides a variety of statistical methods for the identification of phenotypic associations. The methods have been developed for high throughput phenotyping pipelines implemented across various experimental designs with an emphasis on managing temporal variation. PhenStat is targeted to two user groups: small-scale users who wish to interact and test data from large resources and large-scale users who require an automated statistical analysis pipeline. The software provides guidance to the user for selecting appropriate analysis methods based on the dataset and is designed to allow for additions and modifications as needed. The package was tested on mouse and rat data and is used by the International Mouse Phenotyping Consortium (IMPC. By providing raw data and the version of PhenStat used, resources like the IMPC give users the ability to replicate and explore results within their own computing environment.

  20. Quantitative portable gamma spectroscopy sample analysis for non-standard sample geometries

    International Nuclear Information System (INIS)

    Enghauser, M.W.; Ebara, S.B.

    1997-01-01

    Utilizing a portable spectroscopy system, a quantitative method for analysis of samples containing a mixture of fission and activation products in nonstandard geometries was developed. The method can be used with various sample and shielding configurations where analysis on a laboratory based gamma spectroscopy system is impractical. The portable gamma spectroscopy method involves calibration of the detector and modeling of the sample and shielding to identify and quantify the radionuclides present in the sample. The method utilizes the intrinsic efficiency of the detector and the unattenuated gamma fluence rate at the detector surface per unit activity from the sample to calculate the nuclide activity and Minimum Detectable Activity (MDA). For a complex geometry, a computer code written for shielding applications (MICROSHIELD) is utilized to determine the unattenuated gamma fluence rate per unit activity at the detector surface. Lastly, the method is only applicable to nuclides which emit gamma rays and cannot be used for pure beta emitters. In addition, if sample self absorption and shielding is significant, the attenuation will result in high MDA's for nuclides which solely emit low energy gamma rays. The following presents the analysis technique and presents verification results demonstrating the accuracy of the method

  1. Food certification based on isotopic analysis, according to the European standards

    International Nuclear Information System (INIS)

    Costinel, Diana; Ionete, Roxana Elena; Vremera, Raluca; Stanciu, Vasile; Iordache, Andreea

    2007-01-01

    Full text: Under current EU research projects, several public research institutions, universities and private companies are collaborating to develop new methods of evidencing food adulteration and consequently assessing food safety. The use of mass spectrometry (MS) to determine the ratio of stable isotopes in bio-molecules now provides the means to prove the natural origin of a wide variety of foodstuffs - and therefore, to identify the fraud and consequently to reject the improper products or certify the food quality. Isotope analysis has been officially adopted by the EU as a means of controlling adulteration of some food stuffs. A network of research organizations developed the use of isotopic analysis to support training and technology transfer to encourage uptake of the technique. There were also developed proficiency-testing schemes to ensure the correct use of isotopic techniques in national testing laboratories. In addition, ensuring the food quality and safety is a requirement, which must be fulfilled for the integration in EU. The present paper emphasizes the isotopic analysis for D/H, 18 O/ 16 O, 13 C/ 12 C from food (honey, juice, wines) using a new generation Isotope Ratio MS, Finnigan Delta V Plus, coupled to a three flexible continuous flow preparation device (GasBench II, TC Elemental Analyser and GC-C/TC). (authors)

  2. Quantitative portable gamma-spectroscopy sample analysis for non-standard sample geometries

    International Nuclear Information System (INIS)

    Ebara, S.B.

    1998-01-01

    Utilizing a portable spectroscopy system, a quantitative method for analysis of samples containing a mixture of fission and activation products in nonstandard geometries was developed. This method was not developed to replace other methods such as Monte Carlo or Discrete Ordinates but rather to offer an alternative rapid solution. The method can be used with various sample and shielding configurations where analysis on a laboratory based gamma-spectroscopy system is impractical. The portable gamma-spectroscopy method involves calibration of the detector and modeling of the sample and shielding to identify and quantify the radionuclides present in the sample. The method utilizes the intrinsic efficiency of the detector and the unattenuated gamma fluence rate at the detector surface per unit activity from the sample to calculate the nuclide activity and Minimum Detectable Activity (MDA). For a complex geometry, a computer code written for shielding applications (MICROSHIELD) is utilized to determine the unattenuated gamma fluence rate per unit activity at the detector surface. Lastly, the method is only applicable to nuclides which emit gamma-rays and cannot be used for pure beta or alpha emitters. In addition, if sample self absorption and shielding is significant, the attenuation will result in high MDA's for nuclides which solely emit low energy gamma-rays. The following presents the analysis technique and presents verification results using actual experimental data, rather than comparisons to other approximations such as Monte Carlo techniques, to demonstrate the accuracy of the method given a known geometry and source term. (author)

  3. Analysis and estimation of readiness of students of province of Khebey bor taking the state standards of physical preparedness.

    Directory of Open Access Journals (Sweden)

    Van Likhua

    2011-04-01

    Full Text Available The results of analysis of self-appraisal of the personal readiness of students of the unspecialized higher educational establishments of province of Khebey are presented. In research is used the information of the questionnaire of 1000 students of five higher educational establishments. The criteria of keeping up readiness of students are formed, the cross-correlation links of criteria are set, revealed is the intercommunication between the personal and public activity of students which influences on their readiness for taking standards.

  4. Classification of Magnetic Nanoparticle Systems—Synthesis, Standardization and Analysis Methods in the NanoMag Project

    Directory of Open Access Journals (Sweden)

    Sara Bogren

    2015-08-01

    Full Text Available This study presents classification of different magnetic single- and multi-core particle systems using their measured dynamic magnetic properties together with their nanocrystal and particle sizes. The dynamic magnetic properties are measured with AC (dynamical susceptometry and magnetorelaxometry and the size parameters are determined from electron microscopy and dynamic light scattering. Using these methods, we also show that the nanocrystal size and particle morphology determines the dynamic magnetic properties for both single- and multi-core particles. The presented results are obtained from the four year EU NMP FP7 project, NanoMag, which is focused on standardization of analysis methods for magnetic nanoparticles.

  5. A meta-analysis of hypnosis for chronic pain problems: a comparison between hypnosis, standard care, and other psychological interventions.

    Science.gov (United States)

    Adachi, Tomonori; Fujino, Haruo; Nakae, Aya; Mashimo, Takashi; Sasaki, Jun

    2014-01-01

    Hypnosis is regarded as an effective treatment for psychological and physical ailments. However, its efficacy as a strategy for managing chronic pain has not been assessed through meta-analytical methods. The objective of the current study was to conduct a meta-analysis to assess the efficacy of hypnosis for managing chronic pain. When compared with standard care, hypnosis provided moderate treatment benefit. Hypnosis also showed a moderate superior effect as compared to other psychological interventions for a nonheadache group. The results suggest that hypnosis is efficacious for managing chronic pain. Given that large heterogeneity among the included studies was identified, the nature of hypnosis treatment is further discussed.

  6. Implementation of a management system in accordance with IAEA GS-R-3 Standard. A gap analysis

    International Nuclear Information System (INIS)

    Dicianu, I.; Oprea, M.

    2009-01-01

    Full text: The design and implementation of an Integrated Management System at SNN SA Headquarters become necessary as the CNCAN norms are already under revision to comply with the IAEA GS-R-3 standard. The purpose of this analysis is to draft a project for the transition from a Quality Management System (QMS) to an Integrated Management System (IMS) complying with GS-R-3 requirements. Four steps were identified for developing this project: STEP1 - To justify the necessity of the IMS implementation to meet the SNN SA Headquarters Top Management commitments. The requirements for implementing an IMS are analyzed and a comprehensive document is issued to (and maybe discussed with) SNN General Director in order to obtain the top management adherence/commitment to the project implementation. The document will show the strong and the weak points which should be considered in developing the project. The references for the project are: - IAEA Safety Standard GS-R-3 'The Management System for Facilities and Activities'; - ISO - 1400/2004 Standard 'Environmental Management System Requirements'; - OHSAS 18001/2007 Standard 'Occupational Health and Safety Management Systems. Requirements'; There are also considered: - IAEA Safety Guide GS-G-3.1 'Application of the Management System for Facilities and Activities'; - IAEA Draft Safety Guide DS-349 'Application of the Management System for Nuclear Facilities; There will be considered: Workshop 2 Bookmarks (F5) 2 - CNCAN Norms (as they will be revised); STEP2 - The performance of a comparative analysis of the requirements of GS-R-3, ISO 14001 and OHSAS 18001 versus the provisions of the QMS already implemented in SNN. This analysis is shown as a comparative table; STEP3 - Identification of the IMS processes. An overall analysis of the current processes described in the SNN QMS Manual is performed and based on this. There are identified the additional processes that have to be documented for the proper implementation of an IMS

  7. A Comparison of AOP Classification Based on Difficulty, Importance, and Frequency by Cluster Analysis and Standardized Mean

    International Nuclear Information System (INIS)

    Choi, Sun Yeong; Jung, Wondea

    2014-01-01

    In Korea, there are plants that have more than one-hundred kinds of abnormal operation procedures (AOPs). Therefore, operators have started to recognize the importance of classifying the AOPs. They should pay attention to those AOPs required to take emergency measures against an abnormal status that has a more serious effect on plant safety and/or often occurs. We suggested a measure of prioritizing AOPs for a training purpose based on difficulty, importance, and frequency. A DIF analysis based on how difficult the task is, how important it is, and how frequently they occur is a well-known method of assessing the performance, prioritizing training needs and planning. We used an SDIF-mean (Standardized DIF-mean) to prioritize AOPs in the previous paper. For the SDIF-mean, we standardized the three kinds of data respectively. The results of this research will be utilized not only to understand the AOP characteristics at a job analysis level but also to develop an effective AOP training program. The purpose of this paper is to perform a cluster analysis for an AOP classification and compare the results through a cluster analysis with that by a standardized mean based on difficulty, importance, and frequency. In this paper, we categorized AOPs into three groups by a cluster analysis based on D, I, and F. Clustering is the classification of similar objects into groups so that each group shares some common characteristics. In addition, we compared the result by the cluster analysis in this paper with the classification result by the SDIF-mean in the previous paper. From the comparison, we found that a reevaluation can be required to assign a training interval for the AOPs of group C' in the previous paper those have lower SDIF-mean. The reason for this is that some of the AOPs of group C' have quite high D and I values while they have the lowest frequencies. From an educational point of view, AOPs in group which have the highest difficulty and importance, but

  8. Simplified Symptom Pattern Method for verbal autopsy analysis: multisite validation study using clinical diagnostic gold standards

    Directory of Open Access Journals (Sweden)

    Lozano Rafael

    2011-08-01

    Full Text Available Abstract Background Verbal autopsy can be a useful tool for generating cause of death data in data-sparse regions around the world. The Symptom Pattern (SP Method is one promising approach to analyzing verbal autopsy data, but it has not been tested rigorously with gold standard diagnostic criteria. We propose a simplified version of SP and evaluate its performance using verbal autopsy data with accompanying true cause of death. Methods We investigated specific parameters in SP's Bayesian framework that allow for its optimal performance in both assigning individual cause of death and in determining cause-specific mortality fractions. We evaluated these outcomes of the method separately for adult, child, and neonatal verbal autopsies in 500 different population constructs of verbal autopsy data to analyze its ability in various settings. Results We determined that a modified, simpler version of Symptom Pattern (termed Simplified Symptom Pattern, or SSP performs better than the previously-developed approach. Across 500 samples of verbal autopsy testing data, SSP achieves a median cause-specific mortality fraction accuracy of 0.710 for adults, 0.739 for children, and 0.751 for neonates. In individual cause of death assignment in the same testing environment, SSP achieves 45.8% chance-corrected concordance for adults, 51.5% for children, and 32.5% for neonates. Conclusions The Simplified Symptom Pattern Method for verbal autopsy can yield reliable and reasonably accurate results for both individual cause of death assignment and for determining cause-specific mortality fractions. The method demonstrates that verbal autopsies coupled with SSP can be a useful tool for analyzing mortality patterns and determining individual cause of death from verbal autopsy data.

  9. A Retrospective Analysis of the Benefits and Impacts of U.S. Renewable Portfolio Standards

    Energy Technology Data Exchange (ETDEWEB)

    Wiser, Ryan H. [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Barbose, Galen [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Heeter, Jenny [National Renewable Energy Lab. (NREL), Golden, CO (United States); Mai, Trieu [National Renewable Energy Lab. (NREL), Golden, CO (United States); Bird, Lori [National Renewable Energy Lab. (NREL), Golden, CO (United States); Bolinger, Mark [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Carpenter, Alberta [National Renewable Energy Lab. (NREL), Golden, CO (United States); Heath, Garvin [National Renewable Energy Lab. (NREL), Golden, CO (United States); Keyser, David [National Renewable Energy Lab. (NREL), Golden, CO (United States); Macknick, Jordan [National Renewable Energy Lab. (NREL), Golden, CO (United States); Mills, Andrew [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Millstein, Dev [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States)

    2017-07-31

    This report, the second in the series, analyzes historical benefits and impacts of all state RPS policies, in aggregate, employing a consistent and well-vetted set of methods and data sets. The analysis focuses on three specific benefits: greenhouse gas emissions, air pollution, and water use. It also analyzes three other impacts: gross job additions, wholesale electricity market price suppression, and natural gas price suppression. These are an important subset, but by no means a comprehensive set, of all possible effects associated with RPS policies. These benefits and impacts are also subject to many uncertainties, which are described and, to the extent possible, quantified within the report.

  10. Can anthropometry measure gender discrimination? An analysis using WHO standards to assess the growth of Bangladeshi children.

    Science.gov (United States)

    Moestue, Helen

    2009-08-01

    To examine the potential of anthropometry as a tool to measure gender discrimination, with particular attention to the WHO growth standards. Surveillance data collected from 1990 to 1999 were analysed. Height-for-age Z-scores were calculated using three norms: the WHO standards, the 1978 National Center for Health Statistics (NCHS) reference and the 1990 British growth reference (UK90). Bangladesh. Boys and girls aged 6-59 months (n 504 358). The three sets of growth curves provided conflicting pictures of the relative growth of girls and boys by age and over time. Conclusions on sex differences in growth depended also on the method used to analyse the curves, be it according to the shape or the relative position of the sex-specific curves. The shapes of the WHO-generated curves uniquely implied that Bangladeshi girls faltered faster or caught up slower than boys throughout their pre-school years, a finding consistent with the literature. In contrast, analysis of the relative position of the curves suggested that girls had higher WHO Z-scores than boys below 24 months of age. Further research is needed to help establish whether and how the WHO international standards can measure gender discrimination in practice, which continues to be a serious problem in many parts of the world.

  11. Analysis of Possibilities of Detectnig the Manipulation of Financial Statements in Terms of the IFRS and Czech Accounting Standards

    Directory of Open Access Journals (Sweden)

    Zita Drábková

    2015-01-01

    Full Text Available The main objective of financial statements is to give information. The diversity of interests and objectives of individual groups of users and creators of financial statements presents the risk of manipulation of financial statements in the context of true and fair view as defined in the national accounting legislation. The paper is concerned with the different possibilities of detecting the manipulation of financial statements in terms of the Czech Accounting Standards and IFRS. The paper analyzes the selected risk detection models of the manipulation of financial statements using creative accounting methods, off-balance sheet financing methods and accounting frauds in specific case studies of selected accounting unit in terms of Czech accounting standards. Based on the analysis and comparison of the results thereof, the paper presents and evaluates the alternatives of users of financial statements to evaluate the risk of manipulation of financial statements beyond the scope of a fair and true view. The evaluation further includes a comparison of uses of these models with respect to the International Financial Reporting Standards.

  12. Evaluation of a standard breast tangent technique: a dose-volume analysis of tangential irradiation using three-dimensional tools

    International Nuclear Information System (INIS)

    Krasin, Matthew; McCall, Anne; King, Stephanie; Olson, Mary; Emami, Bahman

    2000-01-01

    Purpose: A thorough dose-volume analysis of a standard tangential radiation technique has not been published. We evaluated the adequacy of a tangential radiation technique in delivering dose to the breast and regional lymphatics, as well as dose delivered to underlying critical structures. Methods and Materials: Treatment plans of 25 consecutive women with breast cancer undergoing lumpectomy and adjuvant breast radiotherapy were studied. Patients underwent two-dimensional (2D) treatment planning followed by treatment with standard breast tangents. These 2D plans were reconstructed without modification on our three-dimensional treatment planning system and analyzed with regard to dose-volume parameters. Results: Adequate coverage of the breast (defined as 95% of the target receiving at least 95% of the prescribed dose) was achieved in 16 of 25 patients, with all patients having at least 85% of the breast volume treated to 95% of the prescribed dose. Only 1 patient (4%) had adequate coverage of the Level I axilla, and no patient had adequate coverage of the Level II axilla, Level III axilla, or the internal mammary lymph nodes. Conclusion: Three-dimensional treatment planning is superior in quantification of the dose received by the breast, regional lymphatics, and critical structures. The standard breast tangent technique delivers an adequate dose to the breast but does not therapeutically treat the regional lymph nodes in the majority of patients. If coverage of the axilla or internal mammary lymph nodes is desired, alternate beam arrangements or treatment fields will be necessary

  13. Comparative analysis of JKR Sarawak form of contract and Malaysia Standard form of building contract (PWD203A)

    Science.gov (United States)

    Yunus, A. I. A.; Muhammad, W. M. N. W.; Saaid, M. N. F.

    2018-04-01

    Standard form of contract is normally being used in Malaysia construction industry in establishing legal relation between contracting parties. Generally, most of Malaysia federal government construction project used PWD203A which is a standard form of contract to be used where Bills of Quantities Form Part of the Contract and it is issued by Public Works Department (PWD/JKR). On the other hand in Sarawak, the largest state in Malaysia, the state government has issued their own standard form of contract namely JKR Sarawak Form of Contract 2006. Even both forms have been used widely in construction industry; there is still lack of understanding on both forms. The aim of this paper is to identify significant provision on both forms of contract. Document analysis has been adopted in conducting an in-depth review on both forms. It is found that, both forms of contracts have differences and similarities on several provisions specifically matters to definitions and general; execution of the works; payments, completion and final account; and delay, dispute resolution and determination.

  14. Standard test methods for chemical and spectrochemical analysis of nuclear-Grade silver-indium-cadmium alloys

    CERN Document Server

    American Society for Testing and Materials. Philadelphia

    1990-01-01

    1.1 These test methods cover procedures for the chemical and spectrochemical analysis of nuclear grade silver-indium-cadmium (Ag-In-Cd) alloys to determine compliance with specifications. 1.2 The analytical procedures appear in the following order: Sections Silver, Indium, and Cadmium by a Titration Method 7-15 Trace Impurities by Carrier-Distillation Spectro- chemical Method 16-22 1.3 The values stated in SI units are to be regarded as the standard. 1.4 This standard does not purport to address all of the safety concerns, if any, associated with its use. It is the responsibility of the user of this standard to establish appropriate safety and health practices and determine the applicability of regulatory limitations prior to use. For specific hazard and precautionary statements, see Section 5 and Practices E50. 7.1 This test method is applicable to the determination of silver, indium, and cadmium in alloys of approximately 80 % silver, 15 % indium, and 5 % cadmium used in nuclear reactor control r...

  15. Direct cost analysis of intensive care unit stay in four European countries: applying a standardized costing methodology.

    Science.gov (United States)

    Tan, Siok Swan; Bakker, Jan; Hoogendoorn, Marga E; Kapila, Atul; Martin, Joerg; Pezzi, Angelo; Pittoni, Giovanni; Spronk, Peter E; Welte, Robert; Hakkaart-van Roijen, Leona

    2012-01-01

    The objective of the present study was to measure and compare the direct costs of intensive care unit (ICU) days at seven ICU departments in Germany, Italy, the Netherlands, and the United Kingdom by means of a standardized costing methodology. A retrospective cost analysis of ICU patients was performed from the hospital's perspective. The standardized costing methodology was developed on the basis of the availability of data at the seven ICU departments. It entailed the application of the bottom-up approach for "hotel and nutrition" and the top-down approach for "diagnostics," "consumables," and "labor." Direct costs per ICU day ranged from €1168 to €2025. Even though the distribution of costs varied by cost component, labor was the most important cost driver at all departments. The costs for "labor" amounted to €1629 at department G but were fairly similar at the other departments (€711 ± 115). Direct costs of ICU days vary widely between the seven departments. Our standardized costing methodology could serve as a valuable instrument to compare actual cost differences, such as those resulting from differences in patient case-mix. Copyright © 2012 International Society for Pharmacoeconomics and Outcomes Research (ISPOR). Published by Elsevier Inc. All rights reserved.

  16. Temperature calibration procedure for thin film substrates for thermo-ellipsometric analysis using melting point standards

    Energy Technology Data Exchange (ETDEWEB)

    Kappert, Emiel J.; Raaijmakers, Michiel J.T.; Ogieglo, Wojciech; Nijmeijer, Arian; Huiskes, Cindy; Benes, Nieck E., E-mail: n.e.benes@utwente.nl

    2015-02-10

    Highlights: • Facile temperature calibration method for thermo-ellipsometric analysis. • The melting point of thin films of indium, lead, zinc, and water can be detected by ellipsometry. • In-situ calibration of ellipsometry hot stage, without using any external equipment. • High-accuracy temperature calibration (±1.3 °C). - Abstract: Precise and accurate temperature control is pertinent to studying thermally activated processes in thin films. Here, we present a calibration method for the substrate–film interface temperature using spectroscopic ellipsometry. The method is adapted from temperature calibration methods that are well developed for thermogravimetric analysis and differential scanning calorimetry instruments, and is based on probing a transition temperature. Indium, lead, and zinc could be spread on a substrate, and the phase transition of these metals could be detected by a change in the Ψ signal of the ellipsometer. For water, the phase transition could be detected by a loss of signal intensity as a result of light scattering by the ice crystals. The combined approach allowed for construction of a linear calibration curve with an accuracy of 1.3 °C or lower over the full temperature range.

  17. Computational Fluid Dynamic Analysis of the VHTR Lower Plenum Standard Problem

    International Nuclear Information System (INIS)

    Johnson, Richard W.; Schultz, Richard R.

    2009-01-01

    The United States Department of Energy is promoting the resurgence of nuclear power in the U. S. for both electrical power generation and production of process heat required for industrial processes such as the manufacture of hydrogen for use as a fuel in automobiles. The DOE project is called the next generation nuclear plant (NGNP) and is based on a Generation IV reactor concept called the very high temperature reactor (VHTR), which will use helium as the coolant at temperatures ranging from 450 C to perhaps 1000 C. While computational fluid dynamics (CFD) has not been used for past safety analysis for nuclear reactors in the U.S., it is being considered for safety analysis for existing and future reactors. It is fully recognized that CFD simulation codes will have to be validated for flow physics reasonably close to actual fluid dynamic conditions expected in normal and accident operational situations. To this end, experimental data have been obtained in a scaled model of a narrow slice of the lower plenum of a prismatic VHTR. The present report presents results of CFD examinations of these data to explore potential issues with the geometry, the initial conditions, the flow dynamics and the data needed to fully specify the inlet and boundary conditions; results for several turbulence models are examined. Issues are addressed and recommendations about the data are made

  18. Analysis and meta-analysis of single-case designs with a standardized mean difference statistic: a primer and applications.

    Science.gov (United States)

    Shadish, William R; Hedges, Larry V; Pustejovsky, James E

    2014-04-01

    This article presents a d-statistic for single-case designs that is in the same metric as the d-statistic used in between-subjects designs such as randomized experiments and offers some reasons why such a statistic would be useful in SCD research. The d has a formal statistical development, is accompanied by appropriate power analyses, and can be estimated using user-friendly SPSS macros. We discuss both advantages and disadvantages of d compared to other approaches such as previous d-statistics, overlap statistics, and multilevel modeling. It requires at least three cases for computation and assumes normally distributed outcomes and stationarity, assumptions that are discussed in some detail. We also show how to test these assumptions. The core of the article then demonstrates in depth how to compute d for one study, including estimation of the autocorrelation and the ratio of between case variance to total variance (between case plus within case variance), how to compute power using a macro, and how to use the d to conduct a meta-analysis of studies using single-case designs in the free program R, including syntax in an appendix. This syntax includes how to read data, compute fixed and random effect average effect sizes, prepare a forest plot and a cumulative meta-analysis, estimate various influence statistics to identify studies contributing to heterogeneity and effect size, and do various kinds of publication bias analyses. This d may prove useful for both the analysis and meta-analysis of data from SCDs. Copyright © 2013 Society for the Study of School Psychology. Published by Elsevier Ltd. All rights reserved.

  19. Learning and coping strategies versus standard education in cardiac rehabilitation: a cost-utility analysis alongside a randomised controlled trial.

    Science.gov (United States)

    Dehbarez, Nasrin Tayyari; Lynggaard, Vibeke; May, Ole; Søgaard, Rikke

    2015-09-28

    Learning and coping education strategies (LC) was implemented to enhance patient attendance in the cardiac rehabilitation programme. This study assessed the cost-utility of LC compared to standard education (standard) as part of a rehabilitation programme for patients with ischemic heart disease and heart failure. The study was conducted alongside a randomised controlled trial with 825 patients who were allocated to LC or standard rehabilitation and followed for 5 months. The LC approach was identical to the standard approach in terms of physical training and education, but with the addition of individual interviews and weekly team evaluations by professionals. A societal cost perspective including the cost of intervention, health care, informal time and productivity loss was applied. Cost was based on a micro-costing approach for the intervention and national administrative registries for other cost categories. Quality adjusted life years (QALY) were based on SF-6D measurements at baseline, after intervention and follow-up using British preference weights. Multiple imputation was used to handle non-response on the SF-6D. Conventional cost effectiveness methodology was employed to estimate the net benefit of the LC and to illustrate cost effectiveness acceptability curves. The statistical analysis was based on means and bootstrapped standard errors. An additional cost of DKK 6,043 (95% CI -5,697; 17,783) and a QALY gain of 0.005 (95% CI -0.001; 0.012) was estimated for LC. However, better utility scores in both arms were due to higher utility while receiving the intervention than better health after the intervention. The probability that LC would be cost-effective did not exceed 29% for any threshold values of willingness to pay per QALY. The alternative scenario analysis was restricted to a health care perspective and showed that the probability of cost-effectiveness increased to 62% over the threshold values. The LC was unlikely to be cost-effective within 5

  20. Image analysis as an adjunct to manual HER-2 immunohistochemical review: a diagnostic tool to standardize interpretation.

    LENUS (Irish Health Repository)

    Dobson, Lynne

    2010-07-01

    AIMS: Accurate determination of HER-2 status is critical to identify patients for whom trastuzumab treatment will be of benefit. Although the recommended primary method of evaluation is immunohistochemistry, numerous reports of variability in interpretation have raised uncertainty about the reliability of results. Recent guidelines have suggested that image analysis could be an effective tool for achieving consistent interpretation, and this study aimed to assess whether this technology has potential as a diagnostic support tool. METHODS AND RESULTS: Across a cohort of 275 cases, image analysis could accurately classify HER-2 status, with 91% agreement between computer-aided classification and the pathology review. Assessment of the continuity of membranous immunoreactivity in addition to intensity of reactivity was critical to distinguish between negative and equivocal cases and enabled image analysis to report a lower referral rate of cases for confirmatory fluorescence in situ hybridization (FISH) testing. An excellent concordance rate of 95% was observed between FISH and the automated review across 136 informative cases. CONCLUSIONS: This study has validated that image analysis can robustly and accurately evaluate HER-2 status in immunohistochemically stained tissue. Based on these findings, image analysis has great potential as a diagnostic support tool for pathologists and biomedical scientists, and may significantly improve the standardization of HER-2 testing by providing a quantitative reference method for interpretation.

  1. Longitudinal analysis of standardized test scores of students in the Science Writing Heuristic approach

    Science.gov (United States)

    Chanlen, Niphon

    The purpose of this study was to examine the longitudinal impacts of the Science Writing Heuristic (SWH) approach on student science achievement measured by the Iowa Test of Basic Skills (ITBS). A number of studies have reported positive impact of an inquiry-based instruction on student achievement, critical thinking skills, reasoning skills, attitude toward science, etc. So far, studies have focused on exploring how an intervention affects student achievement using teacher/researcher-generated measurement. Only a few studies have attempted to explore the long-term impacts of an intervention on student science achievement measured by standardized tests. The students' science and reading ITBS data was collected from 2000 to 2011 from a school district which had adopted the SWH approach as the main approach in science classrooms since 2002. The data consisted of 12,350 data points from 3,039 students. The multilevel model for change with discontinuity in elevation and slope technique was used to analyze changes in student science achievement growth trajectories prior and after adopting the SWH approach. The results showed that the SWH approach positively impacted students by initially raising science achievement scores. The initial impact was maintained and gradually increased when students were continuously exposed to the SWH approach. Disadvantaged students who were at risk of having low science achievement had bigger benefits from experience with the SWH approach. As a result, existing problematic achievement gaps were narrowed down. Moreover, students who started experience with the SWH approach as early as elementary school seemed to have better science achievement growth compared to students who started experiencing with the SWH approach only in high school. The results found in this study not only confirmed the positive impacts of the SWH approach on student achievement, but also demonstrated additive impacts found when students had longitudinal experiences

  2. Regulatory Impact Analysis: Amendments to the National Emission Standards for Hazardous Air Pollutants (NESHAP) and New Source Perofrmance Standards (NSPS) for the Portland Cement Manufacturing Industry Final Report

    Science.gov (United States)

    For the regulatory process, EPA is required to develop a regulatory impact analysis (RIA). This August 2010 RIA includes an economic impact analysis (EIA) and a small entity impacts analysis and documents the RIA methods and results for the 2010 rules

  3. Standardization of dosimetry and damage analysis work for U.S. LWR, FBR, and MFR development program

    International Nuclear Information System (INIS)

    McElroy, W.N.; Doran, D.G.; Gold, R.; Morgan, W.C.; Grundl, J.A.; McGarry, E.D.; Kam, F.B.K.; Swank, J.H.; Odette, G.R.

    1978-01-01

    The accuracy requirements for various measured/calculated exposure and correlation parameters associated with current dosimetry and damage analysis procedures and practices depend on the accuracy needs of reactor development efforts in testing, design, safety, operations, and surveillance programs. Present state-of-the-art accuracies are estimated to be in the range of +-2 to 30 percent (1 sigma), depending on the particular parameter. There now appears to be international agreement, at least for the long term, that most reactor fuels and materials programs will not be able to accept an uncertainty greater than about +5 percent (1 sigma). The current status of dosimetry and damage analysis standardization work within the U.S. for LWR, FBR and MFR is reviewed in this paper

  4. Evaluation and standardization of neutron activation analysis according to the K0 method in the RP-10 reactor

    International Nuclear Information System (INIS)

    Montoya R, E.

    1995-01-01

    It has been characterized and standardized an irradiation of the RP-10 Research Nuclear Reactor for use of the K 0 method of neutron activation analysis using the Hoegdahl convention; also it has been evaluate the behaviour of such method in regard to the accuracy and precision of the results obtained in the quantitative multi elemental analysis of several certified materials of reference. In order to prove that the analytical method is totally under statistical control, it has been used the Heydorn method. It has been verified that the method is exact, precise and reliable to determine the aluminium, antimuonium, arsenic, bromine, calcium, chloride, copper, magnesium, manganese, sodium, titanium, vanadium, zinc and other elements. Also, they are discussed, in regard to the use of K 0 constants, the different formalisms employed to calculate the integral of the reaction rate by nucleus in the activation. (author). 58 refs., 18 tabs., 6 figs

  5. Plutonium analysis from controlled-potential coulometry for the certification of the MP3 standard material

    International Nuclear Information System (INIS)

    Ruas, A.; Dalier, V.; Pivato, J.

    2008-01-01

    Coulometry is an assay method in which the quantity of the element analyzed is determined by measuring a quantity of electricity. For contributing to the certification of the new metal plutonium reference material (MP3), controlled-potential coulometry (CPC) has many advantages: it is a high accuracy absolute chemical analysis technique. Many studies are now conducted on plutonium solutions, to improve the operating conditions and the current apparatus, for mass determination with a precision of 0.1%. The different experimental preliminary results are discussed and the apparatus described. The coulometry cell assembly comprises a motor connected to a stirrer designed to prevent splashing, an inlet tube for inert gas, three electrodes, and a thermocouple for measuring the temperature. The measuring system includes a potentiostat, a CPU, a calibrated current generator, a temperature indicator and a voltmeter, all maintained at a constant temperature. Current integration is made by electronic components, introduced in the potentiostat and the CPU. (authors)

  6. Standard test methods for analysis of sintered gadolinium oxide-uranium dioxide pellets

    CERN Document Server

    American Society for Testing and Materials. Philadelphia

    2006-01-01

    1.1 These test methods cover procedures for the analysis of sintered gadolinium oxide-uranium dioxide pellets to determine compliance with specifications. 1.2 The analytical procedures appear in the following order: Section Carbon (Total) by Direct CombustionThermal Conductivity Method C1408 Test Method for Carbon (Total) in Uranium Oxide Powders and Pellets By Direct Combustion-Infrared Detection Method Chlorine and Fluorine by Pyrohydrolysis Ion-Selective Electrode Method C1502 Test Method for Determination of Total Chlorine and Fluorine in Uranium Dioxide and Gadolinium Oxide Gadolinia Content by Energy-Dispersive X-Ray Spectrometry C1456 Test Method for Determination of Uranium or Gadolinium, or Both, in Gadolinium Oxide-Uranium Oxide Pellets or by X-Ray Fluorescence (XRF) Hydrogen by Inert Gas Fusion C1457 Test Method for Determination of Total Hydrogen Content of Uranium Oxide Powders and Pellets by Carrier Gas Extraction Isotopic Uranium Composition by Multiple-Filament Surface-Ioni...

  7. Who Is Learning About Climate Change in American Schools? An Analysis of Climate Change in Curriculum Standards

    Science.gov (United States)

    Golden, B. W.; Francis, T. K.

    2014-12-01

    This work attempts to answer the question "how much, if any, climate change, exists in middle and high school curricula in the United States?" A necessary first step towards this answer involves an examination of Global Climate Change (GCC) coverage in the requisite standards documents. Until recently, each state had its own science framework, with four states (at the time of writing) having already adopted the new Next Generation Science Standards (NGSS) (Achieve, Inc, 2013). This work reports on an analysis of the extent to which GCC exists within the content frameworks of each state, including the NGSS. The analysis began with a word search for such content as "climate change", "greenhouse effect", and "global warming". We then searched through the remainder of the documents in question to understand the nuance of each framework. Each framework was then scored on a scale form zero (no mention of climate change) to four (climate change is explicit, an anthropogenic potential cause is emphasized, and GCC appears within at least one standard of its own). Eighteen states scored a zero, while only five states scored a four. This is particularly troubling, in light of recent statements of scientific consensus (AAAS, 2006; 2009; AGU, 2013; IPCC, 2007). While the NGSS scored well, it is unclear what this means in terms of actual students encountering the subject of climate change in actual classroom. Attention is given to some still-problematic aspects of GCC content are addressed, including its focus largely within courses not required for graduation, as well as the murky details of the yet-to-be determined processes by which individual states will choose to test, or not to test, the subject matter. The authors conclude that as of 2013, there is little evidence that students in most states are required to take courses which include significant aspects of GCC in their curricula.

  8. A confirmatory factor analysis of the WMS-III in a clinical sample with crossvalidation in the standardization sample.

    Science.gov (United States)

    Bradley Burton, D; Ryan, Joseph J; Axelrod, Bradley N; Schellenberger, Tony; Richards, Heather M

    2003-08-01

    A maximum likelihood confirmatory factor analysis (CFA) of the Wechsler Memory Scale-III (WMS-III) was performed by applying LISREL 8 to a general clinical sample (n=281). Analyses were designed to determine which of seven hypothesized oblique factor solutions could best explain memory as measured by the WMS-III. Competing latent variable models were identified in previous studies. Results in the clinical sample were crossvalidated by testing all models in the WMS-III standardization samples (combined n=1,250). Findings in both the clinical and standardization samples supported a four-factor model containing auditory memory, visual memory, working memory, and learning factors. Our analysis differed from that presented in the WMS-III manual and by other authors. We tested our models in a clinical sample and included selected word list subtests in order to test the viability of a learning dimension. Consistent with prior research, we were also unable to empirically support the viability of the immediate and delayed memory indices, despite allowing the error terms between the immediate and delayed memory subtests to correlate.

  9. Standard test method for analysis of uranium and thorium in soils by energy dispersive X-Ray fluorescence spectroscopy

    CERN Document Server

    American Society for Testing and Materials. Philadelphia

    2011-01-01

    1.1 This test method covers the energy dispersive X-ray fluorescence (EDXRF) spectrochemical analysis of trace levels of uranium and thorium in soils. Any sample matrix that differs from the general ground soil composition used for calibration (that is, fertilizer or a sample of mostly rock) would have to be calibrated separately to determine the effect of the different matrix composition. 1.2 The analysis is performed after an initial drying and grinding of the sample, and the results are reported on a dry basis. The sample preparation technique used incorporates into the sample any rocks and organic material present in the soil. This test method of sample preparation differs from other techniques that involve tumbling and sieving the sample. 1.3 Linear calibration is performed over a concentration range from 20 to 1000 μg per gram for uranium and thorium. 1.4 The values stated in SI units are to be regarded as the standard. The inch-pound units in parentheses are for information only. 1.5 This standard...

  10. An analysis of Renewable Portfolio Standard policy formulation and its influence on state level energy prices

    Science.gov (United States)

    McCollester, Peter Colin

    Over the past two decades, environmental concern has crept to the forefront of the world policy agenda. This concern has manifested itself differently throughout the world. In the United States, this has come in the form of Renewable Portfolio Standards (RPS) which have become one of the primary policy tools which states use to encourage renewable energy generation. The advent of RPS has spurred intense debate at a federal and state level, centering on the economic merits of promoting renewable energy generation. Detractors argue that RPS will raise electricity rates, since generation from renewable sources is typically costlier than energy generated from fossil fuels. At this point, evidence to the relationship between RPS on electricity prices remains unclear. Researchers have attempted to understand this relationship through a variety of means. The most common being regression based models, which utilize readily available United States Energy Information Agency (US EIA) data, and have uncovered a number of important independent variables which are incorporated into the model in this study. Examples include personal income, state population, and deregulation of an energy market. In addition to empirical studies, the National Renewable Energy Laboratory (NREL) has created complex mathematical models which generate scenario projections based on a number of assumptions. While interesting, these are forward looking tools and as such have not yielded a tremendous amount of insight into the underlying policy mechanics of RPS. A challenge of addressing this topic which is worth noting is that much of the research available which analyzes the merits of RPS caters to distinct political or private sector agendas. The research gathered for this study is comprehensive, and attempts to avoid studies with any clear political, ideological, or financial motivation. Using the insights from previous researchers this study develops a rigorous fixed effects regression model to

  11. Throughput analysis of the IEEE 802.4 token bus standard under heavy load

    Science.gov (United States)

    Pang, Joseph; Tobagi, Fouad

    1987-01-01

    It has become clear in the last few years that there is a trend towards integrated digital services. Parallel to the development of public Integrated Services Digital Network (ISDN) is service integration in the local area (e.g., a campus, a building, an aircraft). The types of services to be integrated depend very much on the specific local environment. However, applications tend to generate data traffic belonging to one of two classes. According to IEEE 802.4 terminology, the first major class of traffic is termed synchronous, such as packetized voice and data generated from other applications with real-time constraints, and the second class is called asynchronous which includes most computer data traffic such as file transfer or facsimile. The IEEE 802.4 token bus protocol which was designed to support both synchronous and asynchronous traffic is examined. The protocol is basically a timer-controlled token bus access scheme. By a suitable choice of the design parameters, it can be shown that access delay is bounded for synchronous traffic. As well, the bandwidth allocated to asynchronous traffic can be controlled. A throughput analysis of the protocol under heavy load with constant channel occupation of synchronous traffic and constant token-passing times is presented.

  12. Analysis of Fe V and Ni V Wavelength Standards in the Vacuum Ultraviolet

    Science.gov (United States)

    Ward, Jacob Wolfgang; Nave, Gillian

    2015-01-01

    The recent publication[1] by J.C. Berengut et al. tests for a potential variation in the fine-structure constant in the presence of high gravitational potentials through spectral analysis of white-dwarf stars.The spectrum of the white-dwarf star studied in the paper, G191-B2B, has prominent Fe V and Ni V lines, which were used to determine any variation in the fine-structure constant via observed shifts in the wavelengths of Fe V and Ni V in the vacuum ultraviolet region. The results of the paper indicate no such variation, but suggest that refined laboratory values for the observed wavelengths could greatly reduce the uncertainty associated with the paper's findings.An investigation of Fe V and Ni V spectra in the vacuum ultraviolet region has been conducted to reduce wavelength uncertainties currently limiting modern astrophysical studies of this nature. The analyzed spectra were produced by a sliding spark light source with electrodes made of invar, an iron nickel alloy, at peak currents of 750-2000 A. The use of invar ensures that systematic errors in the calibration are common to both species. The spectra were recorded with the NIST Normal Incidence Vacuum Spectrograph on phosphor image plate and photographic plate detectors. Calibration was done with a Pt II spectrum produced by a Platinum Neon Hollow Cathode lamp.[1] J. C. Berengut, V. V. Flambaum, A. Ong, et al Phys. Rev. Lett. 111, 010801 (2013)

  13. Advanced mathematical on-line analysis in nuclear experiments. Usage of parallel computing CUDA routines in standard root analysis

    Directory of Open Access Journals (Sweden)

    Grzeszczuk A.

    2015-01-01

    Full Text Available Compute Unified Device Architecture (CUDA is a parallel computing platform developed by Nvidia for increase speed of graphics by usage of parallel mode for processes calculation. The success of this solution has opened technology General-Purpose Graphic Processor Units (GPGPUs for applications not coupled with graphics. The GPGPUs system can be applying as effective tool for reducing huge number of data for pulse shape analysis measures, by on-line recalculation or by very quick system of compression. The simplified structure of CUDA system and model of programming based on example Nvidia GForce GTX580 card are presented by our poster contribution in stand-alone version and as ROOT application.

  14. New tendencies in isotopic analysis of pesticide residues from wines by mass spectrometry in concordance with the European standards

    International Nuclear Information System (INIS)

    Costinel, Diana; Lazar, Roxana Elena; Vremera, Raluca; Irimescu, Rodica; Saros-Rogobete, Gili

    2006-01-01

    Multi-isotope analysis, the determination of isotope ratios by mass spectrometry or magnetic resonance spectroscopy, becomes increasingly used in the food industry and by national food control laboratories as a method of authenticating both raw materials and finished products. These highly sophisticated techniques are capable of determining the botanical and geographical origin of a wide variety of foodstuffs, thus providing a means of detecting product adulteration and controlling mislabelling practices which are virtually impossible to circumvent. The European Union has officially adopted the used of isotope analysis as a means of controlling sugar addition in wines. Its successful implementation in the wine-producing Member States has considerably reduced the financial losses which the Community had incurred due to over - capitalisation. Coupling mass spectrometer with gas chromatograph is used for quantitative and qualitative analysis of traces of pesticides from food. The presence of pesticides in foods is harmful for the nervous system, the cardiovascular apparatus and decreases the immunity of human body. In addition, ensuring the foods quality and safety is a requirement, which must be fulfilled for the integration in EU. The subject of this paper is the presentation of the tests results of the isotopic analysis for pesticide residues in wines, in concordance with European Standard. (authors)

  15. Non-matrix Matched Glass Disk Calibration Standards Improve XRF Micronutrient Analysis of Wheat Grain across Five Laboratories in India.

    Science.gov (United States)

    Guild, Georgia E; Stangoulis, James C R

    2016-01-01

    Within the HarvestPlus program there are many collaborators currently using X-Ray Fluorescence (XRF) spectroscopy to measure Fe and Zn in their target crops. In India, five HarvestPlus wheat collaborators have laboratories that conduct this analysis and their throughput has increased significantly. The benefits of using XRF are its ease of use, minimal sample preparation and high throughput analysis. The lack of commercially available calibration standards has led to a need for alternative calibration arrangements for many of the instruments. Consequently, the majority of instruments have either been installed with an electronic transfer of an original grain calibration set developed by a preferred lab, or a locally supplied calibration. Unfortunately, neither of these methods has been entirely successful. The electronic transfer is unable to account for small variations between the instruments, whereas the use of a locally provided calibration set is heavily reliant on the accuracy of the reference analysis method, which is particularly difficult to achieve when analyzing low levels of micronutrient. Consequently, we have developed a calibration method that uses non-matrix matched glass disks. Here we present the validation of this method and show this calibration approach can improve the reproducibility and accuracy of whole grain wheat analysis on 5 different XRF instruments across the HarvestPlus breeding program.

  16. Summary of Preliminary Criticality Analysis for Peach Bottom Fuel in the DOE Standardized Spent Nuclear Fuel Canister

    International Nuclear Information System (INIS)

    Henrikson, D.J.

    1999-01-01

    The Department of Energy's (DOE's) National Spent Nuclear Fuel Program is developing a standardized set of canisters for DOE spent nuclear fuel (SNF). These canisters will be used for DOE SNF handling, interim storage, transportation, and disposal in the national repository. Several fuels are being examined in conjunction with the DOE SNF canisters. This report summarizes the preliminary criticality safety analysis that addresses general fissile loading limits for Peach Bottom graphite fuel in the DOE SNF canister. The canister is considered both alone and inside the 5-HLW/DOE Long Spent Fuel Co-disposal Waste Package, and in intact and degraded conditions. Results are appropriate for a single DOE SNF canister. Specific facilities, equipment, canister internal structures, and scenarios for handling, storage, and transportation have not yet been defined and are not evaluated in this analysis. The analysis assumes that the DOE SNF canister is designed so that it maintains reasonable geometric integrity. Parameters important to the results are the canister outer diameter, inner diameter, and wall thickness. These parameters are assumed to have nominal dimensions of 45.7-cm (18.0-in.), 43.815-cm (17.25-in), and 0.953-cm (0.375-in.), respectively. Based on the analysis results, the recommended fissile loading for the DOE SNF canister is 13 Peach Bottom fuel elements if no internal steel is present, and 15 Peach Bottom fuel elements if credit is taken for internal steel

  17. [The Russian and international standards of age-related allocation of population for medical statistics, medical demographic analysis and risk assessment].

    Science.gov (United States)

    Demin, V F; Paltsev, M A

    2013-01-01

    The actual European standard of age-related allocation of population in action is largely implemented in medical demographic studies of international (WHO etc.) and national organizations. The Rosstat also implements this standard in its demographic yearbooks and other publications. The standard is applied in computing the standardized indicator of population mortality in different countries and territories and also in assessing risk factors. The standard is based on the idea of evaluating mortality with an integrated standard in order to compare between different countries mortality of population, genders and calendar years. The analysis of results of testing calculations of values of standardized indicator of mortality of population of Russia and EU countries applying European standard in action revealed serious shortcomings. For example. unfounded overstating of values of standardized indicator, of mortality for males and its understating for females artificially increases already wide difference in mortality of males and females in the Russian Federation. The calculation on this background of standardized indicator of mortality for particular causes of death results in erroneous values due to neglected concurrence of risks. Because of necessity of improvement of standard a new concept of development of national and international standards is proposed. This concept is based on application of notion of balanced age-related allocation of population and its number values.

  18. Environmental Stress Testing of the Single Sample Cylinder: A Proven Consensus Standard for Internal Gas Analysis (IGA) or Residual Gas Analysis (RGA)

    Science.gov (United States)

    Schuessler, Philipp WH

    2010-01-01

    In August 2008, Schuessler Consulting was contracted by NASA GSFC in support of the NASA Electronic Parts and Packaging (NEPP) program to perform two separate studies on moisture laden air in a stainless steel cylinder that had been designed to become a consensus standard for Test Method 1018. This Test Method was originally released for hybrids under Mil. Std. 883 but was quickly utilized on other microelectronic devices under the auspice of Mil. Std. 750. The cylinder had subsequently been fabricated for the 750 community. It was back-filled with moist air and subsequently analyzed over a period of time under a previous NASA contract. It had been shown that moisture in the 4000 - 5000 ppm range could be analyzed rather precisely with a mass spectrometer, commonly referred to as a Residual Gas Analyzer (RGA). The scope of this study was to ascertain if the composition and precision varied as a function of thermal shock at sub-zero temperatures and whether there was consensus when the standard was submitted to other RGA units. It was demonstrated and published that the consensus standard would yield precise RGA data for moisture within +/- 1% when optimized for a given RGA unit. It has been subsequently shown in this study at Oneida Research Services, that sub-zero storage did not affect that precision when a well-defined protocol for the analysis was followed. The consensus standard was taken to a second facility for analysis where it was found that moisture adsorption on the transfer lines caused precision to drop to +/- 12%. The Single Sample Cylinder (SSC) is a one liter stainless steel cylinder with associated sampling valves and has considerable weight and volume. But this considerable size allows for approximately 300 gas samples of the same composition to be delivered to any RGA unit. Lastly, a smaller cylinder, approximately 75 cc, of a second consensus standard was fabricated and tested with a different mix of fixed gases where moisture was kept in the

  19. Quantitative Analysis of Torso FDG-PET Scans by Using Anatomical Standardization of Normal Cases from Thorough Physical Examinations.

    Directory of Open Access Journals (Sweden)

    Takeshi Hara

    Full Text Available Understanding of standardized uptake value (SUV of 2-deoxy-2-[18F]fluoro-d-glucose positron emission tomography (FDG-PET depends on the background accumulations of glucose because the SUV often varies the status of patients. The purpose of this study was to develop a new method for quantitative analysis of SUV of FDG-PET scan images. The method included an anatomical standardization and a statistical comparison with normal cases by using Z-score that are often used in SPM or 3D-SSP approach for brain function analysis. Our scheme consisted of two approaches, which included the construction of a normal model and the determination of the SUV scores as Z-score index for measuring the abnormality of an FDG-PET scan image. To construct the normal torso model, all of the normal images were registered into one shape, which indicated the normal range of SUV at all voxels. The image deformation process consisted of a whole body rigid registration of shoulder to bladder region and liver registration and a non-linear registration of body surface by using the thin-plate spline technique. In order to validate usefulness of our method, we segment suspicious regions on FDG-PET images manually, and obtained the Z-scores of the regions based on the corresponding voxels that stores the mean and the standard deviations from the normal model. We collected 243 (143 males and 100 females normal cases to construct the normal model. We also extracted 432 abnormal spots from 63 abnormal cases (73 cancer lesions to validate the Z-scores. The Z-scores of 417 out of 432 abnormal spots were higher than 2.0, which statistically indicated the severity of the spots. In conclusions, the Z-scores obtained by our computerized scheme with anatomical standardization of torso region would be useful for visualization and detection of subtle lesions on FDG-PET scan images even when the SUV may not clearly show an abnormality.

  20. A Content Analysis of Immigration in Traditional, New, and Non-Gateway State Standards for U.S. History and Civics

    Science.gov (United States)

    Hilburn, Jeremy; Journell, Wayne; Buchanan, Lisa Brown

    2016-01-01

    In this content analysis of state U.S. History and Civics standards, we compared the treatment of immigration across three types of states with differing immigration demographics. Analyzing standards from 18 states from a critical race methodology perspective, our findings indicated three sets of tensions: a unified American story versus local…