WorldWideScience

Sample records for standard candle analysis

  1. Investigating the Effect of Cosmic Opacity on Standard Candles

    International Nuclear Information System (INIS)

    Hu, J.; Yu, H.; Wang, F. Y.

    2017-01-01

    Standard candles can probe the evolution of dark energy over a large redshift range. But the cosmic opacity can degrade the quality of standard candles. In this paper, we use the latest observations, including Type Ia supernovae (SNe Ia) from the “joint light-curve analysis” sample and Hubble parameters, to probe the opacity of the universe. A joint fitting of the SNe Ia light-curve parameters, cosmological parameters, and opacity is used in order to avoid the cosmological dependence of SNe Ia luminosity distances. The latest gamma-ray bursts are used in order to explore the cosmic opacity at high redshifts. The cosmic reionization process is considered at high redshifts. We find that the sample supports an almost transparent universe for flat ΛCDM and XCDM models. Meanwhile, free electrons deplete photons from standard candles through (inverse) Compton scattering, which is known as an important component of opacity. This Compton dimming may play an important role in future supernova surveys. From analysis, we find that about a few per cent of the cosmic opacity is caused by Compton dimming in the two models, which can be corrected.

  2. How Real Detector Thresholds Create False Standard Candles

    International Nuclear Information System (INIS)

    Shahmoradi, Amir; Nemiroff, Robert

    2009-01-01

    GRB satellites are relatively inefficient detectors of dim hard bursts. For example, given two bursts of identical peak luminosity near the detection threshold, a dim soft burst will be preferentially detected over a dim hard burst. This means that a high E peak burst will need a higher peak luminosity to be detected than a low E peak GRB. This purely detector-created attribute will appear as a correlation between E peak and luminosity, and should not be interpreted as a real standard candle effect. This result derives from Monte Carlo simulations utilizing a wide range of initial GRB spectra, and retriggering to create a final ''detected'' sample. In sum, E peak is not a good standard candle, and its appearance as such in seeming correlations such as the Amati and other L iso vs. E peak relations is likely a ghost of real energy-related detection thresholds.

  3. Quasars as Cosmological Standard Candles

    International Nuclear Information System (INIS)

    Negrete, C. Alenka; Dultzin, Deborah; Marziani, Paola; Sulentic, Jack W.; Esparza-Arredondo, Donají; Martínez-Aldama, Mary L.; Del Olmo, Ascensión

    2017-01-01

    We propose the use of quasars with accretion rate near the Eddington ratio (extreme quasars) as standard candles. The selection criteria are based on the Eigenvector 1 (E1) formalism. Our first sample is a selection of 334 optical quasar spectra from the SDSS DR7 database with a S/N > 20. Using the E1, we define primary and secondary selection criteria in the optical spectral range. We show that it is possible to derive a redshift-independent estimate of luminosity for extreme Eddington ratio sources. Our results are consistent with concordance cosmology but we need to work with other spectral ranges to take into account the quasar orientation, among other constrains.

  4. Quasars as Cosmological Standard Candles

    Energy Technology Data Exchange (ETDEWEB)

    Negrete, C. Alenka [CONACYT Research Fellow - Instituto de Astronomía, UNAM, Mexico City (Mexico); Dultzin, Deborah [Instituto de Astronomía, UNAM, Mexico City (Mexico); Marziani, Paola [INAF, Osservatorio Astronomico di Padova, Padua (Italy); Sulentic, Jack W. [Instituto de Astrofísica de Andalucía, IAA-CSIC, Granada (Spain); Esparza-Arredondo, Donají [Instituto de Radioastronomía y Astrofísica, Morelia (Mexico); Martínez-Aldama, Mary L.; Del Olmo, Ascensión, E-mail: alenka@astro.unam.mx [Instituto de Astrofísica de Andalucía, IAA-CSIC, Granada (Spain)

    2017-12-15

    We propose the use of quasars with accretion rate near the Eddington ratio (extreme quasars) as standard candles. The selection criteria are based on the Eigenvector 1 (E1) formalism. Our first sample is a selection of 334 optical quasar spectra from the SDSS DR7 database with a S/N > 20. Using the E1, we define primary and secondary selection criteria in the optical spectral range. We show that it is possible to derive a redshift-independent estimate of luminosity for extreme Eddington ratio sources. Our results are consistent with concordance cosmology but we need to work with other spectral ranges to take into account the quasar orientation, among other constrains.

  5. Flammability Parameters of Candles

    Directory of Open Access Journals (Sweden)

    Balog Karol

    2017-06-01

    Full Text Available The paper deals with the assessment of selected fire safety characteristics of candles. Weight loss of a candle during the burning process, candle burning rate, soot index, heat release rate and yield of carbon oxides were determined. Soot index was determined according to EN 15426: 2007 - Candles - Specification for Sooting Behavior. All samples met the prescribed amount of produced soot. Weight loss, heat release rate and the yield of carbon oxides were determined for one selected sample. While yield of CO increased during the measurement, the yield of CO2 decreased by half in 40 minutes.

  6. 75 FR 44224 - Grant of Authority for Subzone Status; Yankee Candle Corporation (Candles and Gift Sets); Whately...

    Science.gov (United States)

    2010-07-28

    ... Status; Yankee Candle Corporation (Candles and Gift Sets); Whately and South Deerfield, MA Pursuant to... special-purpose subzone at the candle and gift set manufacturing and distribution facilities of Yankee... activity related to the manufacturing and distribution of candles and gift sets at the facilities of Yankee...

  7. Standard test method for determining atmospheric chloride deposition rate by wet candle method

    CERN Document Server

    American Society for Testing and Materials. Philadelphia

    2002-01-01

    1.1 This test method covers a wet candle device and its use in measuring atmospheric chloride deposition (amount of chloride salts deposited from the atmosphere on a given area per unit time). 1.2 Data on atmospheric chloride deposition can be useful in classifying the corrosivity of a specific area, such as an atmospheric test site. Caution must be exercised, however, to take into consideration the season because airborne chlorides vary widely between seasons. 1.3 This standard does not purport to address all of the safety concerns, if any, associated with its use. It is the responsibility of the user of this standard to establish appropriate safety and health practices and determine the applicability of regulatory limitations prior to use.

  8. Social and Economic Impact of the Candle Light Source Project Candle project impact

    Science.gov (United States)

    Baghiryan, M.

    Social and economic progress related to the realization of the CANDLE synchrotron light source creation project in Armenia is discussed. CANDLE service is multidisciplinary and long-lasting. Its impacts include significant improvement in science capacities, education quality, industrial capabilities, investment climate, country image, international relations, health level, restraining the "brain-drain", new workplaces, etc. CANDLE will serve as a universal national infrastructure assuring Armenia as a country with knowledge-based economy, a place for doing high-tech business, and be a powerful tool in achieving the country's jump forward in general.

  9. Application of Candle burnup to small fast reactor

    International Nuclear Information System (INIS)

    Sekimoto, H.; Satoshi, T.

    2004-01-01

    A new reactor burnup strategy CANDLE (Constant Axial shape of Neutron flux, nuclide densities and power shape During Life of Energy producing reactor) was proposed, where shapes of neutron flux, nuclide densities and power density distributions remain constant but move to an axial direction. An equilibrium state was obtained for a large fast reactor (core radius is 2 m and reflector thickness is 0.5 m) successfully by using a newly developed direct analysis code. However, it is difficult to apply this burnup strategy to small reactors, since its neutron leakage becomes large and neutron economy becomes worse. Fuel enrichment should be increased in order to sustain the criticality. However, higher enrichment of fresh fuel makes the CANDLE burnup difficult. We try to find some small reactor designs, which can realize the CANDLE burnup. We have successfully find a design, which is not the CANDLE burnup in the strict meaning, but satisfies qualitatively its characteristics mentioned at the top of this abstract. In the final paper, the general description of CANDLE burnup and some results on the obtained small fast reactor design are presented.(author)

  10. How Beatrice Tinsley Destroyed Sandage's Quest for a Standard Candle

    Science.gov (United States)

    Mitton, Simon

    2014-01-01

    The goal of cosmology and most extragalactic optical astronomy during the heroic period spanning the half century from Hubble to Sandage (1920s - 1970s) was a search for two numbers, the Hubble constant and the deceleration parameter. Standard candles were needed to establish the measure of the universe. In 1968, Beatrice Tinsley, then a postdoctoral fellow in the astronomy department of the University of Texas at Austin showed that the great enterprise at Palomar of calibrating the galaxies was in need of major revision. At the 132nd AAS Meeting (June 1970, Boulder, Colorado) she presented a paper on galactic evolution on the magnitude-redshift relation. In her Abstract she boldly wrote: "My present conclusion is opposite to that reached by most cosmologists." In fact her claims caused great consternation among cosmologists. In 1972 she published eight papers on the evolution of galaxies, and the effects of that evolution for observational cosmology and the origin of structure.

  11. Standard rulers, candles, and clocks from the low-redshift universe.

    Science.gov (United States)

    Heavens, Alan; Jimenez, Raul; Verde, Licia

    2014-12-12

    We measure the length of the baryon acoustic oscillation (BAO) feature, and the expansion rate of the recent Universe, from low-redshift data only, almost model independently. We make only the following minimal assumptions: homogeneity and isotropy, a metric theory of gravity, a smooth expansion history, and the existence of standard candles (supernovæ) and a standard BAO ruler. The rest is determined by the data, which are compilations of recent BAO and type IA supernova results. Making only these assumptions, we find for the first time that the standard ruler has a length of 103.9±2.3h⁻¹ Mpc. The value is a measurement, in contrast to the model-dependent theoretical prediction determined with model parameters set by Planck data (99.3±2.1h⁻¹ Mpc). The latter assumes the cold dark matter model with a cosmological constant, and that the ruler is the sound horizon at radiation drag. Adding passive galaxies as standard clocks or a local Hubble constant measurement allows the absolute BAO scale to be determined (142.8±3.7 Mpc), and in the former case the additional information makes the BAO length determination more precise (101.9±1.9h⁻¹ Mpc). The inverse curvature radius of the Universe is weakly constrained and consistent with zero, independently of the gravity model, provided it is metric. We find the effective number of relativistic species to be N(eff)=3.53±0.32, independent of late-time dark energy or gravity physics.

  12. Type Ia supernovae, standardizable candles, and gravity

    Science.gov (United States)

    Wright, Bill S.; Li, Baojiu

    2018-04-01

    Type Ia supernovae (SNe Ia) are generally accepted to act as standardizable candles, and their use in cosmology led to the first confirmation of the as yet unexplained accelerated cosmic expansion. Many of the theoretical models to explain the cosmic acceleration assume modifications to Einsteinian general relativity which accelerate the expansion, but the question of whether such modifications also affect the ability of SNe Ia to be standardizable candles has rarely been addressed. This paper is an attempt to answer this question. For this we adopt a semianalytical model to calculate SNe Ia light curves in non-standard gravity. We use this model to show that the average rescaled intrinsic peak luminosity—a quantity that is assumed to be constant with redshift in standard analyses of Type Ia supernova (SN Ia) cosmology data—depends on the strength of gravity in the supernova's local environment because the latter determines the Chandrasekhar mass—the mass of the SN Ia's white dwarf progenitor right before the explosion. This means that SNe Ia are no longer standardizable candles in scenarios where the strength of gravity evolves over time, and therefore the cosmology implied by the existing SN Ia data will be different when analysed in the context of such models. As an example, we show that the observational SN Ia cosmology data can be fitted with both a model where (ΩM,ΩΛ)=(0.62 ,0.38 ) and Newton's constant G varies as G (z )=G0(1 +z )-1/4 and the standard model where (ΩM,ΩΛ)=(0.3 ,0.7 ) and G is constant, when the Universe is assumed to be flat.

  13. Candle Soot-Driven Performance Enhancement in Pyroelectric Energy Conversion

    Science.gov (United States)

    Azad, Puneet; Singh, V. P.; Vaish, Rahul

    2018-05-01

    We observed substantial enhancement in pyroelectric output with the help of candle soot coating on the surface of lead zirconate titanate (PZT). Candle soot of varying thicknesses was coated by directly exposing pyroelectric material to the candle flame. The open-circuit pyroelectric voltage and closed-circuit pyroelectric current were recorded while applying infrared heating across the uncoated and candle soot-coated samples for different heating and cooling cycles. In comparison to the uncoated sample, the maximum open-circuit voltage improves seven times for the candle soot-coated sample and electric current increases by eight times across a resistance of 10Å. Moreover, the harvested energy is enhanced by 50 times for candle soot-coated sample. Results indicate that candle soot coating is an effective and economic method to improve infrared sensing performance of pyroelectric materials.

  14. Organic aerosol formation in citronella candle plumes.

    Science.gov (United States)

    Bothe, Melanie; Donahue, Neil McPherson

    2010-09-01

    Citronella candles are widely used as insect repellants, especially outdoors in the evening. Because these essential oils are unsaturated, they have a unique potential to form secondary organic aerosol (SOA) via reaction with ozone, which is also commonly elevated on summer evenings when the candles are often in use. We investigated this process, along with primary aerosol emissions, by briefly placing a citronella tealight candle in a smog chamber and then adding ozone to the chamber. In repeated experiments, we observed rapid and substantial SOA formation after ozone addition; this process must therefore be considered when assessing the risks and benefits of using citronella candle to repel insects.

  15. 75 FR 63200 - Petroleum Wax Candles From China

    Science.gov (United States)

    2010-10-14

    ... INTERNATIONAL TRADE COMMISSION [Investigation No. 731-TA-282 (Third Review)] Petroleum Wax Candles... five-year review concerning the antidumping duty order on petroleum wax candles from China. SUMMARY... antidumping duty order on petroleum wax candles from China would be likely to lead to continuation or...

  16. Electrochemical supercapacitor behaviour of functionalized candle ...

    Indian Academy of Sciences (India)

    ... and G (graphite) phase of carbon present in the candle soots. The electrochemical characterization was performed by cyclic voltammetry, galvanostatic charge/discharge test and impedance spectroscopy in 1MH2SO4 electrolyte. The functionalized candle soot electrode showed an enhanced specific capacitance value of ...

  17. Organic aerosol formation in citronella candle plumes

    OpenAIRE

    Bothe, Melanie; Donahue, Neil McPherson

    2010-01-01

    Citronella candles are widely used as insect repellants, especially outdoors in the evening. Because these essential oils are unsaturated, they have a unique potential to form secondary organic aerosol (SOA) via reaction with ozone, which is also commonly elevated on summer evenings when the candles are often in use. We investigated this process, along with primary aerosol emissions, by briefly placing a citronella tealight candle in a smog chamber and then adding ozone to the chamber. In rep...

  18. Optimized Design and Discussion on Middle and Large CANDLE Reactors

    Directory of Open Access Journals (Sweden)

    Xiaoming Chai

    2012-08-01

    Full Text Available CANDLE (Constant Axial shape of Neutron flux, nuclide number densities and power shape During Life of Energy producing reactor reactors have been intensively researched in the last decades [1–6]. Research shows that this kind of reactor is highly economical, safe and efficiently saves resources, thus extending large scale fission nuclear energy utilization for thousands of years, benefitting the whole of society. For many developing countries with a large population and high energy demands, such as China and India, middle (1000 MWth and large (2000 MWth CANDLE fast reactors are obviously more suitable than small reactors [2]. In this paper, the middle and large CANDLE reactors are investigated with U-Pu and combined ThU-UPu fuel cycles, aiming to utilize the abundant thorium resources and optimize the radial power distribution. To achieve these design purposes, the present designs were utilized, simply dividing the core into two fuel regions in the radial direction. The less active fuel, such as thorium or natural uranium, was loaded in the inner core region and the fuel with low-level enrichment, e.g. 2.0% enriched uranium, was loaded in the outer core region. By this simple core configuration and fuel setting, rather than using a complicated method, we can obtain the desired middle and large CANDLE fast cores with reasonable core geometry and thermal hydraulic parameters that perform safely and economically; as is to be expected from CANDLE. To assist in understanding the CANDLE reactor’s attributes, analysis and discussion of the calculation results achieved are provided.

  19. Z boson as ''the standard candle'' for high-precision W boson physics at LHC

    International Nuclear Information System (INIS)

    Krasny, M.W.; Fayette, F.; Placzek, W.; Siodmok, A.

    2007-01-01

    In this paper we propose a strategy for measuring the inclusive W boson production processes at LHC. This strategy exploits simultaneously the unique flexibility of the LHC collider in running variable beam particle species at variable beam energies, and the configuration flexibility of the LHC detectors. We propose their concrete settings for a precision measurement of the standard model parameters. These dedicated settings optimise the use of the Z boson and Drell-Yan-pair production processes as ''the standard reference candles''. The presented strategy allows one to factorise and to directly measure those of the QCD effects that affect differently the W and Z production processes. It reduces to a level of O(10 -4 ) the impact of uncertainties in the partonic distribution functions (PDFs) and in the transverse momentum of the quarks on the measurement precision. Last but not the least, it reduces by a factor of 10 the impact of systematic measurement errors, such as the energy scale and the measurement resolution, on the W boson production observables. (orig.)

  20. Using slow-release permanganate candles to remediate PAH-contaminated water

    International Nuclear Information System (INIS)

    Rauscher, Lindy; Sakulthaew, Chainarong; Comfort, Steve

    2012-01-01

    Highlights: ► We quantified the efficacy of slow-release permanganate-paraffin candles to degrade and mineralize PAHs. ► 14 C-labeled PAHs were used to quantify both adsorption and transformation. ► Permanganate-treated PAHs were more biodegradable in soil microcosms. ► A flow-through candle system was used to quantify PAH removal in urban runoff. - Abstract: Surface waters impacted by urban runoff in metropolitan areas are becoming increasingly contaminated with polycyclic aromatic hydrocarbons (PAHs). Slow-release oxidant candles (paraffin–KMnO 4 ) are a relatively new technology being used to treat contaminated groundwater and could potentially be used to treat urban runoff. Given that these candles only release permanganate when submerged, the ephemeral nature of runoff events would influence when the permanganate is released for treating PAHs. Our objective was to determine if slow-release permanganate candles could be used to degrade and mineralize PAHs. Batch experiments quantified PAH degradation rates in the presence of the oxidant candles. Results showed most of the 16 PAHs tested were degraded within 2–4 h. Using 14 C-labled phenanthrene and benzo(a)pyrene, we demonstrated that the wax matrix of the candle initially adsorbs the PAH, but then releases the PAH back into solution as transformed, more water soluble products. While permanganate was unable to mineralize the PAHs (i.e., convert to CO 2 ), we found that the permanganate-treated PAHs were much more biodegradable in soil microcosms. To test the concept of using candles to treat PAHs in multiple runoff events, we used a flow-through system where urban runoff water was pumped over a miniature candle in repetitive wet–dry, 24-h cycles. Results showed that the candle was robust in removing PAHs by repeatedly releasing permanganate and degrading the PAHs. These results provide proof-of-concept that permanganate candles could potentially provide a low-cost, low-maintenance approach to

  1. Using slow-release permanganate candles to remediate PAH-contaminated water

    Energy Technology Data Exchange (ETDEWEB)

    Rauscher, Lindy, E-mail: purplerauscher@neb.rr.com [School of Natural Resources, University of Nebraska, Lincoln, NE 68583-0915 (United States); Sakulthaew, Chainarong, E-mail: chainarong@huskers.unl.edu [School of Natural Resources, University of Nebraska, Lincoln, NE 68583-0915 (United States); Department of Veterinary Technology, Kasetsart University, Bangkok 10900 (Thailand); Comfort, Steve, E-mail: scomfort1@unl.edu [School of Natural Resources, University of Nebraska, Lincoln, NE 68583-0915 (United States)

    2012-11-30

    Highlights: Black-Right-Pointing-Pointer We quantified the efficacy of slow-release permanganate-paraffin candles to degrade and mineralize PAHs. Black-Right-Pointing-Pointer {sup 14}C-labeled PAHs were used to quantify both adsorption and transformation. Black-Right-Pointing-Pointer Permanganate-treated PAHs were more biodegradable in soil microcosms. Black-Right-Pointing-Pointer A flow-through candle system was used to quantify PAH removal in urban runoff. - Abstract: Surface waters impacted by urban runoff in metropolitan areas are becoming increasingly contaminated with polycyclic aromatic hydrocarbons (PAHs). Slow-release oxidant candles (paraffin-KMnO{sub 4}) are a relatively new technology being used to treat contaminated groundwater and could potentially be used to treat urban runoff. Given that these candles only release permanganate when submerged, the ephemeral nature of runoff events would influence when the permanganate is released for treating PAHs. Our objective was to determine if slow-release permanganate candles could be used to degrade and mineralize PAHs. Batch experiments quantified PAH degradation rates in the presence of the oxidant candles. Results showed most of the 16 PAHs tested were degraded within 2-4 h. Using {sup 14}C-labled phenanthrene and benzo(a)pyrene, we demonstrated that the wax matrix of the candle initially adsorbs the PAH, but then releases the PAH back into solution as transformed, more water soluble products. While permanganate was unable to mineralize the PAHs (i.e., convert to CO{sub 2}), we found that the permanganate-treated PAHs were much more biodegradable in soil microcosms. To test the concept of using candles to treat PAHs in multiple runoff events, we used a flow-through system where urban runoff water was pumped over a miniature candle in repetitive wet-dry, 24-h cycles. Results showed that the candle was robust in removing PAHs by repeatedly releasing permanganate and degrading the PAHs. These results provide

  2. Using slow-release permanganate candles to remediate PAH-contaminated water.

    Science.gov (United States)

    Rauscher, Lindy; Sakulthaew, Chainarong; Comfort, Steve

    2012-11-30

    Surface waters impacted by urban runoff in metropolitan areas are becoming increasingly contaminated with polycyclic aromatic hydrocarbons (PAHs). Slow-release oxidant candles (paraffin-KMnO(4)) are a relatively new technology being used to treat contaminated groundwater and could potentially be used to treat urban runoff. Given that these candles only release permanganate when submerged, the ephemeral nature of runoff events would influence when the permanganate is released for treating PAHs. Our objective was to determine if slow-release permanganate candles could be used to degrade and mineralize PAHs. Batch experiments quantified PAH degradation rates in the presence of the oxidant candles. Results showed most of the 16 PAHs tested were degraded within 2-4 h. Using (14)C-labled phenanthrene and benzo(a)pyrene, we demonstrated that the wax matrix of the candle initially adsorbs the PAH, but then releases the PAH back into solution as transformed, more water soluble products. While permanganate was unable to mineralize the PAHs (i.e., convert to CO(2)), we found that the permanganate-treated PAHs were much more biodegradable in soil microcosms. To test the concept of using candles to treat PAHs in multiple runoff events, we used a flow-through system where urban runoff water was pumped over a miniature candle in repetitive wet-dry, 24-h cycles. Results showed that the candle was robust in removing PAHs by repeatedly releasing permanganate and degrading the PAHs. These results provide proof-of-concept that permanganate candles could potentially provide a low-cost, low-maintenance approach to remediating PAH-contaminated water. Copyright © 2012 Elsevier B.V. All rights reserved.

  3. 'CANDLE' burnup regime after LWR regime

    International Nuclear Information System (INIS)

    Sekimoto, Hiroshi; Nagata, Akito

    2008-01-01

    CANDLE (Constant Axial shape of Neutron flux, nuclide densities and power shape During Life of Energy producing reactor) burnup strategy can derive many merits. From safety point of view, the change of excess reactivity along burnup is theoretically zero, and the core characteristics, such as power feedback coefficients and power peaking factor, are not changed along burnup. Application of this burnup strategy to neutron rich fast reactors makes excellent performances. Only natural or depleted uranium is required for the replacing fuels. About 40% of natural or depleted uranium undergoes fission without the conventional reprocessing and enrichment. If the LWR produced energy of X Joules, the CANDLE reactor can produce about 50X Joules from the depleted uranium left at the enrichment facility for the LWR fuel. If we can say LWRs have produced energy sufficient for full 20 years, we can produce the energy for 1000 years by using the CANDLE reactors with depleted uranium. We need not mine any uranium ore, and do not need reprocessing facility. The burnup of spent fuel becomes 10 times. Therefore, the spent fuel amount per produced energy is also reduced to one-tenth. The details of the scenario of CANDLE burnup regime after LWR regime will be presented at the symposium. (author)

  4. Stellar candles for the extragalactic distance scale

    CERN Document Server

    Gieren, Wolfgang

    2003-01-01

    This volume reviews the current status with respect to both theory and observation of the extragalactic distance scale. A sufficient accuracy is required both for a precise determination of the cosmological parameters and also in order to achieve a better understanding of physical processes in extragalactic systems. The "standard candles", used to set up the extragalactic distance scale, reviewed in this book include cepheid variables, RR Lyrae variables, novae, Type Ia and Type II supernovae as well as globular clusters and planetary nebulae.

  5. 75 FR 80843 - Petroleum Wax Candles From China

    Science.gov (United States)

    2010-12-23

    ... INTERNATIONAL TRADE COMMISSION [Investigation No. 731-TA-282 (Third Review)] Petroleum Wax Candles... Tariff Act of 1930 (19 U.S.C. 1675(c)), that revocation of the antidumping duty order on petroleum wax... contained in USITC Publication 4207 (December 2010), entitled Petroleum Wax Candles from China...

  6. Study on core radius minimization for long life Pb-Bi cooled CANDLE burnup scheme based fast reactor

    Energy Technology Data Exchange (ETDEWEB)

    Afifah, Maryam, E-mail: maryam.afifah210692@gmail.com; Su’ud, Zaki [Nuclear Research Group, FMIPA, Bandung Institute of Technology Jl. Ganesha 10, Bandung 40132 (Indonesia); Miura, Ryosuke; Takaki, Naoyuki [Department of Nuclear Safety Engineering, Tokyo City University 1-28-1 Tamazutsumi, Setagaya, Tokyo 158-8557 (Japan); Sekimoto, H. [Emerritus Prof. of Research Laboratory for Nuclear Reactors, Tokyo Inst. of Technology (Japan)

    2015-09-30

    Fast Breeder Reactor had been interested to be developed over the world because it inexhaustible source energy, one of those is CANDLE reactor which is have strategy in burn-up scheme, need not control roads for control burn-up, have a constant core characteristics during energy production and don’t need fuel shuffling. The calculation was made by basic reactor analysis which use Sodium coolant geometry core parameter as a reference core to study on minimum core reactor radius of CANDLE for long life Pb-Bi cooled, also want to perform pure coolant effect comparison between LBE and sodium in a same geometry design. The result show that the minimum core radius of Lead Bismuth cooled CANDLE is 100 cm and 500 MWth thermal output. Lead-Bismuth coolant for CANDLE reactor enable to reduce much reactor size and have a better void coefficient than Sodium cooled as the most coolant for FBR, then we will have a good point in safety analysis.

  7. TYPE II-P SUPERNOVAE FROM THE SDSS-II SUPERNOVA SURVEY AND THE STANDARDIZED CANDLE METHOD

    International Nuclear Information System (INIS)

    D'Andrea, Chris B.; Sako, Masao; Dilday, Benjamin; Jha, Saurabh; Frieman, Joshua A.; Kessler, Richard; Holtzman, Jon; Konishi, Kohki; Yasuda, Naoki; Schneider, D. P.; Sollerman, Jesper; Wheeler, J. Craig; Cinabro, David; Nichol, Robert C.; Lampeitl, Hubert; Smith, Mathew; Atlee, David W.; Bassett, Bruce; Castander, Francisco J.; Goobar, Ariel

    2010-01-01

    We apply the Standardized Candle Method (SCM) for Type II Plateau supernovae (SNe II-P), which relates the velocity of the ejecta of a SN to its luminosity during the plateau, to 15 SNe II-P discovered over the three season run of the Sloan Digital Sky Survey-II Supernova Survey. The redshifts of these SNe-0.027 0.01) as all of the current literature on the SCM combined. We find that the SDSS SNe have a very small intrinsic I-band dispersion (0.22 mag), which can be attributed to selection effects. When the SCM is applied to the combined SDSS-plus-literature set of SNe II-P, the dispersion increases to 0.29 mag, larger than the scatter for either set of SNe separately. We show that the standardization cannot be further improved by eliminating SNe with positive plateau decline rates, as proposed in Poznanski et al. We thoroughly examine all potential systematic effects and conclude that for the SCM to be useful for cosmology, the methods currently used to determine the Fe II velocity at day 50 must be improved, and spectral templates able to encompass the intrinsic variations of Type II-P SNe will be needed.

  8. 16 CFR 501.7 - Candles.

    Science.gov (United States)

    2010-01-01

    ... quantity of contents shall be expressed in terms of count and measure (e.g., length and diameter), to the extent that diameter of such candles need not be expressed. The requirements of § 500.7 of this chapter...

  9. Lighting that One Little Candle.

    Science.gov (United States)

    Scarnati, James T.; Tice, Craig J.

    1988-01-01

    Describes a lesson in which fifth graders made observations of candles. Discusses the progress of the lesson and the necessity of instructing students in what and how to watch and measure. Stresses that this can be easily accomplished inexpensively with imagination. (CW)

  10. Line shapes of atomic-candle-type Rabi resonances

    International Nuclear Information System (INIS)

    Coffer, J.G.; Camparo, J.C.; Sickmiller, B.; Presser, A.

    2002-01-01

    When atoms interact with a phase-modulated field, the probability of finding the atom in the excited-state oscillates at the second harmonic of the modulation frequency, 2ω m . The amplitude of this oscillating probability is a resonant function of the Rabi frequency Ω, and this is termed a β Rabi resonance. In this work, we examine the line shape of the β Rabi resonance both theoretically and experimentally. We find that a small-signal theory of the β-Rabi-resonance condition captures much of the line shape's character, and, in particular, that the resonance's 'line Q' (i.e., 2δΩ 1/2 /Ω) is proportional to the modulation frequency. This result can be applied to the atomic candle, where β Rabi resonances are employed to stabilize field strength. Considering our results in the context of developing an optical atomic candle, we find that a free-running diode laser's intensity noise could be improved by orders of magnitude using the atomic candle concept

  11. Nondestructive Evaluation of Ceramic Candle Filters Using Vibration Response

    International Nuclear Information System (INIS)

    Chen, Roger H.L.; Kiriakidis, Alejandro C.; Peng, Steve W.

    1997-01-01

    This study aims at the development of an effective nondestructive evaluation technique to predict the remaining useful life of a ceramic candle filter during a power plant's annual maintenance shutdown. The objective of the present on-going study is to establish the vibration signatures of ceramic candle filters at varying degradation levels due to different operating hours, and to study the various factors involving the establishment of the signatures

  12. Pulse cleaning flow models and numerical computation of candle ceramic filters.

    Science.gov (United States)

    Tian, Gui-shan; Ma, Zhen-ji; Zhang, Xin-yi; Xu, Ting-xiang

    2002-04-01

    Analytical and numerical computed models are developed for reverse pulse cleaning system of candle ceramic filters. A standard turbulent model is demonstrated suitably to the designing computation of reverse pulse cleaning system from the experimental and one-dimensional computational result. The computed results can be used to guide the designing of reverse pulse cleaning system, which is optimum Venturi geometry. From the computed results, the general conclusions and the designing methods are obtained.

  13. 9 CFR 590.506 - Candling and transfer-room facilities and equipment.

    Science.gov (United States)

    2010-01-01

    ... 9 Animals and Animal Products 2 2010-01-01 2010-01-01 false Candling and transfer-room facilities... INSPECTION ACT) Sanitary, Processing, and Facility Requirements § 590.506 Candling and transfer-room... containers are furnished daily. (h) Shell egg conveyors shall be constructed so that they can be thoroughly...

  14. Electrochemical supercapacitor behaviour of functionalized candle ...

    Indian Academy of Sciences (India)

    soots have excellent optical property like luminescence in the. 241 .... cation SEM images of as bare candle soot, which clearly dis- plays that the sample is composed of .... of oxygenated compounds present on the surface of BCS apparently ...

  15. Study on small long-life LBE cooled fast reactor with CANDLE burn-up. Part 1. Steady state research

    International Nuclear Information System (INIS)

    Yan, Mingyu; Sekimoto, Hiroshi

    2008-01-01

    Small long-life reactor is required for some local areas. CANDLE small long-life fast reactor which does not require control rods, mining, enrichment and reprocessing plants can satisfy this demand. In a CANDLE reactor, the shapes of neutron flux, nuclide number densities and power density distributions remain constant and only shift in axial direction. The core with 1.0 m radius, 2.0 m length can realize CANDLE burn-up with nitride (enriched N-15) natural uranium as fresh fuel. Lead-Bismuth is used as coolant. From steady state analysis, we obtained the burn-up velocity, output power distribution, core temperature distribution, etc. The burn-up velocity is less than 1.0 cm/year that enables a long-life design easily. The core averaged discharged fuel burn-up is about 40%. (author)

  16. DESIGN AND DEVELOPMENT SYSTEM OF ELECTRONIC CANDLE BASED ON MICROCONTROLLER

    Directory of Open Access Journals (Sweden)

    Yeli Fitri Lianuari

    2017-12-01

    Full Text Available In this paper is described how the assembly process of designing and developing a series of inflatable candles microkontroller. The series based on electronics is a simple circuit that utilizes condenser microphone as a sensor. The use of these components are suitable for use in an application circuit electronic candles where the workings of this series starts from LEDs that analogous to wax and then lit using a pushbutton that analogy as lighters, and a flame from the LED can be extinguished by blowing. The concept of this series is simple but could be developed into a series of more innovative and interesting. One development is to add the output of the LED running that made candles burn more lively and interesting. Running LED of this circuit works by using a transistor as a switch that controls power of Az IC 368 and IC Az 418 M J. To simplify the development process, combination of a series of electronic candles with LED running can be simulated using the software Livewire and Proteus. Results of designing and developing a tool is not a new thing or that have not been found but the result of this achievement can be a source of inspiring in the world of electronics that can later be developed further so that it becomes a result of more recent example, lights out with once pat.

  17. 76 FR 773 - Petroleum Wax Candles From the People's Republic of China: Continuation of Antidumping Duty Order

    Science.gov (United States)

    2011-01-06

    ... DEPARTMENT OF COMMERCE International Trade Administration [A-570-504] Petroleum Wax Candles From... Trade Commission (``ITC'') that revocation of the antidumping duty order on petroleum wax candles from... order on petroleum wax candles from the PRC pursuant to section 751(c)(2) of the Tariff Act of 1930, as...

  18. Start-up fuel and power flattening of sodium-cooled candle core

    International Nuclear Information System (INIS)

    Takaki, Naoyuki; Sagawa, Yu; Umino, Akitake; Sekimoto, Hiroshi

    2013-01-01

    The hard neutron spectrum and unique power shape of CANDLE enable its distinctive performances such as achieving high burnup more than 30% and exempting necessity of both enrichment and reprocessing. On the other hand, they also cause several challenging problems. One is how the initial fuel can be prepared to start up the first CANDLE reactor because the equilibrium fuel composition that enables stable CANDLE burning is complex both in axial and radial directions. Another prominent problem is high radial power peaking factor that worsens averaged burnup, namely resource utilization factor in once-through mode and shorten the life time of structure materials. The purposes of this study are to solve these two problems. Several ideas for core configurations and startup fuel using single enrichment uranium and iron as a substitute of fission products are studied. As a result, it is found that low enriched uranium is applicable to ignite the core but all concepts examined here exceeded heat limits. Adjustment in enrichment and height of active and burnt zone is opened for future work. Sodium duct assemblies and thorium fuel assemblies loaded in the center region are studied as measures to reduce radial power peaking factor. Replacing 37 fuels by thorium fuel assemblies in the zeroth to third row provides well-balanced performance with flattened radial power distribution. The CANDLE core loaded with natural uranium in the outer and thorium in the center region achieved 35.6% of averaged burnup and 7.0 years of cladding life time owing to mitigated local fast neutron irradiation at the center. Using thorium with natural or depleted uranium in CANDLE reactor is also beneficial to diversifying fission resource and extending available term of fission energy without expansion of needs for enrichment and reprocessing

  19. Light a CANDLE. An innovative burnup strategy of nuclear reactors

    International Nuclear Information System (INIS)

    Sekimoto, Hiroshi

    2005-11-01

    CANDLE is a new burnup strategy for nuclear reactors, which stands for Constant Axial Shape of Neutron Flux, Nuclide Densities and Power Shape During Life of Energy Production. When this candle-like burnup strategy is adopted, although the fuel is fixed in a reactor core, the burning region moves, at a speed proportionate to the power output, along the direction of the core axis without changing the spatial distribution of the number density of the nuclides, neutron flux, and power density. Excess reactivity is not necessary for burnup and the shape of the power distribution and core characteristics do not change with the progress of burnup. It is not necessary to use control rods for the control of the burnup. This booklet described the concept of the CANDLE burnup strategy with basic explanations of excess neutrons and its specific application to a high-temperature gas-cooled reactor and a fast reactor with excellent neutron economy. Supplementary issues concerning the initial core and high burnup were also referred. (T. Tanaka)

  20. A comparison of Candle Auctions and Hard Close Auctions with Common Values

    OpenAIRE

    Sascha Füllbrunn

    2009-01-01

    With this study, we contribute to the literature of auction design by presenting a new auction format: the Candle auction, a popular auction in the Middle Ages. Considering a common value framework, we theoretically and experimentally point out that the Candle auction, where bidding is allowed until a stochastic deadline, yields a better outcome to the seller than the Hard Close auction, the popular eBay online auction format.

  1. Proper Use of Candles During a Power Outage

    Centers for Disease Control (CDC) Podcasts

    Home fires are a threat after a natural disaster and fire trucks may have trouble getting to your home. If the power is out, use flashlights or other battery-powered lights if possible, instead of candles.

  2. TEMPERATURE AND ELECTRON DENSITY DIAGNOSTICS OF A CANDLE-FLAME-SHAPED FLARE

    Energy Technology Data Exchange (ETDEWEB)

    Guidoni, S. E. [NASA Goddard Space Flight Center/CUA, Code 674, 8800 Greenbelt Road, Greenbelt, MD 20771 (United States); McKenzie, D. E.; Longcope, D. W.; Yoshimura, K. [Department of Physics, Montana State University, Bozeman, MT 59717-3840 (United States); Plowman, J. E., E-mail: silvina.e.guidoni@nasa.gov [High Altitude Observatory, National Center for Atmospheric Research P.O. Box 3000, Boulder, CO 80307-3000 (United States)

    2015-02-10

    Candle-flame-shaped flares are archetypical structures that provide indirect evidence of magnetic reconnection. A flare resembling Tsuneta's famous 1992 candle-flame flare occurred on 2011 January 28; we present its temperature and electron density diagnostics. This flare was observed with Solar Dynamics Observatory/Atmospheric Imaging Assembly (SDO/AIA), Hinode/X-Ray Telescope (XRT), and Solar Terrestrial Relations Observatory Ahead (STEREO-A)/Extreme Ultraviolet Imager, resulting in high-resolution, broad temperature coverage, and stereoscopic views of this iconic structure. The high-temperature images reveal a brightening that grows in size to form a tower-like structure at the top of the posteruption flare arcade, a feature that has been observed in other long-duration events. Despite the extensive work on the standard reconnection scenario, there is no complete agreement among models regarding the nature of this high-intensity elongated structure. Electron density maps reveal that reconnected loops that are successively connected at their tops to the tower develop a density asymmetry of about a factor of two between the two legs, giving the appearance of ''half-loops''. We calculate average temperatures with a new fast differential emission measure (DEM) method that uses SDO/AIA data and analyze the heating and cooling of salient features of the flare. Using STEREO observations, we show that the tower and the half-loop brightenings are not a line-of-sight projection effect of the type studied by Forbes and Acton. This conclusion opens the door for physics-based explanations of these puzzling, recurrent solar flare features, previously attributed to projection effects. We corroborate the results of our DEM analysis by comparing them with temperature analyses from Hinode/XRT.

  3. Design study on small CANDLE reactor

    Energy Technology Data Exchange (ETDEWEB)

    Sekimoto, H; Yan, M [Research Laboratory for Nuclear Reactors, Tokyo Institute of Technology (Japan)

    2007-07-01

    A new reactor burnup strategy CANDLE was proposed, where shapes of neutron flux, nuclide densities and power density distributions remain constant but move to an axial direction. Here important points are that the solid fuel is fixed at each position and that any movable burnup reactivity control mechanisms such as control rods are not required. This burnup strategy can derive many merits. The change of excess reactivity along burnup is theoretically zero, and shim rods will not be required for this reactor. The reactor becomes free from accidents induced by unexpected control rods withdrawal. The core characteristics, such as power feedback coefficients and power peaking factor, are not changed along burnup. Therefore, the operation of the reactor becomes much easier than the conventional reactors especially for high burnup reactors. The transportation and storage of replacing fuels become easy and safe, since they are free from criticality accidents. In our previous works it is appeared that application of this burnup strategy to neutron rich fast reactors makes excellent performances. Only natural or depleted uranium is required for the replacing fuels. The average burnup of the spent fuel is about 40% that is equivalent to 40% utilization of the natural uranium without the reprocessing and enrichment. This reactor can be realized for large reactor, since the neutron leakage becomes small and its neutron economy becomes improved. In the present paper we try to design small CANDLE reactor whose performance is similar to the large reactor by increasing its fuel volume ration of the core, since its performance is strongly required for local area usage. Small long life reactor is required for some local areas. Such a characteristic that only natural uranium is required after second core is also strong merit for this case. The core with 1.0 m radius, 2.0 m length can realize CANDLE burn-up with nitride (enriched N-15) natural uranium as fresh fuel. Lead-Bismuth is

  4. Design study on small CANDLE reactor

    International Nuclear Information System (INIS)

    Sekimoto, H.; Yan, M.

    2007-01-01

    A new reactor burnup strategy CANDLE was proposed, where shapes of neutron flux, nuclide densities and power density distributions remain constant but move to an axial direction. Here important points are that the solid fuel is fixed at each position and that any movable burnup reactivity control mechanisms such as control rods are not required. This burnup strategy can derive many merits. The change of excess reactivity along burnup is theoretically zero, and shim rods will not be required for this reactor. The reactor becomes free from accidents induced by unexpected control rods withdrawal. The core characteristics, such as power feedback coefficients and power peaking factor, are not changed along burnup. Therefore, the operation of the reactor becomes much easier than the conventional reactors especially for high burnup reactors. The transportation and storage of replacing fuels become easy and safe, since they are free from criticality accidents. In our previous works it is appeared that application of this burnup strategy to neutron rich fast reactors makes excellent performances. Only natural or depleted uranium is required for the replacing fuels. The average burnup of the spent fuel is about 40% that is equivalent to 40% utilization of the natural uranium without the reprocessing and enrichment. This reactor can be realized for large reactor, since the neutron leakage becomes small and its neutron economy becomes improved. In the present paper we try to design small CANDLE reactor whose performance is similar to the large reactor by increasing its fuel volume ration of the core, since its performance is strongly required for local area usage. Small long life reactor is required for some local areas. Such a characteristic that only natural uranium is required after second core is also strong merit for this case. The core with 1.0 m radius, 2.0 m length can realize CANDLE burn-up with nitride (enriched N-15) natural uranium as fresh fuel. Lead-Bismuth is

  5. SN 2016jhj at redshift 0.34: extending the Type II supernova Hubble diagram using the standard candle method

    Science.gov (United States)

    de Jaeger, T.; Galbany, L.; Filippenko, A. V.; González-Gaitán, S.; Yasuda, N.; Maeda, K.; Tanaka, M.; Morokuma, T.; Moriya, T. J.; Tominaga, N.; Nomoto, K.; Komiyama, Y.; Anderson, J. P.; Brink, T. G.; Carlberg, R. G.; Folatelli, G.; Hamuy, M.; Pignata, G.; Zheng, W.

    2017-12-01

    Although Type Ia supernova cosmology has now reached a mature state, it is important to develop as many independent methods as possible to understand the true nature of dark energy. Recent studies have shown that Type II supernovae (SNe II) offer such a path and could be used as alternative distance indicators. However, the majority of these studies were unable to extend the Hubble diagram above redshift z = 0.3 because of observational limitations. Here, we show that we are now ready to move beyond low redshifts and attempt high-redshift (z ≳ 0.3) SN II cosmology as a result of new-generation deep surveys such as the Subaru/Hyper Suprime-Cam survey. Applying the 'standard candle method' to SN 2016jhj (z = 0.3398 ± 0.0002; discovered by HSC) together with a low-redshift sample, we are able to construct the highest-redshift SN II Hubble diagram to date with an observed dispersion of 0.27 mag (i.e. 12-13 per cent in distance). This work demonstrates the bright future of SN II cosmology in the coming era of large, wide-field surveys like that of the Large Synoptic Survey Telescope.

  6. Proper Use of Candles During a Power Outage

    Centers for Disease Control (CDC) Podcasts

    2006-08-10

    Home fires are a threat after a natural disaster and fire trucks may have trouble getting to your home. If the power is out, use flashlights or other battery-powered lights if possible, instead of candles.  Created: 8/10/2006 by Emergency Communications System.   Date Released: 8/20/2008.

  7. New Scientific Aspects of the "Burning Candle" Experiment

    Science.gov (United States)

    Massalha, Taha

    2016-01-01

    The "burning candle" experiment is used in middle school education programs to prove that air contains a component that is essential to burning (i.e., oxygen). The accepted interpretation taught by teachers in middle school is this: when burning occurs, oxygen is used up, creating an underpressure that causes a rise in water level inside…

  8. Fabrication of Water Jet Resistant and Thermally Stable Superhydrophobic Surfaces by Spray Coating of Candle Soot Dispersion.

    Science.gov (United States)

    Qahtan, Talal F; Gondal, Mohammed A; Alade, Ibrahim O; Dastageer, Mohammed A

    2017-08-08

    A facile synthesis method for highly stable carbon nanoparticle (CNP) dispersion in acetone by incomplete combustion of paraffin candle flame is presented. The synthesized CNP dispersion is the mixture of graphitic and amorphous carbon nanoparticles of the size range of 20-50 nm and manifested the mesoporosity with an average pore size of 7 nm and a BET surface area of 366 m 2 g -1 . As an application of this material, the carbon nanoparticle dispersion was spray coated (spray-based coating) on a glass surface to fabricate superhydrophobic (water contact angle > 150° and sliding angle fabricated from direct candle flame soot deposition (candle-based coating). This study proved that water jet resistant and thermally stable superhydrophobic surfaces can be easily fabricated by simple spray coating of CNP dispersion gathered from incomplete combustion of paraffin candle flame and this technique can be used for different applications with the potential for the large scale fabrication.

  9. 75 FR 49475 - Petroleum Wax Candles From the People's Republic of China: Preliminary Results of Request for...

    Science.gov (United States)

    2010-08-13

    ...''); the National Retail Federation (``NRF''); HSE USA, Inc. (``HSE''); Universal Candle Company (``UC... research firm in Malaysia on producers' prices for candles made and sold in Malaysia and stated that the... Act of 2002, Public Law 107-296, Sec. 1502, 116 Stat. 2135, 2308-09 2002); Reorganization Plan...

  10. Observations on the CANDLE burn-up in various geometries

    International Nuclear Information System (INIS)

    Seifritz, W.

    2007-01-01

    We have looked at all geometrical conditions under which an auto catalytically propagating burnup wave (CANDLE burn-up) is possible. Thereby, the Sine Gordon equation finds a new place in the burn-up theory of nuclear fission reactors. For a practical reactor design the axially burning 'spaghetti' reactor and the azimuthally burning 'pancake' reactor, respectively, seem to be the most promising geometries for a practical reactor design. Radial and spherical burn-waves in cylindrical and spherical geometry, respectively, are principally impossible. Also, the possible applicability of such fission burn-waves on the OKLO-phenomenon and the GEOREACTOR in the center of Earth, postulated by Herndon, is discussed. A fast CANDLE-reactor can work with only depleted uranium. Therefore, uranium mining and uranium-enrichment are not necessary anymore. Furthermore, it is also possible to dispense with reprocessing because the uranium utilization factor is as high as about 40%. Thus, this completely new reactor type can open a new era of reactor technology

  11. Candle soot nanoparticles-polydimethylsiloxane composites for laser ultrasound transducers

    Science.gov (United States)

    Chang, Wei-Yi; Huang, Wenbin; Kim, Jinwook; Li, Sibo; Jiang, Xiaoning

    2015-10-01

    Generation of high power laser ultrasound strongly demands the advanced materials with efficient laser energy absorption, fast thermal diffusion, and large thermoelastic expansion capabilities. In this study, candle soot nanoparticles-polydimethylsiloxane (CSNPs-PDMS) composite was investigated as the functional layer for an optoacoustic transducer with high-energy conversion efficiency. The mean diameter of the collected candle soot carbon nanoparticles is about 45 nm, and the light absorption ratio at 532 nm wavelength is up to 96.24%. The prototyped CSNPs-PDMS nano-composite laser ultrasound transducer was characterized and compared with transducers using Cr-PDMS, carbon black (CB)-PDMS, and carbon nano-fiber (CNFs)-PDMS composites, respectively. Energy conversion coefficient and -6 dB frequency bandwidth of the CSNPs-PDMS composite laser ultrasound transducer were measured to be 4.41 × 10-3 and 21 MHz, respectively. The unprecedented laser ultrasound transduction performance using CSNPs-PDMS nano-composites is promising for a broad range of ultrasound therapy applications.

  12. FTIR Study of Comustion Species in Several Regions of a Candle Flame

    Science.gov (United States)

    White, Allen R.

    2013-06-01

    The complex chemical structure of the fuel in a candle flame, parafin, is broken down into smaller hydrocarbons in the dark region just above the candle wick during combustion. This creates fuel-rich, fuel-lean, hydrocarbon reaction, and combustion product regions in the flame during combustion that are spectroscopically rich, particularly in the infrared. IR emissions were measured for each reaction region via collection optics focused into an FTIR and used to identify IR active species present in that region and, when possible, temperature of the sampling region. The results of the measurements are useful for combustion reaction modeling as well as for future validation of mass spectroscopy sampling systems.

  13. The Histological and Immunohistochemical Features of the Skin Lesions in CANDLE Syndrome

    Science.gov (United States)

    Torrelo, Antonio; Colmenero, Isabel; Requena, Luis; Paller, Amy S.; Ramot, Yuval; Lee, Chyi-Chia Richard; Vera, Angel; Zlotogorski, Abraham; Goldbach-Mansky, Raphaela; Kutzner, Heinz

    2015-01-01

    Chronic atypical neutrophilic dermatosis with lipodystrophy and elevated temperature (CANDLE) syndrome is a newly characterized autoinflammatory disorder, caused by mutations in PSMB8. It is characterized by early-onset fevers, accompanied by a widespread, violaceous and often annular, cutaneous eruption. While the exact pathogenesis of this syndrome is still obscure, it is postulated that the inflammatory disease manifestations stem from excess secretion of interferons. Based on preliminary blood cytokine and gene expression studies, the signature seems to come mostly from type I interferons, which are proposed to lead to the recruitment of immature myeloid cells into the dermis and subcutis. In this study, we systematically analyzed skin biopsies from 6 CANDLE syndrome patients by routine histopathology and immunohistochemistry methods. Skin lesions showed the presence of extensive mixed dermal and subcutaneous inflammatory infiltrate, composed of mononuclear cells, atypical myeloid cells, neutrophils, eosinophils and some mature lymphocytes. Positive LEDER and myeloperoxidase staining supported the presence of myeloid cells. Positive CD68/PMG1 and CD163 staining confirmed the existence of histiocytes and monocytic macrophages in the inflammatory infiltrate. CD123 staining was positive, demonstrating the presence of plasmacytoid dendritic cells. Uncovering the unique histopathologic and immunohistochemical features of CANDLE syndrome provides tools for rapid and specific diagnosis of this disorder as well as further insight into the pathogenesis of this severe, life-threatening condition. PMID:26091509

  14. A Double Candle-Flame-Shaped Solar Flare Observed by SDO and STEREO

    Science.gov (United States)

    Gou, T.; Liu, R.; Wang, Y.; Liu, K.; Zhuang, B.; Zhang, Q.; Liu, J.

    2015-12-01

    We investigate an M1.4 flare occurring on 2011 January 28 near the northwest solar limb. The flare loop system exhibits a double candle-flame configuration in SDO/AIA's hot passbands, sharing a much larger cusp-shaped structure. The results of DEM analysis show that each candle flame has a similar temperature distribution as the famous Tsuneta flare. STEREO-A provides us a view from directly above the flare, and in SECCHI/EUVI 195 Å the post-flare loops are observed to propagate eastward. We performed a 3D reconstruction of the pos-flare loops with AIA and EUVI data. With the aid of the squashing factor Q based on a potential extrapolation of the photospheric field, we recognized that the footpoints of the post-flare loops were slipping along high-Q lines on the photosphere, and the reconstructed loops share similarity with the filed lines that are traced starting from the high-Q lines. The heights of the loops increase as they slip horizontally eastward, giving the loop-top a velocity of about 10 km/s. An extremely large EUV late phase in Fe XVI 33.5 nm observed by SDO/EVE is suggested to be related to the slipping magnetic reconnection occurring in the quasi-separatrix layers (QSLs) whose photosheric footprints are featured by the high-Q lines.

  15. Using slow-release permanganate candles to remove TCE from a low permeable aquifer at a former landfill.

    Science.gov (United States)

    Christenson, Mark D; Kambhu, Ann; Comfort, Steve D

    2012-10-01

    Past disposal of industrial solvents into unregulated landfills is a significant source of groundwater contamination. In 2009, we began investigating a former unregulated landfill with known trichloroethene (TCE) contamination. Our objective was to pinpoint the location of the plume and treat the TCE using in situ chemical oxidation (ISCO). We accomplished this by using electrical resistivity imaging (ERI) to survey the landfill and map the subsurface lithology. We then used the ERI survey maps to guide direct push groundwater sampling. A TCE plume (100-600 μg L(-1)) was identified in a low permeable silty-clay aquifer (K(h)=0.5 md(-1)) that was within 6m of ground surface. To treat the TCE, we manufactured slow-release potassium permanganate candles (SRPCs) that were 91.4 cm long and either 5. cm or 7.6 cm in dia. For comparison, we inserted equal masses of SRPCs (7.6-cm versus 5.1-cm dia) into the low permeable aquifer in staggered rows that intersected the TCE plume. The 5.1-cm dia candles were inserted using direct push rods while the 7.6-cm SRPCs were placed in 10 permanent wells. Pneumatic circulators that emitted small air bubbles were placed below the 7.6-cm SRPCs in the second year. Results 15 months after installation showed significant TCE reductions in the 7.6-cm candle treatment zone (67-85%) and between 10% and 66% decrease in wells impacted by the direct push candles. These results support using slow-release permanganate candles as a means of treating chlorinated solvents in low permeable aquifers. Copyright © 2012 Elsevier Ltd. All rights reserved.

  16. SYNTHESIS ALKANOLAMIDE TETRAHIDROXY OCTADECANOATE COMPOUND FROM CANDLE NUT OIL

    Directory of Open Access Journals (Sweden)

    Daniel Daniel

    2010-06-01

    Full Text Available Candle nut oil could be transesterificated by methanol with concentrated H2SO4 as a catalyst to form fatty acid methyl esther. Methyl linoleate could be separated by Column Chromatography mechanism technic partition from fatty acid methyl ester (FAME mixture, then it was treated by ethanolamine at base condition in benzene as solvent and sodium methylate as a catalyst at reflux condition for 6 hours to form an alkanolamide. Alkanolamide could be epoxydized by tert-buthyl hydroperoxyde and peroxygenase as a catalyst and it was refluxed for 6 hours at 40 °C and nitrogen gas condition to form the epoxy alkanolamide octadecanoate, and then it was hydrolyzed by HCl 0.1 M to form alkanolamide tetrahidroxy octadecanoate (Polyol. Alkanolamide tetrahidroxy octadecanoate could be separated by Column Chromatography using silica gel H 40 and the eluent was the mixture of chloroform, ethyl acetate, formic acid in a ratio 90:10:1 (v/v/v/. Determination of HLB value from alknolamide tetrahydroxy octadecanoate is 13.096. Therefore, this compound was particularly suitable for application as an o/w emulsifiers. All af the reaction steps were confirmed by using FT-IR, 1H-NMR, GC-MS, Gas Chromatography and TLC.   Keywords: Esterification, Candle nut oil, Surfactant, Amidation, Polyol.

  17. A Large and Pristine Sample of Standard Candles across the Milky Way: ∼100,000 Red Clump Stars with 3% Contamination

    Science.gov (United States)

    Ting, Yuan-Sen; Hawkins, Keith; Rix, Hans-Walter

    2018-05-01

    Core helium-burning red clump (RC) stars are excellent standard candles in the Milky Way. These stars may have more precise distance estimates from spectrophotometry than from Gaia parallaxes beyond 3 kpc. However, RC stars have values of T eff and {log}g that are very similar to some red giant branch (RGB) stars. Especially for low-resolution spectroscopic studies where T eff, {log}g, and [Fe/H] can only be estimated with limited precision, separating RC stars from RGB through established methods can incur ∼20% contamination. Recently, Hawkins et al. demonstrated that the additional information in single-epoch spectra, such as the C/N ratio, can be exploited to cleanly differentiate RC and RGB stars. In this second paper of the series, we establish a data-driven mapping from spectral flux space to independently determined asteroseismic parameters, the frequency and the period spacing. From this, we identify 210,371 RC stars from the publicly available LAMOST DR3 and APOGEE DR14 data, with ∼9% of contamination. We provide an RC sample of 92249 stars with a contamination of only ∼3%, by restricting the combined analysis to LAMOST stars with S/Npix ≥ 75. This demonstrates that high-signal-to-noise ratio (S/N), low-resolution spectra covering a broad wavelength range can identify RC samples at least as pristine as their high-resolution counterparts. As coming and ongoing surveys such as TESS, DESI, and LAMOST will continue to improve the overlapping training spectroscopic-asteroseismic sample, the method presented in this study provides an efficient and straightforward way to derive a vast yet pristine sample of RC stars to reveal the three-dimensional (3D) structure of the Milky Way.

  18. Lung inflammation and genotoxicity in mice lungs after pulmonary exposure to candle light combustion particles

    DEFF Research Database (Denmark)

    Skovmand, Astrid; Damiao Gouveia, Ana Cecilia; Koponen, Ismo Kalevi

    2017-01-01

    Candle burning produces a large amount of particles that contribute substantially to the exposure to indoor particulate matter. The exposures to various types of combustion particles, such as diesel exhaust particles, have been associated with increased risk of lung cancer by mechanisms that invo......Candle burning produces a large amount of particles that contribute substantially to the exposure to indoor particulate matter. The exposures to various types of combustion particles, such as diesel exhaust particles, have been associated with increased risk of lung cancer by mechanisms...... that involve oxidative stress, inflammation and genotoxicity. The aim of this study was to compare pulmonary effects of candle light combustion particles (CP) with two benchmark diesel exhaust particles (A-DEP and SRM2975). Intratracheal (i.t.) instillation of CP (5mg/kg bodyweight) in C57BL/6n mice produced......-DEP or SRM2975. The i.t. instillation of CP did not generate oxidative damage to DNA in lung tissue, measured as DNA strand breaks and human 8-oxoguanine glycosylase-sensitive sites by the comet assay. The lack of genotoxic response was confirmed in lung epithelial (A549) cells, although the exposure to CP...

  19. Innovative Energy Planning and Nuclear Option Using CANDLE Reactors

    Energy Technology Data Exchange (ETDEWEB)

    Sekimoto, H; Nagata, A; Mingyu, Y [Tokyo Institute of Technology, Tokyo (Japan)

    2008-07-01

    A new reactor burn-up strategy CANDLE (Constant Axial shape of Neutron flux, nuclide densities and power shape During Life of Energy producing reactor) was proposed, where shapes of neutron flux, nuclide densities and power density distributions remain constant but move upward (or downward) along its core axis. This burn-up strategy can derive many merits. The change of excess reactivity along burn-up is theoretically zero for ideal equilibrium condition, and shim rods will not be required for this reactor. The reactor becomes free from accidents induced by unexpected control rods withdrawal. The core characteristics, such as power feedback coefficients and power peaking factor, are not changed during life of operation. Therefore, the operation of the reactor becomes much easier than the conventional reactors. The infinite-medium neutron multiplication factor of replacing fuel is less than unity. Therefore, the transportation and storage of replacing fuels becomes easy and safe, since they are free from criticality accidents. Small long life fast reactor with CANDLE burn-up concept has investigated with depleted uranium as a replacing fuel. Both core diameter and height are chosen to be 2.0 m, and the thermal power is 200 MW. Lead-bismuth is used as a coolant, and nitride (enriched N-15) fuel are employed. The velocity of burning region along burn-up is less than 1.0 cm/year that enables a long life design easily. The core averaged discharged fuel burn-up is about 40 percent. It is about ten times of light water reactor burn-up. The spent fuel volume becomes one-tenth of light water reactor spent fuel. If a light water reactor with a certain power output has been operated for 40 years, the CANDLE reactor can be operated for 2000 years with the same power output and with only depleted uranium left after fuel production for the light water reactor. The system does not need any reprocessing or enrichment. Therefore, the reactor operation becomes very safe, the waste

  20. Innovative Energy Planning and Nuclear Option Using CANDLE Reactors

    International Nuclear Information System (INIS)

    Sekimoto, H.; Nagata, A.; Mingyu, Y.

    2008-01-01

    A new reactor burn-up strategy CANDLE (Constant Axial shape of Neutron flux, nuclide densities and power shape During Life of Energy producing reactor) was proposed, where shapes of neutron flux, nuclide densities and power density distributions remain constant but move upward (or downward) along its core axis. This burn-up strategy can derive many merits. The change of excess reactivity along burn-up is theoretically zero for ideal equilibrium condition, and shim rods will not be required for this reactor. The reactor becomes free from accidents induced by unexpected control rods withdrawal. The core characteristics, such as power feedback coefficients and power peaking factor, are not changed during life of operation. Therefore, the operation of the reactor becomes much easier than the conventional reactors. The infinite-medium neutron multiplication factor of replacing fuel is less than unity. Therefore, the transportation and storage of replacing fuels becomes easy and safe, since they are free from criticality accidents. Small long life fast reactor with CANDLE burn-up concept has investigated with depleted uranium as a replacing fuel. Both core diameter and height are chosen to be 2.0 m, and the thermal power is 200 MW. Lead-bismuth is used as a coolant, and nitride (enriched N-15) fuel are employed. The velocity of burning region along burn-up is less than 1.0 cm/year that enables a long life design easily. The core averaged discharged fuel burn-up is about 40 percent. It is about ten times of light water reactor burn-up. The spent fuel volume becomes one-tenth of light water reactor spent fuel. If a light water reactor with a certain power output has been operated for 40 years, the CANDLE reactor can be operated for 2000 years with the same power output and with only depleted uranium left after fuel production for the light water reactor. The system does not need any reprocessing or enrichment. Therefore, the reactor operation becomes very safe, the waste

  1. A Hubble Space Telescope survey for novae in M87 - III. Are novae good standard candles 15 d after maximum brightness?

    Science.gov (United States)

    Shara, Michael M.; Doyle, Trisha F.; Pagnotta, Ashley; Garland, James T.; Lauer, Tod R.; Zurek, David; Baltz, Edward A.; Goerl, Ariel; Kovetz, Attay; Machac, Tamara; Madrid, Juan P.; Mikołajewska, Joanna; Neill, J. D.; Prialnik, Dina; Welch, D. L.; Yaron, Ofer

    2018-02-01

    Ten weeks of daily imaging of the giant elliptical galaxy M87 with the Hubble Space Telescope (HST) has yielded 41 nova light curves of unprecedented quality for extragalactic cataclysmic variables. We have recently used these light curves to demonstrate that the observational scatter in the so-called maximum-magnitude rate of decline (MMRD) relation for classical novae is so large as to render the nova-MMRD useless as a standard candle. Here, we demonstrate that a modified Buscombe-de Vaucouleurs hypothesis, namely that novae with decline times t2 > 10 d converge to nearly the same absolute magnitude about two weeks after maximum light in a giant elliptical galaxy, is supported by our M87 nova data. For 13 novae with daily sampled light curves, well determined times of maximum light in both the F606W and F814W filters, and decline times t2 > 10 d we find that M87 novae display M606W,15 = -6.37 ± 0.46 and M814W,15 = -6.11 ± 0.43. If very fast novae with decline times t2 < 10 d are excluded, the distances to novae in elliptical galaxies with stellar binary populations similar to those of M87 should be determinable with 1σ accuracies of ± 20 per cent with the above calibrations.

  2. The Candle and the Mirror: One Author's Journey as an Outsider.

    Science.gov (United States)

    Moreillon, Judi

    1999-01-01

    Chronicles the author's journey as an outsider who authored a book for children about the harvest traditions of the Tohono O'odham people. Describes how her concern about the lack of literature to serve as a mirror and a candle to reflect and illuminate the lives of Tohono O'odham children led her on a journey that was both painful and affirming.…

  3. Compilation of Published PM2.5 Emission Rates for Cooking, Candles and Incense for Use in Modeling of Exposures in Residences

    Energy Technology Data Exchange (ETDEWEB)

    Hu, Tianchao [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Singer, Brett C. [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Logue, Jennifer M. [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States)

    2012-08-01

    recent analysis of health impacts from air pollutant inhalation in homes found that PM2.5 is the most damaging at the population level. Chronic exposure to elevated PM2.5 has the potential to damage human respiratory systems, and may result in premature death. PM2.5 exposures in homes can be mitigated through various approaches including kitchen exhaust ventilation, filtration, indoor pollutant source reduction and designing ventilation systems to reduce the entry of PM2.5 from outdoors. Analysis of the potential benefits and costs of various approaches can be accomplished using computer codes that simulate the key physical processes including emissions, dilution and ventilation. The largest sources of PM2.5 in residences broadly are entry from outdoors and emissions from indoor combustion. The largest indoor sources are tobacco combustion (smoking), cooking and the burning of candles and incense. Data on the magnitude of PM2.5 and other pollutant emissions from these events and processes are required to conduct simulations for analysis. The goal of this study was to produce a database of pollutant emission rates associated with cooking and the burning of candles and incense. The target use of these data is for indoor air quality modeling.

  4. DEVELOPMENT OF AN ADHESIVE CANDLE FILTER SAFEGUARD DEVICE; F

    International Nuclear Information System (INIS)

    John P. Hurley; Ann K. Henderson; Jan W. Nowok; Michael L. Swanson

    2002-01-01

    In order to reach the highest possible efficiencies in a coal-fired turbine-based power system, the turbine should be directly fired with the products of coal conversion. Two main types of systems employ these turbines: those based on pressurized fluidized-bed combustors and those based on integrated gasification combined cycles. In both systems, suspended particulates must be cleaned from the gas stream before it enters the turbine so as to prevent fouling and erosion of the turbine blades. To produce the cleanest gas, barrier filters are being developed and are in use in several facilities. Barrier filters are composed of porous, high-temperature materials that allow the hot gas to pass but collect the particulates on the surface. The three main configurations of the barrier filters are candle, cross-flow, and tube filters. Both candle and tube filters have been tested extensively. They are composed of coarsely porous ceramic that serves as a structural support, overlain with a thin, microporous ceramic layer on the dirty gas side that serves as the primary filter surface. They are highly efficient at removing particulate matter from the gas stream and, because of their ceramic construction, are resistant to gas and ash corrosion. However, ceramics are brittle and individual elements can fail, allowing particulates to pass through the hole left by the filter element and erode the turbine. Preventing all failure of individual ceramic filter elements is not possible at the present state of development of the technology. Therefore, safeguard devices (SGDs) must be employed to prevent the particulates streaming through occasional broken filters from reaching the turbine. However, the SGD must allow for the free passage of gas when it is not activated. Upon breaking of a filter, the SGD must either mechanically close or quickly plug with filter dust to prevent additional dust from reaching the turbine. Production of a dependable rapidly closing autonomous mechanical

  5. The impact of candle burning during All Saints' Day ceremonies on ambient alkyl-substituted benzene concentrations.

    Science.gov (United States)

    Olszowski, Tomasz; Kłos, Andrzej

    2013-11-01

    Research findings concerning benzene, toluene, ethylobenzene, meta-, para- and ortho-xylene as well as styrene (BTEXS) emission at public cemeteries during All Saints' Day are presented here. Tests were carried out at town-located cemeteries in Opole and Grodków (southern Poland) and, as a benchmark, at the centres of those same towns. The purpose of the study was to estimate BTEXS emissions caused by the candle burning and, equally important to examine, whether emissions generated by the tested sources were similar to the BTEXS emissions generated by road transport. During the festive period, significant increases in benzene concentrations, by 200 % and 144 %, were noted at the cemeteries in Opole and Grodków, as well as in toluene, by 366 % and 342 %, respectively. Styrene concentrations also increased. It was demonstrated that the ratio of toluene to benzene concentrations from emissions caused by the burning candles are comparable to the ratio established for transportation emissions.

  6. Development of CANDLES low background HPGe detector and half-life measurement of 180Tam

    Science.gov (United States)

    Chan, W. M.; Kishimoto, T.; Umehara, S.; Matsuoka, K.; Suzuki, K.; Yoshida, S.; Nakajima, K.; Iida, T.; Fushimi, K.; Nomachi, M.; Ogawa, I.; Tamagawa, Y.; Hazama, R.; Takemoto, Y.; Nakatani, N.; Takihira, Y.; Tozawa, M.; Kakubata, H.; Trang, V. T. T.; Ohata, T.; Tetsuno, K.; Maeda, T.; Khai, B. T.; Li, X. L.; Batpurev, T.

    2018-01-01

    A low background HPGe detector system was developed at CANDLES Experimental Hall for multipurpose use. Various low background techniques were employed, including hermatic shield design, radon gas suppression, and background reduction analysis. A new pulse shape discrimination (PSD) method was specially created for coaxial Ge detector. Using this PSD method, microphonics noise and background event at low energy region less than 200 keV can be rejected effectively. Monte Carlo simulation by GEANT4 was performed to acquire the detection efficiency and study the interaction of gamma-rays with detector system. For rare decay measurement, the detector was utilized to detect the nature's most stable isomer tantalum-180m (180Tam) decay. Two phases of tantalum physics run were completed with total livetime of 358.2 days, which Phase II has upgraded shield configuration. The world most stringent half-life limit of 180Tam has been successfully achieved.

  7. Study on the application of CANDLE burnup strategy to several nuclear reactors. JAERI's nuclear research promotion program, H13-002 (Contract research)

    International Nuclear Information System (INIS)

    Kunitomi, Kazuhiko

    2005-03-01

    The CANDLE burnup strategy is a new reactor burnup concept, where the distributions of fuel nuclide densities, neutron flux, and power density move with the same constant speed from bottom to top (or from top to bottom) of the core and without any change in their shapes. Therefore, any burnup control mechanisms are not required, and reactor characteristics do not change along burnup. The reactor is simple and safe. When this burnup scheme is applied to some neutron rich fast reactors, either natural or depleted uranium can be utilized as fresh fuel after second core and the burnup of discharged fuel is about 40%. It means that the nuclear energy can be utilized for many hundreds years without new mining, enrichment and reprocessing, and the amount of spent fuel can be reduced considerably. However, in order to perform such a high fuel burnup some innovative technologies should be developed. Though development of innovative fuel will take a lot of time, intermediate re-cladding may be easy to be employed. Compared to fast reactors, application of CANDLE burnup to prismatic fuel high-temperature gas cooled reactors is very easy. In this report the application of CANDLE burnup to both these types of reactors are studied. (author)

  8. Smart candle soot coated membranes for on-demand immiscible oil/water mixture and emulsion switchable separation.

    Science.gov (United States)

    Li, Jian; Zhao, Zhihong; Li, Dianming; Tian, Haifeng; Zha, Fei; Feng, Hua; Guo, Lin

    2017-09-21

    Oil/water separation is of great importance for the treatment of oily wastewater, including immiscible light/heavy oil-water mixtures, oil-in-water or water-in-oil emulsions. Smart surfaces with responsive wettability have received extensive attention especially for controllable oil/water separation. However, traditional smart membranes with a switchable wettability between superhydrophobicity and superhydrophilicity are limited to certain responsive materials and continuous external stimuli, such as pH, electrical field or light irradiation. Herein, a candle soot coated mesh (CSM) with a larger pore size and a candle soot coated PVDF membrane (CSP) with a smaller pore size with underwater superoleophobicity and underoil superhydrophobicity were successfully fabricated, which can be used for on-demand immiscible oil/water mixtures and surfactants-stabilized oil/water emulsion separation, respectively. Without any continuous external stimulus, the wettability of our membranes could be reversibly switched between underwater superoleophobicity and underoil superhydrophobicity simply by drying and washing alternately, thus achieving effective and switchable oil/water separation with excellent separation efficiency. We believe that such smart materials will be promising candidates for use in the removal of oil pollutants in the future.

  9. Isolation of bacteria from diabetic foot ulcers with special reference to anaerobe isolation by simple two-step combustion technique in candle jar

    Directory of Open Access Journals (Sweden)

    Jayeeta Haldar

    2017-01-01

    Results: All the 43 samples were culture positive, of which aerobic Gram-negative bacteria (GNB predominated, followed by Staphylococcus aureus, Enterococcus and diphtheroids. Anaerobes isolated from 21 samples were Peptostreptococcus, Bacteroides, Porphyromonas, Veillonella spp. and Clostridium perfringens by both GasPak and in-house developed and modified candle jar techniques. Imipenem and metronidazole were most sensitive while clindamycin, penicillin and cefoxitin were least sensitive drugs for anaerobes. Aerobic GNB were found to be multidrug resistant, especially to penicillin and cephalosporins. The most sensitive drug was piperacillin-tazobactam. Interpretation & conclusions: For isolation of anaerobes from clinical specimens such as diabetic foot ulcers, modified candle jar technique was found to be as reliable as GasPak system. This modified technique needs to be tested for many other clinical materials which are not yet evaluated.

  10. Haugh Unit: Gold Standard of Egg Quality

    Science.gov (United States)

    Rapidly determining shell egg quality in an objective manner is not an easy task. Candling is most often utilized as a quick method for assessing egg quality in a non-destructive manner, but it is a highly subjective method. As you have experienced this week, when candling, it is almost impossibl...

  11. Using multiple continuous fine particle monitors to characterize tobacco, incense, candle, cooking, wood burning, and vehicular sources in indoor, outdoor, and in-transit settings

    Science.gov (United States)

    Ott, Wayne R.; Siegmann, Hans C.

    This study employed two continuous particle monitors operating on different measurement principles to measure concentrations simultaneously from common combustion sources in indoor, outdoor, and in-transit settings. The pair of instruments use (a) photo-charging (PC) operating on the principle ionization of fine particles that responds to surface particulate polycyclic aromatic hydrocarbons (PPAHs), and (b) diffusion charging (DC) calibrated to measure the active surface area of fine particles. The sources studied included: (1) secondhand smoke (cigarettes, cigars, and pipes), (2) incense (stick and cone), (3) candles used as food warmers, (4) cooking (toasting bread and frying meat), (5) fireplaces and ambient wood smoke, and (6) in-vehicle exposures traveling on California arterials and interstate highways. The ratio of the PC to the DC readings, or the PC/DC ratio, was found to be different for major categories of sources. Cooking, burning toast, and using a "canned heat" food warmer gave PC/DC ratios close to zero. Controlled experiments with 10 cigarettes averaged 0.15 ng mm -2 (ranging from 0.11 to 0.19 ng mm -2), which was similar to the PC/DC ratio for a cigar, although a pipe was slightly lower (0.09 ng mm -2). Large incense sticks had PC/DC ratios similar to those of cigarettes and cigars. The PC/DC ratios for ambient wood smoke averaged 0.29 ng mm -2 on 6 dates, or about twice those of cigarettes and cigars, reflecting a higher ratio of PAH to active surface area. The smoke from two artificial logs in a residential fireplace had a PC/DC ratio of 0.33-0.35 ng mm -2. The emissions from candles were found to vary, depending on how the candles were burned. If the candle flickered and generated soot, a higher PC/DC ratio resulted than if the candle burned uniformly in still air. Inserting piece of metal into the candle's flame caused high PPAH emissions with a record PC/DC reading of 1.8 ng mm -2. In-vehicle exposures measured on 43- and 50-min drives on a

  12. Study of Standard Model processes with leptons of high transverse momentum with the ATLAS detector

    CERN Document Server

    Petridis, Andreas

    This PhD thesis has taken up various aspects in experimental particle physics by analyzing the first data of the ATLAS detector. The main subject of the thesis is the production cross section measurement of the ZZ process at sqrt{s} = 7 TeV of proton-proton collisions. In order to study this, the author has contributed to various topics, such as detector related issues, the study of standard candle processes and Monte Carlo studies. In specic, this thesis presents a detailed study for the improvement of the hit position and peaking time resolution of the CSCs with 0.6% and 0.4% relative errors respectively. This thesis has also contributed to the first Z inclusive cross section measurement at sqrt{s} = 7 TeV, by analyzing the first 316 nb-1 of data recorded by the Atlas detector. The Z->ll cross section measurement is used as a standard candle for detector performance assessment as well as for the tuning of theoretical predictions at the new energy regime. The fiducial and total cross sections have ...

  13. When a Standard Candle Flickers

    DEFF Research Database (Denmark)

    Wilson-Hodge, Colleen A; Cherry, Michael L; Case, Gary L

    2011-01-01

    -100 keV band with GBM, Swift /BAT, and INTEGRAL /IBIS. The pulsed flux measured with RXTE /PCA since 1999 is consistent with the pulsar spin-down, indicating that the observed changes are nebular. Correlated variations in the Crab Nebula flux on a ~ 3 year timescale are also seen independently...

  14. Polymer-based candle-shaped microneedle electrodes for electroencephalography on hairy skin

    Science.gov (United States)

    Arai, Miyako; Kudo, Yuta; Miki, Norihisa

    2016-06-01

    In this paper, we report on the optimization of the shape of dry microneedle electrodes for electroencephalography (EEG) on hairy locations and compare the electrodes we developed with conventional wet electrodes. We propose the use of SU-8-based candle-shaped microneedle electrodes (CMEs), which have pillars of 1.0 mm height and 0.4 mm diameter with a gap of 0.43 mm between pillars. Microneedles are formed on the top of the pillars. The shape was determined by how well the pillars can avoid hairs and support the microneedles to penetrate through the stratum corneum. The skin-electrode contact impedances of the fabricated CMEs were found to be higher and less stable than those of conventional wet electrodes. However, the CMEs successfully acquired signals with qualities as good as those of conventional wet electrodes. Given the usability of the CMEs, which do not require skin preparation or gel, they are promising alternatives to conventional wet electrodes.

  15. Superluminous supernovae as standardizable candles and high-redshift distance probes

    Energy Technology Data Exchange (ETDEWEB)

    Inserra, C.; Smartt, S. J., E-mail: c.inserra@qub.ac.uk [Astrophysics Research Centre, School of Mathematics and Physics, Queens University Belfast, Belfast BT7 1NN (United Kingdom)

    2014-12-01

    We investigate the use of type Ic superluminous supernovae (SLSN Ic) as standardizable candles and distance indicators. Their appeal as cosmological probes stems from their remarkable peak luminosities, hot blackbody temperatures, and bright rest-frame ultraviolet emission. We present a sample of 16 published SLSN, from redshifts 0.1 to 1.2, and calculate accurate K corrections to determine uniform magnitudes in 2 synthetic rest-frame filter bandpasses with central wavelengths at 400 nm and 520 nm. At 400 nm, we find an encouragingly low scatter in their uncorrected, raw mean magnitudes with M(400) = –21.86 ± 0.35 mag for the full sample of 16 objects. We investigate the correlation between their decline rates and peak magnitude and find that the brighter events appear to decline more slowly. In a manner similar to the Phillips relation for type Ia SNe (SNe Ia), we define a ΔM {sub 20} decline relation. This correlates peak magnitude and decline over 20 days and can reduce the scatter in standardized peak magnitudes to ±0.22 mag. We further show that M(400) appears to have a strong color dependence. Redder objects are fainter and also become redder faster. Using this peak magnitudecolor evolution relation, a surprisingly low scatter of between ±0.08 mag and ±0.13 mag can be found in peak magnitudes, depending on sample selection. However, we caution that only 8 to 10 objects currently have enough data to test this peak magnitudecolor evolution relation. We conclude that SLSN Ic are promising distance indicators in the high-redshift universe in regimes beyond those possible with SNe Ia. Although the empirical relationships are encouraging, the unknown progenitor systems, how they may evolve with redshift, and the uncertain explosion physics are of some concern. The two major measurement uncertainties are the limited numbers of low-redshift, well-studied objects available to test these relationships and internal dust extinction in the host galaxies.

  16. CANDLE reactor: an option for simple, safe, high nuclear proliferation resistant , small waste and efficient fuel use reactor

    International Nuclear Information System (INIS)

    Sekimoto, H.

    2010-01-01

    The innovative nuclear energy systems have been investigated intensively for long period in COE-INES program and CRINES activities in Tokyo Institute of Technology. Five requirements; sustainability, safety, waste, nuclear-proliferation, and economy; are considered as inevitable requirements for nuclear energy. Characteristics of small LBE cooled CANDLE fast reactor developed in this Institute are discussed for these requirements. It satisfies clearly four requirements; safety, nonproliferation and safeguard, less wastes and sustainability. For the remaining requirement, economy, a high potential to satisfy this requirement is also shown

  17. Identification of predominant odorants in thai desserts flavored by smoking with "Tian Op", a traditional Thai scented candle.

    Science.gov (United States)

    Watcharananun, Wanwarang; Cadwallader, Keith R; Huangrak, Kittiphong; Kim, Hun; Lorjaroenphon, Yaowapa

    2009-02-11

    "Tian Op", a traditional Thai scented candle, is used for the smoking and flavoring of sweets, cakes, and other desserts for the purpose of adding a unique aroma to the final product. Gas chromatography-olfactometry, aroma extract dilution analysis, and GC-MS were applied to identify the potent odorants in two types of traditional Thai desserts ("num dok mai" and "gleep lum duan") prepared using a Tian Op smoking process. On the basis of the results of AEDA and calculated odor-activity values, the predominant odorants in the Tian Op flavored desserts were vinyl ketones (C(5)-C(9)), n-aldehydes (C(5)-C(11)), (E)-2-unsaturated aldehydes (C(8)-C(11)), and omega-1-unsaturated aldehydes (C(8) and C(9)). Sensory studies of model mixtures confirmed the importance of n-aldehydes, omega-1-unsaturated aldehydes, and guaiacol as predominant odorants; however, the results showed that vinyl ketones and (E)-2-unsaturated aldehydes, despite having high odor-activity values, may be of only minor importance in the typical aroma profiles of traditional Tian Op smoked desserts.

  18. Uranium hydrogeochemical and stream sediment reconnaissance of the Candle NTMS quadrangle, Alaska

    International Nuclear Information System (INIS)

    Hardy, L.C.; D'Andrea, R.F. Jr.; Zinkl, R.J.

    1982-07-01

    This report presents results of a Hydrogeochemical and Stream Sediment Reconnaissance (HSSR) of the Candle NTMS quadrangle, Alaska. In addition to this abbreviated data release, more complete data are available to the public in machine-readable form. These machine-readable data, as well as quarterly or semiannual program progress reports containing further information on the HSSR program in general, or on the Los Alamos National Laboratory (LANL) portion of the program in particular, are available from DOE's Technical Library at its Grand Junction Area Office. Presented in this data release are location data, field analyses, and laboratory analyses of several different sample media. For the sake of brevity, many field site observations have not been included in this volume; these data are, however, available on the magnetic tape. Appendices A through D describe the sample media and summarize the analytical results for each medium. The data have been subdivided by one of the Los Alamos National Laboratory sorting programs of Zinkl and others (1981a) into groups of stream-sediment, lake-sediment, stream-water, and lake-water samples. For each group which contains a sufficient number of observations, statistical tables, tables of raw data, and 1:1,000,000 scale maps of pertinent elements have been included in this report. Also included are maps showing results of multivariate statistical analyses. Information on the field and analytical procedures used by the Los Alamos National Laboratory during sample collection and analysis may be found in any HSSR data release prepared by the Laboratory and will not be included in this report

  19. A Hubble Space Telescope Survey for Novae in M87. II. Snuffing out the Maximum Magnitude–Rate of Decline Relation for Novae as a Non-standard Candle, and a Prediction of the Existence of Ultrafast Novae

    Energy Technology Data Exchange (ETDEWEB)

    Shara, Michael M.; Doyle, Trisha; Zurek, David [Department of Astrophysics, American Museum of Natural History, Central Park West and 79th Street, New York, NY 10024-5192 (United States); Lauer, Tod R. [National Optical Astronomy Observatory, P.O. Box 26732, Tucson, AZ 85726 (United States); Baltz, Edward A. [KIPAC, SLAC, 2575 Sand Hill Road, M/S 29, Menlo Park, CA 94025 (United States); Kovetz, Attay [School of Physics and Astronomy, Faculty of Exact Sciences, Tel Aviv University, Tel Aviv (Israel); Madrid, Juan P. [CSIRO, Astronomy and Space Science, P.O. Box 76, Epping, NSW 1710 (Australia); Mikołajewska, Joanna [N. Copernicus Astronomical Center, Polish Academy of Sciences, Bartycka 18, PL 00-716 Warsaw (Poland); Neill, J. D. [California Institute of Technology, 1200 East California Boulevard, MC 278-17, Pasadena CA 91125 (United States); Prialnik, Dina [Department of Geosciences, Tel Aviv University, Ramat Aviv, Tel Aviv 69978 (Israel); Welch, D. L. [Department of Physics and Astronomy, McMaster University, Hamilton, L8S 4M1, Ontario (Canada); Yaron, Ofer [Department of Particle Physics and Astrophysics, Weizmann Institute of Science, 76100 Rehovot (Israel)

    2017-04-20

    The extensive grid of numerical simulations of nova eruptions from the work of Yaron et al. first predicted that some classical novae might significantly deviate from the Maximum Magnitude–Rate of Decline (MMRD) relation, which purports to characterize novae as standard candles. Kasliwal et al. have announced the observational detection of a new class of faint, fast classical novae in the Andromeda galaxy. These objects deviate strongly from the MMRD relationship, as predicted by Yaron et al. Recently, Shara et al. reported the first detections of faint, fast novae in M87. These previously overlooked objects are as common in the giant elliptical galaxy M87 as they are in the giant spiral M31; they comprise about 40% of all classical nova eruptions and greatly increase the observational scatter in the MMRD relation. We use the extensive grid of the nova simulations of Yaron et al. to identify the underlying causes of the existence of faint, fast novae. These are systems that have accreted, and can thus eject, only very low-mass envelopes, of the order of 10{sup −7}–10{sup −8} M {sub ⊙}, on massive white dwarfs. Such binaries include, but are not limited to, the recurrent novae. These same models predict the existence of ultrafast novae that display decline times, t {sub 2,} to be as short as five hours. We outline a strategy for their future detection.

  20. ASSESSMENT OF THE PCFBC-EXPOSED AND ACCELERATED LIFE-TESTED CANDLE FILTERS; TOPICAL

    International Nuclear Information System (INIS)

    M.A. Alvin

    1999-01-01

    Development of the hot gas filtration technology has been the focus of DOE/FETC and Siemens Westinghouse Power Corporation during the past twenty years. Systems development during this time has successfully lead to the generation and implementation of high temperature Siemens Westinghouse particulate filtration systems that are currently installed and are operational at Demonstration Plant sites, and which are ready for installation at commercial plant sites. Concurrently, materials development has advanced the use of commercially available oxide- and nonoxide-based monoliths, and has fostered the manufacture and use of second generation, oxide-based, continuous fiber reinforced ceramic composites and filament wound materials. This report summarizes the material characterization results for commercially available and second generation filter materials tested in Siemens Westinghouse's advanced, high temperature, particulate removal system at the Foster Wheeler, pressurized circulating fluidized-bed combustion, pilot-scale test facility in Karhula, Finland, and subsequent extended accelerated life testing of aged elements in Siemens Westinghouse pressurized fluidized-bed combustion simulator test facility in Pittsburgh, PA. The viability of operating candle filters successfully for over 1 year of service life has been shown in these efforts. Continued testing to demonstrate the feasibility of acquiring three years of service operation on aged filter elements is recommended

  1. "Type Ia Supernovae: Tools for Studying Dark Energy" Final Technical Report

    Energy Technology Data Exchange (ETDEWEB)

    Woosley, Stan [Lick Observatory, San Jose, CA (United States); Kasen, Dan [Univ. of California, Berkeley, CA (United States)

    2017-05-10

    Final technical report for project "Type Ia Supernovae: Tools for the Study of Dark Energy" awarded jointly to scientists at the University of California, Santa Cruz and Berkeley, for computer modeling, theory and data analysis relevant to the use of Type Ia supernovae as standard candles for cosmology.

  2. Accurate weak lensing of standard candles. II. Measuring σ8 with supernovae

    Science.gov (United States)

    Quartin, Miguel; Marra, Valerio; Amendola, Luca

    2014-01-01

    Soon the number of type Ia supernova (SN) measurements should exceed 100 000. Understanding the effect of weak lensing by matter structures on the supernova brightness will then be more important than ever. Although SN lensing is usually seen as a source of systematic noise, we will show that it can be in fact turned into signal. More precisely, the non-Gaussianity introduced by lensing in the SN Hubble diagram dispersion depends rather sensitively on the amplitude σ8 of the matter power spectrum. By exploiting this relation, we are able to predict constraints on σ8 of 7% (3%) for a catalog of 100 000 (500 000) SNe of average magnitude error 0.12, without having to assume that such intrinsic dispersion and its redshift evolution are known a priori. The intrinsic dispersion has been assumed to be Gaussian; possible intrinsic non-Gaussianities in the data set (due to the SN themselves and/or to other transients) could be potentially dealt with by means of additional nuisance parameters describing higher moments of the intrinsic dispersion distribution function. This method is independent of and complementary to the standard methods based on cosmic microwave background, cosmic shear, or cluster abundance observables.

  3. Proposed minimum reporting standards for chemical analysis Chemical Analysis Working Group (CAWG) Metabolomics Standards Initiative (MSI)

    Science.gov (United States)

    Amberg, Alexander; Barrett, Dave; Beale, Michael H.; Beger, Richard; Daykin, Clare A.; Fan, Teresa W.-M.; Fiehn, Oliver; Goodacre, Royston; Griffin, Julian L.; Hankemeier, Thomas; Hardy, Nigel; Harnly, James; Higashi, Richard; Kopka, Joachim; Lane, Andrew N.; Lindon, John C.; Marriott, Philip; Nicholls, Andrew W.; Reily, Michael D.; Thaden, John J.; Viant, Mark R.

    2013-01-01

    There is a general consensus that supports the need for standardized reporting of metadata or information describing large-scale metabolomics and other functional genomics data sets. Reporting of standard metadata provides a biological and empirical context for the data, facilitates experimental replication, and enables the re-interrogation and comparison of data by others. Accordingly, the Metabolomics Standards Initiative is building a general consensus concerning the minimum reporting standards for metabolomics experiments of which the Chemical Analysis Working Group (CAWG) is a member of this community effort. This article proposes the minimum reporting standards related to the chemical analysis aspects of metabolomics experiments including: sample preparation, experimental analysis, quality control, metabolite identification, and data pre-processing. These minimum standards currently focus mostly upon mass spectrometry and nuclear magnetic resonance spectroscopy due to the popularity of these techniques in metabolomics. However, additional input concerning other techniques is welcomed and can be provided via the CAWG on-line discussion forum at http://msi-workgroups.sourceforge.net/ or http://Msi-workgroups-feedback@lists.sourceforge.net. Further, community input related to this document can also be provided via this electronic forum. PMID:24039616

  4. Measurement of the Inclusive $Z \\to ee$ Production Cross Section in Proton-Proton Collisions at $\\sqrt{s}$ = 7TeV and $Z \\to ee$ Decays as Standard Candles for Luminosity at the Large Hadron Collider

    Energy Technology Data Exchange (ETDEWEB)

    Werner, Jeremy [Princeton U.

    2011-01-01

    This thesis comprises a precision measurement of the inclusive \\Zee production cross section in proton-proton collisions provided by the Large Hadron Collider (LHC) at a center-of-mass energy of $\\sqrt{s}=7$~TeV and the absolute luminosity based on \\Zee decays. The data was collected by the Compact Muon Solenoid (CMS) detector near Geneva, Switzerland during the year of 2010 and corresponds to an integrated luminosity of $\\int\\mathcal{L}dt = 35.9\\pm 1.4$~pb$^{-1}$. Electronic decays of $Z$ bosons allow one of the first electroweak measurements at the LHC, making the cross section measurement a benchmark of physics performance after the first year of CMS detector and LHC machine operations. It is the first systematic uncertainty limited \\Zee cross section measurement performed at $\\sqrt{s}=7$~TeV. The measured cross section pertaining to the invariant mass window $M_{ee}\\in (60,120)$~GeV is reported as: $\\sigma(pp\\to Z+X) \\times \\mathcal{B}( Z\\to ee ) = 997 \\pm 11 \\mathrm{(sta t)} \\pm 19 \\mathrm{(syst)} \\pm 40 \\mathrm{(lumi)} \\textrm{ pb}$, which agrees with the theoretical prediction calculated to NNLO in QCD. Leveraging \\Zee decays as ``standard candles'' for measuring the absolute luminosity at the LHC is examined; they are produced copiously, are well understood, and have clean detector signatures. Thus the consistency of the inclusive \\Zee production cross section measurement with the theoretical prediction motivates inverting the measurement to instead use the \\Zee signal yield to measure the luminosity. The result, which agrees with the primary relative CMS luminosity measurement calibrated using Van der Meer separation scans, is not only the most precise absolute luminosity measurement performed to date at a hadron collider, but also the first one based on a physics signal at the LHC.

  5. Type Ia Supernova Intrinsic Magnitude Dispersion and the Fitting of Cosmological Parameters

    Science.gov (United States)

    Kim, A. G.

    2011-02-01

    I present an analysis for fitting cosmological parameters from a Hubble diagram of a standard candle with unknown intrinsic magnitude dispersion. The dispersion is determined from the data, simultaneously with the cosmological parameters. This contrasts with the strategies used to date. The advantages of the presented analysis are that it is done in a single fit (it is not iterative), it provides a statistically founded and unbiased estimate of the intrinsic dispersion, and its cosmological-parameter uncertainties account for the intrinsic-dispersion uncertainty. Applied to Type Ia supernovae, my strategy provides a statistical measure to test for subtypes and assess the significance of any magnitude corrections applied to the calibrated candle. Parameter bias and differences between likelihood distributions produced by the presented and currently used fitters are negligibly small for existing and projected supernova data sets.

  6. Background studies of high energy γ rays from (n,γ) reactions in the CANDLES experiment

    Science.gov (United States)

    Nakajima, K.; Iida, T.; Akutagawa, K.; Batpurev, T.; Chan, W. M.; Dokaku, F.; Fushimi, K.; Kakubata, H.; Kanagawa, K.; Katagiri, S.; Kawasaki, K.; Khai, B. T.; Kino, H.; Kinoshita, E.; Kishimoto, T.; Hazama, R.; Hiraoka, H.; Hiyama, T.; Ishikawa, M.; Li, X.; Maeda, T.; Matsuoka, K.; Moser, M.; Nomachi, M.; Ogawa, I.; Ohata, T.; Sato, H.; Shamoto, K.; Shimada, M.; Shokati, M.; Takahashi, N.; Takemoto, Y.; Takihira, Y.; Tamagawa, Y.; Tozawa, M.; Teranishi, K.; Tetsuno, K.; Trang, V. T. T.; Tsuzuki, M.; Umehara, S.; Wang, W.; Yoshida, S.; Yotsunaga, N.

    2018-07-01

    High energy γ rays with several MeV produced by (n,γ) reactions can be a trouble for low background measurements in the underground laboratories such as double beta decay experiments. In the CANDLES project, which aimed to observe the neutrino-less double beta decay from 48Ca, γ rays caused by (n,γ) reactions were found to be the most significant background. The profile of the background was studied by measurements with a neutron source and a simulation with a validity check of neutron processes in Geant4. The observed spectrum of γ rays from (n,γ) reactions was well reproduced by the simulated spectra, which were originated from the surrounding rock and a detector tank made of stainless steel. The environmental neutron flux was derived by the observed event rate of γ rays from (n,γ) reactions using the simulation. The thermal and non-thermal neutron flux were found to be (1.3 ± 0.6) ×10-6 cm-2s-1 and (1.1 ± 0.5) ×10-5 cm-2s-1 , respectively. It is necessary to install an additional shield to reduce the background from (n,γ) reaction to the required level.

  7. Z boson as ``the standard candle'' for high-precision W boson physics at LHC TH1"-->

    Science.gov (United States)

    Krasny, M. W.; Fayette, F.; Płaczek, W.; Siódmok, A.

    2007-08-01

    In this paper we propose a strategy for measuring the inclusive W boson production processes at LHC. This strategy exploits simultaneously the unique flexibility of the LHC collider in running variable beam particle species at variable beam energies, and the configuration flexibility of the LHC detectors. We propose their concrete settings for a precision measurement of the standard model parameters. These dedicated settings optimise the use of the Z boson and Drell Yan-pair production processes as “the standard reference candles”. The presented strategy allows one to factorise and to directly measure those of the QCD effects that affect differently the W and Z production processes. It reduces to a level of mathcal{O}(10^{-4}) the impact of uncertainties in the partonic distribution functions (PDFs) and in the transverse momentum of the quarks on the measurement precision. Last but not the least, it reduces by a factor of 10 the impact of systematic measurement errors, such as the energy scale and the measurement resolution, on the W boson production observables.

  8. Incorporating Experience Curves in Appliance Standards Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Garbesi, Karina; Chan, Peter; Greenblatt, Jeffery; Kantner, Colleen; Lekov, Alex; Meyers, Stephen; Rosenquist, Gregory; Buskirk, Robert Van; Yang, Hung-Chia; Desroches, Louis-Benoit

    2011-10-31

    The technical analyses in support of U.S. energy conservation standards for residential appliances and commercial equipment have typically assumed that manufacturing costs and retail prices remain constant during the projected 30-year analysis period. There is, however, considerable evidence that this assumption does not reflect real market prices. Costs and prices generally fall in relation to cumulative production, a phenomenon known as experience and modeled by a fairly robust empirical experience curve. Using price data from the Bureau of Labor Statistics, and shipment data obtained as part of the standards analysis process, we present U.S. experience curves for room air conditioners, clothes dryers, central air conditioners, furnaces, and refrigerators and freezers. These allow us to develop more representative appliance price projections than the assumption-based approach of constant prices. These experience curves were incorporated into recent energy conservation standards for these products. The impact on the national modeling can be significant, often increasing the net present value of potential standard levels in the analysis. In some cases a previously cost-negative potential standard level demonstrates a benefit when incorporating experience. These results imply that past energy conservation standards analyses may have undervalued the economic benefits of potential standard levels.

  9. Standardizing Activation Analysis: New Software for Photon Activation Analysis

    Science.gov (United States)

    Sun, Z. J.; Wells, D.; Segebade, C.; Green, J.

    2011-06-01

    Photon Activation Analysis (PAA) of environmental, archaeological and industrial samples requires extensive data analysis that is susceptible to error. For the purpose of saving time, manpower and minimizing error, a computer program was designed, built and implemented using SQL, Access 2007 and asp.net technology to automate this process. Based on the peak information of the spectrum and assisted by its PAA library, the program automatically identifies elements in the samples and calculates their concentrations and respective uncertainties. The software also could be operated in browser/server mode, which gives the possibility to use it anywhere the internet is accessible. By switching the nuclide library and the related formula behind, the new software can be easily expanded to neutron activation analysis (NAA), charged particle activation analysis (CPAA) or proton-induced X-ray emission (PIXE). Implementation of this would standardize the analysis of nuclear activation data. Results from this software were compared to standard PAA analysis with excellent agreement. With minimum input from the user, the software has proven to be fast, user-friendly and reliable.

  10. Standardizing Activation Analysis: New Software for Photon Activation Analysis

    International Nuclear Information System (INIS)

    Sun, Z. J.; Wells, D.; Green, J.; Segebade, C.

    2011-01-01

    Photon Activation Analysis (PAA) of environmental, archaeological and industrial samples requires extensive data analysis that is susceptible to error. For the purpose of saving time, manpower and minimizing error, a computer program was designed, built and implemented using SQL, Access 2007 and asp.net technology to automate this process. Based on the peak information of the spectrum and assisted by its PAA library, the program automatically identifies elements in the samples and calculates their concentrations and respective uncertainties. The software also could be operated in browser/server mode, which gives the possibility to use it anywhere the internet is accessible. By switching the nuclide library and the related formula behind, the new software can be easily expanded to neutron activation analysis (NAA), charged particle activation analysis (CPAA) or proton-induced X-ray emission (PIXE). Implementation of this would standardize the analysis of nuclear activation data. Results from this software were compared to standard PAA analysis with excellent agreement. With minimum input from the user, the software has proven to be fast, user-friendly and reliable.

  11. Multielement analysis of biological standards by neutron activation analysis

    International Nuclear Information System (INIS)

    Nadkarni, R.A.

    1977-01-01

    Up to 28 elements were determined in two IAEA standards: Animal Muscle H4 and Fish Soluble A 6/74, and three NBS standards: Spinach: SRM-1570, Tomato Leaves: SRM-1573 and Pine Needles: SRM-1575 by instrumental neutron-activation analysis. Seven noble metals were determined in two NBS standards: Coal: SRM-1632 and Coal Fly Ash: SRM-1633 by radiochemical procedure while 11 rare earth elements were determined in NBS standard Orchard Leaves: SRM-1571 by instrumental neutron-activation analysis. The results are in good agreement with the certified and/or literature data where available. The irradiations were performed at the Cornell TRIGA Mark II nuclear reactor at a thermal neutron flux of 1-3x10 12 ncm -2 sec -1 . The short-lived species were determined after a 2-minute irradiation in the pneumatic rabbit tube, and the longer-lived species after an 8-hour irradiation in the central thimble facility. The standards and samples were counted on coaxial 56-cm 3 Ge(Li) detector. The system resolution was 1.96 keV (FWHM) with a peak to Compton ratio of 37:1 and counting efficiency of 13%, all compared to the 1.332 MeV photopeak of Co-60. (T.I.)

  12. Analysis of standard substance human hair

    International Nuclear Information System (INIS)

    Zou Shuyun; Zhang Yongbao

    2005-01-01

    The human hair samples as standard substances were analyzed by the neutron activation analysis (NAA) on the miniature neutron source reactor. 19 elements, i.e. Al, As, Ba, Br, Ca, Cl, Cr, Co, Cu, Fe, Hg, I, Mg, Mn, Na, S, Se, V and Zn, were measured. The average content, standard deviation, relative standard deviation and the detection limit under the present research conditions were given for each element, and the results showed that the measured values of the samples were in agreement with the recommended values, which indicated that NAA can be used to analyze standard substance human hair with a relatively high accuracy. (authors)

  13. Questionnaire regarding the international Freiberg conference on IGCC and XtL technologies. Analysis of 75 questionnaires

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    2005-07-01

    The Puertollano IGCC Plant, owned by ELCOGAS, uses a mixture (50/50% weight) of local coal with high content of ash (approximately 45%) and pet-coke to be fed into its pressurised entrained flow gasifier. Ash is removed from the bottom of the gasifier as vitrified slag although a fraction is converted into fly ash (2.5-3 t/h) and entrained by the syngas. In order to remove this fly ash, it is filtered in two candle filter vessels with more than 1,000 candles each, using nitrogen for on-line cleaning. The filtering system suffers some malfunctions resulting in blinding of the internal candle surface and increasing of the candle DP. The model of candle filter was changed and modifications were performed without the desired results. Therefore, the identification of suitable hot gas filtration technologies capable of overcoming current and future severe operational constraints experienced is of the utmost importance for IGCC units. In this sense, a pilot plant which allows the performance of alternative filtering elements tests, pulse cleaning strategies, on-line particulate monitoring and off-cleaning procedures has come into operation at the ESI-University of Seville facilities. The design has been conceived as a versatile pilot unit, in order to hold both bags and ceramic candles which are to be tested in a wide range of operating conditions. The pilot is processing air laden with real fly ash provided by ELCOGAS, and high pressure nitrogen for the cleaning operation. This paper describes the design and operation of the pilot as well as the testing plan currently being carried out. (orig.)

  14. Using Quasars as Standard Candles for Studying Dark Energy

    DEFF Research Database (Denmark)

    Denney, Kelly D.; Vestergaard, Marianne; Watson, D.

    2012-01-01

    , which relies on the technique of reverberation mapping to measure time delays between the quasar continuum and emission line variability signatures. Measuring this time delay effectively measures the radius between the central source and the emission-line gas. The emission line gas is photo-ionized...... by the continuum photons, and the radius to this emission-line region scales tightly with the nuclear luminosity - a consequence of the photoionization physics responsible for regulating the production of line-emitting photons. Hence, measuring the radius of the emission-line gas provides a measure...

  15. Di-boson signatures as standard candles for partial compositeness

    Energy Technology Data Exchange (ETDEWEB)

    Belyaev, Alexander [School of Physics & Astronomy, University of Southampton,Southampton (United Kingdom); Particle Physics Department, Rutherford Appleton Laboratory,Chilton, Didcot, Oxon OX11 0QX (United Kingdom); Cacciapaglia, Giacomo; Cai, Haiying [Univerity of Lyon, Université Lyon 1,CNRS/IN2P3, IPNL, F-69622, Villeurbanne (France); Ferretti, Gabriele [Department of Physics, Chalmers University of Technology,Fysikgården, 41296 Göteborg (Sweden); Flacke, Thomas [Center for Theoretical Physics of the Universe, Institute for Basic Science (IBS),Daejeon, 34051 (Korea, Republic of); Department of Physics, Korea University,Seoul 136-713 (Korea, Republic of); Parolini, Alberto [Department of Physics, Korea University,Seoul 136-713 (Korea, Republic of); Serodio, Hugo [Department of Physics, Korea University,Seoul 136-713 (Korea, Republic of); Department of Astronomy and Theoretical Physics,Lund University, SE-223 62 Lund (Sweden)

    2017-01-23

    Composite Higgs Models are often constructed including fermionic top partners with a mass around the TeV scale, with the top partners playing the role of stabilizing the Higgs potential and enforcing partial compositeness for the top quark. A class of models of this kind can be formulated in terms of fermionic strongly coupled gauge theories. A common feature they all share is the presence of specific additional scalar resonances, namely two neutral singlets and a colored octet, described by a simple effective Lagrangian. We study the phenomenology of these scalars, both in a model independent and model dependent way, including the bounds from all the available searches in the relevant channels with di-boson and di-top final states. We develop a generic framework which can be used to constrain any model containing pseudo-scalar singlets or octets. Using it, we find that such signatures provide strong bounds on the compositeness scale complementary to the traditional EWPT and Higgs couplings deviations. In many cases a relatively light scalar can be on the verge of discovery as a first sign of new physics.

  16. Photon and proton activation analysis of iron and steel standards using the internal standard method coupled with the standard addition method

    International Nuclear Information System (INIS)

    Masumoto, K.; Hara, M.; Hasegawa, D.; Iino, E.; Yagi, M.

    1997-01-01

    The internal standard method coupled with the standard addition method has been applied to photon activation analysis and proton activation analysis of minor elements and trace impurities in various types of iron and steel samples issued by the Iron and Steel Institute of Japan (ISIJ). Samples and standard addition samples were once dissolved to mix homogeneously, an internal standard and elements to be determined and solidified as a silica-gel to make a similar matrix composition and geometry. Cerium and yttrium were used as an internal standard in photon and proton activation, respectively. In photon activation, 20 MeV electron beam was used for bremsstrahlung irradiation to reduce matrix activity and nuclear interference reactions, and the results were compared with those of 30 MeV irradiation. In proton activation, iron was removed by the MIBK extraction method after dissolving samples to reduce the radioactivity of 56 Co from iron via 56 Fe(p, n) 56 Co reaction. The results of proton and photon activation analysis were in good agreement with the standard values of ISIJ. (author)

  17. Concrete blocks. Analysis of UNE, ISO en standards and comparison with other international standards

    Directory of Open Access Journals (Sweden)

    Álvarez Alonso, Marina

    1990-12-01

    Full Text Available This paper attempts to describe the recently approved UNE standards through a systematic analysis of the main specifications therein contained and the values considered for each of them, as well as the drafts for ISO and EN concrete block standards. Furthermore, the study tries to place the set of ISO standards in the international environment through a comparative analysis against a representative sample of the standards prevailing in various geographical regions of the globe to determine the analogies and differences among them. PALABRAS CLAVE: albañilería, análisis de sistemas, bloque de hormigón, muros de fábrica, normativa KEY WORDS: masonry, system analysis, concrete blocks, masonry walls, standards

    En este trabajo se pretende describir la reciente aprobada normativa UNE, analizando sistemáticamente las principales prescripciones contempladas y los valores considerados para cada una de ellas, así como los proyectos de Norma ISO, y EN sobre bloques de hormigón. Asimismo se intenta situar la normativa UNE en al ámbito internacional, haciendo un análisis comparativo con una representación de Normas de distintas regiones geográficas del mundo, determinando sus analogías y diferencias.

  18. Incorporating experience curves in appliance standards analysis

    International Nuclear Information System (INIS)

    Desroches, Louis-Benoit; Garbesi, Karina; Kantner, Colleen; Van Buskirk, Robert; Yang, Hung-Chia

    2013-01-01

    There exists considerable evidence that manufacturing costs and consumer prices of residential appliances have decreased in real terms over the last several decades. This phenomenon is generally attributable to manufacturing efficiency gained with cumulative experience producing a certain good, and is modeled by an empirical experience curve. The technical analyses conducted in support of U.S. energy conservation standards for residential appliances and commercial equipment have, until recently, assumed that manufacturing costs and retail prices remain constant during the projected 30-year analysis period. This assumption does not reflect real market price dynamics. Using price data from the Bureau of Labor Statistics, we present U.S. experience curves for room air conditioners, clothes dryers, central air conditioners, furnaces, and refrigerators and freezers. These experience curves were incorporated into recent energy conservation standards analyses for these products. Including experience curves increases the national consumer net present value of potential standard levels. In some cases a potential standard level exhibits a net benefit when considering experience, whereas without experience it exhibits a net cost. These results highlight the importance of modeling more representative market prices. - Highlights: ► Past appliance standards analyses have assumed constant equipment prices. ► There is considerable evidence of consistent real price declines. ► We incorporate experience curves for several large appliances into the analysis. ► The revised analyses demonstrate larger net present values of potential standards. ► The results imply that past standards analyses may have undervalued benefits.

  19. Facile Fabrication and Characterization of a PDMS-Derived Candle Soot Coated Stable Biocompatible Superhydrophobic and Superhemophobic Surface.

    Science.gov (United States)

    Iqbal, R; Majhy, B; Sen, A K

    2017-09-13

    We report a simple, inexpensive, rapid, and one-step method for the fabrication of a stable and biocompatible superhydrophobic and superhemophobic surface. The proposed surface comprises candle soot particles embedded in a mixture of PDMS+n-hexane serving as the base material. The mechanism responsible for the superhydrophobic behavior of the surface is explained, and the surface is characterized based on its morphology and elemental composition, wetting properties, mechanical and chemical stability, and biocompatibility. The effect of %n-hexane in PDMS, the thickness of the PDMS+n-hexane layer (in terms of spin coating speed) and sooting time on the wetting property of the surface is studied. The proposed surface exhibits nanoscale surface asperities (average roughness of 187 nm), chemical compositions of soot particles, very high water and blood repellency along with excellent mechanical and chemical stability and excellent biocompatibility against blood sample and biological cells. The water contact angle and roll-off angle is measured as 160° ± 1° and 2°, respectively, and the blood contact angle is found to be 154° ± 1°, which indicates that the surface is superhydrophobic and superhemophobic. The proposed superhydrophobic and superhemophobic surface offers significantly improved (>40%) cell viability as compared to glass and PDMS surfaces.

  20. Supernova brightening from chameleon-photon mixing

    International Nuclear Information System (INIS)

    Burrage, C.

    2008-01-01

    Measurements of standard candles and measurements of standard rulers give an inconsistent picture of the history of the universe. This discrepancy can be explained if photon number is not conserved as computations of the luminosity distance must be modified. I show that photon number is not conserved when photons mix with chameleons in the presence of a magnetic field. The strong magnetic fields in a supernova mean that the probability of a photon converting into a chameleon in the interior of the supernova is high, this results in a large flux of chameleons at the surface of the supernova. Chameleons and photons also mix as a result of the intergalactic magnetic field. These two effects combined cause the image of the supernova to be brightened resulting in a model which fits both observations of standard candles and observations of standard rulers

  1. Learning from the scatter in type ia supernovae

    Energy Technology Data Exchange (ETDEWEB)

    Dodelson, Scott; /Fermilab /Chicago U., Astron. Astrophys. Ctr.; Vallinotto, Alberto; /Fermilab /Chicago U.

    2005-11-01

    Type Ia Supernovae are standard candles so their mean apparent magnitude has been exploited to learn about the redshift-distance relationship. Besides intrinsic scatter in this standard candle, additional scatter is caused by gravitational magnification by large scale structure. Here they probe the dependence of this dispersion on cosmological parameters and show that information about the amplitude of clustering, {sigma}{sub s}, is contained in the scatter. In principle, it will be possible to constrain {sigma}{sub s} to within 5% with observations of 2000 Type Ia Supernovae. They identify three sources of systematic error--evolution of intrinsic scatter, baryon contributions to lensing, and non-Gaussianity of lensing--which will make this measurement difficult.

  2. ASTM standards for fire debris analysis: a review.

    Science.gov (United States)

    Stauffer, Eric; Lentini, John J

    2003-03-12

    The American Society for Testing and Materials (ASTM) recently updated its standards E 1387 and E 1618 for the analysis of fire debris. The changes in the classification of ignitable liquids are presented in this review. Furthermore, a new standard on extraction of fire debris with solid phase microextraction (SPME) was released. Advantages and drawbacks of this technique are presented and discussed. Also, the standard on cleanup by acid stripping has not been reapproved. Fire debris analysts that use the standards should be aware of these changes.

  3. Standard Compliant Hazard and Threat Analysis for the Automotive Domain

    Directory of Open Access Journals (Sweden)

    Kristian Beckers

    2016-06-01

    Full Text Available The automotive industry has successfully collaborated to release the ISO 26262 standard for developing safe software for cars. The standard describes in detail how to conduct hazard analysis and risk assessments to determine the necessary safety measures for each feature. However, the standard does not concern threat analysis for malicious attackers or how to select appropriate security countermeasures. We propose the application of ISO 27001 for this purpose and show how it can be applied together with ISO 26262. We show how ISO 26262 documentation can be re-used and enhanced to satisfy the analysis and documentation demands of the ISO 27001 standard. We illustrate our approach based on an electronic steering column lock system.

  4. Development of an analysis rule of diagnosis error for standard method of human reliability analysis

    International Nuclear Information System (INIS)

    Jeong, W. D.; Kang, D. I.; Jeong, K. S.

    2003-01-01

    This paper presents the status of development of Korea standard method for Human Reliability Analysis (HRA), and proposed a standard procedure and rules for the evaluation of diagnosis error probability. The quality of KSNP HRA was evaluated using the requirement of ASME PRA standard guideline, and the design requirement for the standard HRA method was defined. Analysis procedure and rules, developed so far, to analyze diagnosis error probability was suggested as a part of the standard method. And also a study of comprehensive application was performed to evaluate the suitability of the proposed rules

  5. 78 FR 45447 - Revisions to Modeling, Data, and Analysis Reliability Standard

    Science.gov (United States)

    2013-07-29

    ...; Order No. 782] Revisions to Modeling, Data, and Analysis Reliability Standard AGENCY: Federal Energy... Analysis (MOD) Reliability Standard MOD- 028-2, submitted to the Commission for approval by the North... Organization. The Commission finds that the proposed Reliability Standard represents an improvement over the...

  6. Precision measurements of Standard Model parameters and Review of Drell-Yan and vector boson plus jets measurements with the ATLAS detector

    International Nuclear Information System (INIS)

    Calace, Noemi

    2016-01-01

    The inclusive productions of the W and the on- or off-shell Z/γ* bosons are standard candles at hadron colliders, while the productions of light and heavy-flavour jets in association with a W or a Z boson are important processes to study quantum-chromodynamics (QCD) in multi-scale environments. The measurements of their production cross-sections integrated and differential in several variables have been measured at 7 and 8 TeV centre-of-mass energies and are compared to high-order QCD calculations and Monte Carlo simulations. These measurements have an impact on our knowledge of the parton densities of the proton, and test soft resummation effects and hard emissions for small and large momentum transfers and in multi-scale processes. Precision measurements of fundamental Standard Model parameters in Drell-Yan final states are also performed to describe the angular distributions of the decay lepton. Run-1 studies carried out by the ATLAS Collaboration are re-viewed and first LHC Run-2 results are included

  7. Rare earths analysis of rock samples by instrumental neutron activation analysis, internal standard method

    International Nuclear Information System (INIS)

    Silachyov, I.

    2016-01-01

    The application of instrumental neutron activation analysis for the determination of long-lived rare earth elements (REE) in rock samples is considered in this work. Two different methods are statistically compared: the well established external standard method carried out using standard reference materials, and the internal standard method (ISM), using Fe, determined through X-ray fluorescence analysis, as an element-comparator. The ISM proved to be the more precise method for a wide range of REE contents and can be recommended for routine practice. (author)

  8. Standardization: using comparative maintenance costs in an economic analysis

    OpenAIRE

    Clark, Roger Nelson

    1987-01-01

    Approved for public release; distribution is unlimited This thesis investigates the use of comparative maintenance costs of functionally interchangeable equipments in similar U.S. Navy shipboard applications in an economic analysis of standardization. The economics of standardization, life-cycle costing, and the Navy 3-M System are discussed in general. An analysis of 3-M System maintenance costs for a selected equipment, diesel engines, is conducted. The potential use of comparative ma...

  9. Spectral analysis of the binary nucleus of the planetary nebula Hen 2-428 - first results

    Science.gov (United States)

    Finch, Nicolle L.; Reindl, Nicole; Barstow, Martin A.; Casewell, Sarah L.; Geier, Stephan; Bertolami, Marcelo M. Miller; Taubenberger, Stefan

    2018-04-01

    Identifying progenitor systems for the double-degenerate scenario is crucial to check the reliability of type Ia supernovae as cosmological standard candles. Santander-Garcia et al. (2015) claimed that Hen 2-428 has a doubledegenerate core whose combined mass significantly exceeds the Chandrasekhar limit. Together with the short orbital period (4.2 hours), the authors concluded that the system should merge within a Hubble time triggering a type Ia supernova event. Garcia-Berro et al. (2016) explored alternative scenarios to explain the observational evidence, as the high mass conclusion is highly unlikely within predictions from stellar evolution theory. They conclude that the evidence supporting the supernova progenitor status of the system is premature. Here we present the first quantitative spectral analysis of Hen 2-428which allows us to derive the effective temperatures, surface gravities and helium abundance of the two CSPNe based on state-of-the-art, non-LTE model atmospheres. These results provide constrains for further studies of this particularly interesting system.

  10. Annual Book of ASTM Standards, Part 23: Water; Atmospheric Analysis.

    Science.gov (United States)

    American Society for Testing and Materials, Philadelphia, PA.

    Standards for water and atmospheric analysis are compiled in this segment, Part 23, of the American Society for Testing and Materials (ASTM) annual book of standards. It contains all current formally approved ASTM standard and tentative test methods, definitions, recommended practices, proposed methods, classifications, and specifications. One…

  11. Spectral analysis of the binary nucleus of the planetary nebula Hen 2-428 – first results

    Directory of Open Access Journals (Sweden)

    Finch Nicolle L.

    2018-04-01

    Full Text Available Identifying progenitor systems for the double-degenerate scenario is crucial to check the reliability of type Ia supernovae as cosmological standard candles. Santander-Garcia et al. (2015 claimed that Hen 2-428 has a doubledegenerate core whose combined mass significantly exceeds the Chandrasekhar limit. Together with the short orbital period (4.2 hours, the authors concluded that the system should merge within a Hubble time triggering a type Ia supernova event. Garcia-Berro et al. (2016 explored alternative scenarios to explain the observational evidence, as the high mass conclusion is highly unlikely within predictions from stellar evolution theory. They conclude that the evidence supporting the supernova progenitor status of the system is premature. Here we present the first quantitative spectral analysis of Hen 2-428which allows us to derive the effective temperatures, surface gravities and helium abundance of the two CSPNe based on state-of-the-art, non-LTE model atmospheres. These results provide constrains for further studies of this particularly interesting system.

  12. Evaluation of pressed powders and thin section standards for multi-elemental analysis by conventional and micro-PIXE analysis

    International Nuclear Information System (INIS)

    Homma-Takeda, Shino; Iso, Hiroyuki; Ito, Masaki

    2010-01-01

    For multi-elemental analysis, various standards are used to quantify the elements consists of environmental and biological samples. In this paper two different configuration standards, pressed powders and thin section standards, were assessed for their purpose as standards by conventional and micro-PIXE analysis. Homogeneity of manganese, iron, zinc (Zn), copper and yttrium added to pressed powder standard materials were validated and the relative standard deviation (RSD) of the X-ray intensity of the standards was 2 area and the metal concentration was acceptable. (author)

  13. Gamma-Ray Burst Prompt Correlations

    Directory of Open Access Journals (Sweden)

    M. G. Dainotti

    2018-01-01

    Full Text Available The mechanism responsible for the prompt emission of gamma-ray bursts (GRBs is still a debated issue. The prompt phase-related GRB correlations can allow discriminating among the most plausible theoretical models explaining this emission. We present an overview of the observational two-parameter correlations, their physical interpretations, and their use as redshift estimators and possibly as cosmological tools. The nowadays challenge is to make GRBs, the farthest stellar-scaled objects observed (up to redshift z=9.4, standard candles through well established and robust correlations. However, GRBs spanning several orders of magnitude in their energetics are far from being standard candles. We describe the advances in the prompt correlation research in the past decades, with particular focus paid to the discoveries in the last 20 years.

  14. SDSS-II: Determination of shape and color parameter coefficients for SALT-II fit model

    Energy Technology Data Exchange (ETDEWEB)

    Dojcsak, L.; Marriner, J.; /Fermilab

    2010-08-01

    In this study we look at the SALT-II model of Type IA supernova analysis, which determines the distance moduli based on the known absolute standard candle magnitude of the Type IA supernovae. We take a look at the determination of the shape and color parameter coefficients, {alpha} and {beta} respectively, in the SALT-II model with the intrinsic error that is determined from the data. Using the SNANA software package provided for the analysis of Type IA supernovae, we use a standard Monte Carlo simulation to generate data with known parameters to use as a tool for analyzing the trends in the model based on certain assumptions about the intrinsic error. In order to find the best standard candle model, we try to minimize the residuals on the Hubble diagram by calculating the correct shape and color parameter coefficients. We can estimate the magnitude of the intrinsic errors required to obtain results with {chi}{sup 2}/degree of freedom = 1. We can use the simulation to estimate the amount of color smearing as indicated by the data for our model. We find that the color smearing model works as a general estimate of the color smearing, and that we are able to use the RMS distribution in the variables as one method of estimating the correct intrinsic errors needed by the data to obtain the correct results for {alpha} and {beta}. We then apply the resultant intrinsic error matrix to the real data and show our results.

  15. A statistically self-consistent type Ia supernova data analysis

    International Nuclear Information System (INIS)

    Lago, B.L.; Calvao, M.O.; Joras, S.E.; Reis, R.R.R.; Waga, I.; Giostri, R.

    2011-01-01

    Full text: The type Ia supernovae are one of the main cosmological probes nowadays and are used as standardized candles in distance measurements. The standardization processes, among which SALT2 and MLCS2k2 are the most used ones, are based on empirical relations and leave room for a residual dispersion in the light curves of the supernovae. This dispersion is introduced in the chi squared used to fit the parameters of the model in the expression for the variance of the data, as an attempt to quantify our ignorance in modeling the supernovae properly. The procedure used to assign a value to this dispersion is statistically inconsistent and excludes the possibility of comparing different cosmological models. In addition, the SALT2 light curve fitter introduces parameters on the model for the variance that are also used in the model for the data. In the chi squared statistics context the minimization of such a quantity yields, in the best case scenario, a bias. An iterative method has been developed in order to perform the minimization of this chi squared but it is not well grounded, although it is used by several groups. We propose an analysis of the type Ia supernovae data that is based on the likelihood itself and makes it possible to address both inconsistencies mentioned above in a straightforward way. (author)

  16. Comparative study of standard space and real space analysis of quantitative MR brain data.

    Science.gov (United States)

    Aribisala, Benjamin S; He, Jiabao; Blamire, Andrew M

    2011-06-01

    To compare the robustness of region of interest (ROI) analysis of magnetic resonance imaging (MRI) brain data in real space with analysis in standard space and to test the hypothesis that standard space image analysis introduces more partial volume effect errors compared to analysis of the same dataset in real space. Twenty healthy adults with no history or evidence of neurological diseases were recruited; high-resolution T(1)-weighted, quantitative T(1), and B(0) field-map measurements were collected. Algorithms were implemented to perform analysis in real and standard space and used to apply a simple standard ROI template to quantitative T(1) datasets. Regional relaxation values and histograms for both gray and white matter tissues classes were then extracted and compared. Regional mean T(1) values for both gray and white matter were significantly lower using real space compared to standard space analysis. Additionally, regional T(1) histograms were more compact in real space, with smaller right-sided tails indicating lower partial volume errors compared to standard space analysis. Standard space analysis of quantitative MRI brain data introduces more partial volume effect errors biasing the analysis of quantitative data compared to analysis of the same dataset in real space. Copyright © 2011 Wiley-Liss, Inc.

  17. Extended standard vector analysis for plasma physics

    International Nuclear Information System (INIS)

    Wimmel, H.K.

    1982-02-01

    Standard vector analysis in 3-dimensional space, as found in most tables and textbooks, is complemented by a number of basic formulas that seem to be largely unknown, but are important in themselves and for some plasma physics applications, as is shown by several examples. (orig.)

  18. Comparison of high temperature gas particulate collectors for low level radwaste incinerator volume reduction systems

    International Nuclear Information System (INIS)

    Moscardini, R.L.; Johnston, J.R.; Waters, R.M.; Zievers, J.F.

    1983-01-01

    Incinerator system off-gases must be treated to prevent the release of particulates, noxious gases and radioactive elements to the environment. Fabric filters, venturi scrubbers, cyclone separators, an ceramic or metal filter candles have been used for particulate removal. Dry high temperature particulate collectors have the advantage of not creating additional liquid wastes. This paper presents a graphical comparison of different methods for filtering particles from high temperature incineration system off-gases. Eight methods of off-gas handling are compared. A much larger group may be present, but some judicious selection of different, but related systems was done for this paper based on experience with the Combustion Engineering Waste Incineration System (CE/WIS) Prototype. The eight types are: Inertial Devices, Electrostatic Precipitators (ESP), Standard Fabric Bags, Woven Ceramic Bags, Granular Beds, Sintered Metal Tubes, Felted Ceramic Bags and Ceramic Filter Candles. For high temperature LLRW particulate collection in incinerator off-gas systems, ceramic filter candles are the best overall choice

  19. Development of suitable plastic standards for X-ray fluorescence analysis

    Energy Technology Data Exchange (ETDEWEB)

    Mans, Christian [University of Applied Sciences Muenster, Department of Chemical Engineering, Advanced Analytical Chemistry, Stegerwaldstr. 39, 48565 Steinfurt (Germany)], E-mail: c.mans@fh-muenster.de; Hanning, Stephanie [University of Applied Sciences Muenster, Department of Chemical Engineering, Advanced Analytical Chemistry, Stegerwaldstr. 39, 48565 Steinfurt (Germany)], E-mail: hanning@fh-muenster.de; Simons, Christoph [University of Applied Sciences Muenster, Department of Chemical Engineering, Advanced Analytical Chemistry, Stegerwaldstr. 39, 48565 Steinfurt (Germany)], E-mail: simons@fh-muenster.de; Wegner, Anne [University of Applied Sciences Muenster, Department of Chemical Engineering, Advanced Analytical Chemistry, Stegerwaldstr. 39, 48565 Steinfurt (Germany)], E-mail: awegner@fh-muenster.de; Janssen, Anton [University of Applied Sciences Muenster, Department of Chemical Engineering, Advanced Analytical Chemistry, Stegerwaldstr. 39, 48565 Steinfurt (Germany)], E-mail: janssena@fh-muenster.de; Kreyenschmidt, Martin [University of Applied Sciences Muenster, Department of Chemical Engineering, Advanced Analytical Chemistry, Stegerwaldstr. 39, 48565 Steinfurt (Germany)], E-mail: martin.kreyenschmidt@fh-muenster.de

    2007-02-15

    For the adoption of the EU directive 'Restriction on use of certain Hazardous Substances' and 'Waste Electrical and Electronic Equipment' using X-ray fluorescence analysis suitable standard materials are required. Plastic standards based on acrylonitrile-butadiene-styrene terpolymer, containing the regulated elements Br, Cd, Cr, Hg and Pb were developed and produced as granulates and solid bodies. The calibration materials were not generated as a dilution from one master batch but rather the element concentrations were distributed over nine independent calibration samples. This was necessary to enable inter-elemental corrections and empirical constant mass absorption coefficients. The produced standard materials are characterized by a homogenous element distribution, which is more than sufficient for X-ray fluorescence analysis. Concentrations for all elements except for Br could be determined by Inductively Coupled Plasma Atomic Emission Spectroscopy after microwave assisted digestion. The concentration of Br was determined by use of Neutron Activation Analysis at Hahn-Meitner-Institute in Berlin, Germany. The correlation of the X-ray fluorescence analysis measurements with the values determined using Inductively Coupled Plasma Atomic Emission Spectroscopy and Neutron Activation Analysis showed a very good linearity.

  20. Provenience studies using neutron activation analysis: the role of standardization

    International Nuclear Information System (INIS)

    Harbottle, G.

    1980-01-01

    This paper covers the historical background of chemical analysis of archaeological artifacts which dates back to 1790 to the first application of neutron activation analysis to archaeological ceramics and goes on to elaborate on the present day status of neutron activation analysis in provenience studies, and the role of standardization. In principle, the concentrations of elements in a neutron-activated specimen can be calculated from an exact knowledge of neutron flux, its intensity, duration and spectral (energy) distribution, plus an exact gamma ray count calibrated for efficiency, corrected for branching rates, etc. However, in practice it is far easier to compare one's unknown to a standard of known or assumed composition. The practice has been for different laboratories to use different standards. With analyses being run in the thousands throughout the world, a great benefit would be derived if analyses could be exchanged among all users and/or generators of data. The emphasis of this paper is on interlaboratory comparability of ceramic data; how far are we from it, what has been proposed in the past to achieve this goal, and what is being proposed. All of this may be summarized under the general heading of Analytical Quality Control - i.e., how to achieve precise and accurate analysis. The author proposes that anyone wishing to analyze archaeological ceramics should simply use his own standard, but attempt to calibrate that standard as nearly as possible to absolute (i.e., accurate) concentration values. The relationship of Analytical Quality Control to provenience location is also examined

  1. Standardizing the practice of human reliability analysis

    International Nuclear Information System (INIS)

    Hallbert, B.P.

    1993-01-01

    The practice of human reliability analysis (HRA) within the nuclear industry varies greatly in terms of posited mechanisms that shape human performance, methods of characterizing and analytically modeling human behavior, and the techniques that are employed to estimate the frequency with which human error occurs. This variation has been a source of contention among HRA practitioners regarding the validity of results obtained from different HRA methods. It has also resulted in attempts to develop standard methods and procedures for conducting HRAs. For many of the same reasons, the practice of HRA has not been standardized or has been standardized only to the extent that individual analysts have developed heuristics and consistent approaches in their practice of HRA. From the standpoint of consumers and regulators, this has resulted in a lack of clear acceptance criteria for the assumptions, modeling, and quantification of human errors in probabilistic risk assessments

  2. Setting Standards for Medically-Based Running Analysis

    Science.gov (United States)

    Vincent, Heather K.; Herman, Daniel C.; Lear-Barnes, Leslie; Barnes, Robert; Chen, Cong; Greenberg, Scott; Vincent, Kevin R.

    2015-01-01

    Setting standards for medically based running analyses is necessary to ensure that runners receive a high-quality service from practitioners. Medical and training history, physical and functional tests, and motion analysis of running at self-selected and faster speeds are key features of a comprehensive analysis. Self-reported history and movement symmetry are critical factors that require follow-up therapy or long-term management. Pain or injury is typically the result of a functional deficit above or below the site along the kinematic chain. PMID:25014394

  3. ANSI/ASHRAE/IESNA Standard 90.1-2007 Final Determination Quantitative Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Halverson, Mark A.; Liu, Bing; Richman, Eric E.; Winiarski, David W.

    2011-05-01

    The United States (U.S.) Department of Energy (DOE) conducted a final quantitative analysis to assess whether buildings constructed according to the requirements of the American National Standards Institute (ANSI)/American Society of Heating, Refrigerating and Air-Conditioning Engineers (ASHRAE)/Illuminating Engineering Society of North America (IESNA) Standard 90.1-2007 would result in energy savings compared with buildings constructed to ANSI/ASHRAE/IESNA Standard 90.1-2004. The final analysis considered each of the 44 addenda to ANSI/ASHRAE/IESNA Standard 90.1-2004 that were included in ANSI/ASHRAE/IESNA Standard 90.1-2007. All 44 addenda processed by ASHRAE in the creation of Standard 90.1-2007 from Standard 90.1-2004 were reviewed by DOE, and their combined impact on a suite of 15 building prototype models in 15 ASHRAE climate zones was considered. Most addenda were deemed to have little quantifiable impact on building efficiency for the purpose of DOE’s final determination. However, out of the 44 addenda, 9 were preliminarily determined to have measureable and quantifiable impact.

  4. Gold-standard for computer-assisted morphological sperm analysis.

    Science.gov (United States)

    Chang, Violeta; Garcia, Alejandra; Hitschfeld, Nancy; Härtel, Steffen

    2017-04-01

    Published algorithms for classification of human sperm heads are based on relatively small image databases that are not open to the public, and thus no direct comparison is available for competing methods. We describe a gold-standard for morphological sperm analysis (SCIAN-MorphoSpermGS), a dataset of sperm head images with expert-classification labels in one of the following classes: normal, tapered, pyriform, small or amorphous. This gold-standard is for evaluating and comparing known techniques and future improvements to present approaches for classification of human sperm heads for semen analysis. Although this paper does not provide a computational tool for morphological sperm analysis, we present a set of experiments for comparing sperm head description and classification common techniques. This classification base-line is aimed to be used as a reference for future improvements to present approaches for human sperm head classification. The gold-standard provides a label for each sperm head, which is achieved by majority voting among experts. The classification base-line compares four supervised learning methods (1- Nearest Neighbor, naive Bayes, decision trees and Support Vector Machine (SVM)) and three shape-based descriptors (Hu moments, Zernike moments and Fourier descriptors), reporting the accuracy and the true positive rate for each experiment. We used Fleiss' Kappa Coefficient to evaluate the inter-expert agreement and Fisher's exact test for inter-expert variability and statistical significant differences between descriptors and learning techniques. Our results confirm the high degree of inter-expert variability in the morphological sperm analysis. Regarding the classification base line, we show that none of the standard descriptors or classification approaches is best suitable for tackling the problem of sperm head classification. We discovered that the correct classification rate was highly variable when trying to discriminate among non-normal sperm

  5. Preparation of uranium standard solutions for x-ray fluorescence analysis

    International Nuclear Information System (INIS)

    Wong, C.M.; Cate, J.L.; Pickles, W.L.

    1978-03-01

    A method has been developed for gravimetrically preparing uranium nitrate standards with an estimated mean error of 0.1% (1 sigma) and a maximum error of 0.2% (1 sigma) for the total uranium weight. Two source materials, depleted uranium dioxide powder and NBS Standard Reference Material 960 uranium metal, were used to prepare stock solutions. The NBS metal proved to be superior because of the small but inherent uncertainty in the stoichiometry of the uranium oxide. These solutions were used to prepare standards in a freeze-dried configuration suitable for x-ray fluorescence analysis. Both gravimetric and freeze-drying techniques are presented. Volumetric preparation was found to be unsatisfactory for 0.1% precision for the sample size of interest. One of the primary considerations in preparing uranium standards for x-ray fluorescence analysis is the development of a technique for dispensing a 50-μl aliquot of a standard solution with a precision of 0.1% and an accuracy of 0.1%. The method developed corrects for variation in aliquoting and for evaporation loss during weighing. Two sets, each containing 50 standards have been produced. One set has been retained by LLL and one set retained by the Savannah River project

  6. Probability of Failure Analysis Standards and Guidelines for Expendable Launch Vehicles

    Science.gov (United States)

    Wilde, Paul D.; Morse, Elisabeth L.; Rosati, Paul; Cather, Corey

    2013-09-01

    Recognizing the central importance of probability of failure estimates to ensuring public safety for launches, the Federal Aviation Administration (FAA), Office of Commercial Space Transportation (AST), the National Aeronautics and Space Administration (NASA), and U.S. Air Force (USAF), through the Common Standards Working Group (CSWG), developed a guide for conducting valid probability of failure (POF) analyses for expendable launch vehicles (ELV), with an emphasis on POF analysis for new ELVs. A probability of failure analysis for an ELV produces estimates of the likelihood of occurrence of potentially hazardous events, which are critical inputs to launch risk analysis of debris, toxic, or explosive hazards. This guide is intended to document a framework for POF analyses commonly accepted in the US, and should be useful to anyone who performs or evaluates launch risk analyses for new ELVs. The CSWG guidelines provide performance standards and definitions of key terms, and are being revised to address allocation to flight times and vehicle response modes. The POF performance standard allows a launch operator to employ alternative, potentially innovative methodologies so long as the results satisfy the performance standard. Current POF analysis practice at US ranges includes multiple methodologies described in the guidelines as accepted methods, but not necessarily the only methods available to demonstrate compliance with the performance standard. The guidelines include illustrative examples for each POF analysis method, which are intended to illustrate an acceptable level of fidelity for ELV POF analyses used to ensure public safety. The focus is on providing guiding principles rather than "recipe lists." Independent reviews of these guidelines were performed to assess their logic, completeness, accuracy, self- consistency, consistency with risk analysis practices, use of available information, and ease of applicability. The independent reviews confirmed the

  7. Quantitative standard-less XRF analysis

    International Nuclear Information System (INIS)

    Ulitzka, S.

    2002-01-01

    Full text: For most analytical tasks in the mining and associated industries matrix-matched calibrations are used for the monitoring of ore grades and process control. In general, such calibrations are product specific (iron ore, bauxite, alumina, mineral sands, cement etc.) and apply to a relatively narrow concentration range but give the best precision and accuracy for those materials. A wide range of CRMs is available and for less common materials synthetic standards can be made up from 'pure' chemicals. At times, analysis of materials with varying matrices (powders, scales, fly ash, alloys, polymers, liquors, etc.) and diverse physical shapes (non-flat, metal drillings, thin layers on substrates etc.) is required that could also contain elements which are not part of a specific calibration. A qualitative analysis can provide information about the presence of certain elements and the relative intensities of element peaks in a scan can give a rough idea about their concentrations. More often however, quantitative values are required. The paper will look into the basics of quantitative standardless analysis and show results for some well-defined CRMs. Copyright (2002) Australian X-ray Analytical Association Inc

  8. Least squares analysis of fission neutron standard fields

    International Nuclear Information System (INIS)

    Griffin, P.J.; Williams, J.G.

    1997-01-01

    A least squares analysis of fission neutron standard fields has been performed using the latest dosimetry cross sections. Discrepant nuclear data are identified and adjusted spectra for 252 Cf spontaneous fission and 235 U thermal fission fields are presented

  9. Risk Analysis as Regulatory Science: Toward The Establishment of Standards.

    Science.gov (United States)

    Murakami, Michio

    2016-09-01

    Understanding how to establish standards is essential for risk communication and also provides perspectives for further study. In this paper, the concept of risk analysis as regulatory science for the establishment of standards is demonstrated through examples of standards for evacuation and provisional regulation values in foods and drinking water. Moreover, academic needs for further studies related to standards are extracted. The concepts of the traditional 'Standard I', which has a paternalistic orientation, and 'Standard II', established through stakeholder consensus, are then systemized by introducing the current status of the new standards-related movement that developed after the Fukushima nuclear power plant accident, and the perspectives of the standards are discussed. Preparation of standards on the basis of stakeholder consensus through intensive risk dialogue before a potential nuclear power plant accident is suggested to be a promising approach to ensure a safe society and enhance subjective well-being. © The Author 2016. Published by Oxford University Press.

  10. Development of international standards for surface analysis by ISO technical committee 201 on surface chemical analysis

    International Nuclear Information System (INIS)

    Powell, C.J.

    1999-01-01

    Full text: The International Organization for Standardization (ISO) established Technical Committee 201 on Surface Chemical Analysis in 1991 to develop documentary standards for surface analysis. ISO/TC 201 met first in 1992 and has met annually since. This committee now has eight subcommittees (Terminology, General Procedures, Data Management and Treatment, Depth Profiling, AES, SIMS, XPS, and Glow Discharge Spectroscopy (GDS)) and one working group (Total X-Ray Fluorescence Spectroscopy). Each subcommittee has one or more working groups to develop standards on particular topics. Australia has observer-member status on ISO/TC 201 and on all ISO/TC 201 subcommittees except GDS where it has participator-member status. I will outline the organization of ISO/TC 201 and summarize the standards that have been or are being developed. Copyright (1999) Australian X-ray Analytical Association Inc

  11. Newly developed standard reference materials for organic contaminant analysis

    Energy Technology Data Exchange (ETDEWEB)

    Poster, D.; Kucklick, J.; Schantz, M.; Porter, B.; Wise, S. [National Inst. of Stand. and Technol., Gaithersburg, MD (USA). Center for Anal. Chem.

    2004-09-15

    The National Institute of Standards and Technology (NIST) has issued a number of Standard Reference Materials (SRM) for specified analytes. The SRMs are biota and biological related materials, sediments and particle related SRMs. The certified compounds for analysis are polychlorinated biphenyls (PCB), polycylic aromatic hydrocarbons (PAH) and their nitro-analogues, chlorinated pesticides, methylmercury, organic tin compounds, fatty acids, polybrominated biphenyl ethers (PBDE). The authors report on origin of materials and analytic methods. (uke)

  12. ANSI/ASHRAE/IES Standard 90.1-2013 Preliminary Determination: Quantitative Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Halverson, Mark A.; Rosenberg, Michael I.; Wang, Weimin; Zhang, Jian; Mendon, Vrushali V.; Athalye, Rahul A.; Xie, YuLong; Hart, Reid; Goel, Supriya

    2014-03-01

    This report provides a preliminary quantitative analysis to assess whether buildings constructed according to the requirements of ANSI/ASHRAE/IES Standard 90.1-2013 would result in energy savings compared with buildings constructed to ANSI/ASHRAE/IES Standard 90.1-2010.

  13. Echo Hunting

    DEFF Research Database (Denmark)

    King, Anthea L.

    Broad line active galactic nuclei (AGN) have been proposed as potential standardisablecandles by Watson et al. (2011), using a technique called reverberation mapping. This thesisinvestigates whether AGN are useful high redshift standard candles and how to optimise thescientific output of the ongo...

  14. Standards for Standardized Logistic Regression Coefficients

    Science.gov (United States)

    Menard, Scott

    2011-01-01

    Standardized coefficients in logistic regression analysis have the same utility as standardized coefficients in linear regression analysis. Although there has been no consensus on the best way to construct standardized logistic regression coefficients, there is now sufficient evidence to suggest a single best approach to the construction of a…

  15. Development of A Standard Method for Human Reliability Analysis of Nuclear Power Plants

    International Nuclear Information System (INIS)

    Jung, Won Dea; Kang, Dae Il; Kim, Jae Whan

    2005-12-01

    According as the demand of risk-informed regulation and applications increase, the quality and reliability of a probabilistic safety assessment (PSA) has been more important. KAERI started a study to standardize the process and the rules of HRA (Human Reliability Analysis) which was known as a major contributor to the uncertainty of PSA. The study made progress as follows; assessing the level of quality of the HRAs in Korea and identifying the weaknesses of the HRAs, determining the requirements for developing a standard HRA method, developing the process and rules for quantifying human error probability. Since the risk-informed applications use the ASME PSA standard to ensure PSA quality, the standard HRA method was developed to meet the ASME HRA requirements with level of category II. The standard method was based on THERP and ASEP HRA that are widely used for conventional HRA. However, the method focuses on standardizing and specifying the analysis process, quantification rules and criteria to minimize the deviation of the analysis results caused by different analysts. Several HRA experts from different organizations in Korea participated in developing the standard method. Several case studies were interactively undertaken to verify the usability and applicability of the standard method

  16. Development of A Standard Method for Human Reliability Analysis of Nuclear Power Plants

    Energy Technology Data Exchange (ETDEWEB)

    Jung, Won Dea; Kang, Dae Il; Kim, Jae Whan

    2005-12-15

    According as the demand of risk-informed regulation and applications increase, the quality and reliability of a probabilistic safety assessment (PSA) has been more important. KAERI started a study to standardize the process and the rules of HRA (Human Reliability Analysis) which was known as a major contributor to the uncertainty of PSA. The study made progress as follows; assessing the level of quality of the HRAs in Korea and identifying the weaknesses of the HRAs, determining the requirements for developing a standard HRA method, developing the process and rules for quantifying human error probability. Since the risk-informed applications use the ASME PSA standard to ensure PSA quality, the standard HRA method was developed to meet the ASME HRA requirements with level of category II. The standard method was based on THERP and ASEP HRA that are widely used for conventional HRA. However, the method focuses on standardizing and specifying the analysis process, quantification rules and criteria to minimize the deviation of the analysis results caused by different analysts. Several HRA experts from different organizations in Korea participated in developing the standard method. Several case studies were interactively undertaken to verify the usability and applicability of the standard method.

  17. Commercial Discount Rate Estimation for Efficiency Standards Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Fujita, K. Sydny [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States)

    2016-04-13

    Underlying each of the Department of Energy's (DOE's) federal appliance and equipment standards are a set of complex analyses of the projected costs and benefits of regulation. Any new or amended standard must be designed to achieve significant additional energy conservation, provided that it is technologically feasible and economically justified (42 U.S.C. 6295(o)(2)(A)). A proposed standard is considered economically justified when its benefits exceed its burdens, as represented by the projected net present value of costs and benefits. DOE performs multiple analyses to evaluate the balance of costs and benefits of commercial appliance and equipment e efficiency standards, at the national and individual building or business level, each framed to capture different nuances of the complex impact of standards on the commercial end user population. The Life-Cycle Cost (LCC) analysis models the combined impact of appliance first cost and operating cost changes on a representative commercial building sample in order to identify the fraction of customers achieving LCC savings or incurring net cost at the considered efficiency levels.1 Thus, the choice of commercial discount rate value(s) used to calculate the present value of energy cost savings within the Life-Cycle Cost model implicitly plays a key role in estimating the economic impact of potential standard levels.2 This report is intended to provide a more in-depth discussion of the commercial discount rate estimation process than can be readily included in standard rulemaking Technical Support Documents (TSDs).

  18. When Is Hub Gene Selection Better than Standard Meta-Analysis?

    Science.gov (United States)

    Langfelder, Peter; Mischel, Paul S.; Horvath, Steve

    2013-01-01

    Since hub nodes have been found to play important roles in many networks, highly connected hub genes are expected to play an important role in biology as well. However, the empirical evidence remains ambiguous. An open question is whether (or when) hub gene selection leads to more meaningful gene lists than a standard statistical analysis based on significance testing when analyzing genomic data sets (e.g., gene expression or DNA methylation data). Here we address this question for the special case when multiple genomic data sets are available. This is of great practical importance since for many research questions multiple data sets are publicly available. In this case, the data analyst can decide between a standard statistical approach (e.g., based on meta-analysis) and a co-expression network analysis approach that selects intramodular hubs in consensus modules. We assess the performance of these two types of approaches according to two criteria. The first criterion evaluates the biological insights gained and is relevant in basic research. The second criterion evaluates the validation success (reproducibility) in independent data sets and often applies in clinical diagnostic or prognostic applications. We compare meta-analysis with consensus network analysis based on weighted correlation network analysis (WGCNA) in three comprehensive and unbiased empirical studies: (1) Finding genes predictive of lung cancer survival, (2) finding methylation markers related to age, and (3) finding mouse genes related to total cholesterol. The results demonstrate that intramodular hub gene status with respect to consensus modules is more useful than a meta-analysis p-value when identifying biologically meaningful gene lists (reflecting criterion 1). However, standard meta-analysis methods perform as good as (if not better than) a consensus network approach in terms of validation success (criterion 2). The article also reports a comparison of meta-analysis techniques applied to

  19. When is hub gene selection better than standard meta-analysis?

    Directory of Open Access Journals (Sweden)

    Peter Langfelder

    Full Text Available Since hub nodes have been found to play important roles in many networks, highly connected hub genes are expected to play an important role in biology as well. However, the empirical evidence remains ambiguous. An open question is whether (or when hub gene selection leads to more meaningful gene lists than a standard statistical analysis based on significance testing when analyzing genomic data sets (e.g., gene expression or DNA methylation data. Here we address this question for the special case when multiple genomic data sets are available. This is of great practical importance since for many research questions multiple data sets are publicly available. In this case, the data analyst can decide between a standard statistical approach (e.g., based on meta-analysis and a co-expression network analysis approach that selects intramodular hubs in consensus modules. We assess the performance of these two types of approaches according to two criteria. The first criterion evaluates the biological insights gained and is relevant in basic research. The second criterion evaluates the validation success (reproducibility in independent data sets and often applies in clinical diagnostic or prognostic applications. We compare meta-analysis with consensus network analysis based on weighted correlation network analysis (WGCNA in three comprehensive and unbiased empirical studies: (1 Finding genes predictive of lung cancer survival, (2 finding methylation markers related to age, and (3 finding mouse genes related to total cholesterol. The results demonstrate that intramodular hub gene status with respect to consensus modules is more useful than a meta-analysis p-value when identifying biologically meaningful gene lists (reflecting criterion 1. However, standard meta-analysis methods perform as good as (if not better than a consensus network approach in terms of validation success (criterion 2. The article also reports a comparison of meta-analysis techniques

  20. When is hub gene selection better than standard meta-analysis?

    Science.gov (United States)

    Langfelder, Peter; Mischel, Paul S; Horvath, Steve

    2013-01-01

    Since hub nodes have been found to play important roles in many networks, highly connected hub genes are expected to play an important role in biology as well. However, the empirical evidence remains ambiguous. An open question is whether (or when) hub gene selection leads to more meaningful gene lists than a standard statistical analysis based on significance testing when analyzing genomic data sets (e.g., gene expression or DNA methylation data). Here we address this question for the special case when multiple genomic data sets are available. This is of great practical importance since for many research questions multiple data sets are publicly available. In this case, the data analyst can decide between a standard statistical approach (e.g., based on meta-analysis) and a co-expression network analysis approach that selects intramodular hubs in consensus modules. We assess the performance of these two types of approaches according to two criteria. The first criterion evaluates the biological insights gained and is relevant in basic research. The second criterion evaluates the validation success (reproducibility) in independent data sets and often applies in clinical diagnostic or prognostic applications. We compare meta-analysis with consensus network analysis based on weighted correlation network analysis (WGCNA) in three comprehensive and unbiased empirical studies: (1) Finding genes predictive of lung cancer survival, (2) finding methylation markers related to age, and (3) finding mouse genes related to total cholesterol. The results demonstrate that intramodular hub gene status with respect to consensus modules is more useful than a meta-analysis p-value when identifying biologically meaningful gene lists (reflecting criterion 1). However, standard meta-analysis methods perform as good as (if not better than) a consensus network approach in terms of validation success (criterion 2). The article also reports a comparison of meta-analysis techniques applied to

  1. Standard Guide for Wet Sieve Analysis of Ceramic Whiteware Clays

    CERN Document Server

    American Society for Testing and Materials. Philadelphia

    2007-01-01

    1.1 This guide covers the wet sieve analysis of ceramic whiteware clays. This guide is intended for use in testing shipments of clay as well as for plant control tests. 1.2 The values stated in inch-pound units are to be regarded as standard. The values given in parentheses are mathematical conversions to SI units that are provided for information only and are not considered standard. 1.3 This standard does not purport to address all of the safety concerns, if any, associated with its use. It is the responsibility of the user of this standard to establish appropriate safety and health practices and determine the applicability of regulatory limitations prior to use.

  2. Journal of Astrophysics and Astronomy | Indian Academy of Sciences

    Indian Academy of Sciences (India)

    2016-01-27

    Jan 27, 2016 ... The field of Very High Energy (VHE) gamma ray astronomy using the Atmospheric Cerenkov Technique has entered an interesting phase with detection of various galactic and extragalactic sources. Among galactic sources, only the Crab nebula has been established as a standard candle.

  3. The Planetary Nebula Spectrograph : The green light for galaxy kinematics

    NARCIS (Netherlands)

    Douglas, NG; Arnaboldi, M; Freeman, KC; Kuijken, K; Merrifield, MR; Romanowsky, AJ; Taylor, K; Capaccioli, M; Axelrod, T; Gilmozzi, R; Hart, J; Bloxham, G; Jones, D

    2002-01-01

    Planetary nebulae (PNe) are now well established as probes of galaxy dynamics and as standard candles in distance determinations. Motivated by the need to improve the efficiency of planetary nebulae searches and the speed with which their radial velocities are determined, a dedicated instrument-the

  4. Establishing working standards of chromosome aberrations analysis for biological dosimetry

    International Nuclear Information System (INIS)

    Bui Thi Kim Luyen; Tran Que; Pham Ngoc Duy; Nguyen Thi Kim Anh; Ha Thi Ngoc Lien

    2015-01-01

    Biological dosimetry is an dose assessment method using specify bio markers of radiation. IAEA (International Atomic Energy Agency) and ISO (International Organization for Standardization) defined that dicentric chromosome is specify for radiation, it is a gold standard for biodosimetry. Along with the documents published by IAEA, WHO, ISO and OECD, our results of study on the chromosome aberrations induced by radiation were organized systematically in nine standards that dealing with chromosome aberration test and micronucleus test in human peripheral blood lymphocytes in vitro. This standard addresses: the reference dose-effect for dose estimation, the minimum detection levels, cell culture, slide preparation, scoring procedure for chromosome aberrations use for biodosimetry, the criteria for converting aberration frequency into absorbed dose, reporting of results. Following these standards, the automatic analysis devices were calibrated for improving biological dosimetry method. This standard will be used to acquire and maintain accreditation of the Biological Dosimetry laboratory in Nuclear Research Institute. (author)

  5. Integrated Data Collection Analysis (IDCA) Program — RDX Standard Data Sets

    Energy Technology Data Exchange (ETDEWEB)

    Sandstrom, Mary M. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Brown, Geoffrey W. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Preston, Daniel N. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Pollard, Colin J. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Warner, Kirstin F. [Naval Surface Warfare Center (NSWC), Indian Head, MD (United States). Indian Head Division; Sorensen, Daniel N. [Naval Surface Warfare Center (NSWC), Indian Head, MD (United States). Indian Head Division; Remmers, Daniel L. [Naval Surface Warfare Center (NSWC), Indian Head, MD (United States). Indian Head Division; Phillips, Jason J. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Shelley, Timothy J. [Bureau of Alcohol, Tobacco and Firearms, Huntsville, AL (United States); Reyes, Jose A. [Applied Research Associates, Tyndall AFB, FL (United States); Hsu, Peter C. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Reynolds, John G. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States)

    2013-03-04

    The Integrated Data Collection Analysis (IDCA) program is conducting a proficiency study for Small- Scale Safety and Thermal (SSST) testing of homemade explosives (HMEs). Described here are the results for impact, friction, electrostatic discharge, and differential scanning calorimetry analysis of the RDX Type II Class 5 standard, for a third and fourth time in the Proficiency Test and averaged with the analysis results from the first and second time. The results, from averaging all four sets (1, 2, 3 and 4) of data suggest a material to have slightly more impact sensitivity, more BAM friction sensitivity, less ABL friction sensitivity, similar ESD sensitivity, and same DSC sensitivity, compared to the results from Set 1, which was used previously as the values for the RDX standard in IDCA Analysis Reports.

  6. Recommendations for a proposed standard for performing systems analysis

    International Nuclear Information System (INIS)

    LaChance, J.; Whitehead, D.; Drouin, M.

    1998-01-01

    In August 1995, the Nuclear Regulatory Commission (NRC) issued a policy statement proposing improved regulatory decisionmaking by increasing the use of PRA [probabilistic risk assessment] in all regulatory matters to the extent supported by the state-of-the-art in PRA methods and data. A key aspect in using PRA in risk-informed regulatory activities is establishing the appropriate scope and attributes of the PRA. In this regard, ASME decided to develop a consensus PRA Standard. The objective is to develop a PRA Standard such that the technical quality of nuclear plant PRAs will be sufficient to support risk-informed regulatory applications. This paper presents examples recommendations for the systems analysis element of a PRA for incorporation into the ASME PRA Standard

  7. Tooth contact analysis of spur gears. Part 1-SAM analysis of standard gears

    Directory of Open Access Journals (Sweden)

    Creţu Spiridon

    2017-01-01

    Full Text Available The involute gears are sensitive to the misalignment of their axes which determines transmission errors and perturbations of pressures distributions along the tooth flank. The concentrated contacts in gears are no longer as Hertz type. A semi-analytical method was developed to find the contact area, pressures distribution and depth stresses state. The matrix of initial separations is found analytically for standard and non-standard spur gears. The presence of misalignment as well as the flank crowning and flank end relief are included in the numerical analysis process.

  8. ANSI/ASHRAE/IES Standard 90.1-2013 Preliminary Determination: Qualitative Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Halverson, Mark A.; Hart, Reid; Athalye, Rahul A.; Rosenberg, Michael I.; Richman, Eric E.; Winiarski, David W.

    2014-03-01

    Section 304(b) of the Energy Conservation and Production Act (ECPA), as amended, requires the Secretary of Energy to make a determination each time a revised version of ASHRAE Standard 90.1 is published with respect to whether the revised standard would improve energy efficiency in commercial buildings. When the U.S. Department of Energy (DOE) issues an affirmative determination on Standard 90.1, states are statutorily required to certify within two years that they have reviewed and updated the commercial provisions of their building energy code, with respect to energy efficiency, to meet or exceed the revised standard. This report provides a preliminary qualitative analysis of all addenda to ANSI/ASHRAE/IES Standard 90.1-2010 (referred to as Standard 90.1-2010 or 2010 edition) that were included in ANSI/ASHRAE/IES Standard 90.1-2013 (referred to as Standard 90.1-2013 or 2013 edition).

  9. International cooperative analysis of standard substance, IAEA-0390

    International Nuclear Information System (INIS)

    Kawamoto, Keizo; Takada, Jitsuya; Moriyama, Hirotake; Akaboshi, Mitsuhiko

    1999-01-01

    Three kinds of algae (IAEA-0391, IAEA-0392 and IAEA-0393) were defined as the biological standard substance to monitor environmental pollution by Analytical Quality Control Service of IAEA (IAEA-AQCS). In this study, analysis of these standard substances were made using ICP-MS to compare with the results of simultaneously conducted radioactivation analysis (INAA). The respective cultures of the three algae were cooperatively prepared by IAEA-AQCS and microbial Institute of Czechoslovakia. After drying and sterilizing by Co-60 exposure, these samples were sent to KURRI. When the results from the experiment in KURRI were compared with the values recommended through statistical treatment of the data obtained by IAEA, these values of 5 elements, Fe, Cr, Mg, Mn and Na were well coincident for either of IAEA-0391, IAEA-0392 and IAEA-0393 and the values of As, Ca, Cd, Co, Cu, K and Zn were nearly coincident between them. Regarding Hg and La, the data from INAA and ICP-MS were very different from the recommended values of IAEA for either of samples. (M.N.)

  10. ANSI/ASHRAE/IES Standard 90.1-2013 Determination of Energy Savings: Quantitative Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Halverson, Mark A.; Athalye, Rahul A.; Rosenberg, Michael I.; Xie, YuLong; Wang, Weimin; Hart, Philip R.; Zhang, Jian; Goel, Supriya; Mendon, Vrushali V.

    2014-09-04

    This report provides a final quantitative analysis to assess whether buildings constructed according to the requirements of ANSI/ASHRAE/IES Standard 90.1-2013 would result in improved energy efficiency in commercial buildings. The final analysis considered each of the 110 addenda to Standard 90.1-2010 that were included in Standard 90.1-2013. PNNL reviewed all addenda included by ASHRAE in creating Standard 90.1-2013 from Standard 90.1-2010, and considered their combined impact on a suite of prototype building models across all U.S. climate zones. Most addenda were deemed to have little quantifiable impact on building efficiency for the purpose of DOE’s final determination. However, out of the 110 total addenda, 30 were identified as having a measureable and quantifiable impact.

  11. Testing and Improving the Luminosity Relations for Gamma-Ray Bursts

    Science.gov (United States)

    Collazzi, Andrew C.

    2012-01-01

    Gamma Ray Bursts (GRBs) have several luminosity relations where a measurable property of a burst light curve or spectrum is correlated with the burst luminosity. These luminosity relations are calibrated for the fraction of bursts with spectroscopic redshifts and hence the known luminosities. GRBs have thus become known as a type of "standard candle” where standard candle is meant in the usual sense that luminosities can be derived from measurable properties of the bursts. GRBs can therefore be used for the same cosmology applications as Type Ia supernovae, including the construction of the Hubble Diagram and measuring massive star formation rate. The greatest disadvantage of using GRBs as standard candles is that their accuracy is lower than desired. With the recent advent of GRBs as a new standard candle, every effort must be made to test and improve the distance measures. Here, methods are employed to do just that. First, generalized forms of two tests are performed on the luminosity relations. All the luminosity relations pass one of these tests, and all but two pass the other. Even with this failure, redundancies in using multiple luminosity relations allows all the luminosity relations to retain value. Next, the "Firmani relation” is shown to have poorer accuracy than first advertised. It is also shown to be derivable from two other luminosity relations. For these reasons, the Firmani relation is useless for cosmology. The Amati relation is then revisited and shown to be an artifact of a combination of selection effects. Therefore, the Amati relation is also not good for cosmology. Fourthly, the systematic errors involved in measuring a luminosity indicator (Epeak) are measured. The result is an irreducible systematic error of 28%. Finally, the work concludes with a discussion about the impact of the work and the future of GRB luminosity relations.

  12. Development of A Standard Method for Human Reliability Analysis (HRA) of Nuclear Power Plants

    International Nuclear Information System (INIS)

    Kang, Dae Il; Jung, Won Dea; Kim, Jae Whan

    2005-12-01

    According as the demand of risk-informed regulation and applications increase, the quality and reliability of a probabilistic safety assessment (PSA) has been more important. KAERI started a study to standardize the process and the rules of HRA (Human Reliability Analysis) which was known as a major contributor to the uncertainty of PSA. The study made progress as follows; assessing the level of quality of the HRAs in Korea and identifying the weaknesses of the HRAs, determining the requirements for developing a standard HRA method, developing the process and rules for quantifying human error probability. Since the risk-informed applications use the ASME and ANS PSA standard to ensure PSA quality, the standard HRA method was developed to meet the ASME and ANS HRA requirements with level of category II. The standard method was based on THERP and ASEP HRA that are widely used for conventional HRA. However, the method focuses on standardizing and specifying the analysis process, quantification rules and criteria to minimize the deviation of the analysis results caused by different analysts. Several HRA experts from different organizations in Korea participated in developing the standard method. Several case studies were interactively undertaken to verify the usability and applicability of the standard method

  13. Development of A Standard Method for Human Reliability Analysis (HRA) of Nuclear Power Plants

    Energy Technology Data Exchange (ETDEWEB)

    Kang, Dae Il; Jung, Won Dea; Kim, Jae Whan

    2005-12-15

    According as the demand of risk-informed regulation and applications increase, the quality and reliability of a probabilistic safety assessment (PSA) has been more important. KAERI started a study to standardize the process and the rules of HRA (Human Reliability Analysis) which was known as a major contributor to the uncertainty of PSA. The study made progress as follows; assessing the level of quality of the HRAs in Korea and identifying the weaknesses of the HRAs, determining the requirements for developing a standard HRA method, developing the process and rules for quantifying human error probability. Since the risk-informed applications use the ASME and ANS PSA standard to ensure PSA quality, the standard HRA method was developed to meet the ASME and ANS HRA requirements with level of category II. The standard method was based on THERP and ASEP HRA that are widely used for conventional HRA. However, the method focuses on standardizing and specifying the analysis process, quantification rules and criteria to minimize the deviation of the analysis results caused by different analysts. Several HRA experts from different organizations in Korea participated in developing the standard method. Several case studies were interactively undertaken to verify the usability and applicability of the standard method.

  14. Radiation Safety Analysis In The NFEC For Assessing Possible Implementation Of The ICRP-60 Standard

    International Nuclear Information System (INIS)

    Yowono, I.

    1998-01-01

    Radiation safety analysis of the 3 facilities in the nuclear fuel element center (NFEC) for assessing possible implementation of the ICRP-60 standard has been done. The analysis has covered the radiation dose received by workers, dose rate in the working area, surface contamination level, air contamination level and the level of radioactive gas release to the environment. The analysis has been based on BATAN regulation and ICRP-60 standard. The result of the analysis has showed that the highest radiation dose received has been found to be only around 15% of the set value in the ICRP-60 standard and only 6% of the set value in the BATAN regulation. Thus the ICRP-60 as radiation safety standard could be implemented without changing the laboratory design

  15. Standard gamma-ray spectra for the comparison of spectral analysis software

    International Nuclear Information System (INIS)

    Woods, S.; Hemingway, J.; Bowles, N.

    1997-01-01

    Three sets of standard γ-ray spectra have been produced for use in assessing the performance of spectral analysis software. The origin of and rationale behind the spectra are described. Nine representative analysis systems have been tested both in terms of component performance and in terms of overall performance and the problems encountered in the analysis are discussed. (author)

  16. Standard gamma-ray spectra for the comparison of spectral analysis software

    Energy Technology Data Exchange (ETDEWEB)

    Woods, S.; Hemingway, J.; Bowles, N. [and others

    1997-08-01

    Three sets of standard {gamma}-ray spectra have been produced for use in assessing the performance of spectral analysis software. The origin of and rationale behind the spectra are described. Nine representative analysis systems have been tested both in terms of component performance and in terms of overall performance and the problems encountered in the analysis are discussed. (author)

  17. Standard test method for isotopic analysis of uranium hexafluoride by double standard single-collector gas mass spectrometer method

    CERN Document Server

    American Society for Testing and Materials. Philadelphia

    2010-01-01

    1.1 This is a quantitative test method applicable to determining the mass percent of uranium isotopes in uranium hexafluoride (UF6) samples with 235U concentrations between 0.1 and 5.0 mass %. 1.2 This test method may be applicable for the entire range of 235U concentrations for which adequate standards are available. 1.3 This test method is for analysis by a gas magnetic sector mass spectrometer with a single collector using interpolation to determine the isotopic concentration of an unknown sample between two characterized UF6 standards. 1.4 This test method is to replace the existing test method currently published in Test Methods C761 and is used in the nuclear fuel cycle for UF6 isotopic analyses. 1.5 The values stated in SI units are to be regarded as standard. No other units of measurement are included in this standard. 1.6 This standard does not purport to address all of the safety concerns, if any, associated with its use. It is the responsibility of the user of this standard to establish appro...

  18. Sleep disordered breathing analysis in a general population using standard pulse oximeter signals.

    Science.gov (United States)

    Barak-Shinar, Deganit; Amos, Yariv; Bogan, Richard K

    2013-09-01

    Obstructive sleep apnea reported as the apnea-hypopnea index (AHI) is usually measured in sleep laboratories using a high number of electrodes connected to the patient's body. In this study, we examined the use of a standard pulse oximeter system with an automated analysis based on the photoplethysmograph (PPG) signal for the diagnosis of sleep disordered breathing. Using a standard and simple device with high accuracy might provide a convenient diagnostic or screening solution for patient evaluation at home or in other out of center testing environments. The study included 140 consecutive patients that were referred routinely to a sleep laboratory [SleepMed Inc.] for the diagnosis of sleep disordered breathing. Each patient underwent an overnight polysomnography (PSG) study according to AASM guidelines in an AASM-accredited sleep laboratory. The automatic analysis is based on photoplethysmographic and saturation signals only. Those two signals were recorded for the entire night as part of the full overnight PSG sleep study. The AHI calculated from the PPG analysis is compared to the AHI calculated from the manual scoring gold standard full PSG. The AHI and total respiratory events measured by the pulse oximeter analysis correlated very well with the corresponding results obtained by the gold standard full PSG. The sensitivity and specificity of AHI = or > 5 and 15 levels measured by the analysis are both above 90 %. The sensitivity and positive predictive value for the detection of respiratory event are both above 84 %. The tested system in this study yielded an acceptable result of sleep disordered breathing compared to the gold standard PSG in patients with moderate to severe sleep apnea. Accordingly and given the convenience and simplicity of the standard pulse oximeter device, the new system can be considered suitable for home and ambulatory diagnosis or screening of sleep disordered breathing patients.

  19. Analysis of standard problem six (Semiscale test S-02-6) data

    International Nuclear Information System (INIS)

    Cartmill, C.E.

    1977-08-01

    Test S-02-6 of the Semiscale Mod-1 blowdown heat transfer test series was conducted to supply data for the U.S. Nuclear Regulatory Commission Standard Problem Six. To determine the credibility of the data and thus establish the validity of Standard Problem Six, an analysis of the results of Test S-02-6 was performed and is presented. This analysis consisted of investigations of system hydraulic and core thermal data. The credibility of the system hydraulic data was investigated through comparisons of the data with data and calculations from related sources (Test S-02-4) and, when necessary, through assessment of physical events. The credibility of the core thermal data was based on a thorough analysis of physical events. The results of these investigations substantiate the validity of Test S-02-6 data

  20. The preparation of synthetic standards for use in instrumental neutron-activation analysis

    International Nuclear Information System (INIS)

    Eddy, B.T.; Watterson, J.I.W.; Erasmus, C.S.

    1979-01-01

    An account is given of the formulation and preparation of synthetic standards suitable for the routine analysis of minerals, ores, and ore concentrates by instrumental neutron activation. Fifteen standards were prepared, each containing from one to seven elements. The standards contain forty-four elements that produce isotopes with half-lives longer than 12 hours. An evaluation of the accuracy and precision of the method of preparation is given

  1. Design and analysis of control charts for standard deviation with estimated parameters

    NARCIS (Netherlands)

    Schoonhoven, M.; Riaz, M.; Does, R.J.M.M.

    2011-01-01

    This paper concerns the design and analysis of the standard deviation control chart with estimated limits. We consider an extensive range of statistics to estimate the in-control standard deviation (Phase I) and design the control chart for real-time process monitoring (Phase II) by determining the

  2. Phytochemical analysis and standardization of Strychnos nux-vomica extract through HPTLC techniques

    Directory of Open Access Journals (Sweden)

    Dinesh Kumar Patel

    2012-05-01

    Full Text Available Objective: The objective is to develop a noval qualitative and quantitative method by which we can determine different phytoconstituents of Strychnos nux-vomica L. Methods: To profile the phyconstituents of Strychnos nux-vomica, in the present study hydroalcoholic extract of Strychnos nux-vomica was subjected to preliminary phytochemical analysis, antimicrobial activities against certain pathogenic microorganisms, solubility test, loss on drying and pH value. Extract was also subjected to the quantitative analysis including total phenol, flavonoid and heavy metal analysis. Quantitative analysis was performed through HPTLC methods using strychnine and brucine as a standard marker. Results: Phytochemical analysis revealed the presence of alkaloid, carbohydrate, tannin, steroid, triterpenoid and glycoside in the extract. Total flavonoid and phenol content of Strychnos nux-vomica L extract was found to be 0.40 % and 0.43%. Result showed that the level of heavy metal (lead, arsenic, mercury and cadmium complie the standard level. Total bacterial count, yeast and moulds contents were found to be under the limit whereas E. coli and salmonella was found to be absent in the extract. Content of strychnine and brucine were found to be 4.75% and 3.91%. Conclusions: These studies provide valluable information for correct identification and selection of the drug from various adulterations. In future this study will be helpful for the quantitative analysis as well as standardization of the Strychnos nux-vomica L.

  3. ANSI/ASHRAE/IES Standard 90.1-2013 Determination of Energy Savings: Qualitative Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Halverson, Mark A.; Rosenberg, Michael I.; Hart, Philip R.; Richman, Eric E.; Athalye, Rahul A.; Winiarski, David W.

    2014-09-04

    This report provides a final qualitative analysis of all addenda to ANSI/ASHRAE/IES Standard 90.1-2010 (referred to as Standard 90.1-2010 or 2010 edition) that were included in ANSI/ASHRAE/IES Standard 90.1-2013 (referred to as Standard 90.1-2013 or 2013 edition). All addenda in creating Standard 90.1-2013 were evaluated for their projected impact on energy efficiency. Each addendum was characterized as having a positive, neutral, or negative impact on overall building energy efficiency.

  4. Cosmology with coalescing massive black holes

    International Nuclear Information System (INIS)

    Hughes, Scott A; Holz, Daniel E

    2003-01-01

    The gravitational waves generated in the coalescence of massive binary black holes will be measurable by LISA to enormous distances. Redshifts z ∼ 10 or larger (depending somewhat on the mass of the binary) can potentially be probed by such measurements, suggesting that binary coalescences can be made into cosmological tools. We discuss two particularly interesting types of probe. First, by combining gravitational-wave measurements with information about the cosmography of the universe, we can study the evolution of black-hole masses and merger rates as a function of redshift, providing information about the growth of structures at high redshift and possibly constraining hierarchical merger scenarios. Second, if it is possible to associate an 'electromagnetic' counterpart with a coalescence, it may be possible to measure both redshift and luminosity distance to an event with less than ∼1% error. Such a measurement would constitute an amazingly precise cosmological standard candle. Unfortunately, gravitational lensing uncertainties will reduce the quality of this candle significantly. Though not as amazing as might have been hoped, such a candle would nonetheless very usefully complement other distance-redshift probes, in particular providing a valuable check on systematic effects in such measurements

  5. Stakeholder analysis for adopting a personal health record standard in Korea.

    Science.gov (United States)

    Kang, Min-Jeoung; Jung, Chai Young; Kim, Soyoun; Boo, Yookyung; Lee, Yuri; Kim, Sundo

    Interest in health information exchanges (HIEs) is increasing. Several countries have adopted core health data standards with appropriate strategies. This study was conducted to determine the feasibility of a continuity of care record (CCR) as the standard for an electronic version of the official transfer note and the HIE in Korean healthcare. A technical review of the CCR standard and analysis of stakeholders' views were undertaken. Transfer notes were reviewed and matched with CCR standard categories. The standard for the Korean coding system was selected. Stakeholder analysis included an online survey of members of the Korean Society of Medical Informatics, a public hearing to derive opinions of consumers, doctors, vendors, academic societies and policy makers about the policy process, and a focus group meeting with EMR vendors to determine which HIE objects were technically applicable. Data objects in the official transfer note form matched CCR standards. Korean Classification of Diseases, Korean Standard Terminology of Medicine, Electronic Data Interchange code (EDI code), Logical Observation Identifiers Names and Codes, and Korean drug codes (KD code) were recommended as the Korean coding standard.'Social history', 'payers', and 'encounters' were mostly marked as optional or unnecessary sections, and 'allergies', 'alerts', 'medication list', 'problems/diagnoses', 'results',and 'procedures' as mandatory. Unlike the US, 'social history' was considered optional and 'advance directives' mandatory.At the public hearing there was some objection from the Korean Medical Association to the HIE on legal grounds in termsof intellectual property and patients' personal information. Other groups showed positive or neutral responses. Focus group members divided CCR data objects into three phases based onpredicted adoption time in CCR: (i) immediate adoption; (ii) short-term adoption ('alerts', 'family history'); and (iii) long-term adoption ('results', 'advanced directives

  6. The UFFO (Ultra Fast Flash Observatory) Pathfinder: Science and Mission

    DEFF Research Database (Denmark)

    Chen, P.; Ahmad, S.; Ahn, K.

    in a more rigorous test of current internal shock models, probe the extremes of bulk Lorentz factors, provide the first early and detailed measurements of fast-rise GRB optical light curves, and help verify the prospect of GRB as a new standard candle. We will describe the science and the mission...

  7. 14 CFR Appendix J to Part 25 - Emergency Evacuation

    Science.gov (United States)

    2010-01-01

    ... AIRWORTHINESS STANDARDS: TRANSPORT CATEGORY AIRPLANES Pt. 25, App. J Appendix J to Part 25—Emergency Evacuation...-candles prior to the activation of the airplane emergency lighting system. The source(s) of the initial... airplane emergency lighting system. (b) The airplane must be in a normal attitude with landing gear...

  8. The Great Attractor: At the Limits of Hubble's Law of the Expanding Universe.

    Science.gov (United States)

    Murdin, Paul

    1991-01-01

    Presents the origin and mathematics of Hubble's Law of the expanding universe. Discusses limitations to this law and the related concepts of standard candles, elliptical galaxies, and streaming motions, which are conspicuous deviations from the law. The third of three models proposed as explanations for streaming motions is designated: The Great…

  9. Standardized Effect Size Measures for Mediation Analysis in Cluster-Randomized Trials

    Science.gov (United States)

    Stapleton, Laura M.; Pituch, Keenan A.; Dion, Eric

    2015-01-01

    This article presents 3 standardized effect size measures to use when sharing results of an analysis of mediation of treatment effects for cluster-randomized trials. The authors discuss 3 examples of mediation analysis (upper-level mediation, cross-level mediation, and cross-level mediation with a contextual effect) with demonstration of the…

  10. The development of a standard format for accelerator data analysis

    International Nuclear Information System (INIS)

    Cohen, S.

    1989-01-01

    The purpose of specifying a standard file format is to facilitate the analysis of data sampled by accelerator beam diagnostic instrumentation. The format's design needs to be flexible enough to allow storage of information from disparate diagnostic devices placed in the beam line. The goal of this project was to establish a standard file layout and syntax that can be generated and ''understood'' by a large set of applications running on the control and data-analysis computers at LAMPF as well as applications on personal computers. Only one file-parsing algorithm is needed for all computing systems. It is a straightforward process to code a parser for both the control computer and pc's once a consensus on the file syntax has been established. This paper describes the file format and the methods used to integrate the format into existing diagnostic and control software

  11. Nuclear microprobe analysis of the standard reference materials

    International Nuclear Information System (INIS)

    Jaksic, M.; Fazinic, S.; Bogdanovic, I.; Tadic, T.

    2002-01-01

    Most of the presently existing Standard Reference Materials (SRM) for nuclear analytical methods are certified for the analyzed mass of the order of few hundred mg. Typical mass of sample which is analyzed by PIXE or XRF methods is very often below 1 mg. By the development of focused proton or x-ray beams, masses which can be typically analyzed go down to μg or even ng level. It is difficult to make biological or environmental SRMs which can give desired homogeneity at such low scale. However, use of fundamental parameter quantitative evaluation procedures (absolute method), minimize needs for SRMs. In PIXE and micro PIXE setup at our Institute, fundamental parameter approach is used. For exact calibration of the quantitative analysis procedure just one standard sample is needed. In our case glass standards which showed homogeneity down to micron scale were used. Of course, it is desirable to use SRMs for quality assurance, and therefore need for homogenous materials can be justified even for micro PIXE method. In this presentation, brief overview of PIXE setup calibration is given, along with some recent results of tests of several SRMs

  12. Soil texture analysis by laser diffraction - standardization needed

    DEFF Research Database (Denmark)

    Callesen, Ingeborg; Palviainen, M.; Kjønaas, O. Janne

    2017-01-01

    Soil texture is a central soil quality property. Laser diffraction (LD) for determination of particle size distribution (PSD) is now widespread due to easy analysis and low cost. However, pretreatment methods and interpretation of the resulting soil PSD’s are not standardized. Comparison of LD data...... with sedimentation and sieving data may cause misinterpretation and confusion. In literature that reports PSD’s based on LD, pretreatment methods, operating procedures and data methods are often underreported or not reported, although literature stressing the importance exists (e.g. Konert and Vandenberghe, 1997...... and many newer; ISO 13320:2009). PSD uncertainty caused by pretreatments and PSD bias caused by plate-shaped clay particles still calls for more method standardization work. If LD is used more generally, new pedotransfer functions for other soil properties (e.g water retention) based on sieving...

  13. Comparative Analysis of Three Proposed Federal Renewable Electricity Standards

    Energy Technology Data Exchange (ETDEWEB)

    Sullivan, P.; Logan, J.; Bird, L.; Short, W.

    2009-05-01

    This paper analyzes potential impacts of proposed national renewable electricity standard (RES) legislation. An RES is a mandate requiring certain electricity retailers to provide a minimum share of their electricity sales from qualifying renewable power generation. The analysis focuses on draft bills introduced individually by Senator Jeff Bingaman and Representative Edward Markey, and jointly by Representative Henry Waxman and Markey. The analysis uses NREL's Regional Energy Deployment System (ReEDS) model to evaluate the impacts of the proposed RES requirements on the U.S. energy sector in four scenarios.

  14. On criteria for examining analysis quality with standard reference material

    International Nuclear Information System (INIS)

    Yang Huating

    1997-01-01

    The advantages and disadvantages and applicability of some criteria for examining analysis quality with standard reference material are discussed. The combination of the uncertainties of the instrument examined and the reference material should be determined on the basis of specific situations. Without the data of the instrument's uncertainty, it would be applicable to substitute the standard deviation multiplied by certain times for the uncertainty. The result of the examining should not result in more error reported in routine measurements than it really is. Over strict examining should also be avoided

  15. Synthetic multielement standards used for instrumental neutron activation analysis as rock imitations

    International Nuclear Information System (INIS)

    Leypunskaya, D.I.; Drynkin, V.I.; Belenky, B.V.; Kolomijtsev, M.A.; Dundera, V.Yu.; Pachulia, N.V.

    1975-01-01

    Complex (multielemental) standards representing microelement composition of standard rocks such as trap ST-1 (USSR), gabbrodiorite SGD-1 (USSR), albitized granite SG-1 (USSR), basalt BCR-1 (USA) and granodiorite GSP-1 (USA) have been synthesized. It has been shown that the concentration of each microelement in the synthetic standards can be given with a high precision. Comparative investigation has been carried out of the synthetic imitations and the above natural standard rocks. It has been found that the result of the instrumental neutron activation analysis using the synthetic standards is as good as in the case when natural standard rocks are used. The results obtained have been also used for substantiation of the versatility of the method used for standard preparation, i.e. a generalization has been made of a possibility of using this method for the preparation of synthetic standards representing the microelement composition of any natural rocks with various compositions and concentrations of microelements. (T.G.)

  16. Initial testing of a neutron activation analysis system by analysing standard reference materials

    International Nuclear Information System (INIS)

    Suhaimi Hamzah; Roslan Idris; Abdul Khalik Haji Wood; Che Seman Mahmood; Abdul Rahim Mohamad Noor.

    1983-01-01

    This paper describes the data acquisition and processing system in our laboratories (ND6600), the methods of activation analysis and the results obtained from our analysis of IAEA standard reference material (SL-l lake sediments and NBS coal ash 1632a). These standards were analysed in order to check the capability of the system, which was designed in such a way as to enable the user to independently collect and process data from multiple radiation detectors. (author)

  17. Discussion on the Standardization of Shielding Materials — Sensitivity Analysis of Material Compositions

    Directory of Open Access Journals (Sweden)

    Ogata Tomohiro

    2017-01-01

    Full Text Available The overview of standardization activities for shielding materials is described. We propose a basic approach for standardizing material composition used in radiation shielding design for nuclear and accelerator facilities. We have collected concrete composition data from actual concrete samples to organize a representative composition and its variance data. Then the sensitivity analysis of the composition variance has been performed through a simple 1-D dose calculation. Recent findings from the analysis are summarized.

  18. Cleanup standards and pathways analysis methods

    International Nuclear Information System (INIS)

    Devgun, J.S.

    1993-01-01

    Remediation of a radioactively contaminated site requires that certain regulatory criteria be met before the site can be released for unrestricted future use. Since the ultimate objective of remediation is to protect the public health and safety, residual radioactivity levels remaining at a site after cleanup must be below certain preset limits or meet acceptable dose or risk criteria. Release of a decontaminated site requires proof that the radiological data obtained from the site meet the regulatory criteria for such a release. Typically release criteria consist of a composite of acceptance limits that depend on the radionuclides, the media in which they are present, and federal and local regulations. In recent years, the US Department of Energy (DOE) has developed a pathways analysis model to determine site-specific soil activity concentration guidelines for radionuclides that do not have established generic acceptance limits. The DOE pathways analysis computer code (developed by Argonne National Laboratory for the DOE) is called RESRAD (Gilbert et al. 1989). Similar efforts have been initiated by the US Nuclear Regulatory Commission (NRC) to develop and use dose-related criteria based on genetic pathways analyses rather than simplistic numerical limits on residual radioactivity. The focus of this paper is radionuclide contaminated soil. Cleanup standards are reviewed, pathways analysis methods are described, and an example is presented in which RESRAD was used to derive cleanup guidelines

  19. A Student Assessment Tool for Standardized Patient Simulations (SAT-SPS): Psychometric analysis.

    Science.gov (United States)

    Castro-Yuste, Cristina; García-Cabanillas, María José; Rodríguez-Cornejo, María Jesús; Carnicer-Fuentes, Concepción; Paloma-Castro, Olga; Moreno-Corral, Luis Javier

    2018-05-01

    The evaluation of the level of clinical competence acquired by the student is a complex process that must meet various requirements to ensure its quality. The psychometric analysis of the data collected by the assessment tools used is a fundamental aspect to guarantee the student's competence level. To conduct a psychometric analysis of an instrument which assesses clinical competence in nursing students at simulation stations with standardized patients in OSCE-format tests. The construct of clinical competence was operationalized as a set of observable and measurable behaviors, measured by the newly-created Student Assessment Tool for Standardized Patient Simulations (SAT-SPS), which was comprised of 27 items. The categories assigned to the items were 'incorrect or not performed' (0), 'acceptable' (1), and 'correct' (2). 499 nursing students. Data were collected by two independent observers during the assessment of the students' performance at a four-station OSCE with standardized patients. Descriptive statistics were used to summarize the variables. The difficulty levels and floor and ceiling effects were determined for each item. Reliability was analyzed using internal consistency and inter-observer reliability. The validity analysis was performed considering face validity, content and construct validity (through exploratory factor analysis), and criterion validity. Internal reliability and inter-observer reliability were higher than 0.80. The construct validity analysis suggested a three-factor model accounting for 37.1% of the variance. These three factors were named 'Nursing process', 'Communication skills', and 'Safe practice'. A significant correlation was found between the scores obtained and the students' grades in general, as well as with the grades obtained in subjects with clinical content. The assessment tool has proven to be sufficiently reliable and valid for the assessment of the clinical competence of nursing students using standardized patients

  20. The development of a standard format for accelerator data analysis

    Energy Technology Data Exchange (ETDEWEB)

    Cohen, S [Los Alamos National Lab., NM (USA)

    1990-08-01

    The purpose of specifying a standard file format is to facilitate the analysis of data sampled by accelerator beam-diagnostic instrumentation. The format's design needs to be flexible enough to allow storage of information from disparate diagnostic devices placed in the beam line. The goal of this project was to establish a standard file layout and syntax that can be generated and 'understood' by a large set of applications running on the control and data-analysis computers at LAMPF, as well as applications on personal computers. Only one file-parsing algorithm is needed for all computing systems. Once a consensus on the file syntax has been established, it is a straightforward process to code a parser for both the control computer and PCs. This paper describes the file format and the method used to integrate the format into existing diagnostic and control software. (orig.).

  1. Environmental protection standards - from the point of view of systems analysis

    Energy Technology Data Exchange (ETDEWEB)

    Becker, K

    1978-11-01

    A project of the International Institute of Applied Systems Analysis (IIASA) in Laxenburg castle near Vienna is reviewed where standards for environmental protection are interpreted from the point of view of systems analysis. Some examples are given to show how results are influenced not only by technical and economic factors but also by psychological and political factors.

  2. Environmental protection standards - from the point of view of systems analysis

    International Nuclear Information System (INIS)

    Becker, K.

    1978-01-01

    A project of the International Institute of Applied Systems Analysis (IIASA) in Laxenburg castle near Vienna is reviewed where standards for environmental protection are interpreted from the point of view of systems analysis. Some examples are given to show how results are influenced not only by technical and economic factors but also by psychological and political factors. (orig.) [de

  3. Non standard analysis, polymer models, quantum fields

    International Nuclear Information System (INIS)

    Albeverio, S.

    1984-01-01

    We give an elementary introduction to non standard analysis and its applications to the theory of stochastic processes. This is based on a joint book with J.E. Fenstad, R. Hoeegh-Krohn and T. Lindstroeem. In particular we give a discussion of an hyperfinite theory of Dirichlet forms with applications to the study of the Hamiltonian for a quantum mechanical particle in the potential created by a polymer. We also discuss new results on the existence of attractive polymer measures in dimension d 1 2 phi 2 2 )sub(d)-model of interacting quantum fields. (orig.)

  4. Performance Analysis of a Utility Helicopter with Standard and Advanced Rotors

    National Research Council Canada - National Science Library

    Yeo, Hyeonsoo; Bousman, William G; Johnson, Wayne

    2002-01-01

    Flight test measurements of the performance of the UH-60 Black Hawk helicopter with both standard and advanced rotors are compared with calculations obtained using the comprehensive helicopter analysis CAMRAD II...

  5. 41 CFR 102-74.180 - What illumination levels must Federal agencies maintain on Federal facilities?

    Science.gov (United States)

    2010-07-01

    ...) 30 foot-candles in work areas during working hours, measured at 30 inches above floor level; (c) 10 foot-candles, but not less than 1 foot-candle, in non-work areas, during working hours (normally this... surfaces, measured at a height of 30 inches above floor level, during working hours (for visually difficult...

  6. Analysis of Standards Efficiency in Digital Television Via Satellite at Ku and Ka Bands

    Directory of Open Access Journals (Sweden)

    Landeros-Ayala Salvador

    2013-06-01

    Full Text Available In this paper, an analysis on the main technical features of digital television standards for satellite transmission is carried out. Based on simulations and link budgets, the standard with the best operational performance is defined, based on simulations and link budget analysis, as well as a comparative efficiency analysis is conducted for the Ku and Ka bands for both transparent and regenerative transponders in terms of power, bandwidth, information rate and link margin, including clear sky, uplink rain, downlink rain and rain in both.

  7. A dedicated on-line system for the preparation and validation of standard beads in XRF analysis

    International Nuclear Information System (INIS)

    Yamamoto, Yasuyuki; Ogasawara, Noriko; Nakata, Akio; Shoji, Shizuko.

    1995-01-01

    A dedicated on-line system in X-ray Fluorescence (XRF) analysis with glass-bead method was developed in which preparation of standard beads was automated including proper choice of reagents, assignment of bead compositions and validation of the prepared beads. This system features: a. Fundamental Parameter (FP) Method for validation of standard beads. b. An original database of high purity reagents for standards. c. Automatic calculation of suitable composition for each standard bead, by giving a range for each element and the number of standard beads. 1) The calculation is based on random numbers, and makes a random assignment of composition for each bead. 2) The calculation results are automatically stored in a computer as a condition file for quantitative analysis. 3) An amount of a material for a standard mixture is corrected if a valence or a chemical compound for an analysis element is different from that of the standard material in the database. In order to realize these features, many high purity reagents were examined for their purities and other characteristics to test a suitability to use for a standard material, and a software for on-line processings was originally developed. (author)

  8. The Status of the Ultra Fast Flash Observatory - Pathfinder

    DEFF Research Database (Denmark)

    Nam, J. W.; Ahmad, S.; Ahn, K. B.

    2014-01-01

    The Ultra Fast Flash Observatory (UFFO) is a project to study early optical emissions from Gamma Ray Bursts (GRBs). The primary scientific goal of UFFO is to see if GRBs can be calibrated with their rising times, so that they could be used as new standard candles. In order to minimize delay in op...

  9. A standard analysis method (SAM) for the automated analysis of polychlorinated biphenyls (PCBs) in soils using the chemical analysis automation (CAA) paradigm: validation and performance

    International Nuclear Information System (INIS)

    Rzeszutko, C.; Johnson, C.R.; Monagle, M.; Klatt, L.N.

    1997-10-01

    The Chemical Analysis Automation (CAA) program is developing a standardized modular automation strategy for chemical analysis. In this automation concept, analytical chemistry is performed with modular building blocks that correspond to individual elements of the steps in the analytical process. With a standardized set of behaviors and interactions, these blocks can be assembled in a 'plug and play' manner into a complete analysis system. These building blocks, which are referred to as Standard Laboratory Modules (SLM), interface to a host control system that orchestrates the entire analytical process, from sample preparation through data interpretation. The integrated system is called a Standard Analysis Method (SAME). A SAME for the automated determination of Polychlorinated Biphenyls (PCB) in soils, assembled in a mobile laboratory, is undergoing extensive testing and validation. The SAME consists of the following SLMs: a four channel Soxhlet extractor, a High Volume Concentrator, column clean up, a gas chromatograph, a PCB data interpretation module, a robot, and a human- computer interface. The SAME is configured to meet the requirements specified in U.S. Environmental Protection Agency's (EPA) SW-846 Methods 3541/3620A/8082 for the analysis of pcbs in soils. The PCB SAME will be described along with the developmental test plan. Performance data obtained during developmental testing will also be discussed

  10. Radiological error: analysis, standard setting, targeted instruction and teamworking

    International Nuclear Information System (INIS)

    FitzGerald, Richard

    2005-01-01

    Diagnostic radiology does not have objective benchmarks for acceptable levels of missed diagnoses [1]. Until now, data collection of radiological discrepancies has been very time consuming. The culture within the specialty did not encourage it. However, public concern about patient safety is increasing. There have been recent innovations in compiling radiological interpretive discrepancy rates which may facilitate radiological standard setting. However standard setting alone will not optimise radiologists' performance or patient safety. We must use these new techniques in radiological discrepancy detection to stimulate greater knowledge sharing, targeted instruction and teamworking among radiologists. Not all radiological discrepancies are errors. Radiological discrepancy programmes must not be abused as an instrument for discrediting individual radiologists. Discrepancy rates must not be distorted as a weapon in turf battles. Radiological errors may be due to many causes and are often multifactorial. A systems approach to radiological error is required. Meaningful analysis of radiological discrepancies and errors is challenging. Valid standard setting will take time. Meanwhile, we need to develop top-up training, mentoring and rehabilitation programmes. (orig.)

  11. Identifying variables that influence manufacturing product quality

    Directory of Open Access Journals (Sweden)

    Marek Krynke

    2014-10-01

    Full Text Available In the article a risk analysis of the production process of selected products in a plant producing votive candles was conducted. The Pareto-Lorenz diagram and FMEA method were used which indicated the most important areas affecting the production of selected elements of candles. The synthesis of intangible factors affecting production in the audited company was also carried out with particular emphasis on the operation of the production system. The factors determining the validity of studies was examined, describing the principle of BOST 14 Toyota management. The most important areas of the company were identified, positively affecting the production process.

  12. Photon energy scale determination and commissioning with radiative Z decays

    CERN Document Server

    Bondu, Olivier

    2012-01-01

    The CMS electromagnetic calorimeter (ECAL) is composed of 75848 lead-tungstate scintillating crystals. It has been designed to be fast, compact, and radiation-hard, with fine granularity and excellent energy resolution. Obtaining the design resolution is a crucial challenge for the SM Higgs search in the two photon channel at the LHC, and more generally good photon calibration and knowledge of the photon energy scale is required for analyses with photons in the final state. The behavior of photons and electrons in the calorimeter is not identical, making the use of a dedicated standard candle for photons, complementary to the canonical high-yield $Z^0$ decay to electrons, highly desirable. The use of $Z^0$ decays to a pair of muons, where one of the muons emits a Bremstrahlung photon, can be such a standard candle. These events, which can be cleanly selected, are a source of high-purity, relatively high-pt photons. Their kinematics are well-constrained by the $Z^0$ boson mass and the precision on the muon ...

  13. Colombeau's generalized functions and non-standard analysis

    International Nuclear Information System (INIS)

    Todorov, T.D.

    1987-10-01

    Using some methods of the Non-Standard Analysis we modify one of Colombeau's classes of generalized functions. As a result we define a class ε-circumflex of the so-called meta-functions which possesses all good properties of Colombeau's generalized functions, i.e. (i) ε-circumflex is an associative and commutative algebra over the system of the so-called complex meta-numbers C-circumflex; (ii) Every meta-function has partial derivatives of any order (which are meta-functions again); (iii) Every meta-function is integrable on any compact set of R n and the integral is a number from C-circumflex; (iv) ε-circumflex contains all tempered distributions S', i.e. S' is contained in ε' isomorphically with respect to all linear operations (including the differentiation). Thus, within the class ε-circumflex the problem of multiplication of the tempered distributions is satisfactorily solved (every two distributions in S' have a well-defined product in ε-circumflex). The crucial point is that C-circumflex is a field in contrast to the system of Colombeau's generalized numbers C-bar which is a ring only (C-bar is the counterpart of C-circumflex in Colombeau's theory). In this way we simplify and improve slightly the properties of the integral and notion of ''values of the meta-functions'' as well as the properties of the whole class ε-circumflex itself if compared with the original Colombeau theory. And, what is maybe more important, we clarify the connection between the Non-Standard Analysis and Colombeau's theory of new generalized functions in the framework of which the problem of multiplication of distributions was recently solved. (author). 14 refs

  14. Cartographic standards to improve maps produced by the Forest Inventory and Analysis program

    Science.gov (United States)

    Charles H. (Hobie) Perry; Mark D. Nelson

    2009-01-01

    The Forest Service, U.S. Department of Agriculture's Forest Inventory and Analysis (FIA) program is incorporating an increasing number of cartographic products in reports, publications, and presentations. To create greater quality and consistency within the national FIA program, a Geospatial Standards team developed cartographic design standards for FIA map...

  15. National Green Building Standard Analysis

    Energy Technology Data Exchange (ETDEWEB)

    none,

    2012-07-01

    DOE's Building America Program is a research and development program to improve the energy performance of new and existing homes. The ultimate goal of the Building America Program is to achieve examples of cost-effective, energy efficient solutions for all U.S. climate zones. Periodic maintenance of an ANSI standard by review of the entire document and action to revise or reaffirm it on a schedule not to exceed five years is required by ANSI. In compliance, a consensus group has once again been formed and the National Green Building Standard is currently being reviewed to comply with the periodic maintenance requirement of an ANSI standard.

  16. Methods for preparing comparative standards and field samples for neutron activation analysis of soil

    International Nuclear Information System (INIS)

    Glasgow, D.C.; Dyer, F.F.; Robinson, L.

    1995-01-01

    One of the more difficult problems associated with comparative neutron activation analysis (CNAA) is the preparation of standards which are tailor-made to the desired irradiation and counting conditions. Frequently, there simply is not a suitable standard available commercially, or the resulting gamma spectrum is convoluted with interferences. In a recent soil analysis project, the need arose for standards which contained about 35 elements. In response, a computer spreadsheet was developed to calculate the appropriate amount of each element so that the resulting gamma spectrum is relatively free of interferences. Incorporated in the program are options for calculating all of the irradiation and counting parameters including activity produced, necessary flux/bombardment time, counting time, and appropriate source-to-detector distance. The result is multi-element standards for CNAA which have optimal concentrations. The program retains ease of use without sacrificing capability. In addition to optimized standard production, a novel soil homogenization technique was developed which is a low cost, highly efficient alternative to commercially available homogenization systems. Comparative neutron activation analysis for large scale projects has been made easier through these advancements. This paper contains details of the design and function of the NAA spreadsheet and innovative sample handling techniques. (author) 7 refs.; 5 tabs

  17. Methods for preparing comparative standards and field samples for neutron activation analysis of soil

    International Nuclear Information System (INIS)

    Glasgow, D.C.; Dyer, F.F.; Robinson, L.

    1994-01-01

    One of the more difficult problems associated with comparative neutron activation analysis (CNAA) is the preparation of standards which are tailor-made to the desired irradiation and counting conditions. Frequently, there simply is not a suitable standard available commercially, or the resulting gamma spectrum is convoluted with interferences. In a recent soil analysis project, the need arose for standards which contained about 35 elements. In response, a computer spreadsheet was developed to calculate the appropriate amount of each element so that the resulting gamma spectrum is relatively free of interferences. Incorporated in the program are options for calculating all of the irradiation and counting parameters including activity produced, necessary flux/bombardment time, counting time, and appropriate source-to-detector distance. The result is multi-element standards for CNAA which have optimal concentrations. The program retains ease of use without sacrificing capability. In addition to optimized standard production, a novel soil homogenization technique was developed which is a low cost, highly efficient alternative to commercially available homogenization systems. Comparative neutron activation analysis for large scale projects has been made easier through these advancements. This paper contains details of the design and function of the NAA spreadsheet and innovative sample handling techniques

  18. Extending Differential Fault Analysis to Dynamic S-Box Advanced Encryption Standard Implementations

    Science.gov (United States)

    2014-09-18

    number. As a result decryption is a different function which relies on a different key to efficiently undo the work of encryption . RSA is the most...EXTENDING DIFFERENTIAL FAULT ANALYSIS TO DYNAMIC S-BOX ADVANCED ENCRYPTION STANDARD IMPLEMENTATIONS THESIS Bradley M. Flamm, Civilian AFIT-ENG-T-14-S...ADVANCED ENCRYPTION STANDARD IMPLEMENTATIONS THESIS Presented to the Faculty Department of Electrical and Computer Engineering Graduate School of

  19. Cost-Effectiveness Analysis of 1-Year Treatment with Golimumab/Standard Care and Standard Care Alone for Ulcerative Colitis in Poland.

    Directory of Open Access Journals (Sweden)

    Ewa Stawowczyk

    Full Text Available The objective of this study was to assess the cost-effectiveness of induction and maintenance treatment up to 1 year of ulcerative colitis with golimumab/standard care and standard care alone in Poland.A Markov model was used to estimate the expected costs and effects of golimumab/standard care and a standard care alone. For each treatment option the costs and quality adjusted life years were calculated to estimate the incremental cost-utility ratio. The analysis was performed from the perspective of the Polish public payer and society over a 30-years time horizon. The clinical parameters were derived mainly from the PURSUIT-SC and PURSUIT-M clinical trials. Different direct and indirect costs and utility values were assigned to the various model health states.The treatment of ulcerative colitis patients with golimumab/standard care instead of a standard care alone resulted in 0.122 additional years of life with full health. The treatment with golimumab/standard care was found to be more expensive than treatment with the standard care alone from the public payer perspective and from social perspective. The incremental cost-utility ratio of golimumab/standard care compared to the standard care alone is estimated to be 391,252 PLN/QALY gained (93,155 €/QALYG from public payer perspective and 374,377 PLN/QALY gained (89,137 €/QALYG from social perspective.The biologic treatment of ulcerative colitis patients with golimumab/standard care is more effective but also more costly compared with standard care alone.

  20. Domestic energy-use pattern by the households: A comparison between rural and semi-urban areas of Noakhali in Bangladesh

    International Nuclear Information System (INIS)

    Miah, Md.Danesh; Foysal, Muhammad Abul; Koike, Masao; Kobayashi, Hajime

    2011-01-01

    An explorative survey was carried out on rural and semi-urban households to find out the energy consumption pattern with respect to socio-demographic and geographic factors in Bangladesh by using stratified random sampling technique. The study revealed that 100% of the households used biomass, 98% kerosene, 61% electricity, 23% LPG and 5% candle in the rural areas. In the semi-urban areas, 100% of the households used electricity, candle and natural gas, 60% kerosene and 13% petrol. Households' mean expenditure for total energy was US$ 5.34 (SE, 0.43) with total income US$ 209.84 (SE, 6.69) month -1 in the rural areas, while it was US$ 6.20 (SE, 1.35) in the semi-urban areas with the total income US$ 427.76 (SE, 24.19) month -1 . This study may be a useful baseline information to energy policy makers in Bangladesh. - Highlights: →The study provides an empirical analysis of household energy consumption. → Rural households are dominated by biomass energy. → Semi-urban households are dominated by standard commercial energy (natural gas and electricity).→ Monthly income, dwelling status and literacy of the households clearly influences energy use.→ The major energy use in the rural households is for the cooking purpose.

  1. Domestic energy-use pattern by the households: A comparison between rural and semi-urban areas of Noakhali in Bangladesh

    Energy Technology Data Exchange (ETDEWEB)

    Miah, Md.Danesh, E-mail: danesh@cu.ac.bd [Institute of Forestry and Environmental Sciences, University of Chittagong, 4331 Chittagong (Bangladesh); Forest Policy Laboratory, Shinshu University, 8304 Minamiminowa-Mura, Kami Ina Gun, 399-4598 Nagano-ken (Japan); Foysal, Muhammad Abul [Institute of Forestry and Environmental Sciences, University of Chittagong, 4331 Chittagong (Bangladesh); Koike, Masao [Forest Policy Laboratory, Shinshu University, 8304 Minamiminowa-Mura, Kami Ina Gun, 399-4598 Nagano-ken (Japan); Kobayashi, Hajime [Laboratory of Forest Environment and Ecology, Faculty of Agriculture, Shinshu University, 8304 Minamiminowa-Mura, Kami Ina Gun, 399-4598 Nagano-ken (Japan)

    2011-06-15

    An explorative survey was carried out on rural and semi-urban households to find out the energy consumption pattern with respect to socio-demographic and geographic factors in Bangladesh by using stratified random sampling technique. The study revealed that 100% of the households used biomass, 98% kerosene, 61% electricity, 23% LPG and 5% candle in the rural areas. In the semi-urban areas, 100% of the households used electricity, candle and natural gas, 60% kerosene and 13% petrol. Households' mean expenditure for total energy was US$ 5.34 (SE, 0.43) with total income US$ 209.84 (SE, 6.69) month{sup -1} in the rural areas, while it was US$ 6.20 (SE, 1.35) in the semi-urban areas with the total income US$ 427.76 (SE, 24.19) month{sup -1}. This study may be a useful baseline information to energy policy makers in Bangladesh. - Highlights: >The study provides an empirical analysis of household energy consumption. > Rural households are dominated by biomass energy. > Semi-urban households are dominated by standard commercial energy (natural gas and electricity).> Monthly income, dwelling status and literacy of the households clearly influences energy use.> The major energy use in the rural households is for the cooking purpose.

  2. Neutron activation analysis for certification of standard reference materials

    International Nuclear Information System (INIS)

    Capote Rodriguez, G.; Perez Zayas, G.; Hernandez Rivero, A.; Ribeiro Guevara, S.

    1996-01-01

    Neutron activation analysis is used extensively as one of the analytical techniques in the certification of standard reference materials. Characteristics of neutron activation analysis which make it valuable in this role are: accuracy multielemental capability to asses homogeneity, high sensitivity for many elements, and essentially non-destructive method. This paper report the concentrations of 30 elements (major, minor and trace elements) in four Cuban samples. The samples were irradiated in a thermal neutron flux of 10 12- 10 13 n.cm 2. s -1. The gamma ray spectra were measured by HPGe detectors and were analyzed using ACTAN program development in Center of Applied Studies for Nuclear Development

  3. Historical floods in flood frequency analysis: Is this game worth the candle?

    Science.gov (United States)

    Strupczewski, Witold G.; Kochanek, Krzysztof; Bogdanowicz, Ewa

    2017-11-01

    In flood frequency analysis (FFA) the profit from inclusion of historical information on the largest historical pre-instrumental floods depends primarily on reliability of the information, i.e. the accuracy of magnitude and return period of floods. This study is focused on possible theoretical maximum gain in accuracy of estimates of upper quantiles, that can be obtained by incorporating the largest historical floods of known return periods into the FFA. We assumed a simple case: N years of systematic records of annual maximum flows and either one largest (XM1) or two largest (XM1 and XM2) flood peak flows in a historical M-year long period. The problem is explored by Monte Carlo simulations with the maximum likelihood (ML) method. Both correct and false distributional assumptions are considered. In the first case the two-parameter extreme value models (Gumbel, log-Gumbel, Weibull) with various coefficients of variation serve as parent distributions. In the case of unknown parent distribution, the Weibull distribution was assumed as estimating model and the truncated Gumbel as parent distribution. The return periods of XM1 and XM2 are determined from the parent distribution. The results are then compared with the case, when return periods of XM1 and XM2 are defined by their plotting positions. The results are presented in terms of bias, root mean square error and the probability of overestimation of the quantile with 100-year return period. The results of the research indicate that the maximal profit of inclusion of pre-instrumental foods in the FFA may prove smaller than the cost of reconstruction of historical hydrological information.

  4. Precision analysis for standard deviation measurements of immobile single fluorescent molecule images.

    Science.gov (United States)

    DeSantis, Michael C; DeCenzo, Shawn H; Li, Je-Luen; Wang, Y M

    2010-03-29

    Standard deviation measurements of intensity profiles of stationary single fluorescent molecules are useful for studying axial localization, molecular orientation, and a fluorescence imaging system's spatial resolution. Here we report on the analysis of the precision of standard deviation measurements of intensity profiles of single fluorescent molecules imaged using an EMCCD camera.We have developed an analytical expression for the standard deviation measurement error of a single image which is a function of the total number of detected photons, the background photon noise, and the camera pixel size. The theoretical results agree well with the experimental, simulation, and numerical integration results. Using this expression, we show that single-molecule standard deviation measurements offer nanometer precision for a large range of experimental parameters.

  5. The Nature of Science and the "Next Generation Science Standards": Analysis and Critique

    Science.gov (United States)

    McComas, William F.; Nouri, Noushin

    2016-01-01

    This paper provides a detailed analysis of the inclusion of aspects of nature of science (NOS) in the "Next Generation Science Standards" (NGSS). In this new standards document, NOS elements in eight categories are discussed in Appendix H along with illustrative statements (called exemplars). Many, but not all, of these exemplars are…

  6. Methodological Choices in the Content Analysis of Textbooks for Measuring Alignment with Standards

    Science.gov (United States)

    Polikoff, Morgan S.; Zhou, Nan; Campbell, Shauna E.

    2015-01-01

    With the recent adoption of the Common Core standards in many states, there is a need for quality information about textbook alignment to standards. While there are many existing content analysis procedures, these generally have little, if any, validity or reliability evidence. One exception is the Surveys of Enacted Curriculum (SEC), which has…

  7. Towards an integral analysis of stereotypes: Challenging the standard guarantee of impartiality

    Directory of Open Access Journals (Sweden)

    Laura Clérico

    2018-05-01

    Full Text Available The use of stereotypes to the detriment of the discriminated group should imply questioning the standard guarantee of judicial impartiality. Even taking the analysis of stereotypes seriously, the depth of its consequences has not been sufficiently taken into account in the constitutional and human rights argumentation because it has not challenged with all intensity the standard way in which the guarantee of impartiality is conceived and applied in legal practice and even by the regional court for the protection of human rights. I will defend two central theses: (1 the use of stereotypes necessarily impacts on the analysis of the guarantee of impartiality and (2 impartiality must get rid of the presumption of impartiality.

  8. Experimental verification for standard analysis procedure of 241Am in food

    International Nuclear Information System (INIS)

    Liu Qingfen; Zhu Hongda; Liu Shutian; Pan Jingshun; Yang Dating

    2005-01-01

    Objective: The briefly experimental verification for 'determination of 241 Am in food' has been described. Methods: The overall recovery, the MDL of method and decontamination experiment has been done by standard analysis procedure. Results: The overall recovery is 76.26 ± 4.1%. The MDL is 3.4 x 10 -5 Bq/g ash, decontamination factor is higher than 10 3 for Po, 10 2 for U, Th, Pu and 60 for 237 Np. Conclusion: The results showed that the overall recovery is quite high and reliable, the MDL of method is able to meet examining 241 Am limited values in foods. the obtained decontamination factors of recommended procedure can meet analysis of 241 Am in food examination. Venifying results of the procedure are satisfied by using 243 Am spike and 241 Am standard reference material. (authors)

  9. An approach to standardization of urine sediment analysis via suggestion of a common manual protocol.

    Science.gov (United States)

    Ko, Dae-Hyun; Ji, Misuk; Kim, Sollip; Cho, Eun-Jung; Lee, Woochang; Yun, Yeo-Min; Chun, Sail; Min, Won-Ki

    2016-01-01

    The results of urine sediment analysis have been reported semiquantitatively. However, as recent guidelines recommend quantitative reporting of urine sediment, and with the development of automated urine sediment analyzers, there is an increasing need for quantitative analysis of urine sediment. Here, we developed a protocol for urine sediment analysis and quantified the results. Based on questionnaires, various reports, guidelines, and experimental results, we developed a protocol for urine sediment analysis. The results of this new protocol were compared with those obtained with a standardized chamber and an automated sediment analyzer. Reference intervals were also estimated using new protocol. We developed a protocol with centrifugation at 400 g for 5 min, with the average concentration factor of 30. The correlation between quantitative results of urine sediment analysis, the standardized chamber, and the automated sediment analyzer were generally good. The conversion factor derived from the new protocol showed a better fit with the results of manual count than the default conversion factor in the automated sediment analyzer. We developed a protocol for manual urine sediment analysis to quantitatively report the results. This protocol may provide a mean for standardization of urine sediment analysis.

  10. A Comparative Analysis of Three Proposed Federal Renewable Electricity Standards

    Energy Technology Data Exchange (ETDEWEB)

    Sullivan, Patrick [National Renewable Energy Lab. (NREL), Golden, CO (United States); Logan, Jeffrey [National Renewable Energy Lab. (NREL), Golden, CO (United States); Bird, Lori [National Renewable Energy Lab. (NREL), Golden, CO (United States); Short, Walter [National Renewable Energy Lab. (NREL), Golden, CO (United States)

    2009-05-01

    This paper analyzes potential impacts of proposed national renewable electricity standard (RES) legislation. An RES is a mandate requiring certain electricity retailers to provide a minimum share of their electricity sales from qualifying renewable power generation. The analysis focuses on draft bills introduced individually by Senator Jeff Bingaman and Representative Edward Markey, and jointly by Representative Henry Waxman and Markey. The analysis uses NREL's Regional Energy Deployment System (ReEDS) model to evaluate the impacts of the proposed RES requirements on the U.S. energy sector in four scenarios.

  11. The Phebus FP thermal-hydraulic analysis with Melcor

    Energy Technology Data Exchange (ETDEWEB)

    Akgane, Kikuo; Kiso, Yoshihiro [Nuclear Power Engineering Corporation, Tokyo (Japan); Fukahori, Takanori [Hitachi Engineering Company, Ltd., Hitachi-shi Ibaraki-ken (Japan); Yoshino, Mamoru [Nuclear Engineering Ltd., Tosabori Nishi-ku (Japan)

    1995-09-01

    The severe accident analysis code MELCOR, version 1.8.2, has been applied for thermal-hydraulic pre-test analysis of the first test of the Phebus FP program (test FPT-0) to study the best test parameters and the applicability of the code. The Phebus FP program is an in-pile test program which has been planned by the French Commissariate a L`Energie Atomique and the Commission of the European Union. The experiments are being conducted by an international collaboration to study the release and transport of fission products (FPs) under conditions assumed to be the most representative of those that would occur in a severe accident. The Phebus FP test apparatus simulates a test bundle of an in-pile section, the circuit including the steam generator U-tubes and the containment. The FPT-0 test was designed to simulate the heat-up and subsequent fuel bundle degradation after a loss of coolant severe accident, using fresh fuel. Two options for fuel degradation models in MELCOR have been applied to fuel degradation behavior. the first model assumes that fuel debris will be formed immediately after the fuel support fails by cladding relocation due to the candling process. The other is the uncollapsed bare fuel pellets option, in which the fuel pellets remain standing in a columnar shape until the fuel reaches its melting point, even if the cladding has been relocated by candling. The thermal-hydraulic behaviors in the circuit and containment of Phebus FP are discussed herein. Flow velocities in the Phebus FP circuit are high in order to produce turbulent flow in a small diameter test pipe. The MELCOR calculation has shown that the length of the hot leg and steam generator are adequate to attain steam temperatures or 700{degrees}C and 150{degrees}C in the respective outlets. The containment atmosphere temperature and humidity derived by once through integral system calculation show that objective test conditions would be satisfied in the Phebus FP experiment.

  12. The Phebus FP thermal-hydraulic analysis with Melcor

    International Nuclear Information System (INIS)

    Akgane, Kikuo; Kiso, Yoshihiro; Fukahori, Takanori; Yoshino, Mamoru

    1995-01-01

    The severe accident analysis code MELCOR, version 1.8.2, has been applied for thermal-hydraulic pre-test analysis of the first test of the Phebus FP program (test FPT-0) to study the best test parameters and the applicability of the code. The Phebus FP program is an in-pile test program which has been planned by the French Commissariate a L'Energie Atomique and the Commission of the European Union. The experiments are being conducted by an international collaboration to study the release and transport of fission products (FPs) under conditions assumed to be the most representative of those that would occur in a severe accident. The Phebus FP test apparatus simulates a test bundle of an in-pile section, the circuit including the steam generator U-tubes and the containment. The FPT-0 test was designed to simulate the heat-up and subsequent fuel bundle degradation after a loss of coolant severe accident, using fresh fuel. Two options for fuel degradation models in MELCOR have been applied to fuel degradation behavior. the first model assumes that fuel debris will be formed immediately after the fuel support fails by cladding relocation due to the candling process. The other is the uncollapsed bare fuel pellets option, in which the fuel pellets remain standing in a columnar shape until the fuel reaches its melting point, even if the cladding has been relocated by candling. The thermal-hydraulic behaviors in the circuit and containment of Phebus FP are discussed herein. Flow velocities in the Phebus FP circuit are high in order to produce turbulent flow in a small diameter test pipe. The MELCOR calculation has shown that the length of the hot leg and steam generator are adequate to attain steam temperatures or 700 degrees C and 150 degrees C in the respective outlets. The containment atmosphere temperature and humidity derived by once through integral system calculation show that objective test conditions would be satisfied in the Phebus FP experiment

  13. The effect of personalized versus standard patient protocols for radiostereometric analysis (RSA)

    DEFF Research Database (Denmark)

    Muharemovic, O; Troelsen, A; Thomsen, M G

    2018-01-01

    INTRODUCTION: Increasing pressure in the clinic requires a more standardized approach to radiostereometric analysis (RSA) imaging. The aim of this study was to investigate whether implementation of personalized RSA patient protocols could increase image quality and decrease examination time...... imaging. Radiographers in the control group used a standard RSA protocol. RESULTS: At three months, radiographers in the case group significantly reduced (p .... No significant improvements were found in the control group at any time point. CONCLUSION: There is strong evidence that personalized RSA patient protocols have a positive effect on image quality and radiation dose savings. Implementation of personal patient protocols as a RSA standard will contribute...

  14. A standards-based method for compositional analysis by energy dispersive X-ray spectrometry using multivariate statistical analysis: application to multicomponent alloys.

    Science.gov (United States)

    Rathi, Monika; Ahrenkiel, S P; Carapella, J J; Wanlass, M W

    2013-02-01

    Given an unknown multicomponent alloy, and a set of standard compounds or alloys of known composition, can one improve upon popular standards-based methods for energy dispersive X-ray (EDX) spectrometry to quantify the elemental composition of the unknown specimen? A method is presented here for determining elemental composition of alloys using transmission electron microscopy-based EDX with appropriate standards. The method begins with a discrete set of related reference standards of known composition, applies multivariate statistical analysis to those spectra, and evaluates the compositions with a linear matrix algebra method to relate the spectra to elemental composition. By using associated standards, only limited assumptions about the physical origins of the EDX spectra are needed. Spectral absorption corrections can be performed by providing an estimate of the foil thickness of one or more reference standards. The technique was applied to III-V multicomponent alloy thin films: composition and foil thickness were determined for various III-V alloys. The results were then validated by comparing with X-ray diffraction and photoluminescence analysis, demonstrating accuracy of approximately 1% in atomic fraction.

  15. Role of ''standard'' fine-group cross section libraries in shielding analysis

    International Nuclear Information System (INIS)

    Weisbin, C.R.; Roussin, R.W.; Oblow, E.M.; Cullen, D.E.; White, J.E.; Wright, R.Q.

    1977-01-01

    The Divisions of Magnetic Fusion Energy (DMFE) and Reactor Development and Demonstration (DRDD) of the United States Energy Research and Development Administration (ERDA) have jointly sponsored the development of a 171 neutron, 36 gamma ray group pseudo composition independent cross section library based upon ENDF/B-IV. This library (named VITAMIN-C and packaged by RSIC as DLC-41) is intended to be generally applicable to fusion blanket and LMFBR core and shield analysis. The purpose of this paper is to evaluate this library as a possible candidate for specific designation as a ''standard'' in light of American Nuclear Society standards for fine-group cross section data sets. The rationale and qualification procedure for such a standard are discussed. Finally, current limitations and anticipated extensions to this processed data file are described

  16. Standard model for safety analysis report of fuel fabrication plants

    International Nuclear Information System (INIS)

    1980-09-01

    A standard model for a safety analysis report of fuel fabrication plants is established. This model shows the presentation format, the origin, and the details of the minimal information required by CNEN (Comissao Nacional de Energia Nuclear) aiming to evaluate the requests of construction permits and operation licenses made according to the legislation in force. (E.G.) [pt

  17. Standard model for safety analysis report of fuel reprocessing plants

    International Nuclear Information System (INIS)

    1979-12-01

    A standard model for a safety analysis report of fuel reprocessing plants is established. This model shows the presentation format, the origin, and the details of the minimal information required by CNEN (Comissao Nacional de Energia Nuclear) aiming to evaluate the requests of construction permits and operation licenses made according to the legislation in force. (E.G.) [pt

  18. Local properties of analytic functions and non-standard analysis

    International Nuclear Information System (INIS)

    O'Brian, N.R.

    1976-01-01

    This is an expository account which shows how the methods of non-standard analysis can be applied to prove the Nullstellensatz for germs of analytic functions. This method of proof was discovered originally by Abraham Robinson. The necessary concepts from model theory are described in some detail and the Nullstellensatz is proved by investigating the relation between the set of infinitesimal elements in the complex n-plane and the spectrum of the ring of germs of analytic functions. (author)

  19. Standardization of Image Quality Analysis – ISO 19264

    DEFF Research Database (Denmark)

    Wüller, Dietmar; Kejser, Ulla Bøgvad

    2016-01-01

    There are a variety of image quality analysis tools available for the archiving world, which are based on different test charts and analysis algorithms. ISO has formed a working group in 2012 to harmonize these approaches and create a standard way of analyzing the image quality for archiving...... systems. This has resulted in three documents that have been or are going to be published soon. ISO 19262 defines the terms used in the area of image capture to unify the language. ISO 19263 describes the workflow issues and provides detailed information on how the measurements are done. Last...... but not least ISO 19264 describes the measurements in detail and provides aims and tolerance levels for the different aspects. This paper will present the new ISO 19264 technical specification to analyze image quality based on a single capture of a multi-pattern test chart, and discuss the reasoning behind its...

  20. Integrated Data Collection Analysis (IDCA) Program - Statistical Analysis of RDX Standard Data Sets

    Energy Technology Data Exchange (ETDEWEB)

    Sandstrom, Mary M. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Brown, Geoffrey W. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Preston, Daniel N. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Pollard, Colin J. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Warner, Kirstin F. [Naval Surface Warfare Center (NSWC), Indian Head, MD (United States). Indian Head Division; Sorensen, Daniel N. [Naval Surface Warfare Center (NSWC), Indian Head, MD (United States). Indian Head Division; Remmers, Daniel L. [Naval Surface Warfare Center (NSWC), Indian Head, MD (United States). Indian Head Division; Phillips, Jason J. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Shelley, Timothy J. [Air Force Research Lab. (AFRL), Tyndall AFB, FL (United States); Reyes, Jose A. [Applied Research Associates, Tyndall AFB, FL (United States); Hsu, Peter C. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Reynolds, John G. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States)

    2015-10-30

    The Integrated Data Collection Analysis (IDCA) program is conducting a Proficiency Test for Small- Scale Safety and Thermal (SSST) testing of homemade explosives (HMEs). Described here are statistical analyses of the results for impact, friction, electrostatic discharge, and differential scanning calorimetry analysis of the RDX Type II Class 5 standard. The material was tested as a well-characterized standard several times during the proficiency study to assess differences among participants and the range of results that may arise for well-behaved explosive materials. The analyses show that there are detectable differences among the results from IDCA participants. While these differences are statistically significant, most of them can be disregarded for comparison purposes to assess potential variability when laboratories attempt to measure identical samples using methods assumed to be nominally the same. The results presented in this report include the average sensitivity results for the IDCA participants and the ranges of values obtained. The ranges represent variation about the mean values of the tests of between 26% and 42%. The magnitude of this variation is attributed to differences in operator, method, and environment as well as the use of different instruments that are also of varying age. The results appear to be a good representation of the broader safety testing community based on the range of methods, instruments, and environments included in the IDCA Proficiency Test.

  1. [Ecological compensation standard in Dongting Lake region of returning cropland to lake based on emergy analysis].

    Science.gov (United States)

    Mao, De-Hua; Hu, Guang-Wei; Liu, Hui-Jie; Li, Zheng-Zui; Li, Zhi-Long; Tan, Zi-Fang

    2014-02-01

    The annual emergy and currency value of the main ecological service value of returning cropland to lake in Dongting Lake region from 1999 to 2010 was calculated based on emergy analysis. The calculation method of ecological compensation standard was established by calculating annual total emergy of ecological service function increment since the starting year of returning cropland to lake, and the annual ecological compensation standard and compensation area were analyzed from 1999 to 2010. The results indicated that ecological compensation standard from 1999 to 2010 was 40.31-86.48 yuan x m(-2) with the mean of 57.33 yuan x m(-2). The ecological compensation standard presented an increase trend year by year due to the effect of eco-recovery of returning cropland to lake. The ecological compensation standard in the research area presented a swift and steady growth trend after 2005 mainly due to the intensive economy development of Hunan Province, suggesting the value of natural ecological resources would increase along with the development of society and economy. Appling the emergy analysis to research the ecological compensation standard could reveal the dynamics of annual ecological compensation standard, solve the abutment problem of matter flow, energy flow and economic flow, and overcome the subjective and arbitrary of environment economic methods. The empirical research of ecological compensation standard in Dongting Lake region showed that the emergy analysis was feasible and advanced.

  2. Standardization of Cassia spectabilis with Respect to Authenticity, Assay and Chemical Constituent Analysis

    Directory of Open Access Journals (Sweden)

    Angeline Torey

    2010-05-01

    Full Text Available Quality control standardizations of the various medicinal plants used in traditional medicine is becoming more important today in view of the commercialization of formulations based on these plants. An attempt at standardization of Cassia spectabilis leaf has been carried out with respect to authenticity, assay and chemical constituent analysis. The authentication involved many parameters, including gross morphology, microscopy of the leaves and functional group analysis by Fourier Transform Infrared (FTIR spectroscopy. The assay part of standardization involved determination of the minimum inhibitory concentration (MIC of the extract which could help assess the chemical effects and establish curative values. The MIC of the C. spectabilis leaf extracts was investigated using the Broth Dilution Method. The extracts showed a MIC value of 6.25 mg/mL, independent of the extraction time. The chemical constituent aspect of standardization involves quantification of the main chemical components in C. spectabilis. The GCMS method used for quantification of 2,4-(1H,3H-pyrimidinedione in the extract was rapid, accurate, precise, linear (R2 = 0.8685, rugged and robust. Hence this method was suitable for quantification of this component in C. spectabilis. The standardization of C. spectabilis is needed to facilitate marketing of medicinal plants, with a view to promoting the export of valuable Malaysian Traditional Medicinal plants such as C. spectabilis.

  3. Standardization of Cassia spectabilis with respect to authenticity, assay and chemical constituent analysis.

    Science.gov (United States)

    Torey, Angeline; Sasidharan, Sreenivasan; Yeng, Chen; Latha, Lachimanan Yoga

    2010-05-10

    Quality control standardizations of the various medicinal plants used in traditional medicine is becoming more important today in view of the commercialization of formulations based on these plants. An attempt at standardization of Cassia spectabilis leaf has been carried out with respect to authenticity, assay and chemical constituent analysis. The authentication involved many parameters, including gross morphology, microscopy of the leaves and functional group analysis by Fourier Transform Infrared (FTIR) spectroscopy. The assay part of standardization involved determination of the minimum inhibitory concentration (MIC) of the extract which could help assess the chemical effects and establish curative values. The MIC of the C. spectabilis leaf extracts was investigated using the Broth Dilution Method. The extracts showed a MIC value of 6.25 mg/mL, independent of the extraction time. The chemical constituent aspect of standardization involves quantification of the main chemical components in C. spectabilis. The GCMS method used for quantification of 2,4-(1H,3H)-pyrimidinedione in the extract was rapid, accurate, precise, linear (R(2) = 0.8685), rugged and robust. Hence this method was suitable for quantification of this component in C. spectabilis. The standardization of C. spectabilis is needed to facilitate marketing of medicinal plants, with a view to promoting the export of valuable Malaysian Traditional Medicinal plants such as C. spectabilis.

  4. Sustainable development induction in organizations: a convergence analysis of ISO standards management tools' parameters.

    Science.gov (United States)

    Merlin, Fabrício Kurman; Pereira, Vera Lúciaduarte do Valle; Pacheco, Waldemar

    2012-01-01

    Organizations are part of an environment in which they are pressured to meet society's demands and acting in a sustainable way. In an attempt to meet such demands, organizations make use of various management tools, among which, ISO standards are used. Although there are evidences of contributions provided by these standards, it is questionable whether its parameters converge for a possible induction for sustainable development in organizations. This work presents a theoretical study, designed on structuralism world view, descriptive and deductive method, which aims to analyze the convergence of management tools' parameters in ISO standards. In order to support the analysis, a generic framework for possible convergence was developed, based on systems approach, linking five ISO standards (ISO 9001, ISO 14001, OHSAS 18001, ISO 31000 and ISO 26000) with sustainable development and positioning them according to organization levels (strategic, tactical and operational). The structure was designed based on Brundtland report concept. The analysis was performed exploring the generic framework for possible convergence based on Nadler and Tushman model. The results found the standards can contribute to a possible sustainable development induction in organizations, as long as they meet certain minimum conditions related to its strategic alignment.

  5. Melting Metal on a Playing Card

    Science.gov (United States)

    Greenslade, Thomas B., Jr.

    2016-01-01

    Many of us are familiar with the demonstration of boiling water in a paper cup held over a candle or a Bunsen burner; the ignition temperature of paper is above the temperature of 100°C at which water boils under standard conditions. A more dramatic demonstration is melting tin held in a playing card. This illustration is from Tissandier's book on…

  6. Analysis of Evidence Supporting the Educational Leadership Constituent Council 2011 Educational Leadership Program Standards

    Science.gov (United States)

    Tucker, Pamela D.; Anderson, Erin; Reynolds, Amy L.; Mawhinney, Hanne

    2016-01-01

    This document analysis provides a summary of the research from high-impact journals published between 2008 and 2013 with the explicit purpose of determining the extent to which the current empirical evidence supports the individual 2011 Educational Leadership Constituent Council Program Standards and their elements. We found that the standards are…

  7. Carbon dioxide emission standards for U.S. power plants. An efficiency analysis perspective

    Energy Technology Data Exchange (ETDEWEB)

    Hampf, Benjamin [Technische Univ. Darmstadt (Germany). Fachbereich Rechts- und Wirtschaftswissenschaften; Roedseth, Kenneth Loevold [Institute of Transport Economics, Oslo (Norway). Dept. of Economics and Logistics

    2013-07-01

    On June 25, 2013, President Obama announced his plan to introduce carbon dioxide emission standards for electricity generation. This paper proposes an efficiency analysis approach that addresses which mission rates (and standards) would be feasible if the existing generating units adopt best practices. A new efficiency measure is introduced and further decomposed to identify different sources' contributions to emission rate improvements. Estimating two Data Envelopment Analysis (DEA) models - the well-known joint production model and the new materials balance model - on a dataset consisting of 160 bituminous-fired generating units, we find that the average generating unit's electricity-to-carbon dioxide ratio is 15.3 percent below the corresponding best-practice ratio. Further examinations reveal that this discrepancy can largely be attributed to non-discretionary factors and not to managerial inefficiency. Moreover, even if the best practice ratios could be implemented, the generating units would not be able to comply with the EPA's recently proposed carbon dioxide standard.

  8. Standards for deuterium analysis requirements of heavy water plants (Preprint No. CA-1)

    Energy Technology Data Exchange (ETDEWEB)

    Rathi, B N; Gopalakrishnan, V T; Alphonse, K P; Pawar, P L; Sadhukhan, H K [Bhabha Atomic Research Centre, Bombay (India). Heavy Water Div.

    1989-04-01

    Accurate analysis of deuterium, covering the entire range, is of great importance in production of heavy water. Most of the methods for determination of deuterium in gas or liquid samples require appropriate standards. Since density of pure protium oxide and pure deuterium oxide has been determined very accurately by a large number of workers and density of mixtures of H{sub 2}O and D{sub 2}O follows a linear relation, it is possible to use accurate density determination for measurement of deuterium content. Float method for density measurements was improved further and used for the preparation of primary heavy water standards in high and low deuterium ranges. Heavy water plant laboratories require gas standards (ammonia synthesis gas matrix), in addition to low deuterium water standards, for calibration of mass spectrometers. SLAP (Standard Light Antarctic Precipitation, D/D+H = 89.02+-0.05ppm) and SMOW (Standard Mean Ocean Water, D/D+H =155.76+-0.05ppm) available from IAEA, Vienna, along with water practically free from deuterium, were used as standards to prepare secondary liquid standards. These secondary standards were subsequently reduced and mixed with pure nitrogen to obtain D/D+H standards in syngas matrix. (author). 8 refs., 2 figs.

  9. Dark-Energy Equation-of-State parameter for high redshifts

    International Nuclear Information System (INIS)

    Montiel, Ariadna; Breton, Nora

    2011-01-01

    Since the elucidation of the nature of dark energy depends strongly on redshift observations, it is desirable to measure them over a wider range, but supernovae cannot be detected out past redshift 1.7. Gamma-ray-bursts (GRBs) offer means to extend the analysis to at least redshifts of > 6. The reason is that GRBs are visible across much larger distances than supernovae. GRBs are now known to have several light-curve and spectral properties from which the luminosity of the burst can be calculated, and it might GRBs become into standard candles. We have used data of 69 GRB to study the behavior of the parameter of the dark energy equation of state as a function of redshift.

  10. Quantitative chemical analysis for the standardization of copaiba oil by high resolution gas chromatography

    International Nuclear Information System (INIS)

    Tappin, Marcelo R.R.; Pereira, Jislaine F.G.; Lima, Lucilene A.; Siani, Antonio C.; Mazzei, Jose L.; Ramos, Monica F.S.

    2004-01-01

    Quantitative GC-FID was evaluated for analysis of methylated copaiba oils, using trans-(-)-caryophyllene or methyl copalate as external standards. Analytical curves showed good linearity and reproducibility in terms of correlation coefficients (0.9992 and 0.996, respectively) and relative standard deviation (< 3%). Quantification of sesquiterpenes and diterpenic acids were performed with each standard, separately. When compared with the integrator response normalization, the standardization was statistically similar for the case of methyl copalate, but the response of trans-(-)-caryophyllene was statistically (P < 0.05) different. This method showed to be suitable for classification and quality control of commercial samples of the oils. (author)

  11. FACT. Energy spectrum of the Crab Nebula

    Energy Technology Data Exchange (ETDEWEB)

    Temme, Fabian; Einecke, Sabrina; Buss, Jens [TU Dortmund, Experimental Physics 5, Otto-Hahn-Str.4, 44221 Dortmund (Germany); Collaboration: FACT-Collaboration

    2016-07-01

    The First G-APD Cherenkov Telescope is the first Imaging Air Cherenkov Telescope which uses silicon photon detectors (G-APDs aka SiPM) as photo sensors. With more than four years of operation, FACT proved an application of SiPMs is suitable for the field of ground-based gamma-ray astronomy. Due to the stable flux at TeV energies, the Crab Nebula is handled as a ''standard candle'' in Cherenkov astronomy. The analysis of its energy spectrum and comparison with other experiments, allows to evaluate the performance of FACT. A modern analysis chain, based on data stream handling and multivariate analysis methods was developed in close cooperation with the department of computer science at the TU Dortmund. In this talk, this analysis chain and its application are presented. Further to this, results, including the energy spectrum of the Crab Nebula, measured with FACT, are shown.

  12. The Trends and Prospects of Health Information Standards : Standardization Analysis and Suggestions

    International Nuclear Information System (INIS)

    Kim, Chang Soo

    2008-01-01

    Ubiquitous health care system, which is one of the developing solution technologies of IT, BT and NT, could give us new medical environments in future. Implementing health information systems can be complex, expensive and frustrating. Healthcare professionals seeking to acquire or upgrade systems do not have a convenient, reliable way of specifying a level of adherence to communication standards sufficient to achieve truly efficient interoperability. Great progress has been made in establishing such standards-DICOM, IHE and HL7, notably, are now highly advanced. IHE has defined a common framework to deliver the basic interoperability needed for local and regional health information networks. It has developed a foundational set of standards-based integration profiles for information exchange with three interrelated efforts. HL7 is one of several ANSI-accredited Standards Developing Organizations operating in the healthcare arena. Most SDOs produce standards (protocols) for a particular healthcare domain such as pharmacy, medical devices, imaging or insurance transactions. HL7's domain is clinical and administrative data. HL7 is an international community of healthcare subject matter experts and information scientists collaborating to create standards for the exchange, management and integration of electronic healthcare information. The ASTM specification for Continuity of Care Record was developed by subcommittee E31.28 on electronic health records, which includes clinicians, provider institutions, administrators, patient advocates, vendors, and health industry. In this paper, there are suggestions that provide a test bed, demonstration and specification of how standards such a IHE, HL7, ASTM can be used to provide an integrated environment.

  13. The Trends and Prospects of Health Information Standards : Standardization Analysis and Suggestions

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Chang Soo [Dept. of Radiological Science, College of Health Science, Catholic University of Pusan, Pusan (Korea, Republic of)

    2008-03-15

    Ubiquitous health care system, which is one of the developing solution technologies of IT, BT and NT, could give us new medical environments in future. Implementing health information systems can be complex, expensive and frustrating. Healthcare professionals seeking to acquire or upgrade systems do not have a convenient, reliable way of specifying a level of adherence to communication standards sufficient to achieve truly efficient interoperability. Great progress has been made in establishing such standards-DICOM, IHE and HL7, notably, are now highly advanced. IHE has defined a common framework to deliver the basic interoperability needed for local and regional health information networks. It has developed a foundational set of standards-based integration profiles for information exchange with three interrelated efforts. HL7 is one of several ANSI-accredited Standards Developing Organizations operating in the healthcare arena. Most SDOs produce standards (protocols) for a particular healthcare domain such as pharmacy, medical devices, imaging or insurance transactions. HL7's domain is clinical and administrative data. HL7 is an international community of healthcare subject matter experts and information scientists collaborating to create standards for the exchange, management and integration of electronic healthcare information. The ASTM specification for Continuity of Care Record was developed by subcommittee E31.28 on electronic health records, which includes clinicians, provider institutions, administrators, patient advocates, vendors, and health industry. In this paper, there are suggestions that provide a test bed, demonstration and specification of how standards such a IHE, HL7, ASTM can be used to provide an integrated environment.

  14. Usage of Latent Class Analysis in Diagnostic Microbiology in the Absence of Gold Standard Test

    Directory of Open Access Journals (Sweden)

    Gul Bayram Abiha

    2016-12-01

    Full Text Available The evaluation of performance of various tests diagnostic tests in the absence of gold standard is an important problem. Latent class analysis (LCA is a statistical analysis method known for many years, especially in the absence of a gold standard for evaluation of diagnostic tests so that LCA has found its wide application area. During the last decade, LCA method has widely used in for determining sensivity and specifity of different microbiological tests. It has investigated in the diagnosis of mycobacterium tuberculosis, mycobacterium bovis, human papilloma virus, bordetella pertussis, influenza viruses, hepatitis E virus (HEV, hepatitis C virus (HCV and other various viral infections. Researchers have compared several diagnostic tests for the diagnosis of different pathogens with LCA. We aimed to evaluate performance of latent class analysis method used microbiological diagnosis in various diseases in several researches. When we took into account all of these tests' results, we suppose that LCA is a good statistical analysis method to assess different test performances in the absence of gold standard. [Archives Medical Review Journal 2016; 25(4.000: 467-488

  15. Joint Oil Analysis Program Spectrometer Standards SCP Science (Conostan) Qualification Report for D19-0, D3-100, and D12-XXX Series Standards

    Science.gov (United States)

    2015-05-20

    Joint Oil Analysis Program Spectrometer Standards SCP Science (Conostan) Qualification Report For D19-0, D3-100, and D12- XXX Series Standards NF...Candidate Type D19-0 ICP-AES Results ..................................................................... 4 Table V. Candidate Type D12- XXX ...Physical Property Results .................................................. 5 Table VI. Candidate Type D12- XXX Rotrode-AES Results

  16. Validation of the k0 standardization method in neutron activation analysis

    International Nuclear Information System (INIS)

    Kubesova, Marie

    2009-01-01

    The goal of this work was to validate the k 0 standardization method in neutron activation analysis for use by the Nuclear Physics Institute's NAA Laboratory. The precision and accuracy of the method were examined by using two types of reference materials: the one type comprised a set of synthetic materials and served to check the implementation of k 0 standardization, the other type consisted of matrix NIST SRMs comprising various different matrices. In general, a good agreement was obtained between the results of this work and the certified values, giving evidence of the accuracy of our results. In addition, the limits were evaluated for 61 elements

  17. Standard model for the safety analysis report of nuclear fuel reprocessing plants

    International Nuclear Information System (INIS)

    1980-02-01

    This norm establishes the Standard Model for the Safety Analysis Report of Nuclear Fuel Reprocessing Plants, comprehending the presentation format, the detailing level of the minimum information required by the CNEN for evaluation the requests of Construction License or Operation Authorization, in accordance with the legislation in force. This regulation applies to the following basic reports: Preliminary Safety Analysis Report - PSAR, integrating part of the requirement of Construction License; and Final Safety Analysis Report (FSAR) which is the integrating part of the requirement for Operation Authorization

  18. Analysis of approaches to classification of forms of non-standard employment

    Directory of Open Access Journals (Sweden)

    N. V. Dorokhova

    2017-01-01

    Full Text Available Currently becoming more widespread non-standard forms of employment. If this is not clear approach to the definition and maintenance of non-standard employment. In the article the analysis of diverse interpretations of the concept, on what basis, the author makes a conclusion about the complexity and contradictory nature of precarious employment as an economic category. It examines different approaches to classification of forms of precarious employment. The main forms of precarious employment such as flexible working year, flexible working week, flexible working hours, remote work, employees on call, shift forwarding; Agency employment, self-employment, negotiator, underemployment, over employment, employment on the basis of fixed-term contracts employment based on contract of civil-legal nature, one-time employment, casual employment, temporary employment, secondary employment and part-time. The author’s approach to classification of non-standard forms of employment, based on identifying the impact of atypical employment on the development of human potential. For the purpose of classification of non-standard employment forms from the standpoint of their impact on human development as the criteria of classification proposed in the following: working conditions, wages and social guarantees, possibility of workers ' participation in management, personal development and self-employment stability. Depending on what value each of these criteria, some form of non-standard employment can be attributed to the progressive or regressive. Classification of non-standard forms of employment should be the basis of the state policy of employment management.

  19. The k0-based neutron activation analysis: a mono standard to standardless approach of NAA

    International Nuclear Information System (INIS)

    Acharya, R.; Nair, A.G.C.; Sudarshan, K.; Goswami, A.; Reddy, A.V.R.

    2006-01-01

    The k 0 -based neutron activation analysis (k 0 -NAA) uses neutron flux parameters, detection efficiency and nuclear constants namely k 0 and Q 0 for the determination of concentration of elements. Gold ( 197 Au) or any other element having suitable nuclear properties is used as external or internal single comparator. This article describes the principle of k 0 -NAA and standardization of method by characterization of reactor irradiation sites and calibration of efficiency of the detector and applications. The method was validated using CRMs obtained from USGS, IAEA and NIST. The applications of method includes samples like gemstones (ruby, beryl and emerald), sediments, manganese nodules and encrustations, cereals, and medicinal and edible leaves. Recently, a k-o-based internal mono standard INAA (IM-NAA) method using in-situ relative efficiency has been standardized by us for the analysis of small and large samples of different shapes and sizes. The method was applied to a new meteorite sample and large size wheat samples. Non-standard size and shape samples of nuclear cladding materials namely zircaloy 2 and 4, stainless steels (SS 316M and D9) and 1S aluminium were analysed. Standard-less analysis of these cladding materials was possible by mass balance approach since all the major and minor elements were amenable to NAA. (author)

  20. Using SWE Standards for Ubiquitous Environmental Sensing: A Performance Analysis

    Directory of Open Access Journals (Sweden)

    Joaquín Huerta

    2012-08-01

    Full Text Available Although smartphone applications represent the most typical data consumer tool from the citizen perspective in environmental applications, they can also be used for in-situ data collection and production in varied scenarios, such as geological sciences and biodiversity. The use of standard protocols, such as SWE, to exchange information between smartphones and sensor infrastructures brings benefits such as interoperability and scalability, but their reliance on XML is a potential problem when large volumes of data are transferred, due to limited bandwidth and processing capabilities on mobile phones. In this article we present a performance analysis about the use of SWE standards in smartphone applications to consume and produce environmental sensor data, analysing to what extent the performance problems related to XML can be alleviated by using alternative uncompressed and compressed formats.

  1. Gamma-ray Burst Prompt Correlations: Selection and Instrumental Effects

    Science.gov (United States)

    Dainotti, M. G.; Amati, L.

    2018-05-01

    The prompt emission mechanism of gamma-ray bursts (GRB) even after several decades remains a mystery. However, it is believed that correlations between observable GRB properties, given their huge luminosity/radiated energy and redshift distribution extending up to at least z ≈ 9, are promising possible cosmological tools. They also may help to discriminate among the most plausible theoretical models. Nowadays, the objective is to make GRBs standard candles, similar to supernovae (SNe) Ia, through well-established and robust correlations. However, differently from SNe Ia, GRBs span over several order of magnitude in their energetics, hence they cannot yet be considered standard candles. Additionally, being observed at very large distances, their physical properties are affected by selection biases, the so-called Malmquist bias or Eddington effect. We describe the state of the art on how GRB prompt correlations are corrected for these selection biases to employ them as redshift estimators and cosmological tools. We stress that only after an appropriate evaluation and correction for these effects, GRB correlations can be used to discriminate among the theoretical models of prompt emission, to estimate the cosmological parameters and to serve as distance indicators via redshift estimation.

  2. Návrh strategií výrobního podniku

    OpenAIRE

    Věrná, Monika

    2015-01-01

    The topic of thesis is "Proposal strategies of manufacturing company" for Lima plus Ltd. The aim of this thesis is to propose strategy for this company, which produces paraffin candles. Based on the external and internal analysis was created strategies via SWOT analysis and Porter's approach to generic strategies.

  3. Neutron activation analysis of reference materials by the k sub 0 standardization and relative methods

    Energy Technology Data Exchange (ETDEWEB)

    Freitas, M C; Martinho, E [LNETI/ICEN, Sacavem (Portugal)

    1989-04-15

    Instrumental neutron activation analysis with the k{sub o}-standardization method was applied to eight geological, environmental and biological reference materials, including leaves, blood, fish, sediments, soils and limestone. To a first approximation, the results were normally distributed around the certified values with a standard deviation of 10%. Results obtained by using the relative method based on well characterized multi-element standards for IAEA CRM Soil-7 are reported.

  4. Integrated Data Collection Analysis (IDCA) Program - RDX Standard Data Set 2

    Energy Technology Data Exchange (ETDEWEB)

    Sandstrom, Mary M. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Brown, Geoffrey W. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Preston, Daniel N. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Pollard, Colin J. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Warner, Kirstin F. [Naval Surface Warfare Center (NSWC), Indian Head, MD (United States). Indian Head Division; Sorensen, Daniel N. [Naval Surface Warfare Center (NSWC), Indian Head, MD (United States). Indian Head Division; Remmers, Daniel L. [Naval Surface Warfare Center (NSWC), Indian Head, MD (United States). Indian Head Division; Phillips, Jason J. [Air Force Research Lab. (AFRL), Tyndall Air Force Base, FL (United States); Shelley, Timothy J. [Applied Research Associates, Tyndall Air Force Base, FL (United States); Reyes, Jose A. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Hsu, Peter C. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Reynolds, John G. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States)

    2013-02-20

    The Integrated Data Collection Analysis (IDCA) program is conducting a proficiency study for Small- Scale Safety and Thermal (SSST) testing of homemade explosives (HMEs). Described here are the results for impact, friction, electrostatic discharge, and differential scanning calorimetry analysis of the RDX Type II Class 5 standard, from testing the second time in the Proficiency Test. This RDX testing (Set 2) compared to the first (Set 1) was found to have about the same impact sensitivity, have more BAM friction sensitivity, less ABL friction sensitivity, similar ESD sensitivity, and same DSC sensitivity.

  5. Use of gold and silver standards based on phenol-formalde-hyde resin in assay-activation analysis of geological samples

    International Nuclear Information System (INIS)

    Aliev, A.I.; Drynkin, V.I.; Lejpunskaya, D.I.; Nedostup, T.V.

    1976-01-01

    Using standards on phenol-formaldehyde resin base for assaying-activation analysis of geological specimens for gold and silver has bee the advantage of uniformly distributing Au and Ag in spesimens and possible preparing tablets of practically any form or size. The validity and accuracy of these standards have been studied for the cases of short irradiation. Conventional point standards were used as reference standards. The experiments carried out have shown that tablet resol standards are suitable for a mass assaying-activation analysis for gold and silver at practically any concentrations

  6. Standards for backscattering analysis

    International Nuclear Information System (INIS)

    Mitchell, I.V.; Eschbach, H.L.

    1978-01-01

    The need for backscattering standards appears to be twofold and depends on the uses and requirements of the users. The first is as a calibrated reference by which samples of a similar nature to the standard may be absolutely compared. The second is as a means of intercomparing the relative results obtained by different laboratories using, as near as possible, identical samples. This type of comparison is of a relative nature and the absolute values are not necessarily required. In the present work the authors try to satisfy both needs by providing identical samples which have been absolutely calibrated to a high accuracy. Very thin copper and vanadium layers were evaporated onto bismuth implanted silicon crystals and on glass plates under carefully controlled conditions. The mass of the deposits was determined in situ using a sensitive UHV microbalance. In addition, two quartz oscillator monitors were used. The samples have been analysed by Rutherford backscattering and the absolute quantity of bismuth determined by a comparison with the known amounts of deposited material. (Auth.)

  7. Summary of component reliability data for probabilistic safety analysis of Korean standard nuclear power plant

    International Nuclear Information System (INIS)

    Choi, S. Y.; Han, S. H.

    2004-01-01

    The reliability data of Korean NPP that reflects the plant specific characteristics is necessary for PSA of Korean nuclear power plants. We have performed a study to develop the component reliability DB and S/W for component reliability analysis. Based on the system, we had have collected the component operation data and failure/repair data during plant operation data to 1998/2000 for YGN 3,4/UCN 3,4 respectively. Recently, we have upgraded the database by collecting additional data by 2002 for Korean standard nuclear power plants and performed component reliability analysis and Bayesian analysis again. In this paper, we supply the summary of component reliability data for probabilistic safety analysis of Korean standard nuclear power plant and describe the plant specific characteristics compared to the generic data

  8. From the Analysis of Work-Processes to Designing Competence-Based Occupational Standards and Vocational Curricula

    Science.gov (United States)

    Tutlys, Vidmantas; Spöttl, Georg

    2017-01-01

    Purpose: This paper aims to explore methodological and institutional challenges on application of the work-process analysis approach in the design and development of competence-based occupational standards for Lithuania. Design/methodology/approach: The theoretical analysis is based on the review of scientific literature and the analysis of…

  9. A Framework for Establishing Standard Reference Scale of Texture by Multivariate Statistical Analysis Based on Instrumental Measurement and Sensory Evaluation.

    Science.gov (United States)

    Zhi, Ruicong; Zhao, Lei; Xie, Nan; Wang, Houyin; Shi, Bolin; Shi, Jingye

    2016-01-13

    A framework of establishing standard reference scale (texture) is proposed by multivariate statistical analysis according to instrumental measurement and sensory evaluation. Multivariate statistical analysis is conducted to rapidly select typical reference samples with characteristics of universality, representativeness, stability, substitutability, and traceability. The reasonableness of the framework method is verified by establishing standard reference scale of texture attribute (hardness) with Chinese well-known food. More than 100 food products in 16 categories were tested using instrumental measurement (TPA test), and the result was analyzed with clustering analysis, principal component analysis, relative standard deviation, and analysis of variance. As a result, nine kinds of foods were determined to construct the hardness standard reference scale. The results indicate that the regression coefficient between the estimated sensory value and the instrumentally measured value is significant (R(2) = 0.9765), which fits well with Stevens's theory. The research provides reliable a theoretical basis and practical guide for quantitative standard reference scale establishment on food texture characteristics.

  10. Standard hazard analysis, critical control point and hotel management

    Directory of Open Access Journals (Sweden)

    Vujačić Vesna

    2017-01-01

    Full Text Available Tourism is a dynamic category which is continuously evolving in the world. Specificities that have to be respected in the execution in relation to the food industry are connected with the fact that the main differences which exist regarding the food serving procedure in catering, numerous complex recipes and production technologies, staff fluctuation, old equipment. For an effective and permanent implementation, the HACCP concept is very important for building a serious base. In this case, the base is represented by the people handling the food. This paper presents international ISO standards, the concept of HACCP and the importance of its application in the tourism and hospitality industry. The concept of HACCP is a food safety management system through the analysis and control of biological, chemical and physical hazards in the entire process, from raw material production, procurement, handling, to manufacturing, distribution and consumption of the finished product. The aim of this paper is to present the importance of the application of HACCP concept in tourism and hotel management as a recognizable international standard.

  11. Comparing Sustainable Forest Management Certifications Standards: A Meta-analysis

    Directory of Open Access Journals (Sweden)

    Michael Rawson. Clark

    2011-03-01

    Full Text Available To solve problems caused by conventional forest management, forest certification has emerged as a driver of sustainable forest management. Several sustainable forest management certification systems exist, including the Forest Stewardship Council and those endorsed by the Programme for the Endorsement of Forest Certification, such as the Canadian Standards Association - Sustainable Forestry Management Standard CAN/CSA - Z809 and Sustainable Forestry Initiative. For consumers to use certified products to meet their own sustainability goals, they must have an understanding of the effectiveness of different certification systems. To understand the relative performance of three systems, we determined: (1 the criteria used to compare the Forest Stewardship Council, Canadian Standards Association - Sustainable Forestry Management, and Sustainable Forestry Initiative, (2 if consensus exists regarding their ability to achieve sustainability goals, and (3 what research gaps must be filled to improve our understanding of how forest certification systems affect sustainable forest management. We conducted a qualitative meta-analysis of 26 grey literature references (books, industry and nongovernmental organization publications and 9 primary literature references (articles in peer-reviewed academic journals that compared at least two of the aforementioned certification systems. The Forest Stewardship Council was the highest performer for ecological health and social sustainable forest management criteria. The Canadian Standards Association - Sustainable Forestry Management and Sustainable Forestry Initiative performed best under sustainable forest management criteria of forest productivity and economic longevity of a firm. Sixty-two percent of analyses were comparisons of the wording of certification system principles or criteria; 34% were surveys of foresters or consumers. An important caveat to these results is that only one comparison was based on

  12. A comparative analysis of quality management standards for contract research organisations in clinical trials.

    Science.gov (United States)

    Murray, Elizabeth; McAdam, Rodney

    2007-01-01

    This article compares and contrasts the main quality standards in the highly regulated pharmaceutical industry with specific focus on Good Clinical Practice (GCP), the standard for designing, conducting, recording and reporting clinical trials involving human participants. Comparison is made to ISO quality standards, which can be applied to all industries and types of organisation. The study is then narrowed to that of contract research organisations (CROs) involved in the conduct of clinical trials. The paper concludes that the ISO 9000 series of quality standards can act as a company-wide framework for quality management within such organisations by helping to direct quality efforts on a long-term basis without any loss of compliance. This study is valuable because comparative analysis in this domain is uncommon.

  13. Is the Einstein de Sitter model actually ruled out?

    International Nuclear Information System (INIS)

    Blanchard, A.

    2003-01-01

    The standard model for cosmology which is now strongly favored is a flat model, dominated by a vacuum density term. However, the actual direct evidences for such term are limited, essentially based on the supernova probe, i.e. based on a standard candle hypothesis. Here I would like to point out that contrary to the general belief there is room for an Einstein de Sitter universe. Actually several independent measurements, not based on stellar reference, pointed towards a high matter density Universe, weakening the need for a cosmological constant

  14. A COMPARATIVE ANALYSIS OF THE SUPERNOVA LEGACY SURVEY SAMPLE WITH ΛCDM AND THE Rh=ct UNIVERSE

    International Nuclear Information System (INIS)

    Wei, Jun-Jie; Wu, Xue-Feng; Melia, Fulvio; Maier, Robert S.

    2015-01-01

    The use of Type Ia supernovae (SNe Ia) has thus far produced the most reliable measurement of the expansion history of the universe, suggesting that ΛCDM offers the best explanation for the redshift–luminosity distribution observed in these events. However, analysis of other kinds of sources, such as cosmic chronometers, gamma-ray bursts, and high-z quasars, conflicts with this conclusion, indicating instead that the constant expansion rate implied by the R h = ct universe is a better fit to the data. The central difficulty with the use of SNe Ia as standard candles is that one must optimize three or four nuisance parameters characterizing supernova (SN) luminosities simultaneously with the parameters of an expansion model. Hence, in comparing competing models, one must reduce the data independently for each. We carry out such a comparison of ΛCDM and the R h = ct universe using the SN Legacy Survey sample of 252 SN events, and show that each model fits its individually reduced data very well. However, since R h = ct has only one free parameter (the Hubble constant), it follows from a standard model selection technique that it is to be preferred over ΛCDM, the minimalist version of which has three (the Hubble constant, the scaled matter density, and either the spatial curvature constant or the dark energy equation-of-state parameter). We estimate using the Bayes Information Criterion that in a pairwise comparison, the likelihood of R h = ct is ∼90%, compared with only ∼10% for a minimalist form of ΛCDM, in which dark energy is simply a cosmological constant. Compared to R h = ct, versions of the standard model with more elaborate parametrizations of dark energy are judged to be even less likely

  15. Equivalence of the Traditional and Non-Standard Definitions of Concepts from Real Analysis

    Directory of Open Access Journals (Sweden)

    John Cowles

    2014-06-01

    Full Text Available ACL2(r is a variant of ACL2 that supports the irrational real and complex numbers. Its logical foundation is based on internal set theory (IST, an axiomatic formalization of non-standard analysis (NSA. Familiar ideas from analysis, such as continuity, differentiability, and integrability, are defined quite differently in NSA–some would argue the NSA definitions are more intuitive. In previous work, we have adopted the NSA definitions in ACL2(r, and simply taken as granted that these are equivalent to the traditional analysis notions, e.g., to the familiar epsilon-delta definitions. However, we argue in this paper that there are circumstances when the more traditional definitions are advantageous in the setting of ACL2(r, precisely because the traditional notions are classical, so they are unencumbered by IST limitations on inference rules such as induction or the use of pseudo-lambda terms in functional instantiation. To address this concern, we describe a formal proof in ACL2(r of the equivalence of the traditional and non-standards definitions of these notions.

  16. Proteomics wants cRacker: automated standardized data analysis of LC-MS derived proteomic data.

    Science.gov (United States)

    Zauber, Henrik; Schulze, Waltraud X

    2012-11-02

    The large-scale analysis of thousands of proteins under various experimental conditions or in mutant lines has gained more and more importance in hypothesis-driven scientific research and systems biology in the past years. Quantitative analysis by large scale proteomics using modern mass spectrometry usually results in long lists of peptide ion intensities. The main interest for most researchers, however, is to draw conclusions on the protein level. Postprocessing and combining peptide intensities of a proteomic data set requires expert knowledge, and the often repetitive and standardized manual calculations can be time-consuming. The analysis of complex samples can result in very large data sets (lists with several 1000s to 100,000 entries of different peptides) that cannot easily be analyzed using standard spreadsheet programs. To improve speed and consistency of the data analysis of LC-MS derived proteomic data, we developed cRacker. cRacker is an R-based program for automated downstream proteomic data analysis including data normalization strategies for metabolic labeling and label free quantitation. In addition, cRacker includes basic statistical analysis, such as clustering of data, or ANOVA and t tests for comparison between treatments. Results are presented in editable graphic formats and in list files.

  17. The Performance and Scientific Rationale for an Infrared Imaging Fourier Transform Spectrograph on a Large Space Telescope

    Science.gov (United States)

    1998-06-22

    several kiloparsecs and will map out the detailed properties of protoplanetary disks as a function of age, stellar mass, and environment in many star...detailed spectral energy distributions and yield radial disk structure, reveal gaps due to the presence of protoplanets, and determine dust composition from...measuring distant standard candles, trace the chemical evolu- tion of galaxies, and study nearby stars and star-forming regions for signs of planetary

  18. Analysis of Potential Benefits and Costs of Adopting a Commercial Building Energy Standard in South Dakota

    Energy Technology Data Exchange (ETDEWEB)

    Belzer, David B.; Cort, Katherine A.; Winiarski, David W.; Richman, Eric E.

    2005-03-04

    The state of South Dakota is considering adopting a commercial building energy standard. This report evaluates the potential costs and benefits to South Dakota residents from requiring compliance with the most recent edition of the ANSI/ASHRAE/IESNA 90.1-2001 Energy Standard for Buildings except Low-Rise Residential Buildings. These standards were developed in an effort to set minimum requirements for the energy efficient design and construction of new commercial buildings. The quantitative benefits and costs of adopting a commercial building energy code are modeled by comparing the characteristics of assumed current building practices with the most recent edition of the ASHRAE Standard, 90.1-2001. Both qualitative and quantitative benefits and costs are assessed in this analysis. Energy and economic impacts are estimated using results from a detailed building simulation tool (Building Loads Analysis and System Thermodynamics [BLAST] model) combined with a Life-Cycle Cost (LCC) approach to assess corresponding economic costs and benefits.

  19. Proposal for Standard Index and Analysis of Financial Performance in 2014 of Brazilian Soccer Clubs of Serie A

    Directory of Open Access Journals (Sweden)

    Rafael da Costa Jahara

    2017-02-01

    Full Text Available This study aims to develop a standard index for analysis of the financial performance of the football clubs of Brazil who participated in the series in the year 2014. To prepare the standard index are used economic and financial indicators of liquidity, profitability and debt, as well solvency analysis of the clubs by using the Kanitz model. Based on studies on standard indices to compare the performance of companies were selected for the analysis of balance sheets and statements of income twenty clubs. This study is classified as a descriptive exploratory research, multicases and quantitative approach because the document analysis, on the information that was extracted from financial reports, published on the websites of the respective soccer clubs. As a result, you can see that the clubs generally have poor performance when individually analyzed in both the analysis of liquidity indicators, indebtedness, profitability and solvency. However, such a result can not explain the competitive performance of the teams in that league.

  20. Determination of gold and silver in geological standard samples MGI by instrument neutron activation analysis

    International Nuclear Information System (INIS)

    Lu Huijiuan; Zhou; Yunlu

    1987-01-01

    Gold and silver in geological standard samples MGI were determined by instrument neutron activation analysis. The various interferences of nuclides were considered. Corrected factors of the geometry in different positions have been determined. Using the geological standard sample MGM and radiochemical separation neutron activation method as reference, the reliability of this method is proved. Gold content in samples is 0.4-0.009 g/t, silver content is 9-0.3 g/t. Standard deviation is less than 3.5%, the precision of the measurement is 4.8-11.6%

  1. Standard model for safety analysis report of hexafluoride power plants from natural uranium

    International Nuclear Information System (INIS)

    1983-01-01

    The standard model for safety analysis report for hexafluoride production power plants from natural uranium is presented, showing the presentation form, the nature and the degree of detail, of the minimal information required by the Brazilian Nuclear Energy Commission - CNEN. (E.G.) [pt

  2. The effect of personalized versus standard patient protocols for radiostereometric analysis (RSA).

    Science.gov (United States)

    Muharemovic, O; Troelsen, A; Thomsen, M G; Kallemose, T; Gosvig, K K

    2018-05-01

    Increasing pressure in the clinic requires a more standardized approach to radiostereometric analysis (RSA) imaging. The aim of this study was to investigate whether implementation of personalized RSA patient protocols could increase image quality and decrease examination time and the number of exposure repetitions. Forty patients undergoing primary total hip arthroplasty were equally randomized to either a case or a control group. Radiographers in the case group were assisted by personalized patient protocols containing information about each patient's post-operative RSA imaging. Radiographers in the control group used a standard RSA protocol. At three months, radiographers in the case group significantly reduced (p RSA patient protocols have a positive effect on image quality and radiation dose savings. Implementation of personal patient protocols as a RSA standard will contribute to the reduction of examination time, thus ensuring a cost benefit for department and patient safety. Copyright © 2017 The College of Radiographers. Published by Elsevier Ltd. All rights reserved.

  3. Certification of standard reference materials employing neutron activation analysis

    International Nuclear Information System (INIS)

    Capote Rodriguez, G.; Hernandez Rivero, A.; Molina Insfran, J.; Ribeiro Guevara, S.; Santana Encinosa, C.; Perez Zayas, G.

    1997-01-01

    Neutron activation analysis (Naa) is used extensively as one of the analytical techniques in the certification of standard reference materials (Srm). Characteristics of Naa which make it valuable in this role are: accuracy; multielemental capability; ability to assess homogeneity; high sensitivity for many elements, and essentially non-destructive method. This paper reports the concentrations of thirty elements (major, minor and trace elements) in four Cuban Srm's. The samples were irradiated in a thermal neutron flux of 10 12 -10 13 neutrons.cm -2 .s -1 . The gamma-ray spectra were measured by HPGe detectors and were analysed using ACTAN program, developed in CEADEN. (author) [es

  4. WW + jet at 14 and 100 TeV

    Energy Technology Data Exchange (ETDEWEB)

    Campbell, John [Fermilab; Miller, David [Glasgow U.; Robens, Tania [Dresden, Tech. U.

    2016-11-05

    In the current LHC run, an accurate understanding of Standard Model processes is extremely important. Processes including electroweak gauge bosons serve as standard candles for SM measurements, and equally constitute important backgrounds for Beyond-the-Standard Model (BSM) searches. We present here the next-to-leading order (NLO) QCD virtual contributions to W+W- + jet in an analytic format obtained through unitarity methods. We present results for the full process using the Monte Carlo event generator MCFM, and discuss total as well as differential cross-sections for the LHC with 14 TeV center-of-mass energy, as well as a future 100 TeV proton-proton machine.

  5. DAG expression: high-throughput gene expression analysis of real-time PCR data using standard curves for relative quantification.

    Directory of Open Access Journals (Sweden)

    María Ballester

    Full Text Available BACKGROUND: Real-time quantitative PCR (qPCR is still the gold-standard technique for gene-expression quantification. Recent technological advances of this method allow for the high-throughput gene-expression analysis, without the limitations of sample space and reagent used. However, non-commercial and user-friendly software for the management and analysis of these data is not available. RESULTS: The recently developed commercial microarrays allow for the drawing of standard curves of multiple assays using the same n-fold diluted samples. Data Analysis Gene (DAG Expression software has been developed to perform high-throughput gene-expression data analysis using standard curves for relative quantification and one or multiple reference genes for sample normalization. We discuss the application of DAG Expression in the analysis of data from an experiment performed with Fluidigm technology, in which 48 genes and 115 samples were measured. Furthermore, the quality of our analysis was tested and compared with other available methods. CONCLUSIONS: DAG Expression is a freely available software that permits the automated analysis and visualization of high-throughput qPCR. A detailed manual and a demo-experiment are provided within the DAG Expression software at http://www.dagexpression.com/dage.zip.

  6. Preparation and analysis of standardized waste samples for Controlled Ecological Life Support Systems (CELSS)

    Science.gov (United States)

    Carden, J. L.; Browner, R.

    1982-01-01

    The preparation and analysis of standardized waste samples for controlled ecological life support systems (CELSS) are considered. Analysis of samples from wet oxidation experiments, the development of ion chromatographic techniques utilizing conventional high pressure liquid chromatography (HPLC) equipment, and an investigation of techniques for interfacing an ion chromatograph (IC) with an inductively coupled plasma optical emission spectrometer (ICPOES) are discussed.

  7. Confirmatory factors analysis of science teacher leadership in the Thailand world-class standard schools

    Science.gov (United States)

    Thawinkarn, Dawruwan

    2018-01-01

    This research aims to analyze factors of science teacher leadership in the Thailand World-Class Standard Schools. The research instrument was a five scale rating questionnaire with reliability 0.986. The sample group included 500 science teachers from World-Class Standard Schools who had been selected by using the stratified random sampling technique. Factor analysis of science teacher leadership in the Thailand World-Class Standard Schools was conducted by using M plus for Windows. The results are as follows: The results of confirmatory factor analysis on science teacher leadership in the Thailand World-Class Standard Schools revealed that the model significantly correlated with the empirical data. The consistency index value was x2 = 105.655, df = 88, P-Value = 0.086, TLI = 0.997, CFI = 0.999, RMSEA = 0.022, and SRMR = 0.019. The value of factor loading of science teacher leadership was positive, with statistical significance at the level of 0.01. The value of six factors was between 0.880-0.996. The highest factor loading was the professional learning community, followed by child-centered instruction, participation in development, the role model in teaching, transformational leaders, and self-development with factor loading at 0.996, 0.928, 0.911, 0.907, 0.901, and 0.871, respectively. The reliability of each factor was 99.1%, 86.0%, 83.0%, 82.2%, 81.0%, and 75.8%, respectively.

  8. Meta-analysis of warmed versus standard temperature CO2 insufflation for laparoscopic cholecystectomy.

    Science.gov (United States)

    Hakeem, Abdul R; Birks, Theodore; Azeem, Qasim; Di Franco, Filippo; Gergely, Szabolcs; Harris, Adrian M

    2016-06-01

    There is conflicting evidence for the use of warmed, humidified carbon dioxide (CO2) for creating pneumoperitoneum during laparoscopic cholecystectomy. Few studies have reported less post-operative pain and analgesic requirement when warmed CO2 was used. This systematic review and meta-analysis aims to analyse the literature on the use of warmed CO2 in comparison to standard temperature CO2 during laparoscopic cholecystectomy. Systematic review and meta-analysis carried out in line with the PRISMA guidelines. Primary outcomes of interest were post-operative pain at 6 h, day 1 and day 2 following laparoscopic cholecystectomy. Secondary outcomes were analgesic usage and drop in intra-operative core body temperature. Standard Mean Difference (SMD) was calculated for continuous variables. Six randomised controlled trials (RCTs) met the inclusion criteria (n = 369). There was no significant difference in post-operative pain at 6 h [3 RCTs; SMD = -0.66 (-1.33, 0.02) (Z = 1.89) (P = 0.06)], day 1 [4 RCTs; SMD = -0.51 (-1.47, 0.44) (Z = 1.05) (P = 0.29)] and day 2 [2 RCTs; SMD = -0.96 (-2.30, 0.37) (Z = 1.42) (P = 0.16)] between the warmed CO2 and standard CO2 group. There was no difference in analgesic usage between the two groups, but pooled analysis was not possible. Two RCTs reported significant drop in intra-operative core body temperature, but there were no adverse events related to this. This review showed no difference in post-operative pain and analgesic requirements between the warmed and standard CO2 insufflation during laparoscopic cholecystectomy. Currently there is not enough high quality evidence to suggest routine usage of warmed CO2 for creating pneumoperitoneum during laparoscopic cholecystectomy. Copyright © 2015 Royal College of Surgeons of Edinburgh (Scottish charity number SC005317) and Royal College of Surgeons in Ireland. Published by Elsevier Ltd. All rights reserved.

  9. Simulation-based estimation of mean and standard deviation for meta-analysis via Approximate Bayesian Computation (ABC).

    Science.gov (United States)

    Kwon, Deukwoo; Reis, Isildinha M

    2015-08-12

    When conducting a meta-analysis of a continuous outcome, estimated means and standard deviations from the selected studies are required in order to obtain an overall estimate of the mean effect and its confidence interval. If these quantities are not directly reported in the publications, they must be estimated from other reported summary statistics, such as the median, the minimum, the maximum, and quartiles. We propose a simulation-based estimation approach using the Approximate Bayesian Computation (ABC) technique for estimating mean and standard deviation based on various sets of summary statistics found in published studies. We conduct a simulation study to compare the proposed ABC method with the existing methods of Hozo et al. (2005), Bland (2015), and Wan et al. (2014). In the estimation of the standard deviation, our ABC method performs better than the other methods when data are generated from skewed or heavy-tailed distributions. The corresponding average relative error (ARE) approaches zero as sample size increases. In data generated from the normal distribution, our ABC performs well. However, the Wan et al. method is best for estimating standard deviation under normal distribution. In the estimation of the mean, our ABC method is best regardless of assumed distribution. ABC is a flexible method for estimating the study-specific mean and standard deviation for meta-analysis, especially with underlying skewed or heavy-tailed distributions. The ABC method can be applied using other reported summary statistics such as the posterior mean and 95 % credible interval when Bayesian analysis has been employed.

  10. A comparative analysis of Science-Technology-Society standards in elementary, middle and high school state science curriculum frameworks

    Science.gov (United States)

    Tobias, Karen Marie

    An analysis of curriculum frameworks from the fifty states to ascertain the compliance with the National Science Education Standards for integrating Science-Technology-Society (STS) themes is reported within this dissertation. Science standards for all fifty states were analyzed to determine if the STS criteria were integrated at the elementary, middle, and high school levels of education. The analysis determined the compliance level for each state, then compared each educational level to see if the compliance was similar across the levels. Compliance is important because research shows that using STS themes in the science classroom increases the student's understanding of the concepts, increases the student's problem solving skills, increases the student's self-efficacy with respect to science, and students instructed using STS themes score well on science high stakes tests. The two hypotheses for this study are: (1) There is no significant difference in the degree of compliance to Science-Technology-Society themes (derived from National Science Education Standards) between the elementary, middle, and high school levels. (2) There is no significant difference in the degree of compliance to Science-Technology-Society themes (derived from National Science Education Standards) between the elementary, middle, and high school level when examined individually. The Analysis of Variance F ratio was used to determine the variance between and within the three educational levels. This analysis addressed hypothesis one. The Analysis of Variance results refused to reject the null hypothesis, meaning there is significant difference in the compliance to STS themes between the elementary, middle and high school educational levels. The Chi-Square test was the statistical analysis used to compare the educational levels for each individual criterion. This analysis addressed hypothesis two. The Chi-Squared results showed that none of the states were equally compliant with each

  11. Accurate determination of arsenic in arsenobetaine standard solutions of BCR-626 and NMIJ CRM 7901-a by neutron activation analysis coupled with internal standard method.

    Science.gov (United States)

    Miura, Tsutomu; Chiba, Koichi; Kuroiwa, Takayoshi; Narukawa, Tomohiro; Hioki, Akiharu; Matsue, Hideaki

    2010-09-15

    Neutron activation analysis (NAA) coupled with an internal standard method was applied for the determination of As in the certified reference material (CRM) of arsenobetaine (AB) standard solutions to verify their certified values. Gold was used as an internal standard to compensate for the difference of the neutron exposure in an irradiation capsule and to improve the sample-to-sample repeatability. Application of the internal standard method significantly improved linearity of the calibration curve up to 1 microg of As, too. The analytical reliability of the proposed method was evaluated by k(0)-standardization NAA. The analytical results of As in AB standard solutions of BCR-626 and NMIJ CRM 7901-a were (499+/-55)mgkg(-1) (k=2) and (10.16+/-0.15)mgkg(-1) (k=2), respectively. These values were found to be 15-20% higher than the certified values. The between-bottle variation of BCR-626 was much larger than the expanded uncertainty of the certified value, although that of NMIJ CRM 7901-a was almost negligible. Copyright (c) 2010 Elsevier B.V. All rights reserved.

  12. Acid Rain Analysis by Standard Addition Titration.

    Science.gov (United States)

    Ophardt, Charles E.

    1985-01-01

    The standard addition titration is a precise and rapid method for the determination of the acidity in rain or snow samples. The method requires use of a standard buret, a pH meter, and Gran's plot to determine the equivalence point. Experimental procedures used and typical results obtained are presented. (JN)

  13. Testing of ceramic filter materials at the PCFB test facility; Keraamisten suodinmateriaalien testaus PCFB-koelaitoksessa

    Energy Technology Data Exchange (ETDEWEB)

    Kuivalainen, R; Eriksson, T; Lehtonen, P; Tiensuu, J [Foster Wheeler Energia Oy, Karhula (Finland)

    1997-10-01

    Pressurized Circulating Fluidized Bed (PCFB) combustion technology has been developed in Karhula, Finland since 1986. In 1989, a 10 MW PCFB test facility was constructed. The test facility has been used for performance testing with different coal types through the years 1990-1994 for obtaining data for design and commercialization of the high-efficiency low-emission PCFB combustion technology. The main objective of the project Y53 was to evaluate advanced candle filter materials for the Hot Gas Clean-up Unit (HGCU) to be used in a commercial PCFB Demonstration Project. To achieve this goal, the selected candle materials were exposed to actual high temperature, high pressure coal combustion flue gases for a period of 1000-1500 h during the PCFB test runs. The test runs were carried out in three test segments in Foster Wheeler`s PCFB test facility at the Karhula R and D Center. An extensive inspection and sampling program was carried out after the second test segment. Selected sample candles were analyzed by the filter supplier and the preliminary results were encouraging. The material strength had decreased only within expected range. Slight elongation of the silicon carbide candles was observed, but at this phase the elongation can not be addressed to creep, unlike in the candles tested in 1993-94. The third and last test segment was completed successfully in October 1996. The filter system was inspected and several sample candles were selected for material characterization. The results will be available in February - March 1997. (orig.)

  14. Investigation of black soot staining in houses

    Energy Technology Data Exchange (ETDEWEB)

    Fugler, D. [Canada Mortgage and Housing Corp., Ottawa, ON (Canada)

    2000-07-01

    Air quality investigators are frequently called upon to determine the origin of streaking, staining or soot marks in both new and old homes. Those marks display common characteristics: black marks along baseboards at interior or exterior walls, behind furniture and at doorways; black smudges on window frames and plastic cabinets; and even shadowing of studs on exterior wall drywall in a few cases. In most instances, carbon soot from a combustion source is the culprit. The combustion sources include furnaces, water heaters, fireplaces, gas dryers, gas ranges, smoking, vehicle exhaust and candle burning. Scepticism about candle soot is prevalent among callers. As a result, a study was initiated in homes where occupants burn candles regularly to investigate soot problems. Samples were collected from five homes, and included stained carpets, filters, and swab samples of black dust or soot. All the houses selected for the study had been built within a three-year period. Some samples of candles commonly burned in those homes were burnt in a laboratory. Air quality audits had been performed in the homes and had revealed other potential pollutant sources. Best practices for cost-effective clean up and control of soot were researched in industry information. The tests conducted in the laboratory found materials consistent with candle soot or residue during microscopic investigations, but no link was established with the stained material obtained from the homes. A few tips for homeowners were included concerning candle burning, and tips for builders were also offered. 1 tab.

  15. The Nature of Science and the Next Generation Science Standards: Analysis and Critique

    Science.gov (United States)

    McComas, William F.; Nouri, Noushin

    2016-08-01

    This paper provides a detailed analysis of the inclusion of aspects of nature of science (NOS) in the Next Generation Science Standards (NGSS). In this new standards document, NOS elements in eight categories are discussed in Appendix H along with illustrative statements (called exemplars). Many, but not all, of these exemplars are linked to the standards by their association with either the "practices of science" or "crosscutting concepts," but curiously not with the recommendations for science content. The study investigated all aspects of NOS in NGSS including the accuracy and inclusion of the supporting exemplar statements and the relationship of NOS in NGSS to other aspects of NOS to support teaching and learning science. We found that while 92 % of these exemplars are acceptable, only 78 % of those written actually appear with the standards. "Science as a way of knowing" is a recommended NOS category in NGSS but is not included with the standards. Also, several other NOS elements fail to be included at all grade levels thus limiting their impact. Finally, NGSS fails to include or insufficiently emphasize several frequently recommended NOS elements such as creativity and subjectivity. The paper concludes with a list of concerns and solutions to the challenges of NOS in NGSS.

  16. An analysis of violations of Osha's (1987) occupational exposure to benzene standard.

    Science.gov (United States)

    Williams, Pamela R D

    2014-01-01

    The Occupational Safety and Health Administration (OSHA), which was formed by the Occupational Safety and Health Act of 1970 (OSH Act), establishes enforceable health and safety standards in the workplace and issues violations and penalties for non-compliance with these standards. The purpose of the current study was to evaluate the number and type of violations of the OSHA (1987) Occupational Exposure to Benzene Standard. Violations of the OSHA Hazard Communication Standard (HCS), particularly those that may pertain to specific provisions of the benzene standard, were also assessed. All analyses were based on OSHA inspection data that have been collected since the early 1970s and that are publicly available from the U.S. Department of Labor enforcement website. Analysis of these data shows that fewer than a thousand OSHA violations of the benzene standard have been issued over the last 25+ years. The results for benzene are in contrast to those for some other toxic and hazardous substances that are regulated by OSHA, such as blood-borne pathogens, lead, and asbestos, for which there have been issued tens of thousands of OSHA violations. The number of benzene standard violations also varies by time period, standard provision, industry sector, and other factors. In particular, the greatest number of benzene standard violations occurred during the late 1980s to early/mid 1990s, soon after the 1987 final benzene rule was promulgated. The majority of benzene standard violations also pertain to noncompliance with specific provisions and subprovisions of the standard dealing with initial exposure monitoring requirements, the communication of hazards to employees, and medical surveillance programs. Only a small fraction of HCS violations are attributed, at least in part, to potential benzene hazards in the workplace. In addition, most benzene standard violations are associated with specific industries within the manufacturing sector where benzene or benzene

  17. Comparative analysis of success of psoriasis treatment with standard therapeutic modalities and balneotherapy.

    Science.gov (United States)

    Baros, Duka Ninković; Gajanin, Vesna S; Gajanin, Radoslav B; Zrnić, Bogdan

    2014-01-01

    Psoriasis is a chronic, inflammatory, immune-mediated skin disease. In addition to standard therapeutic modalities (antibiotics, cytostatics, phototherapy, photochemotherapy and retinoids), nonstandard methods can be used in the treatment of psoriasis. This includes balneotherapy which is most commonly used in combination with therapeutic resources. The aim of this research was to determine the length of remission of psoriasis in patients treated with standard therapeutic modalities, balneotherapy, and combined treatment (standard therapeutic modalities and balneotherapy). The study analyzed 60 adult patients, of both sexes, with different clinical forms of psoriasis, who were divided into three groups according to the applied therapeutic modalities: the first group (treated with standard therapeutic modalities), the second group (treated with balneotherapy) and the third group (treated with combined therapy-standard methods therapy and balneotherapy). The Psoriasis Area and Severity Index was determined in first, third and sixth week of treatment for all patients. The following laboratory analysis were performed and monitored: C reactive protein, iron with total iron binding capacity, unsaturated iron binding capacity and ferritin, uric acid, rheumatoid factors and antibodies to streptolysin O in the first and sixth week of treatment. The average length of remission in patients treated with standard therapeutic modalities and in those treated with balneotherapy was 1.77 +/- 0.951 months and 1.79 +/- 0.918 months, respectively. There was a statistically significant difference in the duration of remission between the patients treated with combination therapy and patients treated with standard therapeutic modalities (p = 0.019) and balneotherapy (p = 0.032). The best results have been achieved when the combination therapy was administered.

  18. Analysis of Peach Bottom station blackout with MELCOR

    International Nuclear Information System (INIS)

    Dingman, S.E.; Cole, R.K.; Haskin, F.E.; Summers, R.M.; Webb, S.W.

    1987-01-01

    A demonstration analysis of station blackout at Peach Bottom has been performed using MELCOR and the results have been compared with those from MARCON 2.1B and the Source Term Code Package (STCP). MELCOR predicts greater in-vessel hydrogen production, earlier melting and core collapse, but later debris discharge than MARCON 2.1B. The drywell fails at vessel breach in MELCOR, but failure is delayed about an hour in MARCON 2.1B. These differences are mainly due to the MELCOR models for candling during melting, in-core axial conduction, and continued oxidation and heat transfer from core debris following lower head dryout. Three sensitivity calculations have been performed with MELCOR to address uncertainties regarding modeling of the core-concrete interactions. The timing of events and the gas and radionuclide release rates are somewhat different in the base case and the three sensitivity cases, but the final conditions and total releases are similar

  19. Comparative Analysis of Norwegian Passive House Criteria and of Criteria related to the Concept of International Passive House Standard

    DEFF Research Database (Denmark)

    Anton, Karin; Vestergaard, Inge

    2013-01-01

    The analysis shows differences in definition of apssive house criterias. It also communicates issues os the passive house concept that are nor completely transferred by the Norwegian passive house standard.......The analysis shows differences in definition of apssive house criterias. It also communicates issues os the passive house concept that are nor completely transferred by the Norwegian passive house standard....

  20. Analysis of standard fracture toughness test based on digital image correlation data

    Czech Academy of Sciences Publication Activity Database

    Jandejsek, Ivan; Gajdoš, Lubomír; Šperl, Martin; Vavřík, Daniel

    2017-01-01

    Roč. 182, September (2017), s. 607-620 ISSN 0013-7944 R&D Projects: GA ČR(CZ) GA15-07210S; GA TA ČR(CZ) TE02000162 Keywords : DIC * full-field measurement * J-integral * CTOD * ASTM standard Subject RIV: JL - Materials Fatigue, Friction Mechanics OBOR OECD: Audio engineering, reliability analysis Impact factor: 2.151, year: 2016 http://www.sciencedirect.com/science/article/pii/S0013794417305799

  1. IEEE standard requirements for reliability analysis in the design and operation of safety systems for nuclear power generating stations

    International Nuclear Information System (INIS)

    Anon.

    1976-01-01

    The purpose of this standard is to provide uniform, minimum acceptable requirements for the performance of reliability analyses for safety-related systems found in nuclear-power generating stations, but not to define the need for an analysis. The need for reliability analysis has been identified in other standards which expand the requirements of regulations (e.g., IEEE Std 379-1972 (ANSI N41.2-1972), ''Guide for the Application of the Single-Failure Criterion to Nuclear Power Generating Station Protection System,'' which describes the application of the single-failure criterion). IEEE Std 352-1975, ''Guide for General Principles of Reliability Analysis of Nuclear Power Generating Station Protection Systems,'' provides guidance in the application and use of reliability techniques referred to in this standard

  2. Are standards effective in improving automobile fuel economy? An international panel analysis

    International Nuclear Information System (INIS)

    Clerides, Sofronis; Zachariadis, Theodoros

    2007-01-01

    Although the adoption of fuel economy standards has induced fuel savings in new motor vehicles, there are arguments against standards and in favour of fuel tax increases because the latter may have lower welfare costs. We therefore attempted to analyze the impact of standards and fuel prices in the fuel consumption of new cars with the aid of cross-section time series analysis of data from 18 countries. To our knowledge, this study is the first one that attempts to explore econometrically this issue at an international level. We built an unbalanced panel comprising 384 observations from the US, Canada, Australia, Japan, Switzerland and 13 EU countries spanning a period between 1975 and 2003. We specified a dynamic panel model of fuel economy and estimated the model for the whole sample and also for North America and Europe separately. Based on these estimates, we derived three important policy conclusions. Firstly, it seems that if there were no FE standards or voluntary targets in force, transportation energy use would increase more rapidly. Secondly, if CO 2 targets are not to be tightened in Europe, retail fuel prices might have to double in order to attain the currently discussed target of 120 g CO 2 /km in the future. Thirdly, without higher fuel prices and/or tighter FE standards, one should not expect any marked improvements in fuel economy under 'business as usual' conditions. European policy makers might need to consider this issue carefully because some recent European studies tend to be optimistic in this respect

  3. Detection of Fish Bones in Cod Fillets by UV Illumination.

    Science.gov (United States)

    Wang, Sheng; Nian, Rui; Cao, Limin; Sui, Jianxin; Lin, Hong

    2015-07-01

    The presence of fish bones is now regarded as an important hazard in fishery products, and there is increasing demand for new analytical techniques to control it more effectively. Here, the fluorescent properties of cod bones under UV illumination were investigated, and the maximal wavelengths for excitation and emission were determined to be 320 nm and 515 nm, respectively, demonstrating significantly different fluorescence characteristics and much higher fluorescence intensity compared to those of fillet muscles. Based on the results, UV fluorescence-assisted candling for the detection of bones in fishery products was developed for the first time. Using cod fillets as samples, the detection ratio of this technique was calculated as 90.86%, significantly higher than that of traditional candling under daylight (76.78%). Moreover, the working efficiency of the new technique was about 26% higher than that of the traditional method. A UV fluorescence imaging framework was also developed, and a method for automatic identification of the fish bones in the cod fillets based on the linear discriminant analysis proposed by Fisher was preliminarily realized, but the detection ratio was demonstrated to be relatively poor compared to those of candling techniques. These results allow us to suggest UV-based methods as new and promising approaches for routine monitoring of bones in fishery products.

  4. Physics potential of precision measurements of the LHC luminosity

    CERN Multimedia

    CERN. Geneva

    2012-01-01

    The uncertainty in the determination of the LHC luminosity is rapidly becoming a limiting factor for the analysis and interpretation of many important LHC processes. In this talk first of all we discuss the theoretical accuracy of total cross sections and examine in which cases the luminosity error is or will be dominant. We then review the impact of LHC data in PDF determinations, with enphasis on the effects of the luminosity uncertainty. We explore the requirements for the accuracy of the 2011 luminosity determination from the point of view of standard candle cross section and other important processes. Finally we discuss what we can learn from the accurate measurement of cross section ratios at different center of mass energies for processes like W, ttbar and dijet production.

  5. JAERI thermal reactor standard code system for reactor design and analysis SRAC

    International Nuclear Information System (INIS)

    Tsuchihashi, Keichiro

    1985-01-01

    SRAC, JAERI thermal reactor standard code system for reactor design and analysis, developed in Japan Atomic Energy Research Institute, is for all types of thermal neutron nuclear design and analysis. The code system has undergone extensive verifications to confirm its functions, and has been used in core modification of the research reactor, detailed design of the multi-purpose high temperature gas reactor and analysis of the experiment with a critical assembly. In nuclear calculation with the code system, multi-group lattice calculation is first made with the libraries. Then, with the resultant homogeneous equivalent group constants, reactor core calculation is made. Described are the following: purpose and development of the code system, functions of the SRAC system, bench mark tests and usage state and future development. (Mori, K.)

  6. Standard Procedure for Grid Interaction Analysis

    International Nuclear Information System (INIS)

    Svensson, Bertil; Lindahl, Sture; Karlsson, Daniel; Joensson, Jonas; Heyman, Fredrik

    2015-01-01

    Grid events, simultaneously affecting all safety related auxiliary systems in a nuclear power plant, are critical and must be carefully addressed in the design, upgrading and operational processes. Up to now, the connecting grid has often been treated as either fully available or totally unavailable, and too little attention has been paid to specify the grid performance criteria. This paper deals with standard procedures for grid interaction analysis, to derive tools and criteria to handle grid events challenging the safety systems of the plant. Critical external power system events are investigated and characterised, with respect to severity and rate of occurrence. These critical events are then grouped with respect to impact on the safety systems, when a disturbance propagates into the plant. It is then important to make sure that 1) the impact of the disturbance will never reach any critical system, 2) the impact of the disturbance will be eliminated before it will hurt any critical system, or 3) the critical systems will be proven to be designed in such a way that they can withstand the impact of the disturbance, and the associated control and protection systems can withstand voltage and frequency transients associated with the disturbances. A number of representative disturbance profiles, reflecting connecting grid conditions, are therefore derived, to be used for equipment testing. (authors)

  7. THE CRITICAL ANALYSIS OF LIMITED SOUTH ASIAN CORPORATE GOVERNANCE STANDARDS AFTER FINANCIAL CRISIS

    Directory of Open Access Journals (Sweden)

    Dinh Tran Ngoc Huy

    2015-12-01

    Full Text Available After the recent global crisis, corporate scandals and bankruptcy in US and Europe, there is some certain evidence on weak corporate governance, risk management and audit system. The 2009 India Code of Corporate Governance also revealed that during the crisis time, there are certain weaknesses although corporate structure is fairly durable. Hence, this paper chooses a different analytical approach and among its aims is to give some systematic opinions. First, it classifies limited South Asian representative corporate governance (CG standards into two (2 groups: India and Malaysia latest CG principles covered in group 1 and, group 2, including corporate governance principle from Thailand and Indonesia, so-called relative good CG group, while it uses ACCA and OECD and ICGN principles as reference. Second, it, through analysis, identifies differences and advantages between above set of standards which are and have been used as reference principles for many relevant organizations. Third, it establishes a selected comparative set of standards for South Asian representative corporate governance system in accordance to international standards. Last but not least, this paper covers some ideas and policy suggestions.

  8. A triple murder.

    Science.gov (United States)

    Vidanapathirana, M; Ruwanpura, P R; Ariyaratne, D; Karunanayake, D S K

    2015-12-01

    Three partially burnt bodies were found in a burnt out bedroom. A candle and matches were found on a partially burnt bed, suggesting accidental deaths. Careful scene analysis and forensic post-mortems demonstrated that this was a multiple murder rather than an accident. © The Author(s) 2015.

  9. Determination Of Adsorption And Paraffin Characterization Of Treatment To Adsorb Vegetable Oil

    International Nuclear Information System (INIS)

    Aminah, Neneng Siti; Mulijani, Sri; Sudirman; Ridwan

    2004-01-01

    Using vegetable oil repeatedly, beside affect on quality decline of food and the oil itself, it is harmful to human health. Some poisoning and carcinogenic symptom were founded with experiment using animals. According to that fact, the aim of the research is using paraffin and candle to adsorb used vegetable oil and to convert into solid sample, so it can be easily wasted. At first, 2 g of sample was poured into the heated oil, with gently stirrer until it turned cold and harden. Each sample and standard before and after treatment was characterized with Ftir, XRD, and DSc. The result shows that paraffins adsorbs 40 ml used vegetable oil with 2 g sample in proportion. That proportion is lower than the standard which can adsorb 66.67 ml vegetable oil in the same weight sample. The difference of paraffin and standard is caused by physical properties within that two materials, and it can be explained by Ftir, X-Ray Diffraction (XRD) and differential scanning calorimetry (DSc). Based on result of Ftir analysis, standard consented of saturated hydrocarbon compound (alkanes) whereas paraffin consisted of unsaturated hydrocarbon compound (alkenes). Infrared spectrum after treatment showed the changes of compound, O-H and esters group were formed and it shows characterised the adsorption process. The result of DSc analysis showed that crystalline the melting point of standard is 75,3 o C and paraffin is 54,17 o C. The result of analysis XRD, described that standard and paraffin before treatment are crystalline whereas after treatment are am orf

  10. Ab initio modeling of primary processes in photosynthesis : protein induced activation of bacteriochlorophylls for efficient light harvesting and charge separation

    NARCIS (Netherlands)

    Wawrzyniak, Piotr K.

    2011-01-01

    Everything started in 1780 when Joseph Priestley, an English chemist, enclosed a mint plant and a burning candle in a glass jar. Surprisingly, the candle burned without interruption, even though in earlier experiments it was extinguished quickly when no plant was present in the jar. Now, 230 years

  11. Data development technical support document for the aircraft crash risk analysis methodology (ACRAM) standard

    International Nuclear Information System (INIS)

    Kimura, C.Y.; Glaser, R.E.; Mensing, R.W.; Lin, T.; Haley, T.A.; Barto, A.B.; Stutzke, M.A.

    1996-01-01

    The Aircraft Crash Risk Analysis Methodology (ACRAM) Panel has been formed by the US Department of Energy Office of Defense Programs (DOE/DP) for the purpose of developing a standard methodology for determining the risk from aircraft crashes onto DOE ground facilities. In order to accomplish this goal, the ACRAM panel has been divided into four teams, the data development team, the model evaluation team, the structural analysis team, and the consequence team. Each team, consisting of at least one member of the ACRAM plus additional DOE and DOE contractor personnel, specializes in the development of the methodology assigned to that team. This report documents the work performed by the data development team and provides the technical basis for the data used by the ACRAM Standard for determining the aircraft crash frequency. This report should be used to provide the generic data needed to calculate the aircraft crash frequency into the facility under consideration as part of the process for determining the aircraft crash risk to ground facilities as given by the DOE Standard Aircraft Crash Risk Assessment Methodology (ACRAM). Some broad guidance is presented on how to obtain the needed site-specific and facility specific data but this data is not provided by this document

  12. Investigation of Hg volatile losses from samples and standards during neutron activation analysis

    International Nuclear Information System (INIS)

    Dubinskaya, N.; Dundua, V.; Chikhladze, N.

    1979-01-01

    The losses of Hg from phenol formaldehyde resin - bound standards and hair samples in neutron activation analysis in case of their irradiation in the water filled nuclear reactor channel is studied. The mean losses of Hg during 20-30 hrs irradiation at (2-3)x10 18 n/cm 2 are 15-20% with their stopping at double Al-covers. The mean losses of Hg from standards at 200, 250 and 300 deg C are 30, 61 and 86% respectively and do not occur at 150 deg C after their 5 hour heating. The losses of Hg from hair samples packed in polyethylene tubes through the package walls in experimental conditions are not observed

  13. Error analysis of isotope dilution mass spectrometry method with internal standard

    International Nuclear Information System (INIS)

    Rizhinskii, M.W.; Vitinskii, M.Y.

    1989-02-01

    The computation algorithms of the normalized isotopic ratios and element concentration by isotope dilution mass spectrometry with internal standard are presented. A procedure based on the Monte-Carlo calculation is proposed for predicting the magnitude of the errors to be expected. The estimation of systematic and random errors is carried out in the case of the certification of uranium and plutonium reference materials as well as for the use of those reference materials in the analysis of irradiated nuclear fuels. 4 refs, 11 figs, 2 tabs

  14. Pharmacognostic standardization and physicochemical analysis of the leaves of Barleria montana Wight & Nees

    Directory of Open Access Journals (Sweden)

    Sriram Sridharan

    2016-03-01

    Full Text Available Objective: To investigate the pharmacognostic features and physiochemical properties of the leaves of Barleria montana Wight & Nees. Methods: The leaf samples were subjected to organoleptic, microscopic and macroscopic analysis. Physiochemical properties and fluorescence analysis of the sample under UV and daylight were studied as per World Health Organization norms. Results: Microscopic analysis showed that the plant possessed dorsiventral leaves, lamina, glandular trichomes, calcium carbonate cystoliths and adaxial epidermis. Physiochemical characters like ash and moisture content, extractive values, foreign matter and fluorescent characteristics of the leaf samples were determined and reported. Conclusions: Results obtained from these studies can be used as reliable markers in the identification and standardization of this plant as a herbal remedy

  15. Instrumental neutron activation analysis of river habitants by the k(0)-standardization method

    International Nuclear Information System (INIS)

    Momoshima, N.; Toyoshima, T.; Matsushita, R.; Fukuda, A.; Hibino, K.

    2005-01-01

    Analysis of metal concentrations in samples use reference materials for determination, which means elements out of the references are not possible to be determined. The instrumental neutron activation analysis (INAA) with k(O)-standardization method makes possible to determine metals without use of reference materials, which is very attractive for environmental sample analysis, River habitants would be available as a bio-indicator from which river water quality or metal contamination level could be evaluated. We analyzed river fishes and river insects by INAA with k(O)-standardization to examine the possibility of these habitants as a bio-indicator of water system. Small fishes, Oryzias latipes and Gambusia affinis were collected at 3 different rivers every month and river insects, families of Heptageniidae, Baetidae, Perlidae, Hydropsychidae, Psephenidae were collected at a fixed point of the river. The dried samples were irradiated at the research reactor, JRR-4 (3.5MW), JAERI for 10 min and 3 h. 17 elements (Na, K, Ca, Sc, Cr, Mn, Fe, Co, Zn, As, Se, Br, Rb, Sr, Ba, Ce and Sm) were determined by the NAA-k(0) method, showing effectiveness of the present method for environmental sample analysis. The metals observed in the fishes were the highest in Ca and the lowest in Sc, ranging from 10 5 mg/kg-dry weigh in Ca to 10 -2 mg/kg-dry weight in Sc. The differences in metal concentrations were examined by statistical analysis with t-test. Ca, Na and Br concentrations differ between species, Oryzias latipes and Gambusia, and Fe, Sc, Co, Zn and Se concentrations differ among rivers. No difference was observed on K, Rb and Sr concentrations.

  16. Gamma-irradiation synthesis of silver nanoparticles fixing in porous ceramic for application in water treatment

    International Nuclear Information System (INIS)

    Dang Van Phu; Nguyen Quoc Hien; Nguyen Thuy Ai Trinh; Bui Duy Du

    2013-01-01

    The Ag nanoparticles in polyvinylpyrrolidone solution with concentration of 500 mg/L and their diameter of 10-15 nm were synthesized on a large scale up to 100 L/batch by gamma irradiation route. Porous ceramic candle samples were functionalized by treatment with a 3-amino-propyltriethoxysilane coupling agent and then impregnated in Ag nanoparticles solution for fixing Ag nanoparticles. The load Ag nanoparticles content on porous ceramic was of about 200-250 mg/kg. The average pore size of porous ceramic/Ag nanoparticles was about 48.2 Å. Owing to strong bonding of silver atoms to the wall of porous ceramic functionalized by 3-amino-propyltriethoxysilane, the contents of silver released from porous ceramic/Ag nanoparticles into filtrated water by test at a flow rate of about 5 L/h were less than 10 μg/L and was far below the required standard limit (<100 μg/L) for drinking water. Thus, porous ceramic/Ag nanoparticles candles can be potentially applied for point-of-use drinking water treatment. (author)

  17. Potential effects of particulate matter from combustion during services on human health and on works of art in medieval churches in Cyprus

    International Nuclear Information System (INIS)

    Loupa, Glykeria; Karageorgos, Evangelos; Rapsomanikis, Spyridon

    2010-01-01

    Indoor and outdoor particulate matter (PM 0.3-10 ) number concentrations were established in two medieval churches in Cyprus. In both churches incense was burnt occasionally during Mass. The highest indoor PM 0.5-1 concentrations compared with outdoors (10.7 times higher) were observed in the church that burning of candles indoors was allowed. Peak indoor black carbon concentration was 6.8 μg m -3 in the instances that incense was burning and 13.4 μg m -3 in the instances that the candles were burning (outdoor levels ranged between 0.6 and 1.3 μg m -3 ). From the water soluble inorganic components determined in PM 10 , calcium prevailed in all samples indoors or outdoors, whilst high potassium concentration indoors were a clear marker of combustion. Indoor sources of PM were clearly identified and their emission strengths were estimated via modeling of the results. Indoor estimated PM 0.3-10 mass concentrations exceeded air quality standards for human health protection and for the preservation of works of art. - Particulate matter in medieval churches of Cyprus.

  18. Potential effects of particulate matter from combustion during services on human health and on works of art in medieval churches in Cyprus.

    Science.gov (United States)

    Loupa, Glykeria; Karageorgos, Evangelos; Rapsomanikis, Spyridon

    2010-09-01

    Indoor and outdoor particulate matter (PM0.3-10) number concentrations were established in two medieval churches in Cyprus. In both churches incense was burnt occasionally during Mass. The highest indoor PM0.5-1 concentrations compared with outdoors (10.7 times higher) were observed in the church that burning of candles indoors was allowed. Peak indoor black carbon concentration was 6.8 microg m(-3) in the instances that incense was burning and 13.4 microg m(-3) in the instances that the candles were burning (outdoor levels ranged between 0.6 and 1.3 microg m(-3)). From the water soluble inorganic components determined in PM10, calcium prevailed in all samples indoors or outdoors, whilst high potassium concentration indoors were a clear marker of combustion. Indoor sources of PM were clearly identified and their emission strengths were estimated via modeling of the results. Indoor estimated PM0.3-10 mass concentrations exceeded air quality standards for human health protection and for the preservation of works of art. Copyright (c) 2010 Elsevier Ltd. All rights reserved.

  19. Corrections for gravitational lensing of supernovae: better than average?

    OpenAIRE

    Gunnarsson, Christofer; Dahlen, Tomas; Goobar, Ariel; Jonsson, Jakob; Mortsell, Edvard

    2005-01-01

    We investigate the possibility of correcting for the magnification due to gravitational lensing of standard candle sources, such as Type Ia supernovae. Our method uses the observed properties of the foreground galaxies along the lines-of-sight to each source and the accuracy of the lensing correction depends on the quality and depth of these observations as well as the uncertainties in translating the observed luminosities to the matter distribution in the lensing galaxies. The current work i...

  20. METHODOLOGICAL ASPECTS OF CONTENT ANALYSIS OF CONVERGENCE BETWEEN UKRAINIAN GAAP AND INTERNATIONAL FINANCIAL REPORTING STANDARDS

    Directory of Open Access Journals (Sweden)

    R. Kuzina

    2015-06-01

    Full Text Available The objective conditions of Ukraine’s integration into the global business environment the need to strengthen the accounting and financial re-porting. At the stage of attracting investment in the country there is a need in the preparation of financial statements generally accepted basic prin-ciples of which are based on common international financial reporting standards (IFRS . Relevant is the assessment of convergence of national standards and International Financial Reporting Standards. However, before you conduct content analysis necessary to determine compliance with standards of methodological approaches to the selection of key indicators for the assessment of convergence. The article is to define the methodo-logical approaches to the selection and development of indicators IFRSs list of key elements for further evaluation convergence of national and international standards. To assess the convergence was allocated 187 basic key elements measuring the level of convergence to IFRS. Sampling was carried out based on the professional judgment of the author, the key indicators of the standard, based on the evaluation of the usefulness of accounting information. These figures make it possible to calculate the specific level of convergence of international and national standards and determine how statements prepared by domestic standards corresponding to IFRS. In other words, can with some certainty assert that Ukraine has made (“good practices in IFRS implementation” or not? This calculation will assess the regulatory efforts of government agencies (Ministry of Finance on the approximation of Ukrainian standards and IFRS.

  1. Accident analysis for aircraft crash into hazardous facilities: DOE standard

    International Nuclear Information System (INIS)

    1996-10-01

    This standard provides the user with sufficient information to evaluate and assess the significance of aircraft crash risk on facility safety without expending excessive effort where it is not required. It establishes an approach for performing a conservative analysis of the risk posed by a release of hazardous radioactive or chemical material resulting from an aircraft crash into a facility containing significant quantities of such material. This can establish whether a facility has a significant potential for an aircraft impact and whether this has the potential for producing significant offsite or onsite consequences. General implementation guidance, screening and evaluation guidelines, and methodologies for the evaluations are included

  2. Standardization of computer-assisted semen analysis using an e-learning application.

    Science.gov (United States)

    Ehlers, J; Behr, M; Bollwein, H; Beyerbach, M; Waberski, D

    2011-08-01

    Computer-assisted semen analysis (CASA) is primarily used to obtain accurate and objective kinetic sperm measurements. Additionally, AI centers use computer-assessed sperm concentration in the sample as a basis for calculating the number of insemination doses available from a given ejaculate. The reliability of data is often limited and results can vary even when the same CASA systems with identical settings are used. The objective of the present study was to develop a computer-based training module for standardized measurements with a CASA system and to evaluate its training effect on the quality of the assessment of sperm motility and concentration. A digital versatile disc (DVD) has been produced showing the standardization of sample preparation and analysis with the CASA system SpermVision™ version 3.0 (Minitube, Verona, WI, USA) in words, pictures, and videos, as well as the most probable sources of error. Eight test persons educated in spermatology, but with different levels of experience with the CASA system, prepared and assessed 10 aliquots from one prediluted bull ejaculate using the same CASA system and laboratory equipment before and after electronic learning (e-learning). After using the e-learning application, the coefficient of variation was reduced on average for the sperm concentration from 26.1% to 11.3% (P ≤ 0.01), and for motility from 5.8% to 3.1% (P ≤ 0.05). For five test persons, the difference in the coefficient of variation before and after use of the e-learning application was significant (P ≤ 0.05). Individual deviations of means from the group mean before e-learning were reduced compared with individual deviations from the group mean after e-learning. According to a survey, the e-learning application was highly accepted by users. In conclusion, e-learning presents an effective, efficient, and accepted tool for improvement of the precision of CASA measurements. This study provides a model for the standardization of other

  3. Turbidity. Training Module 5.240.2.77.

    Science.gov (United States)

    Bonte, John L.; Davidson, Arnold C.

    This document is an instructional module package prepared in objective form for use by an instructor familiar with candle turbidimeter and the nephelometric method of turbidity analysis. Included are objectives, an instructor guide, student handout, and transparency masters. A video tape is also available from the author. This module considers use…

  4. A Meta-Analysis of Typhoid Diagnostic Accuracy Studies: A Recommendation to Adopt a Standardized Composite Reference.

    Directory of Open Access Journals (Sweden)

    Helen L Storey

    Full Text Available Novel typhoid diagnostics currently under development have the potential to improve clinical care, surveillance, and the disease burden estimates that support vaccine introduction. Blood culture is most often used as the reference method to evaluate the accuracy of new typhoid tests; however, it is recognized to be an imperfect gold standard. If no single gold standard test exists, use of a composite reference standard (CRS can improve estimation of diagnostic accuracy. Numerous studies have used a CRS to evaluate new typhoid diagnostics; however, there is no consensus on an appropriate CRS. In order to evaluate existing tests for use as a reference test or inclusion in a CRS, we performed a systematic review of the typhoid literature to include all index/reference test combinations observed. We described the landscape of comparisons performed, showed results of a meta-analysis on the accuracy of the more common combinations, and evaluated sources of variability based on study quality. This wide-ranging meta-analysis suggests that no single test has sufficiently good performance but some existing diagnostics may be useful as part of a CRS. Additionally, based on findings from the meta-analysis and a constructed numerical example demonstrating the use of CRS, we proposed necessary criteria and potential components of a typhoid CRS to guide future recommendations. Agreement and adoption by all investigators of a standardized CRS is requisite, and would improve comparison of new diagnostics across independent studies, leading to the identification of a better reference test and improved confidence in prevalence estimates.

  5. The Bland-Altman analysis: Does it have a role in assessing radiation dosimeter performance relative to an established standard?

    International Nuclear Information System (INIS)

    Hill, R.F.; Tofts, P.S.; Baldock, C.

    2010-01-01

    Bland-Altman analysis is used to compare two different methods of measurement and to determine whether a new method of measurement may replace an existing accepted 'gold standard' method. In this work, Bland-Altman analysis has been applied to radiation dosimetry to compare the PTW Markus and Roos parallel plate ionisation chambers and a PTW PinPoint chamber against a Farmer type ionisation chamber which is accepted as the gold standard for radiation dosimetry in the clinic. Depth doses for low energy x-rays beams with energies of 50, 75 and 100 kVp were measured using each of the ionisation chambers. Depth doses were also calculated by interpolation of the data in the British Journal of Radiology (BJR) Report 25. From the Bland-Altman analysis, the mean dose difference between the two parallel plate chambers and the Farmer chambers was 1% over the range of depths measured. The PinPoint chamber gave significant dose differences compared to the Farmer chamber. There were also differences of up to 12% between the BJR Report 25 depth doses and the measured data. For the Bland-Altman plots, the lines representing the limits of agreement were selected to be a particular percentage agreement e.g. 1 or 2%, instead of being based on the standard deviation (σ) of the differences. The Bland-Altman statistical analysis is a powerful tool for making comparisons of ionisation chambers with an ionisation chamber that has been accepted as a 'gold standard'. Therefore we conclude that Bland-Altman analysis does have a role in assessing radiation dosimeter performance relative to an established standard.

  6. A Comprehensive General Chemistry Demonstration

    Science.gov (United States)

    Sweeder, Ryan D.; Jeffery, Kathleen A.

    2013-01-01

    This article describes the use of a comprehensive demonstration suitable for a high school or first-year undergraduate introductory chemistry class. The demonstration involves placing a burning candle in a container adjacent to a beaker containing a basic solution with indicator. After adding a lid, the candle will extinguish and the produced…

  7. 46 CFR 160.066-12 - Operational tests.

    Science.gov (United States)

    2010-10-01

    ... provided by a sealed plastic bag or other waterproof packaging, submersion under 25 mm (1 in.) of water for... operating instructions. The following data as observed must be recorded for each signal: (1) Burning time of the pyrotechnic candle; (2) Color; (3) Whether the pyrotechnic candle burns out above, at, or below...

  8. Standard Practice for Analysis and Interpretation of Light-Water Reactor Surveillance Results, E706(IA)

    CERN Document Server

    American Society for Testing and Materials. Philadelphia

    2001-01-01

    1.1 This practice covers the methodology, summarized in Annex A1, to be used in the analysis and interpretation of neutron exposure data obtained from LWR pressure vessel surveillance programs; and, based on the results of that analysis, establishes a formalism to be used to evaluate present and future condition of the pressure vessel and its support structures (1-70). 1.2 This practice relies on, and ties together, the application of several supporting ASTM standard practices, guides, and methods (see Master Matrix E 706) (1, 5, 13, 48, 49). In order to make this practice at least partially self-contained, a moderate amount of discussion is provided in areas relating to ASTM and other documents. Support subject areas that are discussed include reactor physics calculations, dosimeter selection and analysis, and exposure units. Note 1—(Figure 1 is deleted in the latest update. The user is refered to Master Matrix E 706 for the latest figure of the standards interconnectivity). 1.3 This practice is restri...

  9. CONVERGENCE OF INTERNATIONAL AUDIT STANDARDS AND AMERICAN AUDIT STANDARDS REGARDING SAMPLING

    Directory of Open Access Journals (Sweden)

    Chis Anca Oana

    2013-07-01

    Full Text Available Abstract: Sampling is widely used in market research, scientific analysis, market analysis, opinion polls and not least in the financial statement audit. We wonder what is actually sampling and how did it appear? Audit sampling involves the application of audit procedures to less than 100% of items within an account balance or class of transactions. Nowadays the technique is indispensable, the economic entities operating with sophisticated computer systems and large amounts of data. Economic globalization and complexity of capital markets has made possible not only the harmonization of international accounting standards with the national ones, but also the convergence of international accounting and auditing standards with the American regulations. International Standard on Auditing 530 and Statement on Auditing Standard 39 are the two main international and American normalized referentials referring to audit sampling. This article discusses the origin of audit sampling, mentioning a brief history of the method and different definitions from literature review. The two standards are studied using Jaccard indicators in terms of the degree of similarity and dissimilarity concerning different issues. The Jaccard coefficient measures the degree of convergence of international auditing standards (ISA 530 and U.S. auditing standards (SAS 39. International auditing standards and American auditing standards, study the sampling problem, both regulations presenting common points with regard to accepted sampling techniques, factors influencing the audit sample, treatment of identified misstatements and the circumstances in which sampling is appropriate. The study shows that both standards agree on application of statistical and non-statistical sampling in auditing, that sampling is appropriate for tests of details and controls, the factors affecting audit sampling being audit risk, audit objectives and population\\'s characteristics.

  10. Pemanfaatan Limbah Kulit Pisang Lilin (Musa paradisiaca Sebagai Pakan Alternatif Ayam Pedaging (Gallus galus domesticus

    Directory of Open Access Journals (Sweden)

    Ryan Hidayat

    2016-04-01

    utilization (Gallus gallus domesticus has been done. The research aims to find the utilization of candle banana skin as an alternative feed for broilers growth. This research was being used 1-day old broiler. And using competely randomized design (CRD with 5 treatments and 2 repititions this research was being used candles banana skin extract with 0% control 25%, 50%, 75%, and 100% concentration.the analized was being used variant analysis (ANOVA, it could continu with Least Significent Difference (LSD if there is a difference between the treatments with 95% confidence level. The result of this research show that the best gain of broiler, weight, is using 0% candles banana skin extract 289.04 grams. However, this result of mixing feed between candles banana skin and comercial put highhest in (25% candles banana skin consentration 259.20 grams, (50% 250.92 grams, (75% 251.65 grams whie the lowest result that cousing the death of broiler is in (100% candle banana extract consentration treatment because of high C/N contained. espescially on the high   N-Total cause the decrease of C/N ratio so that there is on mineralization procers on in creasing the feed contained. The conclution is the concentration of the feed given to broilers, it aaffects the low growth average of broilers the feed in 25%- 75% concentrations can consumed by the broilers to increase the weight. Keywords: Broilers, Candles banana skin, Growth and The rate of consumption Cara sitasi: Hidayat, R., Setiawan, A., Nofyan, E. (2016. Pemanfaatan Limbah Kulit Pisang Lilin (Musa paradisiaca Sebagai Pakan Alternatif Ayam Pedaging (Gallus galus domesticus. Jurnal Ilmu Lingkungan,14(1,11-17, doi:10.14710/jil.14.1.11-17

  11. Compliance with the AM+L4776L/CFT International Standard; Lessons from a Cross-Country Analysis

    OpenAIRE

    Concha Verdugo Yepes

    2011-01-01

    This paper assesses countries' compliance with the Anti-Money Laundering and Combating the Financing of Terrorism (AML/CFT) international standard during the period 2004 to 2011. We find that overall compliance is low; there is an adverse impact on financial transparency created by the cumulative effects of poor implementation of standards on customer identification; and the current measurements of compliance do not take into account an analysis of ML/FT risk, thereby undermining their credib...

  12. Evaluation of the H-point standard additions method (HPSAM) and the generalized H-point standard additions method (GHPSAM) for the UV-analysis of two-component mixtures.

    Science.gov (United States)

    Hund, E; Massart, D L; Smeyers-Verbeke, J

    1999-10-01

    The H-point standard additions method (HPSAM) and two versions of the generalized H-point standard additions method (GHPSAM) are evaluated for the UV-analysis of two-component mixtures. Synthetic mixtures of anhydrous caffeine and phenazone as well as of atovaquone and proguanil hydrochloride were used. Furthermore, the method was applied to pharmaceutical formulations that contain these compounds as active drug substances. This paper shows both the difficulties that are related to the methods and the conditions by which acceptable results can be obtained.

  13. Analysis of RIA standard curve by log-logistic and cubic log-logit models

    International Nuclear Information System (INIS)

    Yamada, Hideo; Kuroda, Akira; Yatabe, Tami; Inaba, Taeko; Chiba, Kazuo

    1981-01-01

    In order to improve goodness-of-fit in RIA standard analysis, programs for computing log-logistic and cubic log-logit were written in BASIC using personal computer P-6060 (Olivetti). Iterative least square method of Taylor series was applied for non-linear estimation of logistic and log-logistic. Hear ''log-logistic'' represents Y = (a - d)/(1 + (log(X)/c)sup(b)) + d As weights either 1, 1/var(Y) or 1/σ 2 were used in logistic or log-logistic and either Y 2 (1 - Y) 2 , Y 2 (1 - Y) 2 /var(Y), or Y 2 (1 - Y) 2 /σ 2 were used in quadratic or cubic log-logit. The term var(Y) represents squares of pure error and σ 2 represents estimated variance calculated using a following equation log(σ 2 + 1) = log(A) + J log(y). As indicators for goodness-of-fit, MSL/S sub(e)sup(2), CMD% and WRV (see text) were used. Better regression was obtained in case of alpha-fetoprotein by log-logistic than by logistic. Cortisol standard curve was much better fitted with cubic log-logit than quadratic log-logit. Predicted precision of AFP standard curve was below 5% in log-logistic in stead of 8% in logistic analysis. Predicted precision obtained using cubic log-logit was about five times lower than that with quadratic log-logit. Importance of selecting good models in RIA data processing was stressed in conjunction with intrinsic precision of radioimmunoassay system indicated by predicted precision. (author)

  14. Introducing mandatory standards for select household appliances in Lebanon: A cost-benefit analysis

    International Nuclear Information System (INIS)

    Ruble, Isabella; Karaki, Sami

    2013-01-01

    Lebanon's energy sector crisis leads to a lack of access to uninterrupted, basic modern electricity services that affects all sectors of the economy. Energy conservation measures are nearly inexistent yet they can potentially lead to substantial reductions in energy demand growth, environmental damages and public expenditures. This paper presents an analysis of the costs and benefits associated with the introduction of mandatory standards for energy efficiency for four different household appliances (refrigerator/freezers, AC split units, washing machines and lighting) over the period 2013–2027. Our results show potential savings in electricity consumption reaching 2054 GW h annually in 2027 as well as a reduction of subsidies paid to the public utility of 3.6 billion USD in 2027 while CO 2 emissions avoided amount to 8.9 million tons over the period of analysis. Furthermore, we propose a financially attractive refrigerator/freezer replacement program for low income households. If this program would cover all existing low-income households in 2013, the savings in electricity consumption would lead to a reduction in subsidies of 9 billion USD (NPV) over the period 2013–2027, while full funding for this program would cost the government 223.8 million USD. This program would thereby benefit consumers, the government and further economic development. - Highlights: ► We model the effect of mandatory appliance standards on electricity consumption. ► We present a refrigerator replacement program contributing to economic development. ► We show that economic efficiency favors the introduction of standards for appliances.

  15. Security analysis of standards-driven communication protocols for healthcare scenarios.

    Science.gov (United States)

    Masi, Massimiliano; Pugliese, Rosario; Tiezzi, Francesco

    2012-12-01

    The importance of the Electronic Health Record (EHR), that stores all healthcare-related data belonging to a patient, has been recognised in recent years by governments, institutions and industry. Initiatives like the Integrating the Healthcare Enterprise (IHE) have been developed for the definition of standard methodologies for secure and interoperable EHR exchanges among clinics and hospitals. Using the requisites specified by these initiatives, many large scale projects have been set up for enabling healthcare professionals to handle patients' EHRs. The success of applications developed in these contexts crucially depends on ensuring such security properties as confidentiality, authentication, and authorization. In this paper, we first propose a communication protocol, based on the IHE specifications, for authenticating healthcare professionals and assuring patients' safety. By means of a formal analysis carried out by using the specification language COWS and the model checker CMC, we reveal a security flaw in the protocol thus demonstrating that to simply adopt the international standards does not guarantee the absence of such type of flaws. We then propose how to emend the IHE specifications and modify the protocol accordingly. Finally, we show how to tailor our protocol for application to more critical scenarios with no assumptions on the communication channels. To demonstrate feasibility and effectiveness of our protocols we have fully implemented them.

  16. Evaluating Living Standard Indicators

    Directory of Open Access Journals (Sweden)

    Birčiaková Naďa

    2015-09-01

    Full Text Available This paper deals with the evaluation of selected available indicators of living standards, divided into three groups, namely economic, environmental, and social. We have selected six countries of the European Union for analysis: Bulgaria, the Czech Republic, Hungary, Luxembourg, France, and Great Britain. The aim of this paper is to evaluate indicators measuring living standards and suggest the most important factors which should be included in the final measurement. We have tried to determine what factors influence each indicator and what factors affect living standards. We have chosen regression analysis as our main method. From the study of factors, we can deduce their impact on living standards, and thus the value of indicators of living standards. Indicators with a high degree of reliability include the following factors: size and density of population, health care and spending on education. Emissions of carbon dioxide in the atmosphere also have a certain lower degree of reliability.

  17. The Distance Standard Deviation

    OpenAIRE

    Edelmann, Dominic; Richards, Donald; Vogel, Daniel

    2017-01-01

    The distance standard deviation, which arises in distance correlation analysis of multivariate data, is studied as a measure of spread. New representations for the distance standard deviation are obtained in terms of Gini's mean difference and in terms of the moments of spacings of order statistics. Inequalities for the distance variance are derived, proving that the distance standard deviation is bounded above by the classical standard deviation and by Gini's mean difference. Further, it is ...

  18. Vocational High School Effectiveness Standard ISO 9001: 2008 for Achievement Content Standards, Standard Process and Competency Standards Graduates

    Directory of Open Access Journals (Sweden)

    Yeni Ratih Pratiwi

    2014-06-01

    Full Text Available Efektivitas Sekolah Menengah Kejuruan Berstandar ISO 9001:2008 terhadap Pencapaian Standar Isi, Standar Proses dan Standar Kompetensi Lulusan Abstract: The purpose of this study was to determine differences in the effectiveness of the achievement of the content standards, process standards, and competency standards in vocational already standard ISO 9001: 2008 with CMS that has not been standardized ISO 9001: 2008 both in public schools and private schools. Data collection using the questionnaire enclosed Likert scale models. Analysis of data using one-way ANOVA using SPSS. The results showed: (1 there is no difference in effectiveness between public SMK ISO standard ISO standards with private SMK (P = 0.001; (2 there are differences in the effectiveness of public SMK SMK ISO standards with ISO standards have not (P = 0.000; (3 there are differences in the effectiveness of public SMK ISO standards with private vocational yet ISO standards (P = 0.000; (4 there are differences in the effectiveness of the private vocational school with vocational standard ISO standard ISO country has not (P = 0.015; (5 there are differences in the effectiveness of the private vocational bertandar ISO with private vocational yet standardized ISO (P = 0.000; (6 there was no difference in the effectiveness of public SMK has not been standardized by the ISO standard ISO private SMK yet. Key Words: vocational high school standards ISO 9001: 2008, the standard content, process standards, competency standards Abstrak: Tujuan penelitian ini untuk mengetahui perbedaan efektivitas pencapaian standar isi, standar proses, dan standar kompetensi lulusan pada SMK yang sudah berstandar ISO 9001:2008 dengan SMK yang belum berstandar ISO 9001:2008 baik pada sekolah negeri maupun sekolah swasta. Pengumpulan data menggunakan kuisioner tertutup model skala likert. Analisis data menggunakan ANOVA one way menggunakan program SPSS. Hasil penelitian menunjukkan: (1 ada perbedaan

  19. Reducing matrix effect error in EDXRF: Comparative study of using standard and standard less methods for stainless steel samples

    International Nuclear Information System (INIS)

    Meor Yusoff Meor Sulaiman; Masliana Muhammad; Wilfred, P.

    2013-01-01

    Even though EDXRF analysis has major advantages in the analysis of stainless steel samples such as simultaneous determination of the minor elements, analysis can be done without sample preparation and non-destructive analysis, the matrix issue arise from the inter element interaction can make the the final quantitative result to be in accurate. The paper relates a comparative quantitative analysis using standard and standard less methods in the determination of these elements. Standard method was done by plotting regression calibration graphs of the interested elements using BCS certified stainless steel standards. Different calibration plots were developed based on the available certified standards and these stainless steel grades include low alloy steel, austenitic, ferritic and high speed. The standard less method on the other hand uses a mathematical modelling with matrix effect correction derived from Lucas-Tooth and Price model. Further improvement on the accuracy of the standard less method was done by inclusion of pure elements into the development of the model. Discrepancy tests were then carried out for these quantitative methods on different certified samples and the results show that the high speed method is most reliable for determining of Ni and the standard less method for Mn. (Author)

  20. Analysis of the basic professional standards involving the work of psychologists in difficult and legally significant situations

    Directory of Open Access Journals (Sweden)

    Bogdanovich N. V.

    2016-06-01

    Full Text Available In this article the analysis of professional standards in terms of the scope of work of the psychologist with clients in difficult life and legal situations. The criteria of analysis chosen: reflected in professional activities, the choice of grounds for the selection of professional activities that focus on a specific Department, selection of a particular direction of activity of the psychologist (prevention, support, rehabilitation. It is shown that all five of the analyzed standards imply such a situation, but only three of them ("educational psychologist", "Psychologist in the social sphere", "Specialist in rehabilitative work in the social sphere" describe the activities of the psychologist, and the remaining ("Expert of bodies of guardianship and guardianship concerning minors" and "Specialist in working with families" are more organizational in nature. The conclusion about compliance of the training programs developed by the Department of legal psychology and law and education, the requirements of professional standards, proposed improvements in these programs.

  1. Measurement of Henry's Law Constants Using Internal Standards: A Quantitative GC Experiment for the Instrumental Analysis or Environmental Chemistry Laboratory

    Science.gov (United States)

    Ji, Chang; Boisvert, Susanne M.; Arida, Ann-Marie C.; Day, Shannon E.

    2008-01-01

    An internal standard method applicable to undergraduate instrumental analysis or environmental chemistry laboratory has been designed and tested to determine the Henry's law constants for a series of alkyl nitriles. In this method, a mixture of the analytes and an internal standard is prepared and used to make a standard solution (organic solvent)…

  2. Thermal safety analysis of a dry storage cask for the Korean standard spent fuel - 16159

    International Nuclear Information System (INIS)

    Cha, Jeonghun; Kim, S.N.; Choi, K.W.

    2009-01-01

    A conceptual dry storage facility, which is based on a commercial dry storage facility, was designed for the Korea standard spent nuclear fuel (SNF) and preliminary thermal safety analysis was performed in this study. To perform the preliminary thermal analysis, a thermal analysis method was proposed. The thermal analysis method consists of 2 parts. By using the method, the surface temperature of the storage canister corresponding to the SNF clad temperature was calculated and the adequate air duct area was decided using the calculation result. The initial temperature of the facility was calculated and the fire condition and half air duct blockage were analyzed. (authors)

  3. Gaia Data Release 1. Testing the parallaxes with local Cepheids and RR Lyrae stars

    OpenAIRE

    Gaia Collaboration; Clementini, G.; Eyer, L.; Ripepi, V.; Marconi, M.; Muraveva, T.; Garofalo, A.; Sarro, L. M.; Palmer, M.; Luri, X.; Molinaro, R.; Rimoldini, L.; Szabados, L.; Musella, I.; Anderson, R. I.

    2017-01-01

    Context. Parallaxes for 331 classical Cepheids, 31 Type II Cepheids, and 364 RR Lyrae stars in common between Gaia and the HIPPARCOS and Tycho-2 catalogues are published in Gaia Data Release 1 (DR1) as part of the Tycho-Gaia Astrometric Solution (TGAS). Aims. In order to test these first parallax measurements of the primary standard candles of the cosmological distance ladder, which involve astrometry collected by Gaia during the initial 14 months of science operation, we compared them with l...

  4. Gaia Data Release 1. Open cluster astrometry: performance, limitations, and future prospects

    OpenAIRE

    Gaia Collaboration; van Leeuwen, F.; Vallenari, A.; Jordi, C.; Lindegren, L.; Bastian, U.; Prusti, T.; de Bruijne, J. H. J.; Brown, A. G. A.; Babusiaux, C.; Bailer-Jones, C. A. L.; Biermann, M.; Evans, D. W.; Eyer, L.; Jansen, F.

    2017-01-01

    Context. Parallaxes for 331 classical Cepheids, 31 Type II Cepheids, and 364 RR Lyrae stars in common between Gaia and the Hipparcos and Tycho-2 catalogues are published in Gaia Data Release 1 (DR1) as part of the Tycho-Gaia Astrometric Solution (TGAS). \\ud Aims. In order to test these first parallax measurements of the primary standard candles of the cosmological distance ladder, which involve astrometry collected by Gaia during the initial 14 months of science operation, we compared them wi...

  5. A Comparative Analysis of the Supernova Legacy Survey Sample With ΛCDM and the Rh=ct Universe

    Science.gov (United States)

    Wei, Jun-Jie; Wu, Xue-Feng; Melia, Fulvio; Maier, Robert S.

    2015-03-01

    The use of Type Ia supernovae (SNe Ia) has thus far produced the most reliable measurement of the expansion history of the universe, suggesting that ΛCDM offers the best explanation for the redshift-luminosity distribution observed in these events. However, analysis of other kinds of sources, such as cosmic chronometers, gamma-ray bursts, and high-z quasars, conflicts with this conclusion, indicating instead that the constant expansion rate implied by the Rh = ct universe is a better fit to the data. The central difficulty with the use of SNe Ia as standard candles is that one must optimize three or four nuisance parameters characterizing supernova (SN) luminosities simultaneously with the parameters of an expansion model. Hence, in comparing competing models, one must reduce the data independently for each. We carry out such a comparison of ΛCDM and the Rh = ct universe using the SN Legacy Survey sample of 252 SN events, and show that each model fits its individually reduced data very well. However, since Rh = ct has only one free parameter (the Hubble constant), it follows from a standard model selection technique that it is to be preferred over ΛCDM, the minimalist version of which has three (the Hubble constant, the scaled matter density, and either the spatial curvature constant or the dark energy equation-of-state parameter). We estimate using the Bayes Information Criterion that in a pairwise comparison, the likelihood of Rh = ct is ˜90%, compared with only ˜10% for a minimalist form of ΛCDM, in which dark energy is simply a cosmological constant. Compared to Rh = ct, versions of the standard model with more elaborate parametrizations of dark energy are judged to be even less likely. This work is dedicated to the memory of Prof. Tan Lu, who sadly passed away 2014 December 3. Among his many achievements, he is considered to be one of the founders of high-energy astrophysics, and a pioneer in modern cosmology, in China.

  6. Standard Model updates and new physics analysis with the Unitarity Triangle fit

    International Nuclear Information System (INIS)

    Bevan, A.; Bona, M.; Ciuchini, M.; Derkach, D.; Franco, E.; Silvestrini, L.; Lubicz, V.; Tarantino, C.; Martinelli, G.; Parodi, F.; Schiavi, C.; Pierini, M.; Sordini, V.; Stocchi, A.; Vagnoni, V.

    2013-01-01

    We present the summer 2012 update of the Unitarity Triangle (UT) analysis performed by the UTfit Collaboration within the Standard Model (SM) and beyond. The increased accuracy on several of the fundamental constraints is now enhancing some of the tensions amongst and within the constraint themselves. In particular, the long standing tension between exclusive and inclusive determinations of the V ub and V cb CKM matrix elements is now playing a major role. Then we present the generalisation the UT analysis to investigate new physics (NP) effects, updating the constraints on NP contributions to ΔF=2 processes. In the NP analysis, both CKM and NP parameters are fitted simultaneously to obtain the possible NP effects in any specific sector. Finally, based on the NP constraints, we derive upper bounds on the coefficients of the most general ΔF=2 effective Hamiltonian. These upper bounds can be translated into lower bounds on the scale of NP that contributes to these low-energy effective interactions

  7. Suitable pellets standards development for LA-ICPMS analysis of Al2O3 powders

    International Nuclear Information System (INIS)

    Ferraz, Israel Elias; Sousa, Talita Alves de; Silva, Ieda de Souza; Gomide, Ricardo Goncalves; Oliveira, Luis Claudio de

    2013-01-01

    Chemical and physical characterization of aluminium oxides has a special interest for the nuclear industry, despite arduous chemical digestion process. Therefore, laser ablation inductively coupled plasma mass spectrometry is an attractive method for analysis. However, due to the lack of suitable matrix-matched certified reference materials (MRC) for such powders and ceramic pellets analysis, LA-ICPMS has not yet been fully applied. Furthermore, establishing calibrate curves to trace element quantification using external standards raises a significant problem. In this context, the development of suitable standard pellets to have calibration curves for chemical determination of the impurities onto aluminium oxide powders by LA-ICPMS analytical technique was aimed in this work. It was developed using two different analytical strategies: (I) boric acid pressed pellets and (II) lithium tetra-borate melted pellets, both spiked with high purity oxides of Si, Mg, Ca, Na,Fe, Cr and Ni. The analytical strategy (II) which presented the best analytical parameters was selected, a reference certificated material was analyzed and the results compared. The limits of detection, linearity, precision, accuracy and recovery study results are presented and discussed. (author)

  8. Using 1H and 13C NMR chemical shifts to determine cyclic peptide conformations: a combined molecular dynamics and quantum mechanics approach.

    Science.gov (United States)

    Nguyen, Q Nhu N; Schwochert, Joshua; Tantillo, Dean J; Lokey, R Scott

    2018-05-10

    Solving conformations of cyclic peptides can provide insight into structure-activity and structure-property relationships, which can help in the design of compounds with improved bioactivity and/or ADME characteristics. The most common approaches for determining the structures of cyclic peptides are based on NMR-derived distance restraints obtained from NOESY or ROESY cross-peak intensities, and 3J-based dihedral restraints using the Karplus relationship. Unfortunately, these observables are often too weak, sparse, or degenerate to provide unequivocal, high-confidence solution structures, prompting us to investigate an alternative approach that relies only on 1H and 13C chemical shifts as experimental observables. This method, which we call conformational analysis from NMR and density-functional prediction of low-energy ensembles (CANDLE), uses molecular dynamics (MD) simulations to generate conformer families and density functional theory (DFT) calculations to predict their 1H and 13C chemical shifts. Iterative conformer searches and DFT energy calculations on a cyclic peptide-peptoid hybrid yielded Boltzmann ensembles whose predicted chemical shifts matched the experimental values better than any single conformer. For these compounds, CANDLE outperformed the classic NOE- and 3J-coupling-based approach by disambiguating similar β-turn types and also enabled the structural elucidation of the minor conformer. Through the use of chemical shifts, in conjunction with DFT and MD calculations, CANDLE can help illuminate conformational ensembles of cyclic peptides in solution.

  9. A main sequence for quasars

    Science.gov (United States)

    Marziani, Paola; Dultzin, Deborah; Sulentic, Jack W.; Del Olmo, Ascensión; Negrete, C. A.; Martínez-Aldama, Mary L.; D'Onofrio, Mauro; Bon, Edi; Bon, Natasa; Stirpe, Giovanna M.

    2018-03-01

    The last 25 years saw a major step forward in the analysis of optical and UV spectroscopic data of large quasar samples. Multivariate statistical approaches have led to the definition of systematic trends in observational properties that are the basis of physical and dynamical modeling of quasar structure. We discuss the empirical correlates of the so-called “main sequence” associated with the quasar Eigenvector 1, its governing physical parameters and several implications on our view of the quasar structure, as well as some luminosity effects associated with the virialized component of the line emitting regions. We also briefly discuss quasars in a segment of the main sequence that includes the strongest FeII emitters. These sources show a small dispersion around a well-defined Eddington ratio value, a property which makes them potential Eddington standard candles.

  10. A Main Sequence for Quasars

    Directory of Open Access Journals (Sweden)

    Paola Marziani

    2018-03-01

    Full Text Available The last 25 years saw a major step forward in the analysis of optical and UV spectroscopic data of large quasar samples. Multivariate statistical approaches have led to the definition of systematic trends in observational properties that are the basis of physical and dynamical modeling of quasar structure. We discuss the empirical correlates of the so-called “main sequence” associated with the quasar Eigenvector 1, its governing physical parameters and several implications on our view of the quasar structure, as well as some luminosity effects associated with the virialized component of the line emitting regions. We also briefly discuss quasars in a segment of the main sequence that includes the strongest FeII emitters. These sources show a small dispersion around a well-defined Eddington ratio value, a property which makes them potential Eddington standard candles.

  11. Establishment of gold-quartz standard GQS-1

    Science.gov (United States)

    Millard, Hugh T.; Marinenko, John; McLane, John E.

    1969-01-01

    A homogeneous gold-quartz standard, GQS-1, was prepared from a heterogeneous gold-bearing quartz by chemical treatment. The concentration of gold in GQS-1 was determined by both instrumental neutron activation analysis and radioisotope dilution analysis to be 2.61?0.10 parts per million. Analysis of 10 samples of the standard by both instrumental neutron activation analysis and radioisotope dilution analysis failed to reveal heterogeneity within the standard. The precision of the analytical methods, expressed as standard error, was approximately 0.1 part per million. The analytical data were also used to estimate the average size of gold particles. The chemical treatment apparently reduced the average diameter of the gold particles by at least an order of magnitude and increased the concentration of gold grains by a factor of at least 4,000.

  12. Standardization of domestic human reliability analysis and experience of human reliability analysis in probabilistic safety assessment for NPPs under design

    International Nuclear Information System (INIS)

    Kang, D. I.; Jung, W. D.

    2002-01-01

    This paper introduces the background and development activities of domestic standardization of procedure and method for Human Reliability Analysis (HRA) to avoid the intervention of subjectivity by HRA analyst in Probabilistic Safety Assessment (PSA) as possible, and the review of the HRA results for domestic nuclear power plants under design studied by Korea Atomic Energy Research Institute. We identify the HRA methods used for PSA for domestic NPPs and discuss the subjectivity of HRA analyst shown in performing a HRA. Also, we introduce the PSA guidelines published in USA and review the HRA results based on them. We propose the system of a standard procedure and method for HRA to be developed

  13. The value of Standards and Labelling: an international cost-benefit analysis tool for Standards and Labelling programs with results for Central American countries

    International Nuclear Information System (INIS)

    Buskirk, Robert D. Van; McNeil, Michael A.; Letschert, Virginie E.

    2005-01-01

    We describe a cost-benefit spreadsheet analysis tool that provides an evaluation of the net impacts of an appliance standards and labelling (SandL) program. The tool is designed to provide a rough estimate at very low cost to local analysts, while allowing for a more accurate evaluation when detailed local data are available. The methodology takes a bottom-up engineering approach, beginning with appliance-specific engineering parameters relating efficiency improvement and incremental costs associated with specific design technologies. Efficiency improvement afforded by each potential policy option is combined with local appliance use patterns to estimate average annual energy consumption for each appliance. This information is combined with appliance lifetime data and local energy prices to provide a life cycle cost impact assessment at the household level. In addition to household level impacts, the analysis tool forecasts future appliance sales, in order to calculate potential energy savings, consumer financial impacts and carbon emissions reductions at the national level. In order to demonstrate the features of the policy model employed, this poster presents a regional analysis based on the most recent publicly available appliance data. In particular, a set of developing countries in Central America were chosen as an example. Taken as a whole, the Central American results demonstrate the general level of benefit which could be afforded in these countries. Comparison between the countries reveals the key parameters determining the benefit a given country can expect from a standards program

  14. Data Analysis and Statistics in Middle Grades: An Analysis of Content Standards

    Science.gov (United States)

    Sorto, M. Alejandra

    2011-01-01

    The purpose of the study reported herein was to identify the important aspects of statistical knowledge that students in the middle school grades in United States are expected to learn as well as what the teachers are expected to teach. A systematic study of 49 states standards and one set of national standards was used to identify these important…

  15. Determination of 25 elements in biological standard reference materials by neutron activation analysis

    International Nuclear Information System (INIS)

    Guzzi, G.; Pietra, R.; Sabbioni, E.

    1974-12-01

    Standard and Certified Reference Materials programme of the JRC includes the determination of trace elements in complex biological samples delivered by the U.S. National Bureau of Standards: Bovine liver (NBS SRM 1577), Orchard Leaves (NBS SRM 1571) and Tomato Leaves. The study has been performed by the use of neutron activation analysis. Due to the very low concentration of some elements, radiochemical groups or elemental separation procedures were necessary. The paper describes the techniques used to analyse 25 elements. Computer assisted instrumental neutron activation analysis with high resolution Ge(Li) spectrometry was considerably advantageous in the determination of Na, K, Cl, Mn, Fe, Rb and Co and in some cases of Ca, Zn, Cs, Sc, and Cr. For low contents of Ca, Mg, Ni and Si special chemical separation schemes, followed by Cerenkov counting have been developped. Two other separation procedures allowing the determination of As, Cd, Ga, Hg, Mo, Cu, Sr Se, Ba and P have been set up. The first, the simplified one involves the use of high resolution Ge(Li) detectors, the second, the more complete one involves a larger number of shorter measurements performed by simpler and more sensitive techniques, such as NaI(Tl) scintillation spectrometry and Cerenkov counting. The results obtained are presented and discussed

  16. Quantitative Analysis of Uncertainty in Medical Reporting: Creating a Standardized and Objective Methodology.

    Science.gov (United States)

    Reiner, Bruce I

    2018-04-01

    Uncertainty in text-based medical reports has long been recognized as problematic, frequently resulting in misunderstanding and miscommunication. One strategy for addressing the negative clinical ramifications of report uncertainty would be the creation of a standardized methodology for characterizing and quantifying uncertainty language, which could provide both the report author and reader with context related to the perceived level of diagnostic confidence and accuracy. A number of computerized strategies could be employed in the creation of this analysis including string search, natural language processing and understanding, histogram analysis, topic modeling, and machine learning. The derived uncertainty data offers the potential to objectively analyze report uncertainty in real time and correlate with outcomes analysis for the purpose of context and user-specific decision support at the point of care, where intervention would have the greatest clinical impact.

  17. Agreement between gastrointestinal panel testing and standard microbiology methods for detecting pathogens in suspected infectious gastroenteritis: Test evaluation and meta-analysis in the absence of a reference standard.

    Science.gov (United States)

    Freeman, Karoline; Tsertsvadze, Alexander; Taylor-Phillips, Sian; McCarthy, Noel; Mistry, Hema; Manuel, Rohini; Mason, James

    2017-01-01

    Multiplex gastrointestinal pathogen panel (GPP) tests simultaneously identify bacterial, viral and parasitic pathogens from the stool samples of patients with suspected infectious gastroenteritis presenting in hospital or the community. We undertook a systematic review to compare the accuracy of GPP tests with standard microbiology techniques. Searches in Medline, Embase, Web of Science and the Cochrane library were undertaken from inception to January 2016. Eligible studies compared GPP tests with standard microbiology techniques in patients with suspected gastroenteritis. Quality assessment of included studies used tailored QUADAS-2. In the absence of a reference standard we analysed test performance taking GPP tests and standard microbiology techniques in turn as the benchmark test, using random effects meta-analysis of proportions. No study provided an adequate reference standard with which to compare the test accuracy of GPP and conventional tests. Ten studies informed a meta-analysis of positive and negative agreement. Positive agreement across all pathogens was 0.93 (95% CI 0.90 to 0.96) when conventional methods were the benchmark and 0.68 (95% CI: 0.58 to 0.77) when GPP provided the benchmark. Negative agreement was high in both instances due to the high proportion of negative cases. GPP testing produced a greater number of pathogen-positive findings than conventional testing. It is unclear whether these additional 'positives' are clinically important. GPP testing has the potential to simplify testing and accelerate reporting when compared to conventional microbiology methods. However the impact of GPP testing upon the management, treatment and outcome of patients is poorly understood and further studies are needed to evaluate the health economic impact of GPP testing compared with standard methods. The review protocol is registered with PROSPERO as CRD42016033320.

  18. Toward the Standardization of Biochar Analysis: The COST Action TD1107 Interlaboratory Comparison.

    Science.gov (United States)

    Bachmann, Hans Jörg; Bucheli, Thomas D; Dieguez-Alonso, Alba; Fabbri, Daniele; Knicker, Heike; Schmidt, Hans-Peter; Ulbricht, Axel; Becker, Roland; Buscaroli, Alessandro; Buerge, Diane; Cross, Andrew; Dickinson, Dane; Enders, Akio; Esteves, Valdemar I; Evangelou, Michael W H; Fellet, Guido; Friedrich, Kevin; Gasco Guerrero, Gabriel; Glaser, Bruno; Hanke, Ulrich M; Hanley, Kelly; Hilber, Isabel; Kalderis, Dimitrios; Leifeld, Jens; Masek, Ondrej; Mumme, Jan; Carmona, Marina Paneque; Calvelo Pereira, Roberto; Rees, Frederic; Rombolà, Alessandro G; de la Rosa, José Maria; Sakrabani, Ruben; Sohi, Saran; Soja, Gerhard; Valagussa, Massimo; Verheijen, Frank; Zehetner, Franz

    2016-01-20

    Biochar produced by pyrolysis of organic residues is increasingly used for soil amendment and many other applications. However, analytical methods for its physical and chemical characterization are yet far from being specifically adapted, optimized, and standardized. Therefore, COST Action TD1107 conducted an interlaboratory comparison in which 22 laboratories from 12 countries analyzed three different types of biochar for 38 physical-chemical parameters (macro- and microelements, heavy metals, polycyclic aromatic hydrocarbons, pH, electrical conductivity, and specific surface area) with their preferential methods. The data were evaluated in detail using professional interlaboratory testing software. Whereas intralaboratory repeatability was generally good or at least acceptable, interlaboratory reproducibility was mostly not (20% < mean reproducibility standard deviation < 460%). This paper contributes to better comparability of biochar data published already and provides recommendations to improve and harmonize specific methods for biochar analysis in the future.

  19. A Content Analysis of Immigration in Traditional, New, and Non-Gateway State Standards for U.S. History and Civics

    Science.gov (United States)

    Hilburn, Jeremy; Journell, Wayne; Buchanan, Lisa Brown

    2016-01-01

    In this content analysis of state U.S. History and Civics standards, we compared the treatment of immigration across three types of states with differing immigration demographics. Analyzing standards from 18 states from a critical race methodology perspective, our findings indicated three sets of tensions: a unified American story versus local…

  20. Instrumental charged-particle activation analysis of several selected elements in biological materials using the internal standard method

    International Nuclear Information System (INIS)

    Yagi, M.; Masumoto, K.

    1987-01-01

    In order to study instrumental charged-particle activation analysis using the internal standard method, simultaneous determinations of several selected elements such as Ca, Ti, V, Fe, Zn, As, Sr, Zr and Mo, in oyster tissue, brewer's yeast and mussel were carried out by using the respective (p, n) reactions and a personal computer-based gamma-ray spectrometer equipped with a micro-robot for sample changing. In the determination constant amounts of Y and La were added to the sample and comparative standard as exotic internal standards. As a result, it was demonstrated that concentrations of the above elements could be determined accurately and precisely. (author)

  1. [Implementation of the National Expert Standard Prophylaxis of Pressure Ulcers in nurse practise - a cost-benefit analysis].

    Science.gov (United States)

    Wolke, R; Hennings, D; Scheu, P

    2007-06-01

    By developing evidence-based, national Expert Standards, agreed-upon by an association of nursing professionals, the German Care Science participates in the international discussion. Up to now, five National Expert Standards on relevant care-related topics have been developed and have been widely implemented in Care Practice. However, sufficient evaluations of these Expert Standards are still required, especially from an economic perspective. The following paper addresses this topic by performing a cost-benefit analysis for the National Expert Standard Prophylaxis of Pressure Ulcers. The authors demonstrate which costs are caused by the implementation of this National Expert Standard for a residential care agency providing services. The benefit of the implementation of the Expert Standard is then being compared to its cost for a period of three years. The evaluation concludes that, in consideration of opportunity costs, the introduction of the National Expert Standard Prophylaxis of Pressure Ulcers appears economically viable for the residential care agency only if the rate of pressure ulcers in the reference agency can be lowered at least by 26.48%. In this case, when exclusively considering direct benefits and direct costs, a positive impact of the implementation will be achieved.

  2. Improvement of precision method of spectrophotometry with inner standardization and its use in plutonium solutions analysis

    International Nuclear Information System (INIS)

    Stepanov, A.V.; Stepanov, D.A.; Nikitina, S.A.; Gogoleva, T.D.; Grigor'eva, M.G.; Bulyanitsa, L.S.; Panteleev, Yu.A.; Pevtsova, E.V.; Domkin, V.D.; Pen'kin, M.V.

    2006-01-01

    Precision method of spectrophotometry with inner standardization is used for analysis of pure Pu solutions. Improvement of the spectrophotometer and spectrophotometric method of analysis is done to decrease accidental constituent of relative error of the method. Influence of U, Np impurities and corrosion products on systematic constituent of error of the method, and effect of fluoride-ion on completeness of Pu oxidation in sample preparation are studied [ru

  3. Standard deviation analysis of the mastoid fossa temperature differential reading: a potential model for objective chiropractic assessment.

    Science.gov (United States)

    Hart, John

    2011-03-01

    This study describes a model for statistically analyzing follow-up numeric-based chiropractic spinal assessments for an individual patient based on his or her own baseline. Ten mastoid fossa temperature differential readings (MFTD) obtained from a chiropractic patient were used in the study. The first eight readings served as baseline and were compared to post-adjustment readings. One of the two post-adjustment MFTD readings fell outside two standard deviations of the baseline mean and therefore theoretically represents improvement according to pattern analysis theory. This study showed how standard deviation analysis may be used to identify future outliers for an individual patient based on his or her own baseline data. Copyright © 2011 National University of Health Sciences. Published by Elsevier Inc. All rights reserved.

  4. Quality assessment of Chrysanthemum indicum Flower by simultaneous quantification of six major ingredients using a single reference standard combined with HPLC fingerprint analysis

    Directory of Open Access Journals (Sweden)

    Jiao He

    2016-04-01

    Full Text Available Chrysanthemum indicum Flower is usually consumed as functional food. This paper described an improved total quality assessment method for Chrysanthemum indicum Flower by simultaneous quantitation using a single standard to determine multi-components method combined with high performance liquid chromatography fingerprint analysis. Six main components of Chrysanthemum indicum Flower including two flavonoids and four phenolic acids were simultaneously quantified using linarin as the internal reference standard. The method was fully validated with respect to linearity, precision, accuracy, robustness and stability. The validated method was successfully applied to the analysis of thirty three batches of Chrysanthemum indicum Flower samples. Under the same chromatographic conditions, fingerprint analysis in combination with Similarity analysis and principal component analysis was performed to identify the samples from different regions. In general, an effective assessment using a single standard to determinate multi-components method combined with fingerprint analysis make the reliable qualitation and quantitation analysis of Chrysanthemum indicum Flower available.

  5. Uranium Hydrogeochemical and Stream Sediment Reconnaissance data from the area of the Teller, Bendeleben, Candle, and Kateel River Quadrangles, Seward Peninsula and vicinity, Alaska

    International Nuclear Information System (INIS)

    Sharp, R.R. Jr.; Hill, D.E.

    1978-05-01

    During July-August 1976, 2026 natural waters and 2085 bottom sediments were collected from 2209 sample locations (at a nominal density of one location each 23 km 2 ) on streams and small lakes throughout the Teller, Bendeleben, Candle, and western one-third of the Kateel River NTMS quadrangles, Alaska. Total uranium was measured in the waters by fluorometry and in the sediments and a few waters by delayed-neutron counting. The uranium content of the waters ranged from below the detection limit of 0.02 parts per billion (ppB) to a high of 14.50 ppB, averaging 0.44 ppB, and that of the sediments ranged from a low of 0.2 parts per million (ppM) to a high of 107.4 ppM, averaing 3.93 ppM. The uranium data for water and sediment are separately presented--as computer listings that include pertinent field measurements from each location, as graphically portrayed concentration overlays at 1:250,000 scale for each quadrangle, and as reduced figures showing contours drawn at various concentration levels for each quadrangle--and their areal distributions are compared and correlated with the known features and uranium showings. A test of increasingly detailed methods of data evaluation shows that the more extensive the evaluation, the more useful the reconnaissance uranium data are likely to be. The validity and potential usefulness of the HSSR uranium data are conclusively substantiated by the fact that evidence of all 23 of the reported uranium showings in the 50,000-km 2 study area can be discerned. Several new locations of interest for further field investigation are identified in each of the quadrangles, and most notably in the Bendeleben Mountains. However, the data presented would appear equally useful in guiding field investigation around the uranium occurrences already known, as noteworthy samples often come from close by but on tributary drainages adjacent, opposite, or above them

  6. Problems concerning product quality enhancement

    Directory of Open Access Journals (Sweden)

    Marek Krynke

    2016-03-01

    Full Text Available In the article analysis of the discrepancies in the production process for selected products in a company producing candles was carried out. Using the Pareto-Lorenzdiagram and the FMEA method the most essential areas having influence on the production of candles were shown. Apart from factors connected with the manufacturing side of the process, factors of the labour organization and requirements concerning the quality of material were also noted. An appropriate quality of equipment constitutes one of the essential conditions of production process functioning and this directly influences manufacturing possibilities of the enterprise. A synthesis of immaterial factors that influence the production of the enterprise, taking into consideration conditions of functioning the production system, was also carried out. The set of factors selected for description was the fourteenth Toyota management principle. Respondents were asked to provide answers which could bring the best improvements.

  7. Technical support document: Energy conservation standards for consumer products: Dishwashers, clothes washers, and clothes dryers including: Environmental impacts; regulatory impact analysis

    Energy Technology Data Exchange (ETDEWEB)

    1990-12-01

    The Energy Policy and Conservation Act as amended (P.L. 94-163), establishes energy conservation standards for 12 of the 13 types of consumer products specifically covered by the Act. The legislation requires the Department of Energy (DOE) to consider new or amended standards for these and other types of products at specified times. This Technical Support Document presents the methodology, data and results from the analysis of the energy and economic impacts of standards on dishwashers, clothes washers, and clothes dryers. The economic impact analysis is performed in five major areas: An Engineering Analysis, which establishes technical feasibility and product attributes including costs of design options to improve appliance efficiency. A Consumer Analysis at two levels: national aggregate impacts, and impacts on individuals. The national aggregate impacts include forecasts of appliance sales, efficiencies, energy use, and consumer expenditures. The individual impacts are analyzed by Life-Cycle Cost (LCC), Payback Periods, and Cost of Conserved Energy (CCE), which evaluate the savings in operating expenses relative to increases in purchase price; A Manufacturer Analysis, which provides an estimate of manufacturers' response to the proposed standards. Their response is quantified by changes in several measures of financial performance for a firm. An Industry Impact Analysis shows financial and competitive impacts on the appliance industry. A Utility Analysis that measures the impacts of the altered energy-consumption patterns on electric utilities. A Environmental Effects analysis, which estimates changes in emissions of carbon dioxide, sulfur oxides, and nitrogen oxides, due to reduced energy consumption in the home and at the power plant. A Regulatory Impact Analysis collects the results of all the analyses into the net benefits and costs from a national perspective. 47 figs., 171 tabs. (JF)

  8. Implementation of a management system in accordance with IAEA GS-R-3 Standard. A gap analysis

    International Nuclear Information System (INIS)

    Dicianu, I.; Oprea, M.

    2009-01-01

    Full text: The design and implementation of an Integrated Management System at SNN SA Headquarters become necessary as the CNCAN norms are already under revision to comply with the IAEA GS-R-3 standard. The purpose of this analysis is to draft a project for the transition from a Quality Management System (QMS) to an Integrated Management System (IMS) complying with GS-R-3 requirements. Four steps were identified for developing this project: STEP1 - To justify the necessity of the IMS implementation to meet the SNN SA Headquarters Top Management commitments. The requirements for implementing an IMS are analyzed and a comprehensive document is issued to (and maybe discussed with) SNN General Director in order to obtain the top management adherence/commitment to the project implementation. The document will show the strong and the weak points which should be considered in developing the project. The references for the project are: - IAEA Safety Standard GS-R-3 'The Management System for Facilities and Activities'; - ISO - 1400/2004 Standard 'Environmental Management System Requirements'; - OHSAS 18001/2007 Standard 'Occupational Health and Safety Management Systems. Requirements'; There are also considered: - IAEA Safety Guide GS-G-3.1 'Application of the Management System for Facilities and Activities'; - IAEA Draft Safety Guide DS-349 'Application of the Management System for Nuclear Facilities; There will be considered: Workshop 2 Bookmarks (F5) 2 - CNCAN Norms (as they will be revised); STEP2 - The performance of a comparative analysis of the requirements of GS-R-3, ISO 14001 and OHSAS 18001 versus the provisions of the QMS already implemented in SNN. This analysis is shown as a comparative table; STEP3 - Identification of the IMS processes. An overall analysis of the current processes described in the SNN QMS Manual is performed and based on this. There are identified the additional processes that have to be documented for the proper implementation of an IMS

  9. Design of Standards and Labeling programs in Chile: Techno-Economic Analysis for Refrigerators

    Energy Technology Data Exchange (ETDEWEB)

    Letschert, Virginie E. [Lawrence Berkeley National Laboratory (LBNL), Berkeley, CA (United States). Environmental Energy Technologies Division.; McNeil, Michael A. [Lawrence Berkeley National Laboratory (LBNL), Berkeley, CA (United States). Environmental Energy Technologies Division.; Pavon, Mariana [Lawrence Berkeley National Laboratory (LBNL), Berkeley, CA (United States). Environmental Energy Technologies Division.; Lutz, Wolfgang F. [Lawrence Berkeley National Laboratory (LBNL), Berkeley, CA (United States). Environmental Energy Technologies Division.

    2013-05-01

    Lawrence Berkeley National Laboratory is a global leader in the study of energy efficiency and its effective implementation through government policy. The Energy Analysis and Environmental Impacts Department of LBNL’s Environmental Energy Technologies Division provides technical assistance to help federal, stat e and local government agencies in the United States, and throughout the world, develop long-term strategies, policy, and programs to encourage energy efficiency in all sectors and industries. In the past, LBNL has assisted staff of various countries government agencies and their con tractors in providing methodologies to analyze cost-effectiveness of regulations and asses s overall national impacts of efficiency programs. The paper presents the work done in collaboration with the Ministry of Energy (MoE) in Chile and the Collaborative Labeling Appliance Standards Programs (CLASP) on designing a Minimum Energy Performance Standards (MEPS) and ext ending the current labeling program for refrigerators.

  10. The Standard Model

    International Nuclear Information System (INIS)

    Sutton, Christine

    1994-01-01

    The initial evidence from Fermilab for the long awaited sixth ('top') quark puts another rivet in the already firm structure of today's Standard Model of physics. Analysis of the Fermilab CDF data gives a top mass of 174 GeV with an error of ten per cent either way. This falls within the mass band predicted by the sum total of world Standard Model data and underlines our understanding of physics in terms of six quarks and six leptons. In this specially commissioned overview, physics writer Christine Sutton explains the Standard Model

  11. Potential effects of particulate matter from combustion during services on human health and on works of art in medieval churches in Cyprus

    Energy Technology Data Exchange (ETDEWEB)

    Loupa, Glykeria, E-mail: gloupa@env.duth.g [Laboratory of Atmospheric Pollution and Pollution Control Engineering, Faculty of Engineering, Department of Environmental Engineering, Democritus University of Thrace, P.O. Box 447, Xanthi 67100 (Greece); Karageorgos, Evangelos, E-mail: vkarageo@env.duth.g [Laboratory of Atmospheric Pollution and Pollution Control Engineering, Faculty of Engineering, Department of Environmental Engineering, Democritus University of Thrace, P.O. Box 447, Xanthi 67100 (Greece); Rapsomanikis, Spyridon, E-mail: rapso@env.duth.g [Laboratory of Atmospheric Pollution and Pollution Control Engineering, Faculty of Engineering, Department of Environmental Engineering, Democritus University of Thrace, P.O. Box 447, Xanthi 67100 (Greece)

    2010-09-15

    Indoor and outdoor particulate matter (PM{sub 0.3-10}) number concentrations were established in two medieval churches in Cyprus. In both churches incense was burnt occasionally during Mass. The highest indoor PM{sub 0.5-1} concentrations compared with outdoors (10.7 times higher) were observed in the church that burning of candles indoors was allowed. Peak indoor black carbon concentration was 6.8 {mu}g m{sup -3} in the instances that incense was burning and 13.4 {mu}g m{sup -3} in the instances that the candles were burning (outdoor levels ranged between 0.6 and 1.3 {mu}g m{sup -3}). From the water soluble inorganic components determined in PM{sub 10}, calcium prevailed in all samples indoors or outdoors, whilst high potassium concentration indoors were a clear marker of combustion. Indoor sources of PM were clearly identified and their emission strengths were estimated via modeling of the results. Indoor estimated PM{sub 0.3-10} mass concentrations exceeded air quality standards for human health protection and for the preservation of works of art. - Particulate matter in medieval churches of Cyprus.

  12. The 3rd ATLAS Domestic Standard Problem for Improvement of Safety Analysis Technology

    International Nuclear Information System (INIS)

    Choi, Ki-Yong; Kang, Kyoung-Ho; Park, Yusun; Kim, Jongrok; Bae, Byoung-Uhn; Choi, Nam-Hyun

    2014-01-01

    The third ATLAS DSP (domestic standard problem exercise) was launched at the end of 2012 in response to the strong need for continuation of the ATLAS DSP. A guillotine break of a main steam line without LOOP at a zero power condition was selected as a target scenario, and it was successfully completed in the beginning of 2014. In the 3 rd ATLAS DSP, comprehensive utilization of the integral effect test data was made by dividing analysis with three topics; 1. scale-up where extrapolation of ATLAS IET data was investigated 2. 3D analysis where how much improvement can be obtained by 3D modeling was studied 3. 1D sensitivity analysis where the key phenomena affecting the SLB simulation were identified and the best modeling guideline was achieved. Through such DSP exercises, it has been possible to effectively utilize high-quality ATLAS experimental data of to enhance thermal-hydraulic understanding and to validate the safety analysis codes. A strong human network and technical expertise sharing among the various nuclear experts are also important outcomes from this program

  13. Standard test methods for chemical and spectrochemical analysis of nuclear-Grade silver-indium-cadmium alloys

    CERN Document Server

    American Society for Testing and Materials. Philadelphia

    1990-01-01

    1.1 These test methods cover procedures for the chemical and spectrochemical analysis of nuclear grade silver-indium-cadmium (Ag-In-Cd) alloys to determine compliance with specifications. 1.2 The analytical procedures appear in the following order: Sections Silver, Indium, and Cadmium by a Titration Method 7-15 Trace Impurities by Carrier-Distillation Spectro- chemical Method 16-22 1.3 The values stated in SI units are to be regarded as the standard. 1.4 This standard does not purport to address all of the safety concerns, if any, associated with its use. It is the responsibility of the user of this standard to establish appropriate safety and health practices and determine the applicability of regulatory limitations prior to use. For specific hazard and precautionary statements, see Section 5 and Practices E50. 7.1 This test method is applicable to the determination of silver, indium, and cadmium in alloys of approximately 80 % silver, 15 % indium, and 5 % cadmium used in nuclear reactor control r...

  14. dcmqi: An Open Source Library for Standardized Communication of Quantitative Image Analysis Results Using DICOM.

    Science.gov (United States)

    Herz, Christian; Fillion-Robin, Jean-Christophe; Onken, Michael; Riesmeier, Jörg; Lasso, Andras; Pinter, Csaba; Fichtinger, Gabor; Pieper, Steve; Clunie, David; Kikinis, Ron; Fedorov, Andriy

    2017-11-01

    Quantitative analysis of clinical image data is an active area of research that holds promise for precision medicine, early assessment of treatment response, and objective characterization of the disease. Interoperability, data sharing, and the ability to mine the resulting data are of increasing importance, given the explosive growth in the number of quantitative analysis methods being proposed. The Digital Imaging and Communications in Medicine (DICOM) standard is widely adopted for image and metadata in radiology. dcmqi (DICOM for Quantitative Imaging) is a free, open source library that implements conversion of the data stored in commonly used research formats into the standard DICOM representation. dcmqi source code is distributed under BSD-style license. It is freely available as a precompiled binary package for every major operating system, as a Docker image, and as an extension to 3D Slicer. Installation and usage instructions are provided in the GitHub repository at https://github.com/qiicr/dcmqi Cancer Res; 77(21); e87-90. ©2017 AACR . ©2017 American Association for Cancer Research.

  15. Canonical integration and analysis of periodic maps using non-standard analysis and life methods

    Energy Technology Data Exchange (ETDEWEB)

    Forest, E.; Berz, M.

    1988-06-01

    We describe a method and a way of thinking which is ideally suited for the study of systems represented by canonical integrators. Starting with the continuous description provided by the Hamiltonians, we replace it by a succession of preferably canonical maps. The power series representation of these maps can be extracted with a computer implementation of the tools of Non-Standard Analysis and analyzed by the same tools. For a nearly integrable system, we can define a Floquet ring in a way consistent with our needs. Using the finite time maps, the Floquet ring is defined only at the locations s/sub i/ where one perturbs or observes the phase space. At most the total number of locations is equal to the total number of steps of our integrator. We can also produce pseudo-Hamiltonians which describe the motion induced by these maps. 15 refs., 1 fig.

  16. Is It Working? Distractor Analysis Results from the Test Of Astronomy STandards (TOAST) Assessment Instrument

    Science.gov (United States)

    Slater, Stephanie

    2009-05-01

    The Test Of Astronomy STandards (TOAST) assessment instrument is a multiple-choice survey tightly aligned to the consensus learning goals stated by the American Astronomical Society - Chair's Conference on ASTRO 101, the American Association of the Advancement of Science's Project 2061 Benchmarks, and the National Research Council's National Science Education Standards. Researchers from the Cognition in Astronomy, Physics and Earth sciences Research (CAPER) Team at the University of Wyoming's Science and Math Teaching Center (UWYO SMTC) have been conducting a question-by-question distractor analysis procedure to determine the sensitivity and effectiveness of each item. In brief, the frequency each possible answer choice, known as a foil or distractor on a multiple-choice test, is determined and compared to the existing literature on the teaching and learning of astronomy. In addition to having statistical difficulty and discrimination values, a well functioning assessment item will show students selecting distractors in the relative proportions to how we expect them to respond based on known misconceptions and reasoning difficulties. In all cases, our distractor analysis suggests that all items are functioning as expected. These results add weight to the validity of the Test Of Astronomy STandards (TOAST) assessment instrument, which is designed to help instructors and researchers measure the impact of course-length duration instructional strategies for undergraduate science survey courses with learning goals tightly aligned to the consensus goals of the astronomy education community.

  17. Standard practice for analysis and interpretation of physics dosimetry results for test reactors

    International Nuclear Information System (INIS)

    Anon.

    1984-01-01

    This practice describes the methodology summarized in Annex Al to be used in the analysis and interpretation of physics-dosimetry results from test reactors. This practice relies on, and ties together, the application of several supporting ASTM standard practices, guides, and methods that are in various stages of completion (see Fig. 1). Support subject areas that are discussed include reactor physics calculations, dosimeter selection and analysis, exposure units, and neutron spectrum adjustment methods. This practice is directed towards the development and application of physics-dosimetrymetallurgical data obtained from test reactor irradiation experiments that are performed in support of the operation, licensing, and regulation of LWR nuclear power plants. It specifically addresses the physics-dosimetry aspects of the problem. Procedures related to the analysis, interpretation, and application of both test and power reactor physics-dosimetry-metallurgy results are addressed in Practice E 853, Practice E 560, Matrix E 706(IE), Practice E 185, Matrix E 706(IG), Guide E 900, and Method E 646

  18. Potential Bone to Implant Contact Area of Short Versus Standard Implants: An In Vitro Micro-Computed Tomography Analysis.

    Science.gov (United States)

    Quaranta, Alessandro; DʼIsidoro, Orlando; Bambini, Fabrizio; Putignano, Angelo

    2016-02-01

    To compare the available potential bone-implant contact (PBIC) area of standard and short dental implants by micro-computed tomography (μCT) assessment. Three short implants with different diameters (4.5 × 6 mm, 4.1 × 7 mm, and 4.1 × 6 mm) and 2 standard implants (3.5 × 10 mm and 3.3 × 9 mm) with diverse design and surface features were scanned with μCT. Cross-sectional images were obtained. Image data were manually processed to find the plane that corresponds to the most coronal contact point between the crestal bone and implant. The available PBIC was calculated for each sample. Later on, the cross-sectional slices were processed by a 3-dimensional (3D) software, and 3D images of each sample were used for descriptive analysis and display the microtopography and macrotopography. The wide-diameter short implant (4.5 × 6 mm) showed the higher PBIC (210.89 mm) value followed by the standard (178.07 mm and 185.37 mm) and short implants (130.70 mm and 110.70 mm). Wide-diameter short implants show a surface area comparable with standard implants. Micro-CT analysis is a promising technique to evaluate surface area in dental implants with different macrodesign, microdesign, and surface features.

  19. De-standardization of family-life trajectories of young adults: a cross-national comparison using sequence analysis

    NARCIS (Netherlands)

    Elzinga, C.; Liefbroer, A.C.

    2007-01-01

    We introduce a number of new methods based on sequence analysis to test hypotheses on the de-standardization of family-life trajectories in early adulthood, using Fertility and Family Survey data on 19 countries. Across cohorts, family-life trajectories of young adults have not become more

  20. Manual of Standard Operating Procedures for Veterinary Drug Residue Analysis

    International Nuclear Information System (INIS)

    2016-01-01

    Laboratories are crucial to national veterinary drug residue monitoring programmes. However, one of the main challenges laboratories encounter is obtaining access to relevant methods of analysis. Thus, in addition to training, providing technical advice and transferring technology, the Joint FAO/IAEA Division of Nuclear Techniques in Food and Agriculture has resolved to develop clear and practical manuals to support Member State laboratories. The Coordinated Research Project (CRP) on Development of Radiometric and Allied Analytical Methods to Strengthen Residue Control Programs for Antibiotic and Anthelmintic Veterinary Drug Residues has developed a number of analytical methods as standard operating procedures (SOPs), which are now compiled here. This publication contains SOPs on chromatographic and spectrometric techniques, as well as radioimmunoassay and associated screening techniques, for various anthelmintic and antimicrobial veterinary drug residue analysis. Some analytical method validation protocols are also included. The publication is primarily aimed at food and environmental safety laboratories involved in testing veterinary drug residues, including under organized national residue monitoring programmes. It is expected to enhance laboratory capacity building and competence through the use of radiometric and complementary tools and techniques. The publication is also relevant for applied research on residues of veterinary drugs in food and environmental samples

  1. Status analysis of Chinese standards on enclosure equipment and proposed countermeasures

    International Nuclear Information System (INIS)

    Wu Luping

    1998-12-01

    Enclosure equipment, such as glove box, tong box etc., is an important kind of equipment for nuclear industry and nuclear scientific research. The status of the establishment and implementation of Chinese standards on enclosure equipment is briefly described. Some problems and deficiency existing in these standards are pointed out. The ISO standard projects on containment enclosures as well as their present progress situations are introduced. The measure for updating Chinese standards on enclosure equipment in accordance with the principle of adopting international standards are recommended. Some issues which should be taken into account in adopting ISO standards on containment enclosures are also discussed

  2. Depth profile analysis of thin TiOxNy films using standard ion beam analysis techniques and HERDA

    International Nuclear Information System (INIS)

    Markwitz, A.; Dytlewski, N.; Cohen, D.

    1999-01-01

    Ion beam assisted deposition is used to fabricate thin titanium oxynitride films (TiO x N y ) at Industrial Research (typical film thickness 100nm). At the Institute of Geological and Nuclear Sciences, the thin films are analysed using non-destructive standard ion beam analysis (IBA) techniques. High-resolution titanium depth profiles are measured with RBS using 1.5MeV 4 He + ions. Non-resonant nuclear reaction analysis (NRA) is performed for investigating the amounts of O and N in the deposited films using the reactions 16 O(d,p) 17 O at 920 keV and 14 N(d,α) 12 C at 1.4 MeV. Using a combination of these nuclear techniques, the stoichiometry as well as the thickness of the layers is revealed. However, when oxygen and nitrogen depth profiles are required for investigating stoichiometric changes in the films, additional nuclear analysis techniques such as heavy ion elastic recoil detection (HERDA) have to be applied. With HERDA, depth profiles of N, O, and Ti are measured simultaneously. In this paper comparative IBA measurement s of TiO x N y films with different compositions are presented and discussed

  3. Matrix effect and correction by standard addition in quantitative liquid chromatographic-mass spectrometric analysis of diarrhetic shellfish poisoning toxins.

    Science.gov (United States)

    Ito, Shinya; Tsukada, Katsuo

    2002-01-11

    An evaluation of the feasibility of liquid chromatography-mass spectrometry (LC-MS) with atmospheric pressure ionization was made for quantitation of four diarrhetic shellfish poisoning toxins, okadaic acid, dinophysistoxin-1, pectenotoxin-6 and yessotoxin in scallops. When LC-MS was applied to the analysis of scallop extracts, large signal suppressions were observed due to coeluting substances from the column. To compensate for these matrix signal suppressions, the standard addition method was applied. First, the sample was analyzed and then the sample involving the addition of calibration standards is analyzed. Although this method requires two LC-MS runs per analysis, effective correction of quantitative errors was found.

  4. Human milk fortifier with high versus standard protein content for promoting growth of preterm infants: A meta-analysis.

    Science.gov (United States)

    Liu, Tian-Tian; Dang, Dan; Lv, Xiao-Ming; Wang, Teng-Fei; Du, Jin-Feng; Wu, Hui

    2015-06-01

    To compare the growth of preterm infants fed standard protein-fortified human milk with that containing human milk fortifier (HMF) with a higher-than-standard protein content. Published articles reporting randomized controlled trials and prospective observational intervention studies listed on the PubMed®, Embase®, CINAHL and Cochrane Library databases were searched using the keywords 'fortifier', 'human milk', 'breastfeeding', 'breast milk' and 'human milk fortifier'. The mean difference with 95% confidence intervals was used to compare the effect of HMF with a higher-than-standard protein content on infant growth characteristics. Five studies with 352 infants with birth weight ≤ 1750 g and a gestational age ≤ 34 weeks who were fed human milk were included in this meta-analysis. Infants in the experimental groups given human milk with higher-than-standard protein fortifier achieved significantly greater weight and length at the end of the study, and greater weight gain, length gain, and head circumference gain, compared with control groups fed human milk with the standard HMF. HMF with a higher-than-standard protein content can improve preterm infant growth compared with standard HMF. © The Author(s) 2015 Reprints and permissions: sagepub.co.uk/journalsPermissions.nav.

  5. Economic analysis of passive houses and low-energy houses compared with standard houses

    International Nuclear Information System (INIS)

    Audenaert, A.; Cleyn, S.H. de; Vankerckhove, B.

    2008-01-01

    As the energy demand used for space heating accounts for 78% of EU15 household delivered energy consumption, significant reductions in energy demand can be achieved by promoting low-energy buildings. Our study investigates three building types: the standard house, the low-energy house and the passive house. As more far-reaching measures concerning energy savings usually lead to higher investments, the aim of our study is to perform an economic analysis in order to determine the economic viability of the three building types

  6. An Analysis of the Impact of Federated Search Products on Library Instruction Using the ACRL Standards

    Science.gov (United States)

    Cox, Christopher

    2006-01-01

    Federated search products are becoming more and more prevalent in academic libraries. What are the implications of this phenomenon for instruction librarians? An analysis of federated search products using the "Information Literacy Competency Standards for Higher Education" and a thorough review of the literature offer insight concerning whether…

  7. Update of the NNLO PDFs in the 3-, 4- and 5-flavour schemes

    International Nuclear Information System (INIS)

    Alekhin, Sergey; Bluemlein, Johannes; Moch, Sven-Olaf

    2010-07-01

    We report on an update of the next-to-next-to-leading order (NNLO) ABKM09 parton distributions functions. They are obtained with the use of the combined HERA collider Run I inclusive deep-inelastic scattering (DIS) data and the partial NNLO corrections to the heavy quark electro-production taken into account. The value of the strong couplig constant α NNLO s (M Z )=0.1147(12) is obtained. The standard candle cross sections for the Tevatron collider and the LHC estimated with the updated PDFs are provided. (orig.)

  8. Antibacterial Efficiency of Benzalkonium Chloride Base Disinfectant According To European Standard 13727, Chemical Analysis and Validation Studies

    OpenAIRE

    Yıldırım, Çinel; Çelenk, Veysel

    2018-01-01

    Antibacterial Efficiency of Benzalkonium Chloride Base Disinfectant According To European Standard 13727, Chemical Analysis and Validation Studies This study was aimed to provide principle of the chemical analyses, antibacterial efficiency test and validation procedures of the most commonly used benzalkonium chloride (BAC) base disinfectant as a biocide. Disinfectant which comprised 20 % BAC concentration was used as a prototype product and active substance was verified with chemical analysis...

  9. Assessment of hygiene standards and Hazard Analysis Critical Control Points implementation on passenger ships.

    Science.gov (United States)

    Mouchtouri, Varavara; Malissiova, Eleni; Zisis, Panagiotis; Paparizou, Evina; Hadjichristodoulou, Christos

    2013-01-01

    The level of hygiene on ferries can have impact on travellers' health. The aim of this study was to assess the hygiene standards of ferries in Greece and to investigate whether Hazard Analysis Critical Control Points (HACCP) implementation contributes to the hygiene status and particularly food safety aboard passenger ships. Hygiene inspections on 17 ferries in Greece were performed using a standardized inspection form, with a 135-point scale. Thirty-four water and 17 food samples were collected and analysed. About 65% (11/17) of ferries were scored with >100 points. Ferries with HACCP received higher scores during inspection compared to those without HACCP (p value food samples, only one was found positive for Salmonella spp. Implementation of management systems including HACCP principles can help to raise the level of hygiene aboard passenger ships.

  10. The Influence of Adaptation and Standardization of the Marketing Mix on Performance: a Meta-Analysis

    Directory of Open Access Journals (Sweden)

    Vinícius Andrade Brei

    2011-07-01

    Full Text Available This article analyzes the relationship between strategies of standardization and adaptation of the marketing mix and performance in an international context. We carried out a meta-analysis on a sample of 23 studies published between 1992 and 2010. The sample was analyzed based on measures of the effect size (ES – or the strength of the relation (Wolf, 1986 – between standardization/adaptation and performance. The results suggest the existence of a medium strength (ES ranging from .133 to .209 for the relationship considered. The results support the existence of a positive impact of both marketing mix adaptation and standardization on performance. However, our results suggest that companies should slightly emphasize the marketing mix adaptation (ES mean = .168 instead of standardizing it (ES mean = .134 when entering in a new international market. Results also indicate that, among the adaptation choices, price (ES = .209 should be the first element of the marketing mix to be adapted, followed by promotion (ES = .155, product (ES = .154, and distribution (ES = .141. Finally, we suggest some new research paths, such as the use of quantitative methods to compare degrees of adaptation to be applied to different segments, regions, and sectors, among other suggestions.

  11. An analysis of combined standard uncertainty for radiochemical measurements of environmental samples

    International Nuclear Information System (INIS)

    Berne, A.

    1996-01-01

    It is anticipated that future data acquisitions intended for use in radiological risk assessments will require the incorporation of uncertainty analysis. Often, only one aliquot of the sample is taken and a single determination is made. Under these circumstances, the total uncertainty is calculated using the open-quotes propagation of errorsclose quotes approach. However, there is no agreement in the radioanalytical community as to the exact equations to use. The Quality Assurance/Metrology Division of the Environmental Measurements Laboratory has developed a systematic process to compute uncertainties in constituent components of the analytical procedure, as well as the combined standard uncertainty (CSU). The equations for computation are presented here, with examples of their use. They have also been incorporated into a code for use in the spreadsheet application, QuattroPro trademark. Using the spreadsheet with appropriate inputs permits an analysis of the variations in the CSU as a function of several different variables. The relative importance of the open-quotes counting uncertaintyclose quotes can also be ascertained

  12. Computing tools for implementing standards for single-case designs.

    Science.gov (United States)

    Chen, Li-Ting; Peng, Chao-Ying Joanne; Chen, Ming-E

    2015-11-01

    In the single-case design (SCD) literature, five sets of standards have been formulated and distinguished: design standards, assessment standards, analysis standards, reporting standards, and research synthesis standards. This article reviews computing tools that can assist researchers and practitioners in meeting the analysis standards recommended by the What Works Clearinghouse: Procedures and Standards Handbook-the WWC standards. These tools consist of specialized web-based calculators or downloadable software for SCD data, and algorithms or programs written in Excel, SAS procedures, SPSS commands/Macros, or the R programming language. We aligned these tools with the WWC standards and evaluated them for accuracy and treatment of missing data, using two published data sets. All tools were tested to be accurate. When missing data were present, most tools either gave an error message or conducted analysis based on the available data. Only one program used a single imputation method. This article concludes with suggestions for an inclusive computing tool or environment, additional research on the treatment of missing data, and reasonable and flexible interpretations of the WWC standards. © The Author(s) 2015.

  13. Standardizing economic analysis in prevention will require substantial effort.

    Science.gov (United States)

    Guyll, Max

    2014-12-01

    It is exceedingly difficult to compare results of economic analyses across studies due to variations in assumptions, methodology, and outcome measures, a fact which surely decreases the impact and usefulness of prevention-related economic research. Therefore, Crowley et al. (Prevention Science, 2013) are precisely correct in their call for increased standardization and have usefully highlighted the issues that must be addressed. However, having made the need clear, the questions become what form the solution should take, and how should it be implemented. The present discussion outlines the rudiments of a comprehensive framework for promoting standardized methodology in the estimation of economic outcomes, as encouraged by Crowley et al. In short, a single, standard, reference case approach should be clearly articulated, and all economic research should be encouraged to apply that standard approach, with results from compliant analyses being reported in a central archive. Properly done, the process would increase the ability of those without specialized training to contribute to the body of economic research pertaining to prevention, and the most difficult tasks of predicting and monetizing distal outcomes would be readily completed through predetermined models. These recommendations might be viewed as somewhat forcible, insomuch as they advocate for prescribing the details of a standard methodology and establishing a means of verifying compliance. However, it is unclear that the best practices proposed by Crowley et al. will be widely adopted in the absence of a strong and determined approach.

  14. Quality of Standard Reference Materials for Short Time Activation Analysis

    International Nuclear Information System (INIS)

    Ismail, S.S.; Oberleitner, W.

    2003-01-01

    Some environmental reference materials (CFA-1633 b, IAEA-SL-1, SARM-1,BCR-176, Coal-1635, IAEA-SL-3, BCR-146, and SRAM-5) were analysed by short-time activation analysis. The results show that these materials can be classified in three groups, according to their activities after irradiation. The obtained results were compared in order to create a quality index for determination of short-lived nuclides at high count rates. It was found that Cfta is not a suitable standard for determining very short-lived nuclides (half-lives<1 min) because the activity it produces is 15-fold higher than that SL-3. Biological reference materials, such as SRM-1571, SRM-1573, SRM-1575, SRM-1577, IAEA-392, and IAEA-393, were also investigated by a higher counting efficiency system. The quality of this system and its well-type detector for investigating short-lived nuclides was discussed

  15. Analysis of a gamma-ray spectrum by using a standard spectrum

    International Nuclear Information System (INIS)

    Tasaka, Kanji

    1975-06-01

    The standard spectrum method has been extended to take into account the energy dependence of a standard spectrum. The method analyses the observed gamma-ray spectrum by the least-square method, using an interpolated standard spectrum for expressing the line shape and a linear function for the background continuum. The interpolated standard spectrum is defined for each fitting interval by interpolating several standard spectra, which are derived directly from the observed spectra of single photopeaks each corresponding to the incident monochromatic gamma-rays by subtracting the background and smoothing the data. (author)

  16. Cross-Referencing National Standards in Personal Finance for Business Education with National Standards in Personal Finance Education

    Science.gov (United States)

    Gayton, Jorge

    2005-01-01

    The purpose of this study was to determine the extent to which National Standards in Personal Finance for Business Education correlate with National Standards in Personal Finance Education. A content analysis revealed that the National Standards in Personal Finance for Business Education, established by the National Business Education Association…

  17. Component reliability analysis for development of component reliability DB of Korean standard NPPs

    International Nuclear Information System (INIS)

    Choi, S. Y.; Han, S. H.; Kim, S. H.

    2002-01-01

    The reliability data of Korean NPP that reflects the plant specific characteristics is necessary for PSA and Risk Informed Application. We have performed a project to develop the component reliability DB and calculate the component reliability such as failure rate and unavailability. We have collected the component operation data and failure/repair data of Korean standard NPPs. We have analyzed failure data by developing a data analysis method which incorporates the domestic data situation. And then we have compared the reliability results with the generic data for the foreign NPPs

  18. Bipolar sealer not superior to standard electrocautery in primary total hip arthroplasty: a meta-analysis.

    Science.gov (United States)

    Yang, Yang; Zhang, Li-Chao; Xu, Fei; Li, Jia; Lv, Yong-Ming

    2014-10-10

    To assess whether bipolar sealer has advantages over standard electrocautery in primary total hip arthroplasty (THA). All studies published through November 2013 were systematically searched in PubMed, Embase, ScienceDirect, The Cochrane Library, and other databases. Relevant journals or conference proceedings were searched manually. Only randomized controlled trials were included. Two independent reviewers identified and assessed the literature. Mean difference in blood loss and risk ratios of transfusion rates and of complication rates in the bipolar sealer group versus the standard electrocautery group were calculated. The meta-analysis was conducted using RevMan 5.1 software. Five studies were included, with a total sample size of 559 patients. The use of bipolar sealer did not significantly reduce intraoperative blood loss, hemoglobin drop, hospital stay, and operative time. There were no significant differences in need for transfusion and the incidence of infection between the study groups. The available evidence suggests that the use of bipolar sealer was not superior to standard electrocautery in patients undergoing primary THA. The use of bipolar sealer is not recommended in primary THA.

  19. The tsunami probabilistic risk assessment (PRA). Example of accident sequence analysis of tsunami PRA according to the standard for procedure of tsunami PRA for nuclear power plants

    International Nuclear Information System (INIS)

    Ohara, Norihiro; Hasegawa, Keiko; Kuroiwa, Katsuya

    2013-01-01

    After the Fukushima Daiichi nuclear power plant (NPP) accident, standard for procedure of tsunami PRA for NPP had been established by the Standardization Committee of AESJ. Industry group had been conducting analysis of Tsunami PRA for PWR based on the standard under the cooperation with electric utilities. This article introduced overview of the standard and examples of accident sequence analysis of Tsunami PRA studied by the industry group according to the standard. The standard consisted of (1) investigation of NPP's composition, characteristics and site information, (2) selection of relevant components for Tsunami PRA and initiating events and identification of accident sequence, (3) evaluation of Tsunami hazards, (4) fragility evaluation of building and components and (5) evaluation of accident sequence. Based on the evaluation, countermeasures for further improvement of safety against Tsunami could be identified by the sensitivity analysis. (T. Tanaka)

  20. The reinterpretation of standard deviation concept

    OpenAIRE

    Ye, Xiaoming

    2017-01-01

    Existing mathematical theory interprets the concept of standard deviation as the dispersion degree. Therefore, in measurement theory, both uncertainty concept and precision concept, which are expressed with standard deviation or times standard deviation, are also defined as the dispersion of measurement result, so that the concept logic is tangled. Through comparative analysis of the standard deviation concept and re-interpreting the measurement error evaluation principle, this paper points o...

  1. Analysis and evaluation of the Electronic Health Record standard in China: a comparison with the American national standard ASTM E 1384.

    Science.gov (United States)

    Xu, Wei; Guan, Zhiyu; Cao, Hongxin; Zhang, Haiyan; Lu, Min; Li, Tiejun

    2011-08-01

    To analyze and evaluate the newly issued Electronic Health Record (EHR) Architecture and Data Standard of China (Chinese EHR Standard) and identify areas of improvement for future revisions. We compared the Chinese EHR Standard with the standard of the American Society for Testing and Materials Standard Practice for Content and Structure of Electronic Health Records in the United States (ASTM E 1384 Standard). The comparison comprised two steps: (1) comparing the conformance of the two standards to the international standard: Health Informatics-Requirements for an Electronic Health Record Architecture (ISO/TS 18308), and showing how the architectures of the two standards satisfy or deviate from the ISO requirements and (2) comparing the detailed data structures between the two standards. Of the 124 requirement items in ISO/TS 18308, the Chinese EHR Standard and the ASTM E 1384 Standard conformed to 77 (62.1%) and 111 (89.5%), respectively. The Chinese EHR Standard conformed to 34 of 50 Structure requirements (68.0%), 22 of 24 Process requirements (91.7%), and 21 of 50 Other requirements (42.0%). The ASTM E 1384 Standard conformed to 49 of 50 Structure requirements (98.0%), 23 of 24 Process requirements (95.8%), and 39 of 40 Other requirements (78.0%). Further development of the Chinese EHR Standard should focus on supporting privacy and security mechanism, diverse data types, more generic and extensible lower level data structures, and relational attributes for data elements. Copyright © 2011 Elsevier Ireland Ltd. All rights reserved.

  2. Complete analysis of a nuclear building to nuclear safety standards

    Energy Technology Data Exchange (ETDEWEB)

    Jaeger, T A

    1975-01-01

    The nuclear standards impose on the designer the necessity of examining the loads, stresses, and strains in a nuclear building even under extreme loading conditions, both due to plant malfunctions and environmental accidents. It is necessary then to generate, combine, and examine a tremendous amount of data; really the lack of symmetry and general complication of the structures and the large number of loading combinations make an automatic analysis quite necessary. A largely automated procedure is presented in view of solving the problem by a series of computer programs linked together. After the seismic analysis has been performed by (SADE CODE) these data together with the data coming from thermal specifications, weight, accident descriptions etc. are fed into a finite element computer code (SAP4) for analysis. They are processed and combined by a computer code (COMBIN) according to the loading conditions (the usual list in Italy is given and briefly discussed), so that for each point (or each selected zone) under each loading condition the applied loads are listed. These data are fed to another computer code (DTP), which determines the amount of reinforcing bars necessary to accommodate the most severe of the loading conditions. The Aci 318/71 and Italian regulation procedures are followed; the characteristics of the program are briefly described and discussed. Some particular problems are discussed, e.g. the thermal stresses due to normal and accident conditions, the inelastic behavior of some frame elements (due to concrete cracking) is considered by means of an 'ad hoc' code. Typical examples are presented and the results are discussed showing a relatively large benefit in considering this inelastic effect.

  3. Determination of trace elements in standard reference materials by the ko-standardization method

    International Nuclear Information System (INIS)

    Smodis, B.; Jacimovic, R.; Stegnar, P.; Jovanovic, S.

    1990-01-01

    The k o -standardization method is suitable for routine multielement determinations by reactor neutron activation analysis (NAA). Investigation of NIST standard reference materials SRM 1571 Orchard Leaves, SRM 1572 Citrus leaves, and SRM 1573 Tomato Leaves showed the systematic error of 12 certified elements determined to be less than 8%. Thirty-four elements were determined in NIST proposed SRM 1515 Apple Leaves

  4. Development of spatial data guidelines and standards: spatial data set documentation to support hydrologic analysis in the U.S. Geological Survey

    Science.gov (United States)

    Fulton, James L.

    1992-01-01

    Spatial data analysis has become an integral component in many surface and sub-surface hydrologic investigations within the U.S. Geological Survey (USGS). Currently, one of the largest costs in applying spatial data analysis is the cost of developing the needed spatial data. Therefore, guidelines and standards are required for the development of spatial data in order to allow for data sharing and reuse; this eliminates costly redevelopment. In order to attain this goal, the USGS is expanding efforts to identify guidelines and standards for the development of spatial data for hydrologic analysis. Because of the variety of project and database needs, the USGS has concentrated on developing standards for documenting spatial sets to aid in the assessment of data set quality and compatibility of different data sets. An interim data set documentation standard (1990) has been developed that provides a mechanism for associating a wide variety of information with a data set, including data about source material, data automation and editing procedures used, projection parameters, data statistics, descriptions of features and feature attributes, information on organizational contacts lists of operations performed on the data, and free-form comments and notes about the data, made at various times in the evolution of the data set. The interim data set documentation standard has been automated using a commercial geographic information system (GIS) and data set documentation software developed by the USGS. Where possible, USGS developed software is used to enter data into the data set documentation file automatically. The GIS software closely associates a data set with its data set documentation file; the documentation file is retained with the data set whenever it is modified, copied, or transferred to another computer system. The Water Resources Division of the USGS is continuing to develop spatial data and data processing standards, with emphasis on standards needed to support

  5. BRAZILIAN AND INTERNATIONAL ACCOUNTING STANDARDS APPLIED TO THE PUBLIC SECTOR AND THE CHALLENGE OF CONVERGENCE: A COMPARATIVE ANALYSIS - IPSAS AND NBCTSP

    Directory of Open Access Journals (Sweden)

    José Francisco Ribeiro Filho (in memoriam

    2012-11-01

    Full Text Available The aim in this study is to analyze the current stage of conceptual convergence between Brazilian accounting standards applied to the public sector (NBCTSP and the International Public Sector Accounting Standard (IPSAS. The complexity and range of transactions between public or private sector entities, as a result of market internationalization, demand continuous and dynamic assessment of the events that promote quantitative or qualitative equity changes. For this evaluation process, observing accounting principles and standards is important to guarantee, among other information characteristics, understandability and comparability, thus reducing costs for investors and users in general, in view of the barriers raised by diverse languages, cultures, tax and economic policies. For convergence analysis, the standards’ contents were subject to a comparative study, based on a descriptive analysis, with a view to verifying the existing adherence between Brazilian and international standards applied to the public sector. The results found highlight that different aspects still have to be discussed with a view to an actual convergence with the international standards; the current convergence is partial. The high-quality conceptual exposure of the NBCPSPs is observed though, while the contents of the IPSAS are more focused on operating procedures

  6. The hidden X-ray breaks in afterglow light curves

    International Nuclear Information System (INIS)

    Curran, P. A.; Wijers, R. A. M. J.; Horst, A. J. van der; Starling, R. L. C.

    2008-01-01

    Gamma-Ray Burst (GRB) afterglow observations in the Swift era have a perceived lack of achromatic jet breaks compared to the BeppoSAX, or pre-Swift era. Specifically, relatively few breaks, consistent with jet breaks, are observed in the X-ray light curves of these bursts. If these breaks are truly missing, it has serious consequences for the interpretation of GRB jet collimation and energy requirements, and the use of GRBs as standard candles.Here we address the issue of X-ray breaks which are possibly 'hidden' and hence the light curves are misinterpreted as being single power-laws. We show how a number of precedents, including GRB 990510 and GRB 060206, exist for such hidden breaks and how, even with the well sampled light curves of the Swift era, these breaks may be left misidentified. We do so by synthesising X-ray light curves and finding general trends via Monte Carlo analysis. Furthermore, in light of these simulations, we discuss how to best identify achromatic breaks in afterglow light curves via multi-wavelength analysis

  7. A two-point diagnostic for the H II galaxy Hubble diagram

    Science.gov (United States)

    Leaf, Kyle; Melia, Fulvio

    2018-03-01

    A previous analysis of starburst-dominated H II galaxies and H II regions has demonstrated a statistically significant preference for the Friedmann-Robertson-Walker cosmology with zero active mass, known as the Rh = ct universe, over Λcold dark matter (ΛCDM) and its related dark-matter parametrizations. In this paper, we employ a two-point diagnostic with these data to present a complementary statistical comparison of Rh = ct with Planck ΛCDM. Our two-point diagnostic compares, in a pairwise fashion, the difference between the distance modulus measured at two redshifts with that predicted by each cosmology. Our results support the conclusion drawn by a previous comparative analysis demonstrating that Rh = ct is statistically preferred over Planck ΛCDM. But we also find that the reported errors in the H II measurements may not be purely Gaussian, perhaps due to a partial contamination by non-Gaussian systematic effects. The use of H II galaxies and H II regions as standard candles may be improved even further with a better handling of the systematics in these sources.

  8. A Comparison of AOP Classification Based on Difficulty, Importance, and Frequency by Cluster Analysis and Standardized Mean

    International Nuclear Information System (INIS)

    Choi, Sun Yeong; Jung, Wondea

    2014-01-01

    In Korea, there are plants that have more than one-hundred kinds of abnormal operation procedures (AOPs). Therefore, operators have started to recognize the importance of classifying the AOPs. They should pay attention to those AOPs required to take emergency measures against an abnormal status that has a more serious effect on plant safety and/or often occurs. We suggested a measure of prioritizing AOPs for a training purpose based on difficulty, importance, and frequency. A DIF analysis based on how difficult the task is, how important it is, and how frequently they occur is a well-known method of assessing the performance, prioritizing training needs and planning. We used an SDIF-mean (Standardized DIF-mean) to prioritize AOPs in the previous paper. For the SDIF-mean, we standardized the three kinds of data respectively. The results of this research will be utilized not only to understand the AOP characteristics at a job analysis level but also to develop an effective AOP training program. The purpose of this paper is to perform a cluster analysis for an AOP classification and compare the results through a cluster analysis with that by a standardized mean based on difficulty, importance, and frequency. In this paper, we categorized AOPs into three groups by a cluster analysis based on D, I, and F. Clustering is the classification of similar objects into groups so that each group shares some common characteristics. In addition, we compared the result by the cluster analysis in this paper with the classification result by the SDIF-mean in the previous paper. From the comparison, we found that a reevaluation can be required to assign a training interval for the AOPs of group C' in the previous paper those have lower SDIF-mean. The reason for this is that some of the AOPs of group C' have quite high D and I values while they have the lowest frequencies. From an educational point of view, AOPs in group which have the highest difficulty and importance, but

  9. A Comparison of AOP Classification Based on Difficulty, Importance, and Frequency by Cluster Analysis and Standardized Mean

    Energy Technology Data Exchange (ETDEWEB)

    Choi, Sun Yeong; Jung, Wondea [Korea Atomic Energy Research Institute, Daejeon (Korea, Republic of)

    2014-05-15

    In Korea, there are plants that have more than one-hundred kinds of abnormal operation procedures (AOPs). Therefore, operators have started to recognize the importance of classifying the AOPs. They should pay attention to those AOPs required to take emergency measures against an abnormal status that has a more serious effect on plant safety and/or often occurs. We suggested a measure of prioritizing AOPs for a training purpose based on difficulty, importance, and frequency. A DIF analysis based on how difficult the task is, how important it is, and how frequently they occur is a well-known method of assessing the performance, prioritizing training needs and planning. We used an SDIF-mean (Standardized DIF-mean) to prioritize AOPs in the previous paper. For the SDIF-mean, we standardized the three kinds of data respectively. The results of this research will be utilized not only to understand the AOP characteristics at a job analysis level but also to develop an effective AOP training program. The purpose of this paper is to perform a cluster analysis for an AOP classification and compare the results through a cluster analysis with that by a standardized mean based on difficulty, importance, and frequency. In this paper, we categorized AOPs into three groups by a cluster analysis based on D, I, and F. Clustering is the classification of similar objects into groups so that each group shares some common characteristics. In addition, we compared the result by the cluster analysis in this paper with the classification result by the SDIF-mean in the previous paper. From the comparison, we found that a reevaluation can be required to assign a training interval for the AOPs of group C' in the previous paper those have lower SDIF-mean. The reason for this is that some of the AOPs of group C' have quite high D and I values while they have the lowest frequencies. From an educational point of view, AOPs in group which have the highest difficulty and importance, but

  10. Solid energy calibration standards for P K-edge XANES: electronic structure analysis of PPh4Br.

    Science.gov (United States)

    Blake, Anastasia V; Wei, Haochuan; Donahue, Courtney M; Lee, Kyounghoon; Keith, Jason M; Daly, Scott R

    2018-03-01

    P K-edge X-ray absorption near-edge structure (XANES) spectroscopy is a powerful method for analyzing the electronic structure of organic and inorganic phosphorus compounds. Like all XANES experiments, P K-edge XANES requires well defined and readily accessible calibration standards for energy referencing so that spectra collected at different beamlines or under different conditions can be compared. This is especially true for ligand K-edge X-ray absorption spectroscopy, which has well established energy calibration standards for Cl (Cs 2 CuCl 4 ) and S (Na 2 S 2 O 3 ·5H 2 O), but not neighboring P. This paper presents a review of common P K-edge XANES energy calibration standards and analysis of PPh 4 Br as a potential alternative. The P K-edge XANES region of commercially available PPh 4 Br revealed a single, highly resolved pre-edge feature with a maximum at 2146.96 eV. PPh 4 Br also showed no evidence of photodecomposition when repeatedly scanned over the course of several days. In contrast, we found that PPh 3 rapidly decomposes under identical conditions. Density functional theory calculations performed on PPh 3 and PPh 4 + revealed large differences in the molecular orbital energies that were ascribed to differences in the phosphorus oxidation state (III versus V) and molecular charge (neutral versus +1). Time-dependent density functional theory calculations corroborated the experimental data and allowed the spectral features to be assigned. The first pre-edge feature in the P K-edge XANES spectrum of PPh 4 Br was assigned to P 1s → P-C π* transitions, whereas those at higher energy were P 1s → P-C σ*. Overall, the analysis suggests that PPh 4 Br is an excellent alternative to other solid energy calibration standards commonly used in P K-edge XANES experiments.

  11. EMPLACEMENT GANTRY ITS STANDARDS IDENTIFICATION STUDY

    International Nuclear Information System (INIS)

    Voegele, M.

    2005-01-01

    To date, the project has established ITS performance requirements for SSCs based on identification and categorization of event sequences that may result in a radiological release. These performance requirements are defined within the NSDB. Further, SSCs credited with performing safe functions are classified as ITS. In turn, perform confirmation for these SSCs is sought through the use of consensus code and standards. The purpose of this study is to identify applicable codes and standards for the WP Emplacement Gantry ITS SSCs. Further, this study will form the basis for selection and the extent of applicability of each code and standard. This study is based on the design development completed for LA only. Accordingly, identification of ITS SSCs beyond those defined within the NSDB are based on designs that may be subject to further development during detail design. Furthermore, several design alternatives may still be under consideration to satisfy certain safety functions, and that final selection will not be determined until further design development has occurred. Therefore, for completeness, throughout this study alternative designs currently under considered will be discussed. Further, the results of this study will be subject to evaluation as part of a follow-on GAP analysis study. Based on the results of this study the GAP analysis will evaluate each code and standard to ensure each ITS performance requirement is fully satisfied. When a performance requirement is not fully satisfied a ''gap'' is highlighted. Thereafter, the study will identify supplemental requirements to augment the code or standard to meet performance requirements. Further, the GAP analysis will identify non-standard areas of the design that will be subject to a Development Plan. Non-standard components and non-standard design configurations are defined as areas of the design that do not follow standard industry practices or codes and standards. Whereby, performance confirmation cannot be

  12. Validation of the Welch Allyn SureBP (inflation) and StepBP (deflation) algorithms by AAMI standard testing and BHS data analysis.

    Science.gov (United States)

    Alpert, Bruce S

    2011-04-01

    We evaluated two new Welch Allyn automated blood pressure (BP) algorithms. The first, SureBP, estimates BP during cuff inflation; the second, StepBP, does so during deflation. We followed the American National Standards Institute/Association for the Advancement of Medical Instrumentation SP10:2006 standard for testing and data analysis. The data were also analyzed using the British Hypertension Society analysis strategy. We tested children, adolescents, and adults. The requirements of the American National Standards Institute/Association for the Advancement of Medical Instrumentation SP10:2006 standard were fulfilled with respect to BP levels, arm sizes, and ages. Association for the Advancement of Medical Instrumentation SP10 Method 1 data analysis was used. The mean±standard deviation for the device readings compared with auscultation by paired, trained, blinded observers in the SureBP mode were -2.14±7.44 mmHg for systolic BP (SBP) and -0.55±5.98 mmHg for diastolic BP (DBP). In the StepBP mode, the differences were -3.61±6.30 mmHg for SBP and -2.03±5.30 mmHg for DBP. Both algorithms achieved an A grade for both SBP and DBP by British Hypertension Society analysis. The SureBP inflation-based algorithm will be available in many new-generation Welch Allyn monitors. Its use will reduce the time it takes to estimate BP in critical patient care circumstances. The device will not need to inflate to excessive suprasystolic BPs to obtain the SBP values. Deflation is rapid once SBP has been determined, thus reducing the total time of cuff inflation and reducing patient discomfort. If the SureBP fails to obtain a BP value, the StepBP algorithm is activated to estimate BP by traditional deflation methodology.

  13. Analysis of Factors Influencing Fur Quality in Minks of Standard, Pastel, Platinum and White Hedlunda Colour Strains

    OpenAIRE

    Stanisław Socha; Dorota Kołodziejczyk; Ewa Kondraciuk; Danuta Wójcik; Aldona Gontarz

    2010-01-01

    The work aimed at the analysis of the factors that influence conformation traits, included animal size and fur quality traits in four colour types of mink: standard, pastel, platinum and white Hedlunda. The data concerns the evaluation of animal conformation traits in the period of three years. The analysis of variance of particular traits indicates statistically significant effect of the year of birth, colour type and animal sex on the majority of analysed traits. Higher means of license eva...

  14. Post flight analysis of NASA standard star trackers recovered from the solar maximum mission

    Science.gov (United States)

    Newman, P.

    1985-01-01

    The flight hardware returned after the Solar Maximum Mission Repair Mission was analyzed to determine the effects of 4 years in space. The NASA Standard Star Tracker would be a good candidate for such analysis because it is moderately complex and had a very elaborate calibration during the acceptance procedure. However, the recovery process extensively damaged the cathode of the image dissector detector making proper operation of the tracker and a comparison with preflight characteristics impossible. Otherwise, the tracker functioned nominally during testing.

  15. Integration of lyoplate based flow cytometry and computational analysis for standardized immunological biomarker discovery.

    Directory of Open Access Journals (Sweden)

    Federica Villanova

    Full Text Available Discovery of novel immune biomarkers for monitoring of disease prognosis and response to therapy in immune-mediated inflammatory diseases is an important unmet clinical need. Here, we establish a novel framework for immunological biomarker discovery, comparing a conventional (liquid flow cytometry platform (CFP and a unique lyoplate-based flow cytometry platform (LFP in combination with advanced computational data analysis. We demonstrate that LFP had higher sensitivity compared to CFP, with increased detection of cytokines (IFN-γ and IL-10 and activation markers (Foxp3 and CD25. Fluorescent intensity of cells stained with lyophilized antibodies was increased compared to cells stained with liquid antibodies. LFP, using a plate loader, allowed medium-throughput processing of samples with comparable intra- and inter-assay variability between platforms. Automated computational analysis identified novel immunophenotypes that were not detected with manual analysis. Our results establish a new flow cytometry platform for standardized and rapid immunological biomarker discovery with wide application to immune-mediated diseases.

  16. Integration of lyoplate based flow cytometry and computational analysis for standardized immunological biomarker discovery.

    Science.gov (United States)

    Villanova, Federica; Di Meglio, Paola; Inokuma, Margaret; Aghaeepour, Nima; Perucha, Esperanza; Mollon, Jennifer; Nomura, Laurel; Hernandez-Fuentes, Maria; Cope, Andrew; Prevost, A Toby; Heck, Susanne; Maino, Vernon; Lord, Graham; Brinkman, Ryan R; Nestle, Frank O

    2013-01-01

    Discovery of novel immune biomarkers for monitoring of disease prognosis and response to therapy in immune-mediated inflammatory diseases is an important unmet clinical need. Here, we establish a novel framework for immunological biomarker discovery, comparing a conventional (liquid) flow cytometry platform (CFP) and a unique lyoplate-based flow cytometry platform (LFP) in combination with advanced computational data analysis. We demonstrate that LFP had higher sensitivity compared to CFP, with increased detection of cytokines (IFN-γ and IL-10) and activation markers (Foxp3 and CD25). Fluorescent intensity of cells stained with lyophilized antibodies was increased compared to cells stained with liquid antibodies. LFP, using a plate loader, allowed medium-throughput processing of samples with comparable intra- and inter-assay variability between platforms. Automated computational analysis identified novel immunophenotypes that were not detected with manual analysis. Our results establish a new flow cytometry platform for standardized and rapid immunological biomarker discovery with wide application to immune-mediated diseases.

  17. Identification of Water Quality Significant Parameter with Two Transformation/Standardization Methods on Principal Component Analysis and Scilab Software

    Directory of Open Access Journals (Sweden)

    Jovan Putranda

    2016-09-01

    Full Text Available Water quality monitoring is prone to encounter error on its recording or measuring process. The monitoring on river water quality not only aims to recognize the water quality dynamic, but also to evaluate the data to create river management policy and water pollution in order to maintain the continuity of human health or sanitation requirement, and biodiversity preservation. Evaluation on water quality monitoring needs to be started by identifying the important water quality parameter. This research objected to identify the significant parameters by using two transformation or standardization methods on water quality data, which are the river Water Quality Index, WQI (Indeks Kualitas Air, Sungai, IKAs transformation or standardization method and transformation or standardization method with mean 0 and variance 1; so that the variability of water quality parameters could be aggregated with one another. Both of the methods were applied on the water quality monitoring data which its validity and reliability have been tested. The PCA, Principal Component Analysis (Analisa Komponen Utama, AKU, with the help of Scilab software, has been used to process the secondary data on water quality parameters of Gadjah Wong river in 2004-2013, with its validity and reliability has been tested. The Scilab result was cross examined with the result from the Excel-based Biplot Add In software. The research result showed that only 18 from total 35 water quality parameters that have passable data quality. The two transformation or standardization data methods gave different significant parameter type and amount result. On the transformation or standardization mean 0 variances 1, there were water quality significant parameter dynamic to mean concentration of each water quality parameters, which are TDS, SO4, EC, TSS, NO3N, COD, BOD5, Grease Oil and NH3N. On the river WQI transformation or standardization, the water quality significant parameter showed the level of

  18. Preliminary analysis of the efficiency of non-standard divertor configurations in DEMO

    Directory of Open Access Journals (Sweden)

    F. Subba

    2017-08-01

    Full Text Available The standard Single Null (SN divertor is currently expected to be installed in DEMO. However, a number of alternative configurations are being evaluated in parallel as backup solutions, in case the standard divertor does not extrapolate successfully from ITER to a fusion power plant. We used the SOLPS code to produce a preliminary analysis of two such configurations, the X-Divertor (XD and the Super X-Divertor (SX, and compare them to the SN solution. Considering the nominal power flowing into the SOL (PSOL = 150 MW, we estimated the amplitude of the acceptable DEMO operational space. The acceptability criterion was chosen as plasma temperature at the target lower than 5eV, providing low sputtering and at least partial detachment, while the operational space was defined in terms of the electron density at the outboard mid-plane separatrix and of the seeded impurity (Ar only in the present study concentration. It was found that both the XD and the SXD extend the DEMO operational space, although the advantages detected so far are not dramatic. The most promising configuration seems to be the XD, which can produce acceptable target temperatures at moderate outboard mid-plane electron density (nomp=4.5×1019 m−3 and Zeff= 1.3.

  19. Vehicle Codes and Standards: Overview and Gap Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Blake, C.; Buttner, W.; Rivkin, C.

    2010-02-01

    This report identifies gaps in vehicle codes and standards and recommends ways to fill the gaps, focusing on six alternative fuels: biodiesel, natural gas, electricity, ethanol, hydrogen, and propane.

  20. Text localization using standard deviation analysis of structure elements and support vector machines

    Directory of Open Access Journals (Sweden)

    Zagoris Konstantinos

    2011-01-01

    Full Text Available Abstract A text localization technique is required to successfully exploit document images such as technical articles and letters. The proposed method detects and extracts text areas from document images. Initially a connected components analysis technique detects blocks of foreground objects. Then, a descriptor that consists of a set of suitable document structure elements is extracted from the blocks. This is achieved by incorporating an algorithm called Standard Deviation Analysis of Structure Elements (SDASE which maximizes the separability between the blocks. Another feature of the SDASE is that its length adapts according to the requirements of the application. Finally, the descriptor of each block is used as input to a trained support vector machines that classify the block as text or not. The proposed technique is also capable of adjusting to the text structure of the documents. Experimental results on benchmarking databases demonstrate the effectiveness of the proposed method.

  1. Retrospective Analysis of the Benefits and Impacts of U.S. Renewable Portfolio Standards

    Energy Technology Data Exchange (ETDEWEB)

    Wiser, Ryan [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Barbose, Galen [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Heeter, Jenny [National Renewable Energy Lab. (NREL), Golden, CO (United States); Mai, Trieu [National Renewable Energy Lab. (NREL), Golden, CO (United States); Bird, Lori [National Renewable Energy Lab. (NREL), Golden, CO (United States); Bolinger, Mark [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Carpenter, Alberta [National Renewable Energy Lab. (NREL), Golden, CO (United States); Heath, Garvin [National Renewable Energy Lab. (NREL), Golden, CO (United States); Keyser, David [National Renewable Energy Lab. (NREL), Golden, CO (United States); Macknick, Jordan [National Renewable Energy Lab. (NREL), Golden, CO (United States); Mills, Andrew [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Millstein, Dev [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States)

    2016-01-06

    This analysis is the first-ever comprehensive assessment of the benefits and impacts of state renewable portfolio standards (RPSs). This joint National Renewable Energy Laboratory-Lawrence Berkeley National Laboratory project provides a retrospective analysis of RPS program benefits and impacts, including greenhouse gas emissions reductions, air pollution emission reductions, water use reductions, gross jobs and economic development impacts, wholesale electricity price reduction impacts, and natural gas price reduction impacts. Wherever possible, benefits and impacts are quantified in monetary terms. The paper will inform state policymakers, RPS program administrators, industry, and others about the costs and benefits of state RPS programs. In particular, the work seeks to inform decision-making surrounding ongoing legislative proposals to scale back, freeze, or expand existing RPS programs, as well as future discussions about increasing RPS targets or otherwise increasing renewable energy associated with Clean Power Plan compliance or other emission-reduction goals.

  2. Multielement analysis of human hair and kidney stones by instrumental neutron activation analysis with the k0-standardization method

    International Nuclear Information System (INIS)

    Abugassa, I.; Sarmani, S.B.; Samat, S.B.

    1999-01-01

    This paper focuses on the evaluation of the k 0 method of instrumental neutron activation analysis in biological materials. The method has been applied in multielement analysis of human hair standard reference materials from IAEA, No. 085, No. 086 and from NIES (National Institute for Environmental Sciences) No. 5. Hair samples from people resident in different parts of Malaysia, in addition to a sample from Japan, were analyzed. In addition, human kidney stones from members of the Malaysian population have been analyzed for minor and trace elements. More than 25 elements have been determined. The samples were irradiated in the rotary rack (Lazy Susan) at the TRIGA Mark II reactor of the Malaysian Institute for Nuclear Technology and Research (MINT). The accuracy of the method was ascertained by analysis of other reference materials, including 1573 tomato leaves and 1572 citrus leaves. In this method the deviation of the 1/E 1+α epithermal neutron flux distribution from the 1/E law (P/T ratio) for true coincidence effects of the γ-ray cascade and the HPGe detector efficiency were determined and corrected for

  3. A system dynamics analysis determining willingness to wait and pay for the implementation of data standards in clinical research.

    Science.gov (United States)

    Cofiel, Luciana; Zammar, Guilherme R; Zaveri, Amrapali J; Shah, Jatin Y; Carvalho, Elias; Nahm, Meredith; Kesselring, Gustavo; Pietrobon, Ricardo

    2010-12-31

    Industry standards provide rigorous descriptions of required data presentation, with the aim of ensuring compatibility across different clinical studies. However despite their crucial importance, these standards are often not used as expected in the development of clinical research. The reasons for this lack of compliance could be related to the high cost and time-intensive nature of the process of data standards implementation. The objective of this study was to evaluate the value of the extra time and cost required for different levels of data standardisation and the likelihood of researchers to comply with these levels. Since we believe that the cost and time necessary for the implementation of data standards can change over time, System Dynamics (SD) analysis was used to investigate how these variables interact and influence the adoption of data standards by clinical researchers. Three levels of data standards implementation were defined through focus group discussion involving four clinical research investigators. Ten Brazilian and eighteen American investigators responded to an online questionnaire which presented possible standards implementation scenarios, with respondents asked to choose one of two options available in each scenario. A random effects ordered probit model was used to estimate the effect of cost and time on investigators' willingness to adhere to data standards. The SD model was used to demonstrate the relationship between degrees of data standardisation and subsequent variation in cost and time required to start the associated study. A preference for low cost and rapid implementation times was observed, with investigators more likely to incur costs than to accept a time delay in project start-up. SD analysis indicated that although initially extra time and cost are necessary for clinical study standardisation, there is a decrease in both over time. Future studies should explore ways of creating mechanisms which decrease the time and cost

  4. A system dynamics analysis determining willingness to wait and pay for the implementation of data standards in clinical research

    Science.gov (United States)

    2010-01-01

    Background Industry standards provide rigorous descriptions of required data presentation, with the aim of ensuring compatibility across different clinical studies. However despite their crucial importance, these standards are often not used as expected in the development of clinical research. The reasons for this lack of compliance could be related to the high cost and time-intensive nature of the process of data standards implementation. The objective of this study was to evaluate the value of the extra time and cost required for different levels of data standardisation and the likelihood of researchers to comply with these levels. Since we believe that the cost and time necessary for the implementation of data standards can change over time, System Dynamics (SD) analysis was used to investigate how these variables interact and influence the adoption of data standards by clinical researchers. Methods Three levels of data standards implementation were defined through focus group discussion involving four clinical research investigators. Ten Brazilian and eighteen American investigators responded to an online questionnaire which presented possible standards implementation scenarios, with respondents asked to choose one of two options available in each scenario. A random effects ordered probit model was used to estimate the effect of cost and time on investigators' willingness to adhere to data standards. The SD model was used to demonstrate the relationship between degrees of data standardisation and subsequent variation in cost and time required to start the associated study. Results A preference for low cost and rapid implementation times was observed, with investigators more likely to incur costs than to accept a time delay in project start-up. SD analysis indicated that although initially extra time and cost are necessary for clinical study standardisation, there is a decrease in both over time. Conclusions Future studies should explore ways of creating

  5. Robotic and endoscopic transaxillary thyroidectomies may be cost prohibitive when compared to standard cervical thyroidectomy: a cost analysis.

    Science.gov (United States)

    Cabot, Jennifer C; Lee, Cho Rok; Brunaud, Laurent; Kleiman, David A; Chung, Woong Youn; Fahey, Thomas J; Zarnegar, Rasa

    2012-12-01

    This study presents a cost analysis of the standard cervical, gasless transaxillary endoscopic, and gasless transaxillary robotic thyroidectomy approaches based on medical costs in the United States. A retrospective review of 140 patients who underwent standard cervical, transaxillary endoscopic, or transaxillary robotic thyroidectomy at 2 tertiary centers was conducted. The cost model included operating room charges, anesthesia fee, consumables cost, equipment depreciation, and maintenance cost. Sensitivity analyses assessed individual cost variables. The mean operative times for the standard cervical, transaxillary endoscopic, and transaxillary robotic approaches were 121 ± 18.9, 185 ± 26.0, and 166 ± 29.4 minutes, respectively. The total cost for the standard cervical, transaxillary endoscopic, and transaxillary robotic approaches were $9,028 ± $891, $12,505 ± $1,222, and $13,670 ± $1,384, respectively. Transaxillary approaches were significantly more expensive than the standard cervical technique (standard cervical/transaxillary endoscopic, P cost when transaxillary endoscopic operative time decreased to 111 minutes and transaxillary robotic operative time decreased to 68 minutes. Increasing the case load did not resolve the cost difference. Transaxillary endoscopic and transaxillary robotic thyroidectomies are significantly more expensive than the standard cervical approach. Decreasing operative times reduces this cost difference. The greater expense may be prohibitive in countries with a flat reimbursement schedule. Copyright © 2012 Mosby, Inc. All rights reserved.

  6. ANALYSIS OF THE TEACHERS’ INVOLVEMENT IN THE DISCUSSION OF THE APPLICATION OF THE FEDERAL STATE EDUCATIONAL STANDARDS VIA ONLINE RESOURCES

    Directory of Open Access Journals (Sweden)

    С Н Вачкова

    2017-12-01

    Full Text Available This article presents the research results of the teachers’ involvement extent in current problems emerging in educational activities. The paper discusses the concept of involvement, its functions and scientific approaches to its analysis; suggests the original definition and structure of this concept, describes the chosen methodology of its analysis, database research and the nature of the sample, analysis tools. The base of the present research was the Internet portal “Public expertise of normative documents in education”. There is a detailed description of quantitative results, the indicators of teachers’ participation in discussing problems of education in relation to normative educational documents of Federal state educational standards of primary, basic and secondary general education. The research results showed the indicators of teachers’ activity and the expressed problems in application the Federal state educational standards.

  7. No slip gravity

    Science.gov (United States)

    Linder, Eric V.

    2018-03-01

    A subclass of the Horndeski modified gravity theory we call No Slip Gravity has particularly interesting properties: 1) a speed of gravitational wave propagation equal to the speed of light, 2) equality between the effective gravitational coupling strengths to matter and light, Gmatter and Glight, hence no slip between the metric potentials, yet difference from Newton's constant, and 3) suppressed growth to give better agreement with galaxy clustering observations. We explore the characteristics and implications of this theory, and project observational constraints. We also give a simple expression for the ratio of the gravitational wave standard siren distance to the photon standard candle distance, in this theory and others, and enable a direct comparison of modified gravity in structure growth and in gravitational waves, an important crosscheck.

  8. A Prospective Analysis of the Costs, Benefits, and Impacts of U.S. Renewable Portfolio Standards

    Energy Technology Data Exchange (ETDEWEB)

    Mai, Trieu [National Renewable Energy Lab. (NREL), Golden, CO (United States); Wiser, Ryan [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Barbose, Galen [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Bird, Lori [National Renewable Energy Lab. (NREL), Golden, CO (United States); Heeter, Jenny [National Renewable Energy Lab. (NREL), Golden, CO (United States); Keyser, David [National Renewable Energy Lab. (NREL), Golden, CO (United States); Krishnan, Venkat [National Renewable Energy Lab. (NREL), Golden, CO (United States); Macknick, Jordan [National Renewable Energy Lab. (NREL), Golden, CO (United States); Millstein, Dev [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States)

    2016-12-01

    This report evaluates the future costs, benefits, and other impacts of renewable energy used to meet current state renewable portfolio standards (RPSs). It also examines a future scenario where RPSs are expanded. The analysis examines changes in electric system costs and retail electricity prices, which include all fixed and operating costs, including capital costs for all renewable, non-renewable, and supporting (e.g., transmission and storage) electric sector infrastructure; fossil fuel, uranium, and biomass fuel costs; and plant operations and maintenance expenditures. The analysis evaluates three specific benefits: air pollution, greenhouse gas emissions, and water use. It also analyzes two other impacts, renewable energy workforce and economic development, and natural gas price suppression. This analysis finds that the benefits or renewable energy used to meet RPS polices exceed the costs, even when considering the highest cost and lowest benefit outcomes.

  9. Principal Components Analysis on the spectral Bidirectional Reflectance Distribution Function of ceramic colour standards.

    Science.gov (United States)

    Ferrero, A; Campos, J; Rabal, A M; Pons, A; Hernanz, M L; Corróns, A

    2011-09-26

    The Bidirectional Reflectance Distribution Function (BRDF) is essential to characterize an object's reflectance properties. This function depends both on the various illumination-observation geometries as well as on the wavelength. As a result, the comprehensive interpretation of the data becomes rather complex. In this work we assess the use of the multivariable analysis technique of Principal Components Analysis (PCA) applied to the experimental BRDF data of a ceramic colour standard. It will be shown that the result may be linked to the various reflection processes occurring on the surface, assuming that the incoming spectral distribution is affected by each one of these processes in a specific manner. Moreover, this procedure facilitates the task of interpolating a series of BRDF measurements obtained for a particular sample. © 2011 Optical Society of America

  10. Standardization of sample collection, isolation and analysis methods in extracellular vesicle research

    Directory of Open Access Journals (Sweden)

    Kenneth W. Witwer

    2013-05-01

    Full Text Available The emergence of publications on extracellular RNA (exRNA and extracellular vesicles (EV has highlighted the potential of these molecules and vehicles as biomarkers of disease and therapeutic targets. These findings have created a paradigm shift, most prominently in the field of oncology, prompting expanded interest in the field and dedication of funds for EV research. At the same time, understanding of EV subtypes, biogenesis, cargo and mechanisms of shuttling remains incomplete. The techniques that can be harnessed to address the many gaps in our current knowledge were the subject of a special workshop of the International Society for Extracellular Vesicles (ISEV in New York City in October 2012. As part of the “ISEV Research Seminar: Analysis and Function of RNA in Extracellular Vesicles (evRNA”, 6 round-table discussions were held to provide an evidence-based framework for isolation and analysis of EV, purification and analysis of associated RNA molecules, and molecular engineering of EV for therapeutic intervention. This article arises from the discussion of EV isolation and analysis at that meeting. The conclusions of the round table are supplemented with a review of published materials and our experience. Controversies and outstanding questions are identified that may inform future research and funding priorities. While we emphasize the need for standardization of specimen handling, appropriate normative controls, and isolation and analysis techniques to facilitate comparison of results, we also recognize that continual development and evaluation of techniques will be necessary as new knowledge is amassed. On many points, consensus has not yet been achieved and must be built through the reporting of well-controlled experiments.

  11. Synthesised standards in natural matrices

    International Nuclear Information System (INIS)

    Olsen, D.G.

    1980-01-01

    The problem of securing the most reliable standards for the accurate analysis of radionuclides is discussed in the paper and in the comment on the paper. It is contended in the paper that the best standards can be created by quantitative addition of accurately known spiking solutions into carefully selected natural matrices. On the other hand it is argued that many natural materials can be successfully standardized for numerous trace constituents. Both points of view are supported with examples. (U.K.)

  12. The Assessment of a Tutoring Program to Meet CAS Standards Using a SWOT Analysis and Action Plan

    Science.gov (United States)

    Fullmer, Patricia

    2009-01-01

    This article summarizes the use of SWOT (Strengths, Weaknesses, Opportunities, and Threats) analysis and subsequent action planning as a tool of self-assessment to meet CAS (Council for the Advancement of Standards in Higher Education) requirements for systematic assessment. The use of the evaluation results to devise improvements to increase the…

  13. Platinum stable isotope analysis of geological standard reference materials by double-spike MC-ICPMS.

    Science.gov (United States)

    Creech, J B; Baker, J A; Handler, M R; Bizzarro, M

    2014-01-10

    We report a method for the chemical purification of Pt from geological materials by ion-exchange chromatography for subsequent Pt stable isotope analysis by multiple-collector inductively coupled plasma mass spectrometry (MC-ICPMS) using a 196 Pt- 198 Pt double-spike to correct for instrumental mass bias. Double-spiking of samples was carried out prior to digestion and chemical separation to correct for any mass-dependent fractionation that may occur due to incomplete recovery of Pt. Samples were digested using a NiS fire assay method, which pre-concentrates Pt into a metallic bead that is readily dissolved in acid in preparation for anion-exchange chemistry. Pt was recovered from anion-exchange resin in concentrated HNO 3 acid after elution of matrix elements, including the other platinum group elements (PGE), in dilute HCl and HNO 3 acids. The separation method has been calibrated using a precious metal standard solution doped with a range of synthetic matrices and results in Pt yields of ≥90% with purity of ≥95%. Using this chemical separation technique, we have separated Pt from 11 international geological standard reference materials comprising of PGE ores, mantle rocks, igneous rocks and one sample from the Cretaceous-Paleogene boundary layer. Pt concentrations in these samples range from ca. 5 ng g -1 to 4 μg g -1 . This analytical method has been shown to have an external reproducibility on δ 198 Pt (permil difference in the 198 Pt/ 194 Pt ratio from the IRMM-010 standard) of ±0.040 (2 sd) on Pt solution standards (Creech et al., 2013, J. Anal. At. Spectrom. 28, 853-865). The reproducibility in natural samples is evaluated by processing multiple replicates of four standard reference materials, and is conservatively taken to be ca. ±0.088 (2 sd). Pt stable isotope data for the full set of reference materials have a range of δ 198 Pt values with offsets of up to 0.4‰ from the IRMM-010 standard, which are readily resolved with this technique. These

  14. Gasoline taxes or efficiency standards? A heterogeneous household demand analysis

    International Nuclear Information System (INIS)

    Liu, Weiwei

    2015-01-01

    Using detailed consumer expenditure survey data and a flexible semiparametric dynamic demand model, this paper estimates the price elasticity and fuel efficiency elasticity of gasoline demand at the household level. The goal is to assess the effectiveness of gasoline taxes and vehicle fuel efficiency standards on fuel consumption. The results reveal substantial interaction between vehicle fuel efficiency and the price elasticity of gasoline demand: the improvement of vehicle fuel efficiency leads to lower price elasticity and weakens consumers’ sensitivity to gasoline price changes. The offsetting effect also differs across households due to demographic heterogeneity. These findings imply that when gasoline taxes are in place, tightening efficiency standards will partially offset the strength of taxes on reducing fuel consumption. - Highlights: • Model household gasoline demand using a semiparametric approach. • Estimate heterogeneous price elasticity and fuel efficiency elasticity. • Assess the effectiveness of gasoline taxes and efficiency standards. • Efficiency standards offset the impact of gasoline taxes on fuel consumption. • The offsetting effect differs by household demographics

  15. Fiscal 1994 research report. Demonstration test for establishing technology for peakload shaving with dispersed small residential PV systems (Evaluation of weatherability of house-use solar light electric power generating system equipment); 1994 nendo jutakuyo taiyoko hatsuden fuka heijunka gijutsu nado kakuritsu jissho shiken ni kansuru kaiseki hyoka no uchi jutakuyo taiyoko hatsuden system kiki no taikosei hyoka

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1995-03-01

    Weatherability tests such as exposure to the atmosphere are conducted to collect and accumulate information on the degradation of house-use photovoltaic equipment expected to come into use in areas where weather conditions are hostile. Based on the thus collected data, analysis and evaluation are conducted about the mechanisms of corrosion and degradation for the establishment of designing and evaluating methods for maintaining weatherability. Since fiscal 1994 is the first year of the project, preparations are made for tests to be conducted. In carrying out research activities, a Weatherability Evaluation Committee for House-Use Solar Light Electric Power Generating System Equipment is organized, who discuss implementation of the project. Discussion is made on the shape and structural materials of the exposure rack. An aluminum alloy coated with a clear coating of alumite sulfate is selected, and a rack is built. Specifications of a sea salt particle collector are discussed, and preparations are made for arresting and quantifying sea salt particles under the Japanese Industrial Standard (JIS) dry gauze method and the International Organization for Standardization (ISO) wet candle method. Solar cell module initial characteristics are measured. (NEDO)

  16. Photon activation analysis using internal standards: some studies of the analysis of environmental materials

    Energy Technology Data Exchange (ETDEWEB)

    Masumoto, K; Yagi, M

    1986-01-01

    The authors report the application of the internal standard method to the simultaneous determination of trace elements in environmental reference materials. The standard soil material used was IAEA CRM Soil-5. The power plant fly ash reference used was NBS SRM-1633a. Fifteen target elements, including As, Ba and Ce, were determined. Internal standards were supplied by six elements, including Na and Mg. Although there were several interfering elements, their effect could be eliminated by utilizing more than one gamma-ray peak and carrying out appropriate corrections. The values determined for most of the target elements were well within the certified range. Measured concentrations were of the orders of 10 to 1000 ..mu..g/g. 6 references, 2 figures, 5 tables.

  17. Light's labour's lost - policies for energy-efficient lighting

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    2006-06-29

    When William Shakepeare wrote Love's Labour's Lost he would have used light from tallow candles at a cost (today) of 12,000 British pounds per million-lumen hours. The same amount of light from electric lamps now costs only 2 pounds! But today's low-cost illumination still has a dark side. Globally, lighting consumes more electricity than is produced by either hydro or nuclear power and results in CO2 emissions equivalent to two thirds of the world's cars. A standard incandescent lamp may be much more efficient than a tallow candle, but it is far less efficient than a high-pressure sodium lamp. Were inefficient light sources to be replaced by the equivalent efficient ones, global lighting energy demand would be up to 40% less at a lower overall cost. Larger savings still could be realised through the intelligent use of controls, lighting levels and daylight. But achieving efficient lighting is not just a question of technology; it requires policies to transform current practice. This book documents the broad range of policy measures to stimulate efficient lighting that have already been implemented around the world and suggests new ways these could be strengthened to prevent light's labour's from being lost.

  18. Light's labour's lost - policies for energy-efficient lighting

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    2006-06-29

    When William Shakepeare wrote Love's Labour's Lost he would have used light from tallow candles at a cost (today) of 12,000 British pounds per million-lumen hours. The same amount of light from electric lamps now costs only 2 pounds! But today's low-cost illumination still has a dark side. Globally, lighting consumes more electricity than is produced by either hydro or nuclear power and results in CO2 emissions equivalent to two thirds of the world's cars. A standard incandescent lamp may be much more efficient than a tallow candle, but it is far less efficient than a high-pressure sodium lamp. Were inefficient light sources to be replaced by the equivalent efficient ones, global lighting energy demand would be up to 40% less at a lower overall cost. Larger savings still could be realised through the intelligent use of controls, lighting levels and daylight. But achieving efficient lighting is not just a question of technology; it requires policies to transform current practice. This book documents the broad range of policy measures to stimulate efficient lighting that have already been implemented around the world and suggests new ways these could be strengthened to prevent light's labour's from being lost.

  19. Light's labour's lost - policies for energy-efficient lighting

    International Nuclear Information System (INIS)

    2006-01-01

    When William Shakepeare wrote Love's Labour's Lost he would have used light from tallow candles at a cost (today) of 12,000 British pounds per million-lumen hours. The same amount of light from electric lamps now costs only 2 pounds. But today's low-cost illumination still has a dark side. Globally, lighting consumes more electricity than is produced by either hydro or nuclear power and results in CO2 emissions equivalent to two thirds of the world's cars. A standard incandescent lamp may be much more efficient than a tallow candle, but it is far less efficient than a high-pressure sodium lamp. Were inefficient light sources to be replaced by the equivalent efficient ones, global lighting energy demand would be up to 40% less at a lower overall cost. Larger savings still could be realised through the intelligent use of controls, lighting levels and daylight. But achieving efficient lighting is not just a question of technology; it requires policies to transform current practice. This book documents the broad range of policy measures to stimulate efficient lighting that have already been implemented around the world and suggests new ways these could be strengthened to prevent light's labour's from being lost

  20. METHODOLOGY COMPARATIVE EVALUATION OF PROFESSIONAL STANDARDS AND EDUCATION STANDARDS WITH THE USE OF NON-NUMERIC DATA PROCESSING METHODS

    Directory of Open Access Journals (Sweden)

    Gennady V. Abramov

    2016-01-01

    Full Text Available The article discusses the development of a technique that allows for a comparative assessment of the requirements of the professional standard and the federal state educational standards. The results can be used by universities to adjust the learning process for the analysis of their curricula to better compliance with professional standards

  1. China's High-technology Standards Development

    Institute of Scientific and Technical Information of China (English)

    2007-01-01

    There are several major technology standards, including audio video coding (AVS), automotive electronics, third generation (3G) mobile phones, mobile television, wireless networks and digital terrestrial television broadcasting, that have been released or are currently under development in China. This article offers a detailed analysis of each standard and studies their impact on China's high-technology industry.

  2. 76 FR 75782 - Revising Standards Referenced in the Acetylene Standard

    Science.gov (United States)

    2011-12-05

    ... Determinations A. Legal Considerations B. Final Economic Analysis and Regulatory Flexibility Act Certification C... within the meaning of Section 652(8) when a significant risk of material harm exists in the workplace and the standard would substantially reduce or eliminate that workplace risk. This DFR will not reduce the...

  3. Highly Accreting Quasars at High Redshift

    Science.gov (United States)

    Martínez-Aldama, Mary L.; Del Olmo, Ascensión; Marziani, Paola; Sulentic, Jack W.; Negrete, C. Alenka; Dultzin, Deborah; Perea, Jaime; D'Onofrio, Mauro

    2017-12-01

    We present preliminary results of a spectroscopic analysis for a sample of type 1 highly accreting quasars (LLedd>0.2) at high redshift, z 2-3. The quasars were observed with the OSIRIS spectrograph on the GTC 10.4 m telescope located at the Observatorio del Roque de los Muchachos in La Palma. The highly accreting quasars were identified using the 4D Eigenvector 1 formalism, which is able to organize type 1 quasars over a broad range of redshift and luminosity. The kinematic and physical properties of the broad line region have been derived by fitting the profiles of strong UV emission lines such as AlIII, SiIII and CIII. The majority of our sources show strong blueshifts in the high-ionization lines and high Eddington ratios which are related with the productions of outflows. The importance of highly accreting quasars goes beyond a detailed understanding of their physics: their extreme Eddington ratio makes them candidates standard candles for cosmological studies.

  4. Towards Uniform Accelerometry Analysis: A Standardization Methodology to Minimize Measurement Bias Due to Systematic Accelerometer Wear-Time Variation

    Directory of Open Access Journals (Sweden)

    Tarun R. Katapally, Nazeem Muhajarine

    2014-06-01

    Full Text Available Accelerometers are predominantly used to objectively measure the entire range of activity intensities – sedentary behaviour (SED, light physical activity (LPA and moderate to vigorous physical activity (MVPA. However, studies consistently report results without accounting for systematic accelerometer wear-time variation (within and between participants, jeopardizing the validity of these results. This study describes the development of a standardization methodology to understand and minimize measurement bias due to wear-time variation. Accelerometry is generally conducted over seven consecutive days, with participants' data being commonly considered 'valid' only if wear-time is at least 10 hours/day. However, even within ‘valid’ data, there could be systematic wear-time variation. To explore this variation, accelerometer data of Smart Cities, Healthy Kids study (www.smartcitieshealthykids.com were analyzed descriptively and with repeated measures multivariate analysis of variance (MANOVA. Subsequently, a standardization method was developed, where case-specific observed wear-time is controlled to an analyst specified time period. Next, case-specific accelerometer data are interpolated to this controlled wear-time to produce standardized variables. To understand discrepancies owing to wear-time variation, all analyses were conducted pre- and post-standardization. Descriptive analyses revealed systematic wear-time variation, both between and within participants. Pre- and post-standardized descriptive analyses of SED, LPA and MVPA revealed a persistent and often significant trend of wear-time’s influence on activity. SED was consistently higher on weekdays before standardization; however, this trend was reversed post-standardization. Even though MVPA was significantly higher on weekdays both pre- and post-standardization, the magnitude of this difference decreased post-standardization. Multivariable analyses with standardized SED, LPA and

  5. Global Value Chains, Labor Organization and Private Social Standards

    DEFF Research Database (Denmark)

    Riisgaard, Lone

    2009-01-01

    This article examines the opportunities and challenges that private social standards pose for labor organizations. It explores different labor responses to private social standards in East African cut flower industries. The analysis incorporates the concept of labor agency in global value chain a...... at production sites. However, labor organizations' ability to seriously challenge the prevailing governance structure of the cut flower value chain appears extremely limited.......This article examines the opportunities and challenges that private social standards pose for labor organizations. It explores different labor responses to private social standards in East African cut flower industries. The analysis incorporates the concept of labor agency in global value chain...... analysis and reveals how retailer-driven chains offer more room for labor organizations to exercise their agency than the traditional cut flower value chains. Labor organizations have been able to influence social standard setting and implementation, and to use standards to further labor representation...

  6. The Analysis of the Psychological Tests Using In Educational Institutions According To the Testing Standards

    Directory of Open Access Journals (Sweden)

    Ezgi MOR DİRLİK

    2017-12-01

    Full Text Available The purpose of this research is to analyze four psychological tests which are frequently used in the Guidance and Research Centers and in the guidance services of the schools according to the standards for educational and psychological testing of APA (American Psychological Association and test adaption standards of ITC (International Testing Commission. The tests were determined based on the goal- oriented sample selecting method and were selected from the most frequently used psychological tests in Guidance and Research Centers and school’s guidance centers. These tests are: Scale of Academic Self-Concept (Akademik Benlik Kavramı Ölçeği-ABKÖ, Evaluation of Early Childhood Development Tool (Gazi Erken Çocukluk Gelişimi Değerlendirme Aracı-GEÇDA, Primary Mental Abilities 7-11 (TKT 7-11, and Wechsler Intelligence Scale for Children Revised Form (WISC-R. In this research, the chapters related to the validity, reliability and test development and revision of “Standards For Educational And Psychological Testing” (APA, 1999 and the adaptation standards developed by ITC were translated into Turkish and a checklist was created by using these documents. The checklist has got two forms as short and long form. The tests were analyzed according to the short form of the checklist by researcher. In order to examine the reliability of these analyses, the analyses were repeated in three weeks’ time. Data of these analyses were exported to the Statistical Package for Social Sciences (SPSS 20.0 and descriptive analysis was perfomed. As a result of this research, the meeting levels of the psychological tests to the test standards in the checklist and the features of the tests which should be improved according to the validity, reliability, test development and revision and test adaptation were determined. In conclusion, the standards analyzed have not been met satisfactorily by ABKÖ and GEÇDA, and according to the analyses of the realibility

  7. Planetary nebulae: understanding the physical and chemical evolution of dying stars.

    Science.gov (United States)

    Weinberger, R; Kerber, F

    1997-05-30

    Planetary nebulae are one of the few classes of celestial objects that are active in every part of the electromagnetic spectrum. These fluorescing and often dusty expanding gaseous envelopes were recently found to be quite complex in their dynamics and morphology, but refined theoretical models can account for these discoveries. Great progress was also made in understanding the mechanisms that shape the nebulae and the spectra of their central stars. In addition, applications for planetary nebulae have been worked out; for example, they have been used as standard candles for long-range distances and as tracers of the enigmatic dark matter.

  8. Numerical analysis of standard and modified osteosynthesis in long bone fractures treatment.

    Science.gov (United States)

    Sisljagić, Vladimir; Jovanović, Savo; Mrcela, Tomislav; Radić, Radivoje; Selthofer, Robert; Mrcela, Milanka

    2010-03-01

    The fundamental problem in osteoporotic fracture treatment is significant decrease in bone mass and bone tissue density resulting in decreased firmness and elasticity of osteoporotic bone. Application of standard implants and standard surgical techniques in osteoporotic bone fracture treatment makes it almost impossible to achieve stable osteosynthesis sufficient for early mobility, verticalization and load. Taking into account the form and the size of the contact surface as well as distribution of forces between the osteosynthetic materials and the bone tissue numerical analysis showed advantages of modified osteosynthesis with bone cement filling in the screw bed. The applied numerical model consisted of three sub-models: 3D model from solid elements, 3D cross section of the contact between the plate and the bone and the part of 3D cross section of the screw head and body. We have reached the conclusion that modified osteosynthesis with bone cement resulted in weaker strain in the part of the plate above the fracture fissure, more even strain on the screws, plate and bone, more even strain distribution along all the screws' bodies, significantly greater strain in the part of the screw head opposite to the fracture fissure, firm connection of the screw head and neck and the plate hole with the whole plate and more even bone strain around the screw.

  9. THE LEGAL FRAMEWORK FOR ENSURING THE STANDARDS OF THE LIVING STANDARDS OF THE POPULATION IN UKRAINE

    Directory of Open Access Journals (Sweden)

    Elena Levanda

    2017-11-01

    Full Text Available The purpose of the paper is legal base in the context of the system of ensuring standards of living standards of the population of Ukraine. Methodology. The analysis of normative – legal documents on the basic level of life of different population groups. The legislative field is investigated through the official web portal of the Verkhovna Rada of Ukraine, the State statistics service of Ukraine clarified the period from 1991 to the present. Results. Functioning laws of the last century – outdated, not consistent with the goals of social policy and the contemporary economy. It is important to modernize the laws, concerning basic living standards of the population to the country's foreign policy, according to the EU methodology. Apply state social standard as a tool for poverty reduction, and the perspective tool starter package with a guaranteed standard of living government to its citizens. The practical implications. Different stages of development of economy of independent Ukraine, laid the foundations of the legislative framework of normative documents concerning social protection of the population. A country's legal framework contains a set of laws belonging to the last century, policy and regulatory documents that comply with EU standards. In turn, the regulatory framework has tenedency to the modernization of laws that establish the guaranteed state social standards and guarantees for every citizen. Value/originality. Analysis of the legislative base, revealed the ineffectiveness of the law guaranteeing basic social standard to citizens. Understanding of the process of modernization of a relatively large part of the laws adopted in the last century.

  10. Preliminary results of standard quantitative analysis by ED-XRF

    International Nuclear Information System (INIS)

    Lara, Alessandro L. de; Franca, Alana C.; Neto, Manoel R.M.; Paschuk, Sergei A.

    2013-01-01

    A comparison between the results of elemental concentration proposed by XRS-FP software , using data obtained by EDXRF technique with those obtained by stoichiometric calculation was performed. For this purpose, five standard samples of known compounds were produced: two lead- oxide, magnesium chloride and iodine in controlled amounts. The compounds were subsequently mixed and compressed to form tablets. The samples were irradiated in three endpoints, according to an orientation. The measurements were performed at the Laboratory for Radiological UTFPR using X-123SDD the Amptek detector and X-ray tube with silver target from the same manufacturer. The operating conditions tube were 05μA current to a 40 kV voltage. Finally, the 15 spectra analyzed with the software to determine the concentration of chlorine, iodine and lead. Data from this analysis were compared with results expected in stoichiometric calculations. The data provided by the program, a convergence of results, indicating homogeneity of the samples was found. Compared to the stoichiometric calculation, a considerable discrepancy that may be the result of a misconfiguration or contamination of the sample was found. At the end, we created a proposal for continuation of the work using an auxiliary calculation should be developed in the next step

  11. Preliminary results of standard quantitative analysis by ED-XRF

    Energy Technology Data Exchange (ETDEWEB)

    Lara, Alessandro L. de; Franca, Alana C.; Neto, Manoel R.M.; Paschuk, Sergei A., E-mail: alellara@hotmail.com [Universidade Tecnologica Federal do Parana (UTFPR), Curitiba, PR (Brazil). Dept. de Fisica; Denyak, Valeriy, E-mail: denyak@gmail.com [Instituto de Pesquisa Pele Pequeno Principe (IPPP), Curitiba, PR (Brazil)

    2013-07-01

    A comparison between the results of elemental concentration proposed by XRS-FP software , using data obtained by EDXRF technique with those obtained by stoichiometric calculation was performed. For this purpose, five standard samples of known compounds were produced: two lead- oxide, magnesium chloride and iodine in controlled amounts. The compounds were subsequently mixed and compressed to form tablets. The samples were irradiated in three endpoints, according to an orientation. The measurements were performed at the Laboratory for Radiological UTFPR using X-123SDD the Amptek detector and X-ray tube with silver target from the same manufacturer. The operating conditions tube were 05μA current to a 40 kV voltage. Finally, the 15 spectra analyzed with the software to determine the concentration of chlorine, iodine and lead. Data from this analysis were compared with results expected in stoichiometric calculations. The data provided by the program, a convergence of results, indicating homogeneity of the samples was found. Compared to the stoichiometric calculation, a considerable discrepancy that may be the result of a misconfiguration or contamination of the sample was found. At the end, we created a proposal for continuation of the work using an auxiliary calculation should be developed in the next step.

  12. Comparisons of ANSI standards cited in the NRC standard review plan, NUREG-0800 and related documents

    International Nuclear Information System (INIS)

    Ankrum, A.R.; Bohlander, K.L.; Gilbert, E.R.; Pawlowski, R.A.; Spiesman, J.B.

    1995-11-01

    This report provides the results of comparisons of the cited and latest versions of ANSI standards cited in the NRC Standard Review Plan for the Review of Safety Analysis Reports for Nuclear Power Plants (NUREG 0800) and related documents. The comparisons were performed by Battelle Pacific Northwest Laboratories in support of the NRC's Standard Review Plan Update and Development Program. Significant changes to the standards, from the cited version to the latest version, are described and discussed in a tabular format for each standard. Recommendations for updating each citation in the Standard Review Plan are presented. Technical considerations and suggested changes are included for related regulatory documents (i.e., Regulatory Guides and the Code of Federal Regulations) citing the standard. The results and recommendations presented in this document have not been subjected to NRC staff review

  13. Comparisons of ASTM standards cited in the NRC standard review plan, NUREG-0800 and related documents

    International Nuclear Information System (INIS)

    Ankrum, A.R.; Bohlander, K.L.; Gilbert, E.R.; Pawlowski, R.A.; Spiesman, J.B.

    1995-10-01

    This report provides the results of comparisons of the cited and latest versions of ASTM standards cited in the NRC Standard Review Plan for the Review of Safety Analysis Reports for Nuclear Power Plants (NUREG 0800) and related documents. The comparisons were performed by Battelle Pacific Northwest Laboratories in support of the NRC's Standard Review Plan Update and Development Program. Significant changes to the standards, from the cited version to the latest version, are described and discussed in a tabular format for each standard. Recommendations for updating each citation in the Standard Review Plan are presented. Technical considerations and suggested changes are included for related regulatory documents (i.e., Regulatory Guides and the Code of Federal Regulations) citing the standard. The results and recommendations presented in this document have not been subjected to NRC staff review

  14. Measurement standards and the general problem of reference points in chemical analysis

    International Nuclear Information System (INIS)

    Richter, W.; Dube, G.

    2002-01-01

    Besides the measurement standards available in general metrology in the form of the realisations of the units of measurement, measurement standards of chemical composition are needed for the vast field of chemical measurement (measurements of the chemical composition), because it is the main aim of such measurements to quantify non-isolated substances, often in complicated matrices, to which the 'classical' measurement standards and their lower- level derivatives are not directly applicable. At present, material artefacts as well as standard measurement devices serve as chemical measurement standards. These are measurement standards in the full metrological sense only, however, if they are firmly linked to the SI unit in which the composition represented by the standard is expressed. This requirement has the consequence that only a very restricted number of really reliable chemical measurement standards exist at present. Since it is very difficult and time consuming to increase this number substantially and, on the other hand, reliable reference points are increasingly needed for all kinds of chemical measurements, primary methods of measurement and high-level reference measurements will play an increasingly important role for the establishment of worldwide comparability and hence mutual acceptance of chemical measurement results. (author)

  15. Analysis of Minimum Efficiency Performance Standards for Residential General Service Lighting in Chile

    Energy Technology Data Exchange (ETDEWEB)

    Letschert, Virginie E.; McNeil, Michael A.; Leiva Ibanez, Francisco Humberto; Ruiz, Ana Maria; Pavon, Mariana; Hall, Stephen

    2011-06-01

    Minimum Efficiency Performance Standards (MEPS) have been chosen as part of Chile's national energy efficiency action plan. As a first MEPS, the Ministry of Energy has decided to focus on a regulation for lighting that would ban the sale of inefficient bulbs, effectively phasing out the use of incandescent lamps. Following major economies such as the US (EISA, 2007) , the EU (Ecodesign, 2009) and Australia (AS/NZS, 2008) who planned a phase out based on minimum efficacy requirements, the Ministry of Energy has undertaken the impact analysis of a MEPS on the residential lighting sector. Fundacion Chile (FC) and Lawrence Berkeley National Laboratory (LBNL) collaborated with the Ministry of Energy and the National Energy Efficiency Program (Programa Pais de Eficiencia Energetica, or PPEE) in order to produce a techno-economic analysis of this future policy measure. LBNL has developed for CLASP (CLASP, 2007) a spreadsheet tool called the Policy Analysis Modeling System (PAMS) that allows for evaluation of costs and benefits at the consumer level but also a wide range of impacts at the national level, such as energy savings, net present value of savings, greenhouse gas (CO2) emission reductions and avoided capacity generation due to a specific policy. Because historically Chile has followed European schemes in energy efficiency programs (test procedures, labelling program definitions), we take the Ecodesign commission regulation No 244/2009 as a starting point when defining our phase out program, which means a tiered phase out based on minimum efficacy per lumen category. The following data were collected in order to perform the techno-economic analysis: (1) Retail prices, efficiency and wattage category in the current market, (2) Usage data (hours of lamp use per day), and (3) Stock data, penetration of efficient lamps in the market. Using these data, PAMS calculates the costs and benefits of efficiency standards from two distinct but related perspectives: (1) The

  16. Application of a spectrum standardization method for carbon analysis in coal using laser-induced breakdown spectroscopy (LIBS).

    Science.gov (United States)

    Li, Xiongwei; Wang, Zhe; Fu, Yangting; Li, Zheng; Liu, Jianmin; Ni, Weidou

    2014-01-01

    Measurement of coal carbon content using laser-induced breakdown spectroscopy (LIBS) is limited by its low precision and accuracy. A modified spectrum standardization method was proposed to achieve both reproducible and accurate results for the quantitative analysis of carbon content in coal using LIBS. The proposed method used the molecular emissions of diatomic carbon (C2) and cyanide (CN) to compensate for the diminution of atomic carbon emissions in high volatile content coal samples caused by matrix effect. The compensated carbon line intensities were further converted into an assumed standard state with standard plasma temperature, electron number density, and total number density of carbon, under which the carbon line intensity is proportional to its concentration in the coal samples. To obtain better compensation for fluctuations of total carbon number density, the segmental spectral area was used and an iterative algorithm was applied that is different from our previous spectrum standardization calculations. The modified spectrum standardization model was applied to the measurement of carbon content in 24 bituminous coal samples. The results demonstrate that the proposed method has superior performance over the generally applied normalization methods. The average relative standard deviation was 3.21%, the coefficient of determination was 0.90, the root mean square error of prediction was 2.24%, and the average maximum relative error for the modified model was 12.18%, showing an overall improvement over the corresponding values for the normalization with segmental spectrum area, 6.00%, 0.75, 3.77%, and 15.40%, respectively.

  17. Qualification standard for photovoltaic concentrator modules

    Energy Technology Data Exchange (ETDEWEB)

    McConnell, R.; Kurtz, S.; Bottenberg, W. R.; Hammond, R.; Jochums, S. W.; McDanal, A. J.; Roubideaux, D.; Whitaker, C.; Wohlgemuth, J.

    2000-05-05

    The paper describes a proposed qualification standard for photovoltaic concentrator modules. The standard's purpose is to provide stress tests and procedures to identify any component weakness in photovoltaic concentrator modules intended for power generation applications. If no weaknesses are identified during qualification, both the manufacturer and the customer can expect a more reliable product. The qualification test program for the standard includes thermal cycles, humidity-freeze cycles, water spray, off-axis beam damage, hail impact, hot-spot endurance, as well as electrical tests for performance, ground continuity, isolation, wet insulation resistance, and bypass diodes. Because concentrator module performance can not be verified using solar simulator and reference cell procedures suitable for flat-plate modules, the standard specifies an outdoor I-V test analysis allowing a performance comparison before and after a test procedure. Two options to this complex analysis are the use of a reference concentrator module for side-by-side outdoor comparison with modules undergoing various tests and a dark I-V performance check.

  18. THE TREND AND DYNAMICSDISTRIBUTION OF THE JAKARTASTOCKEXCHANGE(JSXCOMPOSITE

    Directory of Open Access Journals (Sweden)

    Edi Cahyono

    2012-07-01

    Full Text Available In thispaperwe discuss the dynamics of the JakartaStock Exchange(JSXComposite.The dynamicsindicatesperformance indicator ofseveral industries inIndonesia.The data ispresented as time series. To predict the dynamicsfrom thedata, however,is still difficult. In general,it is almost impossible to predict suchdynamics for the case of high frequency data. Hence, we do not predict thedynamics. Rather, we seek the trend and the probability density function(pdf.For a ‘small’ periodof time, the pdf isbased on the assumption that the dynamicsis normally distributed. Mathematically speaking, this is a time averaging of data,and in some cases the data is presented in the form of candle sticks. The trend willbe approximated by a higher order polynomial function which is sought byapplying a least square methods. On the other hand, the probability densityfunction of the data within each candle stick is obtained bycomputingstandarddeviation ofthe data with respect to the trend inthe candle stick.

  19. Analysis of cryptocurrencies as standard financial instruments

    OpenAIRE

    Bartoš, Jakub

    2014-01-01

    This paper analyzes cryptocurrencies as financial instruments. Firstly, we introduced the main features of cryptocurrencies and summarized the brief history. We found out that price of the most famous cryptocurrency Bitcoin follows the hypothesis of efficient markets and it immediately react on publicly announce information. Furthermore, Bitcoin can be seen as standard economic good that is priced by interaction of supply and demand on the market. These factors can be driven by macro financia...

  20. ARM Data File Standards Version: 1.0

    Energy Technology Data Exchange (ETDEWEB)

    Kehoe, Kenneth [University of Oklahoma; Beus, Sherman [Pacific Northwest National Laboratory; Cialella, Alice [Brookhaven National Laboratory; Collis, Scott [Argonne National Laboratory; Ermold, Brian [Pacific Northwest National Laboratory; Perez, Robin [State University of New York, Albany; Shamblin, Stefanie [Oak Ridge National Laboratory; Sivaraman, Chitra [Pacific Northwest National Laboratory; Jensen, Mike [Brookhaven National Laboratory; McCord, Raymond [Oak Ridge National Laboratory; McCoy, Renata [Sandia National Laboratories; Moore, Sean [Alliant Techsystems, Inc.; Monroe, Justin [University of Oklahoma; Perkins, Brad [Los Alamos National Laboratory; Shippert, Tim [Pacific Northwest National Laboratory

    2014-04-01

    The Atmospheric Radiation Measurement (ARM) Climate Research Facility performs routine in situ and remote-sensing observations to provide a detailed and accurate description of the Earth atmosphere in diverse climate regimes. The result is a diverse data sets containing observational and derived data, currently accumulating at a rate of 30 TB of data and 150,000 different files per month (http://www.archive.arm.gov/stats/storage2.html). Continuing the current processing while scaling this to even larger sizes is extremely important to the ARM Facility and requires consistent metadata and data standards. The standards described in this document will enable development of automated analysis and discovery tools for the ever-growing volumes of data. It also will enable consistent analysis of the multiyear data, allow for development of automated monitoring and data health status tools, and facilitate development of future capabilities for delivering data on demand that can be tailored explicitly to user needs. This analysis ability will only be possible if the data follows a minimum set of standards. This document proposes a hierarchy that includes required and recommended standards.

  1. An Open Standard for Camera Trap Data

    Directory of Open Access Journals (Sweden)

    Tavis Forrester

    2016-12-01

    Full Text Available Camera traps that capture photos of animals are a valuable tool for monitoring biodiversity. The use of camera traps is rapidly increasing and there is an urgent need for standardization to facilitate data management, reporting and data sharing. Here we offer the Camera Trap Metadata Standard as an open data standard for storing and sharing camera trap data, developed by experts from a variety of organizations. The standard captures information necessary to share data between projects and offers a foundation for collecting the more detailed data needed for advanced analysis. The data standard captures information about study design, the type of camera used, and the location and species names for all detections in a standardized way. This information is critical for accurately assessing results from individual camera trapping projects and for combining data from multiple studies for meta-analysis. This data standard is an important step in aligning camera trapping surveys with best practices in data-intensive science. Ecology is moving rapidly into the realm of big data, and central data repositories are becoming a critical tool and are emerging for camera trap data. This data standard will help researchers standardize data terms, align past data to new repositories, and provide a framework for utilizing data across repositories and research projects to advance animal ecology and conservation.

  2. Carbon finance and pro-poor co-benefits: The Gold Standard and Climate, Community and Biodiversity Standards

    Energy Technology Data Exchange (ETDEWEB)

    Wood, Rachel

    2011-04-15

    This paper assesses the practical contribution of the Gold Standard (GS) and Climate Community and Biodiversity (CCB) Standards to local development through the identification of high quality carbon offset projects and ensuring high standards of consultation with local communities during project development and implementation. It is based on desk research, involving analysis of the GS and CCB Standards' project databases, project design documents, and secondary literature. In addition, over 20 representatives of the two standards systems, project developers, NGO representatives, and researchers were interviewed. The paper concludes that both standard systems successfully reward high quality projects which have a demonstrated commitment to local consultations and sustainable development benefits. Moreover, they serve to give well-meaning project developers frameworks with which to ensure that a wide range of criteria are considered in planning and implementing projects. As voluntary standards, it is unrealistic to expect either the GS or CCB Standards to improve poor-quality or unsustainable projects.

  3. Carbon finance and pro-poor co-benefits: The Gold Standard and Climate, Community and Biodiversity Standards

    Energy Technology Data Exchange (ETDEWEB)

    Wood, Rachel

    2011-04-15

    This paper assesses the practical contribution of the Gold Standard (GS) and Climate Community and Biodiversity (CCB) Standards to local development through the identification of high quality carbon offset projects and ensuring high standards of consultation with local communities during project development and implementation. It is based on desk research, involving analysis of the GS and CCB Standards' project databases, project design documents, and secondary literature. In addition, over 20 representatives of the two standards systems, project developers, NGO representatives, and researchers were interviewed. The paper concludes that both standard systems successfully reward high quality projects which have a demonstrated commitment to local consultations and sustainable development benefits. Moreover, they serve to give well-meaning project developers frameworks with which to ensure that a wide range of criteria are considered in planning and implementing projects. As voluntary standards, it is unrealistic to expect either the GS or CCB Standards to improve poor-quality or unsustainable projects.

  4. 1991 annual book of ASTM standards

    International Nuclear Information System (INIS)

    Anon.

    1991-01-01

    This book contains nuclear technology standards also features test methods and practices for solar and geotechnical energy. In the nuclear category, the primary emphasis of this volume is on analysis, dosimetry, and radiation effects in materials. Over eighty standards primarily test methods and practices are featured in this category

  5. Analysis of ultrafiltration failure in peritoneal dialysis patients by means of standard peritoneal permeability analysis.

    Science.gov (United States)

    Ho-dac-Pannekeet, M M; Atasever, B; Struijk, D G; Krediet, R T

    1997-01-01

    Ultrafiltration failure (UFF) is a complication of peritoneal dialysis (PD) treatment that occurs especially in long-term patients. Etiological factors include a large effective peritoneal surface area [measured as high mass transfer area coefficient (MTAC) of creatinine], a high effective lymphatic absorption rate (ELAR), a large residual volume, or combinations. The prevalence and etiology of UFF were studied and the contribution of transcellular water transport (TCWT) was analyzed. A new definition of UFF and guidelines for the analysis of its etiology were derived from the results. Peritoneal dialysis unit in the Academic Medical Center in Amsterdam. Cross-sectional study of standard peritoneal permeability analyses (4-hr dwells, dextran 70 as volume marker) with 1.36% glucose in 68 PD patients. Patients with negative net UF (change in intraperitoneal volume, dIPV rate (TCUFR) were lower (p lower residual volume (p = 0.03), and lower TCUFR (p = 0.01). Ultrafiltration failure was associated with a high MTAC creatinine in 3 patients, a high ELAR in 4 patients, and a combination of factors in one. As an additional possible cause, TCWT was studied, using the sodium gradient in the first hour of the dwell, corrected for diffusion (dNA). Five patients had dNA > 5 mmol/L, indicating normal TCWT. The 3 patients with dNA lower TCUFR (p = 0.04). A smaller difference was found between dIPV 3.86% and 1.36% (p = 0.04) compared to the dNA > 5 mmol/L group, but no differences were present for MTAC creatinine, ELAR, residual volume, or glucose absorption. In addition to known factors, impairment of TCWT can be a cause of UFF. A standardized dwell with 1.36% glucose overestimates UFF. Therefore, 3.86% glucose should be used for identification of patients with UFF, especially because it provides additional information on TCWT. Ultrafiltration failure can be defined as net UF exchange.

  6. Fulfillment of GMP standard, halal standard, and applying HACCP for production process of beef floss (Case study: Ksatria enterprise)

    Science.gov (United States)

    A'diat, Arkan Addien Al; Liquiddanu, Eko; Laksono, Pringgo Widyo; Sutopo, Wahyudi; Suletra, I. Wayan

    2018-02-01

    Along with the increasing number of the modern retail business in Indonesia, give an opportunity to small and medium enterprise (SME) to sell its products through the modern retailer. There are some obstacles faced by the SMEs, one of them is about product standard. Product standard that must be owned by SMEs are GMP standard and halal standard. This research was conducted to know the fulfillment by the beef floss enterprise in jagalan in fulfilling the GMP standard and halal. In addition, Hazard Analysis and Critical Control Points (HACCP) system was applied to analyze the process. HACCP which used in this research was based on the seven principles in SNI (Indonesian National Standard) 01-4852-1998. The seven principles included hazard analysis, critical control point (CCP) determination, critical limit establishment, CCP monitor system establishment, corrective action establishment, verification, and also documentation establishment that must be applied in preparing HACCP plan. Based on this case study, it is concluded that there were 5 CCPs : the boiling process, roasting process, frying process, the beef floss draining process, and the packaging process.

  7. Standards for reference reactor physics measurements

    International Nuclear Information System (INIS)

    Harris, D.R.; Cokinos, D.M.; Uotinen, V.

    1990-01-01

    Reactor physics analysis methods require experimental testing and confirmation over the range of practical reactor configurations and states. This range is somewhat limited by practical fuel types such as actinide oxides or carbides enclosed in metal cladding. On the other hand, this range continues to broaden because of the trend of using higher enrichment, if only slightly enriched, electric utility fuel. The need for experimental testing of the reactor physics analysis methods arises in part because of the continual broadening of the range of core designs, and in part because of the nature of the analysis methods. Reactor physics analyses are directed primarily at the determination of core reactivities and reaction rates, the former largely for reasons of reactor control, and the latter largely to ensure that material limitations are not violated. Errors in these analyses can be regarded as being from numerics, from the data base, and from human factors. For numerical, data base, and human factor reasons, then, it is prudent and customary to qualify reactor physical analysis methods against experiments. These experiments can be treated as being at low power or at high power, and each of these types is subject to an American National Standards Institute standard. The purpose of these standards is to aid in improving and maintaining adequate quality in reactor physics methods, and it is from this point of view that the standards are examined here

  8. Vitamin D and mortality: Individual participant data meta-analysis of standardized 25-hydroxyvitamin D in 26916 individuals from a European consortium.

    Directory of Open Access Journals (Sweden)

    Martin Gaksch

    Full Text Available Vitamin D deficiency may be a risk factor for mortality but previous meta-analyses lacked standardization of laboratory methods for 25-hydroxyvitamin D (25[OH]D concentrations and used aggregate data instead of individual participant data (IPD. We therefore performed an IPD meta-analysis on the association between standardized serum 25(OHD and mortality.In a European consortium of eight prospective studies, including seven general population cohorts, we used the Vitamin D Standardization Program (VDSP protocols to standardize 25(OHD data. Meta-analyses using a one step procedure on IPD were performed to study associations of 25(OHD with all-cause mortality as the primary outcome, and with cardiovascular and cancer mortality as secondary outcomes. This meta-analysis is registered at ClinicalTrials.gov, number NCT02438488.We analysed 26916 study participants (median age 61.6 years, 58% females with a median 25(OHD concentration of 53.8 nmol/L. During a median follow-up time of 10.5 years, 6802 persons died. Compared to participants with 25(OHD concentrations of 75 to 99.99 nmol/L, the adjusted hazard ratios (with 95% confidence interval for mortality in the 25(OHD groups with 40 to 49.99, 30 to 39.99, and <30 nmol/L were 1.15 (1.00-1.29, 1.33 (1.16-1.51, and 1.67 (1.44-1.89, respectively. We observed similar results for cardiovascular mortality, but there was no significant linear association between 25(OHD and cancer mortality. There was also no significantly increased mortality risk at high 25(OHD levels up to 125 nmol/L.In the first IPD meta-analysis using standardized measurements of 25(OHD we observed an association between low 25(OHD and increased risk of all-cause mortality. It is of public health interest to evaluate whether treatment of vitamin D deficiency prevents premature deaths.

  9. A complete analysis of a nuclear building to nuclear safety standards

    International Nuclear Information System (INIS)

    Bergeretto, G.; Giuliano, V.; Lazzeri, L.

    1975-01-01

    The nuclear standards impose on the designer the necessity of examining the loads, stresses and strains in a nuclear building even under extreme loading conditions, both due to plant malfunctions and environmental accidents. It is necessary then to generate, combine and examine a tremendous amount of data; really the lack of symmetry and general complication of the structures and the large number of loading combinations make an automatic analysis quite necessary. A largely automatized procedure is presented in view of solving the problem by a series of computer programs linked together as follows. After the seismic analysis has been performed by (SADE CODE) these data together with the data coming from thermal specifications, weight, accident descriptions etc. are fed into a finite element computer code (SAP4) for analysis. They are processed and combined by a computer code (COMBIN) according to the loading conditions (the usual list in Italy is given and briefly discussed), so that for each point (or each selected zone) under each loading condition the applied loads are listed. These data are fed to another computer code (DTP), which determines the amount of reinforcing bars necessary to accommodate the most severe of the loading conditions. The Aci 318/71 and Italian regulation procedures are followed; the characteristics of the program are briefly described and discussed. Some particular problems are discussed, e.g. the thermal stresses due to normal and accident conditions, the inelastic behavior of some frame elements (due to concrete cracking) is considered by means of an 'ad hoc' code. Typical examples are presented and the results are discussed showing a relatively large benefit in considering this inelastic effect

  10. Thermal analysis of Brazilian standards proposals for social residential buildings; Avaliacao termica das propostas de normas brasileiras para edificacoes de interesse social

    Energy Technology Data Exchange (ETDEWEB)

    Dilkin, Pedro [Universidade Regional do Noroeste do Estado do Rio Grande do Sul, Ijui, RSA (Brazil). Dept. de Fisica, Estatistica e Matematica]. E-mail: dilkin@main.unijui.tche.br; Schneider, Paulo Smith [Rio Grande do Sul Univ., Porto Alegre, RS (Brazil). Dept. de Engenharia Mecanica]. E-mail: pss@mecanica.ufrgs.br

    2000-07-01

    This work presents a critical analysis of national proposals of standards for the thermal performance of simple residential buildings. A review of some international standards is performed together with the description of the national proposals of standards, and a prototype of a house is assembled, following each one of the texts. Results are displayed and the prototypes are simulated using the TRNSYS environment, concerning winter and summer periods of Porto Alegre. Finally, the national proposal that achieved the best performance is improved by means new simulations. (author)

  11. Standard test methods for chemical, mass spectrometric, spectrochemical, nuclear, and radiochemical analysis of uranium hexafluoride

    CERN Document Server

    American Society for Testing and Materials. Philadelphia

    2011-01-01

    1.1 These test methods cover procedures for subsampling and for chemical, mass spectrometric, spectrochemical, nuclear, and radiochemical analysis of uranium hexafluoride UF6. Most of these test methods are in routine use to determine conformance to UF6 specifications in the Enrichment and Conversion Facilities. 1.2 The analytical procedures in this document appear in the following order: Note 1—Subcommittee C26.05 will confer with C26.02 concerning the renumbered section in Test Methods C761 to determine how concerns with renumbering these sections, as analytical methods are replaced with stand-alone analytical methods, are best addressed in subsequent publications. Sections Subsampling of Uranium Hexafluoride 7 - 10 Gravimetric Determination of Uranium 11 - 19 Titrimetric Determination of Uranium 20 Preparation of High-Purity U3O 8 21 Isotopic Analysis 22 Isotopic Analysis by Double-Standard Mass-Spectrometer Method 23 - 29 Determination of Hydrocarbons, Chlorocarbons, and Partially Substitut...

  12. Neutron activation analysis of Standard Materials of Comparison IAEA- the corn and soya flour

    International Nuclear Information System (INIS)

    Dadakhanov, J.A.; Sadykov, I.I.; Salimov, M.I.

    2005-01-01

    It is known that maintenance of quality of results of neutron activation analysis (NAA), no less than in other analytical methods, is one of key problems. Thus first of all it is necessary to provide correctness of results. The most correct way of revealing and elimination of regular errors is carrying out of analyses of Standard Samples of Comparison (SSC) by the developed techniques and comparison of the received results with the certificated results. Therefore, the analysis and certification of various SSC is one of the most actual tasks of modern analytical chemistry. One of few organizations engaged in manufacture SSC for NAA, is IAEA which has organized the Project on certification of samples of comparison - a corn and soya flour. Among many laboratories worldwide, in this Project the Laboratory of the Activation Analysis of Pure Materials of Institute of Nuclear Physics Academy of Sciences Republic of Uzbekistan also participated. We carry out series of analyses of samples of corn and soya flour, candidates for standard samples of comparison, by the method of instrumental NAA. The preparing of samples was carried out by the technique described in the technical project applied to these materials. Samples before the analysis dried up at temperature 80 degree C during 24 h. Cooled, weighed and irradiated in the vertical channel of a nuclear reactor of VVR-SM of Institute of Nuclear Physics Academy of Sciences Republic of Uzbekistan during 0,5-1 h (depending on determined elements) with density of a stream of neutrons 1 x 10 14 neutrons/cm 2 sec. Time of cooling from 10 min up to 10 days. Time of measurement from 100 sec up to 3000 sec. Measurements were carried out on gamma spectrometer consisting of HPGe detector GC1518 and digital multichannel analyzer DSA-1000('Canberra', USA). Processing of the spectrometer information carried out with the help of software package Genie-2000. As a result of the carried out analyses we determine contents of 21 elements in

  13. Standard practice for extreme value analysis of nonmetallic inclusions in steel and other microstructural features

    CERN Document Server

    American Society for Testing and Materials. Philadelphia

    2008-01-01

    1.1 This practice describes a methodology to statistically characterize the distribution of the largest indigenous nonmetallic inclusions in steel specimens based upon quantitative metallographic measurements. The practice is not suitable for assessing exogenous inclusions. 1.2 Based upon the statistical analysis, the nonmetallic content of different lots of steels can be compared. 1.3 This practice deals only with the recommended test methods and nothing in it should be construed as defining or establishing limits of acceptability. 1.4 The measured values are stated in SI units. For measurements obtained from light microscopy, linear feature parameters shall be reported as micrometers, and feature areas shall be reported as micrometers. 1.5 The methodology can be extended to other materials and to other microstructural features. 1.6 This standard does not purport to address all of the safety concerns, if any, associated with its use. It is the responsibility of the user of this standard to establish app...

  14. Head-To-Head Comparison Between High- and Standard-b-Value DWI for Detecting Prostate Cancer: A Systematic Review and Meta-Analysis.

    Science.gov (United States)

    Woo, Sungmin; Suh, Chong Hyun; Kim, Sang Youn; Cho, Jeong Yeon; Kim, Seung Hyup

    2018-01-01

    The purpose of this study was to perform a head-to-head comparison between high-b-value (> 1000 s/mm 2 ) and standard-b-value (800-1000 s/mm 2 ) DWI regarding diagnostic performance in the detection of prostate cancer. The MEDLINE and EMBASE databases were searched up to April 1, 2017. The analysis included diagnostic accuracy studies in which high- and standard-b-value DWI were used for prostate cancer detection with histopathologic examination as the reference standard. Methodologic quality was assessed with the revised Quality Assessment of Diagnostic Accuracy Studies tool. Sensitivity and specificity of all studies were calculated and were pooled and plotted in a hierarchic summary ROC plot. Meta-regression and multiple-subgroup analyses were performed to compare the diagnostic performances of high- and standard-b-value DWI. Eleven studies (789 patients) were included. High-b-value DWI had greater pooled sensitivity (0.80 [95% CI, 0.70-0.87]) (p = 0.03) and specificity (0.92 [95% CI, 0.87-0.95]) (p = 0.01) than standard-b-value DWI (sensitivity, 0.78 [95% CI, 0.66-0.86]); specificity, 0.87 [95% CI, 0.77-0.93] (p value DWI (p ≤ 0.05). Sensitivity was significantly higher for high- than for standard-b-value DWI only in the following subgroups: peripheral zone only, transition zone only, multiparametric protocol (DWI and T2-weighted imaging), visual assessment of DW images, and per-lesion analysis (p ≤ 0.04). In a head-to-head comparison, high-b-value DWI had significantly better sensitivity and specificity for detection of prostate cancer than did standard-b-value DWI. Multiple-subgroup analyses showed that specificity was consistently superior for high-b-value DWI.

  15. Kurumsal Yönetim Standartları Ağırlıklandırılmış Kalite Analizi (Corporate Governance Standards Weighted Quality Analysis

    Directory of Open Access Journals (Sweden)

    İbrahim H. KAYRAL

    2016-03-01

    Full Text Available Providing health and health services always plays important role in maintenance of human life in a quality way and protection of its continuity. The most important factors to achieve health services quality are the structure of hospital management, institutionalization and the success of the strategic management process. It is closely associated with the hospital's level of institutionalization to achieve effectively managing hospital services in its complex structure. This study aims to achieve the standards of quality analysis of Corporate Services and Support Services Health Services dimensions of the Standards of Quality in Health Hospital Kit with the weighted Multidimensional Quality Model. For this purpose; by determining the distribution in terms of structure, process and outcomes technical and functional quality were evaluated by the analysis of standards and standards' points with the model. As a result, SKS-Hospital was found to have the strengths in terms of institutionalization and weaknesses on behalf ÇBKM and additionally SKS-Hospital has been proposed to strengthen the outcome-oriented standards for institutionalization and to improve the efficiency of hospital management.

  16. The standardisation of trace elements in international biological standard reference materials with neutron activation analysis and atomic absorption spectrophotometry

    International Nuclear Information System (INIS)

    Pieterse, H.

    1981-12-01

    An investigation was undertaken into the analytical procedures and the identification of problem areas, for the certification of a new biological standard reference material supplied by the International Atomic Energy Agency, namely, a human hair sample designated as HH-I. The analyses comprised the determination of the elements As, Cd, Co, Cr, Cu, Fe, Hg, Mn, Ni, Sb, Se, and Zn in the hair sample by using two analytical techniques, namely, Instrumental Neutron Activation Analysis and Atomic Absorption. Three other certified biological reference materials, namely, Orchard Leaves (ORCH-L), Sea Plant Material (SPM-I) and Copepod (MAA-I) were used as control standards. Determinations were made of the moisture content of the samples, using varying conditions of drying, and the necessary corrections were applied to all analytical results so that the final elemental values related to dry weight of samples. Attention was also given to the possible loss of specific elements during ashing of the samples prior to the actual instrumental analysis. The results obtained for the hair sample by the two techniques were in good agreement for the elements Co, Fe, Mn, and Zn, but did not agree for the elements Cr and Sb. As, Hg and Se could only be determined with Instrumental Neutron Activation Analysis, and Cd, Cu and Ni only with Atomic Absorption. Most of the results obtained for the three control standard reference materials were within the ranges specified for the individual elements in each sample. The analytical procedures used for determining Cd, Cr, Cu, Ni and Sb with Instrumental Neutron Activation Analysis and As, Cr, Sb and Se with Atomic Absorption, need further investigation. The measurement of the moisture content and the ashing of samples also require further investigation with a view to improving accuracy

  17. Internal standard method for determination of gallium and some trace elements in bauxite by neutron activation analysis

    International Nuclear Information System (INIS)

    Chen, S.G.; Tsai, H.T.

    1983-01-01

    A method is described for the determination of gallium and other trace elements such as Ce, Cr, Hf, Lu and Th in bauxite by the technique of neutron activation analysis using gold as internal standard. Isopropyl ether was used as organic extractant radioactive gallium from the sample. This method yields very good accuracy with a relative error of +-3%. (author)

  18. A trial fabrication of activity standard surface sources and positional standard surface sources for an imaging plate system

    International Nuclear Information System (INIS)

    Sato, Yasushi; Hino, Yoshio; Yamada, Takahiro; Matsumoto, Mikio

    2003-01-01

    An imaging plate system can detect low level activity, but quantitative analysis is difficult because there are no adequate standard surface sources. A new fabrication method was developed for standard surface sources by printing on a sheet of paper using an ink-jet printer with inks in which a radioactive material was mixed. The fabricated standard surface sources had high uniformity, high positional resolution arbitrary shapes and a broad intensity range. The standard sources were used for measurement of surface activity as an application. (H. Yokoo)

  19. Gamma-ray spectral map of standard pottery. Pt. 1

    International Nuclear Information System (INIS)

    Yellin, J.

    1984-01-01

    The gamma-ray spectrum of a neutron activated Standard Pottery is analyzed completely by means of spectral line shape fitting. A detailed spectral map of the standard is presented as it is typically used in pottery analysis. The spectrum obtained by a planar geometry Ge(Li) detector converts the energy range 11 to 409 keV. The map is intended to serve as a guide to the uninitiated user of Standard Pottery as well as a basis of comparison with other standards employed in pottery provenience work. It is shown that the process of calibrating detectors for spectral line interference can be greatly aided by means of a general approach to spectrum analysis and that much usefull information can be obtained by a general approach to pottery spectrum analysis. (orig.)

  20. Implementation of the INEEL safety analyst training standard

    International Nuclear Information System (INIS)

    Hochhalter, E. E.

    2000-01-01

    The Idaho Nuclear Technology and Engineering Center (INTEC) safety analysis units at the Idaho National Engineering and Environmental Laboratory (INEEL) are in the process of implementing the recently issued INEEL Safety Analyst Training Standard (STD-1107). Safety analyst training and qualifications are integral to the development and maintenance of core safety analysis capabilities. The INEEL Safety Analyst Training Standard (STD-1107) was developed directly from EFCOG Training Subgroup draft safety analyst training plan template, but has been adapted to the needs and requirements of the INEEL safety analysis community. The implementation of this Safety Analyst Training Standard is part of the Integrated Safety Management System (ISMS) Phase II Implementation currently underway at the INEEL. The objective of this paper is to discuss (1) the INEEL Safety Analyst Training Standard, (2) the development of the safety analyst individual training plans, (3) the implementation issues encountered during this initial phase of implementation, (4) the solutions developed, and (5) the implementation activities remaining to be completed

  1. On Picturing a Candle: The Prehistory of Imagery Science.

    Science.gov (United States)

    MacKisack, Matthew; Aldworth, Susan; Macpherson, Fiona; Onians, John; Winlove, Crawford; Zeman, Adam

    2016-01-01

    The past 25 years have seen a rapid growth of knowledge about brain mechanisms involved in visual mental imagery. These advances have largely been made independently of the long history of philosophical - and even psychological - reckoning with imagery and its parent concept 'imagination'. We suggest that the view from these empirical findings can be widened by an appreciation of imagination's intellectual history, and we seek to show how that history both created the conditions for - and presents challenges to - the scientific endeavor. We focus on the neuroscientific literature's most commonly used task - imagining a concrete object - and, after sketching what is known of the neurobiological mechanisms involved, we examine the same basic act of imagining from the perspective of several key positions in the history of philosophy and psychology. We present positions that, firstly, contextualize and inform the neuroscientific account, and secondly, pose conceptual and methodological challenges to the scientific analysis of imagery. We conclude by reflecting on the intellectual history of visualization in the light of contemporary science, and the extent to which such science may resolve long-standing theoretical debates.

  2. A multisite validation of whole slide imaging for primary diagnosis using standardized data collection and analysis.

    Science.gov (United States)

    Wack, Katy; Drogowski, Laura; Treloar, Murray; Evans, Andrew; Ho, Jonhan; Parwani, Anil; Montalto, Michael C

    2016-01-01

    Text-based reporting and manual arbitration for whole slide imaging (WSI) validation studies are labor intensive and do not allow for consistent, scalable, and repeatable data collection or analysis. The objective of this study was to establish a method of data capture and analysis using standardized codified checklists and predetermined synoptic discordance tables and to use these methods in a pilot multisite validation study. Fifteen case report form checklists were generated from the College of American Pathology cancer protocols. Prior to data collection, all hypothetical pairwise comparisons were generated, and a level of harm was determined for each possible discordance. Four sites with four pathologists each generated 264 independent reads of 33 cases. Preestablished discordance tables were applied to determine site by site and pooled accuracy, intrareader/intramodality, and interreader intramodality error rates. Over 10,000 hypothetical pairwise comparisons were evaluated and assigned harm in discordance tables. The average difference in error rates between WSI and glass, as compared to ground truth, was 0.75% with a lower bound of 3.23% (95% confidence interval). Major discordances occurred on challenging cases, regardless of modality. The average inter-reader agreement across sites for glass was 76.5% (weighted kappa of 0.68) and for digital it was 79.1% (weighted kappa of 0.72). These results demonstrate the feasibility and utility of employing standardized synoptic checklists and predetermined discordance tables to gather consistent, comprehensive diagnostic data for WSI validation studies. This method of data capture and analysis can be applied in large-scale multisite WSI validations.

  3. A method for quantitative analysis of standard and high-throughput qPCR expression data based on input sample quantity.

    Directory of Open Access Journals (Sweden)

    Mateusz G Adamski

    Full Text Available Over the past decade rapid advances have occurred in the understanding of RNA expression and its regulation. Quantitative polymerase chain reactions (qPCR have become the gold standard for quantifying gene expression. Microfluidic next generation, high throughput qPCR now permits the detection of transcript copy number in thousands of reactions simultaneously, dramatically increasing the sensitivity over standard qPCR. Here we present a gene expression analysis method applicable to both standard polymerase chain reactions (qPCR and high throughput qPCR. This technique is adjusted to the input sample quantity (e.g., the number of cells and is independent of control gene expression. It is efficiency-corrected and with the use of a universal reference sample (commercial complementary DNA (cDNA permits the normalization of results between different batches and between different instruments--regardless of potential differences in transcript amplification efficiency. Modifications of the input quantity method include (1 the achievement of absolute quantification and (2 a non-efficiency corrected analysis. When compared to other commonly used algorithms the input quantity method proved to be valid. This method is of particular value for clinical studies of whole blood and circulating leukocytes where cell counts are readily available.

  4. Providers and Patients Caught Between Standardization and Individualization: Individualized Standardization as a Solution

    Science.gov (United States)

    Ansmann, Lena; Pfaff, Holger

    2018-01-01

    In their 2017 article, Mannion and Exworthy provide a thoughtful and theory-based analysis of two parallel trends in modern healthcare systems and their competing and conflicting logics: standardization and customization. This commentary further discusses the challenge of treatment decision-making in times of evidence-based medicine (EBM), shared decision-making and personalized medicine. From the perspective of systems theory, we propose the concept of individualized standardization as a solution to the problem. According to this concept, standardization is conceptualized as a guiding framework leaving room for individualization in the patient physician interaction. The theoretical background is the concept of context management according to systems theory. Moreover, the comment suggests multidisciplinary teams as a possible solution for the integration of standardization and individualization, using the example of multidisciplinary tumor conferences and highlighting its limitations. The comment also supports the authors’ statement of the patient as co-producer and introduces the idea that the competing logics of standardization and individualization are a matter of perspective on macro, meso and micro levels. PMID:29626403

  5. System 80+trademark Standard Design: CESSAR design certification

    International Nuclear Information System (INIS)

    1990-01-01

    This report, entitled Combustion Engineering Standard Safety Analysis Report -- Design Certification (CESSAR-DC), has been prepared in support of the industry effort to standardize nuclear plant designs. These volumes describe the Combustion Engineering, Inc. System 80+trademark Standard Design. This Volume 18 provides Appendix B, Probabilistic Risk Assessment

  6. System 80+trademark Standard Design: CESSAR design certification

    International Nuclear Information System (INIS)

    1990-01-01

    This report, entitled Combustion Engineering Standard Safety Analysis Report -- Design Certification (CESSAR-DC), has been prepared in support of the industry effort to standardize nuclear plant designs. These volumes describe the Combustion Engineering, Inc. System 80 + trademark Standard Design. This volume 9 discusses Electric Power and Auxiliary Systems

  7. System 80+trademark Standard Design: CESSAR design certification

    International Nuclear Information System (INIS)

    1990-01-01

    This report, entitled Combustion Engineering Standard Safety Analysis Report -- Design Certification (CESSAR-DC), has been prepared in support of the industry effort to standardize nuclear plant designs. These volumes describe the Combustion Engineering, Inc. System 80 + trademark Standard Design. This volume 8 provides a description of instrumentation and controls

  8. A practical approach to estimate emission rates of indoor air pollutants due to the use of personal combustible products based on small-chamber studies.

    Science.gov (United States)

    Szulejko, Jan E; Kim, Ki-Hyun

    2016-02-01

    As emission rates of airborne pollutants are commonly measured from combusting substances placed inside small chambers, those values need to be re-evaluated for the possible significance under practical conditions. Here, a simple numerical procedure is investigated to extrapolate the chamber-based emission rates of formaldehyde that can be released from various combustible sources including e-cigarettes, conventional cigarettes, or scented candles to their concentration levels in a small room with relatively poor ventilation. This simple procedure relies on a mass balance approach by considering the masses of pollutants emitted from source and lost through ventilation under the assumption that mixing occurs instantaneously in the room without chemical reactions or surface sorption. The results of our study provide valuable insights into re-evaluation procedure of chamber data to allow comparison between extrapolated and recommended values to judge the safe use of various combustible products in confined spaces. If two scented candles with a formaldehyde emission rate of 310 µg h(-1) each were lit for 4 h in a small 20 m(3) room with an air change rate of 0.5 h(-1), then the 4-h (candle lit) and 8-h (up to 8 h after candle lighting) TWA [FA] were determined to be 28.5 and 23.5 ppb, respectively. This is clearly above the 8-h NIOSH recommended exposure limit (REL) time weighted average of 16 ppb. Copyright © 2015 Elsevier Ltd. All rights reserved.

  9. Memorial Candles: Beauty as Consolation

    Directory of Open Access Journals (Sweden)

    Mindy Weisel

    2011-01-01

    Full Text Available Do we ever get used to the feelings of loss? Time supposedly heals all wounds. Does it really? Or do we take that time and take that loss and turn it into something else, something that takes the shape and the form of our loss. Is this perhaps the source of the deepest art? Is it the art that actually gives our lives meaning? There are clearly feelings that are beyond comprehension. It is these feelings that are put into the music, poetry, painting, photography, prose, and theater that enrich our lives, and that are addressed in this book. The women in “Daughters of Absence” all have one thing in common: as daughters of Holocaust survivors they have found a strong voice through their work. For these creative women, their work has been both life force and life saver.

  10. The PREP Pipeline: Standardized preprocessing for large-scale EEG analysis

    Directory of Open Access Journals (Sweden)

    Nima eBigdelys Shamlo

    2015-06-01

    Full Text Available The technology to collect brain imaging and physiological measures has become portable and ubiquitous, opening the possibility of large-scale analysis of real-world human imaging. By its nature, such data is large and complex, making automated processing essential. This paper shows how lack of attention to the very early stages of an EEG preprocessing pipeline can reduce the signal-to-noise ratio and introduce unwanted artifacts into the data, particularly for computations done in single precision. We demonstrate that ordinary average referencing improves the signal-to-noise ratio, but that noisy channels can contaminate the results. We also show that identification of noisy channels depends on the reference and examine the complex interaction of filtering, noisy channel identification, and referencing. We introduce a multi-stage robust referencing scheme to deal with the noisy channel-reference interaction. We propose a standardized early-stage EEG processing pipeline (PREP and discuss the application of the pipeline to more than 600 EEG datasets. The pipeline includes an automatically generated report for each dataset processed. Users can download the PREP pipeline as a freely available MATLAB library from http://eegstudy.org/prepcode/.

  11. The PREP pipeline: standardized preprocessing for large-scale EEG analysis.

    Science.gov (United States)

    Bigdely-Shamlo, Nima; Mullen, Tim; Kothe, Christian; Su, Kyung-Min; Robbins, Kay A

    2015-01-01

    The technology to collect brain imaging and physiological measures has become portable and ubiquitous, opening the possibility of large-scale analysis of real-world human imaging. By its nature, such data is large and complex, making automated processing essential. This paper shows how lack of attention to the very early stages of an EEG preprocessing pipeline can reduce the signal-to-noise ratio and introduce unwanted artifacts into the data, particularly for computations done in single precision. We demonstrate that ordinary average referencing improves the signal-to-noise ratio, but that noisy channels can contaminate the results. We also show that identification of noisy channels depends on the reference and examine the complex interaction of filtering, noisy channel identification, and referencing. We introduce a multi-stage robust referencing scheme to deal with the noisy channel-reference interaction. We propose a standardized early-stage EEG processing pipeline (PREP) and discuss the application of the pipeline to more than 600 EEG datasets. The pipeline includes an automatically generated report for each dataset processed. Users can download the PREP pipeline as a freely available MATLAB library from http://eegstudy.org/prepcode.

  12. New quantitative safety standards: different techniques, different results?

    International Nuclear Information System (INIS)

    Rouvroye, J.L.; Brombacher, A.C.

    1999-01-01

    Safety Instrumented Systems (SIS) are used in the process industry to perform safety functions. Many factors can influence the safety of a SIS like system layout, diagnostics, testing and repair. In standards like the German DIN no quantitative analysis is demanded (DIN V 19250 Grundlegende Sicherheitsbetrachtungen fuer MSR-Schutzeinrichtungen, Berlin, 1994; DIN/VDE 0801 Grundsaetze fuer Rechner in Systemen mit Sicherheitsaufgaben, Berlin, 1990). The analysis according to these standards is based on expert opinion and qualitative analysis techniques. New standards like the IEC 61508 (IEC 61508 Functional safety of electrical/electronic/programmable electronic safety-related systems, IEC, Geneve, 1997) and the ISA-S84.01 (ISA-S84.01.1996 Application of Safety Instrumented Systems for the Process Industries, Instrument Society of America, Research Triangle Park, 1996) require quantitative risk analysis but do not prescribe how to perform the analysis. Earlier publications of the authors (Rouvroye et al., Uncertainty in safety, new techniques for the assessment and optimisation of safety in process industry, D W. Pyatt (ed), SERA-Vol. 4, Safety engineering and risk analysis, ASME, New York 1995; Rouvroye et al., A comparison study of qualitative and quantitative analysis techniques for the assessment of safety in industry, P.C. Cacciabue, I.A. Papazoglou (eds), Proceedings PSAM III conference, Crete, Greece, June 1996) have shown that different analysis techniques cover different aspects of system behaviour. This paper shows by means of a case study, that different (quantitative) analysis techniques may lead to different results. The consequence is that the application of the standards to practical systems will not always lead to unambiguous results. The authors therefore propose a technique to overcome this major disadvantage

  13. Internal standardization in atomic-emission spectrometry using inductively coupled plasma

    International Nuclear Information System (INIS)

    Moore, G.L.

    1985-01-01

    The principle of internal standardization has been used in quantitative analytical emission spectroscopy since 1925 to minimize the errors arising from fluctuations in sample preparation, excitation-source conditions, and detection parameters. Although modern spectroscopic excitation sources are far more stable and electronic detection methods are more precise than before, the system for the introduction of the sample in spectrometric analysis using inductively coupled plasma (ICP) introduces significant errors, and internal standardization can still play a useful role in improving the overall precision of the analytical results. The criteria for the selection of the elements to be used as internal standards in arc and spark spectrographic analysis apply to a much lesser extent in ICP-spectrometric analysis. Internal standardization is recommended for use in routine ICP-simultaneous spectrometric analysis to improve its accuracy and precision and to provide a monitor for the reassurance of the analyst. However, the selection of an unsuitable reference element can result in misuse of the principle of internal standardization and, although internal standardization can be applied when a sequential monochromator is used, the main sources of error will not be minimized

  14. ARM Data File Standards Version 1.2

    Energy Technology Data Exchange (ETDEWEB)

    Palanisamy, Giri [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States)

    2016-05-01

    The U.S. Department of Energy (DOE)’s Atmospheric Radiation Measurement (ARM) Climate Research Facility performs routine in situ and remote-sensing observations to provide a detailed and accurate description of the Earth atmosphere in diverse climate regimes. The result is a huge archive of diverse data sets containing observational and derived data, currently accumulating at a rate of 30 terabytes (TB) of data and 150,000 different files per month (http://www.archive.arm.gov/stats/). Continuing the current processing while scaling this to even larger sizes is extremely important to the ARM Facility and requires consistent metadata and data standards. The standards described in this document will enable development of automated analysis and discovery tools for the ever growing data volumes. It will enable consistent analysis of the multiyear data, allow for development of automated monitoring and data health status tools, and allow future capabilities of delivering data on demand that can be tailored explicitly for the user needs. This analysis ability will only be possible if the data follows a minimum set of standards. This document proposes a hierarchy of required and recommended standards.

  15. The Standard Deviation of Launch Vehicle Environments

    Science.gov (United States)

    Yunis, Isam

    2005-01-01

    Statistical analysis is used in the development of the launch vehicle environments of acoustics, vibrations, and shock. The standard deviation of these environments is critical to accurate statistical extrema. However, often very little data exists to define the standard deviation and it is better to use a typical standard deviation than one derived from a few measurements. This paper uses Space Shuttle and expendable launch vehicle flight data to define a typical standard deviation for acoustics and vibrations. The results suggest that 3dB is a conservative and reasonable standard deviation for the source environment and the payload environment.

  16. Achieving the 30% Goal: Energy and Cost Savings Analysis of ASHRAE Standard 90.1-2010

    Energy Technology Data Exchange (ETDEWEB)

    Thornton, Brian A.; Rosenberg, Michael I.; Richman, Eric E.; Wang, Weimin; Xie, YuLong; Zhang, Jian; Cho, Heejin; Mendon, Vrushali V.; Athalye, Rahul A.; Liu, Bing

    2011-05-24

    This Technical Support Document presents the energy and cost savings analysis that PNNL conducted to measure the potential energy savings of 90.1-2010 relative to 90.1-2004. PNNL conducted this analysis with inputs from many other contributors and source of information. In particular, guidance and direction was provided by the Simulation Working Group under the auspices of the SSPC90.1. This report documents the approach and methodologies that PNNL developed to evaluate the energy saving achieved from use of ASHRAE/IES Standard 90.1-2010. Specifically, this report provides PNNL’s Progress Indicator process and methodology, EnergyPlus simulation framework, prototype model descriptions. This report covers the combined upgrades from 90.1-2004 to 90.1-2010, resulting in a total of 153 addenda. PNNL has reviewed and considered all 153 addenda for quantitative analysis in the Progress Indicator process. 53 of those are included in the quantitative analysis. This report provides information on the categorization of all of the addenda, a summary of the content, and deeper explanation of the impact and modeling of 53 identified addenda with quantitative savings.

  17. [Analysis of varieties and standards of Scrophulariaceae plants used in Tibetan medicine].

    Science.gov (United States)

    Cao, Lan; Mu, Ze-jing; Zhong, Wei-hong; Zhong, Wei-jin; He, Jun-wei; Du, Xiao-lang; Zhong, Guo-yue

    2015-12-01

    In this paper, the popular domestic varieties and quality standard of Scrophulariaceae plants used in Tibetan medicine were analyzed. The results showed that there were 11 genera and 99 species (including varieties), as well as 28 medicinal materials varieties of Scrophulariaceae plants were recorded in the relevant literatures. In relevant Tibetan standards arid literatures, there are great differences in varieties, sources, parts, and efficacies of medicinal plant. Among them, about 41.4% (including 41 species) of endemic plants, about 15.2% (including 15 species) of the original plants have medicinal standard legal records, except the medicinal materials of Scrophalaria ningpoensis, Lagotis brevituba, Picrorhiza scrophulariiflora, Veronica eriogyne general, most varieties have not completed quality standard. Consequently it is necessary to reinforce the herbal textual, resources and the use present situation investigation, the effects of the species resources material foundation and biological activity, quality standard, specification the medical terms of the plants, and promote Tibetan medicinal vareties-terminologies-sources such as the criterion and quality standard system for enriching the varieties of Tibetan medicinal materials and Chinese medicinal resources.

  18. Electromagnetic Scattering Analysis of Coated Conductors With Edges Using the Method of Auxiliary Sources (MAS) in Conjunction With the Standard Impedance Boundary Condition (SIBC)

    DEFF Research Database (Denmark)

    Anastassiu, H.T.; D.I.Kaklamani, H.T.; Economou, D.P.

    2002-01-01

    A novel combination of the method of auxiliary sources (MAS) and the standard impedance boundary condition (SIBC) is employed in the analysis of transverse magnetic (TM) plane wave scattering from infinite, coated, perfectly conducting cylinders with square cross sections. The scatterer is initia......A novel combination of the method of auxiliary sources (MAS) and the standard impedance boundary condition (SIBC) is employed in the analysis of transverse magnetic (TM) plane wave scattering from infinite, coated, perfectly conducting cylinders with square cross sections. The scatterer...

  19. Halogens determination in vegetable NBS standard reference materials

    International Nuclear Information System (INIS)

    Stella, R.; Genova, N.; Di Casa, M.

    1977-01-01

    Levels of all four halogens in Orchard Leaves, Pine Needles and Tomato Leaves NBS reference standards were determined. For fluorine a spiking isotope dilution method was used followed by HF absorption on glass beads. Instrumental nuclear activation analysis was adopted for chlorine and bromine determination. Radiochemical separation by a distillation procedure was necessary for iodine nuclear activation analysis after irradiation. Activation parameters of Cl, Br and I are reported. Results of five determinations for each halogen in Orchard Leaves, Pine Needles and Tomato Leaves NBS Standard Materials and Standard deviations of the mean are reported. (T.I.)

  20. Binary trading relations and the limits of EDI standards

    DEFF Research Database (Denmark)

    Damsgaard, Jan; Truex, D.

    2000-01-01

    This paper provides a critical examination of electronic data interchange (EDI) standards and their application in different types of trading relationships. It argues that EDI standards are not directly comparable to more stable sets of technical standards in that they are dynamically tested...... and negotiated in use with each trading exchange. It takes the position that EDI standards are an emergent language form and must mean different things at the institutional and local levels. Using the lens of emergent linguistic analysis it shows how the institutional and local levels must always be distinct...... and yet can coexist. EDI standards can never represent the creation of an 'Esperanto of institutional communication'. Instead we believe that standards must be developed such that they support and accommodate general basic grammatical forms that can be customised to individual needs. The analysis...

  1. Manual of Standard Operating Procedures for Veterinary Drug Residue Analysis (Spanish Edition)

    International Nuclear Information System (INIS)

    2017-01-01

    Laboratories are crucial to national veterinary drug residue monitoring programmes. However, one of the main challenges laboratories encounter is obtaining access to relevant methods of analysis. Thus, in addition to training, providing technical advice and transferring technology, the Joint FAO/IAEA Division of Nuclear Techniques in Food and Agriculture has resolved to develop clear and practical manuals to support Member State laboratories. The Coordinated Research Project (CRP) on Development of Radiometric and Allied Analytical Methods to Strengthen Residue Control Programs for Antibiotic and Anthelmintic Veterinary Drug Residues has developed a number of analytical methods as standard operating procedures (SOPs), which are now compiled here. This publication contains SOPs on chromatographic and spectrometric techniques, as well as radioimmunoassay and associated screening techniques, for various anthelmintic and antimicrobial veterinary drug residue analysis. Some analytical method validation protocols are also included. The publication is primarily aimed at food and environmental safety laboratories involved in testing veterinary drug residues, including under organized national residue monitoring programmes. It is expected to enhance laboratory capacity building and competence through the use of radiometric and complementary tools and techniques. The publication is also relevant for applied research on residues of veterinary drugs in food and environmental samples

  2. SRAC: JAERI thermal reactor standard code system for reactor design and analysis

    International Nuclear Information System (INIS)

    Tsuchihashi, Keichiro; Takano, Hideki; Horikami, Kunihiko; Ishiguro, Yukio; Kaneko, Kunio; Hara, Toshiharu.

    1983-01-01

    The SRAC (Standard Reactor Analysis Code) is a code system for nuclear reactor analysis and design. It is composed of neutron cross section libraries and auxiliary processing codes, neutron spectrum routines, a variety of transport, 1-, 2- and 3-D diffusion routines, dynamic parameters and cell burn-up routines. By making the best use of the individual code function in the SRAC system, the user can select either the exact method for an accurate estimate of reactor characteristics or the economical method aiming at a shorter computer time, depending on the purpose of study. The user can select cell or core calculation; fixed source or eigenvalue problem; transport (collision probability or Sn) theory or diffusion theory. Moreover, smearing and collapsing of macroscopic cross sections are separately done by the user's selection. And a special attention is paid for double heterogeneity. Various techniques are employed to access the data storage and to optimize the internal data transfer. Benchmark calculations using the SRAC system have been made extensively for the Keff values of various types of critical assemblies (light water, heavy water and graphite moderated systems, and fast reactor systems). The calculated results show good prediction for the experimental Keff values. (author)

  3. Manual of Standard Operating Procedures for Veterinary Drug Residue Analysis (French Edition)

    International Nuclear Information System (INIS)

    2017-01-01

    Laboratories are crucial to national veterinary drug residue monitoring programmes. However, one of the main challenges laboratories encounter is obtaining access to relevant methods of analysis. Thus, in addition to training, providing technical advice and transferring technology, the Joint FAO/IAEA Division of Nuclear Techniques in Food and Agriculture has resolved to develop clear and practical manuals to support Member State laboratories. The Coordinated Research Project (CRP) on Development of Radiometric and Allied Analytical Methods to Strengthen Residue Control Programs for Antibiotic and Anthelmintic Veterinary Drug Residues has developed a number of analytical methods as standard operating procedures (SOPs), which are now compiled here. This publication contains SOPs on chromatographic and spectrometric techniques, as well as radioimmunoassay and associated screening techniques, for various anthelmintic and antimicrobial veterinary drug residue analysis. Some analytical method validation protocols are also included. The publication is primarily aimed at food and environmental safety laboratories involved in testing veterinary drug residues, including under organized national residue monitoring programmes. It is expected to enhance laboratory capacity building and competence through the use of radiometric and complementary tools and techniques. The publication is also relevant for applied research on residues of veterinary drugs in food and environmental samples

  4. Standard test method for analysis of uranium and thorium in soils by energy dispersive X-Ray fluorescence spectroscopy

    CERN Document Server

    American Society for Testing and Materials. Philadelphia

    2011-01-01

    1.1 This test method covers the energy dispersive X-ray fluorescence (EDXRF) spectrochemical analysis of trace levels of uranium and thorium in soils. Any sample matrix that differs from the general ground soil composition used for calibration (that is, fertilizer or a sample of mostly rock) would have to be calibrated separately to determine the effect of the different matrix composition. 1.2 The analysis is performed after an initial drying and grinding of the sample, and the results are reported on a dry basis. The sample preparation technique used incorporates into the sample any rocks and organic material present in the soil. This test method of sample preparation differs from other techniques that involve tumbling and sieving the sample. 1.3 Linear calibration is performed over a concentration range from 20 to 1000 μg per gram for uranium and thorium. 1.4 The values stated in SI units are to be regarded as the standard. The inch-pound units in parentheses are for information only. 1.5 This standard...

  5. Non-standard work schedules, gender, and parental stress

    Directory of Open Access Journals (Sweden)

    Mariona Lozano

    2016-02-01

    Full Text Available Background: Working non-standard hours changes the temporal structure of family life, constraining the time that family members spend with one another and threatening individuals' well-being. However, the empirical research on the link between stress and non-standard schedules has provided mixed results. Some studies have indicated that working non-standard hours is harmful whereas others have suggested that working atypical hours might facilitate the balance between family and work. Moreover, there is some evidence that the association between stress and non-standard employment has different implications for men and women. Objective: This paper examines the association between non-standard work schedules and stress among dual-earner couples with children. Two research questions are addressed. First, do predictability of the schedule and time flexibility moderate the link between non-standard work hours and stress? Second, do non-standard schedules affect men's and women's perceptions of stress differently? Methods: We use a sample of 1,932 working parents from the Canadian 2010 General Social Survey, which includes a time-use diary. A sequential logit regression analysis stratified by gender is employed to model two types of result. First, we estimate the odds of being stressed versus not being stressed. Second, for all respondents feeling stressed, we estimate the odds of experiencing high levels versus moderate levels of stress. Results: Our analysis shows that the link between non-standard working hours and perceived stress differs between mothers and fathers. First, fathers with non-standard schedules appear more likely to experience stress than those working standard hours, although the results are not significant. Among mothers, having a non-standard schedule is associated with a significantly lower risk of experiencing stress. Second, the analysis focusing on the mediating role of flexibility and predictability indicates that

  6. System 80+trademark Standard Design: CESSAR design certification

    International Nuclear Information System (INIS)

    1990-01-01

    This report, entitled Combustion Engineering Standard Safety Analysis Report -- Design Certification (CESSAR-DC), has been prepared in support of the industry effort to standardize nuclear plant designs. These volumes describe the Combustion Engineering, Inc. System 80 + trademark Standard Design. This volume 11 discusses Radiation Protection, Conduct of Operations, and the Initial Test Program

  7. System 80+trademark Standard Design: CESSAR design certification

    International Nuclear Information System (INIS)

    1990-01-01

    This report, entitled Combustion Engineering Standard Safety Analysis Report -- Design Certification (CESSAR-DC), has been prepared in support of the industry effort to standardize nuclear plant designs. These volumes describe the Combustion Engineering, Inc. System 80 + trademark Standard Design. This volume 10 discusses the Steam and Power Conversion System and Radioactive Waste Management

  8. System 80+trademark Standard Design: CESSAR design certification

    International Nuclear Information System (INIS)

    1990-01-01

    This report, entitled Combustion Engineering Standard Safety Analysis Report - Design Certification (CESSAR-DC), has been prepared in support of the industry effort to standardize nuclear plant designs. These volumes describe the Combustion Engineering, Inc. System 80+trademark Standard Design. This Volume 16 details the application of Human Factors Engineering in the design process

  9. The quantitative analysis of Bowen's kale by PIXE using the internal standard

    International Nuclear Information System (INIS)

    Navarrete, V.R.; Izawa, G.; Shiokawa, T.; Kamiya, M.; Morita, S.

    1978-01-01

    The internal standard method was used for non-destructive quantitative determination of trace elements by PIXE. The uniform distribution of the internal standard element in the Bowen's kale powder sample was obtained by using homogenization technique. Eleven elements are determined quantitatively for the sample prepared into self-supporting targets having lower relative standard deviations than non-self-supporting targets. (author)

  10. Rural electrification program with renewable energy sources: An analysis of China’s Township Electrification Program

    International Nuclear Information System (INIS)

    Shyu, Chian-Woei

    2012-01-01

    Given the fact that 1.4 billion people, over 20% of the world’s population, lack access to electricity, rural electrification remains a common challenge for many developing countries. The ‘Township Electrification Program’ launched by the Chinese government in 2002 is known as the world’s largest renewable energy-based rural electrification program in terms of investment volume ever carried out by a country. This study gives an in-depth examination of the program implemented in two selected townships in remote of rural areas of western China. The results showed that the implementation of the program possessed a technical orientation (e.g., construction of stations, installation of systems), and underestimated the financial implications (e.g., electricity tariff, households’ ability to pay electricity fees, financial management) as well as human resources available (e.g., training for operators, household participation) and institutional capacity building (e.g., good governance, regulatory framework) at the local level. Even though electricity was provided by the solar PV power stations, households still relied on traditional energy sources, such as candles and dry cell batteries, due to the fact that electricity service was unreliable and electricity supply was not sufficient for households’ needs. - Highlights: ► China’s electrification rate has reached the level of OECD countries. ► Township Electrification Program is the world’s largest electrification program. ► The program possessed a technical orientation and underestimated other aspects. ► Households still relied on traditional energy, such as candles and batteries. ► Having electricity access did not mean that electricity was actually used.

  11. Standardization of depression measurement

    DEFF Research Database (Denmark)

    Wahl, Inka; Löwe, Bernd; Bjørner, Jakob

    2014-01-01

    OBJECTIVES: To provide a standardized metric for the assessment of depression severity to enable comparability among results of established depression measures. STUDY DESIGN AND SETTING: A common metric for 11 depression questionnaires was developed applying item response theory (IRT) methods. Data...... of 33,844 adults were used for secondary analysis including routine assessments of 23,817 in- and outpatients with mental and/or medical conditions (46% with depressive disorders) and a general population sample of 10,027 randomly selected participants from three representative German household surveys....... RESULTS: A standardized metric for depression severity was defined by 143 items, and scores were normed to a general population mean of 50 (standard deviation = 10) for easy interpretability. It covers the entire range of depression severity assessed by established instruments. The metric allows...

  12. Evaluation of spectrographic standards for the carrier-distillation analysis of PuO2

    International Nuclear Information System (INIS)

    Martell, C.J.; Myers, W.M.

    1976-05-01

    Three plutonium metals whose impurity contents have been accurately determined are used to evaluate spectrographic standards. Best results are obtained when (1) highly impure samples are diluted, (2) the internal standard, cobalt, is used, (3) a linear curve is fitted to the standard data that bracket the impurity concentration, and (4) plutonium standards containing 22 impurities are used

  13. System 80+trademark Standard Design: CESSAR design certification

    International Nuclear Information System (INIS)

    1990-01-01

    This report, entitled Combustion Engineering Standard Safety Analysis Report -- Design Certification (CESSAR-DC), has been prepared in support of the industry effort to standardize nuclear plant designs. These volumes describes the Combustion Engineering, Inc. System 80+trademark Standard Design. This Volume 17 provides Appendix A of this report, closure of unresolved and Genetic Safety Issues

  14. Standardization of dosimetry and damage analysis work for U.S. LWR, FBR, and MFR development program

    International Nuclear Information System (INIS)

    McElroy, W.N.; Doran, D.G.; Gold, R.; Morgan, W.C.; Grundl, J.A.; McGarry, E.D.; Kam, F.B.K.; Swank, J.H.; Odette, G.R.

    1978-01-01

    The accuracy requirements for various measured/calculated exposure and correlation parameters associated with current dosimetry and damage analysis procedures and practices depend on the accuracy needs of reactor development efforts in testing, design, safety, operations, and surveillance programs. Present state-of-the-art accuracies are estimated to be in the range of +-2 to 30 percent (1 sigma), depending on the particular parameter. There now appears to be international agreement, at least for the long term, that most reactor fuels and materials programs will not be able to accept an uncertainty greater than about +5 percent (1 sigma). The current status of dosimetry and damage analysis standardization work within the U.S. for LWR, FBR and MFR is reviewed in this paper

  15. Extended substitution-diffusion based image cipher using chaotic standard map

    Science.gov (United States)

    Kumar, Anil; Ghose, M. K.

    2011-01-01

    This paper proposes an extended substitution-diffusion based image cipher using chaotic standard map [1] and linear feedback shift register to overcome the weakness of previous technique by adding nonlinearity. The first stage consists of row and column rotation and permutation which is controlled by the pseudo-random sequences which is generated by standard chaotic map and linear feedback shift register, second stage further diffusion and confusion is obtained in the horizontal and vertical pixels by mixing the properties of the horizontally and vertically adjacent pixels, respectively, with the help of chaotic standard map. The number of rounds in both stage are controlled by combination of pseudo-random sequence and original image. The performance is evaluated from various types of analysis such as entropy analysis, difference analysis, statistical analysis, key sensitivity analysis, key space analysis and speed analysis. The experimental results illustrate that performance of this is highly secured and fast.

  16. A multisite validation of whole slide imaging for primary diagnosis using standardized data collection and analysis

    Directory of Open Access Journals (Sweden)

    Katy Wack

    2016-01-01

    Full Text Available Context: Text-based reporting and manual arbitration for whole slide imaging (WSI validation studies are labor intensive and do not allow for consistent, scalable, and repeatable data collection or analysis. Objective: The objective of this study was to establish a method of data capture and analysis using standardized codified checklists and predetermined synoptic discordance tables and to use these methods in a pilot multisite validation study. Methods and Study Design: Fifteen case report form checklists were generated from the College of American Pathology cancer protocols. Prior to data collection, all hypothetical pairwise comparisons were generated, and a level of harm was determined for each possible discordance. Four sites with four pathologists each generated 264 independent reads of 33 cases. Preestablished discordance tables were applied to determine site by site and pooled accuracy, intrareader/intramodality, and interreader intramodality error rates. Results: Over 10,000 hypothetical pairwise comparisons were evaluated and assigned harm in discordance tables. The average difference in error rates between WSI and glass, as compared to ground truth, was 0.75% with a lower bound of 3.23% (95% confidence interval. Major discordances occurred on challenging cases, regardless of modality. The average inter-reader agreement across sites for glass was 76.5% (weighted kappa of 0.68 and for digital it was 79.1% (weighted kappa of 0.72. Conclusion: These results demonstrate the feasibility and utility of employing standardized synoptic checklists and predetermined discordance tables to gather consistent, comprehensive diagnostic data for WSI validation studies. This method of data capture and analysis can be applied in large-scale multisite WSI validations.

  17. Determination analysis of energy conservation standards for distribution transformers

    Energy Technology Data Exchange (ETDEWEB)

    Barnes, P.R.; Van Dyke, J.W.; McConnell, B.W.; Das, S.

    1996-07-01

    This report contains information for US DOE to use in making a determination on proposing energy conservation standards for distribution transformers as required by the Energy Policy Act of 1992. Potential for saving energy with more efficient liquid-immersed and dry-type distribution transformers could be significant because these transformers account for an estimated 140 billion kWh of the annual energy lost in the delivery of electricity. Objective was to determine whether energy conservation standards for distribution transformers would have the potential for significant energy savings, be technically feasible, and be economically justified from a national perspective. It was found that energy conservation for distribution transformers would be technically and economically feasible. Based on the energy conservation options analyzed, 3.6-13.7 quads of energy could be saved from 2000 to 2030.

  18. ['Gold standard', not 'golden standard'

    NARCIS (Netherlands)

    Claassen, J.A.H.R.

    2005-01-01

    In medical literature, both 'gold standard' and 'golden standard' are employed to describe a reference test used for comparison with a novel method. The term 'gold standard' in its current sense in medical research was coined by Rudd in 1979, in reference to the monetary gold standard. In the same

  19. Development testing of the chemical analysis automation polychlorinated biphenyl standard analysis method during surface soils sampling at the David Witherspoon 1630 site

    International Nuclear Information System (INIS)

    Hunt, M.A.; Klatt, L.N.; Thompson, D.H.

    1998-02-01

    The Chemical Analysis Automation (CAA) project is developing standardized, software-driven, site-deployable robotic laboratory systems with the objective of lowering the per-sample analysis cost, decreasing sample turnaround time, and minimizing human exposure to hazardous and radioactive materials associated with DOE remediation projects. The first integrated system developed by the CAA project is designed to determine polychlorinated biphenyls (PCB) content in soil matrices. A demonstration and development testing of this system was conducted in conjuction with surface soil characterization activities at the David Witherspoon 1630 Site in Knoxville, Tennessee. The PCB system consists of five hardware standard laboratory modules (SLMs), one software SLM, the task sequence controller (TSC), and the human-computer interface (HCI). Four of the hardware SLMs included a four-channel Soxhlet extractor, a high-volume concentrator, a column cleanup, and a gas chromatograph. These SLMs performed the sample preparation and measurement steps within the total analysis protocol. The fifth hardware module was a robot that transports samples between the SLMs and the required consumable supplies to the SLMs. The software SLM is an automated data interpretation module that receives raw data from the gas chromatograph SLM and analyzes the data to yield the analyte information. The TSC is a software system that provides the scheduling, management of system resources, and the coordination of all SLM activities. The HCI is a graphical user interface that presents the automated laboratory to the analyst in terms of the analytical procedures and methods. Human control of the automated laboratory is accomplished via the HCI. Sample information required for processing by the automated laboratory is entered through the HCI. Information related to the sample and the system status is presented to the analyst via graphical icons

  20. Standard guide for precision electroformed wet sieve analysis of nonplastic ceramic powders

    CERN Document Server

    American Society for Testing and Materials. Philadelphia

    2009-01-01

    1.1 This guide covers the determination of the particle size distribution of pulverized alumina and quartz for particle sizes from 45 to 5 μm by wet sieving. 1.2 The values stated in SI units are to be regarded as standard. No other units of measurement are included in this standard. 1.2.1 The only exception is in the Section 5, Apparatus, 5.1 where there is no relevant SI equivalent. 1.3 This standard does not purport to address the safety concerns associated with its use. It is the responsibility of the user of this standard to establish appropriate safety and health practices and determine the applicability of regulatory limitations prior to use.