WorldWideScience

Sample records for sample volume part-per-billion

  1. Methods for preparation of mixtures of gases in air at the parts-per-billion to parts-per-million concentration range for calibration of monitors

    International Nuclear Information System (INIS)

    Karpas, Z.; Melloul, S.; Pollevoy, Y.; Matmor, A.

    1992-05-01

    Static and dynamic methods for generating mixture of gases and vapors in air at the parts-per-billion (ppb) to parts-per-million (ppm) concentration range were surveyed. The dynamic methods include: a dynamic flow and mixing system; injection of samples into large volumes of air; exponential dilution; permeation and diffusion tubes; and generation of the target gas by chemical reaction or electrolysis. The static methods include preparation of mixtures by weighing the components, by volumetric mixing and by partial pressure method. The principles governing the utilization of these methods for the appropriate applications were discussed, and examples in which they were used to calibrate an ion mobility spectrometer (IMS) were given. (authors)

  2. A parts-per-billion measurement of the antiproton magnetic moment

    CERN Document Server

    Smorra, C; Borchert, M J; Harrington, J A; Higuchi, T; Nagahama, H; Tanaka, T; Mooser, A; Schneider, G; Blaum, K; Matsuda, Y; Ospelkaus, C; Quint, W; Walz, J; Yamazaki, Y; Ulmer, S

    2017-01-01

    Precise comparisons of the fundamental properties of matter–antimatter conjugates provide sensitive tests of charge–parity–time (CPT) invariance1, which is an important symmetry that rests on basic assumptions of the standard model of particle physics. Experiments on mesons2, leptons3, 4 and baryons5, 6 have compared different properties of matter–antimatter conjugates with fractional uncertainties at the parts-per-billion level or better. One specific quantity, however, has so far only been known to a fractional uncertainty at the parts-per-million level7, 8: the magnetic moment of the antiproton, . The extraordinary difficulty in measuring with high precision is caused by its intrinsic smallness; for example, it is 660 times smaller than the magnetic moment of the positron3. Here we report a high-precision measurement of in units of the nuclear magneton μN with a fractional precision of 1.5 parts per billion (68% confidence level). We use a two-particle spectroscopy method in an advanced cryogenic ...

  3. A parts-per-billion measurement of the antiproton magnetic moment.

    Science.gov (United States)

    Smorra, C; Sellner, S; Borchert, M J; Harrington, J A; Higuchi, T; Nagahama, H; Tanaka, T; Mooser, A; Schneider, G; Bohman, M; Blaum, K; Matsuda, Y; Ospelkaus, C; Quint, W; Walz, J; Yamazaki, Y; Ulmer, S

    2017-10-18

    Precise comparisons of the fundamental properties of matter-antimatter conjugates provide sensitive tests of charge-parity-time (CPT) invariance, which is an important symmetry that rests on basic assumptions of the standard model of particle physics. Experiments on mesons, leptons and baryons have compared different properties of matter-antimatter conjugates with fractional uncertainties at the parts-per-billion level or better. One specific quantity, however, has so far only been known to a fractional uncertainty at the parts-per-million level: the magnetic moment of the antiproton, . The extraordinary difficulty in measuring with high precision is caused by its intrinsic smallness; for example, it is 660 times smaller than the magnetic moment of the positron. Here we report a high-precision measurement of in units of the nuclear magneton μ N with a fractional precision of 1.5 parts per billion (68% confidence level). We use a two-particle spectroscopy method in an advanced cryogenic multi-Penning trap system. Our result  = -2.7928473441(42)μ N (where the number in parentheses represents the 68% confidence interval on the last digits of the value) improves the precision of the previous best measurement by a factor of approximately 350. The measured value is consistent with the proton magnetic moment, μ p  = 2.792847350(9)μ N , and is in agreement with CPT invariance. Consequently, this measurement constrains the magnitude of certain CPT-violating effects to below 1.8 × 10 -24 gigaelectronvolts, and a possible splitting of the proton-antiproton magnetic moments by CPT-odd dimension-five interactions to below 6 × 10 -12 Bohr magnetons.

  4. White Light Demonstration of One Hundred Parts per Billion Irradiance Suppression in Air by New Starshade Occulters

    Science.gov (United States)

    Levinton, Douglas B.; Cash, Webster C.; Gleason, Brian; Kaiser, Michael J.; Levine, Sara A.; Lo, Amy S.; Schindhelm, Eric; Shipley, Ann F.

    2007-01-01

    A new mission concept for the direct imaging of exo-solar planets called the New Worlds Observer (NWO) has been proposed. The concept involves flying a meter-class space telescope in formation with a newly-conceived, specially-shaped, deployable star-occulting shade several meters across at a separation of some tens of thousands of kilometers. The telescope would make its observations from behind the starshade in a volume of high suppression of incident irradiance from the star around which planets orbit. The required level of irradiance suppression created by the starshade for an efficacious mission is of order 0.1 to 10 parts per billion in broadband light. This paper discusses the experimental setup developed to accurately measure the suppression ratio of irradiance produced at the null position behind candidate starshade forms to these levels. It also presents results of broadband measurements which demonstrated suppression levels of just under 100 parts per billion in air using the Sun as a light source. Analytical modeling of spatial irradiance distributions surrounding the null are presented and compared with photographs of irradiance captured in situ behind candidate starshades.

  5. Electron capture detection of sulphur gases in carbon dioxide at the parts-per-billion level

    International Nuclear Information System (INIS)

    Pick, M.E.

    1979-01-01

    A gas chromatograph with an electron capture detector has been used to determine sulphur gases in CO 2 at the parts-per-billion level, with particular application to the analysis of coolant from CO 2 cooled nuclear reactors. For COS, CS 2 , CH 3 SH, H 2 S and (CH 3 ) 2 S 2 the detector has a sensitivity comparable with the more commonly used flame photometric detector, but it is much less sensitive towards (CH 3 ) 2 S and thiophene. In addition, the paper describes a simple method for trapping sulphur gases which might enable detection of sub parts-per-billion levels of sulphur compounds. (Auth.)

  6. Analysis of precious metals at parts-per-billion levels in industrial applications

    International Nuclear Information System (INIS)

    Tickner, James; O'Dwyer, Joel; Roach, Greg; Smith, Michael; Van Haarlem, Yves

    2015-01-01

    Precious metals, including gold and the platinum group metals (notable Pt, Pd and Rh), are mined commercially at concentrations of a few parts-per-million and below. Mining and processing operations demand sensitive and rapid analysis at concentrations down to about 100 parts-per-billion (ppb). In this paper, we discuss two technologies being developed to meet this challenge: X-ray fluorescence (XRF) and gamma-activation analysis (GAA). We have designed on-stream XRF analysers capable of measuring targeted elements in slurries with precisions in the 35–70 ppb range. For the past two years, two on-stream analysers have been in continuous operation at a precious metals concentrator plant. The simultaneous measurement of feed and waste stream grades provides real-time information on metal recovery, allowing changes in operating conditions and plant upsets to be detected and corrected more rapidly. Separately, we have been developing GAA for the measurement of gold as a replacement for the traditional laboratory fire-assay process. High-energy Bremsstrahlung X-rays are used to excite gold via the 197 Au(γ,γ′) 197 Au-M reaction, and the gamma-rays released in the decay of the meta-state are then counted. We report on work to significantly improve accuracy and detection limits. - Highlights: • X-ray fluorescence analysis at sub-parts-per-million concentration in bulk materials. • Gamma activation analysis of gold at high accuracy and low concentrations. • Use of advanced Monte Carlo techniques to optimise radiation-based analysers. • Industrial application of XRF and GAA technologies for minerals processing.

  7. Development of multicomponent parts-per-billion-level gas standards of volatile toxic organic compounds

    International Nuclear Information System (INIS)

    Rhoderick, G.C.; Zielinski, W.L. Jr.

    1990-01-01

    This paper reports that the demand for stable, low-concentration multicomponent standards of volatile toxic organic compounds for quantifying national and state measurement of ambient air quality and hazardous waste incineration emissions has markedly increased in recent years. In response to this demand, a microgravimetric technique was developed and validated for preparing such standards; these standards ranged in concentration from several parts per million (ppm) down to one part per billion (ppb) and in complexity from one organic up to 17. Studies using the gravimetric procedure to prepare mixtures of different groups of organics. including multi-components mixtures in the 5 to 20 ppb range, revealed a very low imprecision. This procedure is based on the separate gravimetric introduction of individual organics into an evacuated gas cylinder, followed by the pressurized addition of a precalculated amount of pure nitrogen. Additional studies confirmed the long-term stability of these mixtures. The uncertainty of the concentrations of the individual organics at the 95% confidence level ranged from less than 1% relative at 1 ppm to less than 10% relative at 1 ppb. Over 100 primary gravimetric standards have been developed, validated, and used for certifying the concentrations of a variety of mixtures for monitoring studies

  8. Rapid analysis of perchlorate in drinking water at parts per billion levels using microchip electrophoresis.

    Science.gov (United States)

    Gertsch, Jana C; Noblitt, Scott D; Cropek, Donald M; Henry, Charles S

    2010-05-01

    A microchip capillary electrophoresis (MCE) system has been developed for the determination of perchlorate in drinking water. The United States Environmental Protection Agency (USEPA) recently proposed a health advisory limit for perchlorate in drinking water of 15 parts per billion (ppb), a level requiring large, sophisticated instrumentation, such as ion chromatography coupled with mass spectrometry (IC-MS), for detection. An inexpensive, portable system is desired for routine online monitoring applications of perchlorate in drinking water. Here, we present an MCE method using contact conductivity detection for perchlorate determination. The method has several advantages, including reduced analysis times relative to IC, inherent portability, high selectivity, and minimal sample pretreatment. Resolution of perchlorate from more abundant ions was achieved using zwitterionic, sulfobetaine surfactants, N-hexadecyl-N,N-dimethyl-3-ammonio-1-propane sulfonate (HDAPS) and N-tetradecyl-N,N-dimethyl-3-ammonio-1-propane sulfonate (TDAPS). The system performance and the optimization of the separation chemistry, including the use of these surfactants to resolve perchlorate from other anions, are discussed in this work. The system is capable of detection limits of 3.4 +/- 1.8 ppb (n = 6) in standards and 5.6 +/- 1.7 ppb (n = 6) in drinking water.

  9. Double-trap measurement of the proton magnetic moment at 0.3 parts per billion precision.

    Science.gov (United States)

    Schneider, Georg; Mooser, Andreas; Bohman, Matthew; Schön, Natalie; Harrington, James; Higuchi, Takashi; Nagahama, Hiroki; Sellner, Stefan; Smorra, Christian; Blaum, Klaus; Matsuda, Yasuyuki; Quint, Wolfgang; Walz, Jochen; Ulmer, Stefan

    2017-11-24

    Precise knowledge of the fundamental properties of the proton is essential for our understanding of atomic structure as well as for precise tests of fundamental symmetries. We report on a direct high-precision measurement of the magnetic moment μ p of the proton in units of the nuclear magneton μ N The result, μ p = 2.79284734462 (±0.00000000082) μ N , has a fractional precision of 0.3 parts per billion, improves the previous best measurement by a factor of 11, and is consistent with the currently accepted value. This was achieved with the use of an optimized double-Penning trap technique. Provided a similar measurement of the antiproton magnetic moment can be performed, this result will enable a test of the fundamental symmetry between matter and antimatter in the baryonic sector at the 10 -10 level. Copyright © 2017, American Association for the Advancement of Science.

  10. An efficient probe for rapid detection of cyanide in water at parts per billion levels and naked-eye detection of endogenous cyanide.

    Science.gov (United States)

    Kumari, Namita; Jha, Satadru; Bhattacharya, Santanu

    2014-03-01

    A new molecular probe based on an oxidized bis-indolyl skeleton has been developed for rapid and sensitive visual detection of cyanide ions in water and also for the detection of endogenously bound cyanide. The probe allows the "naked-eye" detection of cyanide ions in water with a visual color change from red to yellow (Δλmax =80 nm) with the immediate addition of the probe. It shows high selectivity towards the cyanide ion without any interference from other anions. The detection of cyanide by the probe is ratiometric, thus making the detection quantitative. A Michael-type addition reaction of the probe with the cyanide ion takes place during this chemodosimetric process. In water, the detection limit was found to be at the parts per million level, which improved drastically when a neutral micellar medium was employed, and it showed a parts-per-billion-level detection, which is even 25-fold lower than the permitted limits of cyanide in water. The probe could also efficiently detect the endogenously bound cyanide in cassava (a staple food) with a clear visual color change without requiring any sample pretreatment and/or any special reaction conditions such as pH or temperature. Thus the probe could serve as a practical naked-eye probe for "in-field" experiments without requiring any sophisticated instruments. Copyright © 2014 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  11. 6.6-hour inhalation of ozone concentrations from 60 to 87 parts per billion in healthy humans.

    Science.gov (United States)

    Schelegle, Edward S; Morales, Christopher A; Walby, William F; Marion, Susan; Allen, Roblee P

    2009-08-01

    Identification of the minimal ozone (O(3)) concentration and/or dose that induces measurable lung function decrements in humans is considered in the risk assessment leading to establishing an appropriate National Ambient Air Quality Standard for O(3) that protects public health. To identify and/or predict the minimal mean O(3) concentration that produces a decrement in FEV(1) and symptoms in healthy individuals completing 6.6-hour exposure protocols. Pulmonary function and subjective symptoms were measured in 31 healthy adults (18-25 yr, male and female, nonsmokers) who completed five 6.6-hour chamber exposures: filtered air and four variable hourly patterns with mean O(3) concentrations of 60, 70, 80, and 87 parts per billion (ppb). Compared with filtered air, statistically significant decrements in FEV(1) and increases in total subjective symptoms scores (P < 0.05) were measured after exposure to mean concentrations of 70, 80, and 87 ppb O(3). The mean percent change in FEV(1) (+/-standard error) at the end of each protocol was 0.80 +/- 0.90, -2.72 +/- 1.48, -5.34 +/- 1.42, -7.02 +/- 1.60, and -11.42 +/- 2.20% for exposure to filtered air and 60, 70, 80, and 87 ppb O(3), respectively. Inhalation of 70 ppb O(3) for 6.6 hours, a concentration below the current 8-hour National Ambient Air Quality Standard of 75 ppb, is sufficient to induce statistically significant decrements in FEV(1) in healthy young adults.

  12. Multistage open-tube trap for enrichment of part-per-trillion trace components of low-pressure (below 27-kPa) air samples

    Science.gov (United States)

    Ohara, D.; Vo, T.; Vedder, J. F.

    1985-01-01

    A multistage open-tube trap for cryogenic collection of trace components in low-pressure air samples is described. The open-tube design allows higher volumetric flow rates than densely packed glass-bead traps commonly reported and is suitable for air samples at pressures below 27 kPa with liquid nitrogen as the cryogen. Gas blends containing 200 to 2500 parts per trillion by volume each of ethane and ethene were sampled and hydrocarbons were enriched with 100 + or - 4 percent trap efficiency. The multistage design is more efficient than equal-length open-tube traps under the conditions of the measurements.

  13. Elimination of N,O-bis(trimethylsilyl)trifluoroacetamide interference by base treatment in derivatization gas chromatography mass spectrometry determination of parts per billion of alcohols in a food additive.

    Science.gov (United States)

    Zhu, Koudi; Gu, Binghe; Kerry, Michael; Mintert, Markus; Luong, Jim; Pursch, Matthias

    2017-03-24

    A novel base treatment followed by liquid-liquid extraction was developed to remove the interference of excess derivatization reagent BSTFA [N,O-Bis(trimethylsilyl)trifluoroacetamide] and its byproducts for trace determination of 1-chloro-2-propanol and 2-chloro-1-propanol in a food additive. The corresponding trimethylsilyl derivatives were analyzed by gas chromatography mass spectrometry (GC/MS) detection in selective ion monitoring mode. Due to a large volume splitless injection needed for achieving the required sensitivity, excess BSTFA in the derivatization sample solution interfered with the trimethylsilyl derivatives of the analytes of interest, making their quantitation not attainable. Efforts were made to decompose BSTFA while keeping the trimethylsilyl derivatives intact. Water or aqueous sulfuric acid treatment converted BSTFA into mainly N-trimethylsilyltrifluoroacetamide, which partitions between aqueous and organic layers. In contrast, aqueous sodium hydroxide decomposed BSTFA into trifluoroacetic acid, which went entirely into the aqueous layer. No BSTFA or its byproduct N-trimethylsilyltrifluoroacetamide or trifluroacetamide was found in the organic layer where the derivatized alcohols existed, which in turn completely eliminated their interference, enabling accurate and precise determination of parts per billion of the short-chain alcohols in the food additive. Contrary to the conventional wisdom that a trimethylsilyl derivative is susceptible to hydrolysis, the derivatized short-chain alcohols were found stable even in the presence of 0.17N aqueous sodium hydroxide as the improved GC/MS method was validated successfully, with a satisfactory linearity response in the concentration range of 10-400ng/g (regression coefficient greater than 0.999), good method precision (<4%), good recovery (90-98%), and excellent limit of detection (3ng/g) and limit of quantitation (10ng/g). Copyright © 2017 Elsevier B.V. All rights reserved.

  14. 2016 Billion-Ton Report: Environmental Sustainability Effects of Select Scenarios from Volume 1 (Volume 2)

    Energy Technology Data Exchange (ETDEWEB)

    Efroymson, R. A. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Langholtz, M. H. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Johnson, K. E. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Stokes, B. J. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States)

    2017-01-13

    On behalf of all the authors and contributors, it is a great privilege to present the 2016 Billion-Ton Report (BT16), volume 2: Environmental Sustainability Effects of Select Scenarios from volume 1. This report represents the culmination of several years of collaborative effort among national laboratories, government agencies, academic institutions, and industry. BT16 was developed to support the U.S. Department of Energy’s efforts towards national goals of energy security and associated quality of life.

  15. Quantum cascade laser-based analyzer for hydrogen sulfide detection at sub-parts-per-million levels

    Science.gov (United States)

    Nikodem, Michal; Krzempek, Karol; Stachowiak, Dorota; Wysocki, Gerard

    2018-01-01

    Due to its high toxicity, monitoring of hydrogen sulfide (H2S) concentration is essential in many industrial sites (such as natural gas extraction sites, petroleum refineries, geothermal power plants, or waste water treatment facilities), which require sub-parts-per-million sensitivities. We report on a quantum cascade laser-based spectroscopic system for detection of H2S in the midinfrared at ˜7.2 μm. We present a sensor design utilizing Herriott multipass cell and a wavelength modulation spectroscopy to achieve a detection limit of 140 parts per billion for 1-s integration time.

  16. Mining survival in parts per billion

    International Nuclear Information System (INIS)

    Christensen, J.C.

    1992-01-01

    The paper discusses the economic situation in the coal industry of Utah. Coal prices are down for the tenth year in a row, Utah is isolated from major markets and freight rates are high, and the state legislature has not dropped the issue of a coal severance tax. The author believes the only potential for increased use of Utah coal is the Pacific Rim countries. Environmental issues are also discussed

  17. LOSS/GAIN OF VOCS FROM TEDLAR BAGS AND OTHER SAMPLING EQUIPMENT

    Science.gov (United States)

    Soil gas samples are collected to evaluate human health risk from vapor intrusion into homes and other buildings. In order to meet risk assessment goals, the analytical reporting limit for many compounds of concern are down to part per billion ranges. The appropriate sampling t...

  18. Strong sales growth in 2006: + 21 per cent

    International Nuclear Information System (INIS)

    2007-01-01

    Paris, February 14, 2007 - The Gaz de France group today reported record consolidated sales of euro 27,642 million in 2006, up 21 per cent compared with 2005. Under average weather conditions and comparable accounting methods, sales increased by 24 per cent versus 2005. This growth results primarily from an overall increase in European energy prices notwithstanding the slight decrease in prices towards the end of the year. The group also benefited from an increase in volumes and from the integration of new operations. After a colder first half of the year compared to that of the previous year, the autumn of 2006 was particularly warm. This had a negative impact on sales growth (there was a 12 billion kWh decrease between 2005 and 2006). The sales generated by the group's international activities increased by 33 per cent to a total of euro 10,839 m in 2006 and now account for almost 40 per cent of the group's overall sales. In this context, the group confirmed at the board meeting held on January 23, 2007 that it would reach the targets set for 2006, namely: - Growth in EBITDA above 20 per cent, e.g. in excess of euro 5 billion, - Net income of more than euro 2.2 billion

  19. Strong sales growth in 2006: + 21 per cent

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    2007-07-01

    Paris, February 14, 2007 - The Gaz de France group today reported record consolidated sales of euro 27,642 million in 2006, up 21 per cent compared with 2005. Under average weather conditions and comparable accounting methods, sales increased by 24 per cent versus 2005. This growth results primarily from an overall increase in European energy prices notwithstanding the slight decrease in prices towards the end of the year. The group also benefited from an increase in volumes and from the integration of new operations. After a colder first half of the year compared to that of the previous year, the autumn of 2006 was particularly warm. This had a negative impact on sales growth (there was a 12 billion kWh decrease between 2005 and 2006). The sales generated by the group's international activities increased by 33 per cent to a total of euro 10,839 m in 2006 and now account for almost 40 per cent of the group's overall sales. In this context, the group confirmed at the board meeting held on January 23, 2007 that it would reach the targets set for 2006, namely: - Growth in EBITDA above 20 per cent, e.g. in excess of euro 5 billion, - Net income of more than euro 2.2 billion.

  20. Portable field water sample filtration unit

    International Nuclear Information System (INIS)

    Hebert, A.J.; Young, G.G.

    1977-01-01

    A lightweight back-packable field-tested filtration unit is described. The unit is easily cleaned without cross contamination at the part-per-billion level and allows rapid filtration of boiling hot and sometimes muddy water. The filtration results in samples that are free of bacteria and particulates and which resist algae growth even after storage for months. 3 figures

  1. Sampling soils for 137Cs using various field-sampling volumes

    International Nuclear Information System (INIS)

    Nyhan, J.W.; Schofield, T.G.; White, G.C.; Trujillo, G.

    1981-10-01

    The sediments from a liquid effluent receiving area at the Los Alamos National Laboratory and soils from intensive study area in the fallout pathway of Trinity were sampled for 137 Cs using 25-, 500-, 2500-, and 12 500-cm 3 field sampling volumes. A highly replicated sampling program was used to determine mean concentrations and inventories of 137 Cs at each site, as well as estimates of spatial, aliquoting, and counting variance components of the radionuclide data. The sampling methods were also analyzed as a function of soil size fractions collected in each field sampling volume and of the total cost of the program for a given variation in the radionuclide survey results. Coefficients of variation (CV) of 137 Cs inventory estimates ranged from 0.063 to 0.14 for Mortandad Canyon sediments, where CV values for Trinity soils were observed from 0.38 to 0.57. Spatial variance components of 137 Cs concentration data were usually found to be larger than either the aliquoting or counting variance estimates and were inversely related to field sampling volume at the Trinity intensive site. Subsequent optimization studies of the sampling schemes demonstrated that each aliquot should be counted once, and that only 2 to 4 aliquots out of an many as 30 collected need be assayed for 137 Cs. The optimization studies showed that as sample costs increased to 45 man-hours of labor per sample, the variance of the mean 137 Cs concentration decreased dramatically, but decreased very little with additional labor

  2. Countdown to Six Billion Teaching Kit.

    Science.gov (United States)

    Zero Population Growth, Inc., Washington, DC.

    This teaching kit features six activities focused on helping students understand the significance of the world population reaching six billion for our society and our environment. Featured activities include: (1) History of the World: Part Six Billion; (2) A Woman's Place; (3) Baby-O-Matic; (4) Earth: The Apple of Our Eye; (5) Needs vs. Wants; and…

  3. Does Core Length Taken per cc of Prostate Volume in Prostate Biopsy Affect the Diagnosis of Prostate Cancer?

    Science.gov (United States)

    Deliktas, Hasan; Sahin, Hayrettin; Cetinkaya, Mehmet; Dere, Yelda; Erdogan, Omer; Baldemir, Ercan

    2016-08-01

    The aim of this study was to determine the minimal core length to be taken per cc of prostate volume for an effective prostate biopsy. A retrospective analysis was performed on the records of 379 patients who underwent a first prostate biopsy with 12 to 16 cores under transrectal ultrasound guidance between September 2012 and April 2015. For each patient, the core length per cc of the prostate and the percentage of sampled prostate volume were calculated, and these values were compared between the patients with and without prostate cancer. A total of 348 patients were included in the study. Cancer was determined in 26.4% of patients. The mean core length taken per cc of prostate and the percentage of sampled prostate volume were determined to be 3.40 ± 0.15 mm/cc (0.26%; range, 0.08-0.63 cc) in patients with cancer and 2.75 ± 0.08 mm/cc (0.20%; range, 0.04-0.66 cc) in patients without cancer (P = .000 and P = .000), respectively. Core length taken per cc of prostate of > 3.31 mm/cc was found to be related to an increase in the rates of prostate cancer diagnosis (odds ratio, 2.84; 95% confidence interval, 1.68-4.78). The rate of cancer determination for core length taken per cc of prostate of  3.31 mm/cc, 41.1%. Core length taken per cc of prostate and the percentage of sampled prostate volume are important morphometric parameters in the determination of prostate cancer. The results of study suggest a core length per cc of the prostate of > 3.31 mm/cc as a cutoff value for quality assurance. Copyright © 2016 Elsevier Inc. All rights reserved.

  4. Trends in laboratory test volumes for Medicare Part B reimbursements, 2000-2010.

    Science.gov (United States)

    Shahangian, Shahram; Alspach, Todd D; Astles, J Rex; Yesupriya, Ajay; Dettwyler, William K

    2014-02-01

    Changes in reimbursements for clinical laboratory testing may help us assess the effect of various variables, such as testing recommendations, market forces, changes in testing technology, and changes in clinical or laboratory practices, and provide information that can influence health care and public health policy decisions. To date, however, there has been no report, to our knowledge, of longitudinal trends in national laboratory test use. To evaluate Medicare Part B-reimbursed volumes of selected laboratory tests per 10,000 enrollees from 2000 through 2010. Laboratory test reimbursement volumes per 10,000 enrollees in Medicare Part B were obtained from the Centers for Medicare & Medicaid Services (Baltimore, Maryland). The ratio of the most recent (2010) reimbursed test volume per 10,000 Medicare enrollees, divided by the oldest data (usually 2000) during this decade, called the volume ratio, was used to measure trends in test reimbursement. Laboratory tests with a reimbursement claim frequency of at least 10 per 10,000 Medicare enrollees in 2010 were selected, provided there was more than a 50% change in test reimbursement volume during the 2000-2010 decade. We combined the reimbursed test volumes for the few tests that were listed under more than one code in the Current Procedural Terminology (American Medical Association, Chicago, Illinois). A 2-sided Poisson regression, adjusted for potential overdispersion, was used to determine P values for the trend; trends were considered significant at P reimbursement volumes were electrolytes, digoxin, carbamazepine, phenytoin, and lithium, with volume ratios ranging from 0.27 to 0.64 (P reimbursement volumes were meprobamate, opiates, methadone, phencyclidine, amphetamines, cocaine, and vitamin D, with volume ratios ranging from 83 to 1510 (P reimbursement volumes increased for most of the selected tests, other tests exhibited statistically significant downward trends in annual reimbursement volumes. The observed

  5. Interactive (statistical) visualisation and exploration of a billion objects with vaex

    NARCIS (Netherlands)

    Breddels, M. A.

    2016-01-01

    With new catalogues arriving such as the Gaia DR1, containing more than a billion objects, new methods of handling and visualizing these data volumes are needed. We show that by calculating statistics on a regular (N-dimensional) grid, visualizations of a billion objects can be done within a second

  6. Interactive (statistical) visualisation and exploration of a billion objects with vaex

    Science.gov (United States)

    Breddels, M. A.

    2017-06-01

    With new catalogues arriving such as the Gaia DR1, containing more than a billion objects, new methods of handling and visualizing these data volumes are needed. We show that by calculating statistics on a regular (N-dimensional) grid, visualizations of a billion objects can be done within a second on a modern desktop computer. This is achieved using memory mapping of hdf5 files together with a simple binning algorithm, which are part of a Python library called vaex. This enables efficient exploration or large datasets interactively, making science exploration of large catalogues feasible. Vaex is a Python library and an application, which allows for interactive exploration and visualization. The motivation for developing vaex is the catalogue of the Gaia satellite, however, vaex can also be used on SPH or N-body simulations, any other (future) catalogues such as SDSS, Pan-STARRS, LSST, etc. or other tabular data. The homepage for vaex is http://vaex.astro.rug.nl.

  7. Areva excellent business volume: backlog as of december 31, 2008: + 21.1% to 48.2 billion euros. 2008 revenue: + 10.4% to 13.2 billion euros

    International Nuclear Information System (INIS)

    2009-01-01

    AREVA's backlog stood at 48.2 billion euros as of December 31, 2008, for 21.1% growth year-on-year, including 21.8% growth in Nuclear and 16.5% growth in Transmission and Distribution. The Nuclear backlog came to 42.5 billion euros at December 31, 2008. The Transmission and Distribution backlog came to 5.7 billion euros at year-end. The group recognized revenue of 13.2 billion euros in 2008, for year-on-year growth of 10.4% (+9.8% like-for-like). Revenue outside France was up 10.5% to 9.5 billion euros, representing 72% of total revenue. Revenue was up 6.5% in the Nuclear businesses (up 6.3% LFL), with strong performance in the Reactors and Services division (+10.9% LFL) and the Front End division (+7.2% LFL). The Transmission and Distribution division recorded growth of 17% (+15.8% LFL). Revenue for the fourth quarter of 2008 rose to 4.1 billion euros, up 5.2% (+1.6% LFL) from that of the fourth quarter of 2007. Revenue for the Front End division rose to 3.363 billion euros in 2008, up 7.1% over 2007 (+7.2% LFL). Foreign exchange (currency translations) had a negative impact of 53 million euros. Revenue for the Reactors and Services division rose to 3.037 billion euros, up 11.8% over 2007 (+10.9% LFL). Foreign exchange (currency translations) had a negative impact of 47 million euros. Revenue for the Back End division came to 1.692 billion euros, a drop of 2.7% (-2.5% LFL). Foreign exchange (currency translations) had a negative impact of 3.5 million euros. Revenue for the Transmission and Distribution division rose to 5.065 billion euros in 2008, up 17.0% (+15.8% LFL)

  8. 2016 Billion-Ton Report: Advancing Domestic Resources for a Thriving Bioeconomy

    Energy Technology Data Exchange (ETDEWEB)

    None

    2016-07-06

    This product builds on previous efforts, namely the 2005 Billion-Ton Study (BTS) and the 2011 U.S. Billion-Ton Update (BT2).With each report, greater perspective is gained on the potential of biomass resources to contribute to a national energy strategy. Similarly, each successive report introduces new questions regarding commercialization challenges. BTS quantified the broad biophysical potential of biomass nationally, and BT2 elucidated the potential economic availability of these resources. These reports clearly established the potential availability of up to one billion tons of biomass resources nationally. However, many questions remain, including but not limited to crop yields, climate change impacts, logistical operations, and systems integration across production, harvest, and conversion. The present report aims to address many of these questions through empirically modeled energy crop yields, scenario analysis of resources delivered to biorefineries, and the addition of new feedstocks. Volume 2 of the 2016 Billion-Ton Report is expected to be released by the end of 2016. It seeks to evaluate environmental sustainability indicators of select scenarios from volume 1 and potential climate change impacts on future supplies.

  9. An evaluation of soil sampling for 137Cs using various field-sampling volumes.

    Science.gov (United States)

    Nyhan, J W; White, G C; Schofield, T G; Trujillo, G

    1983-05-01

    The sediments from a liquid effluent receiving area at the Los Alamos National Laboratory and soils from an intensive study area in the fallout pathway of Trinity were sampled for 137Cs using 25-, 500-, 2500- and 12,500-cm3 field sampling volumes. A highly replicated sampling program was used to determine mean concentrations and inventories of 137Cs at each site, as well as estimates of spatial, aliquoting, and counting variance components of the radionuclide data. The sampling methods were also analyzed as a function of soil size fractions collected in each field sampling volume and of the total cost of the program for a given variation in the radionuclide survey results. Coefficients of variation (CV) of 137Cs inventory estimates ranged from 0.063 to 0.14 for Mortandad Canyon sediments, whereas CV values for Trinity soils were observed from 0.38 to 0.57. Spatial variance components of 137Cs concentration data were usually found to be larger than either the aliquoting or counting variance estimates and were inversely related to field sampling volume at the Trinity intensive site. Subsequent optimization studies of the sampling schemes demonstrated that each aliquot should be counted once, and that only 2-4 aliquots out of as many as 30 collected need be assayed for 137Cs. The optimization studies showed that as sample costs increased to 45 man-hours of labor per sample, the variance of the mean 137Cs concentration decreased dramatically, but decreased very little with additional labor.

  10. An Elevated Reservoir of Air Pollutants over the Mid-Atlantic States During the 2011 DISCOVER-AQ Campaign: Airborne Measurements and Numerical Simulations

    Science.gov (United States)

    He, Hao; Loughner, Christopher P.; Stehr, Jeffrey W.; Arkinson, Heather L.; Brent, Lacey C.; Follette-Cook, Melanie B.; Tzortziou, Maria A.; Pickering, Kenneth E.; Thompson, Anne M.; Martins, Douglas K.; hide

    2013-01-01

    During a classic heat wave with record high temperatures and poor air quality from July 18 to 23, 2011, an elevated reservoir of air pollutants was observed over and downwind of Baltimore, MD, with relatively clean conditions near the surface. Aircraft and ozonesonde measurements detected approximately 120 parts per billion by volume ozone at 800 meters altitude, but approximately 80 parts per billion by volume ozone near the surface. High concentrations of other pollutants were also observed around the ozone peak: approximately 300 parts per billion by volume CO at 1200 meters, approximately 2 parts per billion by volume NO2 at 800 meters, approximately 5 parts per billion by volume SO2 at 600 meters, and strong aerosol optical scattering (2 x 10 (sup 4) per meter) at 600 meters. These results suggest that the elevated reservoir is a mixture of automobile exhaust (high concentrations of O3, CO, and NO2) and power plant emissions (high SO2 and aerosols). Back trajectory calculations show a local stagnation event before the formation of this elevated reservoir. Forward trajectories suggest an influence on downwind air quality, supported by surface ozone observations on the next day over the downwind PA, NJ and NY area. Meteorological observations from aircraft and ozonesondes show a dramatic veering of wind direction from south to north within the lowest 5000 meters, implying that the development of the elevated reservoir was caused in part by the Chesapeake Bay breeze. Based on in situ observations, Community Air Quality Multi-scale Model (CMAQ) forecast simulations with 12 kilometers resolution overestimated surface ozone concentrations and failed to predict this elevated reservoir; however, CMAQ research simulations with 4 kilometers and 1.33 kilometers resolution more successfully reproduced this event. These results show that high resolution is essential for resolving coastal effects and predicting air quality for cities near major bodies of water such as

  11. Connecting the last billion

    OpenAIRE

    Ben David, Yahel

    2015-01-01

    The last billion people to join the online world, are likely to face at least one of two obstacles:Part I: Rural Internet AccessRural, sparsely populated, areas make conventional infrastructure investments unfeasible: Bigcorporations attempt to address this challenge via the launch of Low-Earth-Orbiting (LEO) satelliteconstellations, fleets of high-altitude balloons, and giant solar-powered drones; although thesegrandiose initiatives hold potential, they are costly and risky. At the same time...

  12. Oncology pharma costs to exceed $150 billion by 2020.

    Science.gov (United States)

    2016-10-01

    Worldwide costs of oncology drugs will rise above $150 billion by 2020, according to a report by the IMS Institute for Healthcare Informatics. Many factors are in play, according to IMS, including the new wave of expensive immunotherapies. Pembrolizumab (Keytruda), priced at $150,000 per year per patient, and nivolumab (Opdivo), priced at $165,000, may be harbingers of the market for cancer immunotherapies.

  13. Sampling of high amounts of bioaerosols using a high-volume electrostatic field sampler

    DEFF Research Database (Denmark)

    Madsen, A. M.; Sharma, Anoop Kumar

    2008-01-01

    For studies of the biological effects of bioaerosols, large samples are necessary. To be able to sample enough material and to cover the variations in aerosol content during and between working days, a long sampling time is necessary. Recently, a high-volume transportable electrostatic field...... and 315 mg dust (net recovery of the lyophilized dust) was sampled during a period of 7 days, respectively. The sampling rates of the electrostatic field samplers were between 1.34 and 1.96 mg dust per hour, the value for the Gravikon was between 0.083 and 0.108 mg dust per hour and the values for the GSP...... samplers were between 0.0031 and 0.032 mg dust per hour. The standard deviations of replica samplings and the following microbial analysis using the electrostatic field sampler and GSP samplers were at the same levels. The exposure to dust in the straw storage was 7.7 mg m(-3) when measured...

  14. Trends in lumber processing in the western United States. Part I: board foot Scribner volume per cubic foot of timber

    Science.gov (United States)

    Charles E. Keegan; Todd A. Morgan; Keith A. Blatner; Jean M. Daniels

    2010-01-01

    This article describes trends in board foot Scribner volume per cubic foot of timber for logs processed by sawmills in the western United States. Board foot to cubic foot (BF/CF) ratios for the period from 2000 through 2006 ranged from 3.70 in Montana to 5.71 in the Four Corners Region (Arizona, Colorado, New Mexico, and Utah). Sawmills in the Four Corners Region,...

  15. FY97 nuclear-related budgets total 493 billion yen (4.4 billion dollars)

    International Nuclear Information System (INIS)

    Anon.

    1996-01-01

    On September 13, the Atomic Energy Commission of Japan announced the estimated nuclear-related budget requests for FY1997 (April, 1997 - Mach, 1998), giving the breakdowns for eight ministries and agencies. The total amount requested by the government bodies was 493.3 billion yen, 0.8% increase as compared with FY96. this figure includes the budget requests of the Science and Technology Agency (STA), the Ministry of International Trade and Industry (MITI), the Ministry of Foreign Affairs, the Ministry of Transport, the Ministry of Agriculture, Forestry and Fisheries, the Okinawa Development Agency, and the Ministry of Home Affairs, but excludes the budget request made by the Ministry of Education. The budget requests of STA and MITI are 360 billion yen and 126 billion yen, respectively. On August 29, STA released its estimated FY97 budget request. The nuclear-related 360.4 billion yen is 0.9% more than that in year before. Of this sum, 199.9 billion yen is in the general account, and 160.6 billion yen is in the special account for power source development. The details of the nuclear-related amounts are explained. On August 26, MITI released its estimated budget request for FY97, and of the nuclear-related 125.7 billion yen (0.1% increase from FY96), 200 million yen is in the general account, and 98.9 billion yen and 26.6 billion yen are in the special accounts for power resource development and power source diversification, respectively. (K.I.)

  16. It is possible to increase by over thirty per cent the Nile Water availability

    International Nuclear Information System (INIS)

    Lemperiere, F.

    2011-01-01

    The population of the Nile catchment is presently 250 Million and will probably reach 400 Million in 2040. The catchment includes two parts of about same population but with a very different climate. - The upstream rainy part (most of this area is in Ethiopia, Uganda and South Sudan). - The downstream dry part i.e North Sudan and Egypt. The available water from the Nile runoff is evaluated as average as 72 Billion m 3 /year; it is quite totally coming from the upstream part and used in the downstream part. For their development the upstream populations (including also part of Tanzania, Kenya, Congo, Rwanda and Burundi) are now requiring a significant share of the run off generated from local rains when Egypt and North Sudan claim historic rights on the Nile Waters. The best way to avoid conflicts is to increase the water availability for keeping in Egypt and North Sudan at least the water volume presently used and to allow to upstream countries the water resources necessary for their development, possibly in the range of 100 m 3 / year / capita in 2030 or 2040. The average total runoff of the Nile is in fact close to 140 Billion m 3 / year but over 40 Billion evaporate in the South Sudan Swamps and 15 Billion in the reservoirs of Aswan and Northern Sudan. A solution for reducing by half these two main losses is presented in this paper: it is based upon a concrete knowledge of the local very specific data and upon a successful experience of adapted technical solutions

  17. It is possible to increase by over thirty per cent the Nile Water availability

    Energy Technology Data Exchange (ETDEWEB)

    Lemperiere, F.

    2011-01-15

    The population of the Nile catchment is presently 250 Million and will probably reach 400 Million in 2040. The catchment includes two parts of about same population but with a very different climate. - The upstream rainy part (most of this area is in Ethiopia, Uganda and South Sudan). - The downstream dry part i.e North Sudan and Egypt. The available water from the Nile runoff is evaluated as average as 72 Billion m{sup 3} /year; it is quite totally coming from the upstream part and used in the downstream part. For their development the upstream populations (including also part of Tanzania, Kenya, Congo, Rwanda and Burundi) are now requiring a significant share of the run off generated from local rains when Egypt and North Sudan claim historic rights on the Nile Waters. The best way to avoid conflicts is to increase the water availability for keeping in Egypt and North Sudan at least the water volume presently used and to allow to upstream countries the water resources necessary for their development, possibly in the range of 100 m{sup 3} / year / capita in 2030 or 2040. The average total runoff of the Nile is in fact close to 140 Billion m{sup 3} / year but over 40 Billion evaporate in the South Sudan Swamps and 15 Billion in the reservoirs of Aswan and Northern Sudan. A solution for reducing by half these two main losses is presented in this paper: it is based upon a concrete knowledge of the local very specific data and upon a successful experience of adapted technical solutions

  18. The relationship between limit of Dysphagia and average volume per swallow in patients with Parkinson's disease.

    Science.gov (United States)

    Belo, Luciana Rodrigues; Gomes, Nathália Angelina Costa; Coriolano, Maria das Graças Wanderley de Sales; de Souza, Elizabete Santos; Moura, Danielle Albuquerque Alves; Asano, Amdore Guescel; Lins, Otávio Gomes

    2014-08-01

    The goal of this study was to obtain the limit of dysphagia and the average volume per swallow in patients with mild to moderate Parkinson's disease (PD) but without swallowing complaints and in normal subjects, and to investigate the relationship between them. We hypothesize there is a direct relationship between these two measurements. The study included 10 patients with idiopathic PD and 10 age-matched normal controls. Surface electromyography was recorded over the suprahyoid muscle group. The limit of dysphagia was obtained by offering increasing volumes of water until piecemeal deglutition occurred. The average volume per swallow was calculated by dividing the time taken by the number of swallows used to drink 100 ml of water. The PD group showed a significantly lower dysphagia limit and lower average volume per swallow. There was a significantly moderate direct correlation and association between the two measurements. About half of the PD patients had an abnormally low dysphagia limit and average volume per swallow, although none had spontaneously related swallowing problems. Both measurements may be used as a quick objective screening test for the early identification of swallowing alterations that may lead to dysphagia in PD patients, but the determination of the average volume per swallow is much quicker and simpler.

  19. Nuclear business worth billions begins

    International Nuclear Information System (INIS)

    Beer, G.; Marcan, P.; Slovak, K.

    2005-01-01

    specific data regarding the direct costs of decommissioning. Preliminary estimates state 50 billions Slovak crowns (1.28 billions EUR), but the actual costs will mainly depend on the volume of nuclear waste to be disposed of. (authors)

  20. Known volume air sampling pump. Final summary report Jun 1975--Nov 1976

    International Nuclear Information System (INIS)

    McCullough, J.E.; Peterson, A.

    1976-11-01

    The purpose of this development program was to design and develop a known volume air sampling pump for use in measuring the amount of radioactive material in the atmosphere of an underground uranium mine. The principal nuclear radiation hazard to underground uranium mines comes from the mine atmosphere. Daughter products of radon-222 are inhaled by the miner resulting in a relatively high lung cancer rate among these workers. Current exposure control practice employs spot sampling in working areas to measure working level values. Currently available personal air sampling pumps fail to deliver known volumes of air under widely changing differential pressures. A unique type of gas pump known as the scroll compressor, developed by Arthur D. Little, Inc., that has no values and few moving parts is expected to provide a practical, efficient, and dependable air pump for use in dosimeters. The three deliverable known volume air sampling pumps resulting from this work incorporate a scroll pump, drive motor, speed control electronics, and battery pack in a container suitable for attachment to a miner's belt

  1. Effective interventions for unintentional injuries: a systematic review and mortality impact assessment among the poorest billion

    Directory of Open Access Journals (Sweden)

    Andres I Vecino-Ortiz, PhD

    2018-05-01

    lessons for children younger than 14 years (>25 000 lives saved per year and the use of crèches to supervise younger children (younger than 5 years; >10 000 lives saved per year. We did not find sufficient evidence on interventions for other causes of unintentional injuries (poisoning, burns, and falls to run similar simulations. Interpretation: Based on the little available evidence, key interventions have been identified to prevent lives lost from unintentional injuries among the poorest billion. This Article provides guidance to national authorities on evidence-based priority interventions that can reduce the burden of injuries among the most vulnerable members of the population. We also identify an important gap in knowledge on the effectiveness and the mortality impacts of injury interventions. Funding: Partly supported by the Fogarty International Center of the US National Institutes of Health (Chronic Consequences of Trauma, Injuries, Disability Across the Lifespan: Uganda; #D43TW009284.

  2. 2016 Billion-ton report: Advancing domestic resources for a thriving bioeconomy, Volume 1: Economic availability of feedstock

    Science.gov (United States)

    M.H. Langholtz; B.J. Stokes; L.M. Eaton

    2016-01-01

    This product builds on previous efforts, namely the 2005 Billion-Ton Study (BTS) and the 2011 U.S. Billion-Ton Update (BT2).With each report, greater perspective is gained on the potential of biomass resources to contribute to a national energy strategy. Similarly, each successive report introduces new questions regarding commercialization challenges. BTS quantified...

  3. Shining light on human breath analysis with quantum cascade laser spectroscopy

    NARCIS (Netherlands)

    Reyes Reyes, A.

    2017-01-01

    In the search for new non-invasive diagnostic methods, healthcare researchers have turned their attention to exhaled human breath. Breath consists of thousands of molecular compounds in very low concentrations, in the order of parts per million by volume (ppmv), parts per billion by

  4. BUILDING A BILLION SPATIO-TEMPORAL OBJECT SEARCH AND VISUALIZATION PLATFORM

    Directory of Open Access Journals (Sweden)

    D. Kakkar

    2017-10-01

    Full Text Available With funding from the Sloan Foundation and Harvard Dataverse, the Harvard Center for Geographic Analysis (CGA has developed a prototype spatio-temporal visualization platform called the Billion Object Platform or BOP. The goal of the project is to lower barriers for scholars who wish to access large, streaming, spatio-temporal datasets. The BOP is now loaded with the latest billion geo-tweets, and is fed a real-time stream of about 1 million tweets per day. The geo-tweets are enriched with sentiment and census/admin boundary codes when they enter the system. The system is open source and is currently hosted on Massachusetts Open Cloud (MOC, an OpenStack environment with all components deployed in Docker orchestrated by Kontena. This paper will provide an overview of the BOP architecture, which is built on an open source stack consisting of Apache Lucene, Solr, Kafka, Zookeeper, Swagger, scikit-learn, OpenLayers, and AngularJS. The paper will further discuss the approach used for harvesting, enriching, streaming, storing, indexing, visualizing and querying a billion streaming geo-tweets.

  5. Building a Billion Spatio-Temporal Object Search and Visualization Platform

    Science.gov (United States)

    Kakkar, D.; Lewis, B.

    2017-10-01

    With funding from the Sloan Foundation and Harvard Dataverse, the Harvard Center for Geographic Analysis (CGA) has developed a prototype spatio-temporal visualization platform called the Billion Object Platform or BOP. The goal of the project is to lower barriers for scholars who wish to access large, streaming, spatio-temporal datasets. The BOP is now loaded with the latest billion geo-tweets, and is fed a real-time stream of about 1 million tweets per day. The geo-tweets are enriched with sentiment and census/admin boundary codes when they enter the system. The system is open source and is currently hosted on Massachusetts Open Cloud (MOC), an OpenStack environment with all components deployed in Docker orchestrated by Kontena. This paper will provide an overview of the BOP architecture, which is built on an open source stack consisting of Apache Lucene, Solr, Kafka, Zookeeper, Swagger, scikit-learn, OpenLayers, and AngularJS. The paper will further discuss the approach used for harvesting, enriching, streaming, storing, indexing, visualizing and querying a billion streaming geo-tweets.

  6. Ear recognition from one sample per person.

    Directory of Open Access Journals (Sweden)

    Long Chen

    Full Text Available Biometrics has the advantages of efficiency and convenience in identity authentication. As one of the most promising biometric-based methods, ear recognition has received broad attention and research. Previous studies have achieved remarkable performance with multiple samples per person (MSPP in the gallery. However, most conventional methods are insufficient when there is only one sample per person (OSPP available in the gallery. To solve the OSPP problem by maximizing the use of a single sample, this paper proposes a hybrid multi-keypoint descriptor sparse representation-based classification (MKD-SRC ear recognition approach based on 2D and 3D information. Because most 3D sensors capture 3D data accessorizing the corresponding 2D data, it is sensible to use both types of information. First, the ear region is extracted from the profile. Second, keypoints are detected and described for both the 2D texture image and 3D range image. Then, the hybrid MKD-SRC algorithm is used to complete the recognition with only OSPP in the gallery. Experimental results on a benchmark dataset have demonstrated the feasibility and effectiveness of the proposed method in resolving the OSPP problem. A Rank-one recognition rate of 96.4% is achieved for a gallery of 415 subjects, and the time involved in the computation is satisfactory compared to conventional methods.

  7. Ear recognition from one sample per person.

    Science.gov (United States)

    Chen, Long; Mu, Zhichun; Zhang, Baoqing; Zhang, Yi

    2015-01-01

    Biometrics has the advantages of efficiency and convenience in identity authentication. As one of the most promising biometric-based methods, ear recognition has received broad attention and research. Previous studies have achieved remarkable performance with multiple samples per person (MSPP) in the gallery. However, most conventional methods are insufficient when there is only one sample per person (OSPP) available in the gallery. To solve the OSPP problem by maximizing the use of a single sample, this paper proposes a hybrid multi-keypoint descriptor sparse representation-based classification (MKD-SRC) ear recognition approach based on 2D and 3D information. Because most 3D sensors capture 3D data accessorizing the corresponding 2D data, it is sensible to use both types of information. First, the ear region is extracted from the profile. Second, keypoints are detected and described for both the 2D texture image and 3D range image. Then, the hybrid MKD-SRC algorithm is used to complete the recognition with only OSPP in the gallery. Experimental results on a benchmark dataset have demonstrated the feasibility and effectiveness of the proposed method in resolving the OSPP problem. A Rank-one recognition rate of 96.4% is achieved for a gallery of 415 subjects, and the time involved in the computation is satisfactory compared to conventional methods.

  8. Effective interventions for unintentional injuries: a systematic review and mortality impact assessment among the poorest billion.

    Science.gov (United States)

    Vecino-Ortiz, Andres I; Jafri, Aisha; Hyder, Adnan A

    2018-05-01

    saved per year) and the use of crèches to supervise younger children (younger than 5 years; >10 000 lives saved per year). We did not find sufficient evidence on interventions for other causes of unintentional injuries (poisoning, burns, and falls) to run similar simulations. Based on the little available evidence, key interventions have been identified to prevent lives lost from unintentional injuries among the poorest billion. This Article provides guidance to national authorities on evidence-based priority interventions that can reduce the burden of injuries among the most vulnerable members of the population. We also identify an important gap in knowledge on the effectiveness and the mortality impacts of injury interventions. Partly supported by the Fogarty International Center of the US National Institutes of Health (Chronic Consequences of Trauma, Injuries, Disability Across the Lifespan: Uganda; #D43TW009284). Copyright © 2018 The Author(s). Published by Elsevier Ltd. This is an Open Access article under the CC BY-NC-ND 4.0 license. Published by Elsevier Ltd.. All rights reserved.

  9. Direct detection of benzene, toluene, and ethylbenzene at trace levels in ambient air by atmospheric pressure chemical ionization using a handheld mass spectrometer.

    Science.gov (United States)

    Huang, Guangming; Gao, Liang; Duncan, Jason; Harper, Jason D; Sanders, Nathaniel L; Ouyang, Zheng; Cooks, R Graham

    2010-01-01

    The capabilities of a portable mass spectrometer for real-time monitoring of trace levels of benzene, toluene, and ethylbenzene in air are illustrated. An atmospheric pressure interface was built to implement atmospheric pressure chemical ionization for direct analysis of gas-phase samples on a previously described miniature mass spectrometer (Gao et al. Anal. Chem.2006, 78, 5994-6002). Linear dynamic ranges, limits of detection and other analytical figures of merit were evaluated: for benzene, a limit of detection of 0.2 parts-per-billion was achieved for air samples without any sample preconcentration. The corresponding limits of detection for toluene and ethylbenzene were 0.5 parts-per-billion and 0.7 parts-per-billion, respectively. These detection limits are well below the compounds' permissible exposure levels, even in the presence of added complex mixtures of organics at levels exceeding the parts-per-million level. The linear dynamic ranges of benzene, toluene, and ethylbenzene are limited to approximately two orders of magnitude by saturation of the detection electronics. 2010 American Society for Mass Spectrometry. Published by Elsevier Inc. All rights reserved.

  10. More practical critical height sampling.

    Science.gov (United States)

    Thomas B. Lynch; Jeffrey H. Gove

    2015-01-01

    Critical Height Sampling (CHS) (Kitamura 1964) can be used to predict cubic volumes per acre without using volume tables or equations. The critical height is defined as the height at which the tree stem appears to be in borderline condition using the point-sampling angle gauge (e.g. prism). An estimate of cubic volume per acre can be obtained from multiplication of the...

  11. Inductively coupled plasma emission spectroscopy. Part II: applications and fundamentals. Volume 2

    International Nuclear Information System (INIS)

    Boumans, P.W.J.M.

    1987-01-01

    This is the second part of the two-volume treatise by this well-known and respected author. This volume reviews applications of inductively coupled plasma atomic emission spectroscopy (ICP-AES), summarizes fundamental studies, and compares ICP-AES methods with other methods of analysis. The first six chapters are devoted to specific fields of application, including the following: metals and other industrial materials, geology, the environment, agriculture and food, biology and clinical analysis, and organic materials. The chapter on the analysis of organic materials also covers the special instrumental considerations required when organic solvents are introduced into an inductively coupled plasma. A chapter on the direct analysis of solids completes the first part of this volume. Each of the applications chapters begins with a summary of the types of samples that are encountered in that field, and the kinds of problems that an elemental analysis can help to solve. This is followed by a tutorial approach covering applicability, advantages, and limitations of the methods. The coverage is thorough, including sample handling, storage, and preparation, acid, and fusion dissolution, avoiding contamination, methods of preconcentration, the types of interferences that can be expected and ways to reduce them, and the types of ICP plasmas that are used. The second half of the volume covers fundamental studies of ICP-AES: basic processes of aerosol generation, plasma modeling and computer simulation, spectroscopic diagnostics, excitation mechanisms, and discharge characteristics. This section introduces the experimental and modeling methods that have been used to obtain fundamental information about ICPs

  12. What Per Cent Cruise?

    Science.gov (United States)

    George M. Furnival

    1953-01-01

    Cruising timber is ordinarily a job of sampling, in which the quantity of timber on a tract is estimated from the quantity on a part of the tract. The difficulty is to determine what part (per cent) of the tract should be sampled to attain a given level of accuracy. This article gives a rule-ofthumb that can be applied with fair reliability to most Southern forests....

  13. U.S. Billion-Ton Update: Biomass Supply for a Bioenergy and Bioproducts Industry

    Energy Technology Data Exchange (ETDEWEB)

    Downing, Mark [ORNL; Eaton, Laurence M [ORNL; Graham, Robin Lambert [ORNL; Langholtz, Matthew H [ORNL; Perlack, Robert D [ORNL; Turhollow Jr, Anthony F [ORNL; Stokes, Bryce [Navarro Research & Engineering; Brandt, Craig C [ORNL

    2011-08-01

    The report, Biomass as Feedstock for a Bioenergy and Bioproducts Industry: The Technical Feasibility of a Billion-Ton Annual Supply (generally referred to as the Billion-Ton Study or 2005 BTS), was an estimate of 'potential' biomass based on numerous assumptions about current and future inventory, production capacity, availability, and technology. The analysis was made to determine if conterminous U.S. agriculture and forestry resources had the capability to produce at least one billion dry tons of sustainable biomass annually to displace 30% or more of the nation's present petroleum consumption. An effort was made to use conservative estimates to assure confidence in having sufficient supply to reach the goal. The potential biomass was projected to be reasonably available around mid-century when large-scale biorefineries are likely to exist. The study emphasized primary sources of forest- and agriculture-derived biomass, such as logging residues, fuel treatment thinnings, crop residues, and perennially grown grasses and trees. These primary sources have the greatest potential to supply large, reliable, and sustainable quantities of biomass. While the primary sources were emphasized, estimates of secondary residue and tertiary waste resources of biomass were also provided. The original Billion-Ton Resource Assessment, published in 2005, was divided into two parts-forest-derived resources and agriculture-derived resources. The forest resources included residues produced during the harvesting of merchantable timber, forest residues, and small-diameter trees that could become available through initiatives to reduce fire hazards and improve forest health; forest residues from land conversion; fuelwood extracted from forests; residues generated at primary forest product processing mills; and urban wood wastes, municipal solid wastes (MSW), and construction and demolition (C&D) debris. For these forest resources, only residues, wastes, and small

  14. Sample container and storage for paclobutrazol monitoring in irrigation water

    Science.gov (United States)

    Paclobutrazol is a plant growth retardant commonly used on greenhouse crops. Residues from paclobutrazol applications can accumulate in recirculated irrigation water. Given that paclobutrazol has a long half-life and potential biological activity in parts per billion concentrations, it would be de...

  15. Report on the environmental isotopic investigations in Kedah and Perlis Area, Malaysia (Part 1)

    International Nuclear Information System (INIS)

    Daud bin Mohamad.

    1982-11-01

    A preliminary study of the isotope hydrology of the Kedah and Perlis area was undertaken under the RCA programme. This project is an attempt at elucidating the mechanism of recharge, origin, area of recharge and dating of groundwater system in the area. The results show that all groundwater samples in the area vary within a narrow range for 18 0 (-7.58 to -5.06%) while 2 H ranges from -50.3 to -35.1%. The mean isotopic composition of precipitation collected at Alor Star meteorological station fall within the range of variation of the Kedah/Perlis groundwaters. In the southern part of the study site, the isotopic results indicate the occurence of two types of water: firstly, the recharge is from the highlands where more negative 18 0 values and low tritium were observed and the second type is of local recharge where high tritium and less negative 18 0 values were observed. On the other hand, in the northern part of the basin the interpretation of stable isotopic results is quite difficult to be made at this stage. There was no correlation at all between tritium and 18 0 whatsoever. Results of the tritium assay show that some of the groundwater samples are pre-nuclear in age as indicated by low tritium content. Consequently, a Carbon-14 investigation was carried out from a few selected sites and their ages were found to be in the range of about 3000 to 5000 years. (author)

  16. Using data on resistance prevalence per sample in the surveillance of antimicrobial resistance

    DEFF Research Database (Denmark)

    Vieira, Antonio; Shuyu, Wu; Jensen, Lars Bogø

    2008-01-01

    Objectives: In most existing antimicrobial resistance monitoring programmes, one single bacterial colony from each collected sample is susceptibility tested against a panel of antimicrobials. Detecting the proportion of colonies resistant to different antimicrobials in each sample can provide...... quantitative data on antimicrobial resistance (resistance prevalence per sample). Methods: In this study, a total of 98 faecal samples from slaughter pigs were tested for tetracycline and sulphonamide resistance in Escherichia coli using the single colony method, and these results were compared...... with the results obtained using the resistance prevalence per sample method. Results: The results obtained by the resistance prevalence per sample method showed a lower occurrence of resistance. Tetracycline resistance in E. coli was found in 36.7% of the samples using the single colony method, while the mean...

  17. A billion-dollar bonanza

    International Nuclear Information System (INIS)

    Isaacs, J.

    1993-01-01

    In late May -- only weeks after Congress had rejected the president's economic stimulus package because it would add to the federal deficit -- the House of Representatives generously allocated an extra $1.2 billion to the Pentagon. This article discusses some of the rationalizations House members gave for the gift and describes the attempts of a bipartisan group to defeat this request for funds propounded by Pennsylvania Democrat John Murtha. This gist of the arguments for and against the $1.2 billion and the results of votes on the bill are presented

  18. Sampling strategies in antimicrobial resistance monitoring: evaluating how precision and sensitivity vary with the number of animals sampled per farm.

    Directory of Open Access Journals (Sweden)

    Takehisa Yamamoto

    Full Text Available Because antimicrobial resistance in food-producing animals is a major public health concern, many countries have implemented antimicrobial monitoring systems at a national level. When designing a sampling scheme for antimicrobial resistance monitoring, it is necessary to consider both cost effectiveness and statistical plausibility. In this study, we examined how sampling scheme precision and sensitivity can vary with the number of animals sampled from each farm, while keeping the overall sample size constant to avoid additional sampling costs. Five sampling strategies were investigated. These employed 1, 2, 3, 4 or 6 animal samples per farm, with a total of 12 animals sampled in each strategy. A total of 1,500 Escherichia coli isolates from 300 fattening pigs on 30 farms were tested for resistance against 12 antimicrobials. The performance of each sampling strategy was evaluated by bootstrap resampling from the observational data. In the bootstrapping procedure, farms, animals, and isolates were selected randomly with replacement, and a total of 10,000 replications were conducted. For each antimicrobial, we observed that the standard deviation and 2.5-97.5 percentile interval of resistance prevalence were smallest in the sampling strategy that employed 1 animal per farm. The proportion of bootstrap samples that included at least 1 isolate with resistance was also evaluated as an indicator of the sensitivity of the sampling strategy to previously unidentified antimicrobial resistance. The proportion was greatest with 1 sample per farm and decreased with larger samples per farm. We concluded that when the total number of samples is pre-specified, the most precise and sensitive sampling strategy involves collecting 1 sample per farm.

  19. TRU Waste Sampling Program: Volume I. Waste characterization

    International Nuclear Information System (INIS)

    Clements, T.L. Jr.; Kudera, D.E.

    1985-09-01

    Volume I of the TRU Waste Sampling Program report presents the waste characterization information obtained from sampling and characterizing various aged transuranic waste retrieved from storage at the Idaho National Engineering Laboratory and the Los Alamos National Laboratory. The data contained in this report include the results of gas sampling and gas generation, radiographic examinations, waste visual examination results, and waste compliance with the Waste Isolation Pilot Plant-Waste Acceptance Criteria (WIPP-WAC). A separate report, Volume II, contains data from the gas generation studies

  20. Summary and Comparison of the 2016 Billion-Ton Report with the 2011 U.S. Billion-Ton Update

    Energy Technology Data Exchange (ETDEWEB)

    None

    2016-06-01

    In terms of the magnitude of the resource potential, the results of the 2016 Billion-Ton Report (BT16) are consistent with the original 2005 Billion-Ton Study (BTS) and the 2011 report, U.S. Billion-Ton Update: Biomass Supply for a Bioenergy and Bioproducts Industry (BT2. An effort was made to reevaluate the potential forestland, agricultural, and waste resources at the roadside, then extend the analysis by adding transportation costs to a biorefinery under specified logistics assumptions to major resource fractions.

  1. Pulsed Direct Current Electrospray: Enabling Systematic Analysis of Small Volume Sample by Boosting Sample Economy.

    Science.gov (United States)

    Wei, Zhenwei; Xiong, Xingchuang; Guo, Chengan; Si, Xingyu; Zhao, Yaoyao; He, Muyi; Yang, Chengdui; Xu, Wei; Tang, Fei; Fang, Xiang; Zhang, Sichun; Zhang, Xinrong

    2015-11-17

    We had developed pulsed direct current electrospray ionization mass spectrometry (pulsed-dc-ESI-MS) for systematically profiling and determining components in small volume sample. Pulsed-dc-ESI utilized constant high voltage to induce the generation of single polarity pulsed electrospray remotely. This method had significantly boosted the sample economy, so as to obtain several minutes MS signal duration from merely picoliter volume sample. The elongated MS signal duration enable us to collect abundant MS(2) information on interested components in a small volume sample for systematical analysis. This method had been successfully applied for single cell metabolomics analysis. We had obtained 2-D profile of metabolites (including exact mass and MS(2) data) from single plant and mammalian cell, concerning 1034 components and 656 components for Allium cepa and HeLa cells, respectively. Further identification had found 162 compounds and 28 different modification groups of 141 saccharides in a single Allium cepa cell, indicating pulsed-dc-ESI a powerful tool for small volume sample systematical analysis.

  2. Information architecture. Volume 2, Part 1: Baseline analysis summary

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1996-12-01

    The Department of Energy (DOE) Information Architecture, Volume 2, Baseline Analysis, is a collaborative and logical next-step effort in the processes required to produce a Departmentwide information architecture. The baseline analysis serves a diverse audience of program management and technical personnel and provides an organized way to examine the Department`s existing or de facto information architecture. A companion document to Volume 1, The Foundations, it furnishes the rationale for establishing a Departmentwide information architecture. This volume, consisting of the Baseline Analysis Summary (part 1), Baseline Analysis (part 2), and Reference Data (part 3), is of interest to readers who wish to understand how the Department`s current information architecture technologies are employed. The analysis identifies how and where current technologies support business areas, programs, sites, and corporate systems.

  3. Use of passive sampling devices to determine soil contaminant concentrations

    Energy Technology Data Exchange (ETDEWEB)

    Johnson, K.A. [Clemson Univ., Pendleton, SC (United States)]|[Washington State Univ., Richland, WA (United States); Hooper, M.J. [Clemson Univ., Pendleton, SC (United States); Weisskopf, C.P. [Washington State Univ., Richland, WA (United States)

    1996-12-31

    The effective remediation of contaminated sites requires accurate identification of chemical distributions. A rapid sampling method using passive sampling devices (PSDs) can provide a thorough site assessment. We have been pursuing their application in terrestrial systems and have found that they increase the ease and speed of analysis, decrease solvent usage and overall cost, and minimize the transport of contaminated soils. Time and cost savings allow a higher sampling frequency than is generally the case using traditional methods. PSDs have been used in the field in soils of varying physical properties and have been successful in estimating soil concentrations ranging from 1 {mu}g/kg (parts per billion) to greater than 200 mg/kg (parts per million). They were also helpful in identifying hot spots within the sites. Passive sampling devices show extreme promise as an analytical tool to rapidly characterize contaminant distributions in soil. There are substantial time and cost savings in laboratory personnel and supplies. By selectively excluding common interferences that require sample cleanup, PSDs can be retrieved from the field and processed rapidly (one technician can process approximately 90 PSDs in an 8-h work day). The results of our studies indicate that PSDs can be used to accurately estimate soil contaminant concentrations and provide lower detection limits. Further, time and cost savings will allow a more thorough and detailed characterization of contaminant distributions. 13 refs., 4 figs., 2 tabs.

  4. Core sampling system spare parts assessment

    International Nuclear Information System (INIS)

    Walter, E.J.

    1995-01-01

    Soon, there will be 4 independent core sampling systems obtaining samples from the underground tanks. It is desirable that these systems be available for sampling during the next 2 years. This assessment was prepared to evaluate the adequacy of the spare parts identified for the core sampling system and to provide recommendations that may remediate overages or inadequacies of spare parts

  5. LLNL's Big Science Capabilities Help Spur Over $796 Billion in U.S. Economic Activity Sequencing the Human Genome

    Energy Technology Data Exchange (ETDEWEB)

    Stewart, Jeffrey S. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States)

    2015-07-28

    LLNL’s successful history of taking on big science projects spans beyond national security and has helped create billions of dollars per year in new economic activity. One example is LLNL’s role in helping sequence the human genome. Over $796 billion in new economic activity in over half a dozen fields has been documented since LLNL successfully completed this Grand Challenge.

  6. Criterion 6, indicator 28 : total and per capita consumption of wood and wood products in round wood equivalents

    Science.gov (United States)

    James L. Howard; Rebecca Westby; Kenneth E. Skog

    2010-01-01

    Total consumption of wood and paper products and fuelwood, in roundwood equivalents, increased between 1965 and 1988 from 13.2 to 18.9 billion cubic feet. Since 1988, it has been about 20 billion cubic feet per year. Total per capita consumption increased between 1965 and 1987, from 68 to 83 ft3 per year. Since 1987 through 2006, per capita...

  7. Effectiveness of 131I nor-cholesterol uptake per unit volume of adrenal adenoma in the diagnosis of aldosteronoma

    International Nuclear Information System (INIS)

    Kita, Tamotsu; Tomita, Hiroko; Sakaguchi, Chiharu

    2010-01-01

    Diagnosis of adrenal adenomas for patients with primary aldosteronism is sometimes difficult only by referring to the visualization pattern in adrenocortical scintigraphy without regards to standard scintigraphy or suppression scintigraphy with dexamethasone. We studied if quantitative evaluation of the standard scintigraphy without dexamethasone suppression can be useful to diagnose aldosteronomas. Twenty-nine patients who had undergone adrenalectomy with different clinical manifestations (16 patients with primary aldosteronism, 6 patients with Cushing's syndrome and 7 patients without hormonal abnormality) were included in the study. Volume of the adrenocortical adenomas, 131 I nor-cholesterol uptake of the adrenocortical adenomas, and 131 I nor-cholesterol uptake per unit volume of the adrenocortical adenomas were compared between the 3 groups. The volume of adrenocortical adenomas in the patients with primary aldosteronism was significantly lower than those in the other two groups (Cushing's syndrome p 131 I nor-cholesterol uptake of adrenocortical adenoma. The 131 I nor-cholesterol uptake per unit volume of adrenocortical adenomas was significantly higher in the patients with primary aldosteronism than those in the other two groups (Cushing's syndrome p 131 I nor-cholesterol uptake per unit volume of adenoma obtained from adrenocortical scintigraphy without dexamethasone suppression can be useful in the diagnosis of aldosteronoma. (author)

  8. A low-volume cavity ring-down spectrometer for sample-limited applications

    Science.gov (United States)

    Stowasser, C.; Farinas, A. D.; Ware, J.; Wistisen, D. W.; Rella, C.; Wahl, E.; Crosson, E.; Blunier, T.

    2014-08-01

    In atmospheric and environmental sciences, optical spectrometers are used for the measurements of greenhouse gas mole fractions and the isotopic composition of water vapor or greenhouse gases. The large sample cell volumes (tens of milliliters to several liters) in commercially available spectrometers constrain the usefulness of such instruments for applications that are limited in sample size and/or need to track fast variations in the sample stream. In an effort to make spectrometers more suitable for sample-limited applications, we developed a low-volume analyzer capable of measuring mole fractions of methane and carbon monoxide based on a commercial cavity ring-down spectrometer. The instrument has a small sample cell (9.6 ml) and can selectively be operated at a sample cell pressure of 140, 45, or 20 Torr (effective internal volume of 1.8, 0.57, and 0.25 ml). We present the new sample cell design and the flow path configuration, which are optimized for small sample sizes. To quantify the spectrometer's usefulness for sample-limited applications, we determine the renewal rate of sample molecules within the low-volume spectrometer. Furthermore, we show that the performance of the low-volume spectrometer matches the performance of the standard commercial analyzers by investigating linearity, precision, and instrumental drift.

  9. Sneak Peek to the 2016 Billion-Ton Report

    Energy Technology Data Exchange (ETDEWEB)

    None

    2016-06-01

    The 2005 Billion-Ton Study became a landmark resource for bioenergy stakeholders, detailing for the first time the potential to produce at least one billion dry tons of biomass annually in a sustainable manner from U.S. agriculture and forest resources. The 2011 U.S. Billion-Ton Update expanded and updated the analysis, and in 2016, the U.S. Department of Energy’s Bioenergy Technologies Office plans to release the 2016 Billion-Ton Report: Advancing Domestic Resources for a Thriving Bioeconomy.

  10. A measurement of the absorption of liquid argon scintillation light by dissolved nitrogen at the part-per-million level

    International Nuclear Information System (INIS)

    Jones, B J P; Chiu, C S; Conrad, J M; Ignarra, C M; Katori, T; Toups, M

    2013-01-01

    We report on a measurement of the absorption length of scintillation light in liquid argon due to dissolved nitrogen at the part-per-million (ppm) level. We inject controlled quantities of nitrogen into a high purity volume of liquid argon and monitor the light yield from an alpha source. The source is placed at different distances from a cryogenic photomultiplier tube assembly. By comparing the light yield from each position we extract the absorption cross section of nitrogen. We find that nitrogen absorbs argon scintillation light with strength of (1.51±0.15) × 10 −4 cm −1 ppm −1 , corresponding to an absorption cross section of (4.99±0.51) × 10 −21 cm 2 molecule −1 . We obtain the relationship between absorption length and nitrogen concentration over the 0 to 50 ppm range and discuss the implications for the design and data analysis of future large liquid argon time projection chamber (LArTPC) detectors. Our results indicate that for a current-generation LArTPC, where a concentration of 2 parts per million of nitrogen is expected, the attenuation length due to nitrogen will be 30±3 meters

  11. Galaxy growth in a massive halo in the first billion years of cosmic history

    Science.gov (United States)

    Marrone, D. P.; Spilker, J. S.; Hayward, C. C.; Vieira, J. D.; Aravena, M.; Ashby, M. L. N.; Bayliss, M. B.; Béthermin, M.; Brodwin, M.; Bothwell, M. S.; Carlstrom, J. E.; Chapman, S. C.; Chen, Chian-Chou; Crawford, T. M.; Cunningham, D. J. M.; De Breuck, C.; Fassnacht, C. D.; Gonzalez, A. H.; Greve, T. R.; Hezaveh, Y. D.; Lacaille, K.; Litke, K. C.; Lower, S.; Ma, J.; Malkan, M.; Miller, T. B.; Morningstar, W. R.; Murphy, E. J.; Narayanan, D.; Phadke, K. A.; Rotermund, K. M.; Sreevani, J.; Stalder, B.; Stark, A. A.; Strandet, M. L.; Tang, M.; Weiß, A.

    2018-01-01

    According to the current understanding of cosmic structure formation, the precursors of the most massive structures in the Universe began to form shortly after the Big Bang, in regions corresponding to the largest fluctuations in the cosmic density field. Observing these structures during their period of active growth and assembly—the first few hundred million years of the Universe—is challenging because it requires surveys that are sensitive enough to detect the distant galaxies that act as signposts for these structures and wide enough to capture the rarest objects. As a result, very few such objects have been detected so far. Here we report observations of a far-infrared-luminous object at redshift 6.900 (less than 800 million years after the Big Bang) that was discovered in a wide-field survey. High-resolution imaging shows it to be a pair of extremely massive star-forming galaxies. The larger is forming stars at a rate of 2,900 solar masses per year, contains 270 billion solar masses of gas and 2.5 billion solar masses of dust, and is more massive than any other known object at a redshift of more than 6. Its rapid star formation is probably triggered by its companion galaxy at a projected separation of 8 kiloparsecs. This merging companion hosts 35 billion solar masses of stars and has a star-formation rate of 540 solar masses per year, but has an order of magnitude less gas and dust than its neighbour and physical conditions akin to those observed in lower-metallicity galaxies in the nearby Universe. These objects suggest the presence of a dark-matter halo with a mass of more than 100 billion solar masses, making it among the rarest dark-matter haloes that should exist in the Universe at this epoch.

  12. A new model for volume recombination in plane-parallel chambers in pulsed fields of high dose-per-pulse.

    Science.gov (United States)

    Gotz, M; Karsch, L; Pawelke, J

    2017-11-01

    In order to describe the volume recombination in a pulsed radiation field of high dose-per-pulse this study presents a numerical solution of a 1D transport model of the liberated charges in a plane-parallel ionization chamber. In addition, measurements were performed on an Advanced Markus ionization chamber in a pulsed electron beam to obtain suitable data to test the calculation. The experiment used radiation pulses of 4 μs duration and variable dose-per-pulse values up to about 1 Gy, as well as pulses of variable duration up to 308 [Formula: see text] at constant dose-per-pulse values between 85 mGy and 400 mGy. Those experimental data were compared to the developed numerical model and existing descriptions of volume recombination. At low collection voltages the observed dose-per-pulse dependence of volume recombination can be approximated by the existing theory using effective parameters. However, at high collection voltages large discrepancies are observed. The developed numerical model shows much better agreement with the observations and is able to replicate the observed behavior over the entire range of dose-per-pulse values and collection voltages. Using the developed numerical model, the differences between observation and existing theory are shown to be the result of a large fraction of the charge being collected as free electrons and the resultant distortion of the electric field inside the chamber. Furthermore, the numerical solution is able to calculate recombination losses for arbitrary pulse durations in good agreement with the experimental data, an aspect not covered by current theory. Overall, the presented numerical solution of the charge transport model should provide a more flexible tool to describe volume recombination for high dose-per-pulse values as well as for arbitrary pulse durations and repetition rates.

  13. The Complete Local Volume Groups Sample - I. Sample selection and X-ray properties of the high-richness subsample

    Science.gov (United States)

    O'Sullivan, Ewan; Ponman, Trevor J.; Kolokythas, Konstantinos; Raychaudhury, Somak; Babul, Arif; Vrtilek, Jan M.; David, Laurence P.; Giacintucci, Simona; Gitti, Myriam; Haines, Chris P.

    2017-12-01

    We present the Complete Local-Volume Groups Sample (CLoGS), a statistically complete optically selected sample of 53 groups within 80 Mpc. Our goal is to combine X-ray, radio and optical data to investigate the relationship between member galaxies, their active nuclei and the hot intra-group medium (IGM). We describe sample selection, define a 26-group high-richness subsample of groups containing at least four optically bright (log LB ≥ 10.2 LB⊙) galaxies, and report the results of XMM-Newton and Chandra observations of these systems. We find that 14 of the 26 groups are X-ray bright, possessing a group-scale IGM extending at least 65 kpc and with luminosity >1041 erg s-1, while a further three groups host smaller galaxy-scale gas haloes. The X-ray bright groups have masses in the range M500 ≃ 0.5-5 × 1013 M⊙, based on system temperatures of 0.4-1.4 keV, and X-ray luminosities in the range 2-200 × 1041 erg s-1. We find that ∼53-65 per cent of the X-ray bright groups have cool cores, a somewhat lower fraction than found by previous archival surveys. Approximately 30 per cent of the X-ray bright groups show evidence of recent dynamical interactions (mergers or sloshing), and ∼35 per cent of their dominant early-type galaxies host active galactic nuclei with radio jets. We find no groups with unusually high central entropies, as predicted by some simulations, and confirm that CLoGS is in principle capable of detecting such systems. We identify three previously unrecognized groups, and find that they are either faint (LX, R500 < 1042 erg s-1) with no concentrated cool core, or highly disturbed. This leads us to suggest that ∼20 per cent of X-ray bright groups in the local universe may still be unidentified.

  14. Out-of-Pocket Expenditures on Complementary Health Approaches Associated with Painful Health Conditions in a Nationally Representative Adult Sample

    Science.gov (United States)

    Nahin, Richard L.; Stussman, Barbara J.; Herman, Patricia M.

    2015-01-01

    National surveys suggest that millions of adults in the United States use complementary health approaches such as acupuncture, chiropractic manipulation, and herbal medicines to manage painful conditions such as arthritis, back pain and fibromyalgia. Yet, national and per person out-of-pocket (OOP) costs attributable to this condition-specific use are unknown. In the 2007 National Health Interview Survey, use of complementary health approaches, reasons for this use, and associated OOP costs were captured in a nationally representative sample of 5,467 adults. Ordinary least square regression models that controlled for co-morbid conditions were used to estimate aggregate and per person OOP costs associated with 14 painful health conditions. Individuals using complementary approaches spent a total of $14.9 billion (S.E. $0.9 billion) OOP on these approaches to manage these painful conditions. Total OOP expenditures seen in those using complementary approaches for their back pain ($8.7 billion, S.E. $0.8 billion) far outstripped that of any other condition, with the majority of these costs ($4.7 billion, S.E. $0.4 billion) resulting from visits to complementary providers. Annual condition-specific per-person OOP costs varied from a low of $568 (SE $144) for regular headaches, to a high of $895 (SE $163) for fibromyalgia. PMID:26320946

  15. Instrumental measurement of iridium abundances in the part-per-trillion range following neutron activation

    International Nuclear Information System (INIS)

    Alvarez, L.W.; Asaro, F.; Goulding, F.S.; Landis, D.A.; Madden, N.W.; Malone, D.F.

    1988-01-01

    An automated gamma-ray coincidence spectrometer has been constructed which, following neutron activation, can measure iridium (Ir) abundances of the order of 25 parts-per-trillion (ppt) in rock samples 500 times more rapidly than previously possible by instrumental techniques used at the Lawrence Berkeley Laboratory. Twin intrinsic Ge gamma-ray detectors count coincidences between the 316.5 and 468.1 keV gamma rays of 192 Ir, and together with a mineral-oil-based Compton suppression shield provide a sensitivity of 50 ppt Ir in 7 minute measurements of 100 mg limestone samples subsequent to irradiation in the University of Missouri reactor. Over 3000 samples have been measured, and in collaboration with many geologists and paleontologists from around the world, anomalous amounts of Ir have been detected in rocks with approximate ages of 12, 39, 67, 91, 150 and 3500 million years. Modifications are nearly complete to measure ten other elements very important to geochemical studies simultaneously (in the singles rather than the coincidence mode) with the Ir measurements

  16. Automated high-volume aerosol sampling station for environmental radiation monitoring

    International Nuclear Information System (INIS)

    Toivonen, H.; Honkamaa, T.; Ilander, T.; Leppaenen, A.; Nikkinen, M.; Poellaenen, R.; Ylaetalo, S.

    1998-07-01

    An automated high-volume aerosol sampling station, known as CINDERELLA.STUK, for environmental radiation monitoring has been developed by the Radiation and Nuclear Safety Authority (STUK), Finland. The sample is collected on a glass fibre filter (attached into a cassette), the airflow through the filter is 800 m 3 /h at maximum. During the sampling, the filter is continuously monitored with Na(I) scintillation detectors. After the sampling, the large filter is automatically cut into 15 pieces that form a small sample and after ageing, the pile of filter pieces is moved onto an HPGe detector. These actions are performed automatically by a robot. The system is operated at a duty cycle of 1 d sampling, 1 d decay and 1 d counting. Minimum detectable concentrations of radionuclides in air are typically 1Ae10 x 10 -6 Bq/m 3 . The station is equipped with various sensors to reveal unauthorized admittance. These sensors can be monitored remotely in real time via Internet or telephone lines. The processes and operation of the station are monitored and partly controlled by computer. The present approach fulfils the requirements of CTBTO for aerosol monitoring. The concept suits well for nuclear material safeguards, too

  17. Per-Sample Multiple Kernel Approach for Visual Concept Learning

    Directory of Open Access Journals (Sweden)

    Ling-Yu Duan

    2010-01-01

    Full Text Available Learning visual concepts from images is an important yet challenging problem in computer vision and multimedia research areas. Multiple kernel learning (MKL methods have shown great advantages in visual concept learning. As a visual concept often exhibits great appearance variance, a canonical MKL approach may not generate satisfactory results when a uniform kernel combination is applied over the input space. In this paper, we propose a per-sample multiple kernel learning (PS-MKL approach to take into account intraclass diversity for improving discrimination. PS-MKL determines sample-wise kernel weights according to kernel functions and training samples. Kernel weights as well as kernel-based classifiers are jointly learned. For efficient learning, PS-MKL employs a sample selection strategy. Extensive experiments are carried out over three benchmarking datasets of different characteristics including Caltech101, WikipediaMM, and Pascal VOC'07. PS-MKL has achieved encouraging performance, comparable to the state of the art, which has outperformed a canonical MKL.

  18. Per-Sample Multiple Kernel Approach for Visual Concept Learning

    Directory of Open Access Journals (Sweden)

    Tian Yonghong

    2010-01-01

    Full Text Available Abstract Learning visual concepts from images is an important yet challenging problem in computer vision and multimedia research areas. Multiple kernel learning (MKL methods have shown great advantages in visual concept learning. As a visual concept often exhibits great appearance variance, a canonical MKL approach may not generate satisfactory results when a uniform kernel combination is applied over the input space. In this paper, we propose a per-sample multiple kernel learning (PS-MKL approach to take into account intraclass diversity for improving discrimination. PS-MKL determines sample-wise kernel weights according to kernel functions and training samples. Kernel weights as well as kernel-based classifiers are jointly learned. For efficient learning, PS-MKL employs a sample selection strategy. Extensive experiments are carried out over three benchmarking datasets of different characteristics including Caltech101, WikipediaMM, and Pascal VOC'07. PS-MKL has achieved encouraging performance, comparable to the state of the art, which has outperformed a canonical MKL.

  19. Volume Ray Casting with Peak Finding and Differential Sampling

    KAUST Repository

    Knoll, A.

    2009-11-01

    Direct volume rendering and isosurfacing are ubiquitous rendering techniques in scientific visualization, commonly employed in imaging 3D data from simulation and scan sources. Conventionally, these methods have been treated as separate modalities, necessitating different sampling strategies and rendering algorithms. In reality, an isosurface is a special case of a transfer function, namely a Dirac impulse at a given isovalue. However, artifact-free rendering of discrete isosurfaces in a volume rendering framework is an elusive goal, requiring either infinite sampling or smoothing of the transfer function. While preintegration approaches solve the most obvious deficiencies in handling sharp transfer functions, artifacts can still result, limiting classification. In this paper, we introduce a method for rendering such features by explicitly solving for isovalues within the volume rendering integral. In addition, we present a sampling strategy inspired by ray differentials that automatically matches the frequency of the image plane, resulting in fewer artifacts near the eye and better overall performance. These techniques exhibit clear advantages over standard uniform ray casting with and without preintegration, and allow for high-quality interactive volume rendering with sharp C0 transfer functions. © 2009 IEEE.

  20. Brand Medications and Medicare Part D: How Eye Care Providers' Prescribing Patterns Influence Costs.

    Science.gov (United States)

    Newman-Casey, Paula Anne; Woodward, Maria A; Niziol, Leslie M; Lee, Paul P; De Lott, Lindsey B

    2018-03-01

    To quantify costs of eye care providers' Medicare Part D prescribing patterns for ophthalmic medications and to estimate the potential savings of generic or therapeutic drug substitutions and price negotiation. Retrospective cross-sectional study. Eye care providers prescribing medications through Medicare Part D in 2013. Medicare Part D 2013 prescriber public use file and summary file were used to calculate medication costs by physician specialty and drug. Savings from generic or therapeutic drug substitutions were estimated for brand drugs. The potential savings from price negotiation was estimated using drug prices negotiated by the United States Veterans Administration (USVA). Total cost of brand and generic medications prescribed by eye care providers. Eye care providers accounted for $2.4 billion in total Medicare part D prescription drug costs and generated the highest percentage of brand name medication claims compared with all other providers. Brand medications accounted for a significantly higher proportion of monthly supplies by volume, and therefore, also by total cost for eye care providers compared with all other providers (38% vs. 23% by volume, P total cost, P total cost attributable to eye care providers is driven by glaucoma medications, accounting for $1.2 billion (54% of total cost; 72% of total volume). The second costliest category, dry eye medications, was attributable mostly to a single medication, cyclosporine ophthalmic emulsion (Restasis, Allergan, Irvine, CA), which has no generic alternative, accounting for $371 million (17% of total cost; 4% of total volume). If generic medications were substituted for brand medications when available, $148 million would be saved (7% savings); if generic and therapeutic substitutions were made, $882 million would be saved (42% savings). If Medicare negotiated the prices for ophthalmic medications at USVA rates, $1.09 billion would be saved (53% savings). Eye care providers prescribe more brand

  1. Results of laboratory testing for diphacinone in seawater, fish, invertebrates, and soil following aerial application of rodenticide on Lehua Island, Kauai County, Hawaii, January 2009

    Science.gov (United States)

    Orazio, Carl E.; Tanner, Michael J.; Swenson, Chris; Herod, Jeffrey J.; Dunlevy, Peter; Gale, Robert W.

    2009-01-01

    In January 2009, rodenticide bait (Ramik Green pellets) containing the active ingredient diphacinone was aerially applied to Lehua Island. Reported herein are the results of laboratory analyses to determine diphacinone concentrations in samples of seawater, fillet of fish, soft tissue of limpets (opihi), whole-body crabs, and soil collected from Lehua Island, Kauai County, Hawaii, after aerial application of the rodenticide bait. Diphacinone was specifically chosen because of its low toxicity to nontarget organisms. Its use on Lehua Island is the second time it has ever been used for an aerial application to eradicate rodents. Testing of the Lehua Island samples for diphacinone utilized high-performance liquid chromatography with photodiode array detection. No detectable concentrations of diphacinone were found in any of the samples from Lehua Island. The limits of detection for diphacinone were 0.4 nanograms per milliliter (parts per billion) seawater, 15 nanograms per gram (dry weight) soil, 20 nanograms per gram (parts per billion) fish fillet, 13 nanograms per gram whole crab, and 34 nanograms per gram soft tissue limpet.

  2. Capillary ion chromatography with on-column focusing for ultra-trace analysis of methanesulfonate and inorganic anions in limited volume Antarctic ice core samples.

    Science.gov (United States)

    Rodriguez, Estrella Sanz; Poynter, Sam; Curran, Mark; Haddad, Paul R; Shellie, Robert A; Nesterenko, Pavel N; Paull, Brett

    2015-08-28

    Preservation of ionic species within Antarctic ice yields a unique proxy record of the Earth's climate history. Studies have been focused until now on two proxies: the ionic components of sea salt aerosol and methanesulfonic acid. Measurement of the all of the major ionic species in ice core samples is typically carried out by ion chromatography. Former methods, whilst providing suitable detection limits, have been based upon off-column preconcentration techniques, requiring larger sample volumes, with potential for sample contamination and/or carryover. Here, a new capillary ion chromatography based analytical method has been developed for quantitative analysis of limited volume Antarctic ice core samples. The developed analytical protocol applies capillary ion chromatography (with suppressed conductivity detection) and direct on-column sample injection and focusing, thus eliminating the requirement for off-column sample preconcentration. This limits the total sample volume needed to 300μL per analysis, allowing for triplicate sample analysis with Application to composite ice-core samples is demonstrated, with coupling of the capillary ion chromatograph to high resolution mass spectrometry used to confirm the presence and purity of the observed methanesulfonate peak. Copyright © 2015 Elsevier B.V. All rights reserved.

  3. Detector Sampling of Optical/IR Spectra: How Many Pixels per FWHM?

    Science.gov (United States)

    Robertson, J. Gordon

    2017-08-01

    Most optical and IR spectra are now acquired using detectors with finite-width pixels in a square array. Each pixel records the received intensity integrated over its own area, and pixels are separated by the array pitch. This paper examines the effects of such pixellation, using computed simulations to illustrate the effects which most concern the astronomer end-user. It is shown that coarse sampling increases the random noise errors in wavelength by typically 10-20 % at 2 pixels per Full Width at Half Maximum, but with wide variation depending on the functional form of the instrumental Line Spread Function (i.e. the instrumental response to a monochromatic input) and on the pixel phase. If line widths are determined, they are even more strongly affected at low sampling frequencies. However, the noise in fitted peak amplitudes is minimally affected by pixellation, with increases less than about 5%. Pixellation has a substantial but complex effect on the ability to see a relative minimum between two closely spaced peaks (or relative maximum between two absorption lines). The consistent scale of resolving power presented by Robertson to overcome the inadequacy of the Full Width at Half Maximum as a resolution measure is here extended to cover pixellated spectra. The systematic bias errors in wavelength introduced by pixellation, independent of signal/noise ratio, are examined. While they may be negligible for smooth well-sampled symmetric Line Spread Functions, they are very sensitive to asymmetry and high spatial frequency sub-structure. The Modulation Transfer Function for sampled data is shown to give a useful indication of the extent of improperly sampled signal in an Line Spread Function. The common maxim that 2 pixels per Full Width at Half Maximum is the Nyquist limit is incorrect and most Line Spread Functions will exhibit some aliasing at this sample frequency. While 2 pixels per Full Width at Half Maximum is nevertheless often an acceptable minimum for

  4. Tendances Carbone no. 75 'The CDM: let's not discard a tool that raised over US$ 200 billion'

    International Nuclear Information System (INIS)

    Shishlov, Igor

    2012-01-01

    Among the publications of CDC Climat Research, 'Tendances Carbone' bulletin specifically studies the developments of the European market for CO 2 allowances. This issue addresses the following points: Everyone wonders which miraculous instrument will enable the Green Climate Fund to mobilize the pledged US$100 billion per year in climate finance by 2020. Developing countries are now asking for interim targets to quench their mounting skepticism that this level of commitment can be reached. In the meantime paradoxically, the Clean Development Mechanism (CDM) - a tool that managed to leverage over US$200 billion of mostly private investment for climate change mitigation - is left dying without much regret

  5. The billion-mulmod-per-second PC

    NARCIS (Netherlands)

    Bernstein, D.J.; Chen, H.-C.; Chen, M.S.; Cheng, C.M.; Hsiao, C.H.; Lange, T.; Lin, Z.C.; Yang, B.Y.

    2009-01-01

    This paper sets new speed records for ECM, the elliptic-curve method of factorization, on several different hardware platforms: GPUs (specifically the NVIDIA GTX), x86 CPUs with SSE2 (specifically the Intel Core 2 and the AMD Phenom), and the Cell (specifically the PlayStation 3 and the PowerXCell

  6. Uranium in Canada: Billion-dollar industry

    International Nuclear Information System (INIS)

    Whillans, R.T.

    1989-01-01

    In 1988, Canada maintained its position as the world's leading producer and exporter of uranium; five primary uranium producers reported concentrate output containing 12,400 MT of uranium, or about one-third of Western production. Uranium shipments made by these producers in 1988 exceeded 13,200 MT, worth Canadian $1.1 billion. Because domestic requirements represent only 15% of current Canadian output, most of Canada's uranium production is available for export. Despite continued market uncertainty in 1988, Canada's uranium producers signed new sales contracts for some 14,000 MT, twice the 1987 level. About 90% of this new volume is with the US, now Canada's major uranium customer. The recent implementation of the Canada/US Free Trade agreement brings benefits to both countries; the uranium industries in each can now develop in an orderly, free market. Canada's uranium industry was restructured and consolidated in 1988 through merger and acquisition; three new uranium projects advanced significantly. Canada's new policy on nonresident ownership in the uranium mining sector, designed to encourage both Canadian and foreign investment, should greatly improve efforts to finance the development of recent Canadian uranium discoveries

  7. Optimized Cost per Click in Taobao Display Advertising

    OpenAIRE

    Zhu, Han; Jin, Junqi; Tan, Chang; Pan, Fei; Zeng, Yifan; Li, Han; Gai, Kun

    2017-01-01

    Taobao, as the largest online retail platform in the world, provides billions of online display advertising impressions for millions of advertisers every day. For commercial purposes, the advertisers bid for specific spots and target crowds to compete for business traffic. The platform chooses the most suitable ads to display in tens of milliseconds. Common pricing methods include cost per mille (CPM) and cost per click (CPC). Traditional advertising systems target certain traits of users and...

  8. Effects of biofertilizers and different water volume per irrigation on vegetative characteristics and seed yield of sesame (Sesamum indicum L.

    Directory of Open Access Journals (Sweden)

    S. Khorramdel

    2016-05-01

    Full Text Available In order to study the effects of biofertilizers and different water volume per irrigation on vegetative characteristics and seed yield of sesame (Sesamum indicum L., an experiment was conducted at the Research Greenhouse of Faculty of Agriculture, Ferdowsi University of Mashhad, during 2009. This experiment was conducted as factorial based on randomized complete block design with three replications. The first and the second factors were biofertilizers (Nitragin, Nitroxin, bio-phosphorus and control and water volume per irrigation (100, 200 and 300 ml, respectively. The results showed that the simple effects of biofertilizer and irrigation volume were significant (p≥0.05 on plant height, the first internode length, number and dry weight of leaves, dry weight of stem, chlorophyll content and relative water content (RWC of sesame. Also, interaction between biofertilizer and water volume per irrigation was significant (p≥0.05 plant height and RWC. The maximum and the minimum sesame seed yield were observed in Nitragin and control with 204.4 and 100.0 kg.m-2, respectively. The highest seed yield was observed in 100 ml (202.1 kg.m-2 and the lowest was achieved with 300 ml (170.1 kg.m-2 per irrigation water. Application of biofertilizers enhanced root development and hence availability of moisture and nutrients, particularly nitrogen and phosphorus. On the other hand, since these fertilizers are promote of growth regulator and hence in basement of growth and photosynthesis of sesame. With increasing irrigation volume from 100 to 300 ml, growth of sesame was decreased. Therefore, sesame application of biofertilizers could improve its vegetative characteristics in dry and semi-dry regions.

  9. A Volume-Limited Sample of L and T Dwarfs Defined by Parallaxes

    Science.gov (United States)

    Best, William M. J.; Liu, Michael C.; Magnier, Eugene; Dupuy, Trent

    2018-01-01

    Volume-limited samples are the gold standard for stellar population studies, as they enable unbiased measurements of space densities and luminosity functions. Parallaxes are the most direct measures of distance and are therefore essential for defining high-confidence volume limited samples. Previous efforts to model the local brown dwarf population were hampered by samples based on a small number of parallaxes. We are using UKIRT/WFCAM to conduct the largest near-infrared program to date to measure parallaxes and proper motions of L and T dwarfs. For the past 3+ years we have monitored over 350 targets, ≈90% of which are too faint to be observed by Gaia. We present preliminary results from our observations. Our program more than doubles the number of known L and T dwarf parallaxes, defining a volume-limited sample of ≈400 L0-T6 dwarfs out to 25 parsecs, the first L and T dwarf sample of this size and depth based entirely on parallaxes. Our sample will combine with the upcoming stellar census from Gaia DR2 parallaxes to form a complete volume-limited sample of nearby stars and brown dwarfs.

  10. Determination of sub-parts per billion levels of copper in complex matrices by adsorptive string voltammetry on a mercury electrode

    International Nuclear Information System (INIS)

    Shahbaazi, H.R.; Shahbaazi, H.R.; Safavi, A.; Maleki, N.

    2008-01-01

    The voltammetric characteristics of Cu(II)-SSA complex at the mercury electrode were investigated. An analytical method that based on the adsorptive accumulation of Cu(II)-SSA complex followed by the reduction of the complexed copper was developed for the copper determination in complex matrices in presence of the large amount of foreign ions. Adsorptive voltammetry determination showed that the Cu(II)-SSA complex behaves irreversibly exchanging two electrons on the hanging mercury drop electrode (HMDE). Factor affecting on the complextion, accumulation and stripping steps were studied and a procedure was developed. The instrumental parameters in the measurement step were also tested. The optimum conditions of pH, SSA concentration, accumulation potential and accumulation time were studied. Under optimal conditions (pH=12.9 glycin Buffer, 7 x 10 - '3 M SSA and accumulation potential -100 mV vs. Ag/ AgCl), a linear calibration graph in the range 1.25 μg L -1 to 42.5 μg L'- 1 and a detection limit of 0.8 μ L -1 were obtained. The average relative standard deviation (RSD) for seven determinations was calculated as 7 %, 5.5 % and 3 % for the concentrations between 3, 15 and 23 μg L'- 1 . The effect of foreign ions and surfactants on the peak height of Cu(II)-SSA complex was evaluated. The method was applied for the determination of the copper in different real samples such as crude oil, crude oil tank button sludge, waste water and tap water samples. The accuracy of the results was checked by ICP. (author)

  11. Molecular dynamics beyonds the limits: Massive scaling on 72 racks of a BlueGene/P and supercooled glass dynamics of a 1 billion particles system

    KAUST Repository

    Allsopp, Nicholas

    2012-04-01

    We report scaling results on the world\\'s largest supercomputer of our recently developed Billions-Body Molecular Dynamics (BBMD) package, which was especially designed for massively parallel simulations of the short-range atomic dynamics in structural glasses and amorphous materials. The code was able to scale up to 72 racks of an IBM BlueGene/P, with a measured 89% efficiency for a system with 100 billion particles. The code speed, with 0.13. s per iteration in the case of 1 billion particles, paves the way to the study of billion-body structural glasses with a resolution increase of two orders of magnitude with respect to the largest simulation ever reported. We demonstrate the effectiveness of our code by studying the liquid-glass transition of an exceptionally large system made by a binary mixture of 1 billion particles. © 2012.

  12. Amplification volume reduction on DNA database samples using FTA™ Classic Cards.

    Science.gov (United States)

    Wong, Hang Yee; Lim, Eng Seng Simon; Tan-Siew, Wai Fun

    2012-03-01

    The DNA forensic community always strives towards improvements in aspects such as sensitivity, robustness, and efficacy balanced with cost efficiency. Therefore our laboratory decided to study the feasibility of PCR amplification volume reduction using DNA entrapped in FTA™ Classic Card and to bring cost savings to the laboratory. There were a few concerns the laboratory needed to address. First, the kinetics of the amplification reaction could be significantly altered. Second, an increase in sensitivity might affect interpretation due to increased stochastic effects even though they were pristine samples. Third, statics might cause FTA punches to jump out of its allocated well into another thus causing sample-to-sample contamination. Fourth, the size of the punches might be too small for visual inspection. Last, there would be a limit to the extent of volume reduction due to evaporation and the possible need of re-injection of samples for capillary electrophoresis. The laboratory had successfully optimized a reduced amplification volume of 10 μL for FTA samples. Copyright © 2011 Elsevier Ireland Ltd. All rights reserved.

  13. Sampling based motion planning with reachable volumes: Application to manipulators and closed chain systems

    KAUST Repository

    McMahon, Troy

    2014-09-01

    © 2014 IEEE. Reachable volumes are a geometric representation of the regions the joints of a robot can reach. They can be used to generate constraint satisfying samples for problems including complicated linkage robots (e.g. closed chains and graspers). They can also be used to assist robot operators and to help in robot design.We show that reachable volumes have an O(1) complexity in unconstrained problems as well as in many constrained problems. We also show that reachable volumes can be computed in linear time and that reachable volume samples can be generated in linear time in problems without constraints. We experimentally validate reachable volume sampling, both with and without constraints on end effectors and/or internal joints. We show that reachable volume samples are less likely to be invalid due to self-collisions, making reachable volume sampling significantly more efficient for higher dimensional problems. We also show that these samples are easier to connect than others, resulting in better connected roadmaps. We demonstrate that our method can be applied to 262-dof, multi-loop, and tree-like linkages including combinations of planar, prismatic and spherical joints. In contrast, existing methods either cannot be used for these problems or do not produce good quality solutions.

  14. Sampling based motion planning with reachable volumes: Application to manipulators and closed chain systems

    KAUST Repository

    McMahon, Troy; Thomas, Shawna; Amato, Nancy M.

    2014-01-01

    © 2014 IEEE. Reachable volumes are a geometric representation of the regions the joints of a robot can reach. They can be used to generate constraint satisfying samples for problems including complicated linkage robots (e.g. closed chains and graspers). They can also be used to assist robot operators and to help in robot design.We show that reachable volumes have an O(1) complexity in unconstrained problems as well as in many constrained problems. We also show that reachable volumes can be computed in linear time and that reachable volume samples can be generated in linear time in problems without constraints. We experimentally validate reachable volume sampling, both with and without constraints on end effectors and/or internal joints. We show that reachable volume samples are less likely to be invalid due to self-collisions, making reachable volume sampling significantly more efficient for higher dimensional problems. We also show that these samples are easier to connect than others, resulting in better connected roadmaps. We demonstrate that our method can be applied to 262-dof, multi-loop, and tree-like linkages including combinations of planar, prismatic and spherical joints. In contrast, existing methods either cannot be used for these problems or do not produce good quality solutions.

  15. Winglets Save Billions of Dollars in Fuel Costs

    Science.gov (United States)

    2010-01-01

    The upturned ends now featured on many airplane wings are saving airlines billions of dollars in fuel costs. Called winglets, the drag-reducing technology was advanced through the research of Langley Research Center engineer Richard Whitcomb and through flight tests conducted at Dryden Flight Research Center. Seattle-based Aviation Partners Boeing -- a partnership between Aviation Partners Inc., of Seattle, and The Boeing Company, of Chicago -- manufactures Blended Winglets, a unique design featured on Boeing aircraft around the world. These winglets have saved more than 2 billion gallons of jet fuel to date, representing a cost savings of more than $4 billion and a reduction of almost 21.5 million tons in carbon dioxide emissions.

  16. Proton collider breaks the six-billion-dollar barrier

    CERN Multimedia

    Vaughan, C

    1990-01-01

    The SSC will cost at least 1 billion more than its estimated final price of 5.9 billion dollars. Critics in congress believe the final bill could be double that figure. The director of the SSC blames most of the increase in cost on technical problems with developing the superconducting magnets for the SSC (1/2 page).

  17. Chemical thermodynamics of iron - Part 1 - Chemical thermodynamics volume 13a

    International Nuclear Information System (INIS)

    Lemire, Robert J.; Berner, Urs; Musikas, Claude; Palmer, Donald A.; Taylor, Peter; Tochiyama, Osamu; Perrone, Jane

    2013-01-01

    Volume 13a of the 'Chemical Thermodynamics' (TDB) series, is the first of two volumes describing the selection of chemical thermodynamic data for species of iron. Because of the voluminous information in the literature, it has been more efficient to prepare the review in two (unequal) parts. This larger first part contains assessments of data for the metal, simple ions, aqueous hydroxido, chlorido, sulfido, sulfato and carbonato complexes, and for solid oxides and hydroxides, halides, sulfates, carbonates and simple silicates. The second part will provide assessments of data for other aqueous halido species, sulfide solids, and solid and solution species with nitrate, phosphate and arsenate, as well as some aspects of solid solutions in iron-oxide and iron-sulfide systems. The database system developed at the OECD/NEA Data Bank ensures consistency not only within the recommended data sets of iron, but also among all the data sets published in the series. This volume will be of particular interest to scientists carrying out performance assessments of deep geological disposal sites for radioactive waste

  18. Origins fourteen billion years of cosmic evolution

    CERN Document Server

    Tyson, Neil deGrasse

    2004-01-01

    Origins explores cosmic science's stunning new insights into the formation and evolution of our universe--of the cosmos, of galaxies and galaxy clusters, of stars within galaxies, of planets that orbit those stars, and of different forms of life that take us back to the first three seconds and forward through three billion years of life on Earth to today's search for life on other planets. Drawing on the current cross-pollination of geology, biology and astrophysics, Origins explains the thrilling daily breakthroughs in our knowledge of the universe from dark energy to life on Mars to the mysteries of space and time. Distilling complex science in clear and lively prose, co-authors Neil deGrasse Tyson and Donald Goldsmith conduct a galvanising tour of the cosmos revealing what the universe has been up to while turning part of itself into us.

  19. Sample to moderator volume ratio effects in neutron yield from a PGNAA setup

    Energy Technology Data Exchange (ETDEWEB)

    Naqvi, A.A. [Department of Physics, King Fahd University of Petroleum and Minerals, KFUPM Box 1815, Dhahran-31261 (Saudi Arabia)]. E-mail: aanaqvi@kfupm.edu.sa; Fazal-ur-Rehman [Department of Physics, King Fahd University of Petroleum and Minerals, KFUPM Box 1815, Dhahran-31261 (Saudi Arabia); Nagadi, M.M. [Department of Physics, King Fahd University of Petroleum and Minerals, KFUPM Box 1815, Dhahran-31261 (Saudi Arabia); Khateeb-ur-Rehman [Department of Physics, King Fahd University of Petroleum and Minerals, KFUPM Box 1815, Dhahran-31261 (Saudi Arabia)

    2007-02-15

    Performance of a prompt gamma ray neutron activation analysis (PGNAA) setup depends upon thermal neutron yield at the PGNAA sample location. For a moderator, which encloses a sample, thermal neutron intensity depends upon the effective moderator volume excluding the void volume due to sample volume. A rectangular moderator assembly has been designed for the King Fahd University of Petroleum and Minerals (KFUPM) PGNAA setup. The thermal and fast neutron yield has been measured inside the sample cavity as a function of its front moderator thickness using alpha particle tracks density and recoil proton track density inside the CR-39 nuclear track detectors (NTDs). The thermal/fast neutron yield ratio, obtained from the alpha particle tracks density to proton tracks density ratio in the NTDs, shows an inverse correlation with sample to moderator volume ratio. Comparison of the present results with the previously published results of smaller moderators of the KFUPM PGNAA setup confirms the observation.

  20. Sub-parts per million NO2 chemi-transistor sensors based on composite porous silicon/gold nanostructures prepared by metal-assisted etching.

    Science.gov (United States)

    Sainato, Michela; Strambini, Lucanos Marsilio; Rella, Simona; Mazzotta, Elisabetta; Barillaro, Giuseppe

    2015-04-08

    Surface doping of nano/mesostructured materials with metal nanoparticles to promote and optimize chemi-transistor sensing performance represents the most advanced research trend in the field of solid-state chemical sensing. In spite of the promising results emerging from metal-doping of a number of nanostructured semiconductors, its applicability to silicon-based chemi-transistor sensors has been hindered so far by the difficulties in integrating the composite metal-silicon nanostructures using the complementary metal-oxide-semiconductor (CMOS) technology. Here we propose a facile and effective top-down method for the high-yield fabrication of chemi-transistor sensors making use of composite porous silicon/gold nanostructures (cSiAuNs) acting as sensing gate. In particular, we investigate the integration of cSiAuNs synthesized by metal-assisted etching (MAE), using gold nanoparticles (NPs) as catalyst, in solid-state junction-field-effect transistors (JFETs), aimed at the detection of NO2 down to 100 parts per billion (ppb). The chemi-transistor sensors, namely cSiAuJFETs, are CMOS compatible, operate at room temperature, and are reliable, sensitive, and fully recoverable for the detection of NO2 at concentrations between 100 and 500 ppb, up to 48 h of continuous operation.

  1. Lean Production Control at a High-Variety, Low-Volume Parts Manufacturer

    NARCIS (Netherlands)

    Bokhorst, Jos A. C.; Slomp, Jannes

    2010-01-01

    Eaton Electric General Supplies, a parts manufacturing unit that supplies parts for Eaton's electrical business unit, implemented several lean control elements in its high-variety, low-volume production units. These control elements include a constant work-in-process mechanism to limit and control

  2. Per tree estimates with n-tree distance sampling: an application to increment core data

    Science.gov (United States)

    Thomas B. Lynch; Robert F. Wittwer

    2002-01-01

    Per tree estimates using the n trees nearest a point can be obtained by using a ratio of per unit area estimates from n-tree distance sampling. This ratio was used to estimate average age by d.b.h. classes for cottonwood trees (Populus deltoides Bartr. ex Marsh.) on the Cimarron National Grassland. Increment...

  3. Should measurement of maximum urinary flow rate and residual urine volume be a part of a "minimal care" assessment programme in female incontinence?

    DEFF Research Database (Denmark)

    Sander, Pia; Mouritsen, L; Andersen, J Thorup

    2002-01-01

    OBJECTIVE: The aim of this study was to evaluate the value of routine measurements of urinary flow rate and residual urine volume as a part of a "minimal care" assessment programme for women with urinary incontinence in detecting clinical significant bladder emptying problems. MATERIAL AND METHODS....... Twenty-six per cent had a maximum flow rate less than 15 ml/s, but only 4% at a voided volume > or =200 ml. Residual urine more than 149 ml was found in 6%. Two women had chronic retention with overflow incontinence. Both had typical symptoms with continuous leakage, stranguria and chronic cystitis...

  4. Rapid emergence of subaerial landmasses and onset of a modern hydrologic cycle 2.5 billion years ago.

    Science.gov (United States)

    Bindeman, I N; Zakharov, D O; Palandri, J; Greber, N D; Dauphas, N; Retallack, G J; Hofmann, A; Lackey, J S; Bekker, A

    2018-05-01

    The history of the growth of continental crust is uncertain, and several different models that involve a gradual, decelerating, or stepwise process have been proposed 1-4 . Even more uncertain is the timing and the secular trend of the emergence of most landmasses above the sea (subaerial landmasses), with estimates ranging from about one billion to three billion years ago 5-7 . The area of emerged crust influences global climate feedbacks and the supply of nutrients to the oceans 8 , and therefore connects Earth's crustal evolution to surface environmental conditions 9-11 . Here we use the triple-oxygen-isotope composition of shales from all continents, spanning 3.7 billion years, to provide constraints on the emergence of continents over time. Our measurements show a stepwise total decrease of 0.08 per mille in the average triple-oxygen-isotope value of shales across the Archaean-Proterozoic boundary. We suggest that our data are best explained by a shift in the nature of water-rock interactions, from near-coastal in the Archaean era to predominantly continental in the Proterozoic, accompanied by a decrease in average surface temperatures. We propose that this shift may have coincided with the onset of a modern hydrological cycle owing to the rapid emergence of continental crust with near-modern average elevation and aerial extent roughly 2.5 billion years ago.

  5. Galaxy evolution. Evidence for mature bulges and an inside-out quenching phase 3 billion years after the Big Bang.

    Science.gov (United States)

    Tacchella, S; Carollo, C M; Renzini, A; Förster Schreiber, N M; Lang, P; Wuyts, S; Cresci, G; Dekel, A; Genzel, R; Lilly, S J; Mancini, C; Newman, S; Onodera, M; Shapley, A; Tacconi, L; Woo, J; Zamorani, G

    2015-04-17

    Most present-day galaxies with stellar masses ≥10(11) solar masses show no ongoing star formation and are dense spheroids. Ten billion years ago, similarly massive galaxies were typically forming stars at rates of hundreds solar masses per year. It is debated how star formation ceased, on which time scales, and how this "quenching" relates to the emergence of dense spheroids. We measured stellar mass and star-formation rate surface density distributions in star-forming galaxies at redshift 2.2 with ~1-kiloparsec resolution. We find that, in the most massive galaxies, star formation is quenched from the inside out, on time scales less than 1 billion years in the inner regions, up to a few billion years in the outer disks. These galaxies sustain high star-formation activity at large radii, while hosting fully grown and already quenched bulges in their cores. Copyright © 2015, American Association for the Advancement of Science.

  6. The Origin of Amino Acids in Lunar Regolith Samples

    Science.gov (United States)

    Cook, Jamie E.; Callahan, Michael P.; Dworkin, Jason P.; Glavin, Daniel P.; McLain, Hannah L.; Noble, Sarah K.; Gibson, Everett K., Jr.

    2016-01-01

    We analyzed the amino acid content of seven lunar regolith samples returned by the Apollo 16 and Apollo 17 missions and stored under NASA curation since collection using ultrahigh-performance liquid chromatography with fluorescence detection and time-of-flight mass spectrometry. Consistent with results from initial analyses shortly after collection in the 1970s, we observed amino acids at low concentrations in all of the curated samples, ranging from 0.2 parts-per-billion (ppb) to 42.7 ppb in hot-water extracts and 14.5 ppb to 651.1 ppb in 6M HCl acid-vapor-hydrolyzed, hot-water extracts. Amino acids identified in the Apollo soil extracts include glycine, D- and L-alanine, D- and L-aspartic acid, D- and L-glutamic acid, D- and L-serine, L-threonine, and L-valine, all of which had previously been detected in lunar samples, as well as several compounds not previously identified in lunar regoliths: -aminoisobutyric acid (AIB), D-and L-amino-n-butyric acid (-ABA), DL-amino-n-butyric acid, -amino-n-butyric acid, -alanine, and -amino-n-caproic acid. We observed an excess of the L enantiomer in most of the detected proteinogenic amino acids, but racemic alanine and racemic -ABA were present in some samples.

  7. 2016 Billion-Ton Report: Advancing Domestic Resources for a Thriving Bioeconomy, Volume 2: Environmental Sustainability Effects of Select Scenarios from Volume 1

    Energy Technology Data Exchange (ETDEWEB)

    Efroymson, Rebecca Ann [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Langholtz, Matthew H. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States)

    2017-01-11

    With the goal of understanding environmental effects of a growing bioeconomy, the U.S. Department of Energy (DOE), national laboratories, and U.S. Forest Service research laboratories, together with academic and industry collaborators, undertook a study to estimate environmental effects of potential biomass production scenarios in the United States, with an emphasis on agricultural and forest biomass. Potential effects investigated include changes in soil organic carbon (SOC), greenhouse gas (GHG) emissions, water quality and quantity, air emissions, and biodiversity. Effects of altered land-management regimes were analyzed based on select county-level biomass-production scenarios for 2017 and 2040 taken from the 2016 Billion-Ton Report: Advancing Domestic Resources for a Thriving Bioeconomy (BT16), volume 1, which assumes that the land bases for agricultural and forestry would not change over time. The scenarios reflect constraints on biomass supply (e.g., excluded areas; implementation of management practices; and consideration of food, feed, forage, and fiber demands and exports) that intend to address sustainability concerns. Nonetheless, both beneficial and adverse environmental effects might be expected. To characterize these potential effects, this research sought to estimate where and under what modeled scenarios or conditions positive and negative environmental effects could occur nationwide. The report also includes a discussion of land-use change (LUC) (i.e., land management change) assumptions associated with the scenario transitions (but not including analysis of indirect LUC [ILUC]), analyses of climate sensitivity of feedstock productivity under a set of potential scenarios, and a qualitative environmental effects analysis of algae production under carbon dioxide (CO2) co-location scenarios. Because BT16 biomass supplies are simulated independent of a defined end use, most analyses do not include benefits from displacing fossil fuels or other

  8. Saving billions of dollars--and physicians' time--by streamlining billing practices.

    Science.gov (United States)

    Blanchfield, Bonnie B; Heffernan, James L; Osgood, Bradford; Sheehan, Rosemary R; Meyer, Gregg S

    2010-06-01

    The U.S. system of billing third parties for health care services is complex, expensive, and inefficient. Physicians end up using nearly 12 percent of their net patient service revenue to cover the costs of excessive administrative complexity. A single transparent set of payment rules for multiple payers, a single claim form, and standard rules of submission, among other innovations, would reduce the burden on the billing offices of physician organizations. On a national scale, our hypothetical modeling of these changes would translate into $7 billion of savings annually for physician and clinical services. Four hours of professional time per physician and five hours of practice support staff time could be saved each week.

  9. Plasma volume changes during hypoglycaemia: the effect of arterial blood sampling

    DEFF Research Database (Denmark)

    Hilsted, J; Bendtsen, F; Christensen, N J

    1990-01-01

    To investigate whether previously reported changes in venous blood volume and composition induced by acute hypoglycaemia in humans are representative for the entire body we measured erythrocyte 51Cr content, haematocrit, plasma volume, intravascular albumin content and transcapillary escape rate...... hypoglycaemia. The magnitude of the changes in arterial and venous blood were not significantly different. These results indicate that the above changes in blood volume and composition are whole-body phenomena: furthermore, the major part of the changes are likely to occur in tissues other than upper extremity...

  10. Analytical evaluation of BEA zeolite for the pre-concentration of polycyclic aromatic hydrocarbons and their subsequent chromatographic analysis in water samples.

    Science.gov (United States)

    Wilson, Walter B; Costa, Andréia A; Wang, Huiyong; Dias, José A; Dias, Sílvia C L; Campiglia, Andres D

    2012-07-06

    The analytical performance of BEA - a commercial zeolite - is evaluated for the pre-concentration of fifteen Environmental Protection Agency - polycyclic aromatic hydrocarbons and their subsequent HPLC analysis in tap and lake water samples. The pre-concentration factors obtained with BEA have led to a method with excellent analytical figures of merit. One milliliter aliquots were sufficient to obtain excellent precision of measurements at the parts-per-trillion concentration level with relative standard deviations varying from 4.1% (dibenzo[a,h]anthracene) to 13.4% (pyrene). The limits of detection were excellent as well and varied between 1.1 (anthracene) and 49.9 ng L(-1) (indeno[1,2,3-cd]pyrene). The recovery values of all the studied compounds meet the criterion for regulated polycyclic aromatic hydrocarbons, which mandates relative standard deviations equal or lower than 25%. The small volume of organic solvents (100 μL per sample) and amount of BEA (2 mg per sample) makes sample pre-concentration environmentally friendly and cost effective. The extraction procedure is well suited for numerous samples as the small working volume (1 mL) facilitates the implementation of simultaneous sample extraction. These are attractive features when routine monitoring of numerous samples is contemplated. Copyright © 2012 Elsevier B.V. All rights reserved.

  11. Volume Ray Casting with Peak Finding and Differential Sampling

    KAUST Repository

    Knoll, A.; Hijazi, Y.; Westerteiger, R.; Schott, M.; Hansen, C.; Hagen, H.

    2009-01-01

    classification. In this paper, we introduce a method for rendering such features by explicitly solving for isovalues within the volume rendering integral. In addition, we present a sampling strategy inspired by ray differentials that automatically matches

  12. A self-sampling method to obtain large volumes of undiluted cervicovaginal secretions.

    Science.gov (United States)

    Boskey, Elizabeth R; Moench, Thomas R; Hees, Paul S; Cone, Richard A

    2003-02-01

    Studies of vaginal physiology and pathophysiology sometime require larger volumes of undiluted cervicovaginal secretions than can be obtained by current methods. A convenient method for self-sampling these secretions outside a clinical setting can facilitate such studies of reproductive health. The goal was to develop a vaginal self-sampling method for collecting large volumes of undiluted cervicovaginal secretions. A menstrual collection device (the Instead cup) was inserted briefly into the vagina to collect secretions that were then retrieved from the cup by centrifugation in a 50-ml conical tube. All 16 women asked to perform this procedure found it feasible and acceptable. Among 27 samples, an average of 0.5 g of secretions (range, 0.1-1.5 g) was collected. This is a rapid and convenient self-sampling method for obtaining relatively large volumes of undiluted cervicovaginal secretions. It should prove suitable for a wide range of assays, including those involving sexually transmitted diseases, microbicides, vaginal physiology, immunology, and pathophysiology.

  13. A cache-friendly sampling strategy for texture-based volume rendering on GPU

    Directory of Open Access Journals (Sweden)

    Junpeng Wang

    2017-06-01

    Full Text Available The texture-based volume rendering is a memory-intensive algorithm. Its performance relies heavily on the performance of the texture cache. However, most existing texture-based volume rendering methods blindly map computational resources to texture memory and result in incoherent memory access patterns, causing low cache hit rates in certain cases. The distance between samples taken by threads of an atomic scheduling unit (e.g. a warp of 32 threads in CUDA of the GPU is a crucial factor that affects the texture cache performance. Based on this fact, we present a new sampling strategy, called Warp Marching, for the ray-casting algorithm of texture-based volume rendering. The effects of different sample organizations and different thread-pixel mappings in the ray-casting algorithm are thoroughly analyzed. Also, a pipeline manner color blending approach is introduced and the power of warp-level GPU operations is leveraged to improve the efficiency of parallel executions on the GPU. In addition, the rendering performance of the Warp Marching is view-independent, and it outperforms existing empty space skipping techniques in scenarios that need to render large dynamic volumes in a low resolution image. Through a series of micro-benchmarking and real-life data experiments, we rigorously analyze our sampling strategies and demonstrate significant performance enhancements over existing sampling methods.

  14. Three Mile Island: a report to the commissioners and to the public. Volume II, Part 1

    International Nuclear Information System (INIS)

    1979-01-01

    This is part one of three parts of the second volume of the Special Inquiry Group's report to the Nuclear Regulatory Commission on the accident at Three Mile Island. The first volume contained a narrative description of the accident and a discussion of the major conclusions and recommendations. This second volume is divided into three parts. Part 1 of Volume II focuses on the pre-accident licensing and regulatory background. This part includes an examination of the overall licensing and regulatory system for nuclear powerplants viewed from different perspectives: the system as it is set forth in statutes and regulations, as described in Congressional testimony, and an overview of the system as it really works. In addition, Part 1 includes the licensing, operating, and inspection history of Three Mile Island Unit 2, discussions of relevant regulatory matters, a discussion of specific precursor events related to the accident, a case study of the pressurizer design issue, and an analysis of incentives to declare commercial operation

  15. 43 CFR Appendix A to Part 10 - Sample Summary

    Science.gov (United States)

    2010-10-01

    ... 43 Public Lands: Interior 1 2010-10-01 2010-10-01 false Sample Summary A Appendix A to Part 10... REPATRIATION REGULATIONS Pt. 10, App. A Appendix A to Part 10—Sample Summary The following is a generic sample and should be used as a guideline for preparation of summaries tailoring the information to the...

  16. Volume holographic memory

    Directory of Open Access Journals (Sweden)

    Cornelia Denz

    2000-05-01

    Full Text Available Volume holography represents a promising alternative to existing storage technologies. Its parallel data storage leads to high capacities combined with short access times and high transfer rates. The design and realization of a compact volume holographic storage demonstrator is presented. The technique of phase-coded multiplexing implemented to superimpose many data pages in a single location enables to store up to 480 holograms per storage location without any moving parts. Results of analog and digital data storage are shown and real time optical image processing is demonstrated.

  17. New complete sample of identified radio sources. Part 2. Statistical study

    International Nuclear Information System (INIS)

    Soltan, A.

    1978-01-01

    Complete sample of radio sources with known redshifts selected in Paper I is studied. Source counts in the sample and the luminosity - volume test show that both quasars and galaxies are subject to the evolution. Luminosity functions for different ranges of redshifts are obtained. Due to many uncertainties only simplified models of the evolution are tested. Exponential decline of the liminosity with time of all the bright sources is in a good agreement both with the luminosity- volume test and N(S) realtion in the entire range of observed flux densities. It is shown that sources in the sample are randomly distributed in scales greater than about 17 Mpc. (author)

  18. 12 billion DM for Germany

    International Nuclear Information System (INIS)

    Anon.

    1975-01-01

    The German atomic industry has achieved the break-through to the world market: Brazil orders eight nuclear electricity generating plants from Siemens-AEG daughter Kraftwerk-Union. US concerns attacked the twelve billion DM deal, the biggest export order in the history of German industry. Without avail - the contract is to be signed in Bonn this week. (orig./LH) [de

  19. The Stanford-U.S. Geological Survey SHRIMP ion microprobe--a tool for micro-scale chemical and isotopic analysis

    Science.gov (United States)

    Bacon, Charles R.; Grove, Marty; Vazquez, Jorge A.; Coble, Matthew A.

    2012-01-01

    Answers to many questions in Earth science require chemical analysis of minute volumes of minerals, volcanic glass, or biological materials. Secondary Ion Mass Spectrometry (SIMS) is an extremely sensitive analytical method in which a 5–30 micrometer diameter "primary" beam of charged particles (ions) is focused on a region of a solid specimen to sputter secondary ions from 1–5 nanograms of the sample under high vacuum. The elemental abundances and isotopic ratios of these secondary ions are determined with a mass spectrometer. These results can be used for geochronology to determine the age of a region within a crystal thousands to billions of years old or to precisely measure trace abundances of chemical elements at concentrations as low as parts per billion. A partnership of the U.S. Geological Survey and the Stanford University School of Earth Sciences operates a large SIMS instrument, the Sensitive High-Resolution Ion Microprobe with Reverse Geometry (SHRIMP–RG) on the Stanford campus.

  20. Recent changes in atmospheric carbon monoxide

    Energy Technology Data Exchange (ETDEWEB)

    Novelli, P.C.; Masarie, K.A. (Univ. of Colorado, Boulder, CO (United States)); Tans, P.P.; Lang, P.M. (National Oceanic and Atmospheric Administration, Boulder, CO (United States))

    1994-03-18

    Measurements of carbon monoxide (CO) in air samples collected from 27 locations between 71[degrees]N and 41[degrees]S show that atmospheric levels of this gas have decreased worldwide over the past 2 to 5 years. During this period, CO decreased at nearly a constant rate in the high northern latitudes. In contrast, in the tropics an abrupt decrease occurred beginning at the end of 1991. In the Northern Hemisphere, CO decreased at a spatially and temporally averaged rate of 7.3 ([+-]0.9) parts per billion per year (6.1 percent per year) from June 1990 to June 1993, whereas in the Southern Hemisphere, CO decreased 4.2 ([+-]0.5) parts per billion per year (7.0 percent per year). This recent change is opposite a long-term trend of a 1 to 2 percent per year increase inferred from measurements made in the Northern Hemisphere during the past 30 years.

  1. Delivering migrant workers' remittances

    OpenAIRE

    Ballard, Roger

    2004-01-01

    As globalization has led to ever higher levels of labour mobility, so the volume of funds remitted to their families by workers employed in countries far distant from their homes has increased by leaps and bounds. The total volume of such transfers currently amounts to over $100 billion per annum, the greater part of which flows from economically advanced regions in the West and North to developing countries in the East and South. Delivering those funds swiftly, reliably and cheaply to relati...

  2. Global Poverty and the New Bottom Billion: Three-Quarters of the World?s Poor Live in Middle-Income Countries

    OpenAIRE

    Andy Sumner

    2011-01-01

    In 1990, 93 per cent of the world?s poor people lived in poor countries?that is, low-income countries (LICs). For 2007?2008, our estimates suggest three things. First, three-quarters of the world?s poor, or almost 1 billion poor people, now live in middle-income countries (MICs). Second, just a quarter of the world?s poor live in 39 LICs. Third, in contrast to earlier estimates that a third of the poor live in fragile states, our estimate is about 23 per cent if one takes the broadest definit...

  3. Brachytherapy dose-volume histogram computations using optimized stratified sampling methods

    International Nuclear Information System (INIS)

    Karouzakis, K.; Lahanas, M.; Milickovic, N.; Giannouli, S.; Baltas, D.; Zamboglou, N.

    2002-01-01

    A stratified sampling method for the efficient repeated computation of dose-volume histograms (DVHs) in brachytherapy is presented as used for anatomy based brachytherapy optimization methods. The aim of the method is to reduce the number of sampling points required for the calculation of DVHs for the body and the PTV. From the DVHs are derived the quantities such as Conformity Index COIN and COIN integrals. This is achieved by using partial uniform distributed sampling points with a density in each region obtained from a survey of the gradients or the variance of the dose distribution in these regions. The shape of the sampling regions is adapted to the patient anatomy and the shape and size of the implant. For the application of this method a single preprocessing step is necessary which requires only a few seconds. Ten clinical implants were used to study the appropriate number of sampling points, given a required accuracy for quantities such as cumulative DVHs, COIN indices and COIN integrals. We found that DVHs of very large tissue volumes surrounding the PTV, and also COIN distributions, can be obtained using a factor of 5-10 times smaller the number of sampling points in comparison with uniform distributed points

  4. Evaluation of Low versus High Volume per Minute Displacement CO₂ Methods of Euthanasia in the Induction and Duration of Panic-Associated Behavior and Physiology.

    Science.gov (United States)

    Hickman, Debra L; Fitz, Stephanie D; Bernabe, Cristian S; Caliman, Izabela F; Haulcomb, Melissa M; Federici, Lauren M; Shekhar, Anantha; Johnson, Philip L

    2016-08-02

    Current recommendations for the use of CO ₂ as a euthanasia agent for rats require the use of gradual fill protocols (such as 10% to 30% volume displacement per minute) in order to render the animal insensible prior to exposure to levels of CO ₂ that are associated with pain. However, exposing rats to CO ₂ , concentrations as low as 7% CO ₂ are reported to cause distress and 10%-20% CO ₂ induces panic-associated behavior and physiology, but loss of consciousness does not occur until CO ₂ concentrations are at least 40%. This suggests that the use of the currently recommended low flow volume per minute displacement rates create a situation where rats are exposed to concentrations of CO ₂ that induce anxiety, panic, and distress for prolonged periods of time. This study first characterized the response of male rats exposed to normoxic 20% CO ₂ for a prolonged period of time as compared to room air controls. It demonstrated that rats exposed to this experimental condition displayed clinical signs consistent with significantly increased panic-associated behavior and physiology during CO ₂ exposure. When atmospheric air was then again delivered, there was a robust increase in respiration rate that coincided with rats moving to the air intake. The rats exposed to CO ₂ also displayed behaviors consistent with increased anxiety in the behavioral testing that followed the exposure. Next, this study assessed the behavioral and physiologic responses of rats that were euthanized with 100% CO ₂ infused at 10%, 30%, or 100% volume per minute displacement rates. Analysis of the concentrations of CO ₂ and oxygen in the euthanasia chamber and the behavioral responses of the rats suggest that the use of the very low flow volume per minute displacement rate (10%) may prolong the duration of panicogenic ranges of ambient CO ₂ , while the use of the higher flow volume per minute displacement rate (100%) increases agitation. Therefore, of the volume

  5. Unambiguous range-Doppler LADAR processing using 2 giga-sample-per-second noise waveforms

    International Nuclear Information System (INIS)

    Cole, Z.; Roos, P.A.; Berg, T.; Kaylor, B.; Merkel, K.D.; Babbitt, W.R.; Reibel, R.R.

    2007-01-01

    We demonstrate sub-nanosecond range and unambiguous sub-50-Hz Doppler resolved laser radar (LADAR) measurements using spectral holographic processing in rare-earth ion doped crystals. The demonstration utilizes pseudo-random-noise 2 giga-sample-per-second baseband waveforms modulated onto an optical carrier

  6. Unambiguous range-Doppler LADAR processing using 2 giga-sample-per-second noise waveforms

    Energy Technology Data Exchange (ETDEWEB)

    Cole, Z. [S2 Corporation, 2310 University Way 4-1, Bozeman, MT 59715 (United States)]. E-mail: cole@s2corporation.com; Roos, P.A. [Spectrum Lab, Montana State University, P.O. Box 173510, Bozeman, MT 59717 (United States); Berg, T. [S2 Corporation, 2310 University Way 4-1, Bozeman, MT 59715 (United States); Kaylor, B. [S2 Corporation, 2310 University Way 4-1, Bozeman, MT 59715 (United States); Merkel, K.D. [S2 Corporation, 2310 University Way 4-1, Bozeman, MT 59715 (United States); Babbitt, W.R. [Spectrum Lab, Montana State University, P.O. Box 173510, Bozeman, MT 59717 (United States); Reibel, R.R. [S2 Corporation, 2310 University Way 4-1, Bozeman, MT 59715 (United States)

    2007-11-15

    We demonstrate sub-nanosecond range and unambiguous sub-50-Hz Doppler resolved laser radar (LADAR) measurements using spectral holographic processing in rare-earth ion doped crystals. The demonstration utilizes pseudo-random-noise 2 giga-sample-per-second baseband waveforms modulated onto an optical carrier.

  7. Plasma diagnostics package. Volume 2: Spacelab 2 section. Part B: Thesis projects

    Science.gov (United States)

    Pickett, Jolene S. (Compiler); Frank, L. A. (Compiler); Kurth, W. S. (Compiler)

    1988-01-01

    This volume (2), which consists of two parts (A and B), of the Plasma Diagnostics Package (PDP) Final Science Report contains a summary of all of the data reduction and scientific analyses which were performed using PDP data obtained on STS-51F as a part of the Spacelab 2 (SL-2) payload. This work was performed during the period of launch, July 29, 1985, through June 30, 1988. During this period the primary data reduction effort consisted of processing summary plots of the data received by 12 of the 14 instruments located on the PDP and submitting these data to the National Space Science Data Center (NSSDC). Three Master's and three Ph.D. theses were written using PDP instrumentation data. These theses are listed in Volume 2, Part B.

  8. An antithetic variate to facilitate upper-stem height measurements for critical height sampling with importance sampling

    Science.gov (United States)

    Thomas B. Lynch; Jeffrey H. Gove

    2013-01-01

    Critical height sampling (CHS) estimates cubic volume per unit area by multiplying the sum of critical heights measured on trees tallied in a horizontal point sample (HPS) by the HPS basal area factor. One of the barriers to practical application of CHS is the fact that trees near the field location of the point-sampling sample point have critical heights that occur...

  9. Subseabed disposal program annual report, January-December 1980. Volume II. Appendices (principal investigator progress reports). Part 1

    International Nuclear Information System (INIS)

    Hinga, K.R.

    1981-07-01

    Volume II of the sixth annual report describing the progress and evaluating the status of the Subseabed Disposal Program contains the appendices referred to in Volume I, Summary and Status. Because of the length of Volume II, it has been split into two parts for publication purposes. Part 1 contains Appendices A-Q; Part 2 contains Appendices R-MM. Separate abstracts have been prepared for each appendix for inclusion in the Energy Data Base

  10. Subseabed disposal program annual report, January-December 1980. Volume II. Appendices (principal investigator progress reports). Part 1

    Energy Technology Data Exchange (ETDEWEB)

    Hinga, K.R. (ed.)

    1981-07-01

    Volume II of the sixth annual report describing the progress and evaluating the status of the Subseabed Disposal Program contains the appendices referred to in Volume I, Summary and Status. Because of the length of Volume II, it has been split into two parts for publication purposes. Part 1 contains Appendices A-Q; Part 2 contains Appendices R-MM. Separate abstracts have been prepared for each appendix for inclusion in the Energy Data Base.

  11. Remedial Investigation Badger Army Ammunition Plant, Baraboo, Wisconsin. Volume 1. Text Sections 1 Through 12

    Science.gov (United States)

    1993-04-01

    VOCs (acetone [ACET], trichlorofluoromethane [CCL3F], methyl ethyl ketone [MEK]) sporadically detected at very low concentrations (< 1 parts per billion...associated with the site includes red pine ( Pinus resinosa), hickories, cedar (Thuja occidentalis), and American elm (Ulmus americana). Grasses and weedy...cd)pyrene ICDPYR iron FE lead PB magnesium MG *manganese MN mercury HG methylene chloride CH12CL2 methyl ethyl ketone or 2-butanone MIEK

  12. Comparison of uncertainties related to standardization of urine samples with volume and creatinine concentration

    DEFF Research Database (Denmark)

    Garde, Anne Helene; Hansen, Ase Marie; Kristiansen, Jesper

    2004-01-01

    When measuring biomarkers in urine, volume (and time) or concentration of creatinine are both accepted methods of standardization for diuresis. Both types of standardization contribute uncertainty to the final result. The aim of the present paper was to compare the uncertainty introduced when usi...... increase in convenience for the participants, when collecting small volumes rather than complete 24 h samples....... the two types of standardization on 24 h samples from healthy individuals. Estimates of uncertainties were based on results from the literature supplemented with data from our own studies. Only the difference in uncertainty related to the two standardization methods was evaluated. It was found...... that the uncertainty associated with creatinine standardization (19-35%) was higher than the uncertainty related to volume standardization (up to 10%, when not correcting for deviations from 24 h) for 24 h urine samples. However, volume standardization introduced an average bias of 4% due to missed volumes...

  13. Plasma volume changes during hypoglycaemia: the effect of arterial blood sampling

    DEFF Research Database (Denmark)

    Hilsted, J; Bendtsen, Flemming; Christensen, N J

    1990-01-01

    To investigate whether previously reported changes in venous blood volume and composition induced by acute hypoglycaemia in humans are representative for the entire body we measured erythrocyte 51Cr content, haematocrit, plasma volume, intravascular albumin content and transcapillary escape rate...... hypoglycaemia. The magnitude of the changes in arterial and venous blood were not significantly different. These results indicate that the above changes in blood volume and composition are whole-body phenomena: furthermore, the major part of the changes are likely to occur in tissues other than upper extremity...... of albumin in arterial and venous blood in seven healthy subjects before and during insulin-induced hypoglycaemia. In both vascular sites blood 51Cr content and the haematocrit increased, plasma volume and intravascular albumin content decreased and the transcapillary escape rate of albumin increased during...

  14. Day and night profiles of tropospheric nitrous oxide

    Science.gov (United States)

    Cofer, Wesley R., III; Connors, Vickie S.; Levine, Joel S.; Edahl, Robert A., Jr.

    1986-01-01

    Daytime and nighttime vertical profiles of the tropospheric trace gas N2O were determined from grab sample collections off the Atlantic and Gulf coasts of Florida. The grab samples were collected during the week of October 7-13, 1984, from a Lear jet during descent spirals over an altitude range of 12.5-0.3 km in approximately 1.2-km intervals. During this period there were two distinct airflow regimes sampled: (1) the surface boundary layer (less than 2 km), in which the wind direction was typically easterly; and (2) the regime above the boundary layer, which was predominantly characterized by westerly flow. N2O mixing ratios, normalized to dry air, were determined from 148 daytime and nighttime samplings. N2O was found to be uniformly mixed at all altitudes at 301.9 + or - 2.4 parts per billion by volume.

  15. The Boring Billion, a slingshot for Complex Life on Earth.

    Science.gov (United States)

    Mukherjee, Indrani; Large, Ross R; Corkrey, Ross; Danyushevsky, Leonid V

    2018-03-13

    The period 1800 to 800 Ma ("Boring Billion") is believed to mark a delay in the evolution of complex life, primarily due to low levels of oxygen in the atmosphere. Earlier studies highlight the remarkably flat C, Cr isotopes and low trace element trends during the so-called stasis, caused by prolonged nutrient, climatic, atmospheric and tectonic stability. In contrast, we suggest a first-order variability of bio-essential trace element availability in the oceans by combining systematic sampling of the Proterozoic rock record with sensitive geochemical analyses of marine pyrite by LA-ICP-MS technique. We also recall that several critical biological evolutionary events, such as the appearance of eukaryotes, origin of multicellularity & sexual reproduction, and the first major diversification of eukaryotes (crown group) occurred during this period. Therefore, it appears possible that the period of low nutrient trace elements (1800-1400 Ma) caused evolutionary pressures which became an essential trigger for promoting biological innovations in the eukaryotic domain. Later periods of stress-free conditions, with relatively high nutrient trace element concentration, facilitated diversification. We propose that the "Boring Billion" was a period of sequential stepwise evolution and diversification of complex eukaryotes, triggering evolutionary pathways that made possible the later rise of micro-metazoans and their macroscopic counterparts.

  16. Three Mile Island: a report to the commissioners and to the public. Volume II, Part 3

    International Nuclear Information System (INIS)

    1979-01-01

    This is the third and final part of the second volume of a study of the Three Mile Island accident. Part 3 of Volume II contains descriptions and assessments of responses to the accident by the utility and by the NRC and other government agencies

  17. Physical correlates of radiologic heart volume

    International Nuclear Information System (INIS)

    Christie, D.

    1978-01-01

    Radiologic heart volume was calculated on a 10 per cent random sample of subjects examined in the London Civil Service Health Survey. Data were available for 1 188 men over the age of 40, and the importance of correcting radiologic heart volume for body size, age and heart rate was demonstrated. After these variables were taken into account, the most important association found was with blood pressure. Radiologic heart volume has potential value in cardiovascular screening programmes. (Auth.)

  18. Potential effects of the next 100 billion hamburgers sold by McDonald's.

    Science.gov (United States)

    Spencer, Elsa H; Frank, Erica; McIntosh, Nichole F

    2005-05-01

    McDonald's has sold >100 billion beef-based hamburgers worldwide with a potentially considerable health impact. This paper explores whether there would be any advantages if the next 100 billion burgers were instead plant-based burgers. Nutrient composition of the beef hamburger patty and the McVeggie burger patty were obtained from the McDonald's website; sales data were obtained from the McDonald's customer service. Consuming 100 billion McDonald's beef burgers versus the same company's McVeggie burgers would provide, approximately, on average, an additional 550 million pounds of saturated fat and 1.2 billion total pounds of fat, as well as 1 billion fewer pounds of fiber, 660 million fewer pounds of protein, and no difference in calories. These data suggest that the McDonald's new McVeggie burger represents a less harmful fast-food choice than the beef burger.

  19. Efficiency of dioxin recovery from fly-ash samples during extraction and cleanup process, March 1989. Final report, 1 August 1987-30 September 1988

    International Nuclear Information System (INIS)

    Finkel, J.M.; James, R.H.; Baughman, K.W.

    1989-03-01

    The work supported Environmental Monitoring Systems Laboratory, U.S. Environmental Protection Agency in its effort to monitor the hazardous composition, if any, of fly ash from various types of incinerators using different types of combustible materials. The analytical determination of dioxins in environmental samples in the parts per billion, trillion, and quadrillion levels requires meticulous, time-consuming, and very complex sample preparation and analysis procedures. A major part of the task was devoted to the evaluation of various extraction techniques of fly ash and cleanup of sample extracts by column chromatography. Several chromatographic media and eluting solvents were investigated. Each step in the sample preparation was evaluated by using 14 C-radiolabeled 2,3,7,8-tetrachlorodibenzo-p-dioxin and octochlorodibenzo-p-dioxin as a tracer. Radiolabeled dioxin allows the analyst to stop and evaluate each step of the procedure, each extract, and each column eluate fraction by liquid-scintillation computing. To validate the radiometric assay, dioxin was confirmed by gas chromatography/mass spectrometry. The report contains recovery data of spiked 2,3,7,8-tetrachlorodibenzo-p-dioxin and octochlorodibenzo-p-dioxin in carbon-free fly ash and fly ash containing from 0.1% to 10% carbon

  20. Evaluation of Low versus High Volume per Minute Displacement CO2 Methods of Euthanasia in the Induction and Duration of Panic-Associated Behavior and Physiology

    Directory of Open Access Journals (Sweden)

    Debra L. Hickman

    2016-08-01

    Full Text Available Current recommendations for the use of CO 2 as a euthanasia agent for rats require the use of gradual fill protocols (such as 10% to 30% volume displacement per minute in order to render the animal insensible prior to exposure to levels of CO 2 that are associated with pain. However, exposing rats to CO 2 , concentrations as low as 7% CO 2 are reported to cause distress and 10%–20% CO 2 induces panic-associated behavior and physiology, but loss of consciousness does not occur until CO 2 concentrations are at least 40%. This suggests that the use of the currently recommended low flow volume per minute displacement rates create a situation where rats are exposed to concentrations of CO 2 that induce anxiety, panic, and distress for prolonged periods of time. This study first characterized the response of male rats exposed to normoxic 20% CO 2 for a prolonged period of time as compared to room air controls. It demonstrated that rats exposed to this experimental condition displayed clinical signs consistent with significantly increased panic-associated behavior and physiology during CO 2 exposure. When atmospheric air was then again delivered, there was a robust increase in respiration rate that coincided with rats moving to the air intake. The rats exposed to CO 2 also displayed behaviors consistent with increased anxiety in the behavioral testing that followed the exposure. Next, this study assessed the behavioral and physiologic responses of rats that were euthanized with 100% CO 2 infused at 10%, 30%, or 100% volume per minute displacement rates. Analysis of the concentrations of CO 2 and oxygen in the euthanasia chamber and the behavioral responses of the rats suggest that the use of the very low flow volume per minute displacement rate (10% may prolong the duration of panicogenic ranges of ambient CO 2 , while the use of the higher flow volume per minute displacement rate (100% increases agitation. Therefore, of the volume displacement per

  1. Seven Billion People: Fostering Productive Struggle

    Science.gov (United States)

    Murawska, Jaclyn M.

    2018-01-01

    How can a cognitively demanding real-world task such as the Seven Billion People problem promote productive struggle "and" help shape students' mathematical dispositions? Driving home from school one evening, Jaclyn Murawska heard a commentator on the radio announce three statements: (1) experts had determined that the world population…

  2. Arsenic in detergents: Possible danger and pollution hazard

    Science.gov (United States)

    Angino, E.E.; Magnuson, L.M.; Waugh, T.C.; Galle, O.K.; Bredfeldt, J.

    1970-01-01

    Arsenic at a concentration of 10 to 70 parts per million has been detected in several common presoaks and household detergents. Arsenic values of 2 to 8 parts per billion have been measured in the Kansas River. These concentrations are close to the amount (10 parts per billion) recommended by the United States Public Health Service as a drinking-water standard.

  3. ARPA-E Impacts: A Sampling of Project Outcomes, Volume II

    Energy Technology Data Exchange (ETDEWEB)

    Rohlfing, Eric [Dept. of Energy (DOE), Washington DC (United States). Advanced Research Projects Agency-Energy (ARPA-E)

    2017-02-27

    The Advanced Research Projects Agency-Energy (ARPA-E) is demonstrating that a collaborative model has the power to deliver real value. The Agency’s first compilation booklet of impact sheets, published in 2016, began to tell the story of how ARPA-E has already made an impact in just seven years—funding a diverse and sophisticated research portfolio on advanced energy technologies that enable the United States to tackle our most pressing energy challenges. One year later our research investments continue to pay off, with a number of current and alumni project teams successfully commercializing their technologies and advancing the state of the art in transformative areas of energy science and engineering. There is no single measure that can fully illustrate ARPA-E’s success to date, but several statistics viewed collectively begin to reveal the Agency’s impact. Since 2009, ARPA-E has provided more than $1.5 billion in funding for 36 focused programs and three open funding solicitations, totaling over 580 projects. Of those, 263 are now alumni projects. Many teams have successfully leveraged ARPA-E’s investment: 56 have formed new companies, 68 have partnered with other government agencies to continue their technology development, and 74 teams have together raised more than $1.8 billion in reported funding from the private sector to bring their technologies to market. However, even when viewed together, those measures do not capture ARPA-E’s full impact. To best understand the Agency’s success, the specific scientific and engineering challenges that ARPA-E project teams have overcome must be understood. This booklet provides concrete examples of those successes, ranging from innovations that will bear fruit in the future to ones that are beginning to penetrate the market as products today. Importantly, half of the projects highlighted in this volume stem from OPEN solicitations, which the agency has run in 2009, 2012, and 2015. ARPA-E’s OPEN programs

  4. Practical and highly sensitive elemental analysis for aqueous samples containing metal impurities employing electrodeposition on indium-tin oxide film samples and laser-induced shock wave plasma in low-pressure helium gas.

    Science.gov (United States)

    Kurniawan, Koo Hendrik; Pardede, Marincan; Hedwig, Rinda; Abdulmadjid, Syahrun Nur; Lahna, Kurnia; Idris, Nasrullah; Jobiliong, Eric; Suyanto, Hery; Suliyanti, Maria Margaretha; Tjia, May On; Lie, Tjung Jie; Lie, Zener Sukra; Kurniawan, Davy Putra; Kagawa, Kiichiro

    2015-09-01

    We have conducted an experimental study exploring the possible application of laser-induced breakdown spectroscopy (LIBS) for practical and highly sensitive detection of metal impurities in water. The spectrochemical measurements were carried out by means of a 355 nm Nd-YAG laser within N2 and He gas at atmospheric pressures as high as 2 kPa. The aqueous samples were prepared as thin films deposited on indium-tin oxide (ITO) glass by an electrolysis process. The resulting emission spectra suggest that concentrations at parts per billion levels may be achieved for a variety of metal impurities, and it is hence potentially feasible for rapid inspection of water quality in the semiconductor and pharmaceutical industries, as well as for cooling water inspection for possible leakage of radioactivity in nuclear power plants. In view of its relative simplicity, this LIBS equipment offers a practical and less costly alternative to the standard use of inductively coupled plasma-mass spectrometry (ICP-MS) for water samples, and its further potential for in situ and mobile applications.

  5. Regional Feedstock Partnership Summary Report: Enabling the Billion-Ton Vision

    Energy Technology Data Exchange (ETDEWEB)

    Owens, Vance N. [South Dakota State Univ., Brookings, SD (United States). North Central Sun Grant Center; Karlen, Douglas L. [Dept. of Agriculture Agricultural Research Service, Ames, IA (United States). National Lab. for Agriculture and the Environment; Lacey, Jeffrey A. [Idaho National Lab. (INL), Idaho Falls, ID (United States). Process Science and Technology Division

    2016-07-12

    The U.S. Department of Energy (DOE) and the Sun Grant Initiative established the Regional Feedstock Partnership (referred to as the Partnership) to address information gaps associated with enabling the vision of a sustainable, reliable, billion-ton U.S. bioenergy industry by the year 2030 (i.e., the Billion-Ton Vision). Over the past 7 years (2008–2014), the Partnership has been successful at advancing the biomass feedstock production industry in the United States, with notable accomplishments. The Billion-Ton Study identifies the technical potential to expand domestic biomass production to offset up to 30% of U.S. petroleum consumption, while continuing to meet demands for food, feed, fiber, and export. This study verifies for the biofuels and chemical industries that a real and substantial resource base could justify the significant investment needed to develop robust conversion technologies and commercial-scale facilities. DOE and the Sun Grant Initiative established the Partnership to demonstrate and validate the underlying assumptions underpinning the Billion-Ton Vision to supply a sustainable and reliable source of lignocellulosic feedstock to a large-scale bioenergy industry. This report discusses the accomplishments of the Partnership, with references to accompanying scientific publications. These accomplishments include advances in sustainable feedstock production, feedstock yield, yield stability and stand persistence, energy crop commercialization readiness, information transfer, assessment of the economic impacts of achieving the Billion-Ton Vision, and the impact of feedstock species and environment conditions on feedstock quality characteristics.

  6. THE INFLUENCE OF FIRST WORT PART AND AFTERWORTS ON SACCHARIFICATION OF WORT

    Directory of Open Access Journals (Sweden)

    Miriam Líšková

    2011-02-01

    Full Text Available Normal 0 21 false false false SK X-NONE X-NONE MicrosoftInternetExplorer4 Wort is a basic product of mashing, which forms the first intermediate in beer production and constitute the base of its final value. For qualitative value wort has the greatest impact grist per brew, which is a description of materials, they bring to brew extract and determine its the volume and concentration. The main component grist per brew for light and dark beers is stored pale malt and possibly a smaller proportion of adjuncts. The aim of our work was to assess the qualitative parameters of malt in terms of content extract and its impact on the amount of produced the first wort part and afterwort and their qualitative values expressed in % saccharification and volumes. We measured 3 types of malts with the content of the extract 75.2%, 76.1%, 77.2% in the original sample, which determined mainly reached saccharification of first part wort and other afterwort parts one and two. In terms attained of saccharification it was necessary to use on sparge of spent grains at afterwort number two only the amount of water, which would be not affect the total saccharification of wort and its qualitative parameters.doi:10.5219/114 

  7. The economic impact of Medicare Part D on congestive heart failure.

    Science.gov (United States)

    Dall, Timothy M; Blanchard, Tericke D; Gallo, Paul D; Semilla, April P

    2013-05-01

    Medicare Part D has had important implications for patient outcomes and treatment costs among beneficiaries with congestive heart failure (CHF). This study finds that improved medication adherence associated with expansion of drug coverage under Part D led to nearly $2.6 billion in reductions in medical expenditures annually among beneficiaries diagnosed with CHF and without prior comprehensive drug coverage, of which over $2.3 billion was savings to Medicare. Further improvements in adherence could potentially save Medicare another $1.9 billion annually, generating upwards of $22.4 billion in federal savings over 10 years.

  8. Critical length sampling: a method to estimate the volume of downed coarse woody debris

    Science.gov (United States)

    G& #246; ran St& #229; hl; Jeffrey H. Gove; Michael S. Williams; Mark J. Ducey

    2010-01-01

    In this paper, critical length sampling for estimating the volume of downed coarse woody debris is presented. Using this method, the volume of downed wood in a stand can be estimated by summing the critical lengths of down logs included in a sample obtained using a relascope or wedge prism; typically, the instrument should be tilted 90° from its usual...

  9. Nuclear budget for FY1991 up 3.6% to 409.7 billion yen

    International Nuclear Information System (INIS)

    Anon.

    1991-01-01

    A total of yen409.7 billion was approved for the Governmental nuclear energy draft budget for fiscal 1991 on December 28, as the Cabinet gave its approval. The total, the highest ever, was divided into yen182.6 billion for the general account and yen227.1 billion for the special account for power resources development, representing a 3.6% increase over the ongoing fiscal year's level of yen395.5 billion. The draft budget will be examined for approval of the Diet session by the end of March. The nuclear energy budget devoted to research and development projects governed by the Science and Technology Agency amounts yen306.4 billion, up 3.5% exceeding yen300 billion for the first time. The nuclear budget for the Ministry of International Trade and Industry is yen98.1 billion, up 3.5%. For the other ministries, including the Ministry of Foreign Affairs, yen5.1 billion was allotted to nuclear energy-related projects. The Government had decided to raise the unit cost of the power plant siting promotion subsidies in the special account for power resources development by 25% --- from yen600/kw to yen750/kw --- in order to support the siting of plants. Consequently, the power resources siting account of the special accounts for both STA and MITI showed high levels of growth rates: 6.3% and 7.5%, respectively. (N.K.)

  10. Community access networks: how to connect the next billion to the ...

    International Development Research Centre (IDRC) Digital Library (Canada)

    Community access networks: how to connect the next billion to the Internet. Despite recent progress with mobile technology diffusion, more than four billion people worldwide are unconnected and have limited access to global communication infrastructure. The cost of implementing connectivity infrastructure in underserved ...

  11. Sample volume and alignment analysis for an optical particle counter sizer, and other applications

    International Nuclear Information System (INIS)

    Holve, D.J.; Davis, G.W.

    1985-01-01

    Optical methods for particle size distribution measurements in practical high temperature environments are approaching feasibility and offer significant advantages over conventional sampling methods. A key requirement of single particle counting techniques is the need to know features of the sample volume intensity distribution which in general are a function of the particle scattering properties and optical system geometry. In addition, the sample volume intensity distribution is sensitive to system alignment and thus calculations of alignment sensitivity are required for assessment of practical alignment tolerances. To this end, an analysis of sample volume characteristics for single particle counters in general has been developed. Results from the theory are compared with experimental measurements and shown to be in good agreement. A parametric sensitivity analysis is performed and a criterion for allowable optical misalignment is derived for conditions where beam steering caused by fluctuating refractive-index gradients is significant

  12. Micro- and nano-volume samples by electrothermal, near-torch vaporization sample introduction using removable, interchangeable and portable rhenium coiled-filament assemblies and axially-viewed inductively coupled plasma-atomic emission spectrometry

    Energy Technology Data Exchange (ETDEWEB)

    Badiei, Hamid R.; Lai, Bryant; Karanassios, Vassili

    2012-11-15

    An electrothermal, near-torch vaporization (NTV) sample introduction for micro- or nano-volume samples is described. Samples were pipetted onto coiled-filament assemblies that were purposely developed to be removable and interchangeable and were dried and vaporized into a small-volume vaporization chamber that clips onto any ICP torch with a ball joint. Interchangeable assemblies were also constructed to be small-size (e.g., less than 3 cm long with max diameter of 0.65 cm) and light-weight (1.4 g) so that they can be portable. Interchangeable assemblies with volume-capacities in three ranges (i.e., < 1 {mu}L, 1-10 {mu}L and 10-100 {mu}L) were fabricated and used. The horizontally-operated NTV sample introduction was interfaced to an axially-viewed ICP-AES (inductively coupled plasma-atomic emission spectrometry) system and NTV was optimized using ICP-AES and 8 elements (Pb, Cd, Zn, V, Ba, Mg, Be and Ca). Precision was 1.0-2.3% (peak height) and 1.1-2.4% (peak area). Detection limits (obtained using 5 {mu}L volumes) expressed in absolute-amounts ranged between 4 pg for Pb to 0.3 fg ({approx} 5 million atoms) for Ca. Detection limits expressed in concentration units (obtained using 100 {mu}L volumes of diluted, single-element standard solutions) were: 50 pg/mL for Pb; 10 pg/mL for Cd; 9 pg/mL for Zn; 1 pg/mL for V; 0.9 pg/mL for Ba; 0.5 pg/mL for Mg; 50 fg/mL for Be; and 3 fg/mL for Ca. Analytical capability and utility was demonstrated using the determination of Pb in pg/mL levels of diluted natural water Certified Reference Material (CRM) and the determination of Zn in 80 nL volumes of the liquid extracted from an individual vesicle. It is shown that portable and interchangeable assemblies with dried sample residues on them can be transported without analyte loss (for the concentrations tested), thus opening up the possibility for 'taking part of the lab to the sample' applications, such as testing for Cu concentration-compliance with the lead

  13. Determination of Picloram in Soil and Water by Reversed-Phase Liquid Chromatography

    Science.gov (United States)

    M.J.M. Wells; J.L. Michael; D.G. Neary

    1984-01-01

    A reversed-phase liquid chromatographic method is presneted for the determination of picloram in the parts per billion (ppb) range in soil, soil solution, and stream samples. Quanitification is effected by UV absorpation at 254 nm. Derivatization is not necessary. The method permits 92% ± 7.1 recovery from water samples and 61.8% ± 11.1 recovery from soil samples....

  14. Four billion people facing severe water scarcity.

    Science.gov (United States)

    Mekonnen, Mesfin M; Hoekstra, Arjen Y

    2016-02-01

    Freshwater scarcity is increasingly perceived as a global systemic risk. Previous global water scarcity assessments, measuring water scarcity annually, have underestimated experienced water scarcity by failing to capture the seasonal fluctuations in water consumption and availability. We assess blue water scarcity globally at a high spatial resolution on a monthly basis. We find that two-thirds of the global population (4.0 billion people) live under conditions of severe water scarcity at least 1 month of the year. Nearly half of those people live in India and China. Half a billion people in the world face severe water scarcity all year round. Putting caps to water consumption by river basin, increasing water-use efficiencies, and better sharing of the limited freshwater resources will be key in reducing the threat posed by water scarcity on biodiversity and human welfare.

  15. Atmospheric ammonia mixing ratios at an open-air cattle feeding facility.

    Science.gov (United States)

    Hiranuma, Naruki; Brooks, Sarah D; Thornton, Daniel C O; Auvermann, Brent W

    2010-02-01

    Mixing ratios of total and gaseous ammonia were measured at an open-air cattle feeding facility in the Texas Panhandle in the summers of 2007 and 2008. Samples were collected at the nominally upwind and downwind edges of the facility. In 2008, a series of far-field samples was also collected 3.5 km north of the facility. Ammonium concentrations were determined by two complementary laboratory methods, a novel application of visible spectrophotometry and standard ion chromatography (IC). Results of the two techniques agreed very well, and spectrophotometry is faster, easier, and cheaper than chromatography. Ammonia mixing ratios measured at the immediate downwind site were drastically higher (approximately 2900 parts per billion by volume [ppbv]) than thos measured at the upwind site (open-air animal feeding operations, especially under the hot and dry conditions present during these measurements.

  16. Measuring Blood Glucose Concentrations in Photometric Glucometers Requiring Very Small Sample Volumes.

    Science.gov (United States)

    Demitri, Nevine; Zoubir, Abdelhak M

    2017-01-01

    Glucometers present an important self-monitoring tool for diabetes patients and, therefore, must exhibit high accuracy as well as good usability features. Based on an invasive photometric measurement principle that drastically reduces the volume of the blood sample needed from the patient, we present a framework that is capable of dealing with small blood samples, while maintaining the required accuracy. The framework consists of two major parts: 1) image segmentation; and 2) convergence detection. Step 1 is based on iterative mode-seeking methods to estimate the intensity value of the region of interest. We present several variations of these methods and give theoretical proofs of their convergence. Our approach is able to deal with changes in the number and position of clusters without any prior knowledge. Furthermore, we propose a method based on sparse approximation to decrease the computational load, while maintaining accuracy. Step 2 is achieved by employing temporal tracking and prediction, herewith decreasing the measurement time, and, thus, improving usability. Our framework is tested on several real datasets with different characteristics. We show that we are able to estimate the underlying glucose concentration from much smaller blood samples than is currently state of the art with sufficient accuracy according to the most recent ISO standards and reduce measurement time significantly compared to state-of-the-art methods.

  17. Transportable aerosol sampling station with fixed volume (15 l) DMPA-15

    International Nuclear Information System (INIS)

    Giolu, G.; Guta, V.

    1999-01-01

    The mobile installation is used for air-sampling operations with fixed intake volumes, to be analysed by laboratories of routine environmental air monitoring. The station consists of several units, installed on a two-wheel mobile carriage-type platform: - a double - diaphragm pump (ensuring oil separation) that provides air intake and its evacuation to the air-analysers. The sampling and control unit has the following functions: - intake ensured by the pump that aspirates fixed volumes of air from the ambient atmosphere and feeding with it an inflatable rubber chamber. Air intake is automatically stopped as the cushion is filled up completely. A separation clamp is provided to seal up the cushion; - exhaust - allows the residual air to be evacuated from the cushion, ensuring its 'self-cleaning'; - shut down, manually operated; - analyse, the aerosol containing sample is aspirated from the inflatable rubber chamber and evacuated through a flow regulator to the analyser; - stop, canceling any previous commands. A relay unit controls the pneumatic lines and a pressure relay provides automatic stop of air intake process. The following technical features are given: - The fixed air volume in the chamber, 15 l - the air flow at the exit from the flow-meter, 0 - 15 l/min; - power requirements, 220 V/ 50 Hz; - power consumption, max. 1,5 kW; - overall dimensions, 460 x 500 x 820 mm; - weight, 53 kg. (authors)

  18. Behind the ink : multi-billion dollar oil deal between Venezuela and China not necessarily bad for either the United States or Canada

    International Nuclear Information System (INIS)

    Bentein, J.

    2009-01-01

    Although China has recently signed large contracts with Venezuela for crude oil, experts are convinced that Chinese refiners will want more Canadian crude oil because they have greater confidence in Canada as a reliable provider. Production difficulties with nationalized oil industries in Venezuela and Mexico will create further opportunities for Canadian oil producers. The recent recession has caused delays in plans to increase pipeline capacity. China may invest $US 8 billion in Venezuelan oil in order to increase exports to China to over a million barrels per day by 2015. The Chinese government granted the Venezuelan government a $4 billion loan last year. The Venezuelan government has increased the maximum royalty rates paid by foreign oil companies. The Venezuelan-owned Citgo operates more than 7000 gasoline retail outlets in the United States. The country is capable of processing 1.3 million barrels per day. Studies have suggested that the Chinese see their investments as a business opportunity, and not as a means of antagonizing western countries. It was concluded that economists are predicting that Venezuela's currency will be devalued this year. 3 figs

  19. How do you interpret a billion primary care records?

    Directory of Open Access Journals (Sweden)

    Martin Heaven

    2017-04-01

    To establish this we explored just over 1 billion unique Read coded records generated in the time period 1999 to 2015 by GP practices participating in the provision of anonymised records to SAIL, aligning, filtering and summarising the data in a series of observational exercises to generate hypotheses related to the capture and recording of the data. Results A fascinating journey through 1 billion GP practice generated pieces of information, embarked upon to aid interpretation of our Supporting People results, and providing insights into the patterns of recording within GP data.

  20. Subseabed disposal program annual report, January-December 1979. Volume II. Appendices (principal investigator progress reports). Part 1 of 2

    International Nuclear Information System (INIS)

    Talbert, D.M.

    1981-04-01

    Volume II of the sixth annual report describing the progress and evaluating the status of the Subseabed Disposal Program contains the appendices referred to in Volume I, Summary and Status. Because of the length of Volume II, it has been split into two parts for publication purposes. Part 1 contains Appendices A-O; Part 2 contains Appendices P-FF. Separate abstracts have been prepared of each Appendix for inclusion in the Energy Data Base

  1. Subseabed disposal program annual report, January-December 1979. Volume II. Appendices (principal investigator progress reports). Part 2 of 2

    International Nuclear Information System (INIS)

    Talbert, D.M.

    1981-04-01

    Volume II of the sixth annual report describing the progress and evaluating the status of the Subseabed Disposal Program contains the appendices referred to in Volume II, Summary and Status. Because of the length of Volume II, it has been split into two parts for publication purposes. Part 1 contains Appendices A-O; Part 2 contains Appendices P-FF. Separate abstracts have been prepared for each appendix for inclusion in the Energy Data Base

  2. Critical point relascope sampling for unbiased volume estimation of downed coarse woody debris

    Science.gov (United States)

    Jeffrey H. Gove; Michael S. Williams; Mark J. Ducey; Mark J. Ducey

    2005-01-01

    Critical point relascope sampling is developed and shown to be design-unbiased for the estimation of log volume when used with point relascope sampling for downed coarse woody debris. The method is closely related to critical height sampling for standing trees when trees are first sampled with a wedge prism. Three alternative protocols for determining the critical...

  3. Some final conclusions and supporting experiments related to the search for organic compounds on the surface of Mars

    International Nuclear Information System (INIS)

    Bemann, K.; Lavoie, J.M. Jr

    1979-01-01

    The Viking molecular analysis experiment has demonstrated the absence (within the detection limits which range from levels of parts per million to below parts per billion) of organic substances in the Martian surface soil at the two Viking landing sites. Laboratory experiments with sterile and nonsterile antarctic samples further demonstrate the capability and reliability of the instrument. The circimstances under which organic components could have escaped detection, such as inaccessibility or extreme thermal stability of organic polymers, are discussed but are found to be unlikely. The inability of the instrument to detect free oxygen evolved from soil samples is pointed out

  4. Escalating energy costs threaten health care for critically ill and homebound seniors: home care nurses, aides and therapists drive 4.8 billion miles per year to reach shut-in patients.

    Science.gov (United States)

    2008-08-01

    The rapidly rising cost of fuel has had a profound impact on the home care and hospice industry. In an effort to quantify the increased burden, the National Association for Home Care & Hospice's (NAHC) Foundation for Hospice and Home Care conducted a study showing that home care and hospice providers drive over 5 billion miles per year to deliver services --about two-and-a-half times the number driven by United Parcel Service, the international delivery service. The findings garnered significant interest by the media and elected officials. Reprinted in this issue of CARING Magazine are the press release that NAHC issued regarding the study, as well as a graphic representation of the study's findings that was circulated to the National Conference of State Legislatures at its most recent meeting in July. Also represented on these pages is a reprint from the Congressional Record of July 11, 2008, in which Senator Debbie Stabenow (D-MI), one of the highest ranking Democrats in the US. Senate, entered into the record an article from the front page of the New York Times of July 5, 2008, that covered the mileage study.

  5. Evaluation of CNS activities of aerial parts of Cynodon dactylon Pers. in mice.

    Science.gov (United States)

    Pal, Dilipkumar

    2008-01-01

    The dried extracts of aerial parts of Cynodon dactylon Pers. (Graminae) were evaluated for CNS activities in mice. The ethanol extract of aerial parts of C. dactylon (EECD) was found to cause significant depression in general behavioral profiles in mice. EECD significantly potentiated the sleeping time in mice induced by standard hypnotics viz. pentobarbitone sodium, diazepam, and meprobamate in a dose dependant manner. EECD showed significant analgesic properties as evidenced by the significant reduction in the number of writhes and stretches induced in mice by 1.2% acetic acid solution. It also potentiated analgesia induced by morphine and pethidine in mice. EECD inhibited the onset and the incidence of convulsion in a dose dependent manner against pentylenetetrazole (PTZ)-induced convulsion. The present study indicates that EECD has significant CNS depressant activities.

  6. Cosmic rays and the biosphere over 4 billion years

    DEFF Research Database (Denmark)

    Svensmark, Henrik

    2006-01-01

    Variations in the flux of cosmic rays (CR) at Earth during the last 4.6 billion years are constructed from information about the star formation rate in the Milky Way and the evolution of the solar activity. The constructed CR signal is compared with variations in the Earths biological productivit...... as recorded in the isotope delta C-13, which spans more than 3 billion years. CR and fluctuations in biological productivity show a remarkable correlation and indicate that the evolution of climate and the biosphere on the Earth is closely linked to the evolution of the Milky Way....

  7. Influence of volume of sample processed on detection of Chlamydia trachomatis in urogenital samples by PCR

    NARCIS (Netherlands)

    Goessens, W H; Kluytmans, J A; den Toom, N; van Rijsoort-Vos, T H; Niesters, B G; Stolz, E; Verbrugh, H A; Quint, W G

    In the present study, it was demonstrated that the sensitivity of the PCR for the detection of Chlamydia trachomatis is influenced by the volume of the clinical sample which is processed in the PCR. An adequate sensitivity for PCR was established by processing at least 4%, i.e., 80 microliters, of

  8. $17 billion needed by year 2000.

    Science.gov (United States)

    Finger, W R

    1995-09-01

    The United Nations Population Fund (UNFPA) estimates that US$17 billion will be needed to fund reproductive health care in developing countries by the year 2000. About US$10 billion of would go for family planning: currently, the amount spent on family planning is about US$5 billion. Donors are focusing on fewer countries because of limited resources. The United States Agency for International Development (USAID) is planning to phase out support for family planning in Jamaica and Brazil because the programs there have advanced sufficiently. Resources will be shifted to countries with more pressing needs. Dr. Richard Osborn, senior technical officer for UNFPA, states that UNFPA works with national program managers in allocating resources at the macro level (commodities, training). Currently, two-thirds of family planning funds spent worldwide come from developing country governments (mainly China, India, Indonesia, Mexico, South Africa, Turkey, and Bangladesh). Sustaining programs, much less adding new services, will be difficult. User fees and public-private partnerships are being considered; worldwide, consumers provide, currently, about 14% of family planning funds (The portion is higher in most Latin American countries.). In a few countries, insurance, social security, and other public-private arrangements contribute. Social marketing programs are being considered that would remove constraints on prescriptions and prices and improve the quality of services so that clients would be more willing to pay for contraceptives. Although governments are attempting to fit family planning into their health care budgets, estimates at the national level are difficult to make. Standards are needed to make expenditure estimates quickly and at low cost, according to Dr. Barbara Janowitz of FHI, which is developing guidelines. Studies in Bangladesh, Ecuador, Ghana, Mexico, and the Philippines are being conducted, with the assistance of The Evaluation Project at the Population

  9. Los Alamos Scientific Laboratory approach to hydrogeochemical and stream sediment reconnaissance for uranium in the United States

    International Nuclear Information System (INIS)

    Bolivar, S.L.

    1981-01-01

    The Los Alamos Scientific Laboratory of the United States is conducting a geochemical survey for uranium in the Rocky Mountain states of New Mexico, Colorado, Wyoming, and Montana and in Alaska. This survey is part of a national hydrogeochemical and stream sediment reconnaissance in which four Department of Energy laboratories will study the uranium resources of the United States to provide data for the National Uranium Resource Evaluation program. The reconnaissance will identify areas having higher than background concentrations of uranium in ground waters, surface waters, and water-transported sediments. Water and sediment samples are collected at a nominal density of one sample location per 10 km 2 except for lake areas of Alaska where the density is one sample location per 23 km 2 . Water samples are analyzed for uranium by fluorometry which has a 0.02 parts per billion lower limit of detection. Concentrations of 12 additional elements in water are determined by plasma-source emission spectrography. All sediments are analyzed for uranium by delayed-neutron counting with a 20 parts per billion lower limit of detection, which is well below the range of uranium concentrations in natural sediment samples. Elemental concentrations in sediments are also determined by neutron activation analysis for 31 elements by x-ray fluorescence for 9 elements, and by arc-source emission spectrography for 2 elements. The multielement analyses provide valuable data for studies concerning pathfinder elements, environmental pollution, elemental distributions, dispersion halos, and economic ore deposits other than uranium. To date, all of four Rocky Mountain states and about 80% of Alaska have been sampled. About 220,000 samples have been collected from an area of nearly 2,500,000 km 2

  10. Fragmentation of nitrogen-14 nuclei at 2.1 Gev per nucleon.

    Science.gov (United States)

    Heckman, H. H.; Greiner, D. E.; Lindstrom, P. J.; Bieser, F. S.

    1971-01-01

    An experiment has been carried out at the bevatron on the nuclear fragmentation of nitrogen-14 ions at an energy of 2.1 billion electron volts (Gev) per nucleon. Because of the near equality of the velocities of the nitrogen-14 beam and the fragmentation products at an angle of 0 deg, we find it possible to identify the nuclear fragments isotopically.

  11. The ATLAS3D project - I. A volume-limited sample of 260 nearby early-type galaxies: science goals and selection criteria

    Science.gov (United States)

    Cappellari, Michele; Emsellem, Eric; Krajnović, Davor; McDermid, Richard M.; Scott, Nicholas; Verdoes Kleijn, G. A.; Young, Lisa M.; Alatalo, Katherine; Bacon, R.; Blitz, Leo; Bois, Maxime; Bournaud, Frédéric; Bureau, M.; Davies, Roger L.; Davis, Timothy A.; de Zeeuw, P. T.; Duc, Pierre-Alain; Khochfar, Sadegh; Kuntschner, Harald; Lablanche, Pierre-Yves; Morganti, Raffaella; Naab, Thorsten; Oosterloo, Tom; Sarzi, Marc; Serra, Paolo; Weijmans, Anne-Marie

    2011-05-01

    The ATLAS3D project is a multiwavelength survey combined with a theoretical modelling effort. The observations span from the radio to the millimetre and optical, and provide multicolour imaging, two-dimensional kinematics of the atomic (H I), molecular (CO) and ionized gas (Hβ, [O III] and [N I]), together with the kinematics and population of the stars (Hβ, Fe5015 and Mg b), for a carefully selected, volume-limited (1.16 × 105 Mpc3) sample of 260 early-type (elliptical E and lenticular S0) galaxies (ETGs). The models include semi-analytic, N-body binary mergers and cosmological simulations of galaxy formation. Here we present the science goals for the project and introduce the galaxy sample and the selection criteria. The sample consists of nearby (D 15°) morphologically selected ETGs extracted from a parent sample of 871 galaxies (8 per cent E, 22 per cent S0 and 70 per cent spirals) brighter than MK statistically representative of the nearby galaxy population. We present the size-luminosity relation for the spirals and ETGs and show that the ETGs in the ATLAS3D sample define a tight red sequence in a colour-magnitude diagram, with few objects in the transition from the blue cloud. We describe the strategy of the SAURON integral field observations and the extraction of the stellar kinematics with the pPXF method. We find typical 1σ errors of ΔV≈ 6 km s-1, Δσ≈ 7 km s-1, Δh3≈Δh4≈ 0.03 in the mean velocity, the velocity dispersion and Gauss-Hermite (GH) moments for galaxies with effective dispersion σe≳ 120 km s-1. For galaxies with lower σe (≈40 per cent of the sample) the GH moments are gradually penalized by pPXF towards zero to suppress the noise produced by the spectral undersampling and only V and σ can be measured. We give an overview of the characteristics of the other main data sets already available for our sample and of the ongoing modelling projects.

  12. Measurement of radioactivity in the environment - Soil - Part 2: Guidance for the selection of the sampling strategy, sampling and pre-treatment of samples

    International Nuclear Information System (INIS)

    2007-01-01

    This part of ISO 18589 specifies the general requirements, based on ISO 11074 and ISO/IEC 17025, for all steps in the planning (desk study and area reconnaissance) of the sampling and the preparation of samples for testing. It includes the selection of the sampling strategy, the outline of the sampling plan, the presentation of general sampling methods and equipment, as well as the methodology of the pre-treatment of samples adapted to the measurements of the activity of radionuclides in soil. This part of ISO 18589 is addressed to the people responsible for determining the radioactivity present in soil for the purpose of radiation protection. It is applicable to soil from gardens, farmland, urban or industrial sites, as well as soil not affected by human activities. This part of ISO 18589 is applicable to all laboratories regardless of the number of personnel or the range of the testing performed. When a laboratory does not undertake one or more of the activities covered by this part of ISO 18589, such as planning, sampling or testing, the corresponding requirements do not apply. Information is provided on scope, normative references, terms and definitions and symbols, principle, sampling strategy, sampling plan, sampling process, pre-treatment of samples and recorded information. Five annexes inform about selection of the sampling strategy according to the objectives and the radiological characterization of the site and sampling areas, diagram of the evolution of the sample characteristics from the sampling site to the laboratory, example of sampling plan for a site divided in three sampling areas, example of a sampling record for a single/composite sample and example for a sample record for a soil profile with soil description. A bibliography is provided

  13. Novel system using microliter order sample volume for measuring arterial radioactivity concentrations in whole blood and plasma for mouse PET dynamic study.

    Science.gov (United States)

    Kimura, Yuichi; Seki, Chie; Hashizume, Nobuya; Yamada, Takashi; Wakizaka, Hidekatsu; Nishimoto, Takahiro; Hatano, Kentaro; Kitamura, Keishi; Toyama, Hiroshi; Kanno, Iwao

    2013-11-21

    This study aimed to develop a new system, named CD-Well, for mouse PET dynamic study. CD-Well allows the determination of time-activity curves (TACs) for arterial whole blood and plasma using 2-3 µL of blood per sample; the minute sample size is ideal for studies in small animals. The system has the following merits: (1) measures volume and radioactivity of whole blood and plasma separately; (2) allows measurements at 10 s intervals to capture initial rapid changes in the TAC; and (3) is compact and easy to handle, minimizes blood loss from sampling, and delay and dispersion of the TAC. CD-Well has 36 U-shaped channels. A drop of blood is sampled into the opening of the channel and stored there. After serial sampling is completed, CD-Well is centrifuged and scanned using a flatbed scanner to define the regions of plasma and blood cells. The length measured is converted to volume because the channels have a precise and uniform cross section. Then, CD-Well is exposed to an imaging plate to measure radioactivity. Finally, radioactivity concentrations are computed. We evaluated the performance of CD-Well in in vitro measurement and in vivo (18)F-fluorodeoxyglucose and [(11)C]2-carbomethoxy-3β-(4-fluorophenyl) tropane studies. In in vitro evaluation, per cent differences (mean±SE) from manual measurement were 4.4±3.6% for whole blood and 4.0±3.5% for plasma across the typical range of radioactivity measured in mouse dynamic study. In in vivo studies, reasonable TACs were obtained. The peaks were captured well, and the time courses coincided well with the TAC derived from PET imaging of the heart chamber. The total blood loss was less than 200 µL, which had no physiological effect on the mice. CD-Well demonstrates satisfactory performance, and is useful for mouse PET dynamic study.

  14. Biomass as Feedstock for a Bioenergy and Bioproducts Industry: The Technical Feasibility of a Billion-Ton Annual Supply, April 2005

    Energy Technology Data Exchange (ETDEWEB)

    None

    2005-04-01

    The purpose of this report is to determine whether the land resources of the United States are capable of producing a sustainable supply of biomass sufficient to displace 30 percent or more of the country’s present petroleum consumption – the goal set by the Biomass R&D Technical Advisory Committee in their vision for biomass technologies. Accomplishing this goal would require approximately 1 billion dry tons of biomass feedstock per year.

  15. Absolute activity determinations on large volume geological samples independent of self-absorption effects

    International Nuclear Information System (INIS)

    Wilson, W.E.

    1980-01-01

    This paper describes a method for measuring the absolute activity of large volume samples by γ-spectroscopy independent of self-absorption effects using Ge detectors. The method yields accurate matrix independent results at the expense of replicative counting of the unknown sample. (orig./HP)

  16. INVESTIGATION OF THE TOTAL ORGANIC HALOGEN ANALYTICAL METHOD AT THE WASTE SAMPLING CHARACTERIZATION FACILITY (WSCF)

    International Nuclear Information System (INIS)

    DOUGLAS JG; MEZNARICH HD, PHD; OLSEN JR; ROSS GA; STAUFFER M

    2008-01-01

    effectively remove inorganic chloride from the activated carbon adsorption tubes. With the TOX sample preparation equipment and TOX analyzers at WSCF, the nitrate wash recommended by EPA SW-846 method 9020B was found to be inadequate to remove inorganic chloride interference. Increasing the nitrate wash concentration from 10 grams per liter (g/L) to 100 g/L potassium nitrate and increasing the nitrate wash volume from 3 milliliters (mL) to 10 mL effectively removed the inorganic chloride up to at least 100 ppm chloride in the sample matrix. Excessive purging of the adsorption tubes during sample preparation was eliminated. These changes in sample preparation have been incorporated in the analytical procedure. The results using the revised sample preparation procedure show better agreement of TOX values both for replicate analyses of single samples and for the analysis of replicate samples acquired from the same groundwater well. Furthermore, less apparent column breakthrough now occurs with the revised procedure. One additional modification made to sample preparation was to discontinue the treatment of groundwater samples with sodium bisulfite. Sodium bisulfite is used to remove inorganic chlorine from the sample; inorganic chlorine is not expected to be a constituent in these groundwater samples. Several other factors were also investigated as possible sources of anomalous TOX results: (1) Instrument instability: examination of the history of results for TOX laboratory control samples and initial calibration verification standards indicate good long-term precision for the method and instrument. Determination of a method detection limit of 2.3 ppb in a deionized water matrix indicates the method and instrumentation have good stability and repeatability. (2) Non-linear instrument response: the instrument is shown to have good linear response from zero to 200 parts per billion (ppb) TOX. This concentration range encompasses the majority of samples received at WSCF for TOX

  17. INVESTIGATION OF THE TOTAL ORGANIC HALOGEN ANALYTICAL METHOD AT THE WASTE SAMPLING AND CHARACTERIZATION FACILITY

    International Nuclear Information System (INIS)

    Douglas, J.G.; Meznarich, H.K.; Olsen, J.R.; Ross, G.A.; Stauffer, M.

    2009-01-01

    remove inorganic chloride from the activated-carbon adsorption tubes. With the TOX sample preparation equipment and TOX analyzers at WSCF, the nitrate wash recommended by EPA SW-846 method 9020B was found to be inadequate to remove inorganic chloride interference. Increasing the nitrate wash concentration from 10 grams per liter (g/L) to 100 giL potassium nitrate and increasing the nitrate wash volume from 3 milliliters (mL) to 10 mL effectively removed the inorganic chloride up to at least 100 ppm chloride in the sample matrix. Excessive purging of the adsorption tubes during sample preparation was eliminated. These changes in sample preparation have been incorporated in the analytical procedure. The results using the revised sample preparation procedure show better agreement of TOX values both for replicate analyses of single samples and for the analysis of replicate samples acquired from the same groundwater well. Furthermore, less apparent adsorption tube breakthrough now occurs with the revised procedure. One additional modification made to sample preparation was to discontinue the treatment of groundwater samples with sodium bisulfite. Sodium bisulfite is used to remove inorganic chlorine from the sample; inorganic chlorine is not expected to be a constituent in these groundwater samples. Several other factors were also investigated as possible sources of anomalous TOX results: (1) Instrument instability: examination of the history of results for TOX laboratory control samples and initial calibration verification standards indicate good long-term precision for the method and instrument. Determination of a method detection limit of 2.3 ppb in a deionized water matrix indicates the method and instrumentation have good stability and repeatability. (2) Non-linear instrument response: the instrument is shown to have good linear response from zero to 200 parts per billion (ppb) TOX. This concentration range encompasses the majority of samples received at WSCF for TOX

  18. Groundwater sampling with well-points

    International Nuclear Information System (INIS)

    Laubacher, R.C.; Bailey, W.M.

    1992-01-01

    This paper reports that BP Oil Company and Engineering-Science (ES) conducted a groundwater investigation at a BP Oil Distribution facility in the coastal plain of south central Alabama. The predominant lithologies include unconsolidated Quaternary-aged gravels, sands, silts and clay. Wellpoints were used to determine the vertical and horizontal extent of volatile hydrocarbons in the water table aquifer. To determine the vertical extent of contaminant migration, the hollow-stem augers were advanced approximately 10 feet into the aquifer near a suspected source. The drill stem and bit were removed very slowly to prevent sand heaving. The well-point was again driven ahead of the augers and four volumes (18 liters) of groundwater were purged. A sample was collected and the headspace vapor was analyzed as before. Groundwater from a total of seven borings was analyzed using these techniques. Permanent monitoring wells were installed at four boring locations which had volatile concentrations less than 1 part per million. Later groundwater sampling and laboratory analysis confirmed the wells had been installed near or beyond both the horizontal and vertical plume boundaries

  19. Comparison of Statistically Modeled Contaminated Soil Volume Estimates and Actual Excavation Volumes at the Maywood FUSRAP Site - 13555

    Energy Technology Data Exchange (ETDEWEB)

    Moore, James [U.S. Army Corps of Engineers - New York District 26 Federal Plaza, New York, New York 10278 (United States); Hays, David [U.S. Army Corps of Engineers - Kansas City District 601 E. 12th Street, Kansas City, Missouri 64106 (United States); Quinn, John; Johnson, Robert; Durham, Lisa [Argonne National Laboratory, Environmental Science Division 9700 S. Cass Ave., Argonne, Illinois 60439 (United States)

    2013-07-01

    As part of the ongoing remediation process at the Maywood Formerly Utilized Sites Remedial Action Program (FUSRAP) properties, Argonne National Laboratory (Argonne) assisted the U.S. Army Corps of Engineers (USACE) New York District by providing contaminated soil volume estimates for the main site area, much of which is fully or partially remediated. As part of the volume estimation process, an initial conceptual site model (ICSM) was prepared for the entire site that captured existing information (with the exception of soil sampling results) pertinent to the possible location of surface and subsurface contamination above cleanup requirements. This ICSM was based on historical anecdotal information, aerial photographs, and the logs from several hundred soil cores that identified the depth of fill material and the depth to bedrock under the site. Specialized geostatistical software developed by Argonne was used to update the ICSM with historical sampling results and down-hole gamma survey information for hundreds of soil core locations. The updating process yielded both a best guess estimate of contamination volumes and a conservative upper bound on the volume estimate that reflected the estimate's uncertainty. Comparison of model results to actual removed soil volumes was conducted on a parcel-by-parcel basis. Where sampling data density was adequate, the actual volume matched the model's average or best guess results. Where contamination was un-characterized and unknown to the model, the actual volume exceeded the model's conservative estimate. Factors affecting volume estimation were identified to assist in planning further excavations. (authors)

  20. Statistical Power and Optimum Sample Allocation Ratio for Treatment and Control Having Unequal Costs Per Unit of Randomization

    Science.gov (United States)

    Liu, Xiaofeng

    2003-01-01

    This article considers optimal sample allocation between the treatment and control condition in multilevel designs when the costs per sampling unit vary due to treatment assignment. Optimal unequal allocation may reduce the cost from that of a balanced design without sacrificing any power. The optimum sample allocation ratio depends only on the…

  1. Hanford Site background: Part 1, Soil background for nonradioactive analytes. Revision 1, Volume 2

    Energy Technology Data Exchange (ETDEWEB)

    1993-04-01

    Volume two contains the following appendices: Description of soil sampling sites; sampling narrative; raw data soil background; background data analysis; sitewide background soil sampling plan; and use of soil background data for the detection of contamination at waste management unit on the Hanford Site.

  2. A large neutral fraction of cosmic hydrogen a billion years after the Big Bang.

    Science.gov (United States)

    Wyithe, J Stuart B; Loeb, Abraham

    2004-02-26

    The fraction of ionized hydrogen left over from the Big Bang provides evidence for the time of formation of the first stars and quasar black holes in the early Universe; such objects provide the high-energy photons necessary to ionize hydrogen. Spectra of the two most distant known quasars show nearly complete absorption of photons with wavelengths shorter than the Lyman alpha transition of neutral hydrogen, indicating that hydrogen in the intergalactic medium (IGM) had not been completely ionized at a redshift of z approximately 6.3, about one billion years after the Big Bang. Here we show that the IGM surrounding these quasars had a neutral hydrogen fraction of tens of per cent before the quasar activity started, much higher than the previous lower limits of approximately 0.1 per cent. Our results, when combined with the recent inference of a large cumulative optical depth to electron scattering after cosmological recombination therefore suggest the presence of a second peak in the mean ionization history of the Universe.

  3. A field like today's? The strength of the geomagnetic field 1.1 billion years ago

    Science.gov (United States)

    Sprain, Courtney J.; Swanson-Hysell, Nicholas L.; Fairchild, Luke M.; Gaastra, Kevin

    2018-06-01

    Palaeomagnetic data from ancient rocks are one of the few types of observational data that can be brought to bear on the long-term evolution of Earth's core. A recent compilation of palaeointensity estimates from throughout Earth history has been interpreted to indicate that Earth's magnetic field strength increased in the Mesoproterozoic (between 1.5 and 1.0 billion years ago), with this increase taken to mark the onset of inner core nucleation. However, much of the data within the Precambrian palaeointensity database are from Thellier-style experiments with non-ideal behaviour that manifests in results such as double-slope Arai plots. Choices made when interpreting these data may significantly change conclusions about long-term trends in the intensity of Earth's geomagnetic field. In this study, we present new palaeointensity results from volcanics of the ˜1.1-billion-year-old North American Midcontinent Rift. While most of the results exhibit non-ideal double-slope or sagging behaviour in Arai plots, some flows have more ideal single-slope behaviour leading to palaeointensity estimates that may be some of the best constraints on the strength of Earth's field for this time. Taken together, new and previously published palaeointensity data from the Midcontinent Rift yield a median field strength estimate of 56.0 ZAm2—very similar to the median for the past 300 Myr. These field strength estimates are distinctly higher than those for the preceding billion years (Ga) after excluding ca. 1.3 Ga data that may be biased by non-ideal behaviour—consistent with an increase in field strength in the late Mesoproterozoic. However, given that ˜90 per cent of palaeointensity estimates from 1.1 to 0.5 Ga come from the Midcontinent Rift, it is difficult to evaluate whether these high values relative to those estimated for the preceding billion years are the result of a stepwise, sustained increase in dipole moment. Regardless, palaeointensity estimates from the Midcontinent

  4. Reprocessing of the spent nuclear fuel, I-VIII, Part IV, Engineering drawings, C - Sampling equipment; Prerada isluzenog nuklearnog goriva, I-VIII, IV Deo, Konstruktivni crtezi, C - Uredjaj za uzimanje uzoraka

    Energy Technology Data Exchange (ETDEWEB)

    Gal, I [Institute of Nuclear Sciences Boris Kidric, Laboratorija za hemiju visoke aktivnosti, Vinca, Beograd (Serbia and Montenegro)

    1963-02-15

    This volume includes the engineering drawings of the sampling equipment which is part of the pilot device for for extracting uranium, plutonium and fission products from the fuel irradiated in the reactor.

  5. In-Situ Systematic Error Correction for Digital Volume Correlation Using a Reference Sample

    KAUST Repository

    Wang, B.

    2017-11-27

    The self-heating effect of a laboratory X-ray computed tomography (CT) scanner causes slight change in its imaging geometry, which induces translation and dilatation (i.e., artificial displacement and strain) in reconstructed volume images recorded at different times. To realize high-accuracy internal full-field deformation measurements using digital volume correlation (DVC), these artificial displacements and strains associated with unstable CT imaging must be eliminated. In this work, an effective and easily implemented reference sample compensation (RSC) method is proposed for in-situ systematic error correction in DVC. The proposed method utilizes a stationary reference sample, which is placed beside the test sample to record the artificial displacement fields caused by the self-heating effect of CT scanners. The detected displacement fields are then fitted by a parametric polynomial model, which is used to remove the unwanted artificial deformations in the test sample. Rescan tests of a stationary sample and real uniaxial compression tests performed on copper foam specimens demonstrate the accuracy, efficacy, and practicality of the presented RSC method.

  6. In-Situ Systematic Error Correction for Digital Volume Correlation Using a Reference Sample

    KAUST Repository

    Wang, B.; Pan, B.; Lubineau, Gilles

    2017-01-01

    The self-heating effect of a laboratory X-ray computed tomography (CT) scanner causes slight change in its imaging geometry, which induces translation and dilatation (i.e., artificial displacement and strain) in reconstructed volume images recorded at different times. To realize high-accuracy internal full-field deformation measurements using digital volume correlation (DVC), these artificial displacements and strains associated with unstable CT imaging must be eliminated. In this work, an effective and easily implemented reference sample compensation (RSC) method is proposed for in-situ systematic error correction in DVC. The proposed method utilizes a stationary reference sample, which is placed beside the test sample to record the artificial displacement fields caused by the self-heating effect of CT scanners. The detected displacement fields are then fitted by a parametric polynomial model, which is used to remove the unwanted artificial deformations in the test sample. Rescan tests of a stationary sample and real uniaxial compression tests performed on copper foam specimens demonstrate the accuracy, efficacy, and practicality of the presented RSC method.

  7. Zooplankton species identities, zooplankton species number per sample, and zooplankton abundance collected using zooplankton net as part of the California Cooperative Fisheries Investigation (CALCOFI) project, for 1994-03-01 (NODC Accession 9700104)

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — Zooplankton species identities, zooplankton species number per sample, and zooplankton abundance were collected from March 1, 1994 to March 1, 1994. Data were...

  8. Federal Radiological Monitoring and Assessment Center Monitoring Manual Volume 2, Radiation Monitoring and Sampling

    Energy Technology Data Exchange (ETDEWEB)

    NSTec Aerial Measurement Systems

    2012-07-31

    The FRMAC Monitoring and Sampling Manual, Volume 2 provides standard operating procedures (SOPs) for field radiation monitoring and sample collection activities that are performed by the Monitoring group during a FRMAC response to a radiological emergency.

  9. Estimating the cost-per-result of a national reflexed Cryptococcal antigenaemia screening program: Forecasting the impact of potential HIV guideline changes and treatment goals.

    Science.gov (United States)

    Cassim, Naseem; Coetzee, Lindi Marie; Schnippel, Kathryn; Glencross, Deborah Kim

    2017-01-01

    During 2016, the National Health Laboratory Service (NHLS) introduced laboratory-based reflexed Cryptococcal antigen (CrAg) screening to detect early Cryptococcal disease in immunosuppressed HIV+ patients with a confirmed CD4 count of 100 cells/μl or less. The aim of this study was to assess cost-per-result of a national screening program across different tiers of laboratory service, with variable daily CrAg test volumes. The impact of potential ART treatment guideline and treatment target changes on CrAg volumes, platform choice and laboratory workflow are considered. CD4 data (with counts per-result was calculated for four scenarios, including the existing service status quo (Scenario-I), and three other settings (as Scenarios II-IV) which were based on information from recent antiretroviral (ART) guidelines, District Health Information System (DHIS) data and UNAIDS 90/90/90 HIV/AIDS treatment targets. Scenario-II forecast CD4 testing offered only to new ART initiates recorded at DHIS. Scenario-III projected all patients notified as HIV+, but not yet on ART (recorded at DHIS) and Scenario-IV forecast CrAg screening in 90% of estimated HIV+ patients across South Africa (also DHIS). Stata was used to assess daily CrAg volumes at the 5th, 10th, 25th, 50th, 75th, 90th and 95th percentiles across 52 CD4-laboratories. Daily volumes were used to determine technical effort/ operator staff costs (% full time equivalent) and cost-per-result for all scenarios. Daily volumes ranged between 3 and 64 samples for Scenario-I at the 5th and 95th percentile. Similarly, daily volumes ranges of 1-12, 2-45 and 5-100 CrAg-directed samples were noted for Scenario's II, III and IV respectively. A cut-off of 30 CrAg tests per day defined use of either LFA or EIA platform. LFA cost-per-result ranged from $8.24 to $5.44 and EIA cost-per-result between $5.58 and $4.88 across the range of test volumes. The technical effort across scenarios ranged from 3.2-27.6% depending on test volumes and

  10. Lead in drinking water: sampling in primary schools and preschools in south central Kansas.

    Science.gov (United States)

    Massey, Anne R; Steele, Janet E

    2012-03-01

    Studies in Philadelphia, New York City, Houston, Washington, DC, and Greenville, North Carolina, have revealed high lead levels in drinking water. Unlike urban areas, lead levels in drinking water in suburban and rural areas have not been adequately studied. In the study described in this article, drinking water in primary schools and preschools in five suburban and rural south central Kansas towns was sampled to determine if any exceeded the U.S. Environmental Protection Agency (U.S. EPA) guidance level for schools and child care facilities of 20 parts per billion (ppb). The results showed a total of 32.1% of the samples had detectable lead levels and 3.6% exceeded the U.S. EPA guidance level for schools and child care providers of 20 ppb. These results indicate that about one-third of the drinking water consumed by children age six and under in the five suburban and rural south central Kansas towns studied has some lead contamination, exposing these children to both short-term and long-term health risks. The authors suggest a need for increased surveillance of children's drinking water in these facilities.

  11. Satellite Power Systems (SPS) concept definition study, exhibit C. Volume 2, part 1: System engineering

    Science.gov (United States)

    Hanley, G. M.

    1979-01-01

    Volume 2, Part 1, of a seven volume report is presented. Part 1 encompasses Satellite Power Systems (SPS) systems engineering aspects and is divided into three sections. The first section presents descriptions of the various candidate concepts considered and conclusions and recommendations for a preferred concept. The second section presents a summary of results of the various trade studies and analysis conducted during the course of the study. The third section describes the Photovoltaic Satellite Based Satellite Power System (SPS) Point Design as it was defined through studies performed during the period January 1977 through March 1979.

  12. Available forest biomass for new energetic and industrial prospects. Part 1: analysis and synthesis of existing studies compiled at the international level. Part 2: volume calculations. Part 3: economic part. Final report

    International Nuclear Information System (INIS)

    2007-01-01

    Motivated by new energetic constraints and the interest of biomass, the authors report a bibliographical survey of studies concerning the evaluation of the available forest biomass. They comment the geographical and time distribution of the identified and compiled studies. They analyse their different topics. Then, they discuss the various field hypotheses, discuss and comments various resource assessment methodologies. They comment the resource the French forest can be, present a synthesis of the available resource at the regional level according to the different studies. They propose a review of some technical-economical aspects (costs, energy cost, price evolutions, improvement of the wood-energy mobilization). The second part proposes a whole set of volume calculations for different forest types (clusters or plantations of trees, copses, sawmills products), for industry and household consumption. It discusses the available volumes with respect to accessibility, additional available volumes, and possible improvements. The third part analyses, comments and discusses the wood market and wood energetic uses, and the possible supply curves for wood energetic uses by 2016

  13. $35 billion habit: will nuclear cost overruns bankrupt the utilities

    International Nuclear Information System (INIS)

    Morgan, R.E.

    1980-01-01

    The Nuclear Regulatory Commission (NRC) has proposed some 150 modifications in the design and operation of nuclear power plants as a result of the accident at Three Mile Island. The Atomic Industrial Forum estimates the total cost of the NRC's proposed rule changes at $35.5 billion ($3.5 billion in capital costs for the entire industry, and $32 billion in outage and construction-delay costs to the utilities) for existing facilities and for those with construction well underway. The changes range from improved training for reactor workers to a major overhaul of the reactor-containment design. The nuclear industry is asking the NRC to modify the proposals citing excessive costs (like the $100 million changes needed for a plant that cost $17 million to build) and safety (some of the complex regulations may interfere with safety). Financing the changes has become a major problem for the utilities. If the regulators allow all the costs to be passed along to the consumer, the author feels electricity will be too expensive for the consumer

  14. 16 CFR Appendix L to Part 305 - Sample Labels

    Science.gov (United States)

    2010-01-01

    ... DISCLOSURES REGARDING ENERGY CONSUMPTION AND WATER USE OF CERTAIN HOME APPLIANCES AND OTHER PRODUCTS REQUIRED... Part 305—Sample Labels ER29AU07.122 PROTOTYPE LABEL 1 ER29AU07.123 PROTOTYPE LABEL 2 ER29AU07.124 PROTOTYPE LABEL 3 ER29AU07.125 PROTOTYPE LABEL 4 ER29AU07.126 SAMPLE LABEL 1 ER29AU07.127 SAMPLE LABEL 2...

  15. Optimal repairable spare-parts procurement policy under total business volume discount environment

    International Nuclear Information System (INIS)

    Pascual, Rodrigo; Santelices, Gabriel; Lüer-Villagra, Armin; Vera, Jorge; Cawley, Alejandro Mac

    2017-01-01

    In asset intensive fields, where components are expensive and high system availability is required, spare parts procurement is often a critical issue. To gain competitiveness and market share is common for vendors to offer Total Business Volume Discounts (TBVD). Accordingly, companies must define the procurement and stocking policy of their spare parts in order to reduce procurement costs and increase asset availability. In response to those needs, this work presents an optimization model that maximizes the availability of the equipment under a TBVD environment, subject to a budget constraint. The model uses a single-echelon structure where parts can be repaired. It determines the optimal number of repairable spare parts to be stocked, giving emphasis on asset availability, procurement costs and service levels as the main decision criteria. A heuristic procedure that achieves high quality solutions in a fast and time-consistent way was implemented to improve the time required to obtain the model solution. Results show that using an optimal procurement policy of spare parts and accounting for TBVD produces better overall results and yields a better availability performance. - Highlights: • We propose a model for procurement of repairable components in single-echelon and business volume discount environments. • We used a mathematical model to develop a competitive heuristic that provides high quality solutions in very short times. • Our model places emphasis on using system availability, procurement costs and service levels as leading decision criteria. • The model can be used as an engine for a multi-criteria Decision Support System.

  16. Backlog at December 31, 2007: euro 39,8 billion, up by 55% from year-end 2006. 2007 sales revenue: euro 11.9 billion, up by 9.8% (+10.4% like-for-like)

    International Nuclear Information System (INIS)

    2008-01-01

    The AREVA group's backlog reached a record level of euro 39.834 billion as of December 31, 2007, up by 55% from that of year-end 2006. In Nuclear, the backlog was euro 34.927 billion at year-end 2007 (+58%), due in particular to the signature of a contract in a record amount with the Chinese utility CGNPC. The series of agreements concluded provide among other things for the construction of two new-generation EPR nuclear islands and the supply of all of the materials and services needed for their operation through 2027. CGNPC also bought 35% of the production of UraMin, the mining company acquired by AREVA in August 2007. Industrial cooperation in the Back End of the cycle was launched with the signature of an agreement between China and France. In addition, the group signed several long-term contracts in significant amounts, particularly with KHNP of South Korea, EDF and Japanese utilities. The Transmission and Distribution division won several major contracts in Libya and Qatar at the end of the year approaching a total of euro 750 million. For the entire year, new orders grew by 34% to euro 5.816 billion. The backlog, meanwhile, grew by 40% to euro 4.906 billion at year-end. The group cleared sales revenue of euro 11.923 billion in 2007, up by 9.8% (+10.4% like-for-like) in relation to 2006 sales of euro 10.863 billion. Sales revenue for the 4. quarter of 2007 rose to euro 3.858 billion, for growth of 16.7% (+18.8% like-for-like) over one year. Sales revenue for the year was marked by: - Growth of 7.6% (+10.6% like-for-like) in Front End sales revenue, which rose to euro 3.140 billion. The division's Enrichment operations posted strong growth. - Sales were up by 17.5% (+15.2% like-for-like) to euro 2.717 billion in the Reactors and Services division. Sales revenue was driven in particular by the growth of Services operations, after weak demand in 2006, by progress on OL3 construction, and by the start of Flamanville 3, the second EPR. For the Back End division

  17. Monitoring oil persistence on beaches : SCAT versus stratified random sampling designs

    International Nuclear Information System (INIS)

    Short, J.W.; Lindeberg, M.R.; Harris, P.M.; Maselko, J.M.; Pella, J.J.; Rice, S.D.

    2003-01-01

    In the event of a coastal oil spill, shoreline clean-up assessment teams (SCAT) commonly rely on visual inspection of the entire affected area to monitor the persistence of the oil on beaches. Occasionally, pits are excavated to evaluate the persistence of subsurface oil. This approach is practical for directing clean-up efforts directly following a spill. However, sampling of the 1989 Exxon Valdez oil spill in Prince William Sound 12 years later has shown that visual inspection combined with pit excavation does not offer estimates of contaminated beach area of stranded oil volumes. This information is needed to statistically evaluate the significance of change with time. Assumptions regarding the correlation of visually-evident surface oil and cryptic subsurface oil are usually not evaluated as part of the SCAT mandate. Stratified random sampling can avoid such problems and could produce precise estimates of oiled area and volume that allow for statistical assessment of major temporal trends and the extent of the impact. The 2001 sampling of the shoreline of Prince William Sound showed that 15 per cent of surface oil occurrences were associated with subsurface oil. This study demonstrates the usefulness of the stratified random sampling method and shows how sampling design parameters impact statistical outcome. Power analysis based on the study results, indicate that optimum power is derived when unnecessary stratification is avoided. It was emphasized that sampling effort should be balanced between choosing sufficient beaches for sampling and the intensity of sampling

  18. Plasma diagnostics package. Volume 2: Spacelab 2 section. Part B: Thesis projects. Final science report

    International Nuclear Information System (INIS)

    Pickett, J.S.; Frank, L.A.; Kurth, W.S.

    1988-06-01

    This volume (2), which consists of two parts (A and B), of the Plasma Diagnostics Package (PDP) Final Science Report contains a summary of all of the data reduction and scientific analyses which were performed using PDP data obtained on STS-51F as a part of the Spacelab 2 (SL-2) payload. This work was performed during the period of launch, July 29, 1985, through June 30, 1988. During this period the primary data reduction effort consisted of processing summary plots of the data received by 12 of the 14 instruments located on the PDP and submitting these data to the National Space Science Data Center (NSSDC). Three Master's and three Ph.D. theses were written using PDP instrumentation data. These theses are listed in Volume 2, Part B

  19. 2016 Billion-Ton Report: Advancing Domestic Resources for a Thriving Bioeconomy, Volume 2: Environmental Sustainability Effects of Select Scenarios from Volume 1

    Energy Technology Data Exchange (ETDEWEB)

    Efroymson, Rebecca Ann [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Langholtz, Matthew H. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Johnson, Kristen [Dept. of Energy (DOE), Washington DC (United States); Stokes, Bryce [Allegheny Science & Technology, LLC, Bridgeport, WV (United States); Brandt, Craig C. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Davis, Maggie R. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Hellwinckel, Chad [Univ. of Tennessee, Knoxville, TN (United States); Kline, Keith L. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Eaton, Laurence M. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Dunn, Jennifer [Argonne National Lab. (ANL), Argonne, IL (United States); Canter, Christina E. [Argonne National Lab. (ANL), Argonne, IL (United States); Qin, Zhangcai [Argonne National Lab. (ANL), Argonne, IL (United States); Cai, Hao [Argonne National Lab. (ANL), Argonne, IL (United States); Wang, Michael [Argonne National Lab. (ANL), Argonne, IL (United States); Scott, D. Andrew [USDA Forest Service, Normal, AL (United States); Jager, Henrietta I. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Wu, May [Argonne National Lab. (ANL), Argonne, IL (United States); Ha, Miae [Argonne National Lab. (ANL), Argonne, IL (United States); Baskaran, Latha Malar [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Kreig, Jasmine A. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Rau, Benjamin [USDA Forest Service, Aiken, SC (United States); Muwamba, Augustine [Univ. of Georgia, Athens, GA (United States); Trettin, Carl [USDA Forest Service, Aiken, SC (United States); Panda, Sudhanshu [Univ. of North Georgia, Oakwood, GA (United States); Amatya, Devendra M. [USDA Forest Service, Aiken, SC (United States); Tollner, Ernest W. [USDA Forest Service, Aiken, SC (United States); Sun, Ge [USDA Forest Service, Aiken, SC (United States); Zhang, Liangxia [USDA Forest Service, Aiken, SC (United States); Duan, Kai [North Carolina State Univ., Raleigh, NC (United States); Warner, Ethan [National Renewable Energy Lab. (NREL), Golden, CO (United States); Zhang, Yimin [National Renewable Energy Lab. (NREL), Golden, CO (United States); Inman, Daniel [National Renewable Energy Lab. (NREL), Golden, CO (United States); Eberle, Annika [National Renewable Energy Lab. (NREL), Golden, CO (United States); Carpenter, Alberta [National Renewable Energy Lab. (NREL), Golden, CO (United States); Heath, Garvin [National Renewable Energy Lab. (NREL), Golden, CO (United States); Hettinger, Dylan [National Renewable Energy Lab. (NREL), Golden, CO (United States); Wang, Gangsheng [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Sutton, Nathan J. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Busch, Ingrid Karin [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Donner, Deahn M. [USDA Forest Service, Aiken, SC (United States); Wigley, T. Bently [National Council for Air and Stream Improvement (NCASI), Research Triangle Park, NC (United States); Miller, Darren A. [Weyerhaeuser Company, Federal Way, WA (United States); Coleman, Andre [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Wigmosta, Mark [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Pattullo, Molly [Univ. of Tennessee, Knoxville, TN (United States); Mayes, Melanie [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Daly, Christopher [Oregon State Univ., Corvallis, OR (United States); Halbleib, Mike [Oregon State Univ., Corvallis, OR (United States); Negri, Cristina [Argonne National Lab. (ANL), Argonne, IL (United States); Turhollow, Anthony F. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Bonner, Ian [Monsanto Company, Twin Falls, ID (United States); Dale, Virginia H. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States)

    2017-01-01

    With the goal of understanding environmental effects of a growing bioeconomy, the U.S. Department of Energy (DOE), national laboratories, and U.S. Forest Service research laboratories, together with academic and industry collaborators, undertook a study to estimate environmental effects of potential biomass production scenarios in the United States, with an emphasis on agricultural and forest biomass. Potential effects investigated include changes in soil organic carbon (SOC), greenhouse gas (GHG) emissions, water quality and quantity, air emissions, and biodiversity. Effects of altered land-management regimes were analyzed based on select county-level biomass-production scenarios for 2017 and 2040 taken from the 2016 Billion-Ton Report: Advancing Domestic Resources for a Thriving Bioeconomy (BT16), volume 1, which assumes that the land bases for agricultural and forestry would not change over time. The scenarios reflect constraints on biomass supply (e.g., excluded areas; implementation of management practices; and consideration of food, feed, forage, and fiber demands and exports) that intend to address sustainability concerns. Nonetheless, both beneficial and adverse environmental effects might be expected. To characterize these potential effects, this research sought to estimate where and under what modeled scenarios or conditions positive and negative environmental effects could occur nationwide. The report also includes a discussion of land-use change (LUC) (i.e., land management change) assumptions associated with the scenario transitions (but not including analysis of indirect LUC [ILUC]), analyses of climate sensitivity of feedstock productivity under a set of potential scenarios, and a qualitative environmental effects analysis of algae production under carbon dioxide (CO2) co-location scenarios. Because BT16 biomass supplies are simulated independent of a defined end use, most analyses do not include benefits from displacing fossil fuels or

  20. A two-billion-year history for the lunar dynamo.

    Science.gov (United States)

    Tikoo, Sonia M; Weiss, Benjamin P; Shuster, David L; Suavet, Clément; Wang, Huapei; Grove, Timothy L

    2017-08-01

    Magnetic studies of lunar rocks indicate that the Moon generated a core dynamo with surface field intensities of ~20 to 110 μT between at least 4.25 and 3.56 billion years ago (Ga). The field subsequently declined to lunar dynamo by at least 1 billion years. Such a protracted history requires an extraordinarily long-lived power source like core crystallization or precession. No single dynamo mechanism proposed thus far can explain the strong fields inferred for the period before 3.56 Ga while also allowing the dynamo to persist in such a weakened state beyond ~2.5 Ga. Therefore, our results suggest that the dynamo was powered by at least two distinct mechanisms operating during early and late lunar history.

  1. Los Alamos Scientific Laboratory approach to hydrogeochemical and stream sediment reconnaissance for uranium in the United States

    Energy Technology Data Exchange (ETDEWEB)

    Bolivar, S.L.

    1981-01-01

    The Los Alamos Scientific Laboratory of the United States is conducting a geochemical survey for uranium in the Rocky Mountain states of New Mexico, Colorado, Wyoming, and Montana and in Alaska. This survey is part of a national hydrogeochemical and stream sediment reconnaissance in which four Department of Energy laboratories will study the uranium resources of the United States to provide data for the National Uranium Resource Evaluation program. The reconnaissance will identify areas having higher than background concentrations of uranium in ground waters, surface waters, and water-transported sediments. Water and sediment samples are collected at a nominal density of one sample location per 10 km/sup 2/ except for lake areas of Alaska where the density is one sample location per 23 km/sup 2/. Water samples are analyzed for uranium by fluorometry which has a 0.02 parts per billion lower limit of detection. Concentrations of 12 additional elements in water are determined by plasma-source emission spectrography. All sediments are analyzed for uranium by delayed-neutron counting with a 20 parts per billion lower limit of detection, which is well below the range of uranium concentrations in natural sediment samples. Elemental concentrations in sediments are also determined by neutron activation analysis for 31 elements by x-ray fluorescence for 9 elements, and by arc-source emission spectrography for 2 elements. The multielement analyses provide valuable data for studies concerning pathfinder elements, environmental pollution, elemental distributions, dispersion halos, and economic ore deposits other than uranium. To date, all of four Rocky Mountain states and about 80% of Alaska have been sampled. About 220,000 samples have been collected from an area of nearly 2,500,000 km/sup 2/.

  2. Application of the Streamflow Prediction Tool to Estimate Sediment Dredging Volumes in Texas Coastal Waterways

    Science.gov (United States)

    Yeates, E.; Dreaper, G.; Afshari, S.; Tavakoly, A. A.

    2017-12-01

    Over the past six fiscal years, the United States Army Corps of Engineers (USACE) has contracted an average of about a billion dollars per year for navigation channel dredging. To execute these funds effectively, USACE Districts must determine which navigation channels need to be dredged in a given year. Improving this prioritization process results in more efficient waterway maintenance. This study uses the Streamflow Prediction Tool, a runoff routing model based on global weather forecast ensembles, to estimate dredged volumes. This study establishes regional linear relationships between cumulative flow and dredged volumes over a long-term simulation covering 30 years (1985-2015), using drainage area and shoaling parameters. The study framework integrates the National Hydrography Dataset (NHDPlus Dataset) with parameters from the Corps Shoaling Analysis Tool (CSAT) and dredging record data from USACE District records. Results in the test cases of the Houston Ship Channel and the Sabine and Port Arthur Harbor waterways in Texas indicate positive correlation between the simulated streamflows and actual dredging records.

  3. Characterizing the zenithal night sky brightness in large territories: how many samples per square kilometre are needed?

    Science.gov (United States)

    Bará, Salvador

    2018-01-01

    A recurring question arises when trying to characterize, by means of measurements or theoretical calculations, the zenithal night sky brightness throughout a large territory: how many samples per square kilometre are needed? The optimum sampling distance should allow reconstructing, with sufficient accuracy, the continuous zenithal brightness map across the whole region, whilst at the same time avoiding unnecessary and redundant oversampling. This paper attempts to provide some tentative answers to this issue, using two complementary tools: the luminance structure function and the Nyquist-Shannon spatial sampling theorem. The analysis of several regions of the world, based on the data from the New world atlas of artificial night sky brightness, suggests that, as a rule of thumb, about one measurement per square kilometre could be sufficient for determining the zenithal night sky brightness of artificial origin at any point in a region to within ±0.1 magV arcsec-2 (in the root-mean-square sense) of its true value in the Johnson-Cousins V band. The exact reconstruction of the zenithal night sky brightness maps from samples taken at the Nyquist rate seems to be considerably more demanding.

  4. Radioactive elements on Mercury's surface from MESSENGER: implications for the planet's formation and evolution.

    Science.gov (United States)

    Peplowski, Patrick N; Evans, Larry G; Hauck, Steven A; McCoy, Timothy J; Boynton, William V; Gillis-Davis, Jeffery J; Ebel, Denton S; Goldsten, John O; Hamara, David K; Lawrence, David J; McNutt, Ralph L; Nittler, Larry R; Solomon, Sean C; Rhodes, Edgar A; Sprague, Ann L; Starr, Richard D; Stockstill-Cahill, Karen R

    2011-09-30

    The MESSENGER Gamma-Ray Spectrometer measured the average surface abundances of the radioactive elements potassium (K, 1150 ± 220 parts per million), thorium (Th, 220 ± 60 parts per billion), and uranium (U, 90 ± 20 parts per billion) in Mercury's northern hemisphere. The abundance of the moderately volatile element K, relative to Th and U, is inconsistent with physical models for the formation of Mercury requiring extreme heating of the planet or its precursor materials, and supports formation from volatile-containing material comparable to chondritic meteorites. Abundances of K, Th, and U indicate that internal heat production has declined substantially since Mercury's formation, consistent with widespread volcanism shortly after the end of late heavy bombardment 3.8 billion years ago and limited, isolated volcanic activity since.

  5. Interband cascade laser-based ppbv-level mid-infrared methane detection using two digital lock-in amplifier schemes

    Science.gov (United States)

    Song, Fang; Zheng, Chuantao; Yu, Di; Zhou, Yanwen; Yan, Wanhong; Ye, Weilin; Zhang, Yu; Wang, Yiding; Tittel, Frank K.

    2018-03-01

    A parts-per-billion in volume (ppbv) level mid-infrared methane (CH4) sensor system was demonstrated using second-harmonic wavelength modulation spectroscopy (2 f-WMS). A 3291 nm interband cascade laser (ICL) and a multi-pass gas cell (MPGC) with a 16 m optical path length were adopted in the reported sensor system. Two digital lock-in amplifier (DLIA) schemes, a digital signal processor (DSP)-based DLIA and a LabVIEW-based DLIA, were used for harmonic signal extraction. A limit of detection (LoD) of 13.07 ppbv with an averaging time of 2 s was achieved using the DSP-based DLIA and a LoD of 5.84 ppbv was obtained using the LabVIEW-based DLIA with the same averaging time. A rise time of 0→2 parts-per-million in volume (ppmv) and fall time of 2→0 ppmv were observed. Outdoor atmospheric CH4 concentration measurements were carried out to evaluate the sensor performance using the two DLIA schemes.

  6. Determination of pesticide residues in leafy vegetables at parts per billion levels by a chemometric study using GC-ECD in Cameron Highlands, Malaysia.

    Science.gov (United States)

    Farina, Yang; Abdullah, Md Pauzi; Bibi, Nusrat; Khalik, Wan Mohd Afiq Wan Mohd

    2017-06-01

    A simple and sensitive analytical method has been developed employing gas chromatography coupled with electron capture detector (GC-ECD), and validated for screening and quantification of 15 pesticide residues at trace levels in cabbage, broccoli, cauliflower, lettuce, celery, spinach, and mustard. The method consists of two steps, first, to determine the significance of each factor by Pareto chart followed by optimization of these significant factors using central composite design (CCD). Minitab statistical software was used for these multivariate experiments for the generation of 2 4-1 design and CCD matrices. The method evaluation was done by external standard calibration with linearity range between 0.5 and 3mg/kg, with correlation coefficient 0.99, limit of detection (LOD) ranges between 0.02 and 4.5ng/g, and limit of quantification (LOQ) ranges between 0.2 and 45ng/g. The average recovery was between 60% and 128%, with RSD 0.2-19.8%. The method was applied on real vegetable samples from Cameron Highlands. Copyright © 2016 Elsevier Ltd. All rights reserved.

  7. Operational Efficiencies and Simulated Performance of Big Data Analytics Platform over Billions of Patient Records of a Hospital System

    Directory of Open Access Journals (Sweden)

    Dillon Chrimes

    2017-01-01

    Full Text Available Big Data Analytics (BDA is important to utilize data from hospital systems to reduce healthcare costs. BDA enable queries of large volumes of patient data in an interactively dynamic way for healthcare. The study objective was high performance establishment of interactive BDA platform of hospital system. A Hadoop/MapReduce framework was established at University of Victoria (UVic with Compute Canada/Westgrid to form a Healthcare BDA (HBDA platform with HBase (NoSQL database using hospital-specific metadata and file ingestion. Patient data profiles and clinical workflow derived from Vancouver Island Health Authority (VIHA, Victoria, BC, Canada. The proof-of-concept implementation tested patient data representative of the entire Provincial hospital systems. We cross-referenced all data profiles and metadata with real patient data used in clinical reporting. Query performance tested Apache tools in Hadoop’s ecosystem. At optimized iteration, Hadoop Distributed File System (HDFS ingestion required three seconds but HBase required four to twelve hours to complete the Reducer of MapReduce. HBase bulkloads took a week for one billion (10TB and over two months for three billion (30TB. Simple and complex query results showed about two seconds for one and three billion, respectively. Apache Drill outperformed Apache Spark. However, it was restricted to running more simplified queries with poor usability for healthcare. Jupyter on Spark offered high performance and customization to run all queries simultaneously with high usability. BDA platform of HBase distributed over Hadoop successfully; however, some inconsistencies of MapReduce limited operational efficiencies. Importance of Hadoop/MapReduce on representation of platform performance discussed.

  8. A test of alternative estimators for volume at time 1 from remeasured point samples

    Science.gov (United States)

    Francis A. Roesch; Edwin J. Green; Charles T. Scott

    1993-01-01

    Two estimators for volume at time 1 for use with permanent horizontal point samples are evaluated. One estimator, used traditionally, uses only the trees sampled at time 1, while the second estimator, originally presented by Roesch and coauthors (F.A. Roesch, Jr., E.J. Green, and C.T. Scott. 1989. For. Sci. 35(2):281-293). takes advantage of additional sample...

  9. Neutrinos and our Sun - Part 2

    Indian Academy of Sciences (India)

    the sun during its lifetime of four and a half billion years is given by ... The balance between ... per unit time (the luminosity): ... are operative in all stars during the bulk of their life: (a) ..... Thus the data collected over several years of hard work.

  10. Twenty-year trends of authorship and sampling in applied biomechanics research.

    Science.gov (United States)

    Knudson, Duane

    2012-02-01

    This study documented the trends in authorship and sampling in applied biomechanics research published in the Journal of Applied Biomechanics and ISBS Proceedings. Original research articles of the 1989, 1994, 1999, 2004, and 2009 volumes of these serials were reviewed, excluding reviews, modeling papers, technical notes, and editorials. Compared to 1989 volumes, the mean number of authors per paper significantly increased (35 and 100%, respectively) in the 2009 volumes, along with increased rates of hyperauthorship, and a decline in rates of single authorship. Sample sizes varied widely across papers and did not appear to change since 1989.

  11. Fiscal 1988 draft budget for nuclear energy up 1.9% to yen 369 billion

    International Nuclear Information System (INIS)

    Anon.

    1988-01-01

    AT the cabinet meeting held on December 28, the government approved the fiscal 1988 draft budget, with a general account of yen 56.6 trillion. The nuclear energy related budget is yen 181.124 billion from the general account and yen 186.098 billion from the special account for power sources development, totalling yen 367.222 billion, up 1.9% on the previous year. The largest appropriation goes to the Science and Technology Agency (STA) totaling yen 271 billion. The STA is promoting safety studies and R and D for extensive nuclear energy utilization but the budget shows a 0.7% decrease from the previous year, reflecting completion of the construction of JT-60, which is one of the Agency's major projects. MITI, with its budget of yen 91 billion will carry on policies related to the promotion of commercial nuclear power program as well as support for the industrialization program of the nuclear fuel cycle. Nuclear related budget of Ministry of Foreign Affairs is yen 2.8 billion, consisting mainly of IAEA subscriptions and contributions and OECD/NEA subscriptions. Besides these three government agencies, a large sum of yen 1.2 billion is allocated to the Okinawa Development Agency for the prevention and elimination of melon-flies in Kume Island and islands around Okinawa main island. The draft government budget will be submitted to the ordinary session of the Diet when it resumes towards the end of January. After deliberation in the Budget Committees of the House of Representatives and the House of Councilors, the draft budget will be put to the vote in the plenary session. Assuming that all proceeds smoothly, the budget is expected to be approved by the end of March without any major revision. (author)

  12. Working conditions of female part-time and full-time teachers in relation to health status.

    Science.gov (United States)

    Seibt, Reingard; Matz, Annerose; Hegewald, Janice; Spitzer, Silvia

    2012-08-01

    Teacher's volume of employment and health status are controversially discussed in the current literature. This study focused on female teachers with part-time versus full-time jobs in association with working conditions and health status depending on age. A sample of 263 part-time and 367 full-time female teachers (average age 46.7 ± 7.8 vs. 46.0 ± 6.3) participated in an occupational health screening. Specific work conditions, stressors (job history-questionnaire) and effort-reward-imbalance ratio (ERI-Q) were measured and their relationships to mental and physical health were analysed. Health status was quantified by complaints (BFB questionnaire), general mental health status (GHQ-12) and cardiovascular risk factors. On average, teachers in part-time positions reported 36 and in full-time positions 42 h per week. The effort-reward ratios were significantly associated with the volume of employment. Teachers in part-time jobs had only a slightly lower ERI-ratio. There were no differences between full-time and part-time teachers regarding health status. Eighteen percentage of both groups reported impaired mental health (GHQ ≥ 5), 48% of part-time teachers and 53% of full-time teachers suffered from high blood pressure. Low physical fitness was observed in 12% of part-time and 6% of full-time teachers. In this study, neither the volume of employment nor working conditions were found to be significantly correlated with health status. Part-time and full-time employment status did not appear to influence health in the teaching profession. Although there are differences in quantitative working demands, while the health status does not differ between both teacher groups.

  13. Fast and effective determination of strontium-90 in high volumes water samples

    International Nuclear Information System (INIS)

    Basarabova, B.; Dulanska, S.

    2014-01-01

    A simple and fast method was developed for determination of 90 Sr in high volumes of water samples from vicinity of nuclear power facilities. Samples were taken from the environment near Nuclear Power Plants in Jaslovske Bohunice and Mochovce in Slovakia. For determination of 90 Sr was used solid phase extraction using commercial sorbent Analig R Sr-01 from company IBC Advanced Technologies, Inc.. Determination of 90 Sr was performed with dilute solution of HNO 3 (1.5-2 M) and also tested in base medium with NaOH. For elution of 90 Sr was used eluent EDTA with pH in range 8-9. To achieve fast determination, automation was applied, which brings significant reduction of separation time. Concentration of water samples with evaporation was not necessary. Separation was performed immediately after filtration of analyzed samples. The aim of this study was development of less expensive, time unlimited and energy saving method for determination of 90 Sr in comparison with conventional methods. Separation time for fast-flow with volume of 10 dm 3 of water samples was 3.5 hours (flow-rate approximately 3.2 dm 3 / 1 hour). Radiochemical strontium yield was traced by using radionuclide 85 Sr. Samples were measured with HPGe detector (High-purity Germanium detector) at energy E φ = 514 keV. By using Analig R Sr-01 yields in range 72 - 96 % were achieved. Separation based on solid phase extraction using Analig R Sr-01 employing utilization of automation offers new, fast and effective method for determination of 90 Sr in water matrix. After ingrowth of yttrium samples were measured by Liquid Scintillation Spectrometer Packard Tricarb 2900 TR with software Quanta Smart. (authors)

  14. National Low-Level Waste Management Program Radionuclide Report Series. Volume 10, Nickel-63

    International Nuclear Information System (INIS)

    Carboneau, M.L.; Adams, J.P.

    1995-02-01

    This report outlines the basic radiological, chemical, and physical characteristics of nickel-63 ( 63 Ni) and examines how these characteristics affect the behavior of 63 Ni in various environmental media, such as soils, groundwater, plants, animals, the atmosphere, and the human body. Discussions also include methods of 63 Ni production, waste types, and waste forms that contain 63 Ni. The primary source of 63 Ni in the environment has been low-level radioactive waste material generated as a result of neutron activation of stable 62 Ni that is present in the structural components of nuclear reactor vessels. 63 Ni enters the environment from the dismantling activities associated with nuclear reactor decommissioning. However, small amounts of 63 Ni have been detected in the environment following the testing of thermonuclear weapons in the South Pacific. Concentrations as high as 2.7 Bq a per gram of sample (or equivalently 0.0022 parts per billion) were observed on Bikini Atoll (May 1954). 63 Ni was not created as a fission product species (e.g., from 235 U or 239 Pu fissions), but instead was produced as a result of neutron capture in 63 Ni, a common nickel isotope present in the stainless steel components of nuclear weapons (e.g., stainless-304 contains ∼9% total Ni or ∼0.3% 63 Ni)

  15. Future fuels: Canada's coast-to-coast network of refineries is emerging from a $3-billion-plus spending binge to take the lead in producing low sulphur gasoline

    International Nuclear Information System (INIS)

    Lunan, D.

    2004-01-01

    A series of investments to convert Canada's 22 operating refineries to produce low-sulphur gasoline are discussed. The investment involves more than $3-billion that will transform Canada's portfolio of aging refineries into one of the most efficient in the western world, and in the process reduce sulphur content in Canadian gasoline to 30 ppm. In some cases the refitting will be completed years ahead of the required 2005 deadline. Total refining capacity in Canada is about 2.5 million barrels per day of crude oil, which includes 580,000 barrels per day of capacity that is dedicated to upgrading bitumen into synthetic crude oil. The initiative to update the refineries was led by Irving Oil, which launched a one billion dollar refit of its 250,000 barrels per day Saint John refinery in the year 2000. Irving Oil's efforts were driven by the company's marketing program in the United States where regional fuel quality standards are higher than national standards either in Canada or the United States. Shell Canada and Imperial Oil are also on track to meet the 30 ppm sulphur level ahead of schedule. For example, Shell Canada is cooperating with Suncor Energy Products in the construction of a hydrotreater at Suncor's Sarnia refinery which will be used to reduce sulphur content of diesel from both the Shell and Suncor refineries, while Imperial Oil is investing over $520 million to refit its refineries in Alberta, Ontario and Nova Scotia. Petro-Canada too, has embarked on a $450 million capital program late in 2003 to introduce low sulphur gasoline; this was in addition to the $1.2 billion program to integrate its bitumen production, upgrading and refining operations. Ultramar launched its $300 million desulphurization program in late 2002; the project is now nearing completion. Refit of Ultramar's Jean Gaulin refinery on Quebec's South Shore will also include a 30,000 barrels per day continuous regeneration platformer to provide a second hydrogen source for the

  16. The effect of duty hour regulation on resident surgical case volume in otolaryngology.

    Science.gov (United States)

    Curtis, Stuart H; Miller, Robert H; Weng, Cindy; Gurgel, Richard K

    2014-10-01

    Evaluate the effect of duty hour regulation on graduating otolaryngology resident surgical case volume and analyze trends in surgical case volume for Accreditation Council for Graduate Medical Education (ACGME) key indicator cases from 1996 to 2011. Time-trend analysis of surgical case volume. Nationwide sample of otolaryngology residency programs. Operative logs from the American Board of Otolaryngology and ACGME for otolaryngology residents graduating in the years 1996 to 2011. Key indicator volumes and grouped domain volumes before and after resident duty hour regulations (2003) were calculated and compared. Independent t test was performed to evaluate overall difference in operative volume. Wilcoxon rank sum test evaluated differences between procedures per time period. Linear regression evaluated trend. The average total number of key indicator cases per graduating resident was 440.8 in 1996-2003 compared to 500.4 cases in 2004-2011, and overall average per number of key indicators was 31.5 and 36.2, respectively (P = .067). Four key indicator cases showed statistically significant (P otolaryngology residents. The overall trend in operative volume is increasing for several specific key indicators. © American Academy of Otolaryngology—Head and Neck Surgery Foundation 2014.

  17. Improvement of 137Cs analysis in small volume seawater samples using the Ogoya underground facility

    International Nuclear Information System (INIS)

    Hirose, K.; Komura, K.; Kanazawa University, Ishikawa; Aoyama, M.; Igarashi, Y.

    2008-01-01

    137 Cs in seawater is one of the most powerful tracers of water motion. Large volumes of samples have been required for determination of 137 Cs in seawater. This paper describes improvement of separation and purification processes of 137 Cs in seawater, which includes purification of 137 Cs using hexachloroplatinic acid in addition to ammonium phosphomolybdate (AMP) precipitation. As a result, we succeeded the 137 Cs determination in seawater with a smaller sample volume of 10 liter by using ultra-low background gamma-spectrometry in the Ogoya underground facility. 137 Cs detection limit was about 0.1 mBq (counting time: 10 6 s). This method is applied to determine 137 Cs in small samples of the South Pacific deep waters. (author)

  18. Economic impacts of Alberta's oil sands, volume 1

    International Nuclear Information System (INIS)

    Timilsina, G.R.; LeBlanc, N.; Walden, T.

    2005-01-01

    In 2004, the international media recognized Alberta's oil sands as part of the global oil reserves, thereby establishing Canada as second to Saudi Arabia as potential oil producing nations. The economic impacts of Alberta's oil sands industry on economies were assessed at regional, provincial and international levels for the 2000 to 2020 period. A customized input-output model was used to assess economic impacts, which were measured in terms of changes in gross domestic product; employment and labour income; and, government revenues. Cumulative impacts on employment by sector and by jurisdiction were also presented. An investment of $100 billion is expected through 2020, resulting in production of crude bitumen and synthetic crude oil outputs valued at about $531 billion. The impact of the oil sands industry on local employment was also evaluated. It was shown that activities in the oil sands industry will lead to significant economic impact in Alberta, Ontario, Quebec and the rest of Canada. Alberta's local economy would be the main beneficiary of oil sands activities with nearly 3.6 million person years employment created in Alberta during the 2000 to 2020. Another 3 million person years employment would be created in other Canadian provinces and outside Canada during the same time period. A sensitivity analysis on the responsiveness to oil prices and the removal of various constraints incorporated in the main analysis was also presented. The federal government will be the largest recipient of revenues generated to to oil sands activities. The results of the study were compared with that of the National Task Force on Oil Sands Strategies. This first volume revealed the results of the study while the second volume includes the data and detailed results. 48 refs., 57 tabs., 28 figs

  19. Automated force volume image processing for biological samples.

    Directory of Open Access Journals (Sweden)

    Pavel Polyakov

    2011-04-01

    Full Text Available Atomic force microscopy (AFM has now become a powerful technique for investigating on a molecular level, surface forces, nanomechanical properties of deformable particles, biomolecular interactions, kinetics, and dynamic processes. This paper specifically focuses on the analysis of AFM force curves collected on biological systems, in particular, bacteria. The goal is to provide fully automated tools to achieve theoretical interpretation of force curves on the basis of adequate, available physical models. In this respect, we propose two algorithms, one for the processing of approach force curves and another for the quantitative analysis of retraction force curves. In the former, electrostatic interactions prior to contact between AFM probe and bacterium are accounted for and mechanical interactions operating after contact are described in terms of Hertz-Hooke formalism. Retraction force curves are analyzed on the basis of the Freely Jointed Chain model. For both algorithms, the quantitative reconstruction of force curves is based on the robust detection of critical points (jumps, changes of slope or changes of curvature which mark the transitions between the various relevant interactions taking place between the AFM tip and the studied sample during approach and retraction. Once the key regions of separation distance and indentation are detected, the physical parameters describing the relevant interactions operating in these regions are extracted making use of regression procedure for fitting experiments to theory. The flexibility, accuracy and strength of the algorithms are illustrated with the processing of two force-volume images, which collect a large set of approach and retraction curves measured on a single biological surface. For each force-volume image, several maps are generated, representing the spatial distribution of the searched physical parameters as estimated for each pixel of the force-volume image.

  20. Mars methane detection and variability at Gale crater

    Science.gov (United States)

    Webster, Christopher R.; Mahaffy, Paul R.; Atreya, Sushil K.; Flesch, Gregory J.; Mischna, Michael A.; Meslin, Pierre-Yves; Farley, Kenneth A.; Conrad, Pamela G.; Christensen, Lance E.; Pavlov, Alexander A.; Martín-Torres, Javier; Zorzano, María-Paz; McConnochie, Timothy H.; Owen, Tobias; Eigenbrode, Jennifer L.; Glavin, Daniel P.; Steele, Andrew; Malespin, Charles A.; Archer, P. Douglas; Sutter, Brad; Coll, Patrice; Freissinet, Caroline; McKay, Christopher P.; Moores, John E.; Schwenzer, Susanne P.; Bridges, John C.; Navarro-Gonzalez, Rafael; Gellert, Ralf; Lemmon, Mark T.; MSL Science Team; Abbey, William; Achilles, Cherie; Agard, Christophe; Alexandre Alves Verdasca, José; Anderson, Dana; Anderson, Robert C.; Anderson, Ryan B.; Appel, Jan Kristoffer; Archer, Paul Douglas; Arevalo, Ricardo; Armiens-Aparicio, Carlos; Arvidson, Raymond; Atlaskin, Evgeny; Atreya, Andrew Sushil; Azeez, Aubrey Sherif; Baker, Burt; Baker, Michael; Balic-Zunic, Tonci; Baratoux, David; Baroukh, Julien; Barraclough, Bruce; Battalio, Michael; Beach, Michael; Bean, Keri; Beck, Pierre; Becker, Richard; Beegle, Luther; Behar, Alberto; Belgacem, Inès; Bell, James F., III; Bender, Steven; Benna, Mehdi; Bentz, Jennifer; Berger, Jeffrey; Berger, Thomas; Berlanga, Genesis; Berman, Daniel; Bish, David; Blacksberg, Jordana; Blake, David F.; José Blanco, Juan; Blaney, Ávalos Diana; Blank, Jennifer; Blau, Hannah; Bleacher, Lora; Boehm, Eckart; Bonnet, Jean-Yves; Botta, Oliver; Böttcher, Stephan; Boucher, Thomas; Bower, Hannah; Boyd, Nick; Boynton, William; Braswell, Shaneen; Breves, Elly; Bridges, John C.; Bridges, Nathan; Brinckerhoff, William; Brinza, David; Bristow, Thomas; Brunet, Claude; Brunner, Anna; Brunner, Will; Buch, Arnaud; Bullock, Mark; Burmeister, Sönke; Burton, John; Buz, Jennifer; Cabane, Michel; Calef, Fred; Cameron, James; Campbell, John L.; Cantor, Bruce; Caplinger, Michael; Clifton, Carey, Jr.; Caride Rodríguez, Javier; Carmosino, Marco; Carrasco Blázquez, Isaías; Cavanagh, Patrick; Charpentier, Antoine; Chipera, Steve; Choi, David; Christensen, Lance; Clark, Benton; Clegg, Sam; Cleghorn, Timothy; Cloutis, Ed; Cody, George; Coll, Patrice; Coman, Ecaterina I.; Conrad, Pamela; Coscia, David; Cousin, Agnès; Cremers, David; Crisp, Joy A.; Cropper, Kevin; Cros, Alain; Cucinotta, Francis; d'Uston, Claude; Davis, Scott; Day, Mackenzie; Daydou, Yves; DeFlores, Lauren; Dehouck, Erwin; Delapp, Dorothea; DeMarines, Julia; Dequaire, Tristan; Des Marais, David; Desrousseaux, Roch; Dietrich, William; Dingler, Robert; Domagal-Goldman, Shawn; Donny, Christophe; Downs, Robert; Drake, Darrell; Dromart, Gilles; Dupont, Audrey; Duston, Brian; Dworkin, Jason P.; Dyar, M. Darby; Edgar, Lauren; Edgett, Kenneth; Edwards, Christopher S.; Edwards, Laurence; Edwards, Peter; Ehlmann, Bethany; Ehresmann, Bent; Eigenbrode, Jennifer; Elliott, Beverley; Elliott, Harvey; Ewing, Ryan; Fabre, Cécile; Fairén, Alberto; Fairén, Alberto; Farley, Kenneth; Farmer, Jack; Fassett, Caleb; Favot, Laurent; Fay, Donald; Fedosov, Fedor; Feldman, Jason; Fendrich, Kim; Fischer, Erik; Fisk, Martin; Fitzgibbon, Mike; Flesch, Gregory; Floyd, Melissa; Flückiger, Lorenzo; Forni, Olivier; Fox, Valerie; Fraeman, Abigail; Francis, Raymond; François, Pascaline; Franz, Heather; Freissinet, Caroline; French, Katherine Louise; Frydenvang, Jens; Garvin, James; Gasnault, Olivier; Geffroy, Claude; Gellert, Ralf; Genzer, Maria; Getty, Stephanie; Glavin, Daniel; Godber, Austin; Goesmann, Fred; Goetz, Walter; Golovin, Dmitry; Gómez Gómez, Felipe; Gómez-Elvira, Javier; Gondet, Brigitte; Gordon, Suzanne; Gorevan, Stephen; Graham, Heather; Grant, John; Grinspoon, David; Grotzinger, John; Guillemot, Philippe; Guo, Jingnan; Gupta, Sanjeev; Guzewich, Scott; Haberle, Robert; Halleaux, Douglas; Hallet, Bernard; Hamilton, Victoria; Hand, Kevin; Hardgrove, Craig; Hardy, Keian; Harker, David; Harpold, Daniel; Harri, Ari-Matti; Harshman, Karl; Hassler, Donald; Haukka, Harri; Hayes, Alexander; Herkenhoff, Kenneth; Herrera, Paul; Hettrich, Sebastian; Heydari, Ezat; Hipkin, Victoria; Hoehler, Tori; Hollingsworth, Jeff; Hudgins, Judy; Huntress, Wesley; Hurowitz, Joel; Hviid, Stubbe; Iagnemma, Karl; Indyk, Stephen; Israël, Guy; Jackson, Ryan Steele; Jacob, Samantha; Jakosky, Bruce; Jean-Rigaud, Laurent; Jensen, Elsa; Kløvgaard Jensen, Jaqueline; Johnson, Jeffrey R.; Johnson, Micah; Johnstone, Stephen; Jones, Andrea; Jones, John H.; Joseph, Jonathan; Joulin, Mélissa; Jun, Insoo; Kah, Linda C.; Kahanpää, Henrik; Kahre, Melinda; Kaplan, Hannah; Karpushkina, Natalya; Kashyap, Srishti; Kauhanen, Janne; Keely, Leslie; Kelley, Simon; Kempe, Fabian; Kemppinen, Osku; Kennedy, Megan R.; Keymeulen, Didier; Kharytonov, Alexander; Kim, Myung-Hee; Kinch, Kjartan; King, Penelope; Kirk, Randolph; Kirkland, Laurel; Kloos, Jacob; Kocurek, Gary; Koefoed, Asmus; Köhler, Jan; Kortmann, Onno; Kotrc, Benjamin; Kozyrev, Alexander; Krau, Johannes; Krezoski, ß. Gillian; Kronyak, Rachel; Krysak, Daniel; Kuzmin, Ruslan; Lacour, Jean-Luc; Lafaille, Vivian; Langevin, Yves; Lanza, Nina; Lapôtre, Mathieu; Larif, Marie-France; Lasue, Jérémie; Le Deit, Laetitia; Le Mouélic, Stéphane; Lee, Ella Mae; Lee, Qiu-Mei; Lee, Rebekka; Lees, David; Lefavor, Matthew; Lemmon, Mark; Lepinette, Alain; Lepore, Malvitte Kate; Leshin, Laurie; Léveillé, Richard; Lewin, Éric; Lewis, Kevin; Li, Shuai; Lichtenberg, Kimberly; Lipkaman, Leslie; Lisov, Denis; Little, Cynthia; Litvak, Maxim; Liu, Lu; Lohf, Henning; Lorigny, Eric; Lugmair, Günter; Lundberg, Angela; Lyness, Eric; Madsen, Morten Bo; Magee, Angela; Mahaffy, Paul; Maki, Justin; Mäkinen, Teemu; Malakhov, Alexey; Malespin, Charles; Malin, Michael; Mangold, Nicolas; Manhes, Gerard; Manning, Heidi; Marchand, Geneviève; Marín Jiménez, Mercedes; Martín García, César; Martin, David K.; Martin, Mildred; Martin, Peter; Martínez Martínez, Germán; Martínez-Frías, Jesús; Martín-Sauceda, Jaime; Martín-Soler, Martín Javier; Martín-Torres, F. Javier; Mason, Emily; Matthews, Tristan; Matthiä, Daniel; Mauchien, Patrick; Maurice, Sylvestre; McAdam, Amy; McBride, Marie; McCartney, Elaina; McConnochie, Timothy; McCullough, Emily; McEwan, Ian; McKay, Christopher; McLain, Hannah; McLennan, Scott; McNair, Sean; Melikechi, Noureddine; Mendaza de Cal, Teresa; Merikallio, Sini; Merritt, Sean; Meslin, Pierre-Yves; Meyer, Michael; Mezzacappa, Alissa; Milkovich, Sarah; Millan, Maëva; Miller, Hayden; Miller, Kristen; Milliken, Ralph; Ming, Douglas; Minitti, Michelle; Mischna, Michael; Mitchell, Julie; Mitrofanov, Igor; Moersch, Jeffrey; Mokrousov, Maxim; Molina, Antonio; Moore, Jurado Casey; Moores, John E.; Mora-Sotomayor, Luis; Moreno, Gines; Morookian, John Michael; Morris, Richard V.; Morrison, Shaunna; Mousset, Valérie; Mrigakshi, Alankrita; Mueller-Mellin, Reinhold; Muller, Jan-Peter; Muñoz Caro, Guillermo; Nachon, Marion; Nastan, Abbey; Navarro López, Sara; Navarro González, Rafael; Nealson, Kenneth; Nefian, Ara; Nelson, Tony; Newcombe, Megan; Newman, Claire; Newsom, Horton; Nikiforov, Sergey; Nikitczuk, Matthew; Niles, Paul; Nixon, Brian; Noblet, Audrey; Noe, Eldar; Nolan, Dobrea Thomas; Oehler, Dorothy; Ollila, Ann; Olson, Timothy; Orthen, Tobias; Owen, Tobias; Ozanne, Marie; de Pablo Hernández, Miguel Ángel; Pagel, Hannah; Paillet, Alexis; Pallier, Etienne; Palucis, Marisa; Parker, Timothy; Parot, Yann; Parra, Alex; Patel, Kiran; Paton, Mark; Paulsen, Gale; Pavlov, Alexander; Pavri, Betina; Peinado-González, Verónica; Pepin, Robert; Peret, Laurent; Pérez, René; Perrett, Glynis; Peterson, Joseph; Pilorget, Cedric; Pinet, Patrick; Pinnick, Veronica; Pla-García, Jorge; Plante, Ianik; Poitrasson, Franck; Polkko, Jouni; Popa, Radu; Posiolova, Liliya; Posner, Arik; Pradler, Irina; Prats, Benito; Prokhorov, Vasily; Raaen, Eric; Radziemski, Leon; Rafkin, Scot; Ramos, Miguel; Rampe, Elizabeth; Rapin, William; Raulin, François; Ravine, Michael; Reitz, Günther; Ren, Jun; Rennó, Nilton; Rice, Melissa; Richardson, Mark; Ritter, Birgit; Rivera-Hernández, Frances; Robert, François; Robertson, Kevin; Rodriguez Manfredi, José Antonio; José Romeral-Planelló, Julio; Rowland, Scott; Rubin, David; Saccoccio, Muriel; Said, David; Salamon, Andrew; Sanin, Anton; Sans Fuentes, Sara Alejandra; Saper, Lee; Sarrazin, Philippe; Sautter, Violaine; Savijärvi, Hannu; Schieber, Juergen; Schmidt, Mariek; Schmidt, Walter; Scholes, Daniel; Schoppers, Marcel; Schröder, Susanne; Schwenzer, Susanne P.; Sciascia Borlina, Cauê; Scodary, Anthony; Sebastián Martínez, Eduardo; Sengstacken, Aaron; Shechet, Jennifer Griffes; Shterts, Ruslan; Siebach, Kirsten; Siili, Tero; Simmonds, John J.; Sirven, Jean-Baptiste; Slavney, Susan; Sletten, Ronald; Smith, Michael D.; Sobron Sanchez, Pablo; Spanovich, Nicole; Spray, John; Spring, Justin; Squyres, Steven; Stack, Katie; Stalport, Fabien; Starr, Richard; Stein, Andrew Steele Thomas; Stern, Jennifer; Stewart, Noel; Stewart, Wayne; Stipp, Svane Susan Louise; Stoiber, Kevin; Stolper, Edward; Sucharski, Robert; Sullivan, Robert; Summons, Roger; Sumner, Dawn Y.; Sun, Vivian; Supulver, Kimberley; Sutter, Brad; Szopa, Cyril; Tan, Florence; Tate, Christopher; Teinturier, Samuel; ten Kate, Inge Loes; Thomas, Alicia; Thomas, Peter; Thompson, Lucy; Thuillier, Franck; Thulliez, Emmanual; Tokar, Robert; Toplis, Michael; de la Torre Juárez, Manuel; Torres Redondo, Josefina; Trainer, Melissa; Treiman, Allan; Tretyakov, Vladislav; Ullán-Nieto, Aurora; Urqui-O'Callaghan, Roser; Valentín-Serrano, Patricia; Van Beek, Jason; Van Beek, Tessa; VanBommel, Scott; Vaniman, David; Varenikov, Alexey; Vasavada, Ashwin R.; Vasconcelos, Paulo; de Vicente-Retortillo Rubalcaba, Álvaro; Vicenzi, Edward; Vostrukhin, Andrey; Voytek, Mary; Wadhwa, Meenakshi; Ward, Jennifer; Watkins, Jessica; Webster, Christopher R.; Weigle, Gerald; Wellington, Danika; Westall, Frances; Wiens, Roger; Wilhelm, Mary Beth; Williams, Amy; Williams, Joshua; Williams, Rebecca; Williams, Richard B.; Williford, Kenneth; Wilson, Michael A.; Wilson, Sharon A.; Wimmer-Schweingruber, Robert; Wolff, Michael; Wong, Michael; Wray, James; Yana, Charles; Yen, Albert; Yingst, Aileen; Zeitlin, Cary; Zimdar, Robert; Zorzano Mier, María-Paz

    2015-01-01

    Reports of plumes or patches of methane in the martian atmosphere that vary over monthly time scales have defied explanation to date. From in situ measurements made over a 20-month period by the tunable laser spectrometer of the Sample Analysis at Mars instrument suite on Curiosity at Gale crater, we report detection of background levels of atmospheric methane of mean value 0.69 ± 0.25 parts per billion by volume (ppbv) at the 95% confidence interval (CI). This abundance is lower than model estimates of ultraviolet degradation of accreted interplanetary dust particles or carbonaceous chondrite material. Additionally, in four sequential measurements spanning a 60-sol period (where 1 sol is a martian day), we observed elevated levels of methane of 7.2 ± 2.1 ppbv (95% CI), implying that Mars is episodically producing methane from an additional unknown source.

  1. Evaluation of pump pulsation in respirable size-selective sampling: part II. Changes in sampling efficiency.

    Science.gov (United States)

    Lee, Eun Gyung; Lee, Taekhee; Kim, Seung Won; Lee, Larry; Flemmer, Michael M; Harper, Martin

    2014-01-01

    This second, and concluding, part of this study evaluated changes in sampling efficiency of respirable size-selective samplers due to air pulsations generated by the selected personal sampling pumps characterized in Part I (Lee E, Lee L, Möhlmann C et al. Evaluation of pump pulsation in respirable size-selective sampling: Part I. Pulsation measurements. Ann Occup Hyg 2013). Nine particle sizes of monodisperse ammonium fluorescein (from 1 to 9 μm mass median aerodynamic diameter) were generated individually by a vibrating orifice aerosol generator from dilute solutions of fluorescein in aqueous ammonia and then injected into an environmental chamber. To collect these particles, 10-mm nylon cyclones, also known as Dorr-Oliver (DO) cyclones, were used with five medium volumetric flow rate pumps. Those were the Apex IS, HFS513, GilAir5, Elite5, and Basic5 pumps, which were found in Part I to generate pulsations of 5% (the lowest), 25%, 30%, 56%, and 70% (the highest), respectively. GK2.69 cyclones were used with the Legacy [pump pulsation (PP) = 15%] and Elite12 (PP = 41%) pumps for collection at high flows. The DO cyclone was also used to evaluate changes in sampling efficiency due to pulse shape. The HFS513 pump, which generates a more complex pulse shape, was compared to a single sine wave fluctuation generated by a piston. The luminescent intensity of the fluorescein extracted from each sample was measured with a luminescence spectrometer. Sampling efficiencies were obtained by dividing the intensity of the fluorescein extracted from the filter placed in a cyclone with the intensity obtained from the filter used with a sharp-edged reference sampler. Then, sampling efficiency curves were generated using a sigmoid function with three parameters and each sampling efficiency curve was compared to that of the reference cyclone by constructing bias maps. In general, no change in sampling efficiency (bias under ±10%) was observed until pulsations exceeded 25% for the

  2. Generation of sub-part-per-billion gaseous volatile organic compounds at ambient temperature by headspace diffusion of aqueous standards through decoupling between ideal and nonideal Henry's law behavior.

    Science.gov (United States)

    Kim, Yong-Hyun; Kim, Ki-Hyun

    2013-05-21

    In the analysis of volatile organic compounds in air, the preparation of their gaseous standards at low (sub-ppb) concentration levels with high reliability is quite difficult. In this study, a simple dynamic headspace-based approach was evaluated as a means of generating vapor-phase volatile organic compounds from a liquid standard in an impinger at ambient temperature (25 °C). For a given sampling time, volatile organic compound vapor formed in the headspace was swept by bypassing the sweep gas through the impinger and collected four times in quick succession in separate sorbent tubes. In each experiment, a fresh liquid sample was used for each of the four sampling times (5, 10, 20, and 30 min) at a steady flow rate of 50 mL min(-1). The air-water partitioning at the most dynamic (earliest) sweeping stage was established initially in accord with ideal Henry's law, which was then followed by considerably reduced partitioning in a steady-state equilibrium (non-ideal Henry's law). The concentrations of gaseous volatile organic compounds, collected after the steady-state equilibrium, reached fairly constant values: for instance, the mole fraction of toluene measured at a sweeping interval of 10 and 30 min averaged 1.10 and 0.99 nmol mol(-1), respectively (after the initial 10 min sampling). In the second stage of our experiment, the effect of increasing the concentrations of liquid spiking standard was also examined by collecting sweep gas samples from two consecutive 10 min runs. The volatile organic compounds, collected in the first and second 10 min sweep gas samples, exhibited ideal and nonideal Henry's law behavior, respectively. From this observation, we established numerical relationships to predict the mole fraction (or mixing ratio) of each volatile organic compound in steady-state equilibrium in relation to the concentration of standard spiked into the system. This experimental approach can thus be used to produce sub-ppb levels of gaseous volatile organic

  3. Areva excellent business volume: backlog as of december 31, 2008: + 21.1% to 48.2 billion euros. 2008 revenue: + 10.4% to 13.2 billion euros; Areva excellent niveau d'activite: carnet de commandes au 31/12/2008: + 21,1% a 48,2 Mds d'euros. Chiffre d'affaires de l'exercice 2008: + 10,4% a 13,2 Mds d'euros

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    2009-07-01

    AREVA's backlog stood at 48.2 billion euros as of December 31, 2008, for 21.1% growth year-on-year, including 21.8% growth in Nuclear and 16.5% growth in Transmission and Distribution. The Nuclear backlog came to 42.5 billion euros at December 31, 2008. The Transmission and Distribution backlog came to 5.7 billion euros at year-end. The group recognized revenue of 13.2 billion euros in 2008, for year-on-year growth of 10.4% (+9.8% like-for-like). Revenue outside France was up 10.5% to 9.5 billion euros, representing 72% of total revenue. Revenue was up 6.5% in the Nuclear businesses (up 6.3% LFL), with strong performance in the Reactors and Services division (+10.9% LFL) and the Front End division (+7.2% LFL). The Transmission and Distribution division recorded growth of 17% (+15.8% LFL). Revenue for the fourth quarter of 2008 rose to 4.1 billion euros, up 5.2% (+1.6% LFL) from that of the fourth quarter of 2007. Revenue for the Front End division rose to 3.363 billion euros in 2008, up 7.1% over 2007 (+7.2% LFL). Foreign exchange (currency translations) had a negative impact of 53 million euros. Revenue for the Reactors and Services division rose to 3.037 billion euros, up 11.8% over 2007 (+10.9% LFL). Foreign exchange (currency translations) had a negative impact of 47 million euros. Revenue for the Back End division came to 1.692 billion euros, a drop of 2.7% (-2.5% LFL). Foreign exchange (currency translations) had a negative impact of 3.5 million euros. Revenue for the Transmission and Distribution division rose to 5.065 billion euros in 2008, up 17.0% (+15.8% LFL)

  4. National comparison on volume sample activity measurement methods

    International Nuclear Information System (INIS)

    Sahagia, M.; Grigorescu, E.L.; Popescu, C.; Razdolescu, C.

    1992-01-01

    A national comparison on volume sample activity measurements methods may be regarded as a step toward accomplishing the traceability of the environmental and food chain activity measurements to national standards. For this purpose, the Radionuclide Metrology Laboratory has distributed 137 Cs and 134 Cs water-equivalent solid standard sources to 24 laboratories having responsibilities in this matter. Every laboratory has to measure the activity of the received source(s) by using its own standards, equipment and methods and report the obtained results to the organizer. The 'measured activities' will be compared with the 'true activities'. A final report will be issued, which plans to evaluate the national level of precision of such measurements and give some suggestions for improvement. (Author)

  5. Impact of blood sampling in very preterm infants

    DEFF Research Database (Denmark)

    Madsen, L P; Rasmussen, M K; Bjerregaard, L L

    2000-01-01

    ; the groups were then subdivided into critically ill or not. Diagnostic blood sampling and blood transfusion events were recorded. In total, 1905 blood samples (5,253 analysis) were performed, corresponding to 0.7 samples (1.9 analysis) per day per infant. The highest frequencies were found during the first....../kg. For the extremely preterm infants a significant correlation between sampled and transfused blood volume was found (mean 37.1 and 33.3 ml/kg, respectively, r = + 0.71, p = 0.0003). The most frequently requested analyses were glucose, sodium and potassium. Few blood gas analyses were requested (1.9/ infant). No blood...... losses attributable to excessive generous sampling were detected. The results show an acceptable low frequency of sampling and transfusion events for infants of GA 28-32 weeks. The study emphasizes the necessity of thorough reflection and monitoring of blood losses when ordering blood sampling...

  6. 40 CFR Appendix II to Part 600 - Sample Fuel Economy Calculations

    Science.gov (United States)

    2010-07-01

    ... 40 Protection of Environment 29 2010-07-01 2010-07-01 false Sample Fuel Economy Calculations II... FUEL ECONOMY AND CARBON-RELATED EXHAUST EMISSIONS OF MOTOR VEHICLES Pt. 600, App. II Appendix II to Part 600—Sample Fuel Economy Calculations (a) This sample fuel economy calculation is applicable to...

  7. SAMPLING IN EXTERNAL AUDIT - THE MONETARY UNIT SAMPLING METHOD

    Directory of Open Access Journals (Sweden)

    E. Dascalu

    2016-12-01

    Full Text Available This article approaches the general issue of diminishing the evidence investigation space in audit activities, by means of sampling techniques, given that in the instance of a significant data volume an exhaustive examination of the assessed popula¬tion is not possible and/or effective. The general perspective of the presentation involves dealing with sampling risk, in essence, the risk that a selected sample may not be representative for the overall population, in correlation with the audit risk model and with the component parts of this model (inherent risk, control risk and non detection risk and highlights the inter-conditionings between these two models.

  8. Radiological status of the Saint-Pierre (Cantal) uranium mine and of its environment. Volume 1 + volume 2 + volume 3

    International Nuclear Information System (INIS)

    2006-01-01

    The first volume reports detailed investigations performed by the CRIIRAD (sampling and radiological controls of sediments, and more particularly of waters and aquatic plants) in different areas about the Saint-Pierre uranium mine in France (Cantal department). Measurements were performed by gamma spectrometry. The second part concerns external exposure and the radiological characterization of soils. The third part reports the study of the presence of radon in outdoor and indoor air. For each of these aspects, methodological information is given as well as a great quantity of measurement results

  9. 12 CFR Appendix B to Part 216 - Sample Clauses

    Science.gov (United States)

    2010-01-01

    ... CONSUMER FINANCIAL INFORMATION (REGULATION P) Pt. 216, App. B Appendix B to Part 216—Sample Clauses Link to..., such as “call the following toll-free number: (insert number)”]. A-7—Confidentiality and security (all...

  10. No association between Centers for Medicare and Medicaid services payments and volume of Medicare beneficiaries or per-capita health care costs for each state.

    Science.gov (United States)

    Harewood, Gavin C; Alsaffar, Omar

    2015-03-01

    The Centers for Medicare and Medicaid Services recently published data on Medicare payments to physicians for 2012. We investigated regional variations in payments to gastroenterologists and evaluated whether payments correlated with the number of Medicare patients in each state. We found that the mean payment per gastroenterologist in each state ranged from $35,293 in Minnesota to $175,028 in Mississippi. Adjusted per-physician payments ranged from $11 per patient in Hawaii to $62 per patient in Washington, DC. There was no correlation between the mean per-physician payment and the mean number of Medicare patients per physician (r = 0.09), there also was no correlation between the mean per-physician payment and the overall mean per-capita health care costs for each state (r = -0.22). There was a 5.6-fold difference between the states with the lowest and highest adjusted Medicare payments to gastroenterologists. Therefore, the Centers for Medicare and Medicaid Services payments do not appear to be associated with the volume of Medicare beneficiaries or overall per-capita health care costs for each state. Copyright © 2015 AGA Institute. Published by Elsevier Inc. All rights reserved.

  11. Thermal modeling of core sampling in flammable gas waste tanks. Part 1: Push-mode sampling

    International Nuclear Information System (INIS)

    Unal, C.; Stroh, K.; Pasamehmetoglu, K.O.

    1997-01-01

    The radioactive waste stored in underground storage tanks at Hanford site is routinely being sampled for waste characterization purposes. The push- and rotary-mode core sampling is one of the sampling methods employed. The waste includes mixtures of sodium nitrate and sodium nitrite with organic compounds that can produce violent exothermic reactions if heated above 160 C during core sampling. A self-propagating waste reaction would produce very high temperatures that eventually result in failure of the tank and radioactive material releases to environment. A two-dimensional thermal model based on a lumped finite volume analysis method is developed. The enthalpy of each node is calculated from the first law of thermodynamics. A flash temperature and effective contact area concept were introduced to account the interface temperature rise. No maximum temperature rise exceeding the critical value of 60 C was found in the cases studied for normal operating conditions. Several accident conditions are also examined. In these cases it was found that the maximum drill bit temperature remained below the critical reaction temperature as long as a 30 scfm purge flow is provided the push-mode drill bit during sampling in rotary mode. The failure to provide purge flow resulted in exceeding the limiting temperatures in a relatively short time

  12. Suitability of the line intersect method for sampling hardwood logging residues

    Science.gov (United States)

    A. Jeff Martin

    1976-01-01

    The line intersect method of sampling logging residues was tested in Appalachian hardwoods and was found to provide unbiased estimates of the volume of residue in cubic feet per acre. Thirty-two chains of sample line were established on each of sixteen 1-acre plots on cutover areas in a variety of conditions. Estimates from these samples were then compared to actual...

  13. Working Paper 5: Beyond Collier's Bottom Billion | IDRC ...

    International Development Research Centre (IDRC) Digital Library (Canada)

    2010-12-16

    Dec 16, 2010 ... The heart of the narrative presented in the book is that a group of almost 60 countries, with a population of about a billion people, are caught in four main traps. Their prospects for escaping the traps are poor, and they need a set of actions from the international community to achieve the rapid rates of growth ...

  14. Congress OKs $2 Billion Boost for the NIH.

    Science.gov (United States)

    2017-07-01

    President Donald Trump last week signed a $1.1 trillion spending bill for fiscal year 2017, including a welcome $2 billion boost for the NIH that will support former Vice President Joe Biden's Cancer Moonshot initiative, among other priorities. However, researchers who rely heavily on NIH grant funding remain concerned about proposed cuts for 2018. ©2017 American Association for Cancer Research.

  15. 12 CFR Appendix B to Part 573 - Sample Clauses

    Science.gov (United States)

    2010-01-01

    ... INFORMATION Pt. 573, App. B Appendix B to Part 573—Sample Clauses Link to an amendment published at 74 FR..., such as “call the following toll-free number: (insert number)”]. A-7—Confidentiality and security (all...

  16. Guidelines for safety system in the plant for photovoltaic solar cells applications feeded poisonous H{sub 2}Se; Linee guida per la realizzazione di un sistema di sicurezza per un impianto di selenizzazione impiegante gas tossico H{sub 2}Se

    Energy Technology Data Exchange (ETDEWEB)

    Pellegrino, M; Agati, A [ENEA, Centro Ricerche Casaccia, Portici. Naples (Italy). Dip. Energia

    1996-12-01

    The report aims at the description of safety system which has been realized and provided at the ENEA (Italian Agency for New Technologies, Energy and the Environment)`s Research Center, located in Portici, to the plant for the formation of the dyselenide and/or dysulphide of copper and indium CuIn (Se,S){sub 2} for photovoltaic solar cells applications. The plant is a diffusion furnace that has to be feeded with gases such as poisonous H{sub 2}S or the very toxic H{sub 2}Se, whose Threshold Level Values are respectively of 10-p.p.m. (parts per millions) and 50 p.p.b. (parts per billions), as well as the flammable and explodible H{sub 2}; the TLV is the maximum concentration value of a gas to which a worker can be exposed during its working shift, i.e., eight hours a day for five days a week for thirty five years, without any any adverse effects on his health. The description of the scientific results obtained on the field of the research with the use of that facility is beyond the scope of the report, as well as it has been intended as an handbook for the safe toxic gas handling, some example of which may be found on the specialized literature, but only to provide some guidelines for the realization of such a system.

  17. A VOLUME-LIMITED PHOTOMETRIC SURVEY OF 114 γ DORADUS CANDIDATES

    International Nuclear Information System (INIS)

    Henry, Gregory W.; Fekel, Francis C.; Henry, Stephen M.

    2011-01-01

    We have carried out a photometric survey of a complete, volume-limited sample of γ Doradus candidates. The sample was extracted from the Hipparcos catalog and consists of 114 stars with colors and absolute magnitudes within the range of known γ Doradus stars and that also lie within a specified volume of 266,600 pc 3 . We devoted one year of observing time with our T12 0.8 m automatic photometric telescope to acquire nightly observations of the complete sample of stars. From these survey observations, we identify 37 stars with intrinsic variability of 0.002 mag or more. Of these 37 variables, 8 have already been confirmed as γ Doradus stars in our earlier papers; we scheduled the remaining 29 variables on our T3 0.4 m automatic telescope to acquire more intensive observations over the next two years. As promising new γ Doradus candidates were identified from the photometry, we obtained complementary spectroscopic observations of each candidate with the Kitt Peak coude feed telescope. Analysis of our new photometric and spectroscopic data reveals 15 new γ Doradus variables (and confirms two others), 8 new δ Scuti variables (and confirms one other), and 3 new variables with unresolved periodicity. Therefore, of the 114 γ Doradus candidates in our volume-limited sample, we find 25 stars that are new or previously known γ Doradus variables. This results in an incidence of 22% for γ Doradus variability among candidate field stars for this volume of the solar neighborhood. The corresponding space density of γ Doradus stars in this volume of space is 0.094 stars per 10 3 pc 3 or 94 stars per 10 6 pc 3 . We provide an updated list of 86 bright, confirmed, γ Doradus field stars.

  18. Nuclear fuel technology - Tank calibration and volume determination for nuclear materials accountancy - Part 2: Data standardization for tank calibration

    International Nuclear Information System (INIS)

    2007-01-01

    Measurements of the volume and height of liquid in a process accountancy tank are often made in order to estimate or verify the tank's calibration or volume measurement equation. The calibration equation relates the response of the tank's measurement system to some independent measure of tank volume. The ultimate purpose of the calibration exercise is to estimate the tank's volume measurement equation (the inverse of the calibration equation), which relates tank volume to measurement system response. In this part of ISO 18213, it is assumed that the primary measurement-system response variable is liquid height and that the primary measure of liquid content is volume. This part of ISO 18213 presents procedures for standardizing a set of calibration data to a fixed set of reference conditions so as to minimize the effect of variations in ambient conditions that occur during the measurement process. The procedures presented herein apply generally to measurements of liquid height and volume obtained for the purpose of calibrating a tank (i.e. calibrating a tank's measurement system). When used in connection with other parts of ISO 18213, these procedures apply specifically to tanks equipped with bubbler probe systems for measuring liquid content. The standardization algorithms presented herein can be profitably applied when only estimates of ambient conditions, such as temperature, are available. However, the most reliable results are obtained when relevant ambient conditions are measured for each measurement of volume and liquid height in a set of calibration data. Information is provided on scope, physical principles, data required, calibration data, dimensional changes in the tank, multiple calibration runs and results on standardized calibration data. Four annexes inform about density of water, buoyancy corrections for mass determination, determination of tank heel volume and statistical method for aligning data from several calibration runs. A bibliography is

  19. Mars’ First Billion Years: Key Findings, Key Unsolved Paradoxes, and Future Exploration

    Science.gov (United States)

    Ehlmann, Bethany

    2017-10-01

    situ analyses and/or sample return can answer questions about the response of the atmosphere -- and thus Mars climate -- to endogenous and exogenous planetary processes active during the first billion years.

  20. Oak Ridge National Laboratory Technology Logic Diagram. Volume 3, Technology evaluation data sheets: Part B, Dismantlement, Remedial action

    Energy Technology Data Exchange (ETDEWEB)

    1993-09-01

    The Oak Ridge National Laboratory Technology Logic Diagram (TLD) was developed to provide a decision support tool that relates environmental restoration (ER) and waste management (WM) problems at Oak Ridge National Laboratory (ORNL) to potential technologies that can remediate these problems. The TLD identifies the research, development, demonstration testing, and evaluation needed to develop these technologies to a state that allows technology transfer and application to decontamination and decommissioning (D&D), remedial action (RA), and WM activities. The TLD consists of three fundamentally separate volumes: Vol. 1, Technology Evaluation; Vol. 2, Technology Logic Diagram and Vol. 3, Technology EvaLuation Data Sheets. Part A of Vols. 1 and 2 focuses on RA. Part B of Vols. 1 and 2 focuses on the D&D of contaminated facilities. Part C of Vols. 1 and 2 focuses on WM. Each part of Vol. 1 contains an overview of the TM, an explanation of the problems facing the volume-specific program, a review of identified technologies, and rankings of technologies applicable to the site. Volume 2 (Pts. A. B. and C) contains the logic linkages among EM goals, environmental problems, and the various technologies that have the potential to solve these problems. Volume 3 (Pts. A. B, and C) contains the TLD data sheets. This volume provides the technology evaluation data sheets (TEDS) for ER/WM activities (D&D, RA and WM) that are referenced by a TEDS code number in Vol. 2 of the TLD. Each of these sheets represents a single logic trace across the TLD. These sheets contain more detail than is given for the technologies in Vol. 2.

  1. Determination of air-loop volume and radon partition coefficient for measuring radon in water sample.

    Science.gov (United States)

    Lee, Kil Yong; Burnett, William C

    A simple method for the direct determination of the air-loop volume in a RAD7 system as well as the radon partition coefficient was developed allowing for an accurate measurement of the radon activity in any type of water. The air-loop volume may be measured directly using an external radon source and an empty bottle with a precisely measured volume. The partition coefficient and activity of radon in the water sample may then be determined via the RAD7 using the determined air-loop volume. Activity ratios instead of absolute activities were used to measure the air-loop volume and the radon partition coefficient. In order to verify this approach, we measured the radon partition coefficient in deionized water in the temperature range of 10-30 °C and compared the values to those calculated from the well-known Weigel equation. The results were within 5 % variance throughout the temperature range. We also applied the approach for measurement of the radon partition coefficient in synthetic saline water (0-75 ppt salinity) as well as tap water. The radon activity of the tap water sample was determined by this method as well as the standard RAD-H 2 O and BigBottle RAD-H 2 O. The results have shown good agreement between this method and the standard methods.

  2. Determination of air-loop volume and radon partition coefficient for measuring radon in water sample

    International Nuclear Information System (INIS)

    Kil Yong Lee; Burnett, W.C.

    2013-01-01

    A simple method for the direct determination of the air-loop volume in a RAD7 system as well as the radon partition coefficient was developed allowing for an accurate measurement of the radon activity in any type of water. The air-loop volume may be measured directly using an external radon source and an empty bottle with a precisely measured volume. The partition coefficient and activity of radon in the water sample may then be determined via the RAD7 using the determined air-loop volume. Activity ratios instead of absolute activities were used to measure the air-loop volume and the radon partition coefficient. In order to verify this approach, we measured the radon partition coefficient in deionized water in the temperature range of 10-30 deg C and compared the values to those calculated from the well-known Weigel equation. The results were within 5 % variance throughout the temperature range. We also applied the approach for measurement of the radon partition coefficient in synthetic saline water (0-75 ppt salinity) as well as tap water. The radon activity of the tap water sample was determined by this method as well as the standard RAD-H 2 O and BigBottle RAD-H 2 O. The results have shown good agreement between this method and the standard methods. (author)

  3. Oak Ridge National Laboratory Technology Logic Diagram. Volume 1, Technology Evaluation: Part B, Remedial Action

    Energy Technology Data Exchange (ETDEWEB)

    1993-09-01

    The Oak Ridge National Laboratory Technology Logic Diagram (TLD) was developed to provide a decision-support tool that relates environmental restoration (ER) and waste management (WM) problems at Oak Ridge National Laboratory (ORNL) to potential technologies that can remediate these problems. The TLD identifies the research, development, demonstration, testing, and evaluation needed to develop these technologies to a state that allows technology transfer and application to decontamination and decommissioning (D&D), remedial action (RA), and WM activities. The TLD consists of three fundamentally separate volumes: Vol. 1 (Technology Evaluation), Vol. 2 (Technology Logic Diagram), and Vol. 3 (Technology Evaluation Data Sheets). Part A of Vols. 1 and 2 focuses on D&D. Part B of Vols. 1 and 2 focuses on RA of contaminated facilities. Part C of Vols. 1 and 2 focuses on WM. Each part of Vol. 1 contains an overview of the TLD, an explanation of the program-specific responsibilities, a review of identified technologies, and the ranking os remedial technologies. Volume 2 (Pts. A, B, and C) contains the logic linkages among EM goals, environmental problems, and the various technologies that have the potential to solve these problems. Volume 3 (Pts. A, B, and C) contains the TLD data sheets. The focus of Vol. 1, Pt. B, is RA, and it has been divided into six chapters. The first chapter is an introduction, which defines problems specific to the ER Program for ORNL. Chapter 2 provides a general overview of the TLD. Chapters 3 through 5 are organized into necessary subelement categories: RA, characterization, and robotics and automation. The final chapter contains regulatory compliance information concerning RA.

  4. 1.6 billion euros for nuclear research through the 'Horizon 2020' program

    International Nuclear Information System (INIS)

    Anon.

    2014-01-01

    The European Union Council has approved the budget for the future European program for research and innovation called 'Horizon 2020'. A global funding of 77 billion euros has been allocated to 'Horizon 2020' for the 2014 to 2020 years. The share for nuclear sciences will reach 1.6 billion euros and will break down as follows: 316 million euros for fundamental research on fission, 728 million euros for fundamental research on fusion (ITER not included) and 560 million euros for the research projects of the European Joint Research Center (JRC). (A.C.)

  5. Rampa hidràulica per a bolcatge

    OpenAIRE

    Priego Minguell, Isaac

    2004-01-01

    El projecte es basa en el desenvolupament d’una rampa hidràulica per a l’avaluació dels riscos de bolcatge (RHB), que inclou una part mecànica i una part hidràulica. L’objectiu de dissenyar aquesta rampa hidràulica ha estat amb l’únic propòsit de fer una avaluació de la seguretat per bolcatge per a vehicles industrials (tractors de grans dimensions).

  6. Per- and Polyfluoroalkyl Substances (PFAS): Sampling ...

    Science.gov (United States)

    Per- and polyfluoroalkyl substances (PFAS) are a large group of manufactured compounds used in a variety of industries, such as aerospace, automotive, textiles, and electronics, and are used in some food packaging and firefighting materials. For example, they may be used to make products more resistant to stains, grease and water. In the environment, some PFAS break down very slowly, if at all, allowing bioaccumulation (concentration) to occur in humans and wildlife. Some have been found to be toxic to laboratory animals, producing reproductive, developmental, and systemic effects in laboratory tests. EPA's methods for analyzing PFAS in environmental media are in various stages of development. This technical brief summarizes the work being done to develop robust analytical methods for groundwater, surface water, wastewater, and solids, including soils, sediments, and biosolids. The U.S. Environmental Protection Agency’s (EPA) methods for analyzing PFAS in environmental media are in various stages of development. EPA is working to develop robust analytical methods for groundwater, surface water, wastewater, and solids, including soils, sediments, and biosolids.

  7. 12 CFR Appendix B to Part 716 - SAMPLE CLAUSES

    Science.gov (United States)

    2010-01-01

    ...-dealers, and insurance agents”]; • Non-financial companies, such as [provide illustrative examples, such... CONSUMER FINANCIAL INFORMATION Pt. 716, App. B Appendix B to Part 716—SAMPLE CLAUSES This appendix only... disclose nonpublic personal information about you to the following types of third parties: • Financial...

  8. 1991 SOLAR WORLD CONGRESS - VOLUME 1, PART I

    Science.gov (United States)

    The four-volume proceedings document the 1991 Solar World Congress (the biennial congress of the International Solar Energy Society) in Denver, CO, August 19-23, 1991. Volume 1 is dedicated to solar electricity, biofuels, and renewable resources. Volume 2 contains papers on activ...

  9. The ENEA calibration service for ionising radiations. Part 1: Photons; Il centro di taratura per la radiazioni ionizzanti di Bologna. Parte 1: Fotoni

    Energy Technology Data Exchange (ETDEWEB)

    Monteventi, F.; Sermenghi, I. [ENEA Centro Ricerche Ezio Clementel, Bologna (Italy). Dipt. Ambiente

    1999-07-01

    The ENEA (National Agency for New Technology, Energy and the Environment) calibration service for ionizing radiations has been active for 40 years in the secondary standard dosimetry laboratory web. It has been the first center, in 1985, to be acknowledges by the Italian calibration service (SIT) for the two quantities for photons: exposure and air kerma. Since the Institute for the Radiation Protection of ENEA has moved to the new site in Montecuccolino (Bologna, Italy) in 1995, the whole laboratory has been renovated and all irradiation rooms together with radiation source and equipment have been reorganized according to the {chi}, {gamma}, {beta} and neutron fields metrology requirements. The aim of this report, as the first part of a report describing all facilities available at the service, is to give a detailed description of all equipment s qualified for photon fields metrology including the secondary standards and the calibration procedures performed for radiation monitoring devices and dosemeters. [Italian] Il centro di taratura dell'ENEA di Bologna opera nel campo della metrologia secpndaria da quasi 40 anni ed e' stato il primo centro nel 1985 ad essere riconosciuto dal SIT (Servizio di Taratura in Italia) per le grandezze esposizione e Kerma in aria. Con l'insediamento di tutto l'Istituto per la Radioprotezione nella nuova sede di Montecuccolino sono state ricostruite e riorganizzate anche tutte le sale di irraggiamento e tutti gli impianti radiogeni a disposizione per la metrologia delle radiazioni X, gamma, beta e neutroni. Intenzione di questo primo rapporto e' descrivere le attrezzature qualificate per la metrologia fotonica, dei campioni di misura e delle procedure adottate per la taratura degli strumenti e dei dosimetri.

  10. Modeling Per Capita State Health Expenditure Variat...

    Data.gov (United States)

    U.S. Department of Health & Human Services — Modeling Per Capita State Health Expenditure Variation State-Level Characteristics Matter, published in Volume 3, Issue 4, of the Medicare and Medicaid Research...

  11. 17 CFR Appendix B to Part 160 - Sample Clauses

    Science.gov (United States)

    2010-04-01

    ... as [provide illustrative examples, such as “mortgage bankers”]; • Non-financial companies, such as... FINANCIAL INFORMATION Pt. 160, App. B Appendix B to Part 160—Sample Clauses This appendix only applies to privacy notices provided before January 1, 2011. Financial institutions, including a group of financial...

  12. Ocean Thermal Energy Converstion (OTEC) test facilities study program. Final report. Volume II. Part B

    Energy Technology Data Exchange (ETDEWEB)

    None

    1977-01-17

    Results are presented of an 8-month study to develop alternative non-site-specific OTEC facilities/platform requirements for an integrated OTEC test program which may include land and floating test facilities. Volume II--Appendixes is bound in three parts (A, B, and C) which together comprise a compendium of the most significant detailed data developed during the study. Part B provides an annotated test list and describes component tests and system tests.

  13. Sampling procedures for inventory of commercial volume tree species in Amazon Forest.

    Science.gov (United States)

    Netto, Sylvio P; Pelissari, Allan L; Cysneiros, Vinicius C; Bonazza, Marcelo; Sanquetta, Carlos R

    2017-01-01

    The spatial distribution of tropical tree species can affect the consistency of the estimators in commercial forest inventories, therefore, appropriate sampling procedures are required to survey species with different spatial patterns in the Amazon Forest. For this, the present study aims to evaluate the conventional sampling procedures and introduce the adaptive cluster sampling for volumetric inventories of Amazonian tree species, considering the hypotheses that the density, the spatial distribution and the zero-plots affect the consistency of the estimators, and that the adaptive cluster sampling allows to obtain more accurate volumetric estimation. We use data from a census carried out in Jamari National Forest, Brazil, where trees with diameters equal to or higher than 40 cm were measured in 1,355 plots. Species with different spatial patterns were selected and sampled with simple random sampling, systematic sampling, linear cluster sampling and adaptive cluster sampling, whereby the accuracy of the volumetric estimation and presence of zero-plots were evaluated. The sampling procedures applied to species were affected by the low density of trees and the large number of zero-plots, wherein the adaptive clusters allowed concentrating the sampling effort in plots with trees and, thus, agglutinating more representative samples to estimate the commercial volume.

  14. National Low-Level Waste Management Program Radionuclide Report Series. Volume 10, Nickel-63

    Energy Technology Data Exchange (ETDEWEB)

    Carboneau, M.L.; Adams, J.P.

    1995-02-01

    This report outlines the basic radiological, chemical, and physical characteristics of nickel-63 ({sup 63}Ni) and examines how these characteristics affect the behavior of {sup 63}Ni in various environmental media, such as soils, groundwater, plants, animals, the atmosphere, and the human body. Discussions also include methods of {sup 63}Ni production, waste types, and waste forms that contain {sup 63}Ni. The primary source of {sup 63}Ni in the environment has been low-level radioactive waste material generated as a result of neutron activation of stable {sup 62}Ni that is present in the structural components of nuclear reactor vessels. {sup 63}Ni enters the environment from the dismantling activities associated with nuclear reactor decommissioning. However, small amounts of {sup 63}Ni have been detected in the environment following the testing of thermonuclear weapons in the South Pacific. Concentrations as high as 2.7 Bq{sup a} per gram of sample (or equivalently 0.0022 parts per billion) were observed on Bikini Atoll (May 1954). {sup 63}Ni was not created as a fission product species (e.g., from {sup 235}U or {sup 239}Pu fissions), but instead was produced as a result of neutron capture in {sup 63}Ni, a common nickel isotope present in the stainless steel components of nuclear weapons (e.g., stainless-304 contains {approximately}9% total Ni or {approximately}0.3% {sup 63}Ni).

  15. Troglitazone treatment increases bone marrow adipose tissue volume but does not affect trabecular bone volume in mice

    DEFF Research Database (Denmark)

    Erikstrup, Lise Tornvig; Mosekilde, Leif; Justesen, J

    2001-01-01

    proliferator activated receptor-gamma (PPARgamma). Histomorphometric analysis of proximal tibia was performed in order to quantitate the amount of trabecular bone volume per total volume (BV/TV %), adipose tissue volume per total volume (AV/TV %), and hematopoietic marrow volume per total volume (HV......Aging is associated with decreased trabecular bone mass and increased adipocyte formation in bone marrow. As osteoblasts and adipocytes share common precursor cells present in the bone marrow stroma, it has been proposed that an inverse relationship exists between adipocyte and osteoblast....../TV %) using the point-counting technique. Bone size did not differ between the two groups. In troglitazone-treated mice, AV/TV was significantly higher than in control mice (4.7+/-2.1% vs. 0.2+/-0.3%, respectively, mean +/- SD, P

  16. FEE-SCHEDULE INCREASES IN CANADA: IMPLICATION FOR SERVICE VOLUMES AMONG FAMILY AND SPECIALIST PHYSICIANS.

    Science.gov (United States)

    Ariste, Ruolz

    2015-01-01

    Physician spending has substantially increased over the last few years in Canada to reach $27.4 billion in 2010. Total clinical payment to physicians has grown at an average annual rate of 7.6% from 2004 to 2010. The key policy question is whether or not this additional money has bought more physician services. So, the purpose of this study is to understand if we are paying more for the same amount of medical services in Canada or we are getting more bangs for our buck. At the same time, the paper attempts to find out whether or not there is a productivity difference between family physician services and surgical procedures. Using the Baumol theory and data from the National Physician Database for the period 2004-2010, the paper breaks down growth in physician remuneration into growth in unit cost and number of services, both from the physician and the payer perspectives. After removing general inflation and population growth from the 7.6% growth in total clinical payment, we found that real payment per service and volume of services per capita grew at an average annual rate of 3.2% and 1.4% respectively, suggesting that payment per service was the main cost driver of physician remuneration at the national level. Taking the payer perspective, it was found that, for the fee-for-service (FFS) scheme, volume of services per physician decreased at an average annual rate of -0.6%, which is a crude indicator that labour productivity of physicians on FFS has fallen during the period. However, the situation differs for the surgical procedures. Results also vary by province. Overall, our finding is consistent with the Baumol theory, which hypothesizes higher productivity growth in technology-driven sectors.

  17. Heteronuclear Micro-Helmholtz Coil Facilitates µm-Range Spatial and Sub-Hz Spectral Resolution NMR of nL-Volume Samples on Customisable Microfluidic Chips.

    Directory of Open Access Journals (Sweden)

    Nils Spengler

    Full Text Available We present a completely revised generation of a modular micro-NMR detector, featuring an active sample volume of ∼ 100 nL, and an improvement of 87% in probe efficiency. The detector is capable of rapidly screening different samples using exchangeable, application-specific, MEMS-fabricated, microfluidic sample containers. In contrast to our previous design, the sample holder chips can be simply sealed with adhesive tape, with excellent adhesion due to the smooth surfaces surrounding the fluidic ports, and so withstand pressures of ∼2.5 bar, while simultaneously enabling high spectral resolution up to 0.62 Hz for H2O, due to its optimised geometry. We have additionally reworked the coil design and fabrication processes, replacing liquid photoresists by dry film stock, whose final thickness does not depend on accurate volume dispensing or precise levelling during curing. We further introduced mechanical alignment structures to avoid time-intensive optical alignment of the chip stacks during assembly, while we exchanged the laser-cut, PMMA spacers by diced glass spacers, which are not susceptible to melting during cutting. Doing so led to an overall simplification of the entire fabrication chain, while simultaneously increasing the yield, due to an improved uniformity of thickness of the individual layers, and in addition, due to more accurate vertical positioning of the wirebonded coils, now delimited by a post base plateau. We demonstrate the capability of the design by acquiring a 1H spectrum of ∼ 11 nmol sucrose dissolved in D2O, where we achieved a linewidth of 1.25 Hz for the TSP reference peak. Chemical shift imaging experiments were further recorded from voxel volumes of only ∼ 1.5 nL, which corresponded to amounts of just 1.5 nmol per voxel for a 1 M concentration. To extend the micro-detector to other nuclei of interest, we have implemented a trap circuit, enabling heteronuclear spectroscopy, demonstrated by two 1H/13C 2D HSQC

  18. Direct determination of tellurium in soil and plant samples by sector-field ICP-MS for the study of soil-plant transfer of radioactive tellurium subsequent to the Fukushima Daiichi Nuclear Power Plant accident

    International Nuclear Information System (INIS)

    Yang, Guosheng; Zheng, Jian; Tagami, Keiko; Uchida, Shigeo

    2013-01-01

    The Fukushima Daiichi Nuclear Power Plant (FDNPP) accident caused the release of large amounts of radioactive Te into the environment. Stable Te, as an analogue, is considered to be useful for the estimation of the soil-plant transfer of radioactive Te. It is necessary to estimate the radiation dose of Te that would result from food ingestion. However, due to the extremely low concentrations of Te in the environment, reported transfer factor values for Te are considerably limited. We report a sensitive analytical method for direct determination of trace Te in soil and plant samples using a sector-field inductively coupled plasma mass spectrometry (SF-ICP-MS). The developed analytical method is characterized by a very low detection limit at the sub-parts per billion (ng g"-"1) level in soil and plant samples, and it has been applied to the study of soil-plant transfer to collect transfer factor data in Japan. (author)

  19. Empowering billions with food safety and food security

    International Nuclear Information System (INIS)

    Pillai, Suresh D.

    2009-01-01

    Full text: There are virtually millions of people -who die needlessly every year due to contaminated water and food. There are virtually many millions more who are starving due to an inadequate supply of food. Billions of pounds of food are unnecessarily wasted due to insect and other damage. Deaths and illness due to contaminated food or inadequate food are at catastrophic levels in many regions of the world. A majority of the food and water borne illnesses and deaths are preventable. It can be prevented by improved food production methods, improved food processing technologies, improved food distribution systems and improved personal hygiene. Food irradiation technology is over 100 years old. Yet, this technology is poorly understood by governments and corporate decision makers all around the world. Many consumers also are unfortunately misinformed of this technology. There is an urgent need for nations and people around the world to empower themselves with the knowledge and the expertise to harness this powerful technology. Widespread and sensible adoption of this technology can empower billions around the world with clean and abundant food supplies. It is unconscionable in the 21st century for governments to allow people to die or go hungry when the technology to prevent them is readily available

  20. Social assessment of wind power. Part 4: International position and development conditions of the Danish wind turbine industry

    International Nuclear Information System (INIS)

    Karnoee, P.; Joergensen, U.

    1995-12-01

    Today, the Danish wind turbine industry is positioned as a global market leader on a fast growing international market. The conclusion and observation of this report are: 1. The international market for wind power is likely to grow fast over the next 5 years (a self-reinforcing development) and the Danish wind turbine industry has the potential to increase export from the present DKK 2 billion per year to DKK 4 billion per year. 2. The leading market position is achieved after the collapse of the California market in 1987. The last four years have been export-driven growth together with the development of complementary assets to support a strong product technology. 3. The Danish position might be threatened by the new competitors, but it is our view that it takes 2-3 years before they have equivalent competences, and since the Danish companies are presently also mobilizing resources and competences it is not likely that they are easily outperformed. 4. A continued installation of wind power in Denmark is important to support the sustaining of the international position in the coming years. Through, it is not the volume size of the domestic market, but rather a stable market which serves as a home base for the industry. Not only for the reference and testing of new wind turbines, but also the knowledge networks between manufacturers and suppliers and between manufacturers and the Test Station for Windmills at Risoe. (EG) 51 refs

  1. Gaz de France first quarter 2007 sales: an 11 per cent drop due to an exceptionally warm winter: a 1.3 per cent increase on an average-climate basis. Non-audited IFRS data

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    2007-07-01

    Paris, May 14 2007 - For the first quarter 2007, Gaz de France Group posted euro 9,053 million in consolidated sales. This 11 per cent decrease on the same period in 2006 is a direct result of the extremely warm weather conditions in France and Europe this winter. In contrast, under average climate conditions sales improved by 1.3 per cent. In France, where the winter of 2006/2007 was the warmest in fifty years, sales were impacted by 18 billion kWh in the quarter compared to a quarter with average-climate conditions and 32 billion kWh compared to the first quarter 2006 which, in contrast, was colder than normal. The impact of the weather had similar effects on sales outside France. The highly unusual weather conditions also had an indirect impact on the market and, consequently, on both gas production and the arbitrage activities. Not withstanding these effects, the Group continued to consolidate its position in foreign markets, with sales outside France reaching euro 3,341 million. The share of sales outside France increased by 3 points in the first quarter of 2007 versus first quarter 2006 to 37 per cent as at end March 2007. The group reiterates the 2007 financial objective as presented at the full year 2006 results: '2007 will be a year of consolidation and the EBITDA should be in line with that of 2006'.

  2. Gaz de France first quarter 2007 sales: an 11 per cent drop due to an exceptionally warm winter: a 1.3 per cent increase on an average-climate basis. Non-audited IFRS data

    International Nuclear Information System (INIS)

    2007-01-01

    Paris, May 14 2007 - For the first quarter 2007, Gaz de France Group posted euro 9,053 million in consolidated sales. This 11 per cent decrease on the same period in 2006 is a direct result of the extremely warm weather conditions in France and Europe this winter. In contrast, under average climate conditions sales improved by 1.3 per cent. In France, where the winter of 2006/2007 was the warmest in fifty years, sales were impacted by 18 billion kWh in the quarter compared to a quarter with average-climate conditions and 32 billion kWh compared to the first quarter 2006 which, in contrast, was colder than normal. The impact of the weather had similar effects on sales outside France. The highly unusual weather conditions also had an indirect impact on the market and, consequently, on both gas production and the arbitrage activities. Not withstanding these effects, the Group continued to consolidate its position in foreign markets, with sales outside France reaching euro 3,341 million. The share of sales outside France increased by 3 points in the first quarter of 2007 versus first quarter 2006 to 37 per cent as at end March 2007. The group reiterates the 2007 financial objective as presented at the full year 2006 results: '2007 will be a year of consolidation and the EBITDA should be in line with that of 2006'

  3. Process for removing mercury from aqueous solutions

    Science.gov (United States)

    Googin, John M.; Napier, John M.; Makarewicz, Mark A.; Meredith, Paul F.

    1986-01-01

    A process for removing mercury from water to a level not greater than two parts per billion wherein an anion exchange material that is insoluble in water is contacted first with a sulfide containing compound and second with a compound containing a bivalent metal ion forming an insoluble metal sulfide. To this treated exchange material is contacted water containing mercury. The water containing not more than two parts per billion of mercury is separated from the exchange material.

  4. Marble Canyon spring sampling investigation

    International Nuclear Information System (INIS)

    McCulley, B.

    1985-10-01

    The Mississippian Leadville Limestone is the most permeable formation in the lower hydrostratigraphic unit underlying the salt beds of the Paradox Formation in Gibson Dome, Paradox Basin, Utah, which is being considered as a potential nuclear waste repository site. The closest downgradient outcrop of the Mississippian limestone is along the Colorado River in Marble Canyon, Arizona. This report describes the sampling and interpretation of springs in that area to assess the relative contribution of Gibson Dome-type Leadville Limestone ground water to that spring discharge. The high-volume (hundreds of liters per second or thousands of gallons per minute) springs discharging from fault zones in Marble Canyon are mixtures of water recharged west of the Colorado River on the Kaibab Plateau and east of the river in the Kaiparowits basin. No component of Gibson Dome-type Leadville Limestone ground water is evident in major and trace element chemistry or isotopic composition of the Marble Canyon Springs. A low-volume (0.3 liters per second or 5 gallons per minute) spring with some chemical and isotopic characteristics of Gibson Dome-type Leadville Limestone water diluted by Kaiparowits basin-type water issues from a travertine mound in the Bright Angel Shale on the Little Colorado River. However, the stable isotopic composition and bromide levels of that spring discharge, in addition to probable ground-water flow paths, contradict the dilution hypothesis

  5. Urban area and green space: volume estimation using medium resolution satellite imagery

    Science.gov (United States)

    Handayani, H. H.

    2017-12-01

    The latest revision of the UN World Urbanization Prospects predicts the world's urban population to increase by 1.4 billion between 2010 and 2030, 60% of the population will live in cities. Consequently, this expansion affects the existence of ecosystem services in the context of sustainability environment. Green space is a focal point of the ecological system and is affected by the urbanization process. The green space has essential functions in cleaning the water, adjusting the microclimate, eliminating noise, and beautifying the surrounding makes the green quantity as well as quality very vital to its existence. The urban expansion leads the growth into vertical development. Therefore, the third dimension using urban volume as an indicator of vertical development is introduced. Therefore, this study estimates the urban and green volume by using medium resolution remote sensing. Surabaya is used as a case study since the city has grown up significantly in both of population and capital investment in this decade. Here, urban and green volume is investigated by ALOS datasets with urban referring built-up. Also, we examine the area with low and high green volume by performing hot and cold spots analysis. The average of built-up volume reaches 173.05 m3/pixel presented by the building for a residential single house with the height less than 7m. The average of green volume is 14.74m3/pixel performed by the vegetation with the height generally 0.6 to 1m which is frequently planted in the backyard of house. However, the ratio of green volume to the built-up volume shows a small portion which is around 8.52%. Therefore, we identify the hot and cold spots, we evaluate 5 areas having cold spot regarding lack of green volume. The two locations of cold spot are located in the northern part and another is in the southern part. Those areas have high number of built-up volume which is in particularly as sub-CBD area. We emphasize that the improvement of green quantity is needed

  6. Oak Ridge National Laboratory Technology Logic Diagram. Volume 3, Technology evaluation data sheets: Part C, Robotics/automation, Waste management

    Energy Technology Data Exchange (ETDEWEB)

    1993-09-01

    The Oak Ridge National Laboratory Technology Logic Diagram (TLD) was developed to provide a decision support tool that relates environmental restoration (ER) and waste management (WM) problems at Oak Ridge National Laboratory (ORNL) to potential technologies that can remediate these problems. The TLD identifies the research, development, demonstration testing, and evaluation needed to develop these technologies to a state that allows technology transfer and application to decontamination and decommissioning (D&D), remedial action (RA), and WM activities. The TLD consists of three fundamentally separate volumes: Vol. 1, Technology Evaluation; Vol. 2, Technology Logic Diagram and Vol. 3, Technology EvaLuation Data Sheets. Part A of Vols. 1 and 2 focuses on RA. Part B of Vols. 1 and 2 focuses on the D&D of contaminated facilities. Part C of Vols. 1 and 2 focuses on WM. Each part of Vol. 1 contains an overview of the TM, an explanation of the problems facing the volume-specific program, a review of identified technologies, and rankings of technologies applicable to the site. Volume 2 (Pts. A. B. and C) contains the logic linkages among EM goals, environmental problems, and the various technologies that have the potential to solve these problems. Volume 3 (Pts. A. B, and C) contains the TLD data sheets. This volume provides the technology evaluation data sheets (TEDS) for ER/WM activities (D&D, RA and WM) that are referenced by a TEDS code number in Vol. 2 of the TLD. Each of these sheets represents a single logic trace across the TLD. These sheets contain more detail than is given for the technologies in Vol. 2.

  7. Price of next big thing in physics: $6.7 billion

    CERN Multimedia

    Overbye, Dennis

    2007-01-01

    The price of exploring inner space went up Thursday. The machine discusses in a news conference in Beijing, will be 20 miles long and would cost about $6.7 billion and 13'000 person-years of labor to be built. (1,5 page)

  8. National Estimate of Per- and Polyfluoroalkyl Substance (PFAS) Release to U.S. Municipal Landfill Leachate.

    Science.gov (United States)

    Lang, Johnsie R; Allred, B McKay; Field, Jennifer A; Levis, James W; Barlaz, Morton A

    2017-02-21

    Landfills are the final stage in the life cycle of many products containing per- and polyfluoroalkyl substances (PFASs) and their presence has been reported in landfill leachate. The concentrations of 70 PFASs in 95 samples of leachate were measured in a survey of U.S. landfills of varying climates and waste ages. National release of PFASs was estimated by coupling measured concentrations for the 19 PFASs where more than 50% of samples had quantifiable concentrations, with climate-specific estimates of annual leachate volumes. For 2013, the total volume of leachate generated in the U.S. was estimated to be 61.1 million m 3 , with 79% of this volume coming from landfills in wet climates (>75 cm/yr precipitation) that contain 47% of U.S. solid waste. The mass of measured PFASs from U.S. landfill leachate to wastewater treatment plants was estimated to be between 563 and 638 kg for 2013. In the majority of landfill leachate samples, 5:3 fluorotelomer carboxylic acid (FTCA) was dominant and variations in concentrations with waste age affected total estimated mass. There were six PFASs that demonstrated significantly higher concentrations in leachate from younger waste compared to older waste and six PFAS demonstrated significant variation with climate.

  9. Areva revenue growth in the first quarter of 2010: 8.4% like-for-like, i.e. 1.936 billion euros

    International Nuclear Information System (INIS)

    2010-01-01

    The group's first quarter 2010 consolidated revenue rose 6.5% to 1.936 billion euros (+8.4% LFL) compared with the first quarter of 2009. Growth was driven by the Reactors and Services Business Group (+18.0% LFL). Revenue from exports was up 16.6% to 1.089 billion euros, representing 56.2% of total revenue. Foreign exchange had a negative impact of 26 million euros. Changes in consolidation scope were negligible during the period. The group's backlog of 43.6 billion euros at March 31, 2010 was stable in relation to December 31, 2009. Growth in the backlog of the Reactors and Services Business Group helped offset the drawdown of the backlog in the Back End Business Group as contracts were completed. For the full year of 2010, the group confirms its outlook for significant backlog and revenue growth, rising operating income, and a strong increase in net income attributable to owners of the parent. Mining/Front End Business Group: The Mining/Front End BG reported first quarter 2010 revenue of 674 million euros, which was stable on a reported basis and up 3.5% LFL1. Foreign exchange had a negative impact of 16 million euros. - In Mining, quarterly revenue was driven by volume growth due to a favorable delivery schedule. - In Enrichment and Fuel, volumes were down compared with the first quarter of 2009, particularly due to time-lag in customer deliveries. Reactors and Services Business Group: Revenue for the Reactors and Services BG was up 16.4% in the first quarter of 2010 (up 18.0% LFL1), to 775 million euros. Foreign exchange had a negative impact of 10 million euros. - The New Builds Business reported strong growth due to significant progress on major reactor construction projects, particularly Taishan in China. - Installed Base Business was also up due to buoyant engineering operations, particularly in Germany, and to the more favorable seasonality of unit outage campaigns than in the first quarter of 2009. Back End Business Group: First quarter 2010 revenue for

  10. New experimental procedure for measuring volume magnetostriction on powder samples

    International Nuclear Information System (INIS)

    Rivero, G.; Multigner, M.; Valdes, J.; Crespo, P.; Martinez, A.; Hernando, A.

    2005-01-01

    Conventional techniques used for volume magnetostriction measurements, as strain gauge or cantilever method, are very useful for ribbons or thin films but cannot be applied when the samples are in powder form. To overcome this problem a new experimental procedure has been developed. In this work, the experimental set-up is described, together with the results obtained in amorphous FeCuZr powders, which exhibit a strong dependence of the magnetization on the strength of the applied magnetic field. The magnetostriction measurements presented in this work point out that this dependence is related to a magnetovolume effect

  11. Sampling-based motion planning with reachable volumes: Theoretical foundations

    KAUST Repository

    McMahon, Troy

    2014-05-01

    © 2014 IEEE. We introduce a new concept, reachable volumes, that denotes the set of points that the end effector of a chain or linkage can reach. We show that the reachable volume of a chain is equivalent to the Minkowski sum of the reachable volumes of its links, and give an efficient method for computing reachable volumes. We present a method for generating configurations using reachable volumes that is applicable to various types of robots including open and closed chain robots, tree-like robots, and complex robots including both loops and branches. We also describe how to apply constraints (both on end effectors and internal joints) using reachable volumes. Unlike previous methods, reachable volumes work for spherical and prismatic joints as well as planar joints. Visualizations of reachable volumes can allow an operator to see what positions the robot can reach and can guide robot design. We present visualizations of reachable volumes for representative robots including closed chains and graspers as well as for examples with joint and end effector constraints.

  12. Sampling-based motion planning with reachable volumes: Theoretical foundations

    KAUST Repository

    McMahon, Troy; Thomas, Shawna; Amato, Nancy M.

    2014-01-01

    © 2014 IEEE. We introduce a new concept, reachable volumes, that denotes the set of points that the end effector of a chain or linkage can reach. We show that the reachable volume of a chain is equivalent to the Minkowski sum of the reachable volumes of its links, and give an efficient method for computing reachable volumes. We present a method for generating configurations using reachable volumes that is applicable to various types of robots including open and closed chain robots, tree-like robots, and complex robots including both loops and branches. We also describe how to apply constraints (both on end effectors and internal joints) using reachable volumes. Unlike previous methods, reachable volumes work for spherical and prismatic joints as well as planar joints. Visualizations of reachable volumes can allow an operator to see what positions the robot can reach and can guide robot design. We present visualizations of reachable volumes for representative robots including closed chains and graspers as well as for examples with joint and end effector constraints.

  13. Monte Carlo Study of Four-Dimensional Self-avoiding Walks of up to One Billion Steps

    Science.gov (United States)

    Clisby, Nathan

    2018-04-01

    We study self-avoiding walks on the four-dimensional hypercubic lattice via Monte Carlo simulations of walks with up to one billion steps. We study the expected logarithmic corrections to scaling, and find convincing evidence in support the scaling form predicted by the renormalization group, with an estimate for the power of the logarithmic factor of 0.2516(14), which is consistent with the predicted value of 1/4. We also characterize the behaviour of the pivot algorithm for sampling four dimensional self-avoiding walks, and conjecture that the probability of a pivot move being successful for an N-step walk is O([ log N ]^{-1/4}).

  14. The inpatient economic and mortality impact of hepatocellular carcinoma from 2005 to 2009: analysis of the US nationwide inpatient sample.

    Science.gov (United States)

    Mishra, Alita; Otgonsuren, Munkhzul; Venkatesan, Chapy; Afendy, Mariam; Erario, Madeline; Younossi, Zobair M

    2013-09-01

    Hepatocellular carcinoma (HCC) is an important complication of cirrhosis. Our aim was to assess the inpatient economic and mortality of HCC in the USA METHODS: Five cycles of Nationwide Inpatient Sample (NIS) conducted from 2005 to 2009 were used. Demographics, inpatient mortality, severity of illness, payer type, length of stay (LoS) and charges were available. Changes and associated factors related to inpatient HCC were assessed using simple linear regression. Odds ratios and 95% CIs for hospital mortality were analysed using log-linked regression model. To estimate the sampling variances for complex survey data, we used Taylor series approach. SAS(®) v.9.3 was used for statistical analysis. From 2005 to 2009, 32,697,993 inpatient cases were reported to NIS. During these 5 years, primary diagnosis of HCC increased from 4401 (2005), 4170 (2006), 5065 (2007), 6540 (2008) to 6364 (2009). HCC as any diagnosis increased from 68 per 100,000 discharges (2005) to 99 per 100,000 (2009). However, inpatient mortality associated with HCC decreased from 12% (2005) to 10% (2009) (P < 0.046) and LoS remained stable. However, median inflation-adjusted charges at the time of discharge increased from $29,466 per case (2005) to $31,656 per case (2009). Total national HCC charges rose from $1.0 billion (2005) to $2.0 billion (2009). In multivariate analysis, hospital characteristic was independently associated with decreasing in-hospital mortality (all P < 0.05). Liver transplantation for HCC was the main contributor to high inpatient charges. Longer LoS and other procedures also contributed to higher inpatient charges. There is an increase in the number of inpatient cases of HCC. Although inpatient mortality is decreasing and the LoS is stable, the inpatient charges associated with HCC continue to increase. © 2013 John Wiley & Sons A/S. Published by John Wiley & Sons Ltd.

  15. Oak Ridge National Laboratory Technology Logic Diagram. Volume 2, Technology Logic Diagram: Part B, Remedial Action

    Energy Technology Data Exchange (ETDEWEB)

    1993-09-01

    The Oak Ridge National Laboratory Technology Logic Diagram (TLD) was developed to provide a decision support tool that relates environmental restoration (ER) and waste management (WM) problems at Oak Ridge National Laboratory (ORNL) to potential technologies that can remediate these problems. The TLD identifies the research, development, demonstration, testing, and evaluation needed to develop these technologies to a state that allows technology transfer and application to decontamination and decommissioning (D&D), remedial action (RA), and WM activities. The TLD consists of three fundamentally separate volumes: Vol. 1 (Technology Evaluation), Vol. 2 (Technology Logic Diagram), and Vol. 3 (Technology Evaluation Data Sheets). Part A of Vols. 1. and 2 focuses on D&D. Part B of Vols. 1 and 2 focuses on the RA of contaminated facilities. Part C of Vols. 1 and 2 focuses on WM. Each part of Vol. 1 contains an overview of the TLD, an explanation of the program-specific responsibilities, a review of identified technologies, and the rankings of remedial technologies. Volume 2 (Pts. A, B, and C) contains the logic linkages among EM goals, environmental problems, and the various technologies that have the potential to solve these problems. Volume 3 (Pts. A, B, and C) contains the TLD data sheets. Remedial action is the focus of Vol. 2, Pt. B, which has been divided into the three necessary subelements of the RA: characterization, RA, and robotics and automation. Each of these sections address general ORNL problems, which are then broken down by problem area/constituents and linked to potential remedial technologies. The diagrams also contain summary information about a technology`s status, its science and technology needs, and its implementation needs.

  16. Nuclear fuel technology - Tank calibration and volume determination for nuclear materials accountancy - Part 1: Procedural overview

    International Nuclear Information System (INIS)

    2007-01-01

    Accurate determinations of volume are a fundamental component of any measurement-based system of control and accountability in a facility that processes or stores nuclear materials in liquid form. Volume determinations are typically made with the aid of a calibration or volume measurement equation that relates the response of the tank's measurement system to some independent measure of tank volume. The ultimate purpose of the calibration exercise is to estimate the tank's volume measurement equation (the inverse of the calibration equation), which relates tank volume to measurement system response. The steps carried out to acquire data for estimating the tank's calibration or volume measurement equation are collectively described as the process of tank calibration. This part of ISO 18213 describes procedures for tank calibration and volume determination for nuclear process tanks equipped with pressure-measurement systems for determining liquid content. Specifically, overall guidance is provided for planning a calibration exercise undertaken to obtain the data required for the measurement equation to estimate a tank's volume. The key steps in the procedure are also presented for subsequently using the estimated volume-measurement equation to determine tank liquid volumes. The procedures presented apply specifically to tanks equipped with bubbler probe systems for measuring liquid content. Moreover, these procedures produce reliable results only for clear (i.e. without suspended solids), homogeneous liquids that are at both thermal and static equilibrium. The paper elaborates on scope, physical principles involved, the calibration model, equipment required, a typical tank calibration procedure, calibration planning and pre-calibration activities, and volume determination. A bibliography is provided

  17. Treatment of uranium-containing effluent in the process of metallic uranium parts

    International Nuclear Information System (INIS)

    Yuan Guoqi

    1993-01-01

    The anion exchange method used in treatment of uranium-containing effluent in the process of metallic parts is the subject of the paper. The results of the experiments shows that the uranium concentration in created water remains is less than 10 μg/l when the waste water flowed through 10000 column volume. A small facility with column volume 150 litre was installed and 1500 m 3 of waste water can be cleaned per year. (1 tab.)

  18. Dynamics of acoustically levitated disk samples.

    Science.gov (United States)

    Xie, W J; Wei, B

    2004-10-01

    The acoustic levitation force on disk samples and the dynamics of large water drops in a planar standing wave are studied by solving the acoustic scattering problem through incorporating the boundary element method. The dependence of levitation force amplitude on the equivalent radius R of disks deviates seriously from the R3 law predicted by King's theory, and a larger force can be obtained for thin disks. When the disk aspect ratio gamma is larger than a critical value gamma(*) ( approximately 1.9 ) and the disk radius a is smaller than the critical value a(*) (gamma) , the levitation force per unit volume of the sample will increase with the enlargement of the disk. The acoustic levitation force on thin-disk samples ( gammaacoustic field for stable levitation of a large water drop is to adjust the reflector-emitter interval H slightly above the resonant interval H(n) . The simulation shows that the drop is flattened and the central parts of its top and bottom surface become concave with the increase of sound pressure level, which agrees with the experimental observation. The main frequencies of the shape oscillation under different sound pressures are slightly larger than the Rayleigh frequency because of the large shape deformation. The simulated translational frequencies of the vertical vibration under normal gravity condition agree with the theoretical analysis.

  19. The effect of hospital volume on mortality in patients admitted with severe sepsis.

    Directory of Open Access Journals (Sweden)

    Sajid Shahul

    Full Text Available IMPORTANCE: The association between hospital volume and inpatient mortality for severe sepsis is unclear. OBJECTIVE: To assess the effect of severe sepsis case volume and inpatient mortality. DESIGN SETTING AND PARTICIPANTS: Retrospective cohort study from 646,988 patient discharges with severe sepsis from 3,487 hospitals in the Nationwide Inpatient Sample from 2002 to 2011. EXPOSURES: The exposure of interest was the mean yearly sepsis case volume per hospital divided into tertiles. MAIN OUTCOMES AND MEASURES: Inpatient mortality. RESULTS: Compared with the highest tertile of severe sepsis volume (>60 cases per year, the odds ratio for inpatient mortality among persons admitted to hospitals in the lowest tertile (≤10 severe sepsis cases per year was 1.188 (95% CI: 1.074-1.315, while the odds ratio was 1.090 (95% CI: 1.031-1.152 for patients admitted to hospitals in the middle tertile. Similarly, improved survival was seen across the tertiles with an adjusted inpatient mortality incidence of 35.81 (95% CI: 33.64-38.03 for hospitals with the lowest volume of severe sepsis cases and a drop to 32.07 (95% CI: 31.51-32.64 for hospitals with the highest volume. CONCLUSIONS AND RELEVANCE: We demonstrate an association between a higher severe sepsis case volume and decreased mortality. The need for a systems-based approach for improved outcomes may require a high volume of severely septic patients.

  20. Document de travail 5: Beyond Collier's Bottom Billion | CRDI ...

    International Development Research Centre (IDRC) Digital Library (Canada)

    16 déc. 2010 ... L'ouvrage de Paul Collier, The Bottom Billion, suscite un grand intérêt dans le domaine du développement. Il repose sur la thèse selon laquelle un groupe de près de 60 pays, dont la population totale avoisine un milliard de personnes, sont pris dans quatre pièges principaux.

  1. Large-volume injection of sample diluents not miscible with the mobile phase as an alternative approach in sample preparation for bioanalysis: an application for fenspiride bioequivalence.

    Science.gov (United States)

    Medvedovici, Andrei; Udrescu, Stefan; Albu, Florin; Tache, Florentin; David, Victor

    2011-09-01

    Liquid-liquid extraction of target compounds from biological matrices followed by the injection of a large volume from the organic layer into the chromatographic column operated under reversed-phase (RP) conditions would successfully combine the selectivity and the straightforward character of the procedure in order to enhance sensitivity, compared with the usual approach of involving solvent evaporation and residue re-dissolution. Large-volume injection of samples in diluents that are not miscible with the mobile phase was recently introduced in chromatographic practice. The risk of random errors produced during the manipulation of samples is also substantially reduced. A bioanalytical method designed for the bioequivalence of fenspiride containing pharmaceutical formulations was based on a sample preparation procedure involving extraction of the target analyte and the internal standard (trimetazidine) from alkalinized plasma samples in 1-octanol. A volume of 75 µl from the octanol layer was directly injected on a Zorbax SB C18 Rapid Resolution, 50 mm length × 4.6 mm internal diameter × 1.8 µm particle size column, with the RP separation being carried out under gradient elution conditions. Detection was made through positive ESI and MS/MS. Aspects related to method development and validation are discussed. The bioanalytical method was successfully applied to assess bioequivalence of a modified release pharmaceutical formulation containing 80 mg fenspiride hydrochloride during two different studies carried out as single-dose administration under fasting and fed conditions (four arms), and multiple doses administration, respectively. The quality attributes assigned to the bioanalytical method, as resulting from its application to the bioequivalence studies, are highlighted and fully demonstrate that sample preparation based on large-volume injection of immiscible diluents has an increased potential for application in bioanalysis.

  2. Semiconductive materials and associated uses thereof

    Science.gov (United States)

    Lynn, Kelvin; Jones, Kelly; Ciampi, Guido

    2012-10-09

    High rate radiation detectors are disclosed herein. The detectors include a detector material disposed inside the container, the detector material containing cadmium, tellurium, and zinc, a first dopant containing at least one of aluminum, chlorine, and indium, and a second dopant containing a rare earth metal. The first dopant has a concentration of about 500 to about 20,000 atomic parts per billion, and the second dopant has a concentration of about 200 to about 20,000 atomic parts per billion.

  3. Characterization of Tank 16H Annulus Samples Part II: Leaching Results

    International Nuclear Information System (INIS)

    Hay, M.; Reboul, S.

    2012-01-01

    The closure of Tank 16H will require removal of material from the annulus of the tank. Samples from Tank 16H annulus were characterized and tested to provide information to evaluate various alternatives for removing the annulus waste. The analysis found all four annulus samples to be composed mainly of Si, Na, and Al and lesser amounts of other elements. The XRD data indicate quartz (SiO 2 ) and sodium aluminum nitrate silicate hydrate (Na 8 (Al 6 Si 6 O 24 )(NO 3 ) 2 .4H 2 O) as the predominant crystalline mineral phases in the samples. The XRD data also indicate the presence of crystalline sodium nitrate (NaNO 3 ), sodium nitrite (NaNO 2 ), gibbsite (Al(OH) 3 ), hydrated sodium bicarbonate (Na 3 H(CO 3 ) 2 .2H 2 O), and muscovite (KAl 2 (AlSi 3 O 10 )(OH) 2 ). Based on the weight of solids remaining at the end of the test, the water leaching test results indicate 20-35% of the solids dissolved after three contacts with an approximately 3:1 volume of water at 45 C. The chemical analysis of the leachates and the XRD results of the remaining solids indicate sodium salts of nitrate, nitrite, sulfate, and possibly carbonate/bicarbonate make up the majority of the dissolved material. The majority of these salts were dissolved in the first water contact and simply diluted with each subsequent water contact. The water leaching removed large amounts of the uranium in two of the samples and approximately 1/3 of the 99 Tc from all four samples. Most of the other radionuclides analyzed showed low solubility in the water leaching test. The oxalic acid leaching test result indicate approximately 34-47% of the solids in the four annulus samples will dissolve after three contacts with an approximately 3:1 volume of acid to solids at 45 C. The same sodium salts found in the water leaching test comprise the majority of dissolved material in the oxalic acid leaching test. However, the oxalic acid was somewhat more effective in dissolving radionuclides than the water leach. In

  4. Increasing Polymer Solar Cell Fill Factor by Trap-Filling with F4-TCNQ at Parts Per Thousand Concentration.

    Science.gov (United States)

    Yan, Han; Manion, Joseph G; Yuan, Mingjian; García de Arquer, F Pelayo; McKeown, George R; Beaupré, Serge; Leclerc, Mario; Sargent, Edward H; Seferos, Dwight S

    2016-08-01

    Intrinsic traps in organic semiconductors can be eliminated by trap-filling with F4-TCNQ. Photovoltaic tests show that devices with F4-TCNQ at parts per thousand concentration outperform control devices due to an improved fill factor. Further studies confirm the trap-filling pathway and demonstrate the general nature of this finding. © 2016 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  5. Quantifying Uncertainty in Soil Volume Estimates

    International Nuclear Information System (INIS)

    Roos, A.D.; Hays, D.C.; Johnson, R.L.; Durham, L.A.; Winters, M.

    2009-01-01

    Proper planning and design for remediating contaminated environmental media require an adequate understanding of the types of contaminants and the lateral and vertical extent of contamination. In the case of contaminated soils, this generally takes the form of volume estimates that are prepared as part of a Feasibility Study for Comprehensive Environmental Response, Compensation, and Liability Act (CERCLA) sites and/or as part of the remedial design. These estimates are typically single values representing what is believed to be the most likely volume of contaminated soil present at the site. These single-value estimates, however, do not convey the level of confidence associated with the estimates. Unfortunately, the experience has been that pre-remediation soil volume estimates often significantly underestimate the actual volume of contaminated soils that are encountered during the course of remediation. This underestimation has significant implications, both technically (e.g., inappropriate remedial designs) and programmatically (e.g., establishing technically defensible budget and schedule baselines). Argonne National Laboratory (Argonne) has developed a joint Bayesian/geostatistical methodology for estimating contaminated soil volumes based on sampling results, that also provides upper and lower probabilistic bounds on those volumes. This paper evaluates the performance of this method in a retrospective study that compares volume estimates derived using this technique with actual excavated soil volumes for select Formerly Utilized Sites Remedial Action Program (FUSRAP) Maywood properties that have completed remedial action by the U.S. Army Corps of Engineers (USACE) New York District. (authors)

  6. 40 CFR Table 2 to Subpart Ffff of... - Model Rule-Emission Limitations

    Science.gov (United States)

    2010-07-01

    ... Compliance Times for Other Solid Waste Incineration Units That Commenced Construction On or Before December 9... micrograms per dry standard cubic meter 3-run average (1 hour minimum sample time per run) Method 29 of appendix A of this part. 2. Carbon monoxide 40 parts per million by dry volume 3-run average (1 hour...

  7. $158 per Quart: The Value of a Volume of Coins

    Science.gov (United States)

    Kcenich, Stephen; Boss'e, Michael J.

    2008-01-01

    The ubiquitous change jar (or any other container) is the focus of this investigation. Using random pocket change, a distribution is determined and statistical tools are employed to calculate the value of given volumes of coins. This brief investigation begins by considering money, which piques the interest of most students, and uses this…

  8. Evaluation of Ultra-Violet Photocatalytic Oxidation (UVPCO) forIndoor Air Applications: Conversion of Volatile Organic Compounds at LowPart-per-Billion Concentrations

    Energy Technology Data Exchange (ETDEWEB)

    Hodgson, Alfred T.; Sullivan, Douglas P.; Fisk, William J.

    2005-09-30

    Efficient removal of indoor generated airborne particles and volatile organic compounds (VOCs) in office buildings and other large buildings may allow for a reduction in outdoor air supply rates with concomitant energy savings while still maintaining acceptable indoor air quality in these buildings. Ultra-Violet Photocatalytic Oxidation (UVPCO) air cleaners have the potential to achieve the necessary reductions in indoor VOC concentrations at relatively low cost. In this study, laboratory experiments were conducted with a scaled, prototype UVPCO device designed for use in a duct system. The experimental UVPCO contained two 30 by 30-cm honeycomb monoliths coated with titanium dioxide and 3% by weight tungsten oxide. The monoliths were irradiated with 12 UVC lamps arranged in four banks. The UVPCO was challenged with four mixtures of VOCs typical of mixtures encountered in indoor air. A synthetic office mixture contained 27 VOCs commonly measured in office buildings. A cleaning product mixture contained three cleaning products with high market shares. A building product mixture was created by combining sources including painted wallboard, composite wood products, carpet systems, and vinyl flooring. A fourth mixture contained formaldehyde and acetaldehyde. Steady-state concentrations were produced in a classroom laboratory or a 20-m{sup 3} environmental chamber. Air was drawn through the UVPCO, and single pass conversion efficiencies were measured from replicate air samples collected upstream and downstream of the reactor section. Concentrations of the mixtures were manipulated, with concentrations of individual VOCs mostly maintained below 10 ppb. Device flow rates were varied between 165 and 580 m{sup 3}/h. Production of formaldehyde, acetaldehyde, acetone, formic acid, and acetic acid as reaction products was investigated. Conversion efficiency data were generated for 48 individual VOCs or groups of closely related compounds. Alcohols and glycol ethers were the

  9. Gender moderates the association between dorsal medial prefrontal cortex volume and depressive symptoms in a subclinical sample.

    Science.gov (United States)

    Carlson, Joshua M; Depetro, Emily; Maxwell, Joshua; Harmon-Jones, Eddie; Hajcak, Greg

    2015-08-30

    Major depressive disorder is associated with lower medial prefrontal cortex volumes. The role that gender might play in moderating this relationship and what particular medial prefrontal cortex subregion(s) might be implicated is unclear. Magnetic resonance imaging was used to assess dorsal, ventral, and anterior cingulate regions of the medial prefrontal cortex in a normative sample of male and female adults. The Depression, Anxiety, and Stress Scale (DASS) was used to measure these three variables. Voxel-based morphometry was used to test for correlations between medial prefrontal gray matter volume and depressive traits. The dorsal medial frontal cortex was correlated with greater levels of depression, but not anxiety and stress. Gender moderates this effect: in males greater levels of depression were associated with lower dorsal medial prefrontal volumes, but in females no relationship was observed. The results indicate that even within a non-clinical sample, male participants with higher levels of depressive traits tend to have lower levels of gray matter volume in the dorsal medial prefrontal cortex. Our finding is consistent with low dorsal medial prefrontal volume contributing to the development of depression in males. Future longitudinal work is needed to substantiate this possibility. Copyright © 2015 Elsevier Ireland Ltd. All rights reserved.

  10. International collaboration in SSC (or any $4 billion scientific project)

    International Nuclear Information System (INIS)

    Lederman, L.M.

    1988-01-01

    In this paper, the author discusses the superconducting supercollider. This is a project that costs U.S. $4.4 billion. The author spends a short time giving the motivation (which is a scientific motivation) and also giving the idea of how it is possible, with U.S. deficits

  11. Critical Propulsion Components. Volume 1; Summary, Introduction, and Propulsion Systems Studies

    Science.gov (United States)

    2005-01-01

    Several studies have concluded that a supersonic aircraft, if environmentally acceptable and economically viable, could successfully compete in the 21st century marketplace. However, before industry can commit to what is estimated as a 15 to 20 billion dollar investment, several barrier issues must be resolved. In an effort to address these barrier issues, NASA and Industry teamed to form the High-Speed Research (HSR) program. As part of this program, the Critical Propulsion Components (CPC) element was created and assigned the task of developing those propulsion component technologies necessary to: (1) reduce cruise emissions by a factor of 10 and (2) meet the ever-increasing airport noise restrictions with an economically viable propulsion system. The CPC-identified critical components were ultra-low emission combustors, low-noise/high-performance exhaust nozzles, low-noise fans, and stable/high-performance inlets. Propulsion cycle studies (coordinated with NASA Langley Research Center sponsored airplane studies) were conducted throughout this CPC program to help evaluate candidate components and select the best concepts for the more complex and larger scale research efforts. The propulsion cycle and components ultimately selected were a mixed-flow turbofan (MFTF) engine employing a lean, premixed, prevaporized (LPP) combustor coupled to a two-dimensional mixed compression inlet and a two-dimensional mixer/ejector nozzle. Due to the large amount of material presented in this report, it was prepared in four volumes; Volume 1: Summary, Introduction, and Propulsion System Studies, Volume 2: Combustor, Volume 3: Exhaust Nozzle, and Volume 4: Inlet and Fan/ Inlet Acoustic Team.

  12. Defining a Hospital Volume Threshold for Minimally Invasive Pancreaticoduodenectomy in the United States

    Science.gov (United States)

    Adam, Mohamed Abdelgadir; Thomas, Samantha; Youngwirth, Linda; Pappas, Theodore; Roman, Sanziana A.

    2016-01-01

    Importance There is increasing interest in expanding use of minimally invasive pancreaticoduodenectomy (MIPD). This procedure is complex, with data suggesting a significant association between hospital volume and outcomes. Objective To determine whether there is an MIPD hospital volume threshold for which patient outcomes could be optimized. Design, Setting, and Participants Adult patients undergoing MIPD were identified from the Healthcare Cost and Utilization Project National Inpatient Sample from 2000 to 2012. Multivariable models with restricted cubic splines were used to identify a hospital volume threshold by plotting annual hospital volume against the adjusted odds of postoperative complications. The current analysis was conducted on August 16, 2016. Main Outcomes and Measures Incidence of any complication. Results Of the 865 patients who underwent MIPD, 474 (55%) were male and the median patient age was 67 years (interquartile range, 59-74 years). Among the patients, 747 (86%) had cancer and 91 (11%) had benign conditions/pancreatitis. Overall, 410 patients (47%) had postoperative complications and 31 (4%) died in-hospital. After adjustment for demographic and clinical characteristics, increasing hospital volume was associated with reduced complications (overall association P < .001); the likelihood of experiencing a complication declined as hospital volume increased up to 22 cases per year (95% CI, 21-23). Median hospital volume was 6 cases per year (range, 1-60). Most patients (n = 717; 83%) underwent the procedure at low-volume (≤22 cases per year) hospitals. After adjustment for patient mix, undergoing MIPD at low- vs high-volume hospitals was significantly associated with increased odds for postoperative complications (odds ratio, 1.74; 95% CI, 1.03-2.94; P = .04). Conclusions and Relevance Hospital volume is significantly associated with improved outcomes from MIPD, with a threshold of 22 cases per year. Most patients undergo MIPD at low-volume

  13. Planck/SDSS Cluster Mass and Gas Scaling Relations for a Volume-Complete redMaPPer Sample

    Science.gov (United States)

    Jimeno, Pablo; Diego, Jose M.; Broadhurst, Tom; De Martino, I.; Lazkoz, Ruth

    2018-04-01

    Using Planck satellite data, we construct Sunyaev-Zel'dovich (SZ) gas pressure profiles for a large, volume-complete sample of optically selected clusters. We have defined a sample of over 8,000 redMaPPer clusters from the Sloan Digital Sky Survey (SDSS), within the volume-complete redshift region 0.100 trend towards larger break radius with increasing cluster mass. Our SZ-based masses fall ˜16% below the mass-richness relations from weak lensing, in a similar fashion as the "hydrostatic bias" related with X-ray derived masses. Finally, we derive a tight Y500-M500 relation over a wide range of cluster mass, with a power law slope equal to 1.70 ± 0.07, that agrees well with the independent slope obtained by the Planck team with an SZ-selected cluster sample, but extends to lower masses with higher precision.

  14. SU-F-T-501: Dosimetric Comparison of Single Arc-Per-Beam and Two Arc-Per-Beam VMAT Optimization in the Monaco Treatment Planning System

    Energy Technology Data Exchange (ETDEWEB)

    Kalet, A; Cao, N; Meyer, J; Dempsey, C [University of Washington Medical Center, Seattle, WA (United States); Seattle Cancer Care Alliance, Seattle, WA (United States); Richardson, H [Seattle Cancer Care Alliance, Seattle, WA (United States)

    2016-06-15

    Purpose: The purpose of this study was to evaluate the dosimetric and practical effects of the Monaco treatment planning system “max arcs-per-beam” optimization parameter in pelvic radiotherapy treatments. Methods: A total of 17 previously treated patients were selected for this study with a range of pelvic disease site including prostate(9), bladder(1), uterus(3), rectum(3), and cervix(1). For each patient, two plans were generated, one using a arc-per-beam setting of ‘1’ and another with setting of ‘2’. The setting allows the optimizer to add a gantry direction change, creating multiple arc passes per beam sequence. Volumes and constraints established from the initial clinical treatments were used for planning. All constraints and dose coverage objects were kept the same between plans, and all plans were normalized to 99.7% to ensure 100% of the PTV received 95% of the prescription dose. We evaluated the PTV conformity index, homogeneity index, total monitor units, number of control points, and various dose volume histogram (DVH) points for statistical comparison (alpha=0.05). Results: We found for the 10 complex shaped target volumes (small central volumes with extending bilateral ‘arms’ to cover nodal regions) that the use of 2 arcs-per-beam achieved significantly lower average DVH values for the bladder V20 (p=0.036) and rectum V30 (p=0.001) while still meeting the high dose target constraints. DVH values for the simpler, more spherical PTVs were not found significantly different. Additionally, we found a beam delivery time reduction of approximately 25%. Conclusion: In summary, the dosimetric benefit, while moderate, was improved over a 1 arc-per-beam setting for complex PTVs, and equivalent in other cases. The overall reduced delivery time suggests that the use of multiple arcs-per-beam could lead to reduced patient on table time, increased clinical throughput, and reduced medical physics quality assurance effort.

  15. Satellite Power Systems (SPS) concept definition study, exhibit C. Volume 2, part 2: System engineering, cost and programmatics, appendixes

    Science.gov (United States)

    Hanley, G. M.

    1979-01-01

    Appendixes for Volume 2 (Part 2) of a seven volume Satellite (SPS) report are presented. The document contains two appendixes. The first is a SPS work breakdown structure dictionary. The second gives SPS cost estimating relationships and contains the cost analyses and a description of cost elements that comprise the SPS program.

  16. Cost of solving mysteries of universe: $6.7 billion

    CERN Multimedia

    Overbye, Dennis

    2007-01-01

    "An international consortium of physicists on Thursday released the first detailed design of what they believe will be the next big thing in physics. The machine, 20 miles long, will slam together electrons and their opposites, positrons, to produce fireballs of energy re-creating conditions when the universe was only a trillionth of a second old. It would cost about $6.7 billion." (1 page)

  17. 10 CFR Appendix to Part 474 - Sample Petroleum-Equivalent Fuel Economy Calculations

    Science.gov (United States)

    2010-01-01

    ... 10 Energy 3 2010-01-01 2010-01-01 false Sample Petroleum-Equivalent Fuel Economy Calculations..., DEVELOPMENT, AND DEMONSTRATION PROGRAM; PETROLEUM-EQUIVALENT FUEL ECONOMY CALCULATION Pt. 474, App. Appendix to Part 474—Sample Petroleum-Equivalent Fuel Economy Calculations Example 1: An electric vehicle is...

  18. Preparation and calibration by liquid scintillation of a sample of Cl 36

    International Nuclear Information System (INIS)

    Grau Malonda, A.; Los Arcos, J.M.; Rodriguez Barquero, L.; Suarez, C.

    1989-01-01

    A procedure to prepare a sample of Clorine 36, as Li 36 Cl, able to be measured by liquid scintillation counting, is described. The sample is chemically stable, with no variation of the quenching parameter up to 4 mg of LiCl per 15 ml of scintillator, keeps constant the counting efficiency for concentration higher than 40 μg of Li 36 Cl in that volume, and shows no deterioration over a 3 weed period. The Li 36 Cl solution has been standarized using the free parameter method with different volumes of toluene, PCS and Instagel, to an uncertainty of 0,3% (Author)

  19. Redefining agricultural yields: from tonnes to people nourished per hectare

    International Nuclear Information System (INIS)

    Cassidy, Emily S; West, Paul C; Gerber, James S; Foley, Jonathan A

    2013-01-01

    Worldwide demand for crops is increasing rapidly due to global population growth, increased biofuel production, and changing dietary preferences. Meeting these growing demands will be a substantial challenge that will tax the capability of our food system and prompt calls to dramatically boost global crop production. However, to increase food availability, we may also consider how the world’s crops are allocated to different uses and whether it is possible to feed more people with current levels of crop production. Of particular interest are the uses of crops as animal feed and as biofuel feedstocks. Currently, 36% of the calories produced by the world’s crops are being used for animal feed, and only 12% of those feed calories ultimately contribute to the human diet (as meat and other animal products). Additionally, human-edible calories used for biofuel production increased fourfold between the years 2000 and 2010, from 1% to 4%, representing a net reduction of available food globally. In this study, we re-examine agricultural productivity, going from using the standard definition of yield (in tonnes per hectare, or similar units) to using the number of people actually fed per hectare of cropland. We find that, given the current mix of crop uses, growing food exclusively for direct human consumption could, in principle, increase available food calories by as much as 70%, which could feed an additional 4 billion people (more than the projected 2–3 billion people arriving through population growth). Even small shifts in our allocation of crops to animal feed and biofuels could significantly increase global food availability, and could be an instrumental tool in meeting the challenges of ensuring global food security. (letter)

  20. Redefining agricultural yields: from tonnes to people nourished per hectare

    Science.gov (United States)

    Cassidy, Emily S.; West, Paul C.; Gerber, James S.; Foley, Jonathan A.

    2013-09-01

    Worldwide demand for crops is increasing rapidly due to global population growth, increased biofuel production, and changing dietary preferences. Meeting these growing demands will be a substantial challenge that will tax the capability of our food system and prompt calls to dramatically boost global crop production. However, to increase food availability, we may also consider how the world’s crops are allocated to different uses and whether it is possible to feed more people with current levels of crop production. Of particular interest are the uses of crops as animal feed and as biofuel feedstocks. Currently, 36% of the calories produced by the world’s crops are being used for animal feed, and only 12% of those feed calories ultimately contribute to the human diet (as meat and other animal products). Additionally, human-edible calories used for biofuel production increased fourfold between the years 2000 and 2010, from 1% to 4%, representing a net reduction of available food globally. In this study, we re-examine agricultural productivity, going from using the standard definition of yield (in tonnes per hectare, or similar units) to using the number of people actually fed per hectare of cropland. We find that, given the current mix of crop uses, growing food exclusively for direct human consumption could, in principle, increase available food calories by as much as 70%, which could feed an additional 4 billion people (more than the projected 2-3 billion people arriving through population growth). Even small shifts in our allocation of crops to animal feed and biofuels could significantly increase global food availability, and could be an instrumental tool in meeting the challenges of ensuring global food security.

  1. Development of modelling and forecasting in geology. (Volume 1)

    International Nuclear Information System (INIS)

    Afzali, H.; Fourniguet, J.; Peaudecerf, P.

    1990-01-01

    To assess the long-term safety of radioactive waste disposal systems, validation of predictive models is essential and large efforts should be given to barriers, particularly geologic barriers. This work appears in the form of four volumes, the subject of the first part is described below. Soils and rocks erosion depends upon climate, relief, lithology and human activities (deforesting). In the world, mechanical erosion is evaluated from 5 to 8 cm per 1000 years (mean denudation ratio). Rocks weathering solubilize chemical elements in the running water and rocks fracturation becomes more easily under erosion effects. Alteration front progress is 0.3-3 cm per 1000 years in temperate zones and 4-7 cm per 1000 years in tropical zones. 5 figs., 14 tabs., 80 refs

  2. Areva - First quarter 2009 revenue climbs 8.5% to 3.003 billion euros

    International Nuclear Information System (INIS)

    2009-04-01

    First quarter 2009 revenue was up 8.5% compared with the same period last year, to 3.003 billion euros. At constant exchange rates and consolidation scope, growth came to 3.9%. Currency translation had a positive impact of 57 million euros over the quarter. Changes in the consolidation scope had an impact of 66 million euros, primarily due to the consolidation of acquisitions made in 2008 in Transmission and Distribution and in Renewable Energies. The growth engines for first quarter revenue were the Reactors and Services division and the Transmission and Distribution division, with growth of 9.2% and 16.1% respectively. Outside France, revenue rose to 2.032 billion euros, compared with 1.857 billion euros in the first quarter of 2008, and represents 68% of total revenue. Orders were steady in the first quarter, particularly in the Front End, which posted several significant contracts with US and Asian utilities, and in Transmission and Distribution, with orders up sharply in Asia and South America. As of March 31, 2009, the group's backlog reached 49.5 billion euros, for 28.3% growth year-on-year, including 31.3% growth in Nuclear and 10.2% in Transmission and Distribution. For the year as a whole, the group confirms its outlook for backlog and revenue growth as well as rising operating income It should be noted that revenue may vary significantly from one quarter to the next in nuclear operations. Accordingly, quarterly data cannot be viewed as a reliable indicator of annual trends

  3. On the design of a real-time volume rendering engine

    NARCIS (Netherlands)

    Smit, Jaap; Wessels, H.L.F.; van der Horst, A.; Bentum, Marinus Jan

    1992-01-01

    An architecture for a Real-Time Volume Rendering Engine (RT-VRE) is given, capable of computing 750 × 750 × 512 samples from a 3D dataset at a rate of 25 images per second. The RT-VRE uses for this purpose 64 dedicated rendering chips, cooperating with 16 RISC-processors. A plane interpolator

  4. On the design of a real-time volume rendering engine

    NARCIS (Netherlands)

    Smit, Jaap; Wessels, H.J.; van der Horst, A.; Bentum, Marinus Jan

    1995-01-01

    An architecture for a Real-Time Volume Rendering Engine (RT-VRE) is given, capable of computing 750 × 750 × 512 samples from a 3D dataset at a rate of 25 images per second. The RT-VRE uses for this purpose 64 dedicated rendering chips, cooperating with 16 RISC-processors. A plane interpolator

  5. Relationship Between LIBS Ablation and Pit Volume for Geologic Samples: Applications for the In Situ Absolute Geochronology

    Science.gov (United States)

    Devismes, Damien; Cohen, Barbara; Miller, J.-S.; Gillot, P.-Y.; Lefevre, J.-C.; Boukari, C.

    2014-01-01

    These first results demonstrate that LIBS spectra can be an interesting tool to estimate the ablated volume. When the ablated volume is bigger than 9.10(exp 6) cubic micrometers, this method has less than 10% of uncertainties. Far enough to be directly implemented in the KArLE experiment protocol. Nevertheless, depending on the samples and their mean grain size, the difficulty to have homogeneous spectra will increase with the ablated volume. Several K-Ar dating studies based on this approach will be implemented. After that, the results will be shown and discussed.

  6. Using Mobile Device Samples to Estimate Traffic Volumes

    Science.gov (United States)

    2017-12-01

    In this project, TTI worked with StreetLight Data to evaluate a beta version of its traffic volume estimates derived from global positioning system (GPS)-based mobile devices. TTI evaluated the accuracy of average annual daily traffic (AADT) volume :...

  7. Water pollution screening by large-volume injection of aqueous samples and application to GC/MS analysis of a river Elbe sample

    Energy Technology Data Exchange (ETDEWEB)

    Mueller, S.; Efer, J.; Engewald, W. [Leipzig Univ. (Germany). Inst. fuer Analytische Chemie

    1997-03-01

    The large-volume sampling of aqueous samples in a programmed temperature vaporizer (PTV) injector was used successfully for the target and non-target analysis of real samples. In this still rarely applied method, e.g., 1 mL of the water sample to be analyzed is slowly injected direct into the PTV. The vaporized water is eliminated through the split vent. The analytes are concentrated onto an adsorbent inside the insert and subsequently thermally desorbed. The capability of the method is demonstrated using a sample from the river Elbe. By means of coupling this method with a mass selective detector in SIM mode (target analysis) the method allows the determination of pollutants in the concentration range up to 0.01 {mu}g/L. Furthermore, PTV enrichment is an effective and time-saving method for non-target analysis in SCAN mode. In a sample from the river Elbe over 20 compounds were identified. (orig.) With 3 figs., 2 tabs.

  8. Real-time interpolation for true 3-dimensional ultrasound image volumes.

    Science.gov (United States)

    Ji, Songbai; Roberts, David W; Hartov, Alex; Paulsen, Keith D

    2011-02-01

    We compared trilinear interpolation to voxel nearest neighbor and distance-weighted algorithms for fast and accurate processing of true 3-dimensional ultrasound (3DUS) image volumes. In this study, the computational efficiency and interpolation accuracy of the 3 methods were compared on the basis of a simulated 3DUS image volume, 34 clinical 3DUS image volumes from 5 patients, and 2 experimental phantom image volumes. We show that trilinear interpolation improves interpolation accuracy over both the voxel nearest neighbor and distance-weighted algorithms yet achieves real-time computational performance that is comparable to the voxel nearest neighbor algrorithm (1-2 orders of magnitude faster than the distance-weighted algorithm) as well as the fastest pixel-based algorithms for processing tracked 2-dimensional ultrasound images (0.035 seconds per 2-dimesional cross-sectional image [76,800 pixels interpolated, or 0.46 ms/1000 pixels] and 1.05 seconds per full volume with a 1-mm(3) voxel size [4.6 million voxels interpolated, or 0.23 ms/1000 voxels]). On the basis of these results, trilinear interpolation is recommended as a fast and accurate interpolation method for rectilinear sampling of 3DUS image acquisitions, which is required to facilitate subsequent processing and display during operating room procedures such as image-guided neurosurgery.

  9. 40 CFR Table 2 to Subpart Dddd of... - Model Rule-Emission Limitations

    Science.gov (United States)

    2010-07-01

    ... Compliance Times for Commercial and Industrial Solid Waste Incineration Units that Commenced Construction On... averaging time And determining compliance using this method Cadmium 0.004 milligrams per dry standard cubic... this part) Carbon monoxide 157 parts per million by dry volume 3-run average (1 hour minimum sample...

  10. Computational methods and modeling. 1. Sampling a Position Uniformly in a Trilinear Hexahedral Volume

    International Nuclear Information System (INIS)

    Urbatsch, Todd J.; Evans, Thomas M.; Hughes, H. Grady

    2001-01-01

    Monte Carlo particle transport plays an important role in some multi-physics simulations. These simulations, which may additionally involve deterministic calculations, typically use a hexahedral or tetrahedral mesh. Trilinear hexahedrons are attractive for physics calculations because faces between cells are uniquely defined, distance-to-boundary calculations are deterministic, and hexahedral meshes tend to require fewer cells than tetrahedral meshes. We discuss one aspect of Monte Carlo transport: sampling a position in a tri-linear hexahedron, which is made up of eight control points, or nodes, and six bilinear faces, where each face is defined by four non-coplanar nodes in three-dimensional Cartesian space. We derive, code, and verify the exact sampling method and propose an approximation to it. Our proposed approximate method uses about one-third the memory and can be twice as fast as the exact sampling method, but we find that its inaccuracy limits its use to well-behaved hexahedrons. Daunted by the expense of the exact method, we propose an alternate approximate sampling method. First, calculate beforehand an approximate volume for each corner of the hexahedron by taking one-eighth of the volume of an imaginary parallelepiped defined by the corner node and the three nodes to which it is directly connected. For the sampling, assume separability in the parameters, and sample each parameter, in turn, from a linear pdf defined by the sum of the four corner volumes at each limit (-1 and 1) of the parameter. This method ignores the quadratic portion of the pdf, but it requires less storage, has simpler sampling, and needs no extra, on-the-fly calculations. We simplify verification by designing tests that consist of one or more cells that entirely fill a unit cube. Uniformly sampling complicated cells that fill a unit cube will result in uniformly sampling the unit cube. Unit cubes are easily analyzed. The first problem has four wedges (or tents, or A frames) whose

  11. 40 CFR Appendix III to Part 600 - Sample Fuel Economy Label Calculation

    Science.gov (United States)

    2010-07-01

    ... 40 Protection of Environment 29 2010-07-01 2010-07-01 false Sample Fuel Economy Label Calculation...) ENERGY POLICY FUEL ECONOMY AND CARBON-RELATED EXHAUST EMISSIONS OF MOTOR VEHICLES Pt. 600, App. III Appendix III to Part 600—Sample Fuel Economy Label Calculation Suppose that a manufacturer called Mizer...

  12. Y-12 Plant remedial action Technology Logic Diagram: Volume 3, Technology evaluation data sheets: Part A, Remedial action

    International Nuclear Information System (INIS)

    1994-09-01

    The Y-12 Plant Remedial Action Technology Logic Diagram (TLD) was developed to provide a decision-support tool that relates environmental restoration (ER) problems at the Y-12 Plant to potential technologies that can remediate these problems. The TLD identifies the research, development, demonstration, testing, and evaluation needed for sufficient development of these technologies to allow for technology transfer and application to remedial action (RA) activities. The TLD consists of three volumes. Volume 1 contains an overview of the TLD, an explanation of the program-specific responsibilities, a review of identified technologies, and the rankings of remedial technologies. Volume 2 contains the logic linkages among environmental management goals, environmental problems and the various technologies that have the potential to solve these problems. Volume 3 contains the TLD data sheets. This report is Part A of Volume 3 and contains the Remedial Action section

  13. Y-12 Plant remedial action Technology Logic Diagram: Volume 3, Technology evaluation data sheets: Part A, Remedial action

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1994-09-01

    The Y-12 Plant Remedial Action Technology Logic Diagram (TLD) was developed to provide a decision-support tool that relates environmental restoration (ER) problems at the Y-12 Plant to potential technologies that can remediate these problems. The TLD identifies the research, development, demonstration, testing, and evaluation needed for sufficient development of these technologies to allow for technology transfer and application to remedial action (RA) activities. The TLD consists of three volumes. Volume 1 contains an overview of the TLD, an explanation of the program-specific responsibilities, a review of identified technologies, and the rankings of remedial technologies. Volume 2 contains the logic linkages among environmental management goals, environmental problems and the various technologies that have the potential to solve these problems. Volume 3 contains the TLD data sheets. This report is Part A of Volume 3 and contains the Remedial Action section.

  14. Large-volume constant-concentration sampling technique coupling with surface-enhanced Raman spectroscopy for rapid on-site gas analysis.

    Science.gov (United States)

    Zhang, Zhuomin; Zhan, Yisen; Huang, Yichun; Li, Gongke

    2017-08-05

    In this work, a portable large-volume constant-concentration (LVCC) sampling technique coupling with surface-enhanced Raman spectroscopy (SERS) was developed for the rapid on-site gas analysis based on suitable derivatization methods. LVCC sampling technique mainly consisted of a specially designed sampling cell including the rigid sample container and flexible sampling bag, and an absorption-derivatization module with a portable pump and a gas flowmeter. LVCC sampling technique allowed large, alterable and well-controlled sampling volume, which kept the concentration of gas target in headspace phase constant during the entire sampling process and made the sampling result more representative. Moreover, absorption and derivatization of gas target during LVCC sampling process were efficiently merged in one step using bromine-thiourea and OPA-NH 4 + strategy for ethylene and SO 2 respectively, which made LVCC sampling technique conveniently adapted to consequent SERS analysis. Finally, a new LVCC sampling-SERS method was developed and successfully applied for rapid analysis of trace ethylene and SO 2 from fruits. It was satisfied that trace ethylene and SO 2 from real fruit samples could be actually and accurately quantified by this method. The minor concentration fluctuations of ethylene and SO 2 during the entire LVCC sampling process were proved to be gas targets from real samples by SERS. Copyright © 2017 Elsevier B.V. All rights reserved.

  15. Uranium disequilibrium investigation of the Las Cruces East Mesa Geothermal Field

    International Nuclear Information System (INIS)

    Gross, J.; Cochran, J.; Icerman, L.

    1985-03-01

    The concentration of dissolved uranium in 33 thermal and nonthermal groundwaters was found to vary from less than 1 part per billion to 285 parts per billion. The uranium-234 to uranium-238 alpha activity ratio of the 33 samples varied from 0.8 to 4.6. Young waters in the recharge area of the Jornada del Muerto Basin are characterized by low uranium concentrations and high activity ratios. Uranium concentrations of groundwaters increase down hydraulic gradient. Concentrations and activity ratios of dissolved uranium in Mesilla Valley groundwater exhibit wide variation and appear to be related to both short-term and long-term removal of groundwater from storage. Geothermal waters exhibit low uranium concentrations and activity ratios. The water produced from New Mexico State University geothermal wells appears to be a mixture of deep upwelling geothermal water and shallow Jornada del Muerto Basin water. The low activity ratio of water from an 800 meter geothermal well may be the result of thermally-induced isotopic equilibration. Isotopic equilibration suggests that higher temperatures may be found deeper within the reservoir

  16. SyPRID sampler: A large-volume, high-resolution, autonomous, deep-ocean precision plankton sampling system

    Science.gov (United States)

    Billings, Andrew; Kaiser, Carl; Young, Craig M.; Hiebert, Laurel S.; Cole, Eli; Wagner, Jamie K. S.; Van Dover, Cindy Lee

    2017-03-01

    The current standard for large-volume (thousands of cubic meters) zooplankton sampling in the deep sea is the MOCNESS, a system of multiple opening-closing nets, typically lowered to within 50 m of the seabed and towed obliquely to the surface to obtain low-spatial-resolution samples that integrate across 10 s of meters of water depth. The SyPRID (Sentry Precision Robotic Impeller Driven) sampler is an innovative, deep-rated (6000 m) plankton sampler that partners with the Sentry Autonomous Underwater Vehicle (AUV) to obtain paired, large-volume plankton samples at specified depths and survey lines to within 1.5 m of the seabed and with simultaneous collection of sensor data. SyPRID uses a perforated Ultra-High-Molecular-Weight (UHMW) plastic tube to support a fine mesh net within an outer carbon composite tube (tube-within-a-tube design), with an axial flow pump located aft of the capture filter. The pump facilitates flow through the system and reduces or possibly eliminates the bow wave at the mouth opening. The cod end, a hollow truncated cone, is also made of UHMW plastic and includes a collection volume designed to provide an area where zooplankton can collect, out of the high flow region. SyPRID attaches as a saddle-pack to the Sentry vehicle. Sentry itself is configured with a flight control system that enables autonomous survey paths to low altitudes. In its verification deployment at the Blake Ridge Seep (2160 m) on the US Atlantic Margin, SyPRID was operated for 6 h at an altitude of 5 m. It recovered plankton samples, including delicate living larvae, from the near-bottom stratum that is seldom sampled by a typical MOCNESS tow. The prototype SyPRID and its next generations will enable studies of plankton or other particulate distributions associated with localized physico-chemical strata in the water column or above patchy habitats on the seafloor.

  17. Guidelines for the processing and quality assurance of benthic invertebrate samples collected as part of the National Water-Quality Assessment Program

    Science.gov (United States)

    Cuffney, T.F.; Gurtz, M.E.; Meador, M.R.

    1993-01-01

    Benthic invertebrate samples are collected as part of the U.S. Geological Survey's National Water-Quality Assessment Program. This is a perennial, multidisciplinary program that integrates biological, physical, and chemical indicators of water quality to evaluate status and trends and to develop an understanding of the factors controlling observed water quality. The Program examines water quality in 60 study units (coupled ground- and surface-water systems) that encompass most of the conterminous United States and parts of Alaska and Hawaii. Study-unit teams collect and process qualitative and semi-quantitative invertebrate samples according to standardized procedures. These samples are processed (elutriated and subsampled) in the field to produce as many as four sample components: large-rare, main-body, elutriate, and split. Each sample component is preserved in 10-percent formalin, and two components, large-rare and main-body, are sent to contract laboratories for further processing. The large-rare component is composed of large invertebrates that are removed from the sample matrix during field processing and placed in one or more containers. The main-body sample component consists of the remaining sample materials (sediment, detritus, and invertebrates) and is subsampled in the field to achieve a volume of 750 milliliters or less. The remaining two sample components, elutriate and split, are used for quality-assurance and quality-control purposes. Contract laboratories are used to identify and quantify invertebrates from the large-rare and main-body sample components according to the procedures and guidelines specified within this document. These guidelines allow the use of subsampling techniques to reduce the volume of sample material processed and to facilitate identifications. These processing procedures and techniques may be modified if the modifications provide equal or greater levels of accuracy and precision. The intent of sample processing is to

  18. National Trends in Prostate Biopsy and Radical Prostatectomy Volumes Following the US Preventive Services Task Force Guidelines Against Prostate-Specific Antigen Screening.

    Science.gov (United States)

    Halpern, Joshua A; Shoag, Jonathan E; Artis, Amanda S; Ballman, Karla V; Sedrakyan, Art; Hershman, Dawn L; Wright, Jason D; Shih, Ya Chen Tina; Hu, Jim C

    2017-02-01

    Studies demonstrate that use of prostate-specific antigen screening decreased significantly following the US Preventive Services Task Force (USPSTF) recommendation against prostate-specific antigen screening in 2012. To determine downstream effects on practice patterns in prostate cancer diagnosis and treatment following the 2012 USPSTF recommendation. Procedural volumes of certifying and recertifying urologists from 2009 through 2016 were evaluated for variation in prostate biopsy and radical prostatectomy (RP) volume. Trends were confirmed using the New York Statewide Planning and Research Cooperative System and Nationwide Inpatient Sample. The study included a representative sample of urologists across practice settings and nationally representative sample of all RP discharges. We obtained operative case logs from the American Board of Urology and identified urologists performing at least 1 prostate biopsy (n = 5173) or RP (n = 3748), respectively. The 2012 USPSTF recommendation against routine population-wide prostate-specific antigen screening. Change in median biopsy and RP volume per urologist and national procedural volume. Following the USPSTF recommendation, median biopsy volume per urologist decreased from 29 to 21 (interquartile range [IQR}, 12-34; P prostate biopsy and RP volumes decreased significantly. A panoramic vantage point is needed to evaluate the long-term consequences of the 2012 USPSTF recommendation.

  19. Final Report, Volume 2, The Development of Qualification Standards for Cast Duplex Stainless Steel

    Energy Technology Data Exchange (ETDEWEB)

    Russell, Steven, W.; Lundin, Carl, D.

    2005-09-30

    The scope of testing cast Duplex Stainless Steel (DSS) required testing to several ASTM specifications, while formulating and conducting industry round robin tests to verify and study the reproducibility of the results. ASTM E562 (Standard Test Method for Determining Volume Fraction by Systematic manual Point Count) and ASTM A923 (Standard Test Methods for Detecting Detrimental Intermetallic Phase in Wrought Duplex Austenitic/Ferritic Stainless Steels) were the specifications utilized in conducting this work. An ASTM E562 industry round robin, ASTM A923 applicability study, ASTM A923 industry round robin, and an ASTM A923 study of the effectiveness of existing foundry solution annealing procedures for producing cast DSS without intermetallic phases were implemented. In the ASTM E562 study, 5 samples were extracted from various cast austenitic and DSS in order to have varying amounts of ferrite. Each sample was metallographically prepared by UT and sent to each of 8 participants for volume fraction of ferrite measurements. Volume fraction of ferrite was measured using manual point count per ASTM E562. FN was measured from the Feritescope{reg_sign} and converted to volume fraction of ferrite. Results indicate that ASTM E562 is applicable to DSS and the results have excellent lab-to-lab reproducibility. Also, volume fraction of ferrite conversions from the FN measured by the Feritescope{reg_sign} were similar to volume fraction of ferrite measured per ASTM E562. In the ASTM A923 applicability to cast DSS study, 8 different heat treatments were performed on 3 lots of ASTM A890-4A (CD3MN) castings and 1 lot of 2205 wrought DSS. The heat treatments were selected to produce a wide range of cooling rates and hold times in order to study the suitability of ASTM A923 to the response of varying amounts on intermetallic phases [117]. The test parameters were identical to those used to develop ASTM A923 for wrought DSS. Charpy V-notch impact samples were extracted from the

  20. Final Report, Volume 2, The Development of Qualification Standards for Cast Duplex Stainless Steel

    Energy Technology Data Exchange (ETDEWEB)

    Russell, Steven, W.; Lundin, Carl, W.

    2005-09-30

    The scope of testing cast Duplex Stainless Steel (DSS) required testing to several ASTM specifications, while formulating and conducting industry round robin tests to verify and study the reproducibility of the results. ASTM E562 (Standard Test Method for Determining Volume Fraction by Systematic manual Point Count) and ASTM A923 (Standard Test Methods for Detecting Detrimental Intermetallic Phase in Wrought Duplex Austenitic/Ferritic Stainless Steels) were the specifications utilized in conducting this work. An ASTM E562 industry round robin, ASTM A923 applicability study, ASTM A923 industry round robin, and an ASTM A923 study of the effectiveness of existing foundry solution annealing procedures for producing cast DSS without intermetallic phases were implemented. In the ASTM E562 study, 5 samples were extracted from various cast austenitic and DSS in order to have varying amounts of ferrite. Each sample was metallographically prepared by UT and sent to each of 8 participants for volume fraction of ferrite measurements. Volume fraction of ferrite was measured using manual point count per ASTM E562. FN was measured from the Feritescope® and converted to volume fraction of ferrite. Results indicate that ASTM E562 is applicable to DSS and the results have excellent lab-to-lab reproducibility. Also, volume fraction of ferrite conversions from the FN measured by the Feritescope® were similar to volume fraction of ferrite measured per ASTM E562. In the ASTM A923 applicability to cast DSS study, 8 different heat treatments were performed on 3 lots of ASTM A890-4A (CD3MN) castings and 1 lot of 2205 wrought DSS. The heat treatments were selected to produce a wide range of cooling rates and hold times in order to study the suitability of ASTM A923 to the response of varying amounts on intermetallic phases [117]. The test parameters were identical to those used to develop ASTM A923 for wrought DSS. Charpy V-notch impact samples were extracted from the castings and wrought

  1. Mars atmosphere. Mars methane detection and variability at Gale crater.

    Science.gov (United States)

    Webster, Christopher R; Mahaffy, Paul R; Atreya, Sushil K; Flesch, Gregory J; Mischna, Michael A; Meslin, Pierre-Yves; Farley, Kenneth A; Conrad, Pamela G; Christensen, Lance E; Pavlov, Alexander A; Martín-Torres, Javier; Zorzano, María-Paz; McConnochie, Timothy H; Owen, Tobias; Eigenbrode, Jennifer L; Glavin, Daniel P; Steele, Andrew; Malespin, Charles A; Archer, P Douglas; Sutter, Brad; Coll, Patrice; Freissinet, Caroline; McKay, Christopher P; Moores, John E; Schwenzer, Susanne P; Bridges, John C; Navarro-Gonzalez, Rafael; Gellert, Ralf; Lemmon, Mark T

    2015-01-23

    Reports of plumes or patches of methane in the martian atmosphere that vary over monthly time scales have defied explanation to date. From in situ measurements made over a 20-month period by the tunable laser spectrometer of the Sample Analysis at Mars instrument suite on Curiosity at Gale crater, we report detection of background levels of atmospheric methane of mean value 0.69 ± 0.25 parts per billion by volume (ppbv) at the 95% confidence interval (CI). This abundance is lower than model estimates of ultraviolet degradation of accreted interplanetary dust particles or carbonaceous chondrite material. Additionally, in four sequential measurements spanning a 60-sol period (where 1 sol is a martian day), we observed elevated levels of methane of 7.2 ± 2.1 ppbv (95% CI), implying that Mars is episodically producing methane from an additional unknown source. Copyright © 2015, American Association for the Advancement of Science.

  2. Global atmospheric concentrations and source strength of ethane

    Science.gov (United States)

    Blake, D. R.; Rowland, F. S.

    1986-01-01

    A study of the variation in ethane (C2H6) concentration between northern and southern latitudes over three years is presented together with a new estimate of its source strength. Ethane concentrations vary from 0.07 to 2 p.p.b.v. (parts per billion by volume) in air samples collected in remote surface locations in the Pacific (latitude 71 N-47 S) in all four seasons between September 1984 and June 1985. The variations are consistent with southerly transport from sources located chiefly in the Northern Hemisphere, further modified by seasonal variations in the strength of the reaction of C2H6 with OH radicals. These global data can be combined with concurrent data for CH4 and the laboratory reaction rates of each with OH to provide an estimate of three months as the average atmospheric lifetime for C2H6 and 13 + or - 3 Mtons for its annual atmospheric release.

  3. Stable isotope ratio measurements on highly enriched water samples by means of laser spectrometry

    NARCIS (Netherlands)

    van Trigt, R; Kerstel, E.R.T.; Visser, GH; Meijer, H.A.J.

    2001-01-01

    We demonstrate the feasibility of using laser spectrometry (LS) to analyze isotopically highly enriched water samples (i.e., delta H-2 less than or equal to 15000 parts per thousand, delta O-18 less than or equal to 1200 parts per thousand), as often used in the biomedical doubly labeled water (DLW)

  4. Oak Ridge K-25 Site Technology Logic Diagram. Volume 3, Technology evaluation data sheets; Part A, Characterization, decontamination, dismantlement

    Energy Technology Data Exchange (ETDEWEB)

    Fellows, R.L. [ed.

    1993-02-26

    The Oak Ridge K-25 Technology Logic Diagram (TLD), a decision support tool for the K-25 Site, was developed to provide a planning document that relates environmental restoration and waste management problems at the Oak Ridge K-25 Site to potential technologies that can remediate these problems. The TLD technique identifies the research necessary to develop these technologies to a state that allows for technology transfer and application to waste management, remedial action, and decontamination and decommissioning activities. The TLD consists of four separate volumes-Vol. 1, Vol. 2, Vol. 3A, and Vol. 3B. Volume 1 provides introductory and overview information about the TLD. Volume 2 contains logic diagrams. Volume 3 has been divided into two separate volumes to facilitate handling and use. This report is part A of Volume 3 concerning characterization, decontamination, and dismantlement.

  5. Sample Preparation and Extraction in Small Sample Volumes Suitable for Pediatric Clinical Studies: Challenges, Advances, and Experiences of a Bioanalytical HPLC-MS/MS Method Validation Using Enalapril and Enalaprilat

    Science.gov (United States)

    Burckhardt, Bjoern B.; Laeer, Stephanie

    2015-01-01

    In USA and Europe, medicines agencies force the development of child-appropriate medications and intend to increase the availability of information on the pediatric use. This asks for bioanalytical methods which are able to deal with small sample volumes as the trial-related blood lost is very restricted in children. Broadly used HPLC-MS/MS, being able to cope with small volumes, is susceptible to matrix effects. The latter restrains the precise drug quantification through, for example, causing signal suppression. Sophisticated sample preparation and purification utilizing solid-phase extraction was applied to reduce and control matrix effects. A scale-up from vacuum manifold to positive pressure manifold was conducted to meet the demands of high-throughput within a clinical setting. Faced challenges, advances, and experiences in solid-phase extraction are exemplarily presented on the basis of the bioanalytical method development and validation of low-volume samples (50 μL serum). Enalapril, enalaprilat, and benazepril served as sample drugs. The applied sample preparation and extraction successfully reduced the absolute and relative matrix effect to comply with international guidelines. Recoveries ranged from 77 to 104% for enalapril and from 93 to 118% for enalaprilat. The bioanalytical method comprising sample extraction by solid-phase extraction was fully validated according to FDA and EMA bioanalytical guidelines and was used in a Phase I study in 24 volunteers. PMID:25873972

  6. Sample Preparation and Extraction in Small Sample Volumes Suitable for Pediatric Clinical Studies: Challenges, Advances, and Experiences of a Bioanalytical HPLC-MS/MS Method Validation Using Enalapril and Enalaprilat

    Directory of Open Access Journals (Sweden)

    Bjoern B. Burckhardt

    2015-01-01

    Full Text Available In USA and Europe, medicines agencies force the development of child-appropriate medications and intend to increase the availability of information on the pediatric use. This asks for bioanalytical methods which are able to deal with small sample volumes as the trial-related blood lost is very restricted in children. Broadly used HPLC-MS/MS, being able to cope with small volumes, is susceptible to matrix effects. The latter restrains the precise drug quantification through, for example, causing signal suppression. Sophisticated sample preparation and purification utilizing solid-phase extraction was applied to reduce and control matrix effects. A scale-up from vacuum manifold to positive pressure manifold was conducted to meet the demands of high-throughput within a clinical setting. Faced challenges, advances, and experiences in solid-phase extraction are exemplarily presented on the basis of the bioanalytical method development and validation of low-volume samples (50 μL serum. Enalapril, enalaprilat, and benazepril served as sample drugs. The applied sample preparation and extraction successfully reduced the absolute and relative matrix effect to comply with international guidelines. Recoveries ranged from 77 to 104% for enalapril and from 93 to 118% for enalaprilat. The bioanalytical method comprising sample extraction by solid-phase extraction was fully validated according to FDA and EMA bioanalytical guidelines and was used in a Phase I study in 24 volunteers.

  7. Ubiquitous Supercritical Wing Design Cuts Billions in Fuel Costs

    Science.gov (United States)

    2015-01-01

    A Langley Research Center engineer’s work in the 1960s and ’70s to develop a wing with better performance near the speed of sound resulted in a significant increase in subsonic efficiency. The design was shared with industry. Today, Renton, Washington-based Boeing Commercial Airplanes, as well as most other plane manufacturers, apply it to all their aircraft, saving the airline industry billions of dollars in fuel every year.

  8. Tracer techniques for urine volume determination and urine collection and sampling back-up system

    Science.gov (United States)

    Ramirez, R. V.

    1971-01-01

    The feasibility, functionality, and overall accuracy of the use of lithium were investigated as a chemical tracer in urine for providing a means of indirect determination of total urine volume by the atomic absorption spectrophotometry method. Experiments were conducted to investigate the parameters of instrumentation, tracer concentration, mixing times, and methods for incorporating the tracer material in the urine collection bag, and to refine and optimize the urine tracer technique to comply with the Skylab scheme and operational parameters of + or - 2% of volume error and + or - 1% accuracy of amount of tracer added to each container. In addition, a back-up method for urine collection and sampling system was developed and evaluated. This back-up method incorporates the tracer technique for volume determination in event of failure of the primary urine collection and preservation system. One chemical preservative was selected and evaluated as a contingency chemical preservative for the storage of urine in event of failure of the urine cooling system.

  9. Water resources inventory of Connecticut Part 10: Lower Connecticut River basin

    Science.gov (United States)

    Weiss, Lawrence A.; Bingham, James W.; Thomas, Mendall P.

    1982-01-01

    The lower Connecticut River basin study area in south-central Connecticut includes 639 square miles and is drained principally by the Connecticut River and by seven smaller streams that flow directly to Long Island Sound between the West River on the west and the Connecticut River on the east. The population in 1979 was estimated to be 210,380. Much of the industrial development and population centers are in the Mattabesset River basin in the northwestern part, and the largest water use is also in the Mattabesset River basin. Precipitation averages 47 inches per year and provides an abundant supply of water. About 20 inches returns to the atmosphere as evapotranspiration, and the remainder either flows directly to streams or percolates to the water table, eventually discharging to Long Island Sound. Small quantities of water are exported from the basin by the New Haven and Meridan Water Departments, and small quantities are imported by the New Britain Water Department and Metropolitan Direct Commission. Precipitation during 1931-60 resulted in an average annual runoff of 302 billion gallons. In inflow from the Connecticut River is added to the average annual runoff, the 4,370 billion gallon s per year is potentially available for water ue. The domestic, institutional, commercial, and industrial (other than cooling water) water use for 1970 was 7 billion gallons, which is only 3 percent of the total water used, whereas 97 percent of the total is cooling water for power plants. Approximately 60 percent of the 7 billion gallons is treated before being discharged back to the streams. The total amount of fresh water used during 1970 was estimated to be 256,000 million gallons (Mgal), of which 247,000 Mgal was used for cooling water at stream electric-generating plants. The quantity for domestic, commercial, industrial, and agricultural used was 9,000 Mgal, which was approximately 120 gallons a day per person. Public water systems providing 70 percent of these

  10. The Muon g-2 experiment at Fermilab

    Energy Technology Data Exchange (ETDEWEB)

    Chapelain, Antoine [Cornell U., Phys. Dept.

    2017-01-01

    The upcoming Fermilab E989 experiment will measure the muon anomalous magnetic moment aμ. This measurement is motivated by the previous measurement performed in 2001 by the BNL E821 experiment that reported a 3-4 standard deviation discrepancy between the measured value and the Standard Model prediction. The new measurement at Fermilab aims to improve the precision by a factor of four reducing the total uncertainty from 540 parts per billion (BNL E821) to 140 parts per billion (Fermilab E989). This paper gives the status of the experiment.

  11. Satellite Power Systems (SPS) concept definition study, exhibit C. Volume 2, part 2: System engineering, cost and programmatics

    Science.gov (United States)

    Hanley, G. M.

    1979-01-01

    Volume 2, Part 2, of a seven volume Satellite Power Systems (SPS) report is presented. Part 2 covers cost and programmatics and is divided into four sections. The first section gives illustrations of the SPS reference satellite and rectenna concept, and an overall scenario for SPS space transportation involvement. The second section presents SPS program plans for the implementation of PHASE C/D activities. These plans describe SPS program schedules and networks, critical items of systems evolution/technology development, and the natural resources analysis. The fourth section presents summary comments on the methods and rationale followed in arriving at the results documented. Suggestions are also provided in those areas where further analysis or evaluation will enhance SPS cost and programmatic definitions.

  12. Parts, materials, and processes experience summary, volume 2. [design, engineering, and quality control

    Science.gov (United States)

    1973-01-01

    This summary provides the general engineering community with the accumulated experience from ALERT reports issued by NASA and the Government-Industry. Data Exchange Program, and related experience gained by Government and industry. It provides expanded information on selected topics by relating the problem area (failure) to the cause, the investigation and findings, the suggestions for avoidance (inspections, screening tests, proper part applications, requirements for manufacturer's plant facilities, etc.), and failure analysis procedures. Diodes, integrated circuits, and transistors are covered in this volume.

  13. Mathematical modelling in volume per hectare of Pinus caribaea Morelet var. caribaea Barret y Golfari at the «Jazmines» silvicultural unit, Viñales

    Directory of Open Access Journals (Sweden)

    Juana Teresa Suárez Sarria

    2013-12-01

    Full Text Available Mathematical modelling constitutes a very useful tool for the planning and administration of the forest ecosystems. With the objective of predicting the behavior of volume by hectare of Pinus caribaea Moreletvar. caribaea. Barret y Golfari plantations at the «Jazmines» Silvicultural Unit, Viñales, seven non-lineal regression models were evaluated. The best goodness of fit model was the volume per hectare was the one designed by Hossfeld I, with a determining coefficient of 63, 9 % with a high significance parameter (P <0.001. The description curves for the annual mean increment with the time (IMA and the annual periodical increment (ICA of this variables were provided.

  14. 16 CFR Appendix A to Part 436 - Sample Item 10 Table-Summary of Financing Offered

    Science.gov (United States)

    2010-01-01

    ... 16 Commercial Practices 1 2010-01-01 2010-01-01 false Sample Item 10 Table-Summary of Financing Offered A Appendix A to Part 436 Commercial Practices FEDERAL TRADE COMMISSION TRADE REGULATION RULES DISCLOSURE REQUIREMENTS AND PROHIBITIONS CONCERNING FRANCHISING Pt. 436, App. A Appendix A to Part 436—Sample...

  15. Rapid determination of benzene derivatives in water samples by trace volume solvent DLLME prior to GC-FID

    Energy Technology Data Exchange (ETDEWEB)

    Diao, Chun Peng; Wei, Chao Hai; Feng, Chun Hua [South China Univ. of Technology, Guangzhou Higher Education Mega Center (China). College of Environmental Science and Engineering; Guangdong Regular Higher Education Institutions, Guangzhou (China). Key Lab. of Environmental Protection and Eco-Remediation

    2012-05-15

    An inexpensive, simple and environmentally friendly method based on dispersive liquid liquid microextraction (DLLME) for rapid determination of benzene derivatives in water samples was proposed. A significant improvement of DLLME procedure was achieved. Trace volume ethyl acetate (60 {mu}L) was exploited as dispersion solvent instead of common ones such as methanol and acetone, the volume of which was more than 0.5 mL, and the organic solvent required in DLLME was reduced to a great extent. Only 83-{mu}L organic solvent was consumed in the whole analytic process and the preconcentration procedure was less than 10 min. The advantageous approach coupled with gas chromatograph-flame ionization detector was proposed for the rapid determination of benzene, toluene, ethylbenzene and xylene isomers in water samples. Results showed that the proposed approach was an efficient method for rapid determination of benzene derivatives in aqueous samples. (orig.)

  16. Land-Use Change and the Billion Ton 2016 Resource Assessment: Understanding the Effects of Land Management on Environmental Indicators

    Science.gov (United States)

    Kline, K. L.; Eaton, L. M.; Efroymson, R.; Davis, M. R.; Dunn, J.; Langholtz, M. H.

    2016-12-01

    The federal government, led by the U.S. Department of Energy (DOE), quantified potential U.S. biomass resources for expanded production of renewable energy and bioproducts in the 2016 Billion-Ton Report: Advancing Domestic Resources for a Thriving Bioeconomy (BT16) (DOE 2016). Volume 1 of the report provides analysis of projected supplies from 2015 to2040. Volume 2 (forthcoming) evaluates changes in environmental indicators for water quality and quantity, carbon, air quality, and biodiversity associated with production scenarios in BT16 volume 1. This presentation will review land-use allocations under the projected biomass production scenarios and the changes in land management that are implied, including drivers of direct and indirect LUC. National and global concerns such as deforestation and displacement of food production are addressed. The choice of reference scenario, input parameters and constraints (e.g., regarding land classes, availability, and productivity) drive LUC results in any model simulation and are reviewed to put BT16 impacts into context. The principal LUC implied in BT16 supply scenarios involves the transition of 25-to-47 million acres (net) from annual crops in 2015 baseline to perennial cover by 2040 under the base case and 3% yield growth case, respectively. We conclude that clear definitions of land parameters and effects are essential to assess LUC. A lack of consistency in parameters and outcomes of historic LUC analysis in the U.S. underscores the need for science-based approaches.

  17. Community access networks: how to connect the next billion to the ...

    International Development Research Centre (IDRC) Digital Library (Canada)

    Community access networks: how to connect the next billion to the Internet ... services is a prerequisite to sustainable socio-economic development. ... It will provide case studies and formulate recommendations with respect to ... An IDRC delegation will join international delegates and city representatives at the ICLEI World ...

  18. One billion cubic meters of gas produced in Kikinda area

    Energy Technology Data Exchange (ETDEWEB)

    Vicicevic, M; Duric, N

    1969-10-01

    The Kikinda gas reservoir has just passed a milestone in producing one billion cubic meters of natural gas. The reservoir was discovered in 1962, and its present production amounts to 26 million cu m. One of the biggest problems was formation of hydrates, which has successfully been solved by using methanol. Four tables show production statistics by years and productive formations.

  19. Effect of NaOH on large-volume sample stacking of haloacetic acids in capillary zone electrophoresis with a low-pH buffer.

    Science.gov (United States)

    Tu, Chuanhong; Zhu, Lingyan; Ang, Chay Hoon; Lee, Hian Kee

    2003-06-01

    Large-volume sample stacking (LVSS) is an effective on-capillary sample concentration method in capillary zone electrophoresis, which can be applied to the sample in a low-conductivity matrix. NaOH solution is commonly used to back-extract acidic compounds from organic solvent in sample pretreatment. The effect of NaOH as sample matrix on LVSS of haloacetic acids was investigated in this study. It was found that the presence of NaOH in sample did not compromise, but rather help the sample stacking performance if a low pH background electrolyte (BGE) was used. The sensitivity enhancement factor was higher than the case when sample was dissolved in pure water or diluted BGE. Compared with conventional injection (0.4% capillary volume), 97-120-fold sensitivity enhancement in terms of peak height was obtained without deterioration of separation with an injection amount equal to 20% of the capillary volume. This method was applied to determine haloacetic acids in tap water by combination with liquid-liquid extraction and back-extraction into NaOH solution. Limits of detection at sub-ppb levels were obtained for real samples with direct UV detection.

  20. Air sampling in the workplace

    International Nuclear Information System (INIS)

    Hickey, E.E.; Stoetzel, G.A.; Strom, D.J.; Cicotte, G.R.; Wiblin, C.M.; McGuire, S.A.

    1993-09-01

    This report provides technical information on air sampling that will be useful for facilities following the recommendations in the NRC's Regulatory Guide 8.25, Revision 1, ''Air sampling in the Workplace.'' That guide addresses air sampling to meet the requirements in NRC's regulations on radiation protection, 10 CFR Part 20. This report describes how to determine the need for air sampling based on the amount of material in process modified by the type of material, release potential, and confinement of the material. The purposes of air sampling and how the purposes affect the types of air sampling provided are discussed. The report discusses how to locate air samplers to accurately determine the concentrations of airborne radioactive materials that workers will be exposed to. The need for and the methods of performing airflow pattern studies to improve the accuracy of air sampling results are included. The report presents and gives examples of several techniques that can be used to evaluate whether the airborne concentrations of material are representative of the air inhaled by workers. Methods to adjust derived air concentrations for particle size are described. Methods to calibrate for volume of air sampled and estimate the uncertainty in the volume of air sampled are described. Statistical tests for determining minimum detectable concentrations are presented. How to perform an annual evaluation of the adequacy of the air sampling is also discussed

  1. Tasks related to increase of RA reactor exploitation and experimental potential, Independent CO2 loop for cooling the samples irradiated in RA reactor vertical experimental channels, (I-IV), part I

    International Nuclear Information System (INIS)

    Pavicevic, M.

    1963-07-01

    This volume contains the description of the design project of the head of the low-temperature coolant loops needed for cooling the samples to be irradiated in the RA vertical experimental channels. The thermal and mechanical calculations are included as well as calculation of antireactivity and activation of the construction materials. Cost estimation data are included as well. The drawings included are: head of the coolant loop; diagram of CO 2 coolant temperature dependence; diagrams of weight of the loop tubes in the channels; axial distribution of the thermal neutron flux. Engineering drawings of two design solutions of the low-temperature loops with details are part of this volume

  2. Methyl iodide retention on charcoal sorbents at parts-per-million concentrations

    International Nuclear Information System (INIS)

    Wood, G.O.; Vogt, G.J.; Kasunic, C.A.

    1978-01-01

    Breakthrough curves for charcoal beds challenged by air containing parts-per-million methyl iodide ( 127 I) vapor concentrations were obtained and analyzed. A goal of this research is to determine if sorbent tests at relatively high vapor concentrations give data that can be extrapolated many orders of magnitude to the region of interest for radioiodine retention and removal. Another objective is to identify and characterize parameters that are critical to the performance of a charcoal bed in a respirator cartridge application. Towards these ends, a sorbent test system was built that allows experimental variations of the parameters of challenge vapor concentration, volumetric flow rate, bed depth, bed diameter, and relative humidity. Methyl iodide breakthrough was measured at a limit of 0.002 ppM using a gas chromatograph equipped with a linearized electron capture detector. Several models that have been proposed to describe breakthrough curves were tested against experimental data. A variety of charcoals used or proposed for use in radioiodine air filtration systems have been tested against 25.7 ppM methyl iodide to obtain these parameters and protection (decomtamination) factors. Effects of challenge concentration, relative humidity, and bed diameter were also investigated. Significant challenge concentration dependence was measured (more efficiency at lower concentration) for two types of charcoals. Increased relative humidity greatly decreased breakthrough times for a given protection factor. Increased bed diameter greatly increased breakthrough times for a given protection factor. Implications of these effects for a test method are discussed

  3. CTC Sentinel. Volume 6, Issue 10

    Science.gov (United States)

    2013-10-01

    and the office of Kenyan President Uhuru Kenyatta.11 Fighting continued into the evening of Tuesday , 6 Daniel Howden, “Terror in Nairobi: The Full...provided Iraq with $1 billion per month starting in 1982. See John Bulloch and Harvey Morris , The Gulf War: Its Origins, History and Consequences (London...Expected,” UN High Commissioner for Refu- gees, September 5, 2013. 25 Biro. 26 Loveday Morris , Joby Warrick and Souad Mekhen- net, “Rival al-Qaeda

  4. Per- and polyfluoroalkyl substances in human serum and urine samples from a residentially exposed community.

    Science.gov (United States)

    Worley, Rachel Rogers; Moore, Susan McAfee; Tierney, Bruce C; Ye, Xiaoyun; Calafat, Antonia M; Campbell, Sean; Woudneh, Million B; Fisher, Jeffrey

    2017-09-01

    Per- and polyfluoroalkyl substances (PFAS) are considered chemicals of emerging concern, in part due to their environmental and biological persistence and the potential for widespread human exposure. In 2007, a PFAS manufacturer near Decatur, Alabama notified the United States Environmental Protection Agency (EPA) it had discharged PFAS into a wastewater treatment plant, resulting in environmental contamination and potential exposures to the local community. To characterize PFAS exposure over time, the Agency for Toxic Substances and Disease Registry (ATSDR) collected blood and urine samples from local residents. Eight PFAS were measured in serum in 2010 (n=153). Eleven PFAS were measured in serum, and five PFAS were measured in urine (n=45) from some of the same residents in 2016. Serum concentrations were compared to nationally representative data and change in serum concentration over time was evaluated. Biological half-lives were estimated for perfluorooctanoic acid (PFOA), perfluorooctane sulfonic acid (PFOS), and perfluorohexane sulfonic acid (PFHxS) using a one-compartment pharmacokinetic model. In 2010 and 2016, geometric mean PFOA and PFOS serum concentrations were elevated in participants compared to the general U.S. In 2016, the geometric mean PFHxS serum concentration was elevated compared to the general U.S. Geometric mean serum concentrations of PFOA, PFOS, and perfluorononanoic acid (PFNA) were significantly (p≤0.0001) lower (49%, 53%, and 58%, respectively) in 2016 compared to 2010. Half-lives for PFOA, PFOS, and PFHxS were estimated to be 3.9, 3.3, and 15.5years, respectively. Concentrations of PFOA in serum and urine were highly correlated (r=0.75) in males. Serum concentrations of some PFAS are decreasing in this residentially exposed community, but remain elevated compared to the U.S. general population. Published by Elsevier Ltd.

  5. Random sampling technique for ultra-fast computations of molecular opacities for exoplanet atmospheres

    Science.gov (United States)

    Min, M.

    2017-10-01

    Context. Opacities of molecules in exoplanet atmospheres rely on increasingly detailed line-lists for these molecules. The line lists available today contain for many species up to several billions of lines. Computation of the spectral line profile created by pressure and temperature broadening, the Voigt profile, of all of these lines is becoming a computational challenge. Aims: We aim to create a method to compute the Voigt profile in a way that automatically focusses the computation time into the strongest lines, while still maintaining the continuum contribution of the high number of weaker lines. Methods: Here, we outline a statistical line sampling technique that samples the Voigt profile quickly and with high accuracy. The number of samples is adjusted to the strength of the line and the local spectral line density. This automatically provides high accuracy line shapes for strong lines or lines that are spectrally isolated. The line sampling technique automatically preserves the integrated line opacity for all lines, thereby also providing the continuum opacity created by the large number of weak lines at very low computational cost. Results: The line sampling technique is tested for accuracy when computing line spectra and correlated-k tables. Extremely fast computations ( 3.5 × 105 lines per second per core on a standard current day desktop computer) with high accuracy (≤1% almost everywhere) are obtained. A detailed recipe on how to perform the computations is given.

  6. NASA Aerospace Flight Battery Program: Wet Life of Nickel-Hydrogen (Ni-H2) Batteries. Volume 2, Part 3; Appendices

    Science.gov (United States)

    Jung, David S,; Lee, Leonine S.; Manzo, Michelle A.

    2010-01-01

    This NASA Aerospace Flight Battery Systems Working Group was chartered within the NASA Engineering and Safety Center (NESC). The Battery Working Group was tasked to complete tasks and to propose proactive work to address battery related, agency-wide issues on an annual basis. In its first year of operation, this proactive program addressed various aspects of the validation and verification of aerospace battery systems for NASA missions. Studies were performed, issues were discussed and in many cases, test programs were executed to generate recommendations and guidelines to reduce risk associated with various aspects of implementing battery technology in the aerospace industry. This document contains Part 3 - Volume II Appendices to Part 3 - Volume I.

  7. Spatial variability in oceanic redox structure 1.8 billion years ago

    DEFF Research Database (Denmark)

    Poulton, Simon W.; Fralick, Philip W.; Canfield, Donald Eugene

    2010-01-01

    to reconstruct oceanic redox conditions from the 1.88- to 1.83-billion-year-old Animikie group from the Superior region, North America. We find that surface waters were oxygenated, whereas at mid-depths, anoxic and sulphidic (euxinic) conditions extended over 100 km from the palaeoshoreline. The spatial extent...

  8. Neutron activation analysis of absolutely-dated tree rings

    International Nuclear Information System (INIS)

    Uenlue, K.; Hauck, D.K.; Kuniholm, P.I.; Chiment, J.J.

    2005-01-01

    Gold concentration was determined for dendrochronologically-dated wood samples using neutron activation analysis (NAA) and correlation sought with known environmental changes, e.g., volcanic activities, during historic periods. Uptake of gold is sensitive to soil pH for many plants. Data presented are from a single, cross-dated tree that grew in Greece. Using NAA, gold was measured with parts-per-billion sensitivity in individual tree rings from 1411 to 1988 AD. (author)

  9. Development and application of a luminol-based nitrogen dioxide detector

    International Nuclear Information System (INIS)

    Wendel, G.J.

    1985-01-01

    An instrument for the continuous measurement of nitrogen dioxide (NO 2 ) at all atmospheric concentration ranges and conditions was developed. The detector is based on the chemiluminescent reaction between 5-amino-2,3-dihydro-1,4-phthalazinedione (luminol) and NO 2 in alkaline aqueous solution. Development included the optimization of the cell design and the solution composition. Sodium sulfite (Na 2 SO 3 ) and methanol (CH 3 OH) were added to the solution to improve sensitivity and specificity. The detector was favorably compared to two different instruments measuring NO 2 by NO + O 3 chemiluminescent and by a tunable diode laser absorption spectrometry system. The detector has demonstrated a detection limit of 30 parts-per-trillion by volume (ppt) and a frequency response of 0.3 Hz. The instrument was operated for two one-month periods on Bermuda. The purpose was to study air masses from the East Coast of the United States after transport over the ocean. Average daily values were 400 ppt with values as low as 100 ppt measured. Other field experiments involved monitoring of NO 2 in ambient air in the range of 1 to 60 parts-per-billion by volume

  10. 32 CFR Appendix C to Part 113 - Sample DD Form 2653, “Involuntary Allotment Application”

    Science.gov (United States)

    2010-07-01

    ... 32 National Defense 1 2010-07-01 2010-07-01 false Sample DD Form 2653, âInvoluntary Allotment Applicationâ C Appendix C to Part 113 National Defense Department of Defense OFFICE OF THE SECRETARY OF DEFENSE... Part 113—Sample DD Form 2653, “Involuntary Allotment Application” ER05JA95.002 ER05JA95.003 ...

  11. Large-volume constant-concentration sampling technique coupling with surface-enhanced Raman spectroscopy for rapid on-site gas analysis

    Science.gov (United States)

    Zhang, Zhuomin; Zhan, Yisen; Huang, Yichun; Li, Gongke

    2017-08-01

    In this work, a portable large-volume constant-concentration (LVCC) sampling technique coupling with surface-enhanced Raman spectroscopy (SERS) was developed for the rapid on-site gas analysis based on suitable derivatization methods. LVCC sampling technique mainly consisted of a specially designed sampling cell including the rigid sample container and flexible sampling bag, and an absorption-derivatization module with a portable pump and a gas flowmeter. LVCC sampling technique allowed large, alterable and well-controlled sampling volume, which kept the concentration of gas target in headspace phase constant during the entire sampling process and made the sampling result more representative. Moreover, absorption and derivatization of gas target during LVCC sampling process were efficiently merged in one step using bromine-thiourea and OPA-NH4+ strategy for ethylene and SO2 respectively, which made LVCC sampling technique conveniently adapted to consequent SERS analysis. Finally, a new LVCC sampling-SERS method was developed and successfully applied for rapid analysis of trace ethylene and SO2 from fruits. It was satisfied that trace ethylene and SO2 from real fruit samples could be actually and accurately quantified by this method. The minor concentration fluctuations of ethylene and SO2 during the entire LVCC sampling process were proved to be samples were achieved in range of 95.0-101% and 97.0-104% respectively. It is expected that portable LVCC sampling technique would pave the way for rapid on-site analysis of accurate concentrations of trace gas targets from real samples by SERS.

  12. Gaia: Science with 1 billion objects in three dimensions

    Science.gov (United States)

    Prusti, Timo

    2018-02-01

    Gaia is an operational satellite in the ESA science programme. It is gathering data for more than a billion objects. Gaia measures positions and motions of stars in our Milky Way Galaxy, but captures many asteroids and extragalactic sources as well. The first data release has already been made and exploitation by the world-wide scientific community is underway. Further data releases will be made with further increasing accuracy. Gaia is well underway to provide its promised set of fundamental astronomical data.

  13. General Public Space Travel and Tourism. Volume 1; Executive Summary

    Science.gov (United States)

    ONeil, Daniel (Compiler); Bekey, Ivan; Mankins, John; Rogers, Thomas F.; Stallmer, Eric W.

    1998-01-01

    Travel and tourism is one of the world's largest businesses. Its gross revenues exceed $400 billion per year in the U.S. alone, and it is our second largest employer. U.S. private sector business revenues in the space information area now approximate $10 billion per year, and are increasing rapidly. Not so in the human spaceflight area. After spending $100s of billions (1998 dollars) in public funds thereon, and continuing to spend over $5 billion per year, the government is still the only customer for human spaceflight goods and services. Serious and detailed consideration was first given to the possibility of space being opened up to trips by the general public three decades ago, and some initial attempts to do so were made a dozen years ago. But the difficulties were great and the Challenger disaster put an end to them. In recent years professional space tourism studies have been conducted in the United Kingdom, Germany and, especially, Japan. In the U.S., technological progress has been pronounced; we have had nearly a decade's experience in seeing our astronauts travel to-from low Earth orbit (LEO) safely, and we expect to commence assembly of a LEO space station housing a half-dozen people this year. Too, NASA and our space industry now have new and promising space transportation development programs underway, especially the X-33 and X-34 programs, and some related, further generation, basic technology development programs. And five private companies are also working on the design of new surface - LEO vehicles. The first professional space tourism market studies have been conducted in several countries in the past few years, especially in Japan and here. The U.S. study makes it clear that, conceptually, tens of millions of us would like to take a trip to space if we could do so with reasonable safety, comfort and reliability, and at an acceptable price. Initial businesses will address the desires of those willing to pay a greater price and accept a greater

  14. Gamut Volume Index: a color preference metric based on meta-analysis and optimized colour samples.

    Science.gov (United States)

    Liu, Qiang; Huang, Zheng; Xiao, Kaida; Pointer, Michael R; Westland, Stephen; Luo, M Ronnier

    2017-07-10

    A novel metric named Gamut Volume Index (GVI) is proposed for evaluating the colour preference of lighting. This metric is based on the absolute gamut volume of optimized colour samples. The optimal colour set of the proposed metric was obtained by optimizing the weighted average correlation between the metric predictions and the subjective ratings for 8 psychophysical studies. The performance of 20 typical colour metrics was also investigated, which included colour difference based metrics, gamut based metrics, memory based metrics as well as combined metrics. It was found that the proposed GVI outperformed the existing counterparts, especially for the conditions where correlated colour temperatures differed.

  15. Evaluation of the effects of insufficient blood volume samples on the performance of blood glucose self-test meters.

    Science.gov (United States)

    Pfützner, Andreas; Schipper, Christina; Ramljak, Sanja; Flacke, Frank; Sieber, Jochen; Forst, Thomas; Musholt, Petra B

    2013-11-01

    Accuracy of blood glucose readings is (among other things) dependent on the test strip being completely filled with sufficient sample volume. The devices are supposed to display an error message in case of incomplete filling. This laboratory study was performed to test the performance of 31 commercially available devices in case of incomplete strip filling. Samples with two different glucose levels (60-90 and 300-350 mg/dl) were used to generate three different sample volumes: 0.20 µl (too low volume for any device), 0.32 µl (borderline volume), and 1.20 µl (low but supposedly sufficient volume for all devices). After a point-of-care capillary reference measurement (StatStrip, NovaBiomedical), the meter strip was filled (6x) with the respective volume, and the response of the meters (two devices) was documented (72 determinations/meter type). Correct response was defined as either an error message indicating incomplete filling or a correct reading (±20% compared with reference reading). Only five meters showed 100% correct responses [BGStar and iBGStar (both Sanofi), ACCU-CHEK Compact+ and ACCU-CHEK Mobile (both Roche Diagnostics), OneTouch Verio (LifeScan)]. The majority of the meters (17) had up to 10% incorrect reactions [predominantly incorrect readings with sufficient volume; Precision Xceed and Xtra, FreeStyle Lite, and Freedom Lite (all Abbott); GlucoCard+ and GlucoMen GM (both Menarini); Contour, Contour USB, and Breeze2 (all Bayer); OneTouch Ultra Easy, Ultra 2, and Ultra Smart (all LifeScan); Wellion Dialog and Premium (both MedTrust); FineTouch (Terumo); ACCU-CHEK Aviva (Roche); and GlucoTalk (Axis-Shield)]. Ten percent to 20% incorrect reactions were seen with OneTouch Vita (LifeScan), ACCU-CHEK Aviva Nano (Roche), OmniTest+ (BBraun), and AlphaChek+ (Berger Med). More than 20% incorrect reactions were obtained with Pura (Ypsomed), GlucoCard Meter and GlucoMen LX (both Menarini), Elite (Bayer), and MediTouch (Medisana). In summary, partial and

  16. Universities Report $1.8-Billion in Earnings on Inventions in 2011

    Science.gov (United States)

    Blumenstyk, Goldie

    2012-01-01

    Universities and their inventors earned more than $1.8-billion from commercializing their academic research in the 2011 fiscal year, collecting royalties from new breeds of wheat, from a new drug for the treatment of HIV, and from longstanding arrangements over enduring products like Gatorade. Northwestern University earned the most of any…

  17. Automated determination of the stable carbon isotopic composition (δ13C) of total dissolved inorganic carbon (DIC) and total nonpurgeable dissolved organic carbon (DOC) in aqueous samples: RSIL lab codes 1851 and 1852

    Science.gov (United States)

    Révész, Kinga M.; Doctor, Daniel H.

    2014-01-01

    The purposes of the Reston Stable Isotope Laboratory (RSIL) lab codes 1851 and 1852 are to determine the total carbon mass and the ratio of the stable isotopes of carbon (δ13C) for total dissolved inorganic carbon (DIC, lab code 1851) and total nonpurgeable dissolved organic carbon (DOC, lab code 1852) in aqueous samples. The analysis procedure is automated according to a method that utilizes a total carbon analyzer as a peripheral sample preparation device for analysis of carbon dioxide (CO2) gas by a continuous-flow isotope ratio mass spectrometer (CF-IRMS). The carbon analyzer produces CO2 and determines the carbon mass in parts per million (ppm) of DIC and DOC in each sample separately, and the CF-IRMS determines the carbon isotope ratio of the produced CO2. This configuration provides a fully automated analysis of total carbon mass and δ13C with no operator intervention, additional sample preparation, or other manual analysis. To determine the DIC, the carbon analyzer transfers a specified sample volume to a heated (70 °C) reaction vessel with a preprogrammed volume of 10% phosphoric acid (H3PO4), which allows the carbonate and bicarbonate species in the sample to dissociate to CO2. The CO2 from the reacted sample is subsequently purged with a flow of helium gas that sweeps the CO2 through an infrared CO2 detector and quantifies the CO2. The CO2 is then carried through a high-temperature (650 °C) scrubber reactor, a series of water traps, and ultimately to the inlet of the mass spectrometer. For the analysis of total dissolved organic carbon, the carbon analyzer performs a second step on the sample in the heated reaction vessel during which a preprogrammed volume of sodium persulfate (Na2S2O8) is added, and the hydroxyl radicals oxidize the organics to CO2. Samples containing 2 ppm to 30,000 ppm of carbon are analyzed. The precision of the carbon isotope analysis is within 0.3 per mill for DIC, and within 0.5 per mill for DOC.

  18. Minimization of the volume and Pu content of the waste generated at a plutonium fuel fabrication plant

    International Nuclear Information System (INIS)

    Pauwels, H.

    1992-01-01

    The amounts of waste generated during 1987, 1989 and a past reference period have been reported in great detail. The main conclusions which can be drawn from these figures are: (i) for all kinds of waste, the waste-to-product ratio has decreased very substantially during the past few years. This reduction results partly from a scale effect, i.e. the better load factor of the plant, and partly from Belgonucleare's continuous effort to minimize the radioactive waste arisings; (ii) the ratio of the Pu content of the waste to the total Pu throughput of the plant has also decreased substantially; (iii) the mean Pu content of the solid Pu contaminated waste equals 1.39 g Pu per unit volume of 25 l. Only for a small fraction of this waste (<5% by volume) does the Pu content exceed 5 g per unit volume of 25 l; (iv) even after the implementation of waste reducing measures, some 45% of the solid Pu contaminated waste is generated by operations which involve the handling and transfer of powders. Finally, some 63% of the total amount of Pu in the waste can be imputed to these operations

  19. 17 CFR Appendix B to Part 420 - Sample Large Position Report

    Science.gov (United States)

    2010-04-01

    ..., and as collateral for financial derivatives and other securities transactions $ Total Memorandum 1... 17 Commodity and Securities Exchanges 3 2010-04-01 2010-04-01 false Sample Large Position Report B Appendix B to Part 420 Commodity and Securities Exchanges DEPARTMENT OF THE TREASURY REGULATIONS UNDER...

  20. Readability of the web: a study on 1 billion web pages

    NARCIS (Netherlands)

    de Heus, Marije; Hiemstra, Djoerd

    We have performed a readability study on more than 1 billion web pages. The Automated Readability Index was used to determine the average grade level required to easily comprehend a website. Some of the results are that a 16-year-old can easily understand 50% of the web and an 18-year old can easily

  1. Y-12 Plant remedial action Technology Logic Diagram: Volume 3, Technology evaluation data sheets: Part B, Characterization; robotics/automation

    International Nuclear Information System (INIS)

    1994-09-01

    The Y-12 Plant Remedial Action Technology Logic Diagram (TLD) was developed to provide a decision-support tool that relates environmental restoration (ER) problems at the Y-12 Plant to potential technologies that can remediate theses problems. The TLD identifies the research, development, demonstration, testing, and evaluation needed for sufficient development of these technologies to allow for technology transfer and application to remedial action (RA) activities. The TLD consists of three volumes. Volume 1 contains an overview of the TLD, an explanation of the program-specific responsibilities, a review of identified technologies, and the rankings of remedial technologies. Volume 2 contains the logic linkages among environmental management goals, environmental problems, and the various technologies that have the potential to solve these problems. Volume 3 contains the TLD data sheets. This report is Part B of Volume 3 and contains the Characterization and Robotics/Automation sections

  2. Determination of submicrogram-per-liter concentrations of caffeine in surface water and groundwater samples by solid-phase extraction and liquid chromatography

    Science.gov (United States)

    Burkhardt, M.R.; Soliven, P.P.; Werner, S.L.; Vaught, D.G.

    1999-01-01

    A method for determining submicrogram-per-liter concentrations of caffeine in surface water and groundwater samples has been developed. Caffeine is extracted from a 1 L water sample with a 0.5 g graphitized carbon-based solid-phase cartridge, eluted with methylene chloride-methanol (80 + 20, v/v), and analyzed by liquid chromatography with photodiode-array detection. The single-operator method detection limit for organic-free water samples was 0.02 ??g/L. Mean recoveries and relative standard deviations were 93 ?? 13% for organicfree water samples fortified at 0.04 ??g/L and 84 ?? 4% for laboratory reagent spikes fortified at 0.5 ??g/L. Environmental concentrations of caffeine ranged from 0.003 to 1.44 ??g/L in surface water samples and from 0.01 to 0.08 ??g/L in groundwater samples.

  3. Effects of large volume injection of aliphatic alcohols as sample diluents on the retention of low hydrophobic solutes in reversed-phase liquid chromatography.

    Science.gov (United States)

    David, Victor; Galaon, Toma; Aboul-Enein, Hassan Y

    2014-01-03

    Recent studies showed that injection of large volume of hydrophobic solvents used as sample diluents could be applied in reversed-phase liquid chromatography (RP-LC). This study reports a systematic research focused on the influence of a series of aliphatic alcohols (from methanol to 1-octanol) on the retention process in RP-LC, when large volumes of sample are injected on the column. Several model analytes with low hydrophobic character were studied by RP-LC process, for mobile phases containing methanol or acetonitrile as organic modifiers in different proportions with aqueous component. It was found that starting with 1-butanol, the aliphatic alcohols can be used as sample solvents and they can be injected in high volumes, but they may influence the retention factor and peak shape of the dissolved solutes. The dependence of the retention factor of the studied analytes on the injection volume of these alcohols is linear, with a decrease of its value as the sample volume is increased. The retention process in case of injecting up to 200μL of upper alcohols is dependent also on the content of the organic modifier (methanol or acetonitrile) in mobile phase. Copyright © 2013 Elsevier B.V. All rights reserved.

  4. Evaluating droplet digital PCR for the quantification of human genomic DNA: converting copies per nanoliter to nanograms nuclear DNA per microliter.

    Science.gov (United States)

    Duewer, David L; Kline, Margaret C; Romsos, Erica L; Toman, Blaza

    2018-05-01

    The highly multiplexed polymerase chain reaction (PCR) assays used for forensic human identification perform best when used with an accurately determined quantity of input DNA. To help ensure the reliable performance of these assays, we are developing a certified reference material (CRM) for calibrating human genomic DNA working standards. To enable sharing information over time and place, CRMs must provide accurate and stable values that are metrologically traceable to a common reference. We have shown that droplet digital PCR (ddPCR) limiting dilution end-point measurements of the concentration of DNA copies per volume of sample can be traceably linked to the International System of Units (SI). Unlike values assigned using conventional relationships between ultraviolet absorbance and DNA mass concentration, entity-based ddPCR measurements are expected to be stable over time. However, the forensic community expects DNA quantity to be stated in terms of mass concentration rather than entity concentration. The transformation can be accomplished given SI-traceable values and uncertainties for the number of nucleotide bases per human haploid genome equivalent (HHGE) and the average molar mass of a nucleotide monomer in the DNA polymer. This report presents the considerations required to establish the metrological traceability of ddPCR-based mass concentration estimates of human nuclear DNA. Graphical abstract The roots of metrological traceability for human nuclear DNA mass concentration results. Values for the factors in blue must be established experimentally. Values for the factors in red have been established from authoritative source materials. HHGE stands for "haploid human genome equivalent"; there are two HHGE per diploid human genome.

  5. Top-level DB design for Big Data in ATLAS Experiment at CERN

    CERN Document Server

    Dimitrov, Gancho; The ATLAS collaboration

    2017-01-01

    This presentation describes a system that accumulates a set of key quantities for a very large number of particle collision events recorded by the ATLAS experiment at the LHC (Large Hadron Collider) at CERN. The main project requirements are the handling of tens of billions of rows per year with minimal DB resources, and providing outstanding performance for the fundamental use cases. Various challenges were faced in the process of project development, such as large data volume, large transactions (tens to hundreds of million of rows per transaction) requiring significant amount of undo, row duplication checks, adequate table statistics gathering, and SQL execution plan stability. Currently the system hosts about 120 billion rows as the data ingestion rate has gone beyond the initially foreseen 30 billion rows per year. The crucial DB schema design decisions and the Oracle DB features and techniques will be shared with the audience. By attending this session you will learn how big physics data can be organize...

  6. Chip-Oriented Fluorimeter Design and Detection System Development for DNA Quantification in Nano-Liter Volumes

    Directory of Open Access Journals (Sweden)

    Da-Sheng Lee

    2009-12-01

    Full Text Available The chip-based polymerase chain reaction (PCR system has been developed in recent years to achieve DNA quantification. Using a microstructure and miniature chip, the volume consumption for a PCR can be reduced to a nano-liter. With high speed cycling and a low reaction volume, the time consumption of one PCR cycle performed on a chip can be reduced. However, most of the presented prototypes employ commercial fluorimeters which are not optimized for fluorescence detection of such a small quantity sample. This limits the performance of DNA quantification, especially low experiment reproducibility. This study discusses the concept of a chip-oriented fluorimeter design. Using the analytical model, the current study analyzes the sensitivity and dynamic range of the fluorimeter to fit the requirements for detecting fluorescence in nano-liter volumes. Through the optimized processes, a real-time PCR on a chip system with only one nano-liter volume test sample is as sensitive as the commercial real-time PCR machine using the sample with twenty micro-liter volumes. The signal to noise (S/N ratio of a chip system for DNA quantification with hepatitis B virus (HBV plasmid samples is 3 dB higher. DNA quantification by the miniature chip shows higher reproducibility compared to the commercial machine with respect to samples of initial concentrations from 103 to 105 copies per reaction.

  7. TRU waste-sampling program

    International Nuclear Information System (INIS)

    Warren, J.L.; Zerwekh, A.

    1985-08-01

    As part of a TRU waste-sampling program, Los Alamos National Laboratory retrieved and examined 44 drums of 238 Pu- and 239 Pu-contaminated waste. The drums ranged in age from 8 months to 9 years. The majority of drums were tested for pressure, and gas samples withdrawn from the drums were analyzed by a mass spectrometer. Real-time radiography and visual examination were used to determine both void volumes and waste content. Drum walls were measured for deterioration, and selected drum contents were reassayed for comparison with original assays and WIPP criteria. Each drum tested at atmospheric pressure. Mass spectrometry revealed no problem with 239 Pu-contaminated waste, but three 8-month-old drums of 238 Pu-contaminated waste contained a potentially hazardous gas mixture. Void volumes fell within the 81 to 97% range. Measurements of drum walls showed no significant corrosion or deterioration. All reassayed contents were within WIPP waste acceptance criteria. Five of the drums opened and examined (15%) could not be certified as packaged. Three contained free liquids, one had corrosive materials, and one had too much unstabilized particulate. Eleven drums had the wrong (or not the most appropriate) waste code. In many cases, disposal volumes had been inefficiently used. 2 refs., 23 figs., 7 tabs

  8. Transfer function design based on user selected samples for intuitive multivariate volume exploration

    KAUST Repository

    Zhou, Liang

    2013-02-01

    Multivariate volumetric datasets are important to both science and medicine. We propose a transfer function (TF) design approach based on user selected samples in the spatial domain to make multivariate volumetric data visualization more accessible for domain users. Specifically, the user starts the visualization by probing features of interest on slices and the data values are instantly queried by user selection. The queried sample values are then used to automatically and robustly generate high dimensional transfer functions (HDTFs) via kernel density estimation (KDE). Alternatively, 2D Gaussian TFs can be automatically generated in the dimensionality reduced space using these samples. With the extracted features rendered in the volume rendering view, the user can further refine these features using segmentation brushes. Interactivity is achieved in our system and different views are tightly linked. Use cases show that our system has been successfully applied for simulation and complicated seismic data sets. © 2013 IEEE.

  9. Transfer function design based on user selected samples for intuitive multivariate volume exploration

    KAUST Repository

    Zhou, Liang; Hansen, Charles

    2013-01-01

    Multivariate volumetric datasets are important to both science and medicine. We propose a transfer function (TF) design approach based on user selected samples in the spatial domain to make multivariate volumetric data visualization more accessible for domain users. Specifically, the user starts the visualization by probing features of interest on slices and the data values are instantly queried by user selection. The queried sample values are then used to automatically and robustly generate high dimensional transfer functions (HDTFs) via kernel density estimation (KDE). Alternatively, 2D Gaussian TFs can be automatically generated in the dimensionality reduced space using these samples. With the extracted features rendered in the volume rendering view, the user can further refine these features using segmentation brushes. Interactivity is achieved in our system and different views are tightly linked. Use cases show that our system has been successfully applied for simulation and complicated seismic data sets. © 2013 IEEE.

  10. Mycotoxin monitoring for commercial foodstuffs in Taiwan

    Directory of Open Access Journals (Sweden)

    Ming-Tzai Chen

    2016-01-01

    Full Text Available Mycotoxins are toxic food contaminants that are naturally produced by certain fungi. They induce negative effects on human health by making food unsafe for consumption. In this study, analyses were performed to determine the levels and incidence of aflatoxins (AFs in peanut products, tree nuts, spices, and Coix seeds; ochratoxin A (OTA in wheat and roasted coffee, as well as OTA and AFs in rice; and citrinin (CIT in red yeast rice (RYR products. A total of 712 samples from nine different food categories were collected between 2012 and 2013. The samples were analyzed over 2 years for AFs, OTA, and CIT by methods recommended by the Ministry of Health and Welfare. These official analytical methods were extensively validated in-house and through interlaboratory trials. The analytical values of suspected contaminated specimens were confirmed by liquid chromatography – tandem mass spectrometry analysis to identify the specific mycotoxin present in the sample. We show that 689 samples (96.8% complied with the regulations set by the Ministry of Health and Welfare. AFs were found in four peanut-candy products, one peanut-flour product, one pistachio product, one Sichuan-pepper product, and one Coix seed product. All had exceeded the maximum levels of 15 parts per billion for peanut and 10 parts per billion for other food products. Furthermore, 14 RYR samples contained CIT above 5 parts per million, and one RYR tablet exceeded the maximum amount allowed. Instances of AFs in substandard Sichuan pepper and Coix seeds were first detected in Taiwan. Measures were taken by the relevant authorities to remove substandard products from the market in order to decrease consumer exposure to mycotoxin. Border control measures were applied to importing food commodities with a higher risk of mycotoxin contamination, such as peanut, Sichuan pepper, and RYR products. Declining trends were observed in the noncompliance rate of AFs in peanut products, as well as that of

  11. Quantum Cascade Laser-Based Photoacoustic Sensor for Trace Detection of Formaldehyde Gas

    Directory of Open Access Journals (Sweden)

    Pietro Mario Lugarà

    2009-04-01

    Full Text Available We report on the development of a photoacoustic sensor for the detection of formaldehyde (CH2O using a thermoelectrically cooled distributed-feedback quantum cascade laser operating in pulsed mode at 5.6 mm. A resonant photoacoustic cell, equipped with four electret microphones, is excited in its first longitudinal mode at 1,380 Hz. The absorption line at 1,778.9 cm-1 is selected for CH2O detection. A detection limit of 150 parts per billion in volume in nitrogen is achieved using a 10 seconds time constant and 4 mW laser power. Measurements in ambient air will require water vapour filters.

  12. Self-reference and random sampling approach for label-free identification of DNA composition using plasmonic nanomaterials.

    Science.gov (United States)

    Freeman, Lindsay M; Pang, Lin; Fainman, Yeshaiahu

    2018-05-09

    The analysis of DNA has led to revolutionary advancements in the fields of medical diagnostics, genomics, prenatal screening, and forensic science, with the global DNA testing market expected to reach revenues of USD 10.04 billion per year by 2020. However, the current methods for DNA analysis remain dependent on the necessity for fluorophores or conjugated proteins, leading to high costs associated with consumable materials and manual labor. Here, we demonstrate a potential label-free DNA composition detection method using surface-enhanced Raman spectroscopy (SERS) in which we identify the composition of cytosine and adenine within single strands of DNA. This approach depends on the fact that there is one phosphate backbone per nucleotide, which we use as a reference to compensate for systematic measurement variations. We utilize plasmonic nanomaterials with random Raman sampling to perform label-free detection of the nucleotide composition within DNA strands, generating a calibration curve from standard samples of DNA and demonstrating the capability of resolving the nucleotide composition. The work represents an innovative way for detection of the DNA composition within DNA strands without the necessity of attached labels, offering a highly sensitive and reproducible method that factors in random sampling to minimize error.

  13. Personal monitors for inorganic gases. Final report, 28 September 1978-31 August 1979

    Energy Technology Data Exchange (ETDEWEB)

    West, P.W.

    1979-01-01

    Successful passive sampling techniques were developed for ammonia and hydrogen-cyanide. For ammonia the collection of the sample required permeation through a silicone membrane into boric acid. Spectrophotometric determination of the ammonia was carried out using Nessler's reagent or potentiometrical means with an ion-selective electrode. Linearity was demonstrated using the monitor over concentration ranges of less than 10 to over 160 parts per million. The ammonia monitor was relatively free from interference by amines and could be used in a badge-type monitor, as it was very light in weight and compact. The hydrogen-cyanide monitor functioned by permeation through a silicone membrane. The permeated gas was stabilized in sodium hydroxide and determined by means of a standard pyridine/barbituric acid colorimetric method. A detection limit of 10 parts per billion for an 8-hour exposure was obtained. Difficulties were encountered in developing a sampler for hydrogen sulfide.

  14. AppWeb per a una Aplicació per Android, iOS i Windows Phone

    OpenAIRE

    López Cuadros, Francesc Marc

    2016-01-01

    El projecte consisteix en desenvolupar una aplicació web que s'adapti a qualsevol dispositiu actual i que funcioni en qualsevol sistema operatiu. També ha de ser capaç de exprimir al màxim les característiques i funcionalitats del dispositiu en el que s'executa, per fer una aplicació molt més potent i atractiva per a l'usuari. El projecte consta de dues parts: una plataforma web totalment funcional adaptable a qualsevol dispositiu i una aplicació que permeti l'accés a aquesta plataforma i que...

  15. 40 CFR Table 1 to Subpart III of... - Emission Limitations

    Science.gov (United States)

    2010-07-01

    ... Requirements for Commercial and Industrial Solid Waste Incineration Units That Commenced Construction On or... determining compliance using this method Cadmium 0.004 milligrams per dry standard cubic meter 3-run average... monoxide 157 parts per million by dry volume 3-run average (1 hour minimum sample time per run) Performance...

  16. A Structural Molar Volume Model for Oxide Melts Part III: Fe Oxide-Containing Melts

    Science.gov (United States)

    Thibodeau, Eric; Gheribi, Aimen E.; Jung, In-Ho

    2016-04-01

    As part III of this series, the model is extended to iron oxide-containing melts. All available experimental data in the FeO-Fe2O3-Na2O-K2O-MgO-CaO-MnO-Al2O3-SiO2 system were critically evaluated based on the experimental condition. The variations of FeO and Fe2O3 in the melts were taken into account by using FactSage to calculate the Fe2+/Fe3+ distribution. The molar volume model with unary and binary model parameters can be used to predict the molar volume of the molten oxide of the Li2O-Na2O-K2O-MgO-CaO-MnO-PbO-FeO-Fe2O3-Al2O3-SiO2 system in the entire range of compositions, temperatures, and oxygen partial pressures from Fe saturation to 1 atm pressure.

  17. AREVA - First quarter 2011 revenue: 2.7% growth like for like to 1.979 billion euros

    International Nuclear Information System (INIS)

    2011-01-01

    The group reported consolidated revenue of 1.979 billion euros in the 1. quarter of 2011, for 2.2% growth compared with the 1. quarter of 2010 (+ 2.7% like for like). The increase was driven by the Mining / Front End Business Group (+ 20.8% LFL). Revenue from outside France rose 12.0% to 1.22 billion euros and represented 62% of total revenue. The impacts of foreign exchange and changes in consolidation scope were negligible during the period. The March 11 events in Japan had no significant impact on the group's performance in the 1. quarter of 2011. The group's backlog of 43.5 billion euros at March 31, 2011 was stable in relation to March 31, 2010. The growth in the backlog of the Mining / Front End and Renewable Energies Business Groups offset the partial depletion of the backlog in the Reactors and Services and Back End Business Groups as contracts were completed

  18. Two ten-billion-solar-mass black holes at the centres of giant elliptical galaxies.

    Science.gov (United States)

    McConnell, Nicholas J; Ma, Chung-Pei; Gebhardt, Karl; Wright, Shelley A; Murphy, Jeremy D; Lauer, Tod R; Graham, James R; Richstone, Douglas O

    2011-12-08

    Observational work conducted over the past few decades indicates that all massive galaxies have supermassive black holes at their centres. Although the luminosities and brightness fluctuations of quasars in the early Universe suggest that some were powered by black holes with masses greater than 10 billion solar masses, the remnants of these objects have not been found in the nearby Universe. The giant elliptical galaxy Messier 87 hosts the hitherto most massive known black hole, which has a mass of 6.3 billion solar masses. Here we report that NGC 3842, the brightest galaxy in a cluster at a distance from Earth of 98 megaparsecs, has a central black hole with a mass of 9.7 billion solar masses, and that a black hole of comparable or greater mass is present in NGC 4889, the brightest galaxy in the Coma cluster (at a distance of 103 megaparsecs). These two black holes are significantly more massive than predicted by linearly extrapolating the widely used correlations between black-hole mass and the stellar velocity dispersion or bulge luminosity of the host galaxy. Although these correlations remain useful for predicting black-hole masses in less massive elliptical galaxies, our measurements suggest that different evolutionary processes influence the growth of the largest galaxies and their black holes.

  19. Sub-parts-per-quadrillion-level graphite furnace atomic absorption spectrophotometry based on laser wave mixing.

    Science.gov (United States)

    Mickadeit, Fritz K; Berniolles, Sandrine; Kemp, Helen R; Tong, William G

    2004-03-15

    Nonlinear laser wave mixing in a common graphite furnace atomizer is presented as a zeptomole-level, sub-Doppler, high-resolution atomic absorption spectrophotometric method. A nonplanar three-dimensional wave-mixing optical setup is used to generate the signal beam in its own space. Signal collection is efficient and convenient using a template-based optical alignment. The graphite furnace atomizer offers advantages including fast and convenient introduction of solid, liquid, or gas analytes, clean atomization environment, and minimum background noise. Taking advantage of the unique features of the wave-mixing optical method and those of the graphite furnace atomizer, one can obtain both excellent spectral resolution and detection sensitivity. A preliminary concentration detection limit of 0.07 parts-per-quadrillion and a preliminary mass detection limit of 0.7 ag or 8 zmol are determined for rubidium using a compact laser diode as the excitation source.

  20. Air-deployable oil spill sampling devices review phase 2 testing. Volume 1

    International Nuclear Information System (INIS)

    Hawke, L.; Dumouchel, A.; Fingas, M.; Brown, C.E.

    2007-01-01

    SAIC Canada tested air deployable oil sampling devices for the Emergencies Science and Technology Division of Environment Canada in order to determine the applicability and status of these devices. The 3 devices tested were: Canada's SABER (sampling autonomous buoy for evidence recovery), the United States' POPEIE (probe for oil pollution evidence in the environment); and, Sweden's SAR Floatation 2000. They were tested for buoyancy properties, drift behaviour and sampler sorbent pickup ratios. The SAR and SABER both had lesser draft and greater freeboard, while the POPEIE had much greater draft than freeboard. All 3 devices could be used for oil sample collection in that their drift characteristics would allow for the SABER and SAR devices to be placed upwind of the slick while the POPEIE device could be placed downwind of an oil spill. The sorbent testing revealed that Sefar sorbent and Spectra sorbent used in the 3 devices had negative pickup ratios for diesel but performance improved as oil viscosity increased. Both sorbents are inert and capable of collecting oil in sufficient volumes for consistent fingerprinting analysis. 10 refs., 8 tabs., 8 figs

  1. Pore volume and pore size distribution of cement samples measured by a modified mercury intrusion porosimeter

    International Nuclear Information System (INIS)

    Zamorani, E.; Blanchard, H.

    1987-01-01

    Important parameters for the characterization of cement specimens are mechanical properties and porosity. This work is carried out at the Ispra Establishment of the Joint Research Centre in the scope of the Radioactive Waste Management programme. A commercial Mercury Intrusion Porosimeter was modified in an attempt to improve the performance of the instrument and to provide fast processing of the recorded values: pressure-volume of pores. The dead volume of the instrument was reduced and the possibility of leakage from the moving parts eliminated. In addition, the modification allows an improvement of data acquisition thus increasing data accuracy and reproducibility. In order to test the improved performance of the modified instrument, physical characterizations of cement forms were carried out. Experimental procedures and results are reported

  2. Update of Part 61 Impacts Analysis Methodology. Methodology report. Volume 1

    International Nuclear Information System (INIS)

    Oztunali, O.I.; Roles, G.W.

    1986-01-01

    Under contract to the US Nuclear Regulatory Commission, the Envirosphere Company has expanded and updated the impacts analysis methodology used during the development of the 10 CFR Part 61 rule to allow improved consideration of the costs and impacts of treatment and disposal of low-level waste that is close to or exceeds Class C concentrations. The modifications described in this report principally include: (1) an update of the low-level radioactive waste source term, (2) consideration of additional alternative disposal technologies, (3) expansion of the methodology used to calculate disposal costs, (4) consideration of an additional exposure pathway involving direct human contact with disposed waste due to a hypothetical drilling scenario, and (5) use of updated health physics analysis procedures (ICRP-30). Volume 1 of this report describes the calculational algorithms of the updated analysis methodology

  3. Update of Part 61 Impacts Analysis Methodology. Methodology report. Volume 1

    Energy Technology Data Exchange (ETDEWEB)

    Oztunali, O.I.; Roles, G.W.

    1986-01-01

    Under contract to the US Nuclear Regulatory Commission, the Envirosphere Company has expanded and updated the impacts analysis methodology used during the development of the 10 CFR Part 61 rule to allow improved consideration of the costs and impacts of treatment and disposal of low-level waste that is close to or exceeds Class C concentrations. The modifications described in this report principally include: (1) an update of the low-level radioactive waste source term, (2) consideration of additional alternative disposal technologies, (3) expansion of the methodology used to calculate disposal costs, (4) consideration of an additional exposure pathway involving direct human contact with disposed waste due to a hypothetical drilling scenario, and (5) use of updated health physics analysis procedures (ICRP-30). Volume 1 of this report describes the calculational algorithms of the updated analysis methodology.

  4. Anhui Tongling Invests 1 Billion Yuan to Set up “Copper Industry Fund”

    Institute of Scientific and Technical Information of China (English)

    2012-01-01

    <正>On September 12, the signing ceremony for "Anhui Copper Industry Fund" set up by Anhui Tongling Development & Investment Group Co., Ltd. and Shanghai V. Stone Investment Management Co., Ltd. was held in Tongling. The fund is 1 billion yuan.

  5. Relationship between LIBS Ablation and Pit Volume for Geologic Samples: Applications for in situ Absolute Geochronology

    Science.gov (United States)

    Devismes, D.; Cohen, Barbara A.

    2014-01-01

    In planetary sciences, in situ absolute geochronology is a scientific and engineering challenge. Currently, the age of the Martian surface can only be determined by crater density counting. However this method has significant uncertainties and needs to be calibrated with absolute ages. We are developing an instrument to acquire in situ absolute geochronology based on the K-Ar method. The protocol is based on the laser ablation of a rock by hundreds of laser pulses. Laser Induced Breakdown Spectroscopy (LIBS) gives the potassium content of the ablated material and a mass spectrometer (quadrupole or ion trap) measures the quantity of 40Ar released. In order to accurately measure the quantity of released 40Ar in cases where Ar is an atmospheric constituent (e.g., Mars), the sample is first put into a chamber under high vacuum. The 40Arquantity, the concentration of K and the estimation of the ablated mass are the parameters needed to give the age of the rocks. The main uncertainties with this method are directly linked to the measures of the mass (typically some µg) and of the concentration of K by LIBS (up to 10%). Because the ablated mass is small compared to the mass of the sample, and because material is redeposited onto the sample after ablation, it is not possible to directly measure the ablated mass. Our current protocol measures the ablated volume and estimates the sample density to calculate ablated mass. The precision and accuracy of this method may be improved by using knowledge of the sample's geologic properties to predict its response to laser ablation, i.e., understanding whether natural samples have a predictable relationship between laser energy deposited and resultant ablation volume. In contrast to most previous studies of laser ablation, theoretical equations are not highly applicable. The reasons are numerous, but the most important are: a) geologic rocks are complex, polymineralic materials; b) the conditions of ablation are unusual (for example

  6. CONTAMINATED SOIL VOLUME ESTIMATE TRACKING METHODOLOGY

    International Nuclear Information System (INIS)

    Durham, L.A.; Johnson, R.L.; Rieman, C.; Kenna, T.; Pilon, R.

    2003-01-01

    The U.S. Army Corps of Engineers (USACE) is conducting a cleanup of radiologically contaminated properties under the Formerly Utilized Sites Remedial Action Program (FUSRAP). The largest cost element for most of the FUSRAP sites is the transportation and disposal of contaminated soil. Project managers and engineers need an estimate of the volume of contaminated soil to determine project costs and schedule. Once excavation activities begin and additional remedial action data are collected, the actual quantity of contaminated soil often deviates from the original estimate, resulting in cost and schedule impacts to the project. The project costs and schedule need to be frequently updated by tracking the actual quantities of excavated soil and contaminated soil remaining during the life of a remedial action project. A soil volume estimate tracking methodology was developed to provide a mechanism for project managers and engineers to create better project controls of costs and schedule. For the FUSRAP Linde site, an estimate of the initial volume of in situ soil above the specified cleanup guidelines was calculated on the basis of discrete soil sample data and other relevant data using indicator geostatistical techniques combined with Bayesian analysis. During the remedial action, updated volume estimates of remaining in situ soils requiring excavation were calculated on a periodic basis. In addition to taking into account the volume of soil that had been excavated, the updated volume estimates incorporated both new gamma walkover surveys and discrete sample data collected as part of the remedial action. A civil survey company provided periodic estimates of actual in situ excavated soil volumes. By using the results from the civil survey of actual in situ volumes excavated and the updated estimate of the remaining volume of contaminated soil requiring excavation, the USACE Buffalo District was able to forecast and update project costs and schedule. The soil volume

  7. Balancing Conservation and Economic Sustainability: The Future of the Amazon Timber Industry

    Science.gov (United States)

    Merry, Frank; Soares-Filho, Britaldo; Nepstad, Daniel; Amacher, Gregory; Rodrigues, Hermann

    2009-09-01

    Logging has been a much maligned feature of frontier development in the Amazon. Most discussions ignore the fact that logging can be part of a renewable, environmentally benign, and broadly equitable economic activity in these remote places. We estimate there to be some 4.5 ± 1.35 billion m3 of commercial timber volume in the Brazilian Amazon today, of which 1.2 billion m3 is currently profitable to harvest, with a total potential stumpage value of 15.4 billion. A successful forest sector in the Brazilian Amazon will integrate timber harvesting on private lands and on unprotected and unsettled government lands with timber concessions on public lands. If a legal, productive, timber industry can be established outside of protected areas, it will deliver environmental benefits in synergy with those provided by the region’s network of protected areas, the latter of which we estimate to have an opportunity cost from lost timber revenues of 2.3 billion over 30 years. Indeed, on all land accessible to harvesting, the timber industry could produce an average of more than 16 million m3 per year over a 30-year harvest cycle—entirely outside of current protected areas—providing 4.8 billion in returns to landowners and generating 1.8 billion in sawnwood sales tax revenue. This level of harvest could be profitably complemented with an additional 10% from logging concessions on National Forests. This advance, however, should be realized only through widespread adoption of reduced impact logging techniques.

  8. Examining the effect of psychopathic traits on gray matter volume in a community substance abuse sample.

    Science.gov (United States)

    Cope, Lora M; Shane, Matthew S; Segall, Judith M; Nyalakanti, Prashanth K; Stevens, Michael C; Pearlson, Godfrey D; Calhoun, Vince D; Kiehl, Kent A

    2012-11-30

    Psychopathy is believed to be associated with brain abnormalities in both paralimbic (i.e., orbitofrontal cortex, insula, temporal pole, parahippocampal gyrus, posterior cingulate) and limbic (i.e., amygdala, hippocampus, anterior cingulate) regions. Recent structural imaging studies in both community and prison samples are beginning to support this view. Sixty-six participants, recruited from community corrections centers, were administered the Hare psychopathy checklist-revised (PCL-R), and underwent magnetic resonance imaging (MRI). Voxel-based morphometry was used to test the hypothesis that psychopathic traits would be associated with gray matter reductions in limbic and paralimbic regions. Effects of lifetime drug and alcohol use on gray matter volume were covaried. Psychopathic traits were negatively associated with gray matter volumes in right insula and right hippocampus. Additionally, psychopathic traits were positively associated with gray matter volumes in bilateral orbital frontal cortex and right anterior cingulate. Exploratory regression analyses indicated that gray matter volumes within right hippocampus and left orbital frontal cortex combined to explain 21.8% of the variance in psychopathy scores. These results support the notion that psychopathic traits are associated with abnormal limbic and paralimbic gray matter volume. Furthermore, gray matter increases in areas shown to be functionally impaired suggest that the structure-function relationship may be more nuanced than previously thought. Copyright © 2012 Elsevier Ireland Ltd. All rights reserved.

  9. Estudi sobre l’eficàcia de l’ús de la hidroteràpia durant el primer període de part

    OpenAIRE

    Blasco i Guitart, Marta

    2017-01-01

    Introducció Fa dècades que s’ha classificat el dolor de part com un dels més intensos. Els grans avanços han establert l’analgèsia epidural com el principal recurs utilitzat per alleugerir el dolor de part, no obstant, els canvis socioculturals dels últims anys han generat un canvi de visió en els professionals obstètrics i centres hospitalaris, fent-lo més personalitzat i humanitzat, fomentant la participació i presa de decisions de la mare i parella i descentralitzant l’ús de teràpies farma...

  10. CTC Sentinel. Volume 7, Issue 4, January 2014

    Science.gov (United States)

    2014-01-01

    new threat, the “biohacker.” “Biohacking” is not necessarily malicious and could be as innocent as a beer enthusiast altering yeast to create a...turn operated by the Egyptian government.7 The canal generates around $5 billion per year for Egypt and is an important source of foreign currency...toll charges at Panama and the deployment of 18,000 20-foot equivalent unit (TEU) vessels, further increasing the importance of the Suez route to

  11. 3D DVH-based metric analysis versus per-beam planar analysis in IMRT pretreatment verification

    International Nuclear Information System (INIS)

    Carrasco, Pablo; Jornet, Núria; Latorre, Artur; Eudaldo, Teresa; Ruiz, Agustí; Ribas, Montserrat

    2012-01-01

    Purpose: To evaluate methods of pretreatment IMRT analysis, using real measurements performed with a commercial 2D detector array, for clinical relevance and accuracy by comparing clinical DVH parameters. Methods: We divided the work into two parts. The first part consisted of six in-phantom tests aimed to study the sensitivity of the different analysis methods. Beam fluences, 3D dose distribution, and DVH of an unaltered original plan were compared to those of the delivered plan, in which an error had been intentionally introduced. The second part consisted of comparing gamma analysis with DVH metrics for 17 patient plans from various sites. Beam fluences were measured with the MapCHECK 2 detector, per-beam planar analysis was performed with the MapCHECK software, and 3D gamma analysis and the DVH evaluation were performed using 3DVH software. Results: In a per-beam gamma analysis some of the tests yielded false positives or false negatives. However, the 3DVH software correctly described the DVH of the plan which included the error. The measured DVH from the plan with controlled error agreed with the planned DVH within 2% dose or 2% volume. We also found that a gamma criterion of 3%/3 mm was too lax to detect some of the forced errors. Global analysis masked some problems, while local analysis magnified irrelevant errors at low doses. Small hotspots were missed for all metrics due to the spatial resolution of the detector panel. DVH analysis for patient plans revealed small differences between treatment plan calculations and 3DVH results, with the exception of very small volume structures such as the eyes and the lenses. Target coverage (D 98 and D 95 ) of the measured plan was systematically lower than that predicted by the treatment planning system, while other DVH characteristics varied depending on the parameter and organ. Conclusions: We found no correlation between the gamma index and the clinical impact of a discrepancy for any of the gamma index evaluation

  12. Quantification of Protozoa and Viruses from Small Water Volumes

    Directory of Open Access Journals (Sweden)

    J. Alfredo Bonilla

    2015-06-01

    Full Text Available Large sample volumes are traditionally required for the analysis of waterborne pathogens. The need for large volumes greatly limits the number of samples that can be processed. The aims of this study were to compare extraction and detection procedures for quantifying protozoan parasites and viruses from small volumes of marine water. The intent was to evaluate a logistically simpler method of sample collection and processing that would facilitate direct pathogen measures as part of routine monitoring programs. Samples were collected simultaneously using a bilayer device with protozoa capture by size (top filter and viruses capture by charge (bottom filter. Protozoan detection technologies utilized for recovery of Cryptosporidium spp. and Giardia spp. were qPCR and the more traditional immunomagnetic separation—IFA-microscopy, while virus (poliovirus detection was based upon qPCR versus plaque assay. Filters were eluted using reagents consistent with the downstream detection technologies. Results showed higher mean recoveries using traditional detection methods over qPCR for Cryptosporidium (91% vs. 45% and poliovirus (67% vs. 55% whereas for Giardia the qPCR-based methods were characterized by higher mean recoveries (41% vs. 28%. Overall mean recoveries are considered high for all detection technologies. Results suggest that simultaneous filtration may be suitable for isolating different classes of pathogens from small marine water volumes. More research is needed to evaluate the suitability of this method for detecting pathogens at low ambient concentration levels.

  13. U of M seeking $1.1 billion in projects for Soudan Mine lab.

    CERN Multimedia

    2003-01-01

    The University of Minnesota is hoping that groundbreaking research underway at its labs at the Soudan Underground Mine near Tower will help secure up to $1.1 billion in the next 5 to 20 years to expand its work into particle physics (1 page).

  14. Using hair, nail and urine samples for human exposure assessment of legacy and emerging per- and polyfluoroalkyl substances.

    Science.gov (United States)

    Wang, Yuan; Shi, Yali; Vestergren, Robin; Zhou, Zhen; Liang, Yong; Cai, Yaqi

    2018-09-15

    Non-invasive samples present ethical and practical benefits for investigating human exposure to hazardous contaminants, but analytical challenges and difficulties to interpret the results limit their application in biomonitoring. Here we investigated the potential for using hair, nail and urine samples as a measure of internal exposure to an array of legacy and emerging per- and polyfluoroalkyl substances (PFASs) in two populations with different exposure conditions. Paired urine-serum measurements of PFASs from a group of highly exposed fishery employees displayed strong correlations for PFASs with three to eight perfluorinated carbons (ρ > 0.653; p < 0.01). Consistent statistical correlations and transfer ratios in nails and hair from both populations demonstrated that these non-invasive samples can be used as a measure of internal exposure to perfluorooctane sulfonic acid and C8 chlorinated polyfluoralkyl ether sulfonic acid (C8 Cl-PFESA). Contrastingly, the infrequent detections and/or lack of consistent transfer ratios for perfluorooctanoic acid, perfluorononanoic acid and short-chain PFASs in hair and nail samples indicate passive uptake from the external environment rather than uptake and internal distribution. Collectively, the study supports the use of urine samples as a valid measure of internal exposure for a range of short- and medium-chain PFASs, while the validity of nail and hair samples as a measure of internal exposure may vary for different PFASs and populations. The ubiquitous detection of C8 Cl-PFESA in all sample matrices from both populations indicates widespread exposure to this contaminant of emerging concern in China. Copyright © 2018 Elsevier B.V. All rights reserved.

  15. 32 CFR Appendix D to Part 113 - Sample DD Form 2654, “Involuntary Allotment Notice and Processing”

    Science.gov (United States)

    2010-07-01

    ... 32 National Defense 1 2010-07-01 2010-07-01 false Sample DD Form 2654, âInvoluntary Allotment Notice and Processingâ D Appendix D to Part 113 National Defense Department of Defense OFFICE OF THE..., App. D Appendix D to Part 113—Sample DD Form 2654, “Involuntary Allotment Notice and Processing...

  16. Los Alamos Scientific Laboratory approach to hydrogeochemical and stream sediment reconnaissance for uranium in the United States

    International Nuclear Information System (INIS)

    Bolivar, S.L.

    1980-01-01

    The Los Alamos Scientific Laboratory of the United States is conducting a geochemical survey for uranium in the Rocky Mountain states of New Mexico, Colorado, Wyoming, and Montana and in Alaska. This survey is part of a national hydrogeochemical and stream sediment reconnaissance in which four Department of Energy laboratories will study the uranium resources of the United States to provide data for the National Uranium Resource Evaluation program. The reconnaissance will identify areas having higher than background concentrations of uranium in ground waters, surface waters, and water-transported sediments. The reconnaissance data will be combined with data from airborne radiometric surveys and geological and geophysical investigations to provide an improved estimate for the economics and availability of nuclear fuel resources in the United States and to make information available to industry for use in the exploration and development of uranium resources. Water samples are analyzed for uranium by fluorometry which has a 0.02 parts per billion lower limit of detection. Concentrations of 12 additional elements in water are determined by plasma-source emission spectrography. All sediments are analyzed for uranium by delayed-neutron counting and a 20 parts per billion lower limit of detection. Elemental concentrations in sediments are also determined by neutron activation analysis, x-ray fluorescence, and by arc-source emission spectrography. To date, all of four Rocky Mountain states and about 80% of Alaska have been sampled. About 220,000 samples have been collected from an area of nearly 2,500,000 km 2 . The philosophy, sampling methodology, analytical techniques, and progress of the reconnaissance are described in several published pilot study, reconnaissance, and technical reports. The Los Alamos program was designed to maximize the identification of uranium in terrains of varied geography, geology, and climate

  17. Use of environmental isotopes in hydrogeological studies in Kedah and Perlis, Malaysia

    International Nuclear Information System (INIS)

    Daud bin Mohamad.

    1981-01-01

    A preliminary study of the isotope hydrology of the Kedah and Perlis area was undertaken under the RCA programme. This project is an attempt at elucidating the mechanism of recharge, origin, area of recharge and dating of groundwater system in the area. The results show that all groundwater samples in the area vary within a narrow range for 18 0 (-7.58 to -5.06%) while 2 H ranges from -50.3 to -35.1%. The mean isotopic composition of precipitation collected at Alor Star meteorological station fall within the range of variation of the Kedah/Perlis groundwaters. In the southern part of the study site, the isotopic results indicate the occurence of two types of water; firstly the recharge is from the highlands where more negative 18 0 values and low tritium were observed and the second type is of local recharge where high tritium and less negative 18 0 values were observed. On the other hand, in the northern part of the basin the interpretation of stable isotopic results is quite difficult to be made at this stage. There was no correlation at all between tritium and 18 0 whatsoever. Results of the tritium assay show that some of the groundwater samples are pre-nuclear in age as indicated by low tritium content. Consequently, a Carbon-14 investigation was carried out from a few selected sites and their ages were found to be in the range of about 3000 to 5000 years. (author)

  18. Comparison of bone-implant contact and bone-implant volume between 2D-histological sections and 3D-SRµCT slices

    Directory of Open Access Journals (Sweden)

    R Bernhardt

    2012-04-01

    Full Text Available Histological imaging is still considered the gold standard for analysing bone formation around metallic implants. Generally, a limited number of histological sections per sample are used for the approximation of mean values of peri-implant bone formation. In this study we compared statistically the results of bone-implant contact (BIC and bone-implant volume (BIV obtained by histological sections, with those obtained by X-ray absorption images from synchrotron radiation micro-computed tomography (SRµCT using osseointegrated screw-shaped implants from a mini-pig study. Comparing the BIC results of 3-4 histological sections per implant sample with the appropriate 3-4 SRµCT slices showed a non-significant difference of 1.9 % (p = 0.703. The contact area assessed by the whole 3D information from the SRµCT measurement in comparison to the histomorphometric results showed a non-significant difference in BIC of 4.9 % (p = 0.171. The amount of the bone-implant volume in the histological sections and the appropriate SRµCT slices showed a non-significant difference by only 1.4 % (p = 0.736 and also remains non-significant with 2.6 % (p = 0.323 using the volumetric SRµCT information. We conclude that for a clinical evaluation of implant osseointegration with histological imaging at least 3-4 sections per sample are sufficient to represent the BIC or BIV for a sample. Due to the fact that in this study we have found a significant intra-sample variation in BIC of up to ± 35 % the selection of only one or two histological sections per sample may strongly influence the determined BIC.

  19. Concentration of Beryllium (Be) and Depleted Uranium (DU) in Marine Fauna and Sediment Samples from Illeginni and Boggerik Islands at Kwajalein Atoll

    International Nuclear Information System (INIS)

    Robison, W L; Hamilton, T F; Martinelli, R E; Kehl, S R; Lindman, T R

    2005-01-01

    Lawrence Livermore National Laboratory (LLNL) personnel have supported US Air Force (USAF) ballistic missile flight tests for about 15 years for Peacekeeper and Minuteman missiles launched at Vandenberg Air Force Base (VAFB). Associated re-entry vehicles (RV's) re-enter at Regan Test Site (RTS) at the US Army base at Kwajalein Atoll (USAKA) where LLNL has supported scoring, recovery operations for RV materials, and environmental assessments. As part of ongoing USAF ballistic missile flight test programs, LLNL is participating in an updated EA being written for flights originating at VFAB. Marine fauna and sediments (beach-sand samples) were collected by US Fish and Wildlife Service (USFWS), National Marine Fisheries Service (NMFS), and LLNL at Illeginni Island and Boggerik Island (serving as a control site) at Kwajalein Atoll. Data on the concentration of DU (hereafter, U) and Be in collected samples was requested by USFWS and NMFS to determine whether or not U and Be in RV's entering the Illeginni area are increasing U and Be concentrations in marine fauna and sediments. LLNL agreed to do the analyses for U and Be in support of the EA process and provide a report of the results. There is no statistically significant difference in the concentration of U and Be in six species of marine fauna from Illeginni and Boggerik Islands (p - 0.14 for U and p = 0.34 for Be). Thus, there is no evidence that there has been any increase in U and Be concentrations in marine fauna as a result of the missile flight test program. Concentration of U in beach sand at Illeginni is the same as soil and beach sand in the rest of the Marshall Islands and again reflects an insignificant impact from the flight test program. Beach sand from Illeginni has a mean concentration of Be higher than that from the control site, Boggeik Island. Seven of 21 samples from Ileginni had detectable Be. Four samples had a concentration of Be ranging from 4 to 7 ng g -1 (4 to 7 parts per billion (ppb)), one

  20. Outcomes of PCI in Relation to Procedural Characteristics and Operator Volumes in the United States.

    Science.gov (United States)

    Fanaroff, Alexander C; Zakroysky, Pearl; Dai, David; Wojdyla, Daniel; Sherwood, Matthew W; Roe, Matthew T; Wang, Tracy Y; Peterson, Eric D; Gurm, Hitinder S; Cohen, Mauricio G; Messenger, John C; Rao, Sunil V

    2017-06-20

    Professional guidelines have reduced the recommended minimum number to an average of 50 percutaneous coronary intervention (PCI) procedures performed annually by each operator. Operator volume patterns and associated outcomes since this change are unknown. The authors describe herein PCI operator procedure volumes; characteristics of low-, intermediate-, and high-volume operators; and the relationship between operator volume and clinical outcomes in a large, contemporary, nationwide sample. Using data from the National Cardiovascular Data Registry collected between July 1, 2009, and March 31, 2015, we examined operator annual PCI volume. We divided operators into low- (100 PCIs per year) volume groups, and determined the adjusted association between annual PCI volume and in-hospital outcomes, including mortality. The median annual number of procedures performed per operator was 59; 44% of operators performed PCI procedures per year. Low-volume operators more frequently performed emergency and primary PCI procedures and practiced at hospitals with lower annual PCI volumes. Unadjusted in-hospital mortality was 1.86% for low-volume operators, 1.73% for intermediate-volume operators, and 1.48% for high-volume operators. The adjusted risk of in-hospital mortality was higher for PCI procedures performed by low- and intermediate-volume operators compared with those performed by high-volume operators (adjusted odds ratio: 1.16 for low versus high; adjusted odds ratio: 1.05 for intermediate vs. high volume) as was the risk for new dialysis post PCI. No volume relationship was observed for post-PCI bleeding. Many PCI operators in the United States are performing fewer than the recommended number of PCI procedures annually. Although absolute risk differences are small and may be partially explained by unmeasured differences in case mix between operators, there remains an inverse relationship between PCI operator volume and in-hospital mortality that persisted in risk

  1. Final report on Phase II remedial action at the former Middlesex Sampling Plant and associated properties. Volume 2

    International Nuclear Information System (INIS)

    1985-04-01

    Volume 2 presents the radiological measurement data taken after remedial action on properties surrounding the former Middlesex Sampling Plant during Phase II of the DOE Middlesex Remedial Action Program. Also included are analyses of the confirmatory radiological survey data for each parcel with respect to the remedial action criteria established by DOE for the Phase II cleanup and a discussion of the final status of each property. Engineering details of this project and a description of the associated health physics and environmental monitoring activities are presented in Volume 1

  2. Statistical evaluations concerning the failure behaviour of formed parts with superheated steam flow. Pt. 1

    International Nuclear Information System (INIS)

    Oude-Hengel, H.H.; Vorwerk, K.; Heuser, F.W.; Boesebeck, K.

    1976-01-01

    Statistical evaluations concerning the failure behaviour of formed parts with superheated-steam flow were carried out using data from VdTUEV inventory and failure statistics. Due to the great number of results, the findings will be published in two volumes. This first part will describe and classify the stock of data and will make preliminary quantitative statements on failure behaviour. More differentiated statements are made possible by including the operation time and the number of start-ups per failed part. On the basis of time-constant failure rates some materials-specific statements are given. (orig./ORU) [de

  3. BioSAXS Sample Changer: a robotic sample changer for rapid and reliable high-throughput X-ray solution scattering experiments

    Energy Technology Data Exchange (ETDEWEB)

    Round, Adam, E-mail: around@embl.fr; Felisaz, Franck [European Molecular Biology Laboratory, Grenoble Outstation, 71 Avenue des Martyrs, CS 90181, 38042 Grenoble (France); Université Grenoble Alpes–EMBL–CNRS, 71 Avenue des Martyrs, CS 90181, 38042 Grenoble (France); Fodinger, Lukas; Gobbo, Alexandre [European Molecular Biology Laboratory, Grenoble Outstation, 71 Avenue des Martyrs, CS 90181, 38042 Grenoble (France); Huet, Julien [European Molecular Biology Laboratory, Grenoble Outstation, 71 Avenue des Martyrs, CS 90181, 38042 Grenoble (France); Université Grenoble Alpes–EMBL–CNRS, 71 Avenue des Martyrs, CS 90181, 38042 Grenoble (France); Villard, Cyril [European Molecular Biology Laboratory, Grenoble Outstation, 71 Avenue des Martyrs, CS 90181, 38042 Grenoble (France); Blanchet, Clement E., E-mail: around@embl.fr [EMBL c/o DESY, Notkestrasse 85, 22603 Hamburg (Germany); Pernot, Petra; McSweeney, Sean [ESRF, 6 Rue Jules Horowitz, 38000 Grenoble (France); Roessle, Manfred; Svergun, Dmitri I. [EMBL c/o DESY, Notkestrasse 85, 22603 Hamburg (Germany); Cipriani, Florent, E-mail: around@embl.fr [European Molecular Biology Laboratory, Grenoble Outstation, 71 Avenue des Martyrs, CS 90181, 38042 Grenoble (France); Université Grenoble Alpes–EMBL–CNRS, 71 Avenue des Martyrs, CS 90181, 38042 Grenoble (France)

    2015-01-01

    A robotic sample changer for solution X-ray scattering experiments optimized for speed and to use the minimum amount of material has been developed. This system is now in routine use at three high-brilliance European synchrotron sites, each capable of several hundred measurements per day. Small-angle X-ray scattering (SAXS) of macromolecules in solution is in increasing demand by an ever more diverse research community, both academic and industrial. To better serve user needs, and to allow automated and high-throughput operation, a sample changer (BioSAXS Sample Changer) that is able to perform unattended measurements of up to several hundred samples per day has been developed. The Sample Changer is able to handle and expose sample volumes of down to 5 µl with a measurement/cleaning cycle of under 1 min. The samples are stored in standard 96-well plates and the data are collected in a vacuum-mounted capillary with automated positioning of the solution in the X-ray beam. Fast and efficient capillary cleaning avoids cross-contamination and ensures reproducibility of the measurements. Independent temperature control for the well storage and for the measurement capillary allows the samples to be kept cool while still collecting data at physiological temperatures. The Sample Changer has been installed at three major third-generation synchrotrons: on the BM29 beamline at the European Synchrotron Radiation Facility (ESRF), the P12 beamline at the PETRA-III synchrotron (EMBL@PETRA-III) and the I22/B21 beamlines at Diamond Light Source, with the latter being the first commercial unit supplied by Bruker ASC.

  4. Eines web per a l'elaboració del Treball Final de Grau i Màster en Enginyeria Química

    Directory of Open Access Journals (Sweden)

    Maria del Carmen Marquez Moreno

    2014-12-01

    Full Text Available La necessitat d'un gran volum d'informació per a l'elaboració de projectes de caràcter professional en Enginyeria Química pot esdevenir un greu problema en aquells casos en els quals el projecte ha de ser preparat pels alumnes per al seu Treball Fi de Grau o Treball Fi de Màster, ja que la consecució d'aquesta informació és complicada i implica, en moltes ocasions, una despesa econòmica important. Per resoldre aquest problema, l'objectiu d'aquest article és subministrar a l'estudiant eines web de lliure accés que li permeten tenir informació relativa a les parts del projecte que ha d'elaborar. Per a fer-ho s'ha realitzat una anàlisi dels camps en els quals l'estudiant necessita informació i un estudi per determinar les eines web disponibles de lliure accés per a la consecució d'aquesta informació. A partir d'aquesta recerca s'han trobat pàgines web amb lliure accés que contenen la informació necessària per a l'elaboració d'un projecte professional, les referències dels quals s'han inclòs en aquest treball.

  5. Industrial experience feedback of a geostatistical estimation of contaminated soil volumes - 59181

    International Nuclear Information System (INIS)

    Faucheux, Claire; Jeannee, Nicolas

    2012-01-01

    Geo-statistics meets a growing interest for the remediation forecast of potentially contaminated sites, by providing adapted methods to perform both chemical and radiological pollution mapping, to estimate contaminated volumes, potentially integrating auxiliary information, and to set up adaptive sampling strategies. As part of demonstration studies carried out for GeoSiPol (Geo-statistics for Polluted Sites), geo-statistics has been applied for the detailed diagnosis of a former oil depot in France. The ability within the geo-statistical framework to generate pessimistic / probable / optimistic scenarios for the contaminated volumes allows a quantification of the risks associated to the remediation process: e.g. the financial risk to excavate clean soils, the sanitary risk to leave contaminated soils in place. After a first mapping, an iterative approach leads to collect additional samples in areas previously identified as highly uncertain. Estimated volumes are then updated and compared to the volumes actually excavated. This benchmarking therefore provides a practical feedback on the performance of the geo-statistical methodology. (authors)

  6. Residual limb fluid volume change and volume accommodation: Relationships to activity and self-report outcomes in people with trans-tibial amputation.

    Science.gov (United States)

    Sanders, Joan E; Youngblood, Robert T; Hafner, Brian J; Ciol, Marcia A; Allyn, Katheryn J; Gardner, David; Cagle, John C; Redd, Christian B; Dietrich, Colin R

    2018-02-01

    Fluctuations in limb volume degrade prosthesis fit and require users to accommodate changes using management strategies, such as donning and doffing prosthetic socks. To examine how activities and self-report outcomes relate to daily changes in residual limb fluid volume and volume accommodation. Standardized, two-part laboratory protocol with an interim observational period. Participants were classified as "accommodators" or "non-accommodators," based on self-report prosthetic sock use. Participants' residual limb fluid volume change was measured using a custom bioimpedance analyzer and a standardized in-laboratory activity protocol. Self-report health outcomes were assessed with the Socket Comfort Score and Prosthesis Evaluation Questionnaire. Activity was monitored while participants left the laboratory for at least 3 h. They then returned to repeat the bioimpedance test protocol. Twenty-nine people were enrolled. Morning-to-afternoon percent limb fluid volume change per hour was not strongly correlated to percent time weight-bearing or to self-report outcomes. As a group, non-accommodators ( n = 15) spent more time with their prosthesis doffed and reported better outcomes than accommodators. Factors other than time weight-bearing may contribute to morning-to-afternoon limb fluid volume changes and reported satisfaction with the prosthesis among trans-tibial prosthesis users. Temporary doffing may be a more effective and satisfying accommodation method than sock addition. Clinical relevance Practitioners should be mindful that daily limb fluid volume change and prosthesis satisfaction are not dictated exclusively by activity. Temporarily doffing the prosthesis may slow daily limb fluid volume loss and should be investigated as an alternative strategy to sock addition.

  7. Green Ocean Amazon 2014/15 High-Volume Filter Sampling: Atmospheric Particulate Matter of an Amazon Tropical City and its Relationship to Population Health Field Campaign Report

    Energy Technology Data Exchange (ETDEWEB)

    Machado, C. M. [Federal Univ. of Amazonas (Brazil); Santos, Erickson O. [Federal Univ. of Amazonas (Brazil); Fernandes, Karenn S. [Federal Univ. of Amazonas (Brazil); Neto, J. L. [Federal Univ. of Amazonas (Brazil); Souza, Rodrigo A. [Univ. of the State of Amazonas (Brazil)

    2016-08-01

    Manaus, the capital of the Brazilian state of Amazonas, is developing very rapidly. Its pollution plume contains aerosols from fossil fuel combustion mainly due to vehicular emission, industrial activity, and a thermal power plant. Soil resuspension is probably a secondary source of atmospheric particles. The plume transports from Manaus to the U.S. Department of Energy (DOE) Atmospheric Radiation Measurement (ARM) Climate Research Facility ARM site at Manacapuru urban pollutants as well as pollutants from pottery factories along the route of the plume. Considering the effects of particulate matter on health, atmospheric particulate matter was evaluated at this site as part of the ARM Facility’s Green Ocean Amazon 2014/15 (GoAmazon 2014/15) field campaign. Aerosol or particulate matter (PM) is typically defined by size, with the smaller particles having more health impact. Total suspended particulate (TSP) are particles smaller than 100 μm; particles smaller than 2.5 μm are called PM2.5. In this work, the PM2.5 levels were obtained from March to December of 2015, totaling 34 samples and TSP levels from October to December of 2015, totaling 17 samples. Sampling was conducted with PM2.5 and TSP high-volume samplers using quartz filters (Figure 1). Filters were stored during 24 hours in a room with temperature (21,1ºC) and humidity (44,3 %) control, in order to do gravimetric analyses by weighing before and after sampling. This procedure followed the recommendations of the Brazilian Association for Technical Standards local norm (NBR 9547:1997). Mass concentrations of particulate matter were obtained from the ratio between the weighted sample and the volume of air collected. Defining a relationship between particulate matter (PM2.5 and TSP) and respiratory diseases of the local population is an important goal of this project, since no information exists on that topic.

  8. Electric power annual 1997. Volume 1

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1998-07-01

    The Electric Power Annual presents a summary of electric power industry statistics at national, regional, and State levels. The objective of the publication is to provide industry decisionmakers, government policy-makers, analysts, and the general public with data that may be used in understanding US electricity markets. The Electric Power Annual is prepared by the Electric Power Division; Office of Coal, Nuclear, Electric and Alternate Fuels; Energy Information Administration (EIA); US Department of Energy. Volume 1 -- with a focus on US electric utilities -- contains final 1997 data on net generation and fossil fuel consumption, stocks, receipts, and cost; preliminary 1997 data on generating unit capability, and retail sales of electricity, associated revenue, and the average revenue per kilowatthour of electricity sold (based on a monthly sample: Form EIA-826, ``Monthly Electric Utility Sales and Revenue Report with State Distributions``). Additionally, information on net generation from renewable energy sources and on the associated generating capability is included in Volume 1 of the EPA.

  9. Detection of atmospheric gaseous amines and amides by a high-resolution time-of-flight chemical ionization mass spectrometer with protonated ethanol reagent ions

    Directory of Open Access Journals (Sweden)

    L. Yao

    2016-11-01

    Full Text Available Amines and amides are important atmospheric organic-nitrogen compounds but high time resolution, highly sensitive, and simultaneous ambient measurements of these species are rather sparse. Here, we present the development of a high-resolution time-of-flight chemical ionization mass spectrometer (HR-ToF-CIMS method, utilizing protonated ethanol as reagent ions to simultaneously detect atmospheric gaseous amines (C1 to C6 and amides (C1 to C6. This method possesses sensitivities of 5.6–19.4 Hz pptv−1 for amines and 3.8–38.0 Hz pptv−1 for amides under total reagent ion signals of  ∼  0.32 MHz. Meanwhile, the detection limits were 0.10–0.50 pptv for amines and 0.29–1.95 pptv for amides at 3σ of the background signal for a 1 min integration time. Controlled characterization in the laboratory indicates that relative humidity has significant influences on the detection of amines and amides, whereas the presence of organics has no obvious effects. Ambient measurements of amines and amides utilizing this method were conducted from 25 July to 25 August 2015 in urban Shanghai, China. While the concentrations of amines ranged from a few parts per trillion by volume to hundreds of parts per trillion by volume, concentrations of amides varied from tens of parts per trillion by volume to a few parts per billion by volume. Among the C1- to C6-amines, the C2-amines were the dominant species with concentrations up to 130 pptv. For amides, the C3-amides (up to 8.7 ppb were the most abundant species. The diurnal and backward trajectory analysis profiles of amides suggest that in addition to the secondary formation of amides in the atmosphere, industrial emissions could be important sources of amides in urban Shanghai. During the campaign, photo-oxidation of amines and amides might be a main loss pathway for them in daytime, and wet deposition was also an important sink.

  10. Fat Injection: A Systematic Review of Injection Volumes by Facial Subunit.

    Science.gov (United States)

    Shue, Shirley; Kurlander, David E; Guyuron, Bahman

    2017-08-08

    Fat grafting to the aging face has become an integral component of esthetic surgery. However, the amount of fat to inject to each area of the face is not standardized and has been based mainly on the surgeon's experience. The purpose of this study was to perform a systematic review of injected fat volume to different facial zones. A systematic review of the literature was performed through a MEDLINE search using keywords "facial," "fat grafting," "lipofilling," "Coleman technique," "autologous fat transfer," and "structural fat grafting." Articles were then sorted by facial subunit and analyzed for: author(s), year of publication, study design, sample size, donor site, fat preparation technique, average and range of volume injected, time to follow-up, percentage of volume retention, and complications. Descriptive statistics were performed. Nineteen articles involving a total of 510 patients were included. Rhytidectomy was the most common procedure performed concurrently with fat injection. The mean volume of fat injected to the forehead is 6.5 mL (range 4.0-10.0 mL); to the glabellar region 1.4 mL (range 1.0-4.0 mL); to the temple 5.9 mL per side (range 2.0-10.0 mL); to the eyebrow 5.5 mL per side; to the upper eyelid 1.7 mL per side (range 1.5-2.5 mL); to the tear trough 0.65 mL per side (range 0.3-1.0 mL); to the infraorbital area (infraorbital rim to lower lid/cheek junction) 1.4 mL per side (range 0.9-3.0 mL); to the midface 1.4 mL per side (range 1.0-4.0 mL); to the nasolabial fold 2.8 mL per side (range 1.0-7.5 mL); to the mandibular area 11.5 mL per side (range 4.0-27.0 mL); and to the chin 6.7 mL (range 1.0-20.0 mL). Data on exactly how much fat to inject to each area of the face in facial fat grafting are currently limited and vary widely based on different methods and anatomical terms used. This review offers the ranges and the averages for the injected volume in each zone. This journal requires that authors assign a level of evidence

  11. Influencia do período de coleta sobre o volume, motilidade e doses de sêmen em suínos Influence of the collection period on volume, motility and semen doses in swine

    Directory of Open Access Journals (Sweden)

    Martha Lopes Schuch de Castro

    1996-12-01

    Full Text Available O presente trabalho teve como objetivo determinar ainfluência do ano e mês de coleta sobre o volume (VOL, motilidade (MOT e doses de sêmen produzidas (DO, as correlações existentes entre as variáveis e as suas repetibilidades. Foram analisadas amostras de sêmen de 96 machos pertencentes as raças Landrace (41, Large White (31 e Duroc (24, durante o período (1981 a 1987 de permanência dos mesmos na Central de Inseminação Artificial de Suínos de Estrela - RS. O número de amostras dês sêmen coletadas foi de 7.264 da raça Landrace, 3.589 da raça Large White e 3.051 da raça Duroc. Os resultados mostraram haver influência (PThe aim of this experiment was to determine the influence of the collection period on volume (VOL, motility (MOT and semen doses (DO, and the correlation among the six variables and their repeatibilities. Semen samples from ninety (96 boars belonging to Landrace (41, Large White (31 and Duroc (24 breeds were analyzed, taken into account the permanence period (1981 to 1987 of the boars at the Artificial Insemination Center -Estrela - RS. The number of semen samples collected were: Landrace 7,264, Large White 3,589 and Duroc, 3,051. Year and month of collection had influence (P<0.01 on the variables analyzed. Minimum and maximum average values, within each breed, were VOL 236.9 and 300.4ml (Landrace, 238.1 and 284.1ml (Large White and 150.0 and 201.1ml (Duroc: MOT 79.2 and 80.3% (Landrace, 76.7 and 78.0% (Large White and 77.8 and 79.1% (Duroc; DO 12.0 and 14.7 (Landrace, 10.1 and 13.0 (Large White and 9.1 and 11.9 (Duroc, respectively. Correlations between VOL and DO were 0.30 (Landrace, 0.36 (Large White and 0.36 (Duroc. Correlations between VOL and MOT were close to zero (Landrace -0.05, Large White 0.03 and Duroc 0.01, and between MOT and DO were 0.08 (Landrace, 0.15 (Large White and 0.13 (Duroc. Repeatibilities were VOL 0.49 (Landrace, 0.59 (Large White and 0.54 (Duroc; MOT 0.18 (Landrace, 0.27 (Large White and

  12. Development of a novel ozone- and photo-stable HyPer5 red fluorescent dye for array CGH and microarray gene expression analysis with consistent performance irrespective of environmental conditions

    Directory of Open Access Journals (Sweden)

    Kille Peter

    2008-11-01

    Full Text Available Abstract Background Array-based comparative genomic hybridization (CGH and gene expression profiling have become vital techniques for identifying molecular defects underlying genetic diseases. Regardless of the microarray platform, cyanine dyes (Cy3 and Cy5 are one of the most widely used fluorescent dye pairs for microarray analysis owing to their brightness and ease of incorporation, enabling high level of assay sensitivity. However, combining both dyes on arrays can become problematic during summer months when ozone levels rise to near 25 parts per billion (ppb. Under such conditions, Cy5 is known to rapidly degrade leading to loss of signal from either "homebrew" or commercial arrays. Cy5 can also suffer disproportionately from dye photobleaching resulting in distortion of (Cy5/Cy3 ratios used in copy number analysis. Our laboratory has been active in fluorescent dye research to find a suitable alternative to Cy5 that is stable to ozone and resistant to photo-bleaching. Here, we report on the development of such a dye, called HyPer5, and describe its' exceptional ozone and photostable properties on microarrays. Results Our results show HyPer5 signal to be stable to high ozone levels. Repeated exposure of mouse arrays hybridized with HyPer5-labeled cDNA to 300 ppb ozone at 5, 10 and 15 minute intervals resulted in no signal loss from the dye. In comparison, Cy5 arrays showed a dramatic 80% decrease in total signal during the same interval. Photobleaching experiments show HyPer5 to be resistant to light induced damage with 3- fold improvement in dye stability over Cy5. In high resolution array CGH experiments, HyPer5 is demonstrated to detect chromosomal aberrations at loci 2p21-16.3 and 15q26.3-26.2 from three patient sample using bacterial artificial chromosome (BAC arrays. The photostability of HyPer5 is further documented by repeat array scanning without loss of detection. Additionally, HyPer5 arrays are shown to preserve sensitivity and

  13. Analysis of plant hormones by microemulsion electrokinetic capillary chromatography coupled with on-line large volume sample stacking.

    Science.gov (United States)

    Chen, Zongbao; Lin, Zian; Zhang, Lin; Cai, Yan; Zhang, Lan

    2012-04-07

    A novel method of microemulsion electrokinetic capillary chromatography (MEEKC) coupled with on-line large volume sample stacking was developed for the analysis of six plant hormones including indole-3-acetic acid, indole-3-butyric acid, indole-3-propionic acid, 1-naphthaleneacetic acid, abscisic acid and salicylic acid. Baseline separation of six plant hormones was achieved within 10 min by using the microemulsion background electrolyte containing a 97.2% (w/w) 10 mM borate buffer at pH 9.2, 1.0% (w/w) ethyl acetate as oil droplets, 0.6% (w/w) sodium dodecyl sulphate as surfactant and 1.2% (w/w) 1-butanol as cosurfactant. In addition, an on-line concentration method based on a large volume sample stacking technique and multiple wavelength detection was adopted for improving the detection sensitivity in order to determine trace level hormones in a real sample. The optimal method provided about 50-100 fold increase in detection sensitivity compared with a single MEEKC method, and the detection limits (S/N = 3) were between 0.005 and 0.02 μg mL(-1). The proposed method was simple, rapid and sensitive and could be applied to the determination of six plant hormones in spiked water samples, tobacco leaves and 1-naphthylacetic acid in leaf fertilizer. The recoveries ranged from 76.0% to 119.1%, and good reproducibilities were obtained with relative standard deviations (RSDs) less than 6.6%.

  14. Plasma diagnostics package. Volume 2: Spacelab 2 section, part A

    Science.gov (United States)

    Pickett, Jolene S. (Compiler); Frank, L. A. (Compiler); Kurth, W. S. (Compiler)

    1988-01-01

    This volume (2), which consists of two parts (A and B), of the Plasma Diagnostics Package (PDP) Final Science Report contains a summary of all of the data reduction and scientific analyses which were performed using PDP data obtained on STS-51F as a part of the Spacelab 2 (SL-2) payload. This work was performed during the period of launch, July 29, l985, through June 30, l988. During this period the primary data reduction effort consisted of processing summary plots of the data received by 12 of the 14 instruments located on the PDP and submitting these data to the National Space Science Data Center (NSSDC). The scientific analyses during the performance period consisted of follow-up studies of shuttle orbiter environment and orbiter/ionosphere interactions and various plasma particle and wave studies which dealt with data taken when the PDP was on the Remote Manipulator System (RMS) arm and when the PDP was in free flight. Of particular interest during the RMS operations and free flight were the orbiter wake studies and joint studies of beam/plasma interactions with the SL-2 Fast Pulse Electron Generator (FPEG) of the Vehicle Charging and Potential Investigation (VCAP). Internal reports, published papers and presentations which involve PDP/SL-2 data are listed in Sections 3 and 4. A PDP/SL-2 scientific results meeting was held at the University of Iowa on June 10, l986. This meeting was attended by most of the PDP and VCAP investigators and provided a forum for discussing and comparing the various results, particularly with regard to the PDP free flight.

  15. Neutron activation analysis of geochemical samples

    International Nuclear Information System (INIS)

    Rosenberg, R.; Zilliacus, R.; Kaistila, M.

    1983-06-01

    The present paper will describe the work done at the Technical Research Centre of Finland in developing methods for the large-scale activation analysis of samples for the geochemical prospecting of metals. The geochemical prospecting for uranium started in Finland in 1974 and consequently a manually operated device for the delayed neutron activation analysis of uranium was taken into use. During 1974 9000 samples were analyzed. The small capacity of the analyzer made it necessary to develop a completely automated analyzer which was taken into use in August 1975. Since then 20000-30000 samples have been analyzed annually the annual capacity being about 60000 samples when running seven hours per day. Multielemental instrumental neutron activation analysis is used for the analysis of more than 40 elements. Using instrumental epithermal neutron activation analysis 25-27 elements can be analyzed using one irradiation and 20 min measurement. During 1982 12000 samples were analyzed for mining companies and Geological Survey of Finland. The capacity is 600 samples per week. Besides these two analytical methods the analysis of lanthanoids is an important part of the work. 11 lanthanoids have been analyzed using instrumental neutron activation analysis. Radiochemical separation methods have been developed for several elements to improve the sensitivity of the analysis

  16. Resource Conservation and Recovery Act: Part B Permit application. Volume 2, Chapter C, Appendix C1-Appendix C8

    International Nuclear Information System (INIS)

    1995-01-01

    Volume 2 contains appendices for the following: chemical compatibility analysis of waste forms and container materials; data accumulated from headspace-gas analyses; totals analysis versus toxicity characteristic leaching procedure; waste characterization sampling methods; applicability of real-time radiography; quality assurance objectives for waste characterization sampling and analytical methods; quality assurance project plan requirements; and Waste Isolation Pilot Plant generator/storage site waste screening and acceptance audit program

  17. Waste tank vapor project: Vapor characterization of Tank 241-C-103: Data report for OVS samples collected from Sample Job 7b, Parts I and II, received 5/18/94 and 5/24/94

    International Nuclear Information System (INIS)

    Clauss, T.R.; Edwards, J.A.; Fruchter, J.S.

    1994-09-01

    On 5/18/94, Westinghouse Hanford Company (WHC) delivered samples to Pacific Northwest Laboratory (PNL) that were collected from waste Tank 241-C-103 on 5/16/94. These samples were from Sample Job (SJ) 7b, Part 1. On 5/24/94, WHC delivered samples to PNL that were collected from waste Tank 241-C-103 on 5/18/94. These samples were from SJ7b, Part 2. A summary of data derived from the sampling of waste Tank 241-C-103 for gravimetric (H 2 O) and normal paraffin hydrocarbon (NPH) concentrations are shown for SJ7b. Gravimetric analysis was performed on the samples within 24 hours of receipt by PNL. The NPH concentration of 10 samples collected for Part 1 was slightly higher than the average concentration for 15 samples collected in Part 2, 812 (± 133) mg/m 3 and 659 (± 88) mg/m 3 , respectively. The higher concentrations measured in Part 1 samples may be because the samples in Part 1 were collected at a single level, 0.79 meters above the air-liquid interface. Part 2 samples were collected at three different tank levels, 0.79, 2.92, and 5.05 m above the air-liquid interface. In Part 2, the average NPH concentrations for 5 samples collected at each of three levels was similar: 697 (60) mg/m 3 at the low level, 631 (51) mg/m 3 at the mid level, and 651 (134) mg/m 3 at the high level. It is important to note that the measured tridecane to dodecane concentration remained constant in all samples collected in Parts 1 and 2. That ratio is 1.2 ± 0.05. This consistent ratio indicates that there were no random analytical biases towards either compound

  18. Insetti per l’alimentazione umana: barriere e drivers per l’accettazione da parte dei consumatori

    NARCIS (Netherlands)

    Materia, V.C.; Cavallo, Carla

    2015-01-01

    As the overall demand for food increases, there is an urgent need of sustainable proteins. To what extent European consumers are willing to accept innovative ingredients in their diet is the main research question we address. Interviews are conducted on a sample of 45 Italian consumers and results

  19. Reduction of Powerplex(®) Y23 reaction volume for genotyping buccal cell samples on FTA(TM) cards.

    Science.gov (United States)

    Raziel, Aliza; Dell'Ariccia-Carmon, Aviva; Zamir, Ashira

    2015-01-01

    PowerPlex(®) Y23 is a novel kit for Y-STR typing that includes new highly discriminating loci. The Israel DNA Database laboratory has recently adopted it for routine Y-STR analysis. This study examined PCR amplification from 1.2-mm FTA punch in reduced volumes of 5 and 10 μL. Direct amplification and washing of the FTA punches were examined in different PCR cycle numbers. One short robotically performed wash was found to improve the quality and the percent of profiles obtained. The optimal PCR cycle number was determined for 5 and 10 μL reaction volumes. The percent of obtained profiles, color balance, and reproducibility were examined. High-quality profiles were achieved in 90% and 88% of the samples amplified in 5 and 10 μL, respectively, in the first attempt. Volume reduction to 5 μL has a vast economic impact especially for DNA database laboratories. © 2014 American Academy of Forensic Sciences.

  20. Guangdong Aluminum to Raise RMB 3 billion for New Production Base in Guizhou

    Institute of Scientific and Technical Information of China (English)

    2010-01-01

    <正>On July 7, a loan signing ceremony was held between the Guangdong Aluminum Group, China Construction Bank, Hua Xia Bank and Guangzhou Bank Consortium. It is reported that these banks will provide Guangdong Aluminum Group with RMB 30 billion for an alu-minum oxide and supporting bauxite mining project in Guizhou.

  1. Ultrarelativistic heavy ion collisions: the first billion seconds

    Energy Technology Data Exchange (ETDEWEB)

    Baym, Gordon

    2016-12-15

    I first review the early history of the ultrarelativistic heavy ion program, starting with the 1974 Bear Mountain Workshop, and the 1983 Aurora meeting of the U.S. Nuclear Science Committtee, just one billion seconds ago, which laid out the initial science goals of an ultrarelativistic collider. The primary goal, to discover the properties of nuclear matter at the highest energy densities, included finding new states of matter – the quark-gluon plasma primarily – and to use collisions to open a new window on related problems of matter in cosmology, neutron stars, supernovae, and elsewhere. To bring out how the study of heavy ions and hot, dense matter in QCD has been fulfilling these goals, I concentrate on a few topics, the phase diagram of matter in QCD, and connections of heavy ion physics to cold atoms, cosmology, and neutron stars.

  2. A Quantitative Analysis of the Relationship between Medicare Payment and Service Volume for Glaucoma Procedures from 2005 through 2009.

    Science.gov (United States)

    Gong, Dan; Jun, Lin; Tsai, James C

    2015-05-01

    To calculate the association between Medicare payment and service volume for 6 commonly performed glaucoma procedures. Retrospective, longitudinal database study. A 100% dataset of all glaucoma procedures performed on Medicare Part B beneficiaries within the United States from 2005 to 2009. Fixed-effects regression model using Medicare Part B carrier data for all 50 states and the District of Columbia, controlling for time-invariant carrier-specific characteristics, national trends in glaucoma service volume, Medicare beneficiary population, number of ophthalmologists, and income per capita. Payment-volume elasticities, defined as the percent change in service volume per 1% change in Medicare payment, for laser trabeculoplasty (Current Procedural Terminology [CPT] code 65855), trabeculectomy without previous surgery (CPT code 66170), trabeculectomy with previous surgery (CPT code 66172), aqueous shunt to reservoir (CPT code 66180), laser iridotomy (CPT code 66761), and scleral reinforcement with graft (CPT code 67255). The payment-volume elasticity was nonsignificant for 4 of 6 procedures studied: laser trabeculoplasty (elasticity, -0.27; 95% confidence interval [CI], -1.31 to 0.77; P = 0.61), trabeculectomy without previous surgery (elasticity, -0.42; 95% CI, -0.85 to 0.01; P = 0.053), trabeculectomy with previous surgery (elasticity, -0.28; 95% CI, -0.83 to 0.28; P = 0.32), and aqueous shunt to reservoir (elasticity, -0.47; 95% CI, -3.32 to 2.37; P = 0.74). Two procedures yielded significant associations between Medicare payment and service volume. For laser iridotomy, the payment-volume elasticity was -1.06 (95% CI, -1.39 to -0.72; P payment, laser iridotomy service volume increased by 1.06%. For scleral reinforcement with graft, the payment-volume elasticity was -2.92 (95% CI, -5.72 to -0.12; P = 0.041): for every 1% decrease in CPT code 67255 payment, scleral reinforcement with graft service volume increased by 2.92%. This study calculated the association

  3. Hippocampal volumes are important predictors for memory function in elderly women

    Directory of Open Access Journals (Sweden)

    Adolfsdottir Steinunn

    2009-08-01

    Full Text Available Abstract Background Normal aging involves a decline in cognitive function that has been shown to correlate with volumetric change in the hippocampus, and with genetic variability in the APOE-gene. In the present study we utilize 3D MR imaging, genetic analysis and assessment of verbal memory function to investigate relationships between these factors in a sample of 170 healthy volunteers (age range 46–77 years. Methods Brain morphometric analysis was performed with the automated segmentation work-flow implemented in FreeSurfer. Genetic analysis of the APOE genotype was determined with polymerase chain reaction (PCR on DNA from whole-blood. All individuals were subjected to extensive neuropsychological testing, including the California Verbal Learning Test-II (CVLT. To obtain robust and easily interpretable relationships between explanatory variables and verbal memory function we applied the recent method of conditional inference trees in addition to scatterplot matrices and simple pairwise linear least-squares regression analysis. Results APOE genotype had no significant impact on the CVLT results (scores on long delay free recall, CVLT-LD or the ICV-normalized hippocampal volumes. Hippocampal volumes were found to decrease with age and a right-larger-than-left hippocampal asymmetry was also found. These findings are in accordance with previous studies. CVLT-LD score was shown to correlate with hippocampal volume. Multivariate conditional inference analysis showed that gender and left hippocampal volume largely dominated predictive values for CVLT-LD scores in our sample. Left hippocampal volume dominated predictive values for females but not for males. APOE genotype did not alter the model significantly, and age was only partly influencing the results. Conclusion Gender and left hippocampal volumes are main predictors for verbal memory function in normal aging. APOE genotype did not affect the results in any part of our analysis.

  4. Assessment of undiscovered oil and gas resources of the Central Burma Basin and the Irrawaddy-Andaman and Indo-Burman Geologic Provinces, Myanmar

    Science.gov (United States)

    Wandrey, Craig J.; Schenk, Christopher J.; Klett, Timothy R.; Brownfield, Michael E.; Charpentier, Ronald R.; Cook, Troy A.; Pollastro, Richard M.; Tennyson, Marilyn E.

    2012-01-01

    The Irrawaddy-Andaman and Indo-Burman Geologic Provinces were recently assessed for undiscovered technically recoverable oil, natural gas, and natural gas liquids resources as part of the U.S. Geological Survey's (USGS) World Oil and Gas Assessment. Using a geology-based assessment methodology, the USGS estimated mean volumes of 2.3 billion barrels of oil, 79.6 trillion cubic feet of gas, and 2.1 billion barrels of natrual gas liquids.

  5. INCOME PER BED AS A DETERMINANT OF HOSPITAL’S FINANCIAL LIQUIDITY

    Directory of Open Access Journals (Sweden)

    Agnieszka Bem

    2014-10-01

    Full Text Available Hospitals’ financial condition is very important, in terms of availability and quality of inpatient health care services. Inpatient’s services consume, in Poland, an important part (about 50% of National Health Fund resources, but financial situation of hospitals is difficult and many hospitals report problems with liquidity and solvency. The purpose of this research is to study the relationship between the intensity of care, measured by the annual income per bed, and the static liquidity ratios (current ratio and quick ratio. The research has been conducted on the sample of 138 Polish hospitals, using data covering the period 2009-2011. In order to test research hypotheses, statistical tools have been used (T-Student distribution. The study has shown, that, during analyzed period, liquidity ratios have lowered and the level of financial liquidity is, in case of Polish hospitals, lower than recommended in the literature. The authors also confirmed the existence of the relationship between annual income per bed and liquidity ratios. However, the most important finding is that the relationship between the hospital’s income per bed and financial liquidity ratios is positive only up to a certain level, which has been estimated at about 60,000-70,000 EUR per bed. Above this level further increase in income per bed decreases liquidity ratios. This finding seems to be extremely important for health care managers, which usually strive for the income maximization.

  6. A suspended-particle rosette multi-sampler for discrete biogeochemical sampling in low-particle-density waters

    Energy Technology Data Exchange (ETDEWEB)

    Breier, J. A.; Rauch, C. G.; McCartney, K.; Toner, B. M.; Fakra, S. C.; White, S. N.; German, C. R.

    2010-06-22

    To enable detailed investigations of early stage hydrothermal plume formation and abiotic and biotic plume processes we developed a new oceanographic tool. The Suspended Particulate Rosette sampling system has been designed to collect geochemical and microbial samples from the rising portion of deep-sea hydrothermal plumes. It can be deployed on a remotely operated vehicle for sampling rising plumes, on a wire-deployed water rosette for spatially discrete sampling of non-buoyant hydrothermal plumes, or on a fixed mooring in a hydrothermal vent field for time series sampling. It has performed successfully during both its first mooring deployment at the East Pacific Rise and its first remotely-operated vehicle deployments along the Mid-Atlantic Ridge. It is currently capable of rapidly filtering 24 discrete large-water-volume samples (30-100 L per sample) for suspended particles during a single deployment (e.g. >90 L per sample at 4-7 L per minute through 1 {mu}m pore diameter polycarbonate filters). The Suspended Particulate Rosette sampler has been designed with a long-term goal of seafloor observatory deployments, where it can be used to collect samples in response to tectonic or other events. It is compatible with in situ optical sensors, such as laser Raman or visible reflectance spectroscopy systems, enabling in situ particle analysis immediately after sample collection and before the particles alter or degrade.

  7. High Efficiency, 100 mJ per pulse, Nd:YAG Oscillator Optimized for Space-Based Earth and Planetary Remote Sensing

    Science.gov (United States)

    Coyle, D. Barry; Stysley, Paul R.; Poulios, Demetrios; Fredrickson, Robert M.; Kay, Richard B.; Cory, Kenneth C.

    2014-01-01

    We report on a newly solid state laser transmitter, designed and packaged for Earth and planetary space-based remote sensing applications for high efficiency, low part count, high pulse energy scalability/stability, and long life. Finally, we have completed a long term operational test which surpassed 2 Billion pulses with no measured decay in pulse energy.

  8. A rapid method for estimation of Pu-isotopes in urine samples using high volume centrifuge.

    Science.gov (United States)

    Kumar, Ranjeet; Rao, D D; Dubla, Rupali; Yadav, J R

    2017-07-01

    The conventional radio-analytical technique used for estimation of Pu-isotopes in urine samples involves anion exchange/TEVA column separation followed by alpha spectrometry. This sequence of analysis consumes nearly 3-4 days for completion. Many a times excreta analysis results are required urgently, particularly under repeat and incidental/emergency situations. Therefore, there is need to reduce the analysis time for the estimation of Pu-isotopes in bioassay samples. This paper gives the details of standardization of a rapid method for estimation of Pu-isotopes in urine samples using multi-purpose centrifuge, TEVA resin followed by alpha spectrometry. The rapid method involves oxidation of urine samples, co-precipitation of plutonium along with calcium phosphate followed by sample preparation using high volume centrifuge and separation of Pu using TEVA resin. Pu-fraction was electrodeposited and activity estimated using 236 Pu tracer recovery by alpha spectrometry. Ten routine urine samples of radiation workers were analyzed and consistent radiochemical tracer recovery was obtained in the range 47-88% with a mean and standard deviation of 64.4% and 11.3% respectively. With this newly standardized technique, the whole analytical procedure is completed within 9h (one working day hour). Copyright © 2017 Elsevier Ltd. All rights reserved.

  9. Background levels of methane in Mars’ atmosphere show strong seasonal variations

    Science.gov (United States)

    Webster, Christopher R.; Mahaffy, Paul R.; Atreya, Sushil K.; Moores, John E.; Flesch, Gregory J.; Malespin, Charles; McKay, Christopher P.; Martinez, German; Smith, Christina L.; Martin-Torres, Javier; Gomez-Elvira, Javier; Zorzano, Maria-Paz; Wong, Michael H.; Trainer, Melissa G.; Steele, Andrew; Archer, Doug; Sutter, Brad; Coll, Patrice J.; Freissinet, Caroline; Meslin, Pierre-Yves; Gough, Raina V.; House, Christopher H.; Pavlov, Alexander; Eigenbrode, Jennifer L.; Glavin, Daniel P.; Pearson, John C.; Keymeulen, Didier; Christensen, Lance E.; Schwenzer, Susanne P.; Navarro-Gonzalez, Rafael; Pla-García, Jorge; Rafkin, Scot C. R.; Vicente-Retortillo, Álvaro; Kahanpää, Henrik; Viudez-Moreiras, Daniel; Smith, Michael D.; Harri, Ari-Matti; Genzer, Maria; Hassler, Donald M.; Lemmon, Mark; Crisp, Joy; Sander, Stanley P.; Zurek, Richard W.; Vasavada, Ashwin R.

    2018-06-01

    Variable levels of methane in the martian atmosphere have eluded explanation partly because the measurements are not repeatable in time or location. We report in situ measurements at Gale crater made over a 5-year period by the Tunable Laser Spectrometer on the Curiosity rover. The background levels of methane have a mean value 0.41 ± 0.16 parts per billion by volume (ppbv) (95% confidence interval) and exhibit a strong, repeatable seasonal variation (0.24 to 0.65 ppbv). This variation is greater than that predicted from either ultraviolet degradation of impact-delivered organics on the surface or from the annual surface pressure cycle. The large seasonal variation in the background and occurrences of higher temporary spikes (~7 ppbv) are consistent with small localized sources of methane released from martian surface or subsurface reservoirs.

  10. International Linear Collider Technical Design Report (Volumes 1 through 4)

    Energy Technology Data Exchange (ETDEWEB)

    Harrison M.

    2013-03-27

    The design report consists of four volumes: Volume 1, Executive Summary; Volume 2, Physics; Volume 3, Accelerator (Part I, R and D in the Technical Design Phase, and Part II, Baseline Design); and Volume 4, Detectors.

  11. Preliminary study for the determination of heavy metal in ground samples by GF-ASS Zeeman; Studio preliminare per la determinazione di metalli pesanti in campioni di suolo mediante analisi GF-AAS Zeeman

    Energy Technology Data Exchange (ETDEWEB)

    Casabianca, T.; Bitonte, R.; Epifani, M.; Ubaldi, C. [ENEA, Divisione Tecnologie Ingegneria e Servizi Ambientali, Centro Ricerche Trisaia, MT (Italy)

    2001-07-01

    In the framework of SIMOA project have been investigated methods to evaluate the level of soil contamination due to heavy metals. In this wok, it is discussed a procedure to measure topsoil bioavailable fraction of seven heavy metals (Cd, Cu, Pb, Ni, Cr, Hg). The adopted procedure is based on acid digestion followed by instrumental detection by means of graphite furnace atomic-absorption spectrophotometry (GFAAS) using Zeeman effect to reduce background contribution. Details of samples preparation and analysis, experimental setup optimization and statistical data analysis are presented, together with a discussion on method accuracy and data interpretation. [Italian] Nell'ambito del progetto SIMOA (Sistema Integrato di Monitoraggio Ambientale) per il monitoraggio ambientale nel bacino del Basento (Regione Basilicata, Italia), vengono investigati i metodi per il controllo dei livelli di inquinamento del suolo da parte di metalli pesanti. Nel presente lavoro viene proposta una procedura per determinare il livello di concentrazione della frazione biodisponibile di sette metalli pesanti (Cadmio, Rame, Piombo, Nickel, Cromo, Mercurio) in campion di suolo superficiale. Il metodo e' basato su di un trattamento di digestione acida in forno a microonde cui segue la rivelazione strumentale mediante spettrofotometria di assorbimento atomico in fornetto di grafite (GFAAS) con effetto Zeeman per la correzione del fondo. Si descrivono in dettaglio le fasi di preparazione dei campioni, la metodologia di misura e l'analisi statistica dei dati, oltre ad una discussione sull'attendibilita' del metodo e sui futuri sviluppi.

  12. Fluorometric analysis for uranium in natural waters

    International Nuclear Information System (INIS)

    Waterbury, G.R.

    1977-01-01

    A fluorometric method is used for the routine determination of uranium at 0.2 to parts-per-billion (ppB) concentrations in natural surface waters. Duplicate 200-μl aliquots of the water samples are pipetted onto 0.4-g pellets of 98 percent NaF-2 percent LiF flux contained in platinum dishes. The pellets are dried under heat lamps and fused over special propane burners. The fused pellets are subjected to ultraviolet radiation and the fluorescence is measured in a fluorometer. The lower limit of detection is 0.2 ppB of uranium, and the precision is about 15 relative percent in the 0.2 to 10 ppB uranium concentration range. Two analysts determine uranium in 750 to 900 samples per week using this method. Samples containing solids or more than 19 ppB of uranium are analyzed by a delayed neutron counting method

  13. Quantifying landfill biogas production potential in the U.S.

    Science.gov (United States)

    This study presents an overview of the biogas (biomethane) availability in U.S. landfills, calculated from EPA estimates of landfill capacities. This survey concludes that the volume of landfill-derived methane in the U.S. is 466 billion cubic feet per year, of which 66 percent is collected and onl...

  14. Advanced Quadrupole Ion Trap Instrumentation for Low Level Vehicle Emissions Measurements

    International Nuclear Information System (INIS)

    McLuckey, S.A.

    1997-01-01

    Quadrupole ion trap mass spectrometry has been evaluated for its potential use in vehicle emissions measurements in vehicle test facilities as an analyzer for the top 15 compounds contributing to smog generation. A variety of ionization methods were explored including ion trap in situ chemical ionization, atmospheric sampling glow discharge ionization, and nitric oxide chemical ionization in a glow discharge ionization source coupled with anion trap mass spectrometer. Emphasis was placed on the determination of hydrocarbons and oxygenated hydrocarbons at parts per million to parts per billion levels. Ion trap in situ water chemical ionization and atmospheric sampling glow discharge ionization were both shown to be amendable to the analysis of arenes, alcohols, aldehydes and, to some degree, alkenes. Atmospheric sampling glow discharge also generated molecular ions of methy-t-butyl ether (MTBE). Neither of these ionization methods, however, were found to generate diagnostic ions for the alkanes. Nitric oxide chemical ionization, on the other hand, was found to yield diagnostic ions for alkanes, alkenes, arenes, alcohols, aldehydes, and MTBE. The ability to measure a variety of hydrocarbons present at roughly 15 parts per billion at measurement rates of 3 Hz was demonstrated. All of the ions with potential to serve as parent ions in a tandem mass spectrometry experiment were found to yield parent-to-product conversion efficiencies greater than 75%. The flexibility afforded to the ion trap by use of tailored wave-forms applied to the end-caps allows parallel monitoring schemes to be devised that provide many of the advantages of tandem mass spectrometry without major loss in measurement rate. A large loss in measurement rate would ordinarily result from the use of conventional tandem mass spectrometry experiments carried out in series for a large number of targeted components. These results have demonstrated that the ion trap has an excellent combination of

  15. Investigations of per- and polyfluorinated compounds in environmental samples and contemporary products

    Science.gov (United States)

    Due to the usefulness of per- and polyfluoro alkyl substances (PFAS), they are in many applications and products, such as fluoropolymer dispersions, fluoro-surfactants, paper/paperboard coatings, and aqueous film forming foam (AFFF). Recent regulatory pressure has altered the che...

  16. The problem of large samples. An activation analysis study of electronic waste material

    International Nuclear Information System (INIS)

    Segebade, C.; Goerner, W.; Bode, P.

    2007-01-01

    Large-volume instrumental photon activation analysis (IPAA) was used for the investigation of shredded electronic waste material. Sample masses from 1 to 150 grams were analyzed to obtain an estimate of the minimum sample size to be taken to achieve a representativeness of the results which is satisfactory for a defined investigation task. Furthermore, the influence of irradiation and measurement parameters upon the quality of the analytical results were studied. Finally, the analytical data obtained from IPAA and instrumental neutron activation analysis (INAA), both carried out in a large-volume mode, were compared. Only parts of the values were found in satisfactory agreement. (author)

  17. Orbital forcing of climate 1.4 billion years ago

    DEFF Research Database (Denmark)

    Zhang, Shuichang; Wang, Xiaomei; Hammarlund, Emma U

    2015-01-01

    Fluctuating climate is a hallmark of Earth. As one transcends deep into Earth time, however, both the evidence for and the causes of climate change become difficult to establish. We report geochemical and sedimentological evidence for repeated, short-term climate fluctuations from the exceptionally...... well-preserved ∼1.4-billion-year-old Xiamaling Formation of the North China Craton. We observe two patterns of climate fluctuations: On long time scales, over what amounts to tens of millions of years, sediments of the Xiamaling Formation record changes in geochemistry consistent with long-term changes...... reflect what appear to be orbitally forced changes in wind patterns and ocean circulation as they influenced rates of organic carbon flux, trace metal accumulation, and the source of detrital particles to the sediment....

  18. Mg II ABSORPTION CHARACTERISTICS OF A VOLUME-LIMITED SAMPLE OF GALAXIES AT z ∼ 0.1

    International Nuclear Information System (INIS)

    Barton, Elizabeth J.; Cooke, Jeff

    2009-01-01

    We present an initial survey of Mg II absorption characteristics in the halos of a carefully constructed, volume-limited subsample of galaxies embedded in the spectroscopic part of the Sloan Digital Sky Survey (SDSS). We observed quasars near sightlines to 20 low-redshift (z ∼ 0.1), luminous (M r + 5log h ≤-20.5) galaxies in SDSS DR4 and DR6 with the LRIS-B spectrograph on the Keck I telescope. The primary systematic criteria for the targeted galaxies are a redshift z ∼> 0.1 and the presence of an appropriate bright background quasar within a projected 75 h -1 kpc of its center, although we preferentially sample galaxies with lower impact parameters and slightly more star formation within this range. Of the observed systems, six exhibit strong (W eq (2796) ≥ 0.3 A) Mg II absorption at the galaxy's redshift, six systems have upper limits which preclude strong Mg II absorption, while the remaining observations rule out very strong (W eq (2796) ≥ 1-2 A) absorption. The absorbers fall at higher impact parameters than many non-absorber sightlines, indicating a covering fraction f c ∼ -1 kpc (f c ∼ 0.25). The data are consistent with a possible dependence of covering fraction and/or absorption halo size on the environment or star-forming properties of the central galaxy.

  19. Using Dried Blood Spot Sampling to Improve Data Quality and Reduce Animal Use in Mouse Pharmacokinetic Studies

    Science.gov (United States)

    Wickremsinhe, Enaksha R; Perkins, Everett J

    2015-01-01

    Traditional pharmacokinetic analysis in nonclinical studies is based on the concentration of a test compound in plasma and requires approximately 100 to 200 µL blood collected per time point. However, the total blood volume of mice limits the number of samples that can be collected from an individual animal—often to a single collection per mouse—thus necessitating dosing multiple mice to generate a pharmacokinetic profile in a sparse-sampling design. Compared with traditional methods, dried blood spot (DBS) analysis requires smaller volumes of blood (15 to 20 µL), thus supporting serial blood sampling and the generation of a complete pharmacokinetic profile from a single mouse. Here we compare plasma-derived data with DBS-derived data, explain how to adopt DBS sampling to support discovery mouse studies, and describe how to generate pharmacokinetic and pharmacodynamic data from a single mouse. Executing novel study designs that use DBS enhances the ability to identify and streamline better drug candidates during drug discovery. Implementing DBS sampling can reduce the number of mice needed in a drug discovery program. In addition, the simplicity of DBS sampling and the smaller numbers of mice needed translate to decreased study costs. Overall, DBS sampling is consistent with 3Rs principles by achieving reductions in the number of animals used, decreased restraint-associated stress, improved data quality, direct comparison of interanimal variability, and the generation of multiple endpoints from a single study. PMID:25836959

  20. Volume Measurements of Laser-generated Pits for In Situ Geochronology using KArLE (Potassium-Argon Laser Experiment)

    Science.gov (United States)

    French, R. A.; Cohen, B. A.; Miller, J. S.

    2014-01-01

    The Potassium-Argon Laser Experiment( KArLE), is composed of two main instruments: a spectrometer as part of the Laser-Induced Breakdown Spectroscopy (LIBS) method and a Mass Spectrometer (MS). The LIBS laser ablates a sample and creates a plasma cloud, generating a pit in the sample. The LIBS plasma is measured for K abundance in weight percent and the released gas is measured using the MS, which calculates Ar abundance in mols. To relate the K and Ar measurements, total mass of the ablated sample is needed but can be difficult to directly measure. Instead, density and volume are used to calculate mass, where density is calculated based on the elemental composition of the rock (from the emission spectrum) and volume is determined by pit morphology. This study aims to reduce the uncertainty for KArLE by analyzing pit volume relationships in several analog materials and comparing methods of pit volume measurements and their associated uncertainties.

  1. Aneurysm coil embolization: cost per volumetric filling analysis and strategy for cost reduction.

    Science.gov (United States)

    Wang, Charlie; Ching, Esteban Cheng; Hui, Ferdinand K

    2016-05-01

    One of the primary device expenditures associated with the endovascular treatment of aneurysms is that of detachable coils. Analyzing the cost efficiency of detachable coils is difficult, given the differences in design, implantable volume, and the presence of additives. However, applying a volume per cost metric may provide an index analogous to unit price found in grocery stores. The price information for 509 different coils belonging to 31 different coil lines, available as of September 2013, was obtained through the inventory management system at the study site, and normalized to the price of the least expensive coil. Values were used to calculate the logarithmic ratio of volume over cost. Operator choice among coil sizes can vary the material costs by five-fold in a hypothetical aneurysm. The difference in coil costs as a function of cost per volume of coil can vary tremendously. Using the present pricing algorithms, using the longest available length at a particular helical dimension and system yields improved efficiency. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://www.bmj.com/company/products-services/rights-and-licensing/

  2. Energy tax price tag for CPI: $1.2 billion, jobs, and production

    International Nuclear Information System (INIS)

    Begley, R.

    1993-01-01

    If President Clinton's proposed energy tax had been fully in place last year, it would have cost the US chemical industry an additional $1.2 billion and 9,900 jobs, according to Chemical Manufacturers Association (CMA; Washington) estimates. It also would have driven output down 3% and prices up 5%, CMA says. Allen Lenz, CMA director/trade and economics, says the increase in production costs that would accompany the tax will not be shared by foreign competitors, cannot be neutralized with higher border taxes because of existing trade agreements, and provides another reason to move production offshore. Worse, the US chemical industry's generally impressive trade surplus declined by $2.5 billion last year, and a further drop is projected for this year. The margin of error gets thinner all the time as competition increases, Lenz says. We're not concerned only with the chemical industry, but the rest of US-based manufacturing because they taken half our output, he adds. One problem is the energy intensiveness of the chemical process industries-a CMA report says that 55% of the cost of producing ethylene glycol is energy related. And double taxation of such things as coproducts returned for credit to oil refineries could add up to $115 million/year, the report says

  3. INEL Sample Management Office

    International Nuclear Information System (INIS)

    Watkins, C.

    1994-01-01

    The Idaho National Engineering Laboratory (INEL) Sample Management Office (SMO) was formed as part of the EG ampersand G Idaho Environmental Restoration Program (ERP) in June, 1990. Since then, the SMO has been recognized and sought out by other prime contractors and programs at the INEL. Since December 1991, the DOE-ID Division Directors for the Environmental Restoration Division and Waste Management Division supported the expansion of the INEL ERP SMO into the INEL site wide SMO. The INEL SMO serves as a point of contact for multiple environmental analytical chemistry and laboratory issues (e.g., capacity, capability). The SMO chemists work with project managers during planning to help develop data quality objectives, select appropriate analytical methods, identify special analytical services needs, identify a source for the services, and ensure that requirements for sampling and analysis (e.g., preservations, sample volumes) are clear and technically accurate. The SMO chemists also prepare work scope statements for the laboratories performing the analyses

  4. 16 CFR Appendix F to Part 436 - Sample Item 20(5) Table-Projected New Franchised Outlets

    Science.gov (United States)

    2010-01-01

    ... 16 Commercial Practices 1 2010-01-01 2010-01-01 false Sample Item 20(5) Table-Projected New Franchised Outlets F Appendix F to Part 436 Commercial Practices FEDERAL TRADE COMMISSION TRADE REGULATION RULES DISCLOSURE REQUIREMENTS AND PROHIBITIONS CONCERNING FRANCHISING Pt. 436, App. F Appendix F to Part...

  5. 16 CFR Appendix C to Part 436 - Sample Item 20(2) Table-Transfers of Franchised Outlets

    Science.gov (United States)

    2010-01-01

    ... 16 Commercial Practices 1 2010-01-01 2010-01-01 false Sample Item 20(2) Table-Transfers of Franchised Outlets C Appendix C to Part 436 Commercial Practices FEDERAL TRADE COMMISSION TRADE REGULATION RULES DISCLOSURE REQUIREMENTS AND PROHIBITIONS CONCERNING FRANCHISING Pt. 436, App. C Appendix C to Part...

  6. A 17-billion-solar-mass black hole in a group galaxy with a diffuse core.

    Science.gov (United States)

    Thomas, Jens; Ma, Chung-Pei; McConnell, Nicholas J; Greene, Jenny E; Blakeslee, John P; Janish, Ryan

    2016-04-21

    Quasars are associated with and powered by the accretion of material onto massive black holes; the detection of highly luminous quasars with redshifts greater than z = 6 suggests that black holes of up to ten billion solar masses already existed 13 billion years ago. Two possible present-day 'dormant' descendants of this population of 'active' black holes have been found in the galaxies NGC 3842 and NGC 4889 at the centres of the Leo and Coma galaxy clusters, which together form the central region of the Great Wall--the largest local structure of galaxies. The most luminous quasars, however, are not confined to such high-density regions of the early Universe; yet dormant black holes of this high mass have not yet been found outside of modern-day rich clusters. Here we report observations of the stellar velocity distribution in the galaxy NGC 1600--a relatively isolated elliptical galaxy near the centre of a galaxy group at a distance of 64 megaparsecs from Earth. We use orbit superposition models to determine that the black hole at the centre of NGC 1600 has a mass of 17 billion solar masses. The spatial distribution of stars near the centre of NGC 1600 is rather diffuse. We find that the region of depleted stellar density in the cores of massive elliptical galaxies extends over the same radius as the gravitational sphere of influence of the central black holes, and interpret this as the dynamical imprint of the black holes.

  7. Introduction to Part 3

    DEFF Research Database (Denmark)

    Petersen, Nils Holger

    2015-01-01

    A brief contextualising discussion of Western Music History and its relations to Theological Aesthetical Thought since Carolingian Times as an introduction to 3 music articles in Part 3 of the volume.......A brief contextualising discussion of Western Music History and its relations to Theological Aesthetical Thought since Carolingian Times as an introduction to 3 music articles in Part 3 of the volume....

  8. Uranium hydrogeochemical and stream sediment reconnaissance of the Cheyenne NTMS Quadrangle, Wyoming

    International Nuclear Information System (INIS)

    Trexler, P.K.

    1978-06-01

    Between June 1976 and October 1977, 1138 water and 600 sediment samples were systematically collected from 1498 locations in the Cheyenne NTMS quadrangle of southeast Wyoming. The samples were analyzed for total uranium at the Los Alamos Scientific Laboratory. The uranium concentration in waters ranged from 0.01 to 296.30 parts per billion (ppB), with a median of 3.19 ppB and a mean of 8.34 ppB. The uranium in sediments ranged from 0.8 to 83.0 parts per million (ppM) with a median of 3.4 ppM and a mean of 4.5 ppM. Arbitrary anomaly thresholds were selected to isolate those water and sediment samples containing uranium concentrations above those of 98% of the population sampled. Using this procedure, 23 water samples above 54.50 ppB and 12 sediment samples above 14.0 ppM were considered anomalous. Several areas appear favorable for further investigation for possible uranium mineralization. High uranium concentrations were detected in waters from the northeast corner of the Cheyenne quadrangle. High uranium concentrations were detected in sediments from locations in the southern and central Laramie Mountains and along the southeast and east-central edges of the study area

  9. Nuestras cuentas diarias: Matematicas. Primaria para adultos, Primera parte, Volumens 1 y 2. Edicion Experimental (Our Daily Accounting: Mathematics. Primer for Adults, Part One, Volumes 1 and 2. Experimental Edition).

    Science.gov (United States)

    Instituto Nacional para la Educacion de los Adultos, Mexico City (Mexico).

    These workbooks are part of a Mexican series of instructional materials designed for Spanish speaking adults who are in the process of becoming literate or have recently become literate in their native language. The workbooks, divided in two volumes, are designed to teach skills required in managing ordinary financial transactions and daily tasks…

  10. The energy sector abroad. Part 5. Norwegian energy sector large exporter of natural gas

    International Nuclear Information System (INIS)

    Van Gelder, J.W.

    1997-01-01

    Some facts about the Norwegian natural gas and oil industry are presented. In 1995 the industries took 12.5% of GNP and no less than 47.6% of export revenues. The use of natural gas in Norway is low. In 1996 Norway exported 37.9 billion m 3 of natural gas. It is planned to double that volume within the next 10 years. Therefore, a strategic alliance between two major foreign competitors (Gasunie in the Netherlands and Gazprom in the Russian Federation) was not met with enthusiasm. The three most important companies in the Norwegian oil and gas industry are Statoil, Norsk Hydro, and Saga Petroleum. Overall turnover of the sector in 1994 was 40.6 billion Dutch guilders. Some 17,500 people are directly employed by the sector. 5 ills., 5 tabs

  11. Neutron multicounter detector for investigation of content and spatial distribution of fission materials in large volume samples

    International Nuclear Information System (INIS)

    Swiderska-Kowalczyk, M.; Starosta, W.; Zoltowski, T.

    1998-01-01

    The experimental device is a neutron coincidence well counter. It can be applied for passive assay of fissile - especially for plutonium bearing - materials. It consist of a set of 3 He tubes placed inside a polyethylene moderator; outputs from the tubes, first processed by preamplifier/amplifier/discriminator circuits, are then analysed using neutron correlator connected with a PC, and correlation techniques implemented in software. Such a neutron counter allows for determination of plutonium mass ( 240 Pu effective mass) in nonmultiplying samples having fairly big volume (up to 0.14 m 3 ). For determination of neutron sources distribution inside the sample, the heuristic methods based on hierarchical cluster analysis are applied. As an input parameters, amplitudes and phases of two-dimensional Fourier transformation of the count profiles matrices for known point sources distributions and for the examined samples, are taken. Such matrices are collected by means of sample scanning by detection head. During clustering process, counts profiles for unknown samples fitted into dendrograms using the 'proximity' criterion of the examined sample profile to standard samples profiles. Distribution of neutron sources in an examined sample is then evaluated on the basis of comparison with standard sources distributions. (author)

  12. Part-time work among pediatricians expands.

    Science.gov (United States)

    Cull, William L; O'Connor, Karen G; Olson, Lynn M

    2010-01-01

    The objective of this study was to track trends in part-time employment among pediatricians from 2000 to 2006 and to examine differences within subgroups of pediatricians. As part of the Periodic Survey of Fellows, national random samples of American Academy of Pediatrics members were surveyed in 2000, 2003, and 2006. These surveys shared questions concerning working part-time and other practice characteristics. Roughly 1600 pediatricians were included in each random sample. Totals of 812 (51%), 1020 (63%), and 1013 (62%) pediatricians completed the surveys in 2000, 2003, and 2006, respectively. Analyses were limited to nonretired, posttrainee pediatricians. The number of pediatricians who reported that they work part-time increased from 15% in 2000, to 20% in 2003, to 23% in 2006. The pattern of increased part-time work from 2000 to 2006 held for many subgroups, including men, women, pediatricians who were younger than 40 years, pediatricians who were aged >or=50 years, pediatricians who worked in an urban inner city, pediatricians who worked in suburban areas, general pediatricians, and subspecialist pediatricians. Those who were working part-time were more satisfied within their professional and personal activities. Part-time pediatricians worked on average 14.3 fewer hours per week in direct patient care. Increases in part-time work are apparent throughout pediatrics. The possible continued growth of part-time is an important trend within the field of pediatrics that will need to be monitored.

  13. Parts per billion-level detection of benzene using SnO2/graphene nanocomposite composed of sub-6 nm SnO2 nanoparticles.

    Science.gov (United States)

    Meng, Fan-Li; Li, Hui-Hua; Kong, Ling-Tao; Liu, Jin-Yun; Jin, Zhen; Li, Wei; Jia, Yong; Liu, Jin-Huai; Huang, Xing-Jiu

    2012-07-29

    In the present work, the SnO(2)/graphene nanocomposite composed of 4-5 nm SnO(2) nanoparticles was synthesized using a simple wet chemical method for ppb-level detection of benzene. The formation mechanism of the nanocomposite was investigated systematically by means of simultaneous thermogravimetry analysis, X-ray diffraction, and X-ray photoelectron spectroscopy cooperated with transmission electron microscopy observations. The SnO(2)/graphene nanocomposite showed a very attractive improved sensitivity to toxic volatile organic compounds, especially to benzene, compared to a traditional SnO(2). The responses of the nanocomposite to benzene were a little higher than those to ethanol and the detection limit reached 5 ppb to benzene which is, to our best knowledge, far lower than those reported previously. Copyright © 2012 Elsevier B.V. All rights reserved.

  14. Parts per billion-level detection of benzene using SnO2/graphene nanocomposite composed of sub-6 nm SnO2 nanoparticles

    International Nuclear Information System (INIS)

    Meng Fanli; Li Huihua; Kong Lingtao; Liu Jinyun; Jin Zhen; Li Wei; Jia Yong; Liu Jinhuai; Huang Xingjiu

    2012-01-01

    Graphical abstract: SnO 2 /graphene nanocomposite composed of 4–5 nm SnO 2 nanoparticles was synthesized by one-step wet chemical method and the form mechanism of the nanocomposite is clearly interpreted. The detection limit of the nanocomposite was as low as 5 ppb to toxic benzene. Highlights: ► We synthesized SnO 2 /graphene nanocomposite using a simple one-step wet chemical method. ► The nanocomposite composed of 4–5 nm SnO 2 nanoparticles. ► Toxic benzene was detected by such kind of nanocomposite. ► The detection limit to toxic benzene was as low as 5 ppb. - Abstract: In the present work, the SnO 2 /graphene nanocomposite composed of 4–5 nm SnO 2 nanoparticles was synthesized using a simple wet chemical method for ppb-level detection of benzene. The formation mechanism of the nanocomposite was investigated systematically by means of simultaneous thermogravimetry analysis, X-ray diffraction, and X-ray photoelectron spectroscopy cooperated with transmission electron microscopy observations. The SnO 2 /graphene nanocomposite showed a very attractive improved sensitivity to toxic volatile organic compounds, especially to benzene, compared to a traditional SnO 2 . The responses of the nanocomposite to benzene were a little higher than those to ethanol and the detection limit reached 5 ppb to benzene which is, to our best knowledge, far lower than those reported previously.

  15. Low-level radioactive waste from commercial nuclear reactors. Volume 3. Bibliographic abstracts of significant source references. Part 2. Bibliography for treatment, storage, disposal and transportation regulatory constraints

    Energy Technology Data Exchange (ETDEWEB)

    Jolley, R.L.; Rodgers, B.R.

    1986-05-01

    The overall task of this program was to provide an assessment of currently available technology for treating commercial low-level radioactive waste (LLRW), to initiate development of a methodology for choosing one technology for a given application, and to identify research needed to improve current treatment techniques and decision methodology. The resulting report is issued in four volumes. Volume 3 of this series is a collection of abstracts of most of the reference documents used for this study. Because of the large volume of literature, the abstracts have been printed in two separate parts. Federal, state, and local regulations affect the decision process for selecting technology applications. Regulations may favor a particular technology and may prevent application of others. Volume 3, part 2 presents abstracts of the regulatory constraint documents that relate to all phases of LLRW management (e.g., treatment, packaging, storage, transportation, and disposal).

  16. Missing billions. How the Australian government's climate policy is penalising farmers

    International Nuclear Information System (INIS)

    Riguet, T.

    2006-10-01

    The Climate Institute analysis suggests ratifying the Kyoto Protocol and implementing a national emissions trading scheme today could provide Australian farmers with an income of $1.8 billion over the period 2008-2012, due to the emissions saved by limiting land clearing. Separately, a report to the National Farmers Federation by the Allen Consulting Group earlier this year concluded that a carbon emission trading system which recognised Kyoto Protocol rules could create an additional income stream of $0.7-0.9 billion over a five year period from revenue to farmers from forestry sinks. These two studies suggest that ratification of the Kyoto Protocol and the introduction of a national emissions trading scheme could provide farmers an income stream in the order of $2.5 billion. A central tenet of the Federal Government's greenhouse policy for over a decade has been to not ratify Kyoto, but to meet its Kyoto target - a national emissions increase of 8% from 1990 levels, in the period 2008-2012. Australia's National Greenhouse Gas Accounts show that farmers, by reducing land clearing rates since 1990, have offset substantial increases in greenhouse gas emissions from other sectors, mainly energy. Official Federal Government projections show that without land clearing reductions, Australia's greenhouse emissions would be 30% above 1990 levels by 2010. Australia's farmers have been responsible for virtually the entire share of the nation's greenhouse gas emissions reductions, but their efforts, worth around $2 billion, have not been recognised or financially rewarded by the Government. By reducing land clearing, farmers have already reduced greenhouse gas emissions by about 75 million tonnes since 1990. By 2010, the savings are projected to be about 83 million tonnes. This level of emissions reductions is equivalent to eliminating the total annual emissions of New Zealand or Ireland. Over that same period, emissions from energy and transport have and continue to sky

  17. Epidural anesthesia, hypotension, and changes in intravascular volume

    DEFF Research Database (Denmark)

    Holte, Kathrine; Foss, Nicolai B; Svensén, Christer

    2004-01-01

    receiving hydroxyethyl starch. RESULTS: Plasma volume did not change per se after thoracic epidural anesthesia despite a decrease in blood pressure. Plasma volume increased with fluid administration but remained unchanged with vasopressors despite that both treatments had similar hemodynamic effects...... constant was 56 ml/min. CONCLUSIONS: Thoracic epidural anesthesia per se does not lead to changes in blood volumes despite a reduction in blood pressure. When fluid is infused, there is a dilution, and the fluid initially seems to be located centrally. Because administration of hydroxyethyl starch......BACKGROUND: The most common side effect of epidural or spinal anesthesia is hypotension with functional hypovolemia prompting fluid infusions or administration of vasopressors. Short-term studies (20 min) in patients undergoing lumbar epidural anesthesia suggest that plasma volume may increase when...

  18. Rapid surface sampling and archival record system

    Energy Technology Data Exchange (ETDEWEB)

    Barren, E.; Penney, C.M.; Sheldon, R.B. [GE Corporate Research and Development Center, Schenectady, NY (United States)] [and others

    1995-10-01

    A number of contamination sites exist in this country where the area and volume of material to be remediated is very large, approaching or exceeding 10{sup 6} m{sup 2} and 10{sup 6} m{sup 3}. Typically, only a small fraction of this material is actually contaminated. In such cases there is a strong economic motivation to test the material with a sufficient density of measurements to identify which portions are uncontaminated, so extensively they be left in place or be disposed of as uncontaminated waste. Unfortunately, since contamination often varies rapidly from position to position, this procedure can involve upwards of one million measurements per site. The situation is complicated further in many cases by the difficulties of sampling porous surfaces, such as concrete. This report describes a method for sampling concretes in which an immediate distinction can be made between contaminated and uncontaminated surfaces. Sample acquisition and analysis will be automated.

  19. The Scrap Collection per Industry Sector and the Circulation Times of Steel in the U.S. between 1900 and 2016, Calculated Based on the Volume Correlation Model

    Directory of Open Access Journals (Sweden)

    Alicia Gauffin

    2018-05-01

    Full Text Available On the basis of the Volume Correlation Model (VCM as well as data on steel consumption and scrap collection per industry sector (construction, automotive, industrial goods, and consumer goods, it was possible to estimate service lifetimes of steel in the United States between 1900 and 2016. Input data on scrap collection per industry sector was based on a scrap survey conducted by the World Steel Association for a static year in 2014 in the United States. The lifetimes of steel calculated with the VCM method were within the range of previously reported measured lifetimes of products and applications for all industry sectors. Scrapped (and apparent lifetimes of steel compared with measured lifetimes were calculated to be as follows: a scrapped lifetime of 29 years for the construction sector (apparent lifetime: 52 years compared with 44 years measured in 2014. Industrial goods: 16 (27 years compared with 19 years measured in 2010. Consumer goods: 12 (14 years compared with 13 years measured in 2014. Automotive sector: 14 (19 years compared with 17 years measured in 2011. Results show that the VCM can estimate reasonable values of scrap collection and availability per industry sector over time.

  20. Microstructure-Based Counterfeit Detection in Metal Part Manufacturing

    Science.gov (United States)

    Dachowicz, Adam; Chaduvula, Siva Chaitanya; Atallah, Mikhail; Panchal, Jitesh H.

    2017-11-01

    Counterfeiting in metal part manufacturing has become a major global concern. Although significant effort has been made in detecting the implementation of such counterfeits, modern approaches suffer from high expense during production, invasiveness during manufacture, and unreliability in practice if parts are damaged during use. In this paper, a practical microstructure-based counterfeit detection methodology is proposed, which draws on inherent randomness present in the microstructure as a result of the manufacturing process. An optical Physically Unclonable Function (PUF) protocol is developed which takes a micrograph as input and outputs a compact, unique string representation of the micrograph. The uniqueness of the outputs and their robustness to moderate wear and tear is demonstrated by application of the methodology to brass samples. The protocol is shown to have good discriminatory power even between samples manufactured in the same batch, and runs on the order of several seconds per part on inexpensive machines.

  1. Wealth of the world's richest publicly traded companies per industry and per employee: Gamma, Log-normal and Pareto power-law as universal distributions?

    Science.gov (United States)

    Soriano-Hernández, P.; del Castillo-Mussot, M.; Campirán-Chávez, I.; Montemayor-Aldrete, J. A.

    2017-04-01

    Forbes Magazine published its list of leading or strongest publicly-traded two thousand companies in the world (G-2000) based on four independent metrics: sales or revenues, profits, assets and market value. Every one of these wealth metrics yields particular information on the corporate size or wealth size of each firm. The G-2000 cumulative probability wealth distribution per employee (per capita) for all four metrics exhibits a two-class structure: quasi-exponential in the lower part, and a Pareto power-law in the higher part. These two-class structure per capita distributions are qualitatively similar to income and wealth distributions in many countries of the world, but the fraction of firms per employee within the high-class Pareto is about 49% in sales per employee, and 33% after averaging on the four metrics, whereas in countries the fraction of rich agents in the Pareto zone is less than 10%. The quasi-exponential zone can be adjusted by Gamma or Log-normal distributions. On the other hand, Forbes classifies the G-2000 firms in 82 different industries or economic activities. Within each industry, the wealth distribution per employee also follows a two-class structure, but when the aggregate wealth of firms in each industry for the four metrics is divided by the total number of employees in that industry, then the 82 points of the aggregate wealth distribution by industry per employee can be well adjusted by quasi-exponential curves for the four metrics.

  2. Is there a role for anterior zone sampling as part of saturation trans-rectal ultrasound guided prostate biopsy?

    Science.gov (United States)

    Cole, Eric; Margel, David; Greenspan, Michael; Shayegan, Bobby; Matsumoto, Edward; Fischer, Marc A; Patlas, Michael; Daya, Dean; Pinthus, Jehonathan H

    2014-05-03

    The prostatic anterior zone (AZ) is not targeted routinely by TRUS guided prostate biopsy (TRUS-Pbx). MRI is an accurate diagnostic tool for AZ tumors, but is often unavailable due to cost or system restrictions. We examined the diagnostic yield of office based AZ TRUS-Pbx. 127 men at risk for AZ tumors were studied: Patients with elevated PSA and previous extended negative TRUS-Pbx (group 1, n = 78) and actively surveyed low risk prostate cancer patients (group 2, n = 49). None of the participants had a previous AZ biopsy. Biopsy template included suspicious ultrasonic areas, 16 peripheral zone (PZ), 4 transitional zone (TZ) and 6 AZ cores. All biopsies were performed by a single urologist under local peri-prostatic anaesthetic, using the B-K Medical US System, an end-firing probe 4-12 MHZ and 18 ga/25 cm needle. All samples were reviewed by a single specialized uro-pathologist. Multivariate analysis was used to detect predictors for AZ tumors accounting for age, PSA, PSA density, prostate volume, BMI, and number of previous biopsies. Median PSA was 10.4 (group 1) and 7.3 (group 2). Age (63.9, 64.5), number of previous biopsies (1.5) and cores (17.8, 21.3) and prostate volume (56.4 cc, 51 cc) were similar for both groups. The overall diagnostic yield was 34.6% (group 1) and 85.7% (group 2). AZ cancers were detected in 21.8% (group 1) and 34.7% (group 2) but were rarely the only zone involved (1.3% and 4.1% respectively). Gleason ≥ 7 AZ cancers were often accompanied by equal grade PZ tumors. In multivariate analysis only prostate volume predicted for AZ tumors. Patients detected with AZ tumors had significantly smaller prostates (36.9 cc vs. 61.1 cc p < 0.001). Suspicious AZ ultrasonic findings were uncommon (6.3%). TRUS-Pbx AZ sampling rarely improves the diagnostic yield of extended PZ sampling in patients with elevated PSA and previous negative biopsies. In low risk prostate cancer patients who are followed by active surveillance, AZ sampling changes risk

  3. The in vivo estrogenic and in vitro anti-estrogenic activity of permethrin and bifenthrin

    OpenAIRE

    Brander, Susanne M.; He, Guochun; Smalling, Kelly L.; Denison, Michael S.; Cherr, Gary N.

    2012-01-01

    Pyrethroids are highly toxic to fish at parts per billion or parts per trillion concentrations. Their intended mechanism is prolonged sodium channel opening, but recent studies reveal that pyrethroids such as permethrin and bifenthrin also have endocrine activity. Additionally, metabolites may have greater endocrine activity than parent compounds. We evaluated the in vivo concentration-dependent ability of bifenthrin and permethrin to induce choriogenin (an estrogen-responsive protein) in Men...

  4. Assessment of reagent effectiveness and preservation methods for equine faecal samples

    Directory of Open Access Journals (Sweden)

    Eva Vavrouchova

    2015-03-01

    Full Text Available The aim of our study was to identify the most suitable flotation solution and effective preservation method for the examination of equine faeces samples using the FLOTAC technique. Samples from naturally infected horses were transported to the laboratory andanalysed accordingly. The sample from each horse was homogenized and divided into four parts: one was frozen, another two were preserved in different reagents such as sodium acetate-acetic-acid–formalin (SAF or 5% formalin.The last part was examined as a fresh sample in three different flotation solutions (Sheather´s solution, sodium chloride and sodium nitrate solution, all with a specific gravity 1.200. The preserved samples were examined in the period from 14 to21days after collection. According to our results, the sucrose solution was the most suitable flotation solution for fresh samples (small strongyle egg per gram was 706 compared to 360 in sodium chlorid and 507 in sodium nitrate and the sodium nitrate solution was the most efficient for the preserved samples (egg per gram was 382 compared to 295 in salt solution and 305 in sucrose solution. Freezing appears to be the most effective method of sample preservation, resulting in minimal damage to fragile strongyle eggs and therefore it is the most simple and effective preservation method for the examination of large numbers of faecal samples without the necessity of examining them all within 48 hours of collection. Deep freezing as a preservation method for equine faeces samples has not, according to our knowledge, been yet published.

  5. Plate tectonic influences on Earth's baseline climate: a 2 billion-year record

    Science.gov (United States)

    McKenzie, R.; Evans, D. A.; Eglington, B. M.; Planavsky, N.

    2017-12-01

    Plate tectonic processes present strong influences on the long-term carbon cycle, and thus global climate. Here we utilize multiple aspects of the geologic record to assess the role plate tectonics has played in driving major icehouse­-greenhouse transitions for the past 2 billion years. Refined paleogeographic reconstructions allow us to quantitatively assess the area of continents in various latitudinal belts throughout this interval. From these data we are able to test the hypothesis that concentrating continental masses in low-latitudes will drive cooler climates due to increased silicate weathering. We further superimpose records of events that are believed to increase the `weatherability' of the crust, such as large igneous province emplacement, island-arc accretion, and continental collisional belts. Climatic records are then compared with global detrital zircon U-Pb age data as a proxy for continental magmatism. Our results show a consistent relationship between zircon-generating magmatism and icehouse-greenhouse transitions for > 2 billion years, whereas paleogeographic records show no clear consistent relationship between continental configurations and prominent climate transitions. Volcanic outgassing appears to exert a first-order control on major baseline climatic shifts; however, paleogeography likely plays an important role in the magnitude of this change. Notably, climatic extremes, such as the Cryogenian icehouse, occur during a combination of reduce volcanism and end-member concentrations of low-latitudinal continents.

  6. Portable sensor for hazardous waste. Topical report, October 1, 1993--September 30, 1994

    International Nuclear Information System (INIS)

    1997-01-01

    The authors describe an innovative technique to detect hazardous materials at sub part-per-billion levels. The approach exploits active nitrogen energy-transfer (ANET) to excite atomic and molecular fluorescence characteristic of various hazardous species. ANET excitation is very state specific, generating simple spectra that are easily detected with instrumentation of modest resolution. Typical spectral features include 254 nm emission from Hg, 388 and 420 nm emission from CN when organics are sampled, and 278 nm emission from CCl when chlorinated organics are sampled. They also observe several broadbands between 450 and 540 nm where uranium compounds are added to the D-B discharge region. They attribute this spectrum to electronic transitions of uranium oxide, probably UO. Additionally, they have used ANET to detect a number of heavy metals such as Cr, Se, Cd, Pb, and Cu. Dielectric-barrier (D-B) discharge technology generates the active nitrogen. This approach affords atmospheric-pressure operation, fluorescence excitation in gaseous, particulate, and aqueous sample matrices, and is amenable to field operation because the discharge and associated electronics are compact and can be powered by 12V batteries. This report details the results of the first phase of a three and a half year program designed to develop a portable monitor for sensitive hazardous waste detection. The ultimate goal of the program is to develop the concept to the prototype instrument level. In this first phase they have demonstrated the applicability of the ANET technology to a variety of hazardous species, and have determined detection sensitivity limits for Hg, Se, organics, and chlorinated organics to be at part-per-billion levels or below

  7. Portable sensor for hazardous waste. Topical report, October 1993--September 1994

    International Nuclear Information System (INIS)

    Piper, L.G.

    1994-10-01

    We describe an innovative technique to detect hazardous materials at sub part-per-billion levels. Our approach exploits active nitrogen energy-transfer (ANET) to excite atomic and molecular fluorescence characteristic of various hazardous species. ANET excitation is very state specific, generating simple spectra that are easily detected with instrumentation of modest resolution. Typical spectral features include 254 nm emission from Hg, 388 and 420 nm emission from CN when organics are sampled, and 278 nm emission from M when chlorinated organics are sampled. We also observe several broadbands between 450 and 540 nm where uranium compounds are added to the D-B discharge region. We attribute this spectrum to electronic transitions of uranium oxide, probably UO. Additionally, we have used ANET to detect a number of heavy metals such as Cr, Se, Cd, Pb, and Cu. Dielectric-barrier (D-B) discharge technology generates the active nitrogen. This approach affords atmospheric-pressure operation, fluorescence excitation in gaseous, particulate, and aqueous sample matrices, and is amenable to field operation because the discharge and associated electronics are compact and can be powered by 12V batteries. This report details the results of the first phase of a three and a half year program designed to develop a portable monitor for sensitive hazardous waste detection. The ultimate goal of the program is to develop our concept to the prototype instrument level. In this first phase we have demonstrated the applicability of the ANET technology to a variety of hazardous species, and have determined detection sensitivity limits for Hg, Se, organics, and chlorinated organics to be at part-per-billion levels or below

  8. Prospection for natural 231Pa in India

    International Nuclear Information System (INIS)

    Anupama, P.; Gantayet, L.M.; Verma, R.; Parthasarathy, R.; Anil Kumar, S.; Dingankar, M.V.; Ghosh, S.K.; Patra, R.N.

    2001-08-01

    Protactinium-231 ( 231 Pa) occurs in nature as a member of the decay chain of naturally occuring 235 U of the 4n+ 3 radioactive series. The expected protactinium concentration in the Jaduguda ore body (with uranium concentration of 0.03-0.06 %) is around 0.2 parts per billion (ppb) and that in monazite ore (uranium concentration 0.3%) is 0.9 ppb. The process at uranium ore processing plant at Jaduguda was studied. 231 Pa content in samples from the process streams of the plant was determined. The gamma ray spectrometry method was chosen and standardised in our laboratory to detect and measure 231 Pa in parts per billion levels in these samples. A concentrated source of protactinium could not be found among the assessed streams of Jaduguda uranium plant. The Monazite processing plant at IRE, Aluva was then studied. From the known chemistry of protactinium, the possible distribution of the 231 Pa was guessed at. Accordingly, the process streams of IRE process plant were selected to prospect for 231 Pa and determine the fractionation of protactinium. For analysis of 231 Pa, the thorium bearing samples were chemically treated to remove the thorium daughter products, which interfere in gamma spectrometry. This report describes the planning for prospecting, sample selection, the standardisation of the analysis procedure for determination of 231 Pa content, and the analysis results. The 231 Pa content in various streams of Indian Rare Earths plant was found in the range 0.2 -6.5 ppb. Some of the streams did not carry any protactinium. The fractionation of 231 Pa in the various streams of the plant and the selection of source for recovery of protactinium are discussed in detail. (author)

  9. Diagnostic peritoneal lavage: volume of lavage effluent needed for accurate determination of a negative lavage.

    Science.gov (United States)

    Sweeney, J F; Albrink, M H; Bischof, E; McAllister, E W; Rosemurgy, A S

    1994-12-01

    While the ability of diagnostic peritoneal lavage (DPL) to 'rule out' occult intra-abdominal injuries has been well established, the volume of lavage effluent necessary for accurate prediction of a negative lavage has not been determined. To address this, 60 injured adults with blunt (N = 45) or penetrating (N = 15) trauma undergoing DPL were evaluated prospectively through protocol. After infusion of 1l of Ringer's lactate solution, samples of lavage effluent were obtained at 100 cm3, 250 cm3, 500 cm3, and 759 cm3, and when no more effluent could be returned (final sample). DPL was considered negative if final sample RBC count was < or = 100,000/mm3 for blunt injury and < 50,000/mm3 for penetrating injury. The conclusion is that at 100 cm3 of lavage effluent returned, negative results are highly predictive of a negative DPL (98 per cent), though 250 cm3 of lavage effluent is required to predict a negative DPL uniformly (100 per cent).

  10. The effect of hospital volume on patient outcomes in severe acute pancreatitis

    Directory of Open Access Journals (Sweden)

    Shen Hsiu-Nien

    2012-08-01

    Full Text Available Abstract Background We investigated the relation between hospital volume and outcome in patients with severe acute pancreatitis (SAP. The determination is important because patient outcome may be improved through volume-based selective referral. Methods In this cohort study, we analyzed 22,551 SAP patients in 2,208 hospital-years (between 2000 and 2009 from Taiwan’s National Health Insurance Research Database. Primary outcome was hospital mortality. Secondary outcomes were hospital length of stay and charges. Hospital SAP volume was measured both as categorical and as continuous variables (per one case increase each hospital-year. The effect was assessed using multivariable logistic regression models with generalized estimating equations accounting for hospital clustering effect. Adjusted covariates included patient and hospital characteristics (model 1, and additional treatment variables (model 2. Results Irrespective of the measurements, increasing hospital volume was associated with reduced risk of hospital mortality after adjusting the patient and hospital characteristics (adjusted odds ratio [OR] 0.995, 95% confidence interval [CI] 0.993-0.998 for per one case increase. The patients treated in the highest volume quartile (≥14 cases per hospital-year had 42% lower risk of hospital mortality than those in the lowest volume quartile (1 case per hospital-year after adjusting the patient and hospital characteristics (adjusted OR 0.58, 95% CI 0.40-0.83. However, an inverse relation between volume and hospital stay or hospital charges was observed only when the volume was analyzed as a categorical variable. After adjusting the treatment covariates, the volume effect on hospital mortality disappeared regardless of the volume measures. Conclusions These findings support the use of volume-based selective referral for patients with SAP and suggest that differences in levels or processes of care among hospitals may have contributed to the volume

  11. Reducing overlay sampling for APC-based correction per exposure by replacing measured data with computational prediction

    Science.gov (United States)

    Noyes, Ben F.; Mokaberi, Babak; Oh, Jong Hun; Kim, Hyun Sik; Sung, Jun Ha; Kea, Marc

    2016-03-01

    One of the keys to successful mass production of sub-20nm nodes in the semiconductor industry is the development of an overlay correction strategy that can meet specifications, reduce the number of layers that require dedicated chuck overlay, and minimize measurement time. Three important aspects of this strategy are: correction per exposure (CPE), integrated metrology (IM), and the prioritization of automated correction over manual subrecipes. The first and third aspects are accomplished through an APC system that uses measurements from production lots to generate CPE corrections that are dynamically applied to future lots. The drawback of this method is that production overlay sampling must be extremely high in order to provide the system with enough data to generate CPE. That drawback makes IM particularly difficult because of the throughput impact that can be created on expensive bottleneck photolithography process tools. The goal is to realize the cycle time and feedback benefits of IM coupled with the enhanced overlay correction capability of automated CPE without impacting process tool throughput. This paper will discuss the development of a system that sends measured data with reduced sampling via an optimized layout to the exposure tool's computational modelling platform to predict and create "upsampled" overlay data in a customizable output layout that is compatible with the fab user CPE APC system. The result is dynamic CPE without the burden of extensive measurement time, which leads to increased utilization of IM.

  12. Low-level radioactive waste from commercial nuclear reactors. Volume 3. Bibliographic abstracts of significant source documents. Part 1. Open-literature abstracts for low-level radioactive waste

    International Nuclear Information System (INIS)

    Bowers, M.K.; Rodgers, B.R.; Jolley, R.L.

    1986-05-01

    The overall task of this program was to provide an assessment of currently available technology for treating commercial low-level radioactive waste (LLRW), to initiate development of a methodology for choosing one technology for a given application, and to identify research needed to improve current treatment techniques and decision methodology. The resulting report is issued in four volumes. Volume 3 of this series is a collection of abstracts of most of the reference documents used for this study. Because of the large volume of literature, the abstracts have been printed in two separate parts. Volume 3, part 1 presents abstracts of the open literature relating to LLRW treatment methodologies. Some of these references pertain to treatment processes for hazardous wastes that may also be applicable to LLRW management. All abstracts have been limited to 21 lines (for brevity), but each abstract contains sufficient information to enable the reader to determine the potential usefulness of the source document and to locate each article. The abstracts are arranged alphabetically by author or organization, and indexed by keyword

  13. Low-level radioactive waste from commercial nuclear reactors. Volume 3. Bibliographic abstracts of significant source documents. Part 1. Open-literature abstracts for low-level radioactive waste

    Energy Technology Data Exchange (ETDEWEB)

    Bowers, M.K.; Rodgers, B.R.; Jolley, R.L.

    1986-05-01

    The overall task of this program was to provide an assessment of currently available technology for treating commercial low-level radioactive waste (LLRW), to initiate development of a methodology for choosing one technology for a given application, and to identify research needed to improve current treatment techniques and decision methodology. The resulting report is issued in four volumes. Volume 3 of this series is a collection of abstracts of most of the reference documents used for this study. Because of the large volume of literature, the abstracts have been printed in two separate parts. Volume 3, part 1 presents abstracts of the open literature relating to LLRW treatment methodologies. Some of these references pertain to treatment processes for hazardous wastes that may also be applicable to LLRW management. All abstracts have been limited to 21 lines (for brevity), but each abstract contains sufficient information to enable the reader to determine the potential usefulness of the source document and to locate each article. The abstracts are arranged alphabetically by author or organization, and indexed by keyword.

  14. Reachable volume RRT

    KAUST Repository

    McMahon, Troy

    2015-05-01

    © 2015 IEEE. Reachable volumes are a new technique that allows one to efficiently restrict sampling to feasible/reachable regions of the planning space even for high degree of freedom and highly constrained problems. However, they have so far only been applied to graph-based sampling-based planners. In this paper we develop the methodology to apply reachable volumes to tree-based planners such as Rapidly-Exploring Random Trees (RRTs). In particular, we propose a reachable volume RRT called RVRRT that can solve high degree of freedom problems and problems with constraints. To do so, we develop a reachable volume stepping function, a reachable volume expand function, and a distance metric based on these operations. We also present a reachable volume local planner to ensure that local paths satisfy constraints for methods such as PRMs. We show experimentally that RVRRTs can solve constrained problems with as many as 64 degrees of freedom and unconstrained problems with as many as 134 degrees of freedom. RVRRTs can solve problems more efficiently than existing methods, requiring fewer nodes and collision detection calls. We also show that it is capable of solving difficult problems that existing methods cannot.

  15. Reachable volume RRT

    KAUST Repository

    McMahon, Troy; Thomas, Shawna; Amato, Nancy M.

    2015-01-01

    © 2015 IEEE. Reachable volumes are a new technique that allows one to efficiently restrict sampling to feasible/reachable regions of the planning space even for high degree of freedom and highly constrained problems. However, they have so far only been applied to graph-based sampling-based planners. In this paper we develop the methodology to apply reachable volumes to tree-based planners such as Rapidly-Exploring Random Trees (RRTs). In particular, we propose a reachable volume RRT called RVRRT that can solve high degree of freedom problems and problems with constraints. To do so, we develop a reachable volume stepping function, a reachable volume expand function, and a distance metric based on these operations. We also present a reachable volume local planner to ensure that local paths satisfy constraints for methods such as PRMs. We show experimentally that RVRRTs can solve constrained problems with as many as 64 degrees of freedom and unconstrained problems with as many as 134 degrees of freedom. RVRRTs can solve problems more efficiently than existing methods, requiring fewer nodes and collision detection calls. We also show that it is capable of solving difficult problems that existing methods cannot.

  16. Market projections of cellulose nanomaterial-enabled products-- Part 2: Volume estimates

    Science.gov (United States)

    John Cowie; E.M. (Ted) Bilek; Theodore H. Wegner; Jo Anne Shatkin

    2014-01-01

    Nanocellulose has enormous potential to provide an important materials platform in numerous product sectors. This study builds on previous work by the same authors in which likely high-volume, low-volume, and novel applications for cellulosic nanomaterials were identified. In particular, this study creates a transparent methodology and estimates the potential annual...

  17. Assessing genetic polymorphisms using DNA extracted from cells present in saliva samples

    Directory of Open Access Journals (Sweden)

    Nemoda Zsofia

    2011-12-01

    Full Text Available Abstract Background Technical advances following the Human Genome Project revealed that high-quality and -quantity DNA may be obtained from whole saliva samples. However, usability of previously collected samples and the effects of environmental conditions on the samples during collection have not been assessed in detail. In five studies we document the effects of sample volume, handling and storage conditions, type of collection device, and oral sampling location, on quantity, quality, and genetic assessment of DNA extracted from cells present in saliva. Methods Saliva samples were collected from ten adults in each study. Saliva volumes from .10-1.0 ml, different saliva collection devices, sampling locations in the mouth, room temperature storage, and multiple freeze-thaw cycles were tested. One representative single nucleotide polymorphism (SNP in the catechol-0-methyltransferase gene (COMT rs4680 and one representative variable number of tandem repeats (VNTR in the serotonin transporter gene (5-HTTLPR: serotonin transporter linked polymorphic region were selected for genetic analyses. Results The smallest tested whole saliva volume of .10 ml yielded, on average, 1.43 ± .77 μg DNA and gave accurate genotype calls in both genetic analyses. The usage of collection devices reduced the amount of DNA extracted from the saliva filtrates compared to the whole saliva sample, as 54-92% of the DNA was retained on the device. An "adhered cell" extraction enabled recovery of this DNA and provided good quality and quantity DNA. The DNA from both the saliva filtrates and the adhered cell recovery provided accurate genotype calls. The effects of storage at room temperature (up to 5 days, repeated freeze-thaw cycles (up to 6 cycles, and oral sampling location on DNA extraction and on genetic analysis from saliva were negligible. Conclusions Whole saliva samples with volumes of at least .10 ml were sufficient to extract good quality and quantity DNA. Using

  18. Accelerator mass spectrometry in biomedical research

    International Nuclear Information System (INIS)

    Vogel, J.S.; Turteltaub, K.W.

    1993-01-01

    Biological effects occur in natural systems at chemical concentrations of parts per billion (1:10 9 ) or less. Affected biomolecules may be separable in only milligram or microgram quantities. Quantification at attomole sensitivity is needed to study these interactions. AMS measures isotope concentrations to parts per 10 13--15 on milligram-sized samples and is ideal for quantifying long-lived radioisotopic labels that are commonly used to trace biochemical pathways in natural systems. 14 C-AMS has now been coupled to a variety of organic separation and definition technologies. The primary research investigates pharmacokinetics and genotoxicities of toxins and drugs at very low doses. Human subject research using AMS includes nutrition, toxicity and elemental balance studies. 3 H, 41 Ca and 26 Al are also traced by AMS for fundamental biochemical kinetic research. Expansion of biomedical AMS awaits further development of biochemical and accelerator technologies designed specifically for these applications

  19. Determination of arsenic in ambient water at sub-part-per-trillion levels by hydride generation Pd coated platform collection and GFAAS detection.

    Science.gov (United States)

    Liang, L; Lazoff, S; Chan, C; Horvat, M; Woods, J S

    1998-11-01

    A method for trace determination of total arsenic in ambient waters is described. Arsenic is separated on-line from a large volume water sample by hydride generation and purging, pre-collected on a Pd coated pyrolytic platform cuvette using a simple and inexpensive system, and finally detected by GFAAS. Instrument parameters, hydride generation, transportation, and collection were optimized. The analytical behavior for major species including As(3+), As(5+), monomethyl As (MMA), and dimethyl As (DMA) were investigated individually. Problems arising from use of the system were discussed and eliminated. The necessity of sample digestion and an efficient digestion method were studied. Sample digestion for water with low organic content such as tap water and clean ground water and some clean surface water can be omitted. The method detection limit (MDL) is 0.3 ng l(-1) for a 25 ml water sample. Recoveries close to 100% with R.S.D.rain, sewage effluent, and saline water from different origins in the US, China, and Canada were collected and analyzed using ultra clean sampling and analysis techniques. The background levels of As in most water analyzed were established for the first time, and found to be far above the EPA's health effect criteria, 18 ng l(-1).

  20. Genotyping for DQA1 and PM loci in urine using PCR-based amplification: effects of sample volume, storage temperature, preservatives, and aging on DNA extraction and typing.

    Science.gov (United States)

    Vu, N T; Chaturvedi, A K; Canfield, D V

    1999-05-31

    Urine is often the sample of choice for drug screening in aviation/general forensic toxicology and in workplace drug testing. In some instances, the origin of the submitted samples may be challenged because of the medicolegal and socioeconomic consequences of a positive drug test. Methods for individualization of biological samples have reached a new boundary with the application of the polymerase chain reaction (PCR) in DNA profiling, but a successful characterization of the urine specimens depends on the quantity and quality of DNA present in the samples. Therefore, the present study investigated the influence of storage conditions, sample volume, concentration modes, extraction procedures, and chemical preservations on the quantity of DNA recovered, as well as the success rate of PCR-based genotyping for DQA1 and PM loci in urine. Urine specimens from male and female volunteers were divided and stored at various temperatures for up to 30 days. The results suggested that sample purification by dialfiltration, using 3000-100,000 molecular weight cut-off filters, did not enhance DNA recovery and typing rate as compared with simple centrifugation procedures. Extraction of urinary DNA by the organic method and by the resin method gave comparable typing results. Larger sample volume yielded a higher amount of DNA, but the typing rates were not affected for sample volumes between 1 and 5 ml. The quantifiable amounts of DNA present were found to be greater in female (14-200 ng/ml) than in male (4-60 ng/ml) samples and decreased with the elapsed time under both room temperature (RT) and frozen storage. Typing of the male samples also demonstrated that RT storage samples produced significantly higher success rates than that of frozen samples, while there was only marginal difference in the DNA typing rates among the conditions tested using female samples. Successful assignment of DQA1 + PM genotype was achieved for all samples of fresh urine, independent of gender

  1. Sample preparation

    International Nuclear Information System (INIS)

    Anon.

    1992-01-01

    Sample preparation prior to HPLC analysis is certainly one of the most important steps to consider in trace or ultratrace analysis. For many years scientists have tried to simplify the sample preparation process. It is rarely possible to inject a neat liquid sample or a sample where preparation may not be any more complex than dissolution of the sample in a given solvent. The last process alone can remove insoluble materials, which is especially helpful with the samples in complex matrices if other interactions do not affect extraction. Here, it is very likely a large number of components will not dissolve and are, therefore, eliminated by a simple filtration process. In most cases, the process of sample preparation is not as simple as dissolution of the component interest. At times, enrichment is necessary, that is, the component of interest is present in very large volume or mass of material. It needs to be concentrated in some manner so a small volume of the concentrated or enriched sample can be injected into HPLC. 88 refs

  2. Unlocking the EUR53 billion savings from smart meters in the EU. How increasing the adoption of dynamic tariffs could make or break the EU's smart grid investment

    International Nuclear Information System (INIS)

    Faruqui, Ahmad; Hledik, Ryan; Harris, Dan

    2010-01-01

    We estimate the cost of installing smart meters in the EU to be EUR51 billion, and that operational savings will be worth between EUR26 and 41 billion, leaving a gap of EUR10-25 billion between benefits and costs. Smart meters can fill this gap because they enable the provision of dynamic pricing, which reduces peak demand and lowers the need for building and running expensive peaking power plants. The present value of savings in peaking infrastructure could be as high as EUR67 billion for the EU if policy-makers can overcome barriers to consumers adopting dynamic tariffs, but only EUR14 billion otherwise. We outline a number of ways to increase the adoption of dynamic tariffs. (author)

  3. Childhood adversity is linked to differential brain volumes in adolescents with alcohol use disorder: a voxel-based morphometry study.

    Science.gov (United States)

    Brooks, Samantha J; Dalvie, Shareefa; Cuzen, Natalie L; Cardenas, Valerie; Fein, George; Stein, Dan J

    2014-06-01

    Previous neuroimaging studies link both alcohol use disorder (AUD) and early adversity to neurobiological differences in the adult brain. However, the association between AUD and childhood adversity and effects on the developing adolescent brain are less clear, due in part to the confound of psychiatric comorbidity. Here we examine early life adversity and its association with brain volume in a unique sample of 116 South African adolescents (aged 12-16) with AUD but without psychiatric comorbidity. Participants were 58 adolescents with DSM-IV alcohol dependence and with no other psychiatric comorbidities, and 58 age-, gender- and protocol-matched light/non-drinking controls (HC). Assessments included the Childhood Trauma Questionnaire (CTQ). MR images were acquired on a 3T Siemens Magnetom Allegra scanner. Volumes of global and regional structures were estimated using SPM8 Voxel Based Morphometry (VBM), with analysis of covariance (ANCOVA) and regression analyses. In whole brain ANCOVA analyses, a main effect of group when examining the AUD effect after covarying out CTQ was observed on brain volume in bilateral superior temporal gyrus. Subsequent regression analyses to examine how childhood trauma scores are linked to brain volumes in the total cohort revealed a negative correlation in the left hippocampus and right precentral gyrus. Furthermore, bilateral (but most significantly left) hippocampal volume was negatively associated with sub-scores on the CTQ in the total cohort. These findings support our view that some alterations found in brain volumes in studies of adolescent AUD may reflect the impact of confounding factors such as psychiatric comorbidity rather than the effects of alcohol per se. In particular, early life adversity may influence the developing adolescent brain in specific brain regions, such as the hippocampus.

  4. Solid phase extraction of large volume of water and beverage samples to improve detection limits for GC-MS analysis of bisphenol A and four other bisphenols.

    Science.gov (United States)

    Cao, Xu-Liang; Popovic, Svetlana

    2018-01-01

    Solid phase extraction (SPE) of large volumes of water and beverage products was investigated for the GC-MS analysis of bisphenol A (BPA), bisphenol AF (BPAF), bisphenol F (BPF), bisphenol E (BPE), and bisphenol B (BPB). While absolute recoveries of the method were improved for water and some beverage products (e.g. diet cola, iced tea), breakthrough may also have occurred during SPE of 200 mL of other beverages (e.g. BPF in cola). Improvements in method detection limits were observed with the analysis of large sample volumes for all bisphenols at ppt (pg/g) to sub-ppt levels. This improvement was found to be proportional to sample volumes for water and beverage products with less interferences and noise levels around the analytes. Matrix effects and interferences were observed during SPE of larger volumes (100 and 200 mL) of the beverage products, and affected the accurate analysis of BPF. This improved method was used to analyse bisphenols in various beverage samples, and only BPA was detected, with levels ranging from 0.022 to 0.030 ng/g for products in PET bottles, and 0.085 to 0.32 ng/g for products in cans.

  5. A New Approach to Identification of Biomarkers for Early Cancer Stage Detection

    Directory of Open Access Journals (Sweden)

    Buszewski Bogusław

    2014-06-01

    Full Text Available Gas chromatography and mass spectrometry (GC/MS was applied for determination of concentrations volatile organic compounds present in human breath samples. The technique allows to rapid determination compounds in human air, at the level of parts per billion. It showed linear correlations ranging from 0.83-234.05 ppb, limit detection in the range of 0.31-0.75 ppb and precision, expressed as RSD, was less then 10.00%. Moreover, trained dogs are able to discriminate breath samples of patients with diagnosed cancer disease. We found positive correlation between dog indications and content of ethyl acetate and 2- pentanone in breath (r=0.85 and r=0.97, respectively

  6. A Continuous Flow System for the Measurement of Ambient Nitrogen Oxides [NO + NO] Using Rhodamine B Hydrazide as a Chemosensor

    Directory of Open Access Journals (Sweden)

    Pandurangappa Malingappa

    2014-01-01

    Full Text Available A new chemosensor has been used to monitor atmospheric nitrogen oxides [NO + NO 2 ] at parts per billion (ppb level. It is based on the catalytic reaction of nitrogen oxides with rhodamine B hydrazide (RBH to produce a colored compound through the hydrolysis of the amide bond of the molecule. A simple colorimeter has been used to monitor atmospheric nitrogen dioxide at ppb level. The air samples were purged through a sampling cuvette containing RBH solution using peristaltic pump. The proposed method has been successfully applied to monitor the ambient nitrogen dioxide levels at traffic junction points within the city limits and the results obtained are compared with the standard Griess-Ilosvay method.

  7. History of CERN. Volume 3

    International Nuclear Information System (INIS)

    Krige, J.

    1996-01-01

    The present volume continues the story of the history of the European Organization for Nuclear Research (CERN) in Geneva, Switzerland, concentrating on the years between the mid 1960s and the late 1970s. Whereas the first two volumes were the product of a team of historians, this book is rather a collection of studies by authors with very different professional backgrounds and institutional locations. It also differs from the predecessor volumes in the fact that it consists of distinct case studies dealing with a number of issues deemed important. The first part of this volume, containing contributions by historians of science, perceives the laboratory as being at the node of a complex of interconnected relationships between scientists and science managers on the staff, the users in the member states, and the governments which were called upon to finance the laboratory. In part 2 the physical results, obtained at CERN, are surveyed, while in part 3 two chapters are presented, one on engineering and technology, and the other on the research and development of electronic position detectors

  8. Nuclear fuel technology - Tank calibration and volume determination for nuclear materials accountancy - Part 4: Accurate determination of liquid height in accountancy tanks equipped with dip tubes, slow bubbling rate

    International Nuclear Information System (INIS)

    2008-01-01

    ISO 18213 deals with the acquisition, standardization, analysis, and use of calibration to determine liquid volumes in process tanks for the accountancy of nuclear materials. This part of ISO 18213 is complementary to the other parts, ISO 18213-1 (procedural overview), ISO 18213-2 (data standardization), ISO 18213-3 (statistical methods), ISO 18213-5 (fast bubbling rate) and ISO 18213-6 (in-tank determination of liquid density). The procedure presented herein for determining liquid height from measurements of induced pressure applies specifically when a very slow bubbling rate is employed. A similar procedure that is appropriate for a fast bubbling rate is given in ISO 18213-5. Measurements of the volume and height of liquid in a process accountancy tank are often made in order to estimate or verify the tank's calibration or volume measurement equation. The calibration equation relates the response of the tank's measurement system to some independent measure of tank volume. Beginning with an empty tank, calibration data are typically acquired by introducing a series of carefully measured quantities of some calibration liquid into the tank. The quantity of liquid added, the response of the tank's measurement system, and relevant ambient conditions such as temperature are measured for each incremental addition. Several calibration runs are made to obtain data for estimating or verifying a tank's calibration or measurement equation. A procedural overview of the tank calibration and volume measurement process is given in ISO 18213-1. An algorithm for standardizing tank calibration and volume measurement data to minimize the effects of variability in ambient conditions that prevail during the measurement period is given in ISO 18213-2. The procedure presented in this part of ISO 18213 for determining the height of calibration liquid in the tank from a measurement of the pressure it induces in the tank's measurement system is a vital component of that algorithm. In some

  9. Nuclear fuel technology - Tank calibration and volume determination for nuclear materials accountancy - Part 5: Accurate determination of liquid height in accountancy tanks equipped with dip tubes, fast bubbling rate

    International Nuclear Information System (INIS)

    2008-01-01

    ISO 18213 deals with the acquisition, standardization, analysis, and use of calibration to determine liquid volumes in process tanks for the accountancy of nuclear materials. This part of ISO 18213 is complementary to the other parts, ISO 18213-1 (procedural overview), ISO 18213-2 (data standardization), ISO 18213-3 (statistical methods), ISO 18213-5 (fast bubbling rate) and ISO 18213-6 (in-tank determination of liquid density). The procedure presented herein for determining liquid height from measurements of induced pressure applies specifically when a very slow bubbling rate is employed. A similar procedure that is appropriate for a fast bubbling rate is given in ISO 18213-5. Measurements of the volume and height of liquid in a process accountancy tank are often made in order to estimate or verify the tank's calibration or volume measurement equation. The calibration equation relates the response of the tank's measurement system to some independent measure of tank volume. Beginning with an empty tank, calibration data are typically acquired by introducing a series of carefully measured quantities of some calibration liquid into the tank. The quantity of liquid added, the response of the tank's measurement system, and relevant ambient conditions such as temperature are measured for each incremental addition. Several calibration runs are made to obtain data for estimating or verifying a tank's calibration or measurement equation. A procedural overview of the tank calibration and volume measurement process is given in ISO 18213-1. An algorithm for standardizing tank calibration and volume measurement data to minimize the effects of variability in ambient conditions that prevail during the measurement period is given in ISO 18213-2. The procedure presented in this part of ISO 18213 for determining the height of calibration liquid in the tank from a measurement of the pressure it induces in the tank's measurement system is a vital component of that algorithm. In some

  10. NMT - A new individual ion counting method: Comparison to a Faraday cup

    Science.gov (United States)

    Burton, Michael; Gorbunov, Boris

    2018-03-01

    Two sample detectors used to analyze the emission from Gas Chromatography (GC) columns are the Flame Ionization Detector (FID) and the Electron Capture Detector (ECD). Both of these detectors involve ionization of the sample molecules and then measuring electric current in the gas using a Faraday cup. In this paper a newly discovered method of ion counting, Nanotechnology Molecular Tagging (NMT) is tested as a replacement to the Faraday cup in GCs. In this method the effective physical volume of individual molecules is enlarged up to 1 billion times enabling them to be detected by an optical particle counter. It was found that the sensitivity of NMT was considerably greater than the Faraday cup. The background in the NMT was circa 200 ions per cm3, corresponding to an extremely low electric current ∼10-17 A.

  11. Enhanced Oil Recovery: Aqueous Flow Tracer Measurement

    Energy Technology Data Exchange (ETDEWEB)

    Joseph Rovani; John Schabron

    2009-02-01

    A low detection limit analytical method was developed to measure a suite of benzoic acid and fluorinated benzoic acid compounds intended for use as tracers for enhanced oil recovery operations. Although the new high performance liquid chromatography separation successfully measured the tracers in an aqueous matrix at low part per billion levels, the low detection limits could not be achieved in oil field water due to interference problems with the hydrocarbon-saturated water using the system's UV detector. Commercial instrument vendors were contacted in an effort to determine if mass spectrometry could be used as an alternate detection technique. The results of their work demonstrate that low part per billion analysis of the tracer compounds in oil field water could be achieved using ultra performance liquid chromatography mass spectrometry.

  12. A high volume sampling system for isotope determination of volatile halocarbons and hydrocarbons

    Directory of Open Access Journals (Sweden)

    E. Bahlmann

    2011-10-01

    Full Text Available The isotopic composition of volatile organic compounds (VOCs can provide valuable information on their sources and fate not deducible from mixing ratios alone. In particular the reported carbon stable isotope ratios of chloromethane and bromomethane from different sources cover a δ13C-range of almost 100‰ making isotope ratios a very promising tool for studying the biogeochemistry of these compounds. So far, the determination of the isotopic composition of C1 and C2 halocarbons others than chloromethane is hampered by their low mixing ratios.

    In order to determine the carbon isotopic composition of C1 and C2 halocarbons with mixing ratios as low as 1 pptv (i a field suitable cryogenic high volume sampling system and (ii a chromatographic set up for processing these samples have been developed and validated. The sampling system was tested at two different sampling sites, an urban and a coastal location in Northern Germany. The average δ13C-values for bromomethane at the urban site were −42.9 ± 1.1‰ and agreed well with previously published results. But at the coastal site bromomethane was substantially enriched in 13C by almost 10‰. Less pronounced differences were observed for chlorodifluoromethane, 1,1,1-trichloroethane and chloromethane. We suggest that these differences are related to the turnover of these compounds in ocean surface waters. Furthermore we report first carbon isotope ratios for iodomethane (−40.4‰ to −79.8‰, bromoform (−13.8‰ to 22.9‰, and other halocarbons.

  13. Petroleum industry in Latin America: volume I

    International Nuclear Information System (INIS)

    Reinsch, A.E.; Tissot, R.R.

    1995-01-01

    This first volume of a three-volume series, provided an overview of major economic trends, and energy reserves (i.e. crude oil, natural gas and electricity) in Latin America. Established crude oil reserves were estimated at 125 billion barrels, with Mexico and Venezuela accounting for over 90 percent of the total. Established natural gas reserves were estimated at 249 Tcf, roughly one half of it being in Venezuela. It was noted that since natural gas exploration was still in its infancy in the region, this figure was very likely an underestimate of available resources. The current physical and market characteristics of the petroleum sector in each of the seven Latin American countries were examined in detail, as were the legal, regulatory, fiscal and political environments. Latin American efforts at integration were examined, with emphasis on regional trade agreements and energy integration. The central conclusion of the study was that Latin America appeared poised for a period of sustained economic development, with the energy sector occupying center stage. tabs., figs., refs

  14. Does use of a PACS increase the number of images per study? A case study in ultrasound.

    Science.gov (United States)

    Horii, Steven; Nisenbaum, Harvey; Farn, James; Coleman, Beverly; Rowling, Susan; Langer, Jill; Jacobs, Jill; Arger, Peter; Pinheiro, Lisa; Klein, Wendy; Reber, Michele; Iyoob, Christopher

    2002-03-01

    The purpose of this study was to determine if the use of a picture archiving and communications system (PACS) in ultrasonography increased the number of images acquired per examination. The hypothesis that such an increase does occur was based on anecdotal information; this study sought to test the hypothesis. A random sample of all ultrasound examination types was drawn from the period 1998 through 1999. The ultrasound PACS in use (ACCESS; Kodak Health Information Systems, Dallas, TX) records the number of grayscale and color images saved as part of each study. Each examination in the sample was checked in the ultrasound PACS database,.and the number of grayscale and color images was recorded. The comparison film-based sample was drawn from the period 1994 through 1995. The number of examinations of each type selected was based on the overall statistics of the section; that is, the sample was designed to represent the approximate frequency with which the various examination types are done. For film-based image counts, the jackets were retrieved, and the number of grayscale and color images were counted. The number of images obtained per examination (for most examinations) in ultrasound increased with PACS use. This was more evident with some examination types (eg, pelvis). This result, however, has to be examined for possible systematic biases because ultrasound practice has changed over the time since the authors stopped using film routinely. The use of PACS in ultrasonography was not associated with an increase in the number of images per examination based solely on the use of PACS, with the exception of neonatal head studies. Increases in the number of images per study was otherwise associated with examinations for which changes in protocols resulted in the increased image counts.

  15. Effect of natural ageing on volume stability of MSW and wood waste incineration residues

    International Nuclear Information System (INIS)

    Gori, Manuela; Bergfeldt, Britta; Reichelt, Jürgen; Sirini, Piero

    2013-01-01

    Highlights: ► Natural weathering on BA from MSW and wood waste incineration was evaluated. ► Type of mineral phases, pH and volume stability were considered. ► Weathering reactions effect in improved stability of the materials. - Abstract: This paper presents the results of a study on the effect of natural weathering on volume stability of bottom ash (BA) from municipal solid waste (MSW) and wood waste incineration. BA samples were taken at different steps of treatment (fresh, 4 weeks and 12 weeks aged) and then characterised for their chemical and mineralogical composition and for volume stability by means of the mineralogical test method (M HMVA-StB), which is part of the German quality control system for using aggregates in road construction (TL Gestein-StB 04). Changes of mineralogical composition with the proceeding of the weathering treatment were also monitored by leaching tests. At the end of the 12 weeks of treatment, almost all the considered samples resulted to be usable without restrictions in road construction with reference to the test parameter volume stability

  16. Influence of the volume ratio of solid phase on carrying capacity of regular porous structure

    Directory of Open Access Journals (Sweden)

    Monkova Katarina

    2017-01-01

    Full Text Available Direct metal laser sintering is spread technology today. The main advantage of this method is the ability to produce parts which have a very complex geometry and which can be produced only in very complicated way by classical conventional methods. Special category of such components are parts with porous structure, which can give to the product extraordinary combination of properties. The article deals with some aspects that influence the manufacturing of regular porous structures in spite of the fact that input technological parameters at various samples were the same. The main goal of presented research has been to investigate the influence of the volume ratio of solid phase on carrying capacity of regular porous structure. Realized tests have indicated that the unit of regular porous structure with lower volume ratio is able to carry a greater load to failure than the unit with higher volume ratio.

  17. Densidade global de solos medida com anel volumétrico e por cachimbagem de terra fina seca ao ar Bulk density of soil samples measured in the field and through volume measurement of sieved soil

    Directory of Open Access Journals (Sweden)

    Bernardo Van Raij

    1989-01-01

    Full Text Available Em laboratórios de rotina de fertilidade do solo, a medida de quantidade de terra para análise é feita em volume, mediante utensílios chamados "cachimbos", que permitem medir volumes de terra. Admite-se que essas medidas reflitam a quantidade de terra existente em volume de solo similar em condições de campo. Essa hipótese foi avaliada neste trabalho, por doze amostras dos horizontes A e B de seis perfis de solos. A densidade em condições de campo foi avaliada por anel volumétrico e, no laboratório, por meio de cachimbos de diversos tamanhos. A cachimbagem revelou-se bastante precisa. Os valores de densidade global calculada variaram de 0,63 a 1,46g/cm³ para medidas de campo e de 0,91 a 1,33g/cm³ para medidas com cachimbos. Portanto, a medida de laboratório subestimou valores altos de densidade e deu resultados mais elevados para valores de campo mais baixos.In soil testing laboratories, soil samples for chemical analysis are usually measured by volume, using appropriate measuring spoons. It is tacitly assumed that such measurements would reflect amounts of soil existing in the same volume under field conditions. This hypothesis was tested, using 12 soil samples of the A and B horizons of six soil profiles. Bulk density in the field was evaluated through a cylindrical metal sampler of 50cm³ and in the laboratory using spoons of different sizes. Measurements of soil volumes by spoons were quite precise. Values of bulk density varied between 0.63 and 1.46g/cm³ for field measurements and between 0.91 and 1.33g/cm³ for laboratory measurements with spoons. Thus, laboratory measurements overestimated lower values of bulk densities and underestimated the higher ones.

  18. Tratamiento tributario del factoring en el Perú

    OpenAIRE

    Rocío Liu Arévalo; Eduardo Sotelo Castañeda

    2015-01-01

    Una de las alternativas de financiamiento que tienen las empresas a su alcance es la que se presenta a través del contrato de factoring. En virtud de dicho contrato una de las partes - factor- adquiere todos o una parte de los créditos que la otra parte - factorado - tiene frente a terceros, adelantándole a cambio el importe de los mismos. En el presente artículo, los autores describen y analizan el régimen tributario aplicable al contrato de factoring en el Perú, p...

  19. Dongfeng has fixed a sales goal of 80 billion yuan supported by Nissan

    Institute of Scientific and Technical Information of China (English)

    2003-01-01

    <正> The Nissan and Dongfeng Group-based Dongfeng Automobile Co., Ltd, which has the largest investment in the history of the industry, opened officially for business on July 1. With a total investment of USD 2 billion and 70,000 employees, the company is the first joint venture in China which plans a full range of truck, light commercial and passenger vehicles. According to president Nakamura, the company has established a management

  20. Uranium hydrogeochemical and stream sediment reconnaissance data release for the Socorro NRMS Quadrangle, New Mexico, including concentrations of forty-two additional elements

    International Nuclear Information System (INIS)

    Planner, H.N.; Fuka, M.A.; Hanks, D.E.; Hansel, J.M.; Minor, M.M.; Montoya, J.D.; Sandoval, W.F.

    1980-10-01

    Results for uranium in water samples and uranium and 42 additional elements in sediment samples are given. A total of 650 water samples was collected from wells (525), springs (99), streams (25), and one pond. Uranium concentrations for all water samples range from below the detection limit to 157.20 parts per billion (ppB). Mean concentrations in springs and well waters are 4.91 ppB and 5.04 ppB, respectively, compared to a value of 2.78 ppB in stream waters. Of the 1384 sediment samples collected, 1246 are from dry stream beds. The remaining 138 samples are from springs (68), ponds (50), and flowing streams (20). Uranium concentrations in sediments range from 0.84 to 13.40 parts per million (ppM) with the exception of a single 445.10-ppM concentration. The mean uranium content of all sediments is 3.12 ppM. Field data, recorded at the collection site, are reported with the elemental concentrations for each water and sediment sample listed in Appendixes I-A and I-B. These data include a scintillometer determination of the equivalent uranium, pH and conductivity measurements, and geographic and weather information. Appendix II explains the codes used in Appendix I and describes the standard field and analytical procedures used by the LASL in the HSSR program