WorldWideScience

Sample records for large volume sample

  1. A self-sampling method to obtain large volumes of undiluted cervicovaginal secretions.

    Science.gov (United States)

    Boskey, Elizabeth R; Moench, Thomas R; Hees, Paul S; Cone, Richard A

    2003-02-01

    Studies of vaginal physiology and pathophysiology sometime require larger volumes of undiluted cervicovaginal secretions than can be obtained by current methods. A convenient method for self-sampling these secretions outside a clinical setting can facilitate such studies of reproductive health. The goal was to develop a vaginal self-sampling method for collecting large volumes of undiluted cervicovaginal secretions. A menstrual collection device (the Instead cup) was inserted briefly into the vagina to collect secretions that were then retrieved from the cup by centrifugation in a 50-ml conical tube. All 16 women asked to perform this procedure found it feasible and acceptable. Among 27 samples, an average of 0.5 g of secretions (range, 0.1-1.5 g) was collected. This is a rapid and convenient self-sampling method for obtaining relatively large volumes of undiluted cervicovaginal secretions. It should prove suitable for a wide range of assays, including those involving sexually transmitted diseases, microbicides, vaginal physiology, immunology, and pathophysiology.

  2. Absolute activity determinations on large volume geological samples independent of self-absorption effects

    International Nuclear Information System (INIS)

    Wilson, W.E.

    1980-01-01

    This paper describes a method for measuring the absolute activity of large volume samples by γ-spectroscopy independent of self-absorption effects using Ge detectors. The method yields accurate matrix independent results at the expense of replicative counting of the unknown sample. (orig./HP)

  3. Large-volume constant-concentration sampling technique coupling with surface-enhanced Raman spectroscopy for rapid on-site gas analysis.

    Science.gov (United States)

    Zhang, Zhuomin; Zhan, Yisen; Huang, Yichun; Li, Gongke

    2017-08-05

    In this work, a portable large-volume constant-concentration (LVCC) sampling technique coupling with surface-enhanced Raman spectroscopy (SERS) was developed for the rapid on-site gas analysis based on suitable derivatization methods. LVCC sampling technique mainly consisted of a specially designed sampling cell including the rigid sample container and flexible sampling bag, and an absorption-derivatization module with a portable pump and a gas flowmeter. LVCC sampling technique allowed large, alterable and well-controlled sampling volume, which kept the concentration of gas target in headspace phase constant during the entire sampling process and made the sampling result more representative. Moreover, absorption and derivatization of gas target during LVCC sampling process were efficiently merged in one step using bromine-thiourea and OPA-NH 4 + strategy for ethylene and SO 2 respectively, which made LVCC sampling technique conveniently adapted to consequent SERS analysis. Finally, a new LVCC sampling-SERS method was developed and successfully applied for rapid analysis of trace ethylene and SO 2 from fruits. It was satisfied that trace ethylene and SO 2 from real fruit samples could be actually and accurately quantified by this method. The minor concentration fluctuations of ethylene and SO 2 during the entire LVCC sampling process were proved to be gas targets from real samples by SERS. Copyright © 2017 Elsevier B.V. All rights reserved.

  4. SyPRID sampler: A large-volume, high-resolution, autonomous, deep-ocean precision plankton sampling system

    Science.gov (United States)

    Billings, Andrew; Kaiser, Carl; Young, Craig M.; Hiebert, Laurel S.; Cole, Eli; Wagner, Jamie K. S.; Van Dover, Cindy Lee

    2017-03-01

    The current standard for large-volume (thousands of cubic meters) zooplankton sampling in the deep sea is the MOCNESS, a system of multiple opening-closing nets, typically lowered to within 50 m of the seabed and towed obliquely to the surface to obtain low-spatial-resolution samples that integrate across 10 s of meters of water depth. The SyPRID (Sentry Precision Robotic Impeller Driven) sampler is an innovative, deep-rated (6000 m) plankton sampler that partners with the Sentry Autonomous Underwater Vehicle (AUV) to obtain paired, large-volume plankton samples at specified depths and survey lines to within 1.5 m of the seabed and with simultaneous collection of sensor data. SyPRID uses a perforated Ultra-High-Molecular-Weight (UHMW) plastic tube to support a fine mesh net within an outer carbon composite tube (tube-within-a-tube design), with an axial flow pump located aft of the capture filter. The pump facilitates flow through the system and reduces or possibly eliminates the bow wave at the mouth opening. The cod end, a hollow truncated cone, is also made of UHMW plastic and includes a collection volume designed to provide an area where zooplankton can collect, out of the high flow region. SyPRID attaches as a saddle-pack to the Sentry vehicle. Sentry itself is configured with a flight control system that enables autonomous survey paths to low altitudes. In its verification deployment at the Blake Ridge Seep (2160 m) on the US Atlantic Margin, SyPRID was operated for 6 h at an altitude of 5 m. It recovered plankton samples, including delicate living larvae, from the near-bottom stratum that is seldom sampled by a typical MOCNESS tow. The prototype SyPRID and its next generations will enable studies of plankton or other particulate distributions associated with localized physico-chemical strata in the water column or above patchy habitats on the seafloor.

  5. Large-volume injection of sample diluents not miscible with the mobile phase as an alternative approach in sample preparation for bioanalysis: an application for fenspiride bioequivalence.

    Science.gov (United States)

    Medvedovici, Andrei; Udrescu, Stefan; Albu, Florin; Tache, Florentin; David, Victor

    2011-09-01

    Liquid-liquid extraction of target compounds from biological matrices followed by the injection of a large volume from the organic layer into the chromatographic column operated under reversed-phase (RP) conditions would successfully combine the selectivity and the straightforward character of the procedure in order to enhance sensitivity, compared with the usual approach of involving solvent evaporation and residue re-dissolution. Large-volume injection of samples in diluents that are not miscible with the mobile phase was recently introduced in chromatographic practice. The risk of random errors produced during the manipulation of samples is also substantially reduced. A bioanalytical method designed for the bioequivalence of fenspiride containing pharmaceutical formulations was based on a sample preparation procedure involving extraction of the target analyte and the internal standard (trimetazidine) from alkalinized plasma samples in 1-octanol. A volume of 75 µl from the octanol layer was directly injected on a Zorbax SB C18 Rapid Resolution, 50 mm length × 4.6 mm internal diameter × 1.8 µm particle size column, with the RP separation being carried out under gradient elution conditions. Detection was made through positive ESI and MS/MS. Aspects related to method development and validation are discussed. The bioanalytical method was successfully applied to assess bioequivalence of a modified release pharmaceutical formulation containing 80 mg fenspiride hydrochloride during two different studies carried out as single-dose administration under fasting and fed conditions (four arms), and multiple doses administration, respectively. The quality attributes assigned to the bioanalytical method, as resulting from its application to the bioequivalence studies, are highlighted and fully demonstrate that sample preparation based on large-volume injection of immiscible diluents has an increased potential for application in bioanalysis.

  6. Large-volume constant-concentration sampling technique coupling with surface-enhanced Raman spectroscopy for rapid on-site gas analysis

    Science.gov (United States)

    Zhang, Zhuomin; Zhan, Yisen; Huang, Yichun; Li, Gongke

    2017-08-01

    In this work, a portable large-volume constant-concentration (LVCC) sampling technique coupling with surface-enhanced Raman spectroscopy (SERS) was developed for the rapid on-site gas analysis based on suitable derivatization methods. LVCC sampling technique mainly consisted of a specially designed sampling cell including the rigid sample container and flexible sampling bag, and an absorption-derivatization module with a portable pump and a gas flowmeter. LVCC sampling technique allowed large, alterable and well-controlled sampling volume, which kept the concentration of gas target in headspace phase constant during the entire sampling process and made the sampling result more representative. Moreover, absorption and derivatization of gas target during LVCC sampling process were efficiently merged in one step using bromine-thiourea and OPA-NH4+ strategy for ethylene and SO2 respectively, which made LVCC sampling technique conveniently adapted to consequent SERS analysis. Finally, a new LVCC sampling-SERS method was developed and successfully applied for rapid analysis of trace ethylene and SO2 from fruits. It was satisfied that trace ethylene and SO2 from real fruit samples could be actually and accurately quantified by this method. The minor concentration fluctuations of ethylene and SO2 during the entire LVCC sampling process were proved to be samples were achieved in range of 95.0-101% and 97.0-104% respectively. It is expected that portable LVCC sampling technique would pave the way for rapid on-site analysis of accurate concentrations of trace gas targets from real samples by SERS.

  7. Analysis of plant hormones by microemulsion electrokinetic capillary chromatography coupled with on-line large volume sample stacking.

    Science.gov (United States)

    Chen, Zongbao; Lin, Zian; Zhang, Lin; Cai, Yan; Zhang, Lan

    2012-04-07

    A novel method of microemulsion electrokinetic capillary chromatography (MEEKC) coupled with on-line large volume sample stacking was developed for the analysis of six plant hormones including indole-3-acetic acid, indole-3-butyric acid, indole-3-propionic acid, 1-naphthaleneacetic acid, abscisic acid and salicylic acid. Baseline separation of six plant hormones was achieved within 10 min by using the microemulsion background electrolyte containing a 97.2% (w/w) 10 mM borate buffer at pH 9.2, 1.0% (w/w) ethyl acetate as oil droplets, 0.6% (w/w) sodium dodecyl sulphate as surfactant and 1.2% (w/w) 1-butanol as cosurfactant. In addition, an on-line concentration method based on a large volume sample stacking technique and multiple wavelength detection was adopted for improving the detection sensitivity in order to determine trace level hormones in a real sample. The optimal method provided about 50-100 fold increase in detection sensitivity compared with a single MEEKC method, and the detection limits (S/N = 3) were between 0.005 and 0.02 μg mL(-1). The proposed method was simple, rapid and sensitive and could be applied to the determination of six plant hormones in spiked water samples, tobacco leaves and 1-naphthylacetic acid in leaf fertilizer. The recoveries ranged from 76.0% to 119.1%, and good reproducibilities were obtained with relative standard deviations (RSDs) less than 6.6%.

  8. Effects of large volume injection of aliphatic alcohols as sample diluents on the retention of low hydrophobic solutes in reversed-phase liquid chromatography.

    Science.gov (United States)

    David, Victor; Galaon, Toma; Aboul-Enein, Hassan Y

    2014-01-03

    Recent studies showed that injection of large volume of hydrophobic solvents used as sample diluents could be applied in reversed-phase liquid chromatography (RP-LC). This study reports a systematic research focused on the influence of a series of aliphatic alcohols (from methanol to 1-octanol) on the retention process in RP-LC, when large volumes of sample are injected on the column. Several model analytes with low hydrophobic character were studied by RP-LC process, for mobile phases containing methanol or acetonitrile as organic modifiers in different proportions with aqueous component. It was found that starting with 1-butanol, the aliphatic alcohols can be used as sample solvents and they can be injected in high volumes, but they may influence the retention factor and peak shape of the dissolved solutes. The dependence of the retention factor of the studied analytes on the injection volume of these alcohols is linear, with a decrease of its value as the sample volume is increased. The retention process in case of injecting up to 200μL of upper alcohols is dependent also on the content of the organic modifier (methanol or acetonitrile) in mobile phase. Copyright © 2013 Elsevier B.V. All rights reserved.

  9. Water pollution screening by large-volume injection of aqueous samples and application to GC/MS analysis of a river Elbe sample

    Energy Technology Data Exchange (ETDEWEB)

    Mueller, S.; Efer, J.; Engewald, W. [Leipzig Univ. (Germany). Inst. fuer Analytische Chemie

    1997-03-01

    The large-volume sampling of aqueous samples in a programmed temperature vaporizer (PTV) injector was used successfully for the target and non-target analysis of real samples. In this still rarely applied method, e.g., 1 mL of the water sample to be analyzed is slowly injected direct into the PTV. The vaporized water is eliminated through the split vent. The analytes are concentrated onto an adsorbent inside the insert and subsequently thermally desorbed. The capability of the method is demonstrated using a sample from the river Elbe. By means of coupling this method with a mass selective detector in SIM mode (target analysis) the method allows the determination of pollutants in the concentration range up to 0.01 {mu}g/L. Furthermore, PTV enrichment is an effective and time-saving method for non-target analysis in SCAN mode. In a sample from the river Elbe over 20 compounds were identified. (orig.) With 3 figs., 2 tabs.

  10. Large Sample Neutron Activation Analysis of Heterogeneous Samples

    International Nuclear Information System (INIS)

    Stamatelatos, I.E.; Vasilopoulou, T.; Tzika, F.

    2018-01-01

    A Large Sample Neutron Activation Analysis (LSNAA) technique was developed for non-destructive analysis of heterogeneous bulk samples. The technique incorporated collimated scanning and combining experimental measurements and Monte Carlo simulations for the identification of inhomogeneities in large volume samples and the correction of their effect on the interpretation of gamma-spectrometry data. Corrections were applied for the effect of neutron self-shielding, gamma-ray attenuation, geometrical factor and heterogeneous activity distribution within the sample. A benchmark experiment was performed to investigate the effect of heterogeneity on the accuracy of LSNAA. Moreover, a ceramic vase was analyzed as a whole demonstrating the feasibility of the technique. The LSNAA results were compared against results obtained by INAA and a satisfactory agreement between the two methods was observed. This study showed that LSNAA is a technique capable to perform accurate non-destructive, multi-elemental compositional analysis of heterogeneous objects. It also revealed the great potential of the technique for the analysis of precious objects and artefacts that need to be preserved intact and cannot be damaged for sampling purposes. (author)

  11. Large Sample Neutron Activation Analysis: A Challenge in Cultural Heritage Studies

    International Nuclear Information System (INIS)

    Stamatelatos, I.E.; Tzika, F.

    2007-01-01

    Large sample neutron activation analysis compliments and significantly extends the analytical tools available for cultural heritage and authentication studies providing unique applications of non-destructive, multi-element analysis of materials that are too precious to damage for sampling purposes, representative sampling of heterogeneous materials or even analysis of whole objects. In this work, correction factors for neutron self-shielding, gamma-ray attenuation and volume distribution of the activity in large volume samples composed of iron and ceramic material were derived. Moreover, the effect of inhomogeneity on the accuracy of the technique was examined

  12. Effect of NaOH on large-volume sample stacking of haloacetic acids in capillary zone electrophoresis with a low-pH buffer.

    Science.gov (United States)

    Tu, Chuanhong; Zhu, Lingyan; Ang, Chay Hoon; Lee, Hian Kee

    2003-06-01

    Large-volume sample stacking (LVSS) is an effective on-capillary sample concentration method in capillary zone electrophoresis, which can be applied to the sample in a low-conductivity matrix. NaOH solution is commonly used to back-extract acidic compounds from organic solvent in sample pretreatment. The effect of NaOH as sample matrix on LVSS of haloacetic acids was investigated in this study. It was found that the presence of NaOH in sample did not compromise, but rather help the sample stacking performance if a low pH background electrolyte (BGE) was used. The sensitivity enhancement factor was higher than the case when sample was dissolved in pure water or diluted BGE. Compared with conventional injection (0.4% capillary volume), 97-120-fold sensitivity enhancement in terms of peak height was obtained without deterioration of separation with an injection amount equal to 20% of the capillary volume. This method was applied to determine haloacetic acids in tap water by combination with liquid-liquid extraction and back-extraction into NaOH solution. Limits of detection at sub-ppb levels were obtained for real samples with direct UV detection.

  13. Solid phase extraction of large volume of water and beverage samples to improve detection limits for GC-MS analysis of bisphenol A and four other bisphenols.

    Science.gov (United States)

    Cao, Xu-Liang; Popovic, Svetlana

    2018-01-01

    Solid phase extraction (SPE) of large volumes of water and beverage products was investigated for the GC-MS analysis of bisphenol A (BPA), bisphenol AF (BPAF), bisphenol F (BPF), bisphenol E (BPE), and bisphenol B (BPB). While absolute recoveries of the method were improved for water and some beverage products (e.g. diet cola, iced tea), breakthrough may also have occurred during SPE of 200 mL of other beverages (e.g. BPF in cola). Improvements in method detection limits were observed with the analysis of large sample volumes for all bisphenols at ppt (pg/g) to sub-ppt levels. This improvement was found to be proportional to sample volumes for water and beverage products with less interferences and noise levels around the analytes. Matrix effects and interferences were observed during SPE of larger volumes (100 and 200 mL) of the beverage products, and affected the accurate analysis of BPF. This improved method was used to analyse bisphenols in various beverage samples, and only BPA was detected, with levels ranging from 0.022 to 0.030 ng/g for products in PET bottles, and 0.085 to 0.32 ng/g for products in cans.

  14. A low-volume cavity ring-down spectrometer for sample-limited applications

    Science.gov (United States)

    Stowasser, C.; Farinas, A. D.; Ware, J.; Wistisen, D. W.; Rella, C.; Wahl, E.; Crosson, E.; Blunier, T.

    2014-08-01

    In atmospheric and environmental sciences, optical spectrometers are used for the measurements of greenhouse gas mole fractions and the isotopic composition of water vapor or greenhouse gases. The large sample cell volumes (tens of milliliters to several liters) in commercially available spectrometers constrain the usefulness of such instruments for applications that are limited in sample size and/or need to track fast variations in the sample stream. In an effort to make spectrometers more suitable for sample-limited applications, we developed a low-volume analyzer capable of measuring mole fractions of methane and carbon monoxide based on a commercial cavity ring-down spectrometer. The instrument has a small sample cell (9.6 ml) and can selectively be operated at a sample cell pressure of 140, 45, or 20 Torr (effective internal volume of 1.8, 0.57, and 0.25 ml). We present the new sample cell design and the flow path configuration, which are optimized for small sample sizes. To quantify the spectrometer's usefulness for sample-limited applications, we determine the renewal rate of sample molecules within the low-volume spectrometer. Furthermore, we show that the performance of the low-volume spectrometer matches the performance of the standard commercial analyzers by investigating linearity, precision, and instrumental drift.

  15. Solid-Phase Extraction and Large-Volume Sample Stacking-Capillary Electrophoresis for Determination of Tetracycline Residues in Milk

    Directory of Open Access Journals (Sweden)

    Gabriela Islas

    2018-01-01

    Full Text Available Solid-phase extraction in combination with large-volume sample stacking-capillary electrophoresis (SPE-LVSS-CE was applied to measure chlortetracycline, doxycycline, oxytetracycline, and tetracycline in milk samples. Under optimal conditions, the proposed method had a linear range of 29 to 200 µg·L−1, with limits of detection ranging from 18.6 to 23.8 µg·L−1 with inter- and intraday repeatabilities < 10% (as a relative standard deviation in all cases. The enrichment factors obtained were from 50.33 to 70.85 for all the TCs compared with a conventional capillary zone electrophoresis (CZE. This method is adequate to analyze tetracyclines below the most restrictive established maximum residue limits. The proposed method was employed in the analysis of 15 milk samples from different brands. Two of the tested samples were positive for the presence of oxytetracycline with concentrations of 95 and 126 µg·L−1. SPE-LVSS-CE is a robust, easy, and efficient strategy for online preconcentration of tetracycline residues in complex matrices.

  16. Assembly, operation and disassembly manual for the Battelle Large Volume Water Sampler (BLVWS)

    International Nuclear Information System (INIS)

    Thomas, V.W.; Campbell, R.M.

    1984-12-01

    Assembly, operation and disassembly of the Battelle Large Volume Water Sampler (BLVWS) are described in detail. Step by step instructions of assembly, general operation and disassembly are provided to allow an operator completely unfamiliar with the sampler to successfully apply the BLVWS to his research sampling needs. The sampler permits concentration of both particulate and dissolved radionuclides from large volumes of ocean and fresh water. The water sample passes through a filtration section for particle removal then through sorption or ion exchange beds where species of interest are removed. The sampler components which contact the water being sampled are constructed of polyvinylchloride (PVC). The sampler has been successfully applied to many sampling needs over the past fifteen years. 9 references, 8 figures

  17. Geophysics Under Pressure: Large-Volume Presses Versus the Diamond-Anvil Cell

    Science.gov (United States)

    Hazen, R. M.

    2002-05-01

    Prior to 1970, the legacy of Harvard physicist Percy Bridgman dominated high-pressure geophysics. Massive presses with large-volume devices, including piston-cylinder, opposed-anvil, and multi-anvil configurations, were widely used in both science and industry to achieve a range of crustal and upper mantle temperatures and pressures. George Kennedy of UCLA was a particularly influential advocate of large-volume apparatus for geophysical research prior to his death in 1980. The high-pressure scene began to change in 1959 with the invention of the diamond-anvil cell, which was designed simultaneously and independently by John Jamieson at the University of Chicago and Alvin Van Valkenburg at the National Bureau of Standards in Washington, DC. The compact, inexpensive diamond cell achieved record static pressures and had the advantage of optical access to the high-pressure environment. Nevertheless, members of the geophysical community, who favored the substantial sample volumes, geothermally relevant temperature range, and satisfying bulk of large-volume presses, initially viewed the diamond cell with indifference or even contempt. Several factors led to a gradual shift in emphasis from large-volume presses to diamond-anvil cells in geophysical research during the 1960s and 1970s. These factors include (1) their relatively low cost at time of fiscal restraint, (2) Alvin Van Valkenburg's new position as a Program Director at the National Science Foundation in 1964 (when George Kennedy's proposal for a Nation High-Pressure Laboratory was rejected), (3) the development of lasers and micro-analytical spectroscopic techniques suitable for analyzing samples in a diamond cell, and (4) the attainment of record pressures (e.g., 100 GPa in 1975 by Mao and Bell at the Geophysical Laboratory). Today, a more balanced collaborative approach has been adopted by the geophysics and mineral physics community. Many high-pressure laboratories operate a new generation of less expensive

  18. An open-flow pulse ionization chamber for alpha spectrometry of large-area samples

    International Nuclear Information System (INIS)

    Johansson, L.; Roos, B.; Samuelsson, C.

    1992-01-01

    The presented open-flow pulse ionization chamber was developed to make alpha spectrometry on large-area surfaces easy. One side of the chamber is left open, where the sample is to be placed. The sample acts as a chamber wall and therby defeins the detector volume. The sample area can be as large as 400 cm 2 . To prevent air from entering the volume there is a constant gas flow through the detector, coming in at the bottom of the chamber and leaking at the sides of the sample. The method results in good energy resolution and has considerable applicability in the retrospective radon research. Alpha spectra obtained in the retrospective measurements descend from 210 Po, built up in the sample from the radon daughters recoiled into a glass surface. (au)

  19. High Energy Performance Tests of Large Volume LaBr{sub 3}:Ce Detector

    Energy Technology Data Exchange (ETDEWEB)

    Naqvi, A.A.; Gondal, M.A.; Khiari, F.Z.; Dastageer, M.A. [Department of Physics, King Fahd University of Petroleum and Minerals, Dhahran (Saudi Arabia); Maslehuddin, M.M. [Center for Engineering Research, King Fahd University of Petroleum and Minerals, Dhahran (Saudi Arabia); Al-Amoudi, O.S.B. [Department of Civil Engineering, King Fahd University of Petroleum and Minerals, Dhahran (Saudi Arabia)

    2015-07-01

    High energy prompt gamma ray tests of a large volume cylindrical 100 mm x 100 mm (height x diameter) LaBr{sub 3}:Ce detector were carried out using a portable neutron generator-based Prompt Gamma Neutron Activation Analysis (PGNAA) setup. In this study prompt gamma-rays yield were measured from water samples contaminated with toxic elements such nickel, chromium and mercury compounds with gamma ray energies up to 10 MeV. The experimental yield of prompt gamma-rays from toxic elements were compared with the results of Monte Carlo calculations. In spite of its higher intrinsic background due to its larger volume, an excellent agreement between the experimental and calculated yields of high energy gamma-rays from Ni, Cr and Hg samples has been achieved for the large volume LaBr{sub 3}:Ce detector. (authors)

  20. The problem of large samples. An activation analysis study of electronic waste material

    International Nuclear Information System (INIS)

    Segebade, C.; Goerner, W.; Bode, P.

    2007-01-01

    Large-volume instrumental photon activation analysis (IPAA) was used for the investigation of shredded electronic waste material. Sample masses from 1 to 150 grams were analyzed to obtain an estimate of the minimum sample size to be taken to achieve a representativeness of the results which is satisfactory for a defined investigation task. Furthermore, the influence of irradiation and measurement parameters upon the quality of the analytical results were studied. Finally, the analytical data obtained from IPAA and instrumental neutron activation analysis (INAA), both carried out in a large-volume mode, were compared. Only parts of the values were found in satisfactory agreement. (author)

  1. Volume independence in large Nc QCD-like gauge theories

    International Nuclear Information System (INIS)

    Kovtun, Pavel; Uensal, Mithat; Yaffe, Laurence G.

    2007-01-01

    Volume independence in large N c gauge theories may be viewed as a generalized orbifold equivalence. The reduction to zero volume (or Eguchi-Kawai reduction) is a special case of this equivalence. So is temperature independence in confining phases. A natural generalization concerns volume independence in 'theory space' of quiver gauge theories. In pure Yang-Mills theory, the failure of volume independence for sufficiently small volumes (at weak coupling) due to spontaneous breaking of center symmetry, together with its validity above a critical size, nicely illustrate the symmetry realization conditions which are both necessary and sufficient for large N c orbifold equivalence. The existence of a minimal size below which volume independence fails also applies to Yang-Mills theory with antisymmetric representation fermions [QCD(AS)]. However, in Yang-Mills theory with adjoint representation fermions [QCD(Adj)], endowed with periodic boundary conditions, volume independence remains valid down to arbitrarily small size. In sufficiently large volumes, QCD(Adj) and QCD(AS) have a large N c ''orientifold'' equivalence, provided charge conjugation symmetry is unbroken in the latter theory. Therefore, via a combined orbifold-orientifold mapping, a well-defined large N c equivalence exists between QCD(AS) in large, or infinite, volume and QCD(Adj) in arbitrarily small volume. Since asymptotically free gauge theories, such as QCD(Adj), are much easier to study (analytically or numerically) in small volume, this equivalence should allow greater understanding of large N c QCD in infinite volume

  2. Capillary gas chromatographic analysis of nerve agents using large volume injections

    NARCIS (Netherlands)

    Degenhardt, C.E.A.M.; Kientz, C.E.

    1996-01-01

    The use of large volume injections has been studied for the verification of intact organophosphorus chemical warfare agents in water samples. As the use of ethyl acetate caused severe detection problems new potential solvents were evaluated. With the developed procedure, the nerve agents sarin,

  3. Rapid estimate of solid volume in large tuff cores using a gas pycnometer

    International Nuclear Information System (INIS)

    Thies, C.; Geddis, A.M.; Guzman, A.G.

    1996-09-01

    A thermally insulated, rigid-volume gas pycnometer system has been developed. The pycnometer chambers have been machined from solid PVC cylinders. Two chambers confine dry high-purity helium at different pressures. A thick-walled design ensures minimal heat exchange with the surrounding environment and a constant volume system, while expansion takes place between the chambers. The internal energy of the gas is assumed constant over the expansion. The ideal gas law is used to estimate the volume of solid material sealed in one of the chambers. Temperature is monitored continuously and incorporated into the calculation of solid volume. Temperature variation between measurements is less than 0.1 degrees C. The data are used to compute grain density for oven-dried Apache Leap tuff core samples. The measured volume of solid and the sample bulk volume are used to estimate porosity and bulk density. Intrinsic permeability was estimated from the porosity and measured pore surface area and is compared to in-situ measurements by the air permeability method. The gas pycnometer accommodates large core samples (0.25 m length x 0.11 m diameter) and can measure solid volume greater than 2.20 cm 3 with less than 1% error

  4. Adsorption of transuranic elements from large volume sea water

    International Nuclear Information System (INIS)

    Holm, E.; Ballestra, S.

    1976-01-01

    Some years ago a sampler for concentrating radionuclides from large volumes of sea water was developed by Silker et al. of the Battelle Northwest Laboratories. They used pure A1 2 O 3 as the adsorbent. The device has been applied successfully to the determination of 238 Pu and 239 Pu in several sea water samples. Our experience on the application of an identical system for the determination of transuranics in Mediterranean sea water was not quite as satisfactory as we had hoped. The chemistry involved in leaching up to 1 kg Al 2 O 3 . with acid, followed by removal of dissolved aluminium from the transuranic fraction, is rather tedious and time-consuming for routine use. The adsorption efficiency of transuranics, checked by dual-bed adsorption did not give consistent results. However, since the principle of the device is attractive enough for handling large volume water samples, it was felt that it was worthwhile to test other types of adsorbents which are easier to handle than Al 2 O 3 . For this purpose, chitosan and manganese dioxide were chosen and series of experiments were conducted in order to examine the suitability of these materials as an adsorbent in the system

  5. Rapid estimate of solid volume in large tuff cores using a gas pycnometer

    Energy Technology Data Exchange (ETDEWEB)

    Thies, C. [ed.; Geddis, A.M.; Guzman, A.G. [and others

    1996-09-01

    A thermally insulated, rigid-volume gas pycnometer system has been developed. The pycnometer chambers have been machined from solid PVC cylinders. Two chambers confine dry high-purity helium at different pressures. A thick-walled design ensures minimal heat exchange with the surrounding environment and a constant volume system, while expansion takes place between the chambers. The internal energy of the gas is assumed constant over the expansion. The ideal gas law is used to estimate the volume of solid material sealed in one of the chambers. Temperature is monitored continuously and incorporated into the calculation of solid volume. Temperature variation between measurements is less than 0.1{degrees}C. The data are used to compute grain density for oven-dried Apache Leap tuff core samples. The measured volume of solid and the sample bulk volume are used to estimate porosity and bulk density. Intrinsic permeability was estimated from the porosity and measured pore surface area and is compared to in-situ measurements by the air permeability method. The gas pycnometer accommodates large core samples (0.25 m length x 0.11 m diameter) and can measure solid volume greater than 2.20 cm{sup 3} with less than 1% error.

  6. Sampling soils for 137Cs using various field-sampling volumes

    International Nuclear Information System (INIS)

    Nyhan, J.W.; Schofield, T.G.; White, G.C.; Trujillo, G.

    1981-10-01

    The sediments from a liquid effluent receiving area at the Los Alamos National Laboratory and soils from intensive study area in the fallout pathway of Trinity were sampled for 137 Cs using 25-, 500-, 2500-, and 12 500-cm 3 field sampling volumes. A highly replicated sampling program was used to determine mean concentrations and inventories of 137 Cs at each site, as well as estimates of spatial, aliquoting, and counting variance components of the radionuclide data. The sampling methods were also analyzed as a function of soil size fractions collected in each field sampling volume and of the total cost of the program for a given variation in the radionuclide survey results. Coefficients of variation (CV) of 137 Cs inventory estimates ranged from 0.063 to 0.14 for Mortandad Canyon sediments, where CV values for Trinity soils were observed from 0.38 to 0.57. Spatial variance components of 137 Cs concentration data were usually found to be larger than either the aliquoting or counting variance estimates and were inversely related to field sampling volume at the Trinity intensive site. Subsequent optimization studies of the sampling schemes demonstrated that each aliquot should be counted once, and that only 2 to 4 aliquots out of an many as 30 collected need be assayed for 137 Cs. The optimization studies showed that as sample costs increased to 45 man-hours of labor per sample, the variance of the mean 137 Cs concentration decreased dramatically, but decreased very little with additional labor

  7. A cache-friendly sampling strategy for texture-based volume rendering on GPU

    Directory of Open Access Journals (Sweden)

    Junpeng Wang

    2017-06-01

    Full Text Available The texture-based volume rendering is a memory-intensive algorithm. Its performance relies heavily on the performance of the texture cache. However, most existing texture-based volume rendering methods blindly map computational resources to texture memory and result in incoherent memory access patterns, causing low cache hit rates in certain cases. The distance between samples taken by threads of an atomic scheduling unit (e.g. a warp of 32 threads in CUDA of the GPU is a crucial factor that affects the texture cache performance. Based on this fact, we present a new sampling strategy, called Warp Marching, for the ray-casting algorithm of texture-based volume rendering. The effects of different sample organizations and different thread-pixel mappings in the ray-casting algorithm are thoroughly analyzed. Also, a pipeline manner color blending approach is introduced and the power of warp-level GPU operations is leveraged to improve the efficiency of parallel executions on the GPU. In addition, the rendering performance of the Warp Marching is view-independent, and it outperforms existing empty space skipping techniques in scenarios that need to render large dynamic volumes in a low resolution image. Through a series of micro-benchmarking and real-life data experiments, we rigorously analyze our sampling strategies and demonstrate significant performance enhancements over existing sampling methods.

  8. SparseLeap: Efficient Empty Space Skipping for Large-Scale Volume Rendering

    KAUST Repository

    Hadwiger, Markus; Al-Awami, Ali K.; Beyer, Johanna; Agus, Marco; Pfister, Hanspeter

    2017-01-01

    Recent advances in data acquisition produce volume data of very high resolution and large size, such as terabyte-sized microscopy volumes. These data often contain many fine and intricate structures, which pose huge challenges for volume rendering, and make it particularly important to efficiently skip empty space. This paper addresses two major challenges: (1) The complexity of large volumes containing fine structures often leads to highly fragmented space subdivisions that make empty regions hard to skip efficiently. (2) The classification of space into empty and non-empty regions changes frequently, because the user or the evaluation of an interactive query activate a different set of objects, which makes it unfeasible to pre-compute a well-adapted space subdivision. We describe the novel SparseLeap method for efficient empty space skipping in very large volumes, even around fine structures. The main performance characteristic of SparseLeap is that it moves the major cost of empty space skipping out of the ray-casting stage. We achieve this via a hybrid strategy that balances the computational load between determining empty ray segments in a rasterization (object-order) stage, and sampling non-empty volume data in the ray-casting (image-order) stage. Before ray-casting, we exploit the fast hardware rasterization of GPUs to create a ray segment list for each pixel, which identifies non-empty regions along the ray. The ray-casting stage then leaps over empty space without hierarchy traversal. Ray segment lists are created by rasterizing a set of fine-grained, view-independent bounding boxes. Frame coherence is exploited by re-using the same bounding boxes unless the set of active objects changes. We show that SparseLeap scales better to large, sparse data than standard octree empty space skipping.

  9. SparseLeap: Efficient Empty Space Skipping for Large-Scale Volume Rendering

    KAUST Repository

    Hadwiger, Markus

    2017-08-28

    Recent advances in data acquisition produce volume data of very high resolution and large size, such as terabyte-sized microscopy volumes. These data often contain many fine and intricate structures, which pose huge challenges for volume rendering, and make it particularly important to efficiently skip empty space. This paper addresses two major challenges: (1) The complexity of large volumes containing fine structures often leads to highly fragmented space subdivisions that make empty regions hard to skip efficiently. (2) The classification of space into empty and non-empty regions changes frequently, because the user or the evaluation of an interactive query activate a different set of objects, which makes it unfeasible to pre-compute a well-adapted space subdivision. We describe the novel SparseLeap method for efficient empty space skipping in very large volumes, even around fine structures. The main performance characteristic of SparseLeap is that it moves the major cost of empty space skipping out of the ray-casting stage. We achieve this via a hybrid strategy that balances the computational load between determining empty ray segments in a rasterization (object-order) stage, and sampling non-empty volume data in the ray-casting (image-order) stage. Before ray-casting, we exploit the fast hardware rasterization of GPUs to create a ray segment list for each pixel, which identifies non-empty regions along the ray. The ray-casting stage then leaps over empty space without hierarchy traversal. Ray segment lists are created by rasterizing a set of fine-grained, view-independent bounding boxes. Frame coherence is exploited by re-using the same bounding boxes unless the set of active objects changes. We show that SparseLeap scales better to large, sparse data than standard octree empty space skipping.

  10. Large-volume static compression using nano-polycrystalline diamond for opposed anvils in compact cells

    International Nuclear Information System (INIS)

    Okuchi, T; Sasaki, S; Ohno, Y; Osakabe, T; Odake, S; Kagi, H

    2010-01-01

    In order to extend the pressure regime of intrinsically low-sensitivity methods of measurement, such as neutron scattering and NMR, sample volume to be compressed in compact opposed-anvil cells is desired to be significantly increased. We hereby conducted a series of experiments using two types of compact cells equipped with enforced loading mechanisms. Super-hard nano-polycrystalline diamond (NPD) anvils were carefully prepared for large-volume compression in these cells. These anvils are harder, larger and stronger than single crystal diamond anvils, so that they could play an ideal role to accept the larger forces. Supported and unsupported anvil geometries were separately tested to evaluate this expectation. In spite of insufficient support to the anvils, pressures to 14 GPa were generated for the sample volume of > 0.1 mm 3 , without damaging the NPD anvils. These results demonstrate a large future potential of compact cells equipped with NPD anvils and enforced loading mechanism.

  11. Brachytherapy dose-volume histogram computations using optimized stratified sampling methods

    International Nuclear Information System (INIS)

    Karouzakis, K.; Lahanas, M.; Milickovic, N.; Giannouli, S.; Baltas, D.; Zamboglou, N.

    2002-01-01

    A stratified sampling method for the efficient repeated computation of dose-volume histograms (DVHs) in brachytherapy is presented as used for anatomy based brachytherapy optimization methods. The aim of the method is to reduce the number of sampling points required for the calculation of DVHs for the body and the PTV. From the DVHs are derived the quantities such as Conformity Index COIN and COIN integrals. This is achieved by using partial uniform distributed sampling points with a density in each region obtained from a survey of the gradients or the variance of the dose distribution in these regions. The shape of the sampling regions is adapted to the patient anatomy and the shape and size of the implant. For the application of this method a single preprocessing step is necessary which requires only a few seconds. Ten clinical implants were used to study the appropriate number of sampling points, given a required accuracy for quantities such as cumulative DVHs, COIN indices and COIN integrals. We found that DVHs of very large tissue volumes surrounding the PTV, and also COIN distributions, can be obtained using a factor of 5-10 times smaller the number of sampling points in comparison with uniform distributed points

  12. Large volume cryogenic silicon detectors

    International Nuclear Information System (INIS)

    Braggio, C.; Boscardin, M.; Bressi, G.; Carugno, G.; Corti, D.; Galeazzi, G.; Zorzi, N.

    2009-01-01

    We present preliminary measurements for the development of a large volume silicon detector to detect low energy and low rate energy depositions. The tested detector is a one cm-thick silicon PIN diode with an active volume of 31 cm 3 , cooled to the liquid helium temperature to obtain depletion from thermally-generated free carriers. A thorough study has been done to show that effects of charge trapping during drift disappears at a bias field value of the order of 100V/cm.

  13. Large volume cryogenic silicon detectors

    Energy Technology Data Exchange (ETDEWEB)

    Braggio, C. [Dipartimento di Fisica, Universita di Padova, via Marzolo 8, 35131 Padova (Italy); Boscardin, M. [Fondazione Bruno Kessler (FBK), via Sommarive 18, I-38100 Povo (Italy); Bressi, G. [INFN sez. di Pavia, via Bassi 6, 27100 Pavia (Italy); Carugno, G.; Corti, D. [INFN sez. di Padova, via Marzolo 8, 35131 Padova (Italy); Galeazzi, G. [INFN lab. naz. Legnaro, viale dell' Universita 2, 35020 Legnaro (Italy); Zorzi, N. [Fondazione Bruno Kessler (FBK), via Sommarive 18, I-38100 Povo (Italy)

    2009-12-15

    We present preliminary measurements for the development of a large volume silicon detector to detect low energy and low rate energy depositions. The tested detector is a one cm-thick silicon PIN diode with an active volume of 31 cm{sup 3}, cooled to the liquid helium temperature to obtain depletion from thermally-generated free carriers. A thorough study has been done to show that effects of charge trapping during drift disappears at a bias field value of the order of 100V/cm.

  14. Capillary gas chromatographic analysis of nerve agents using large volume injections. Final report

    Energy Technology Data Exchange (ETDEWEB)

    Deinum, T.; Nieuwenhuy, C.

    1994-11-01

    The procedure developed at TNO-Prins Maurits Laboratory (TNO-PML) for the verification of intact organophosphorus chemical warfare agents in water samples was improved. The last step in this procedure, the laborious and non-reproducible transfer of an ethyl acetate extract onto a Tenax-adsorption tube followed by thermal desorption of the Tenax-tube, was replaced by large volume injection of the extract onto a capillary gas chromatographic system. The parameters controlling the injection of a large volume of an extract (200 ul) were investigated and optimized. As ethyl acetate caused severe problems, potential new solvents were evaluated. With the improved procedure, the nerve agents sarin, tabun, soman, diisopropyl fluorophosphate (DFP) and VX could be determined in freshly prepared water samples at pg/ml (ppt) levels. The fate of the nerve agents under study in water at two pH`s (4.8 and 6) was investigated. For VX, the pH should be adjusted before extraction. Moreover, it is worthwhile to acidify water samples to diminish hydrolysis.

  15. Effects of uncertainty in model predictions of individual tree volume on large area volume estimates

    Science.gov (United States)

    Ronald E. McRoberts; James A. Westfall

    2014-01-01

    Forest inventory estimates of tree volume for large areas are typically calculated by adding model predictions of volumes for individual trees. However, the uncertainty in the model predictions is generally ignored with the result that the precision of the large area volume estimates is overestimated. The primary study objective was to estimate the effects of model...

  16. TRU Waste Sampling Program: Volume I. Waste characterization

    International Nuclear Information System (INIS)

    Clements, T.L. Jr.; Kudera, D.E.

    1985-09-01

    Volume I of the TRU Waste Sampling Program report presents the waste characterization information obtained from sampling and characterizing various aged transuranic waste retrieved from storage at the Idaho National Engineering Laboratory and the Los Alamos National Laboratory. The data contained in this report include the results of gas sampling and gas generation, radiographic examinations, waste visual examination results, and waste compliance with the Waste Isolation Pilot Plant-Waste Acceptance Criteria (WIPP-WAC). A separate report, Volume II, contains data from the gas generation studies

  17. Pulsed Direct Current Electrospray: Enabling Systematic Analysis of Small Volume Sample by Boosting Sample Economy.

    Science.gov (United States)

    Wei, Zhenwei; Xiong, Xingchuang; Guo, Chengan; Si, Xingyu; Zhao, Yaoyao; He, Muyi; Yang, Chengdui; Xu, Wei; Tang, Fei; Fang, Xiang; Zhang, Sichun; Zhang, Xinrong

    2015-11-17

    We had developed pulsed direct current electrospray ionization mass spectrometry (pulsed-dc-ESI-MS) for systematically profiling and determining components in small volume sample. Pulsed-dc-ESI utilized constant high voltage to induce the generation of single polarity pulsed electrospray remotely. This method had significantly boosted the sample economy, so as to obtain several minutes MS signal duration from merely picoliter volume sample. The elongated MS signal duration enable us to collect abundant MS(2) information on interested components in a small volume sample for systematical analysis. This method had been successfully applied for single cell metabolomics analysis. We had obtained 2-D profile of metabolites (including exact mass and MS(2) data) from single plant and mammalian cell, concerning 1034 components and 656 components for Allium cepa and HeLa cells, respectively. Further identification had found 162 compounds and 28 different modification groups of 141 saccharides in a single Allium cepa cell, indicating pulsed-dc-ESI a powerful tool for small volume sample systematical analysis.

  18. A propidium monoazide–quantitative PCR method for the detection and quantification of viable Enterococcus faecalis in large-volume samples of marine waters

    KAUST Repository

    Salam, Khaled W.; El-Fadel, Mutasem E.; Barbour, Elie K.; Saikaly, Pascal

    2014-01-01

    The development of rapid detection assays of cell viability is essential for monitoring the microbiological quality of water systems. Coupling propidium monoazide with quantitative PCR (PMA-qPCR) has been successfully applied in different studies for the detection and quantification of viable cells in small-volume samples (0.25-1.00 mL), but it has not been evaluated sufficiently in marine environments or in large-volume samples. In this study, we successfully integrated blue light-emitting diodes for photoactivating PMA and membrane filtration into the PMA-qPCR assay for the rapid detection and quantification of viable Enterococcus faecalis cells in 10-mL samples of marine waters. The assay was optimized in phosphate-buffered saline and seawater, reducing the qPCR signal of heat-killed E. faecalis cells by 4 log10 and 3 log10 units, respectively. Results suggest that high total dissolved solid concentration (32 g/L) in seawater can reduce PMA activity. Optimal PMA-qPCR standard curves with a 6-log dynamic range and detection limit of 102 cells/mL were generated for quantifying viable E. faecalis cells in marine waters. The developed assay was compared with the standard membrane filter (MF) method by quantifying viable E. faecalis cells in seawater samples exposed to solar radiation. The results of the developed PMA-qPCR assay did not match that of the standard MF method. This difference in the results reflects the different physiological states of E. faecalis cells in seawater. In conclusion, the developed assay is a rapid (∼5 h) method for the quantification of viable E. faecalis cells in marine recreational waters, which should be further improved and tested in different seawater settings. © 2014 Springer-Verlag Berlin Heidelberg.

  19. A propidium monoazide–quantitative PCR method for the detection and quantification of viable Enterococcus faecalis in large-volume samples of marine waters

    KAUST Repository

    Salam, Khaled W.

    2014-08-23

    The development of rapid detection assays of cell viability is essential for monitoring the microbiological quality of water systems. Coupling propidium monoazide with quantitative PCR (PMA-qPCR) has been successfully applied in different studies for the detection and quantification of viable cells in small-volume samples (0.25-1.00 mL), but it has not been evaluated sufficiently in marine environments or in large-volume samples. In this study, we successfully integrated blue light-emitting diodes for photoactivating PMA and membrane filtration into the PMA-qPCR assay for the rapid detection and quantification of viable Enterococcus faecalis cells in 10-mL samples of marine waters. The assay was optimized in phosphate-buffered saline and seawater, reducing the qPCR signal of heat-killed E. faecalis cells by 4 log10 and 3 log10 units, respectively. Results suggest that high total dissolved solid concentration (32 g/L) in seawater can reduce PMA activity. Optimal PMA-qPCR standard curves with a 6-log dynamic range and detection limit of 102 cells/mL were generated for quantifying viable E. faecalis cells in marine waters. The developed assay was compared with the standard membrane filter (MF) method by quantifying viable E. faecalis cells in seawater samples exposed to solar radiation. The results of the developed PMA-qPCR assay did not match that of the standard MF method. This difference in the results reflects the different physiological states of E. faecalis cells in seawater. In conclusion, the developed assay is a rapid (∼5 h) method for the quantification of viable E. faecalis cells in marine recreational waters, which should be further improved and tested in different seawater settings. © 2014 Springer-Verlag Berlin Heidelberg.

  20. An experimental study on the excitation of large volume airguns in a small volume body of water

    International Nuclear Information System (INIS)

    Wang, Baoshan; Yang, Wei; Yuan, Songyong; Ge, Hongkui; Chen, Yong; Guo, Shijun; Xu, Ping

    2010-01-01

    A large volume airgun array is effective in generating seismic waves, which is extensively used in large volume bodies of water such as oceans, lakes and reservoirs. So far, the application of large volume airguns is subject to the distribution of large volume bodies of water. This paper reports an attempt to utilize large volume airguns in a small body of water as a seismic source for seismotectonic studies. We carried out a field experiment in Mapaoquan pond, Fangshan district, Beijing, during the period 25–30 May 2009. Bolt LL1500 airguns, each with volumes of 2000 in 3 , the largest commercial airguns available today, were used in this experiment. We tested the excitation of the airgun array with one or two guns. The airgun array was placed 7–11 m below the water's surface. The near- and far-field seismic motions induced by the airgun source were recorded by a 100 km long seismic profile composed of 16 portable seismometers and a 100 m long strong motion seismograph profile, respectively. The following conclusions can be drawn from this experiment. First, it is feasible to excite large volume airguns in a small volume body of water. Second, seismic signals from a single shot of one airgun can be recognized at the offset up to 15 km. Taking advantage of high source repeatability, we stacked records from 128 shots to enhance the signal-to-noise ratio, and direct P-waves can be easily identified at the offset ∼50 km in stacked records. Third, no detectable damage to fish or near-field constructions was caused by the airgun shots. Those results suggest that large volume airguns excited in small bodies of water can be used as a routinely operated seismic source for mid-scale (tens of kilometres) subsurface explorations and monitoring under various running conditions

  1. Neutron multicounter detector for investigation of content and spatial distribution of fission materials in large volume samples

    International Nuclear Information System (INIS)

    Swiderska-Kowalczyk, M.; Starosta, W.; Zoltowski, T.

    1998-01-01

    The experimental device is a neutron coincidence well counter. It can be applied for passive assay of fissile - especially for plutonium bearing - materials. It consist of a set of 3 He tubes placed inside a polyethylene moderator; outputs from the tubes, first processed by preamplifier/amplifier/discriminator circuits, are then analysed using neutron correlator connected with a PC, and correlation techniques implemented in software. Such a neutron counter allows for determination of plutonium mass ( 240 Pu effective mass) in nonmultiplying samples having fairly big volume (up to 0.14 m 3 ). For determination of neutron sources distribution inside the sample, the heuristic methods based on hierarchical cluster analysis are applied. As an input parameters, amplitudes and phases of two-dimensional Fourier transformation of the count profiles matrices for known point sources distributions and for the examined samples, are taken. Such matrices are collected by means of sample scanning by detection head. During clustering process, counts profiles for unknown samples fitted into dendrograms using the 'proximity' criterion of the examined sample profile to standard samples profiles. Distribution of neutron sources in an examined sample is then evaluated on the basis of comparison with standard sources distributions. (author)

  2. Improvement of 137Cs analysis in small volume seawater samples using the Ogoya underground facility

    International Nuclear Information System (INIS)

    Hirose, K.; Komura, K.; Kanazawa University, Ishikawa; Aoyama, M.; Igarashi, Y.

    2008-01-01

    137 Cs in seawater is one of the most powerful tracers of water motion. Large volumes of samples have been required for determination of 137 Cs in seawater. This paper describes improvement of separation and purification processes of 137 Cs in seawater, which includes purification of 137 Cs using hexachloroplatinic acid in addition to ammonium phosphomolybdate (AMP) precipitation. As a result, we succeeded the 137 Cs determination in seawater with a smaller sample volume of 10 liter by using ultra-low background gamma-spectrometry in the Ogoya underground facility. 137 Cs detection limit was about 0.1 mBq (counting time: 10 6 s). This method is applied to determine 137 Cs in small samples of the South Pacific deep waters. (author)

  3. Volume measurement study for large scale input accountancy tank

    International Nuclear Information System (INIS)

    Uchikoshi, Seiji; Watanabe, Yuichi; Tsujino, Takeshi

    1999-01-01

    Large Scale Tank Calibration (LASTAC) facility, including an experimental tank which has the same volume and structure as the input accountancy tank of Rokkasho Reprocessing Plant (RRP) was constructed in Nuclear Material Control Center of Japan. Demonstration experiments have been carried out to evaluate a precision of solution volume measurement and to establish the procedure of highly accurate pressure measurement for a large scale tank with dip-tube bubbler probe system to be applied to the input accountancy tank of RRP. Solution volume in a tank is determined from substitution the solution level for the calibration function obtained in advance, which express a relation between the solution level and its volume in the tank. Therefore, precise solution volume measurement needs a precise calibration function that is determined carefully. The LASTAC calibration experiments using pure water showed good result in reproducibility. (J.P.N.)

  4. Rugged Large Volume Injection for Sensitive Capillary LC-MS Environmental Monitoring

    Directory of Open Access Journals (Sweden)

    Hanne Roberg-Larsen

    2017-08-01

    Full Text Available A rugged and high throughput capillary column (cLC LC-MS switching platform using large volume injection and on-line automatic filtration and filter back-flush (AFFL solid phase extraction (SPE for analysis of environmental water samples with minimal sample preparation is presented. Although narrow columns and on-line sample preparation are used in the platform, high ruggedness is achieved e.g., injection of 100 non-filtrated water samples did not result in a pressure rise/clogging of the SPE/capillary columns (inner diameter 300 μm. In addition, satisfactory retention time stability and chromatographic resolution were also features of the system. The potential of the platform for environmental water samples was demonstrated with various pharmaceutical products, which had detection limits (LOD in the 0.05–12.5 ng/L range. Between-day and within-day repeatability of selected analytes were <20% RSD.

  5. Large volume axionic Swiss cheese inflation

    Science.gov (United States)

    Misra, Aalok; Shukla, Pramod

    2008-09-01

    Continuing with the ideas of (Section 4 of) [A. Misra, P. Shukla, Moduli stabilization, large-volume dS minimum without anti-D3-branes, (non-)supersymmetric black hole attractors and two-parameter Swiss cheese Calabi Yau's, arXiv: 0707.0105 [hep-th], Nucl. Phys. B, in press], after inclusion of perturbative and non-perturbative α corrections to the Kähler potential and (D1- and D3-) instanton generated superpotential, we show the possibility of slow roll axionic inflation in the large volume limit of Swiss cheese Calabi Yau orientifold compactifications of type IIB string theory. We also include one- and two-loop corrections to the Kähler potential but find the same to be subdominant to the (perturbative and non-perturbative) α corrections. The NS NS axions provide a flat direction for slow roll inflation to proceed from a saddle point to the nearest dS minimum.

  6. Large volume axionic Swiss cheese inflation

    International Nuclear Information System (INIS)

    Misra, Aalok; Shukla, Pramod

    2008-01-01

    Continuing with the ideas of (Section 4 of) [A. Misra, P. Shukla, Moduli stabilization, large-volume dS minimum without anti-D3-branes, (non-)supersymmetric black hole attractors and two-parameter Swiss cheese Calabi-Yau's, (arXiv: 0707.0105 [hep-th]), Nucl. Phys. B, in press], after inclusion of perturbative and non-perturbative α ' corrections to the Kaehler potential and (D1- and D3-) instanton generated superpotential, we show the possibility of slow roll axionic inflation in the large volume limit of Swiss cheese Calabi-Yau orientifold compactifications of type IIB string theory. We also include one- and two-loop corrections to the Kaehler potential but find the same to be subdominant to the (perturbative and non-perturbative) α ' corrections. The NS-NS axions provide a flat direction for slow roll inflation to proceed from a saddle point to the nearest dS minimum

  7. An evaluation of soil sampling for 137Cs using various field-sampling volumes.

    Science.gov (United States)

    Nyhan, J W; White, G C; Schofield, T G; Trujillo, G

    1983-05-01

    The sediments from a liquid effluent receiving area at the Los Alamos National Laboratory and soils from an intensive study area in the fallout pathway of Trinity were sampled for 137Cs using 25-, 500-, 2500- and 12,500-cm3 field sampling volumes. A highly replicated sampling program was used to determine mean concentrations and inventories of 137Cs at each site, as well as estimates of spatial, aliquoting, and counting variance components of the radionuclide data. The sampling methods were also analyzed as a function of soil size fractions collected in each field sampling volume and of the total cost of the program for a given variation in the radionuclide survey results. Coefficients of variation (CV) of 137Cs inventory estimates ranged from 0.063 to 0.14 for Mortandad Canyon sediments, whereas CV values for Trinity soils were observed from 0.38 to 0.57. Spatial variance components of 137Cs concentration data were usually found to be larger than either the aliquoting or counting variance estimates and were inversely related to field sampling volume at the Trinity intensive site. Subsequent optimization studies of the sampling schemes demonstrated that each aliquot should be counted once, and that only 2-4 aliquots out of as many as 30 collected need be assayed for 137Cs. The optimization studies showed that as sample costs increased to 45 man-hours of labor per sample, the variance of the mean 137Cs concentration decreased dramatically, but decreased very little with additional labor.

  8. Large sample neutron activation analysis: establishment at CDTN/CNEN, Brazil

    Energy Technology Data Exchange (ETDEWEB)

    Menezes, Maria Angela de B.C., E-mail: menezes@cdtn.b [Centro de Desenvolvimento da Tecnologia Nuclear (CDTN/CNEN-MG), Belo Horizonte, MG (Brazil); Jacimovic, Radojko, E-mail: radojko.jacimovic@ijs.s [Jozef Stefan Institute, Ljubljana (Slovenia). Dept. of Environmental Sciences. Group for Radiochemistry and Radioecology

    2011-07-01

    In order to improve the application of the neutron activation technique at CDTN/CNEN, the large sample instrumental neutron activation analysis is being established, IAEA BRA 14798 and FAPEMIG APQ-01259-09 projects. This procedure, LS-INAA, usually requires special facilities for the activation as well as for the detection. However, the TRIGA Mark I IPR R1, CDTN/CNEN has not been adapted for the irradiation and the usual gamma spectrometry has being carried out. To start the establishment of the LS-INAA, a 5g sample - IAEA/Soil 7 reference material was analyzed by k{sub 0}-standardized method. This paper is about the detector efficiency over the volume source using KayWin v2.23 and ANGLE V3.0 software. (author)

  9. Large Volume, Behaviorally-relevant Illumination for Optogenetics in Non-human Primates.

    Science.gov (United States)

    Acker, Leah C; Pino, Erica N; Boyden, Edward S; Desimone, Robert

    2017-10-03

    This protocol describes a large-volume illuminator, which was developed for optogenetic manipulations in the non-human primate brain. The illuminator is a modified plastic optical fiber with etched tip, such that the light emitting surface area is > 100x that of a conventional fiber. In addition to describing the construction of the large-volume illuminator, this protocol details the quality-control calibration used to ensure even light distribution. Further, this protocol describes techniques for inserting and removing the large volume illuminator. Both superficial and deep structures may be illuminated. This large volume illuminator does not need to be physically coupled to an electrode, and because the illuminator is made of plastic, not glass, it will simply bend in circumstances when traditional optical fibers would shatter. Because this illuminator delivers light over behaviorally-relevant tissue volumes (≈ 10 mm 3 ) with no greater penetration damage than a conventional optical fiber, it facilitates behavioral studies using optogenetics in non-human primates.

  10. Large sample neutron activation analysis of a reference inhomogeneous sample

    International Nuclear Information System (INIS)

    Vasilopoulou, T.; Athens National Technical University, Athens; Tzika, F.; Stamatelatos, I.E.; Koster-Ammerlaan, M.J.J.

    2011-01-01

    A benchmark experiment was performed for Neutron Activation Analysis (NAA) of a large inhomogeneous sample. The reference sample was developed in-house and consisted of SiO 2 matrix and an Al-Zn alloy 'inhomogeneity' body. Monte Carlo simulations were employed to derive appropriate correction factors for neutron self-shielding during irradiation as well as self-attenuation of gamma rays and sample geometry during counting. The large sample neutron activation analysis (LSNAA) results were compared against reference values and the trueness of the technique was evaluated. An agreement within ±10% was observed between LSNAA and reference elemental mass values, for all matrix and inhomogeneity elements except Samarium, provided that the inhomogeneity body was fully simulated. However, in cases that the inhomogeneity was treated as not known, the results showed a reasonable agreement for most matrix elements, while large discrepancies were observed for the inhomogeneity elements. This study provided a quantification of the uncertainties associated with inhomogeneity in large sample analysis and contributed to the identification of the needs for future development of LSNAA facilities for analysis of inhomogeneous samples. (author)

  11. Fast concentration of dissolved forms of cesium radioisotopes from large seawater samples

    International Nuclear Information System (INIS)

    Jan Kamenik; Henrieta Dulaiova; Ferdinand Sebesta; Kamila St'astna; Czech Technical University, Prague

    2013-01-01

    The method developed for cesium concentration from large freshwater samples was tested and adapted for analysis of cesium radionuclides in seawater. Concentration of dissolved forms of cesium in large seawater samples (about 100 L) was performed using composite absorbers AMP-PAN and KNiFC-PAN with ammonium molybdophosphate and potassium–nickel hexacyanoferrate(II) as active components, respectively, and polyacrylonitrile as a binding polymer. A specially designed chromatography column with bed volume (BV) 25 mL allowed fast flow rates of seawater (up to 1,200 BV h -1 ). The recovery yields were determined by ICP-MS analysis of stable cesium added to seawater sample. Both absorbers proved usability for cesium concentration from large seawater samples. KNiFC-PAN material was slightly more effective in cesium concentration from acidified seawater (recovery yield around 93 % for 700 BV h -1 ). This material showed similar efficiency in cesium concentration also from natural seawater. The activity concentrations of 137 Cs determined in seawater from the central Pacific Ocean were 1.5 ± 0.1 and 1.4 ± 0.1 Bq m -3 for an offshore (January 2012) and a coastal (February 2012) locality, respectively, 134 Cs activities were below detection limit ( -3 ). (author)

  12. Higgs, moduli problem, baryogenesis and large volume compactifications

    Energy Technology Data Exchange (ETDEWEB)

    Higaki, Tetsutaro [RIKEN Nishina Center, Saitama (Japan). Mathematical Physics Lab.; Kamada, Kohei [Deutsches Elektronen-Synchrotron (DESY), Hamburg (Germany); Takahashi, Fuminobu [Tohoku Univ., Sendai (Japan). Dept. of Physics

    2012-07-15

    We consider the cosmological moduli problem in the context of high-scale supersymmetry breaking suggested by the recent discovery of the standard-model like Higgs boson. In order to solve the notorious moduli-induced gravitino problem, we focus on the LARGE volume scenario, in which the modulus decay into gravitinos can be kinematically forbidden. We then consider the Affleck-Dine mechanism with or without an enhanced coupling with the inflaton, taking account of possible Q-ball formation. We show that the baryon asymmetry of the present Universe can be generated by the Affleck-Dine mechanism in LARGE volume scenario, solving the moduli and gravitino problems.

  13. Higgs, moduli problem, baryogenesis and large volume compactifications

    International Nuclear Information System (INIS)

    Higaki, Tetsutaro; Takahashi, Fuminobu

    2012-07-01

    We consider the cosmological moduli problem in the context of high-scale supersymmetry breaking suggested by the recent discovery of the standard-model like Higgs boson. In order to solve the notorious moduli-induced gravitino problem, we focus on the LARGE volume scenario, in which the modulus decay into gravitinos can be kinematically forbidden. We then consider the Affleck-Dine mechanism with or without an enhanced coupling with the inflaton, taking account of possible Q-ball formation. We show that the baryon asymmetry of the present Universe can be generated by the Affleck-Dine mechanism in LARGE volume scenario, solving the moduli and gravitino problems.

  14. Validation Of Intermediate Large Sample Analysis (With Sizes Up to 100 G) and Associated Facility Improvement

    International Nuclear Information System (INIS)

    Bode, P.; Koster-Ammerlaan, M.J.J.

    2018-01-01

    Pragmatic rather than physical correction factors for neutron and gamma-ray shielding were studied for samples of intermediate size, i.e. up to the 10-100 gram range. It was found that for most biological and geological materials, the neutron self-shielding is less than 5 % and the gamma-ray self-attenuation can easily be estimated. A trueness control material of 1 kg size was made based on use of left-overs of materials, used in laboratory intercomparisons. A design study for a large sample pool-side facility, handling plate-type volumes, had to be stopped because of a reduction in human resources, available for this CRP. The large sample NAA facilities were made available to guest scientists from Greece and Brazil. The laboratory for neutron activation analysis participated in the world’s first laboratory intercomparison utilizing large samples. (author)

  15. Calibration of a large volume argon-41 gas-effluent monitor

    International Nuclear Information System (INIS)

    Wilson, William E.; Lovas, Thomas A.

    1976-01-01

    In September of 1975, a large volume Argon-41 sampler was calibrated using a series connected calibration chamber of known sensitivity and a constant flow of activated Argon gas. The calibration included analysis of the effects of flow rate through the large volume sampler and yielded a calibration constant of 2.34 x 10 -8 μc/cm 3 /CPM. (author)

  16. Center-stabilized Yang-Mills Theory:Confinement and Large N Volume Independence

    International Nuclear Information System (INIS)

    Unsal, Mithat; Yaffe, Laurence G.

    2008-01-01

    We examine a double trace deformation of SU(N) Yang-Mills theory which, for large N and large volume, is equivalent to unmodified Yang-Mills theory up to O(1/N 2 ) corrections. In contrast to the unmodified theory, large N volume independence is valid in the deformed theory down to arbitrarily small volumes. The double trace deformation prevents the spontaneous breaking of center symmetry which would otherwise disrupt large N volume independence in small volumes. For small values of N, if the theory is formulated on R 3 x S 1 with a sufficiently small compactification size L, then an analytic treatment of the non-perturbative dynamics of the deformed theory is possible. In this regime, we show that the deformed Yang-Mills theory has a mass gap and exhibits linear confinement. Increasing the circumference L or number of colors N decreases the separation of scales on which the analytic treatment relies. However, there are no order parameters which distinguish the small and large radius regimes. Consequently, for small N the deformed theory provides a novel example of a locally four-dimensional pure gauge theory in which one has analytic control over confinement, while for large N it provides a simple fully reduced model for Yang-Mills theory. The construction is easily generalized to QCD and other QCD-like theories

  17. Center-stabilized Yang-Mills theory: Confinement and large N volume independence

    International Nuclear Information System (INIS)

    Uensal, Mithat; Yaffe, Laurence G.

    2008-01-01

    We examine a double trace deformation of SU(N) Yang-Mills theory which, for large N and large volume, is equivalent to unmodified Yang-Mills theory up to O(1/N 2 ) corrections. In contrast to the unmodified theory, large N volume independence is valid in the deformed theory down to arbitrarily small volumes. The double trace deformation prevents the spontaneous breaking of center symmetry which would otherwise disrupt large N volume independence in small volumes. For small values of N, if the theory is formulated on R 3 xS 1 with a sufficiently small compactification size L, then an analytic treatment of the nonperturbative dynamics of the deformed theory is possible. In this regime, we show that the deformed Yang-Mills theory has a mass gap and exhibits linear confinement. Increasing the circumference L or number of colors N decreases the separation of scales on which the analytic treatment relies. However, there are no order parameters which distinguish the small and large radius regimes. Consequently, for small N the deformed theory provides a novel example of a locally four-dimensional pure-gauge theory in which one has analytic control over confinement, while for large N it provides a simple fully reduced model for Yang-Mills theory. The construction is easily generalized to QCD and other QCD-like theories.

  18. Volume Ray Casting with Peak Finding and Differential Sampling

    KAUST Repository

    Knoll, A.

    2009-11-01

    Direct volume rendering and isosurfacing are ubiquitous rendering techniques in scientific visualization, commonly employed in imaging 3D data from simulation and scan sources. Conventionally, these methods have been treated as separate modalities, necessitating different sampling strategies and rendering algorithms. In reality, an isosurface is a special case of a transfer function, namely a Dirac impulse at a given isovalue. However, artifact-free rendering of discrete isosurfaces in a volume rendering framework is an elusive goal, requiring either infinite sampling or smoothing of the transfer function. While preintegration approaches solve the most obvious deficiencies in handling sharp transfer functions, artifacts can still result, limiting classification. In this paper, we introduce a method for rendering such features by explicitly solving for isovalues within the volume rendering integral. In addition, we present a sampling strategy inspired by ray differentials that automatically matches the frequency of the image plane, resulting in fewer artifacts near the eye and better overall performance. These techniques exhibit clear advantages over standard uniform ray casting with and without preintegration, and allow for high-quality interactive volume rendering with sharp C0 transfer functions. © 2009 IEEE.

  19. Automated high-volume aerosol sampling station for environmental radiation monitoring

    International Nuclear Information System (INIS)

    Toivonen, H.; Honkamaa, T.; Ilander, T.; Leppaenen, A.; Nikkinen, M.; Poellaenen, R.; Ylaetalo, S.

    1998-07-01

    An automated high-volume aerosol sampling station, known as CINDERELLA.STUK, for environmental radiation monitoring has been developed by the Radiation and Nuclear Safety Authority (STUK), Finland. The sample is collected on a glass fibre filter (attached into a cassette), the airflow through the filter is 800 m 3 /h at maximum. During the sampling, the filter is continuously monitored with Na(I) scintillation detectors. After the sampling, the large filter is automatically cut into 15 pieces that form a small sample and after ageing, the pile of filter pieces is moved onto an HPGe detector. These actions are performed automatically by a robot. The system is operated at a duty cycle of 1 d sampling, 1 d decay and 1 d counting. Minimum detectable concentrations of radionuclides in air are typically 1Ae10 x 10 -6 Bq/m 3 . The station is equipped with various sensors to reveal unauthorized admittance. These sensors can be monitored remotely in real time via Internet or telephone lines. The processes and operation of the station are monitored and partly controlled by computer. The present approach fulfils the requirements of CTBTO for aerosol monitoring. The concept suits well for nuclear material safeguards, too

  20. The position response of a large-volume segmented germanium detector

    International Nuclear Information System (INIS)

    Descovich, M.; Nolan, P.J.; Boston, A.J.; Dobson, J.; Gros, S.; Cresswell, J.R.; Simpson, J.; Lazarus, I.; Regan, P.H.; Valiente-Dobon, J.J.; Sellin, P.; Pearson, C.J.

    2005-01-01

    The position response of a large-volume segmented coaxial germanium detector is reported. The detector has 24-fold segmentation on its outer contact. The output from each contact was sampled with fast digital signal processing electronics in order to determine the position of the γ-ray interaction from the signal pulse shape. The interaction position was reconstructed in a polar coordinate system by combining the radial information, contained in the rise-time of the pulse leading edge, with the azimuthal information, obtained from the magnitude of the transient charge signals induced on the neighbouring segments. With this method, a position resolution of 3-7mm is achieved in both the radial and the azimuthal directions

  1. The position response of a large-volume segmented germanium detector

    Energy Technology Data Exchange (ETDEWEB)

    Descovich, M. [Oliver Lodge Laboratory, Physics Department, University of Liverpool, Liverpool L69 7ZE (United Kingdom)]. E-mail: mdescovich@lbl.gov; Nolan, P.J. [Oliver Lodge Laboratory, Physics Department, University of Liverpool, Liverpool L69 7ZE (United Kingdom); Boston, A.J. [Oliver Lodge Laboratory, Physics Department, University of Liverpool, Liverpool L69 7ZE (United Kingdom); Dobson, J. [Oliver Lodge Laboratory, Physics Department, University of Liverpool, Liverpool L69 7ZE (United Kingdom); Gros, S. [Oliver Lodge Laboratory, Physics Department, University of Liverpool, Liverpool L69 7ZE (United Kingdom); Cresswell, J.R. [Oliver Lodge Laboratory, Physics Department, University of Liverpool, Liverpool L69 7ZE (United Kingdom); Simpson, J. [CCLRC Daresbury Laboratory, Daresbury, Warrington, Cheshire WA4 4AD (United Kingdom); Lazarus, I. [CCLRC Daresbury Laboratory, Daresbury, Warrington, Cheshire WA4 4AD (United Kingdom); Regan, P.H. [Department of Physics, University of Surrey, Guildford GU2 7XH (United Kingdom); Valiente-Dobon, J.J. [Department of Physics, University of Surrey, Guildford GU2 7XH (United Kingdom); Sellin, P. [Department of Physics, University of Surrey, Guildford GU2 7XH (United Kingdom); Pearson, C.J. [Department of Physics, University of Surrey, Guildford GU2 7XH (United Kingdom)

    2005-11-21

    The position response of a large-volume segmented coaxial germanium detector is reported. The detector has 24-fold segmentation on its outer contact. The output from each contact was sampled with fast digital signal processing electronics in order to determine the position of the {gamma}-ray interaction from the signal pulse shape. The interaction position was reconstructed in a polar coordinate system by combining the radial information, contained in the rise-time of the pulse leading edge, with the azimuthal information, obtained from the magnitude of the transient charge signals induced on the neighbouring segments. With this method, a position resolution of 3-7mm is achieved in both the radial and the azimuthal directions.

  2. Planck/SDSS Cluster Mass and Gas Scaling Relations for a Volume-Complete redMaPPer Sample

    Science.gov (United States)

    Jimeno, Pablo; Diego, Jose M.; Broadhurst, Tom; De Martino, I.; Lazkoz, Ruth

    2018-04-01

    Using Planck satellite data, we construct Sunyaev-Zel'dovich (SZ) gas pressure profiles for a large, volume-complete sample of optically selected clusters. We have defined a sample of over 8,000 redMaPPer clusters from the Sloan Digital Sky Survey (SDSS), within the volume-complete redshift region 0.100 trend towards larger break radius with increasing cluster mass. Our SZ-based masses fall ˜16% below the mass-richness relations from weak lensing, in a similar fashion as the "hydrostatic bias" related with X-ray derived masses. Finally, we derive a tight Y500-M500 relation over a wide range of cluster mass, with a power law slope equal to 1.70 ± 0.07, that agrees well with the independent slope obtained by the Planck team with an SZ-selected cluster sample, but extends to lower masses with higher precision.

  3. Sampling of high amounts of bioaerosols using a high-volume electrostatic field sampler

    DEFF Research Database (Denmark)

    Madsen, A. M.; Sharma, Anoop Kumar

    2008-01-01

    For studies of the biological effects of bioaerosols, large samples are necessary. To be able to sample enough material and to cover the variations in aerosol content during and between working days, a long sampling time is necessary. Recently, a high-volume transportable electrostatic field...... and 315 mg dust (net recovery of the lyophilized dust) was sampled during a period of 7 days, respectively. The sampling rates of the electrostatic field samplers were between 1.34 and 1.96 mg dust per hour, the value for the Gravikon was between 0.083 and 0.108 mg dust per hour and the values for the GSP...... samplers were between 0.0031 and 0.032 mg dust per hour. The standard deviations of replica samplings and the following microbial analysis using the electrostatic field sampler and GSP samplers were at the same levels. The exposure to dust in the straw storage was 7.7 mg m(-3) when measured...

  4. Chromatographic lipophilicity determination using large volume injections of the solvents non-miscible with the mobile phase.

    Science.gov (United States)

    Sârbu, Costel; Naşcu-Briciu, Rodica Domnica; Casoni, Dorina; Kot-Wasik, Agata; Wasik, Andrzej; Namieśnik, Jacek

    2012-11-30

    A new perspective in the lipophilicity evaluation through RP-HPLC is permitted by analysis of the retention factor (k) obtained by injecting large volumes of test samples prepared in solvents immiscible with mobile phase. The experiment is carried out on representative groups of compounds with increased toxicity (mycotoxins and alkaloids) and amines with important biological activity (naturally occurring monoamine compounds and related drugs), which are covering a large interval of lipophilicity. The stock solution of each compound was prepared in hexane and the used mobile phases were mixtures of methanol or acetonitrile and water, in suited volume ratio. The injected volume was between 10 and 100 μL, while the used stationary phases were RP-18 and RP-8. On both reverse stationary phases the retention factors were linearly decreasing while the injection volume was increasing. In all cases, the linear models were highly statistically significant. On the basis of the obtained results new lipophilicity indices were purposed and discussed. The developed lipophilicity indices and the computationally expressed ones are correlated at a high level of statistical significance. Copyright © 2012 Elsevier B.V. All rights reserved.

  5. Optimal sampling designs for large-scale fishery sample surveys in Greece

    Directory of Open Access Journals (Sweden)

    G. BAZIGOS

    2007-12-01

    The paper deals with the optimization of the following three large scale sample surveys: biological sample survey of commercial landings (BSCL, experimental fishing sample survey (EFSS, and commercial landings and effort sample survey (CLES.

  6. Effect of large volume paracentesis on plasma volume--a cause of hypovolemia

    International Nuclear Information System (INIS)

    Kao, H.W.; Rakov, N.E.; Savage, E.; Reynolds, T.B.

    1985-01-01

    Large volume paracentesis, while effectively relieving symptoms in patients with tense ascites, has been generally avoided due to reports of complications attributed to an acute reduction in intravascular volume. Measurements of plasma volume in these subjects have been by indirect methods and have not uniformly confirmed hypovolemia. We have prospectively evaluated 18 patients (20 paracenteses) with tense ascites and peripheral edema due to chronic liver disease undergoing 5 liter paracentesis for relief of symptoms. Plasma volume pre- and postparacentesis was assessed by a 125 I-labeled human serum albumin dilution technique as well as by the change in hematocrit and postural blood pressure difference. No significant change in serum sodium, urea nitrogen, hematocrit or postural systolic blood pressure difference was noted at 24 or 48 hr after paracentesis. Serum creatinine at 24 hr after paracentesis was unchanged but a small but statistically significant increase in serum creatinine was noted at 48 hr postparacentesis. Plasma volume changed -2.7% (n = 6, not statistically significant) during the first 24 hr and -2.8% (n = 12, not statistically significant) during the 0- to 48-hr period. No complications from paracentesis were noted. These results suggest that 5 liter paracentesis for relief of symptoms is safe in patients with tense ascites and peripheral edema from chronic liver disease

  7. Sampling procedures for inventory of commercial volume tree species in Amazon Forest.

    Science.gov (United States)

    Netto, Sylvio P; Pelissari, Allan L; Cysneiros, Vinicius C; Bonazza, Marcelo; Sanquetta, Carlos R

    2017-01-01

    The spatial distribution of tropical tree species can affect the consistency of the estimators in commercial forest inventories, therefore, appropriate sampling procedures are required to survey species with different spatial patterns in the Amazon Forest. For this, the present study aims to evaluate the conventional sampling procedures and introduce the adaptive cluster sampling for volumetric inventories of Amazonian tree species, considering the hypotheses that the density, the spatial distribution and the zero-plots affect the consistency of the estimators, and that the adaptive cluster sampling allows to obtain more accurate volumetric estimation. We use data from a census carried out in Jamari National Forest, Brazil, where trees with diameters equal to or higher than 40 cm were measured in 1,355 plots. Species with different spatial patterns were selected and sampled with simple random sampling, systematic sampling, linear cluster sampling and adaptive cluster sampling, whereby the accuracy of the volumetric estimation and presence of zero-plots were evaluated. The sampling procedures applied to species were affected by the low density of trees and the large number of zero-plots, wherein the adaptive clusters allowed concentrating the sampling effort in plots with trees and, thus, agglutinating more representative samples to estimate the commercial volume.

  8. A Volume-Limited Sample of L and T Dwarfs Defined by Parallaxes

    Science.gov (United States)

    Best, William M. J.; Liu, Michael C.; Magnier, Eugene; Dupuy, Trent

    2018-01-01

    Volume-limited samples are the gold standard for stellar population studies, as they enable unbiased measurements of space densities and luminosity functions. Parallaxes are the most direct measures of distance and are therefore essential for defining high-confidence volume limited samples. Previous efforts to model the local brown dwarf population were hampered by samples based on a small number of parallaxes. We are using UKIRT/WFCAM to conduct the largest near-infrared program to date to measure parallaxes and proper motions of L and T dwarfs. For the past 3+ years we have monitored over 350 targets, ≈90% of which are too faint to be observed by Gaia. We present preliminary results from our observations. Our program more than doubles the number of known L and T dwarf parallaxes, defining a volume-limited sample of ≈400 L0-T6 dwarfs out to 25 parsecs, the first L and T dwarf sample of this size and depth based entirely on parallaxes. Our sample will combine with the upcoming stellar census from Gaia DR2 parallaxes to form a complete volume-limited sample of nearby stars and brown dwarfs.

  9. The large sample size fallacy.

    Science.gov (United States)

    Lantz, Björn

    2013-06-01

    Significance in the statistical sense has little to do with significance in the common practical sense. Statistical significance is a necessary but not a sufficient condition for practical significance. Hence, results that are extremely statistically significant may be highly nonsignificant in practice. The degree of practical significance is generally determined by the size of the observed effect, not the p-value. The results of studies based on large samples are often characterized by extreme statistical significance despite small or even trivial effect sizes. Interpreting such results as significant in practice without further analysis is referred to as the large sample size fallacy in this article. The aim of this article is to explore the relevance of the large sample size fallacy in contemporary nursing research. Relatively few nursing articles display explicit measures of observed effect sizes or include a qualitative discussion of observed effect sizes. Statistical significance is often treated as an end in itself. Effect sizes should generally be calculated and presented along with p-values for statistically significant results, and observed effect sizes should be discussed qualitatively through direct and explicit comparisons with the effects in related literature. © 2012 Nordic College of Caring Science.

  10. Automated force volume image processing for biological samples.

    Directory of Open Access Journals (Sweden)

    Pavel Polyakov

    2011-04-01

    Full Text Available Atomic force microscopy (AFM has now become a powerful technique for investigating on a molecular level, surface forces, nanomechanical properties of deformable particles, biomolecular interactions, kinetics, and dynamic processes. This paper specifically focuses on the analysis of AFM force curves collected on biological systems, in particular, bacteria. The goal is to provide fully automated tools to achieve theoretical interpretation of force curves on the basis of adequate, available physical models. In this respect, we propose two algorithms, one for the processing of approach force curves and another for the quantitative analysis of retraction force curves. In the former, electrostatic interactions prior to contact between AFM probe and bacterium are accounted for and mechanical interactions operating after contact are described in terms of Hertz-Hooke formalism. Retraction force curves are analyzed on the basis of the Freely Jointed Chain model. For both algorithms, the quantitative reconstruction of force curves is based on the robust detection of critical points (jumps, changes of slope or changes of curvature which mark the transitions between the various relevant interactions taking place between the AFM tip and the studied sample during approach and retraction. Once the key regions of separation distance and indentation are detected, the physical parameters describing the relevant interactions operating in these regions are extracted making use of regression procedure for fitting experiments to theory. The flexibility, accuracy and strength of the algorithms are illustrated with the processing of two force-volume images, which collect a large set of approach and retraction curves measured on a single biological surface. For each force-volume image, several maps are generated, representing the spatial distribution of the searched physical parameters as estimated for each pixel of the force-volume image.

  11. Nonperturbative volume reduction of large-N QCD with adjoint fermions

    International Nuclear Information System (INIS)

    Bringoltz, Barak; Sharpe, Stephen R.

    2009-01-01

    We use nonperturbative lattice techniques to study the volume-reduced 'Eguchi-Kawai' version of four-dimensional large-N QCD with a single adjoint Dirac fermion. We explore the phase diagram of this single-site theory in the space of quark mass and gauge coupling using Wilson fermions for a number of colors in the range 8≤N≤15. Our evidence suggests that these values of N are large enough to determine the nature of the phase diagram for N→∞. We identify the region in the parameter space where the (Z N ) 4 center symmetry is intact. According to previous theoretical work using the orbifolding paradigm, and assuming that translation invariance is not spontaneously broken in the infinite-volume theory, in this region volume reduction holds: the single-site and infinite-volume theories become equivalent when N→∞. We find strong evidence that this region includes both light and heavy quarks (with masses that are at the cutoff scale), and our results are consistent with this region extending toward the continuum limit. We also compare the action density and the eigenvalue density of the overlap Dirac operator in the fundamental representation with those obtained in large-N pure-gauge theory.

  12. Heterogeneous asymmetric recombinase polymerase amplification (haRPA) for rapid hygiene control of large-volume water samples.

    Science.gov (United States)

    Elsäßer, Dennis; Ho, Johannes; Niessner, Reinhard; Tiehm, Andreas; Seidel, Michael

    2018-04-01

    Hygiene of drinking water is periodically controlled by cultivation and enumeration of indicator bacteria. Rapid and comprehensive measurements of emerging pathogens are of increasing interest to improve drinking water safety. In this study, the feasibility to detect bacteriophage PhiX174 as a potential indicator for virus contamination in large volumes of water is demonstrated. Three consecutive concentration methods (continuous ultrafiltration, monolithic adsorption filtration, and centrifugal ultrafiltration) were combined to concentrate phages stepwise from 1250 L drinking water into 1 mL. Heterogeneous asymmetric recombinase polymerase amplification (haRPA) is applied as rapid detection method. Field measurements were conducted to test the developed system for hygiene online monitoring under realistic conditions. We could show that this system allows the detection of artificial contaminations of bacteriophage PhiX174 in drinking water pipelines. Copyright © 2018 Elsevier Inc. All rights reserved.

  13. Sampling Large Graphs for Anticipatory Analytics

    Science.gov (United States)

    2015-05-15

    low. C. Random Area Sampling Random area sampling [8] is a “ snowball ” sampling method in which a set of random seed vertices are selected and areas... Sampling Large Graphs for Anticipatory Analytics Lauren Edwards, Luke Johnson, Maja Milosavljevic, Vijay Gadepally, Benjamin A. Miller Lincoln...systems, greater human-in-the-loop involvement, or through complex algorithms. We are investigating the use of sampling to mitigate these challenges

  14. Amplification volume reduction on DNA database samples using FTA™ Classic Cards.

    Science.gov (United States)

    Wong, Hang Yee; Lim, Eng Seng Simon; Tan-Siew, Wai Fun

    2012-03-01

    The DNA forensic community always strives towards improvements in aspects such as sensitivity, robustness, and efficacy balanced with cost efficiency. Therefore our laboratory decided to study the feasibility of PCR amplification volume reduction using DNA entrapped in FTA™ Classic Card and to bring cost savings to the laboratory. There were a few concerns the laboratory needed to address. First, the kinetics of the amplification reaction could be significantly altered. Second, an increase in sensitivity might affect interpretation due to increased stochastic effects even though they were pristine samples. Third, statics might cause FTA punches to jump out of its allocated well into another thus causing sample-to-sample contamination. Fourth, the size of the punches might be too small for visual inspection. Last, there would be a limit to the extent of volume reduction due to evaporation and the possible need of re-injection of samples for capillary electrophoresis. The laboratory had successfully optimized a reduced amplification volume of 10 μL for FTA samples. Copyright © 2011 Elsevier Ireland Ltd. All rights reserved.

  15. Sampling based motion planning with reachable volumes: Application to manipulators and closed chain systems

    KAUST Repository

    McMahon, Troy

    2014-09-01

    © 2014 IEEE. Reachable volumes are a geometric representation of the regions the joints of a robot can reach. They can be used to generate constraint satisfying samples for problems including complicated linkage robots (e.g. closed chains and graspers). They can also be used to assist robot operators and to help in robot design.We show that reachable volumes have an O(1) complexity in unconstrained problems as well as in many constrained problems. We also show that reachable volumes can be computed in linear time and that reachable volume samples can be generated in linear time in problems without constraints. We experimentally validate reachable volume sampling, both with and without constraints on end effectors and/or internal joints. We show that reachable volume samples are less likely to be invalid due to self-collisions, making reachable volume sampling significantly more efficient for higher dimensional problems. We also show that these samples are easier to connect than others, resulting in better connected roadmaps. We demonstrate that our method can be applied to 262-dof, multi-loop, and tree-like linkages including combinations of planar, prismatic and spherical joints. In contrast, existing methods either cannot be used for these problems or do not produce good quality solutions.

  16. Sampling based motion planning with reachable volumes: Application to manipulators and closed chain systems

    KAUST Repository

    McMahon, Troy; Thomas, Shawna; Amato, Nancy M.

    2014-01-01

    © 2014 IEEE. Reachable volumes are a geometric representation of the regions the joints of a robot can reach. They can be used to generate constraint satisfying samples for problems including complicated linkage robots (e.g. closed chains and graspers). They can also be used to assist robot operators and to help in robot design.We show that reachable volumes have an O(1) complexity in unconstrained problems as well as in many constrained problems. We also show that reachable volumes can be computed in linear time and that reachable volume samples can be generated in linear time in problems without constraints. We experimentally validate reachable volume sampling, both with and without constraints on end effectors and/or internal joints. We show that reachable volume samples are less likely to be invalid due to self-collisions, making reachable volume sampling significantly more efficient for higher dimensional problems. We also show that these samples are easier to connect than others, resulting in better connected roadmaps. We demonstrate that our method can be applied to 262-dof, multi-loop, and tree-like linkages including combinations of planar, prismatic and spherical joints. In contrast, existing methods either cannot be used for these problems or do not produce good quality solutions.

  17. Large-volume injection in gas chromatographic trace analysis using temperature-programmable (PTV) injectors

    NARCIS (Netherlands)

    Mol, J.G.J.; Janssen, J.G.M.; Cramers, C.A.M.G.; Brinkman, U.A.T.

    1996-01-01

    The use of programmed-temperature vaporising (PTV) injectors for large-volume injection in capillary gas chromatography is briefly reviewed. The principles and optimisation of large-volume PTV injection are discussed. Guidelines are given for selection of the PTV conditions and injection mode for

  18. Sample to moderator volume ratio effects in neutron yield from a PGNAA setup

    Energy Technology Data Exchange (ETDEWEB)

    Naqvi, A.A. [Department of Physics, King Fahd University of Petroleum and Minerals, KFUPM Box 1815, Dhahran-31261 (Saudi Arabia)]. E-mail: aanaqvi@kfupm.edu.sa; Fazal-ur-Rehman [Department of Physics, King Fahd University of Petroleum and Minerals, KFUPM Box 1815, Dhahran-31261 (Saudi Arabia); Nagadi, M.M. [Department of Physics, King Fahd University of Petroleum and Minerals, KFUPM Box 1815, Dhahran-31261 (Saudi Arabia); Khateeb-ur-Rehman [Department of Physics, King Fahd University of Petroleum and Minerals, KFUPM Box 1815, Dhahran-31261 (Saudi Arabia)

    2007-02-15

    Performance of a prompt gamma ray neutron activation analysis (PGNAA) setup depends upon thermal neutron yield at the PGNAA sample location. For a moderator, which encloses a sample, thermal neutron intensity depends upon the effective moderator volume excluding the void volume due to sample volume. A rectangular moderator assembly has been designed for the King Fahd University of Petroleum and Minerals (KFUPM) PGNAA setup. The thermal and fast neutron yield has been measured inside the sample cavity as a function of its front moderator thickness using alpha particle tracks density and recoil proton track density inside the CR-39 nuclear track detectors (NTDs). The thermal/fast neutron yield ratio, obtained from the alpha particle tracks density to proton tracks density ratio in the NTDs, shows an inverse correlation with sample to moderator volume ratio. Comparison of the present results with the previously published results of smaller moderators of the KFUPM PGNAA setup confirms the observation.

  19. Refined universal laws for hull volumes and perimeters in large planar maps

    International Nuclear Information System (INIS)

    Guitter, Emmanuel

    2017-01-01

    We consider ensembles of planar maps with two marked vertices at distance k from each other, and look at the closed line separating these vertices and lying at distance d from the first one ( d   <   k ). This line divides the map into two components, the hull at distance d which corresponds to the part of the map lying on the same side as the first vertex and its complementary. The number of faces within the hull is called the hull volume, and the length of the separating line the hull perimeter. We study the statistics of the hull volume and perimeter for arbitrary d and k in the limit of infinitely large planar quadrangulations, triangulations and Eulerian triangulations. We consider more precisely situations where both d and k become large with the ratio d / k remaining finite. For infinitely large maps, two regimes may be encountered: either the hull has a finite volume and its complementary is infinitely large, or the hull itself has an infinite volume and its complementary is of finite size. We compute the probability for the map to be in either regime as a function of d / k as well as a number of universal statistical laws for the hull perimeter and volume when maps are conditioned to be in one regime or the other. (paper)

  20. An immunomagnetic separator for concentration of pathogenic micro-organisms from large volume samples

    International Nuclear Information System (INIS)

    Rotariu, Ovidiu; Ogden, Iain D.; MacRae, Marion; Badescu, Vasile; Strachan, Norval J.C.

    2005-01-01

    The standard method of immunomagnetic separation of pathogenic bacteria from food and environmental matrices processes 1 ml volumes. Pathogens present at low levels ( 50 ml). Preliminary results show that between 70 and 113 times more Escherchia coli O157 are recovered compared with the standard 1 ml method

  1. Coupling of RF antennas to large volume helicon plasma

    Directory of Open Access Journals (Sweden)

    Lei Chang

    2018-04-01

    Full Text Available Large volume helicon plasma sources are of particular interest for large scale semiconductor processing, high power plasma propulsion and recently plasma-material interaction under fusion conditions. This work is devoted to studying the coupling of four typical RF antennas to helicon plasma with infinite length and diameter of 0.5 m, and exploring its frequency dependence in the range of 13.56-70 MHz for coupling optimization. It is found that loop antenna is more efficient than half helix, Boswell and Nagoya III antennas for power absorption; radially parabolic density profile overwhelms Gaussian density profile in terms of antenna coupling for low-density plasma, but the superiority reverses for high-density plasma. Increasing the driving frequency results in power absorption more near plasma edge, but the overall power absorption increases with frequency. Perpendicular stream plots of wave magnetic field, wave electric field and perturbed current are also presented. This work can serve as an important reference for the experimental design of large volume helicon plasma source with high RF power.

  2. Development of Large Sample Neutron Activation Technique for New Applications in Thailand

    International Nuclear Information System (INIS)

    Laoharojanaphand, S.; Tippayakul, C.; Wonglee, S.; Channuie, J.

    2018-01-01

    The development of the Large Sample Neutron Activation Analysis (LSNAA) in Thailand is presented in this paper. The technique had been firstly developed with rice sample as the test subject. The Thai Research Reactor-1/Modification 1 (TRR-1/M1) was used as the neutron source. The first step was to select and characterize an appropriate irradiation facility for the research. An out-core irradiation facility (A4 position) was first attempted. The results performed with the A4 facility were then used as guides for the subsequent experiments with the thermal column facility. The characterization of the thermal column was performed with Cu-wire to determine spatial distribution without and with rice sample. The flux depression without rice sample was observed to be less than 30% while the flux depression with rice sample increased to within 60%. The flux monitors internal to the rice sample were used to determine average flux over the rice sample. The gamma selfshielding effect during gamma measurement was corrected using the Monte Carlo simulation. The ratio between the efficiencies of the volume source and the point source for each energy point was calculated by the MCNPX code. The research team adopted the k0-NAA methodology to calculate the element concentration in the research. The k0-NAA program which developed by IAEA was set up to simulate the conditions of the irradiation and measurement facilities used in this research. The element concentrations in the bulk rice sample were then calculated taking into account the flux depression and gamma efficiency corrections. At the moment, the results still show large discrepancies with the reference values. However, more research on the validation will be performed to identify sources of errors. Moreover, this LS-NAA technique was introduced for the activation analysis of the IAEA archaeological mock-up. The results are provided in this report. (author)

  3. Novel regenerative large-volume immobilized enzyme reactor: preparation, characterization and application.

    Science.gov (United States)

    Ruan, Guihua; Wei, Meiping; Chen, Zhengyi; Su, Rihui; Du, Fuyou; Zheng, Yanjie

    2014-09-15

    A novel large-volume immobilized enzyme reactor (IMER) on small column was prepared with organic-inorganic hybrid silica particles and applied for fast (10 min) and oriented digestion of protein. At first, a thin enzyme support layer was formed in the bottom of the small column by polymerization with α-methacrylic acid and dimethacrylate. After that, amino SiO2 particles was prepared by the sol-gel method with tetraethoxysilane and 3-aminopropyltriethoxysilane. Subsequently, the amino SiO2 particles were activated by glutaraldehyde for covalent immobilization of trypsin. Digestive capability of large-volume IMER for proteins was investigated by using bovine serum albumin (BSA), cytochrome c (Cyt-c) as model proteins. Results showed that although the sequence coverage of the BSA (20%) and Cyt-c (19%) was low, the large-volume IMER could produce peptides with stable specific sequence at 101-105, 156-160, 205-209, 212-218, 229-232, 257-263 and 473-451 of the amino sequence of BSA when digesting 1mg/mL BSA. Eight of common peptides were observed during each of the ten runs of large-volume IMER. Besides, the IMER could be easily regenerated by reactivating with GA and cross-linking with trypsin after breaking the -C=N- bond by 0.01 M HCl. The sequence coverage of BSA from regenerated IMER increased to 25% comparing the non-regenerated IMER (17%). 14 common peptides. accounting for 87.5% of first use of IMER, were produced both with IMER and regenerated IMER. When the IMER was applied for ginkgo albumin digestion, the sequence coverage of two main proteins of ginkgo, ginnacin and legumin, was 56% and 55%, respectively. (Reviewer 2) Above all, the fast and selective digestion property of the large-volume IMER indicated that the regenerative IMER could be tentatively used for the production of potential bioactive peptides and the study of oriented protein digestion. Copyright © 2014 Elsevier B.V. All rights reserved.

  4. Nuclear waste calorimeter for very large drums with 385 litres sample volume

    Energy Technology Data Exchange (ETDEWEB)

    Jossens, G.; Mathonat, C. [SETARAM Instrumentation, Caluire (France); Bachelet, F. [CEA Valduc, Is sur Tille (France)

    2015-03-15

    Calorimetry is a very precise and well adapted tool for the classification of drums containing nuclear waste material depending on their level of activities (low, medium, high). A new calorimeter has been developed by SETARAM Instrumentation and the CEA Valduc in France. This new calorimeter is designed for drums having a volume bigger than 100 liters. It guarantees high operator safety by optimizing drum handling and air circulation for cooling, and optimized software for direct measurement of the quantity of nuclear material. The LVC1380 calorimeter makes it possible to work over the range 10 to 3000 mW, which corresponds to approximately 0.03 to 10 g of tritium or 3 to 955 g of {sup 241}Pu in a volume up to 385 liters. This calorimeter is based on the heat flow measurement using Peltier elements which surround the drum in the 3 dimensions and therefore measure all the heat coming from the radioactive stuff whatever its position inside the drum. Calorimeter's insulating layers constitute a thermal barrier designed to filter disturbances until they represent less than 0.001 Celsius degrees and to eliminate long term disturbances associated, for example, with laboratory temperature variations between day and night. A calibration device based on Joule effect has also been designed. Measurement time has been optimized but remains long compared with other methods of measurement such as gamma spectrometry but its main asset is to have a good accuracy for low level activities.

  5. 21 CFR 201.323 - Aluminum in large and small volume parenterals used in total parenteral nutrition.

    Science.gov (United States)

    2010-04-01

    ... 21 Food and Drugs 4 2010-04-01 2010-04-01 false Aluminum in large and small volume parenterals... for Specific Drug Products § 201.323 Aluminum in large and small volume parenterals used in total parenteral nutrition. (a) The aluminum content of large volume parenteral (LVP) drug products used in total...

  6. Volume Ray Casting with Peak Finding and Differential Sampling

    KAUST Repository

    Knoll, A.; Hijazi, Y.; Westerteiger, R.; Schott, M.; Hansen, C.; Hagen, H.

    2009-01-01

    classification. In this paper, we introduce a method for rendering such features by explicitly solving for isovalues within the volume rendering integral. In addition, we present a sampling strategy inspired by ray differentials that automatically matches

  7. Large-Scale Multi-Resolution Representations for Accurate Interactive Image and Volume Operations

    KAUST Repository

    Sicat, Ronell B.

    2015-11-25

    The resolutions of acquired image and volume data are ever increasing. However, the resolutions of commodity display devices remain limited. This leads to an increasing gap between data and display resolutions. To bridge this gap, the standard approach is to employ output-sensitive operations on multi-resolution data representations. Output-sensitive operations facilitate interactive applications since their required computations are proportional only to the size of the data that is visible, i.e., the output, and not the full size of the input. Multi-resolution representations, such as image mipmaps, and volume octrees, are crucial in providing these operations direct access to any subset of the data at any resolution corresponding to the output. Despite its widespread use, this standard approach has some shortcomings in three important application areas, namely non-linear image operations, multi-resolution volume rendering, and large-scale image exploration. This dissertation presents new multi-resolution representations for large-scale images and volumes that address these shortcomings. Standard multi-resolution representations require low-pass pre-filtering for anti- aliasing. However, linear pre-filters do not commute with non-linear operations. This becomes problematic when applying non-linear operations directly to any coarse resolution levels in standard representations. Particularly, this leads to inaccurate output when applying non-linear image operations, e.g., color mapping and detail-aware filters, to multi-resolution images. Similarly, in multi-resolution volume rendering, this leads to inconsistency artifacts which manifest as erroneous differences in rendering outputs across resolution levels. To address these issues, we introduce the sparse pdf maps and sparse pdf volumes representations for large-scale images and volumes, respectively. These representations sparsely encode continuous probability density functions (pdfs) of multi-resolution pixel

  8. Sample preparation

    International Nuclear Information System (INIS)

    Anon.

    1992-01-01

    Sample preparation prior to HPLC analysis is certainly one of the most important steps to consider in trace or ultratrace analysis. For many years scientists have tried to simplify the sample preparation process. It is rarely possible to inject a neat liquid sample or a sample where preparation may not be any more complex than dissolution of the sample in a given solvent. The last process alone can remove insoluble materials, which is especially helpful with the samples in complex matrices if other interactions do not affect extraction. Here, it is very likely a large number of components will not dissolve and are, therefore, eliminated by a simple filtration process. In most cases, the process of sample preparation is not as simple as dissolution of the component interest. At times, enrichment is necessary, that is, the component of interest is present in very large volume or mass of material. It needs to be concentrated in some manner so a small volume of the concentrated or enriched sample can be injected into HPLC. 88 refs

  9. Known volume air sampling pump. Final summary report Jun 1975--Nov 1976

    International Nuclear Information System (INIS)

    McCullough, J.E.; Peterson, A.

    1976-11-01

    The purpose of this development program was to design and develop a known volume air sampling pump for use in measuring the amount of radioactive material in the atmosphere of an underground uranium mine. The principal nuclear radiation hazard to underground uranium mines comes from the mine atmosphere. Daughter products of radon-222 are inhaled by the miner resulting in a relatively high lung cancer rate among these workers. Current exposure control practice employs spot sampling in working areas to measure working level values. Currently available personal air sampling pumps fail to deliver known volumes of air under widely changing differential pressures. A unique type of gas pump known as the scroll compressor, developed by Arthur D. Little, Inc., that has no values and few moving parts is expected to provide a practical, efficient, and dependable air pump for use in dosimeters. The three deliverable known volume air sampling pumps resulting from this work incorporate a scroll pump, drive motor, speed control electronics, and battery pack in a container suitable for attachment to a miner's belt

  10. Comparison of uncertainties related to standardization of urine samples with volume and creatinine concentration

    DEFF Research Database (Denmark)

    Garde, Anne Helene; Hansen, Ase Marie; Kristiansen, Jesper

    2004-01-01

    When measuring biomarkers in urine, volume (and time) or concentration of creatinine are both accepted methods of standardization for diuresis. Both types of standardization contribute uncertainty to the final result. The aim of the present paper was to compare the uncertainty introduced when usi...... increase in convenience for the participants, when collecting small volumes rather than complete 24 h samples....... the two types of standardization on 24 h samples from healthy individuals. Estimates of uncertainties were based on results from the literature supplemented with data from our own studies. Only the difference in uncertainty related to the two standardization methods was evaluated. It was found...... that the uncertainty associated with creatinine standardization (19-35%) was higher than the uncertainty related to volume standardization (up to 10%, when not correcting for deviations from 24 h) for 24 h urine samples. However, volume standardization introduced an average bias of 4% due to missed volumes...

  11. Systems and methods for the detection of low-level harmful substances in a large volume of fluid

    Science.gov (United States)

    Carpenter, Michael V.; Roybal, Lyle G.; Lindquist, Alan; Gallardo, Vincente

    2016-03-15

    A method and device for the detection of low-level harmful substances in a large volume of fluid comprising using a concentrator system to produce a retentate and analyzing the retentate for the presence of at least one harmful substance. The concentrator system performs a method comprising pumping at least 10 liters of fluid from a sample source through a filter. While pumping, the concentrator system diverts retentate from the filter into a container. The concentrator system also recirculates at least part of the retentate in the container again through the filter. The concentrator system controls the speed of the pump with a control system thereby maintaining a fluid pressure less than 25 psi during the pumping of the fluid; monitors the quantity of retentate within the container with a control system, and maintains a reduced volume level of retentate and a target volume of retentate.

  12. Hierarchical imaging: a new concept for targeted imaging of large volumes from cells to tissues.

    Science.gov (United States)

    Wacker, Irene; Spomer, Waldemar; Hofmann, Andreas; Thaler, Marlene; Hillmer, Stefan; Gengenbach, Ulrich; Schröder, Rasmus R

    2016-12-12

    Imaging large volumes such as entire cells or small model organisms at nanoscale resolution seemed an unrealistic, rather tedious task so far. Now, technical advances have lead to several electron microscopy (EM) large volume imaging techniques. One is array tomography, where ribbons of ultrathin serial sections are deposited on solid substrates like silicon wafers or glass coverslips. To ensure reliable retrieval of multiple ribbons from the boat of a diamond knife we introduce a substrate holder with 7 axes of translation or rotation specifically designed for that purpose. With this device we are able to deposit hundreds of sections in an ordered way in an area of 22 × 22 mm, the size of a coverslip. Imaging such arrays in a standard wide field fluorescence microscope produces reconstructions with 200 nm lateral resolution and 100 nm (the section thickness) resolution in z. By hierarchical imaging cascades in the scanning electron microscope (SEM), using a new software platform, we can address volumes from single cells to complete organs. In our first example, a cell population isolated from zebrafish spleen, we characterize different cell types according to their organelle inventory by segmenting 3D reconstructions of complete cells imaged with nanoscale resolution. In addition, by screening large numbers of cells at decreased resolution we can define the percentage at which different cell types are present in our preparation. With the second example, the root tip of cress, we illustrate how combining information from intermediate resolution data with high resolution data from selected regions of interest can drastically reduce the amount of data that has to be recorded. By imaging only the interesting parts of a sample considerably less data need to be stored, handled and eventually analysed. Our custom-designed substrate holder allows reproducible generation of section libraries, which can then be imaged in a hierarchical way. We demonstrate, that EM

  13. On the accuracy of protein determination in large biological samples by prompt gamma neutron activation analysis

    International Nuclear Information System (INIS)

    Kasviki, K.; Stamatelatos, I.E.; Yannakopoulou, E.; Papadopoulou, P.; Kalef-Ezra, J.

    2007-01-01

    A prompt gamma neutron activation analysis (PGNAA) facility has been developed for the determination of nitrogen and thus total protein in large volume biological samples or the whole body of small animals. In the present work, the accuracy of nitrogen determination by PGNAA in phantoms of known composition as well as in four raw ground meat samples of about 1 kg mass was examined. Dumas combustion and Kjeldahl techniques were also used for the assessment of nitrogen concentration in the meat samples. No statistically significant differences were found between the concentrations assessed by the three techniques. The results of this work demonstrate the applicability of PGNAA for the assessment of total protein in biological samples of 0.25-1.5 kg mass, such as a meat sample or the body of small animal even in vivo with an equivalent radiation dose of about 40 mSv

  14. On the accuracy of protein determination in large biological samples by prompt gamma neutron activation analysis

    Energy Technology Data Exchange (ETDEWEB)

    Kasviki, K. [Institute of Nuclear Technology and Radiation Protection, NCSR ' Demokritos' , Aghia Paraskevi, Attikis 15310 (Greece); Medical Physics Laboratory, Medical School, University of Ioannina, Ioannina 45110 (Greece); Stamatelatos, I.E. [Institute of Nuclear Technology and Radiation Protection, NCSR ' Demokritos' , Aghia Paraskevi, Attikis 15310 (Greece)], E-mail: ion@ipta.demokritos.gr; Yannakopoulou, E. [Institute of Physical Chemistry, NCSR ' Demokritos' , Aghia Paraskevi, Attikis 15310 (Greece); Papadopoulou, P. [Institute of Technology of Agricultural Products, NAGREF, Lycovrissi, Attikis 14123 (Greece); Kalef-Ezra, J. [Medical Physics Laboratory, Medical School, University of Ioannina, Ioannina 45110 (Greece)

    2007-10-15

    A prompt gamma neutron activation analysis (PGNAA) facility has been developed for the determination of nitrogen and thus total protein in large volume biological samples or the whole body of small animals. In the present work, the accuracy of nitrogen determination by PGNAA in phantoms of known composition as well as in four raw ground meat samples of about 1 kg mass was examined. Dumas combustion and Kjeldahl techniques were also used for the assessment of nitrogen concentration in the meat samples. No statistically significant differences were found between the concentrations assessed by the three techniques. The results of this work demonstrate the applicability of PGNAA for the assessment of total protein in biological samples of 0.25-1.5 kg mass, such as a meat sample or the body of small animal even in vivo with an equivalent radiation dose of about 40 mSv.

  15. Methods of pre-concentration of radionuclides from large volume samples

    International Nuclear Information System (INIS)

    Olahova, K.; Matel, L.; Rosskopfova, O.

    2006-01-01

    The development of radioanalytical methods for low level radionuclides in environmental samples is presented. In particular, emphasis is placed on the introduction of extraction chromatography as a tool for improving the quality of results as well as reducing the analysis time. However, the advantageous application of extraction chromatography often depends on the effective use of suitable preconcentration techniques, such as co-precipitation, to reduce the amount of matrix components which accompany the analysis interest. On-going investigations in this field relevant to the determination of environmental levels of actinides and 90 Sr are discussed. (authors)

  16. Critical length sampling: a method to estimate the volume of downed coarse woody debris

    Science.gov (United States)

    G& #246; ran St& #229; hl; Jeffrey H. Gove; Michael S. Williams; Mark J. Ducey

    2010-01-01

    In this paper, critical length sampling for estimating the volume of downed coarse woody debris is presented. Using this method, the volume of downed wood in a stand can be estimated by summing the critical lengths of down logs included in a sample obtained using a relascope or wedge prism; typically, the instrument should be tilted 90° from its usual...

  17. Radiation from Large Gas Volumes and Heat Exchange in Steam Boiler Furnaces

    Energy Technology Data Exchange (ETDEWEB)

    Makarov, A. N., E-mail: tgtu-kafedra-ese@mail.ru [Tver State Technical University (Russian Federation)

    2015-09-15

    Radiation from large cylindrical gas volumes is studied as a means of simulating the flare in steam boiler furnaces. Calculations of heat exchange in a furnace by the zonal method and by simulation of the flare with cylindrical gas volumes are described. The latter method is more accurate and yields more reliable information on heat transfer processes taking place in furnaces.

  18. Genetic Influences on Pulmonary Function: A Large Sample Twin Study

    DEFF Research Database (Denmark)

    Ingebrigtsen, Truls S; Thomsen, Simon F; van der Sluis, Sophie

    2011-01-01

    Heritability of forced expiratory volume in one second (FEV(1)), forced vital capacity (FVC), and peak expiratory flow (PEF) has not been previously addressed in large twin studies. We evaluated the genetic contribution to individual differences observed in FEV(1), FVC, and PEF using data from...... the largest population-based twin study on spirometry. Specially trained lay interviewers with previous experience in spirometric measurements tested 4,314 Danish twins (individuals), 46-68 years of age, in their homes using a hand-held spirometer, and their flow-volume curves were evaluated. Modern variance...

  19. “Finite” non-Gaussianities and tensor-scalar ratio in large volume Swiss-cheese compactifications

    Science.gov (United States)

    Misra, Aalok; Shukla, Pramod

    2009-03-01

    Developing on the ideas of (Section 4 of) [A. Misra, P. Shukla, Moduli stabilization, large-volume dS minimum without anti-D3-branes, (non-)supersymmetric black hole attractors and two-parameter Swiss cheese Calabi-Yau's, Nucl. Phys. B 799 (2008) 165-198, arXiv: 0707.0105] and [A. Misra, P. Shukla, Large volume axionic Swiss-cheese inflation, Nucl. Phys. B 800 (2008) 384-400, arXiv: 0712.1260 [hep-th

  20. State-of-the-Art in GPU-Based Large-Scale Volume Visualization

    KAUST Repository

    Beyer, Johanna

    2015-05-01

    This survey gives an overview of the current state of the art in GPU techniques for interactive large-scale volume visualization. Modern techniques in this field have brought about a sea change in how interactive visualization and analysis of giga-, tera- and petabytes of volume data can be enabled on GPUs. In addition to combining the parallel processing power of GPUs with out-of-core methods and data streaming, a major enabler for interactivity is making both the computational and the visualization effort proportional to the amount and resolution of data that is actually visible on screen, i.e. \\'output-sensitive\\' algorithms and system designs. This leads to recent output-sensitive approaches that are \\'ray-guided\\', \\'visualization-driven\\' or \\'display-aware\\'. In this survey, we focus on these characteristics and propose a new categorization of GPU-based large-scale volume visualization techniques based on the notions of actual output-resolution visibility and the current working set of volume bricks-the current subset of data that is minimally required to produce an output image of the desired display resolution. Furthermore, we discuss the differences and similarities of different rendering and data traversal strategies in volume rendering by putting them into a common context-the notion of address translation. For our purposes here, we view parallel (distributed) visualization using clusters as an orthogonal set of techniques that we do not discuss in detail but that can be used in conjunction with what we present in this survey. © 2015 The Eurographics Association and John Wiley & Sons Ltd.

  1. State-of-the-Art in GPU-Based Large-Scale Volume Visualization

    KAUST Repository

    Beyer, Johanna; Hadwiger, Markus; Pfister, Hanspeter

    2015-01-01

    This survey gives an overview of the current state of the art in GPU techniques for interactive large-scale volume visualization. Modern techniques in this field have brought about a sea change in how interactive visualization and analysis of giga-, tera- and petabytes of volume data can be enabled on GPUs. In addition to combining the parallel processing power of GPUs with out-of-core methods and data streaming, a major enabler for interactivity is making both the computational and the visualization effort proportional to the amount and resolution of data that is actually visible on screen, i.e. 'output-sensitive' algorithms and system designs. This leads to recent output-sensitive approaches that are 'ray-guided', 'visualization-driven' or 'display-aware'. In this survey, we focus on these characteristics and propose a new categorization of GPU-based large-scale volume visualization techniques based on the notions of actual output-resolution visibility and the current working set of volume bricks-the current subset of data that is minimally required to produce an output image of the desired display resolution. Furthermore, we discuss the differences and similarities of different rendering and data traversal strategies in volume rendering by putting them into a common context-the notion of address translation. For our purposes here, we view parallel (distributed) visualization using clusters as an orthogonal set of techniques that we do not discuss in detail but that can be used in conjunction with what we present in this survey. © 2015 The Eurographics Association and John Wiley & Sons Ltd.

  2. Sample volume and alignment analysis for an optical particle counter sizer, and other applications

    International Nuclear Information System (INIS)

    Holve, D.J.; Davis, G.W.

    1985-01-01

    Optical methods for particle size distribution measurements in practical high temperature environments are approaching feasibility and offer significant advantages over conventional sampling methods. A key requirement of single particle counting techniques is the need to know features of the sample volume intensity distribution which in general are a function of the particle scattering properties and optical system geometry. In addition, the sample volume intensity distribution is sensitive to system alignment and thus calculations of alignment sensitivity are required for assessment of practical alignment tolerances. To this end, an analysis of sample volume characteristics for single particle counters in general has been developed. Results from the theory are compared with experimental measurements and shown to be in good agreement. A parametric sensitivity analysis is performed and a criterion for allowable optical misalignment is derived for conditions where beam steering caused by fluctuating refractive-index gradients is significant

  3. Development of a solid-phase extraction system modified for preconcentration of emerging contaminants in large sample volumes from rivers of the lagoon system in the city of Rio de Janeiro, Brazil.

    Science.gov (United States)

    Lopes, Vitor Sergio Almeida; Riente, Roselene Ribeiro; da Silva, Alexsandro Araújo; Torquilho, Delma Falcão; Carreira, Renato da Silva; Marques, Mônica Regina da Costa

    2016-09-15

    A single method modified for monitoring of emerging contaminants in river water was developed for large sample volumes. Water samples from rivers of the lagoon system in the city of Rio de Janeiro (Brazil) were analyzed by the SPE-HPLC-MS-TOF analytical method. Acetaminophen was detected in four rivers in the concentration range of 0.09μgL(-1) to 0.14μgL(-1). Salicylic acid was also found in the four rivers in the concentration range of 1.65μgL(-1) to 4.81μgL(-1). Bisphenol-A was detected in all rivers in the concentration range of 1.37μgL(-1) to 39.86μgL(-1). Diclofenac was found in only one river, with concentration of 0.22μgL(-1). The levels of emerging organic pollutants in the water samples of the Jacarepaguá hydrographical basin are significant. The compounds are not routinely monitored and present potential risks to environmental health. Copyright © 2016 Elsevier Ltd. All rights reserved.

  4. Ultrasensitive multiplex optical quantification of bacteria in large samples of biofluids

    Science.gov (United States)

    Pazos-Perez, Nicolas; Pazos, Elena; Catala, Carme; Mir-Simon, Bernat; Gómez-de Pedro, Sara; Sagales, Juan; Villanueva, Carlos; Vila, Jordi; Soriano, Alex; García de Abajo, F. Javier; Alvarez-Puebla, Ramon A.

    2016-01-01

    Efficient treatments in bacterial infections require the fast and accurate recognition of pathogens, with concentrations as low as one per milliliter in the case of septicemia. Detecting and quantifying bacteria in such low concentrations is challenging and typically demands cultures of large samples of blood (~1 milliliter) extending over 24–72 hours. This delay seriously compromises the health of patients. Here we demonstrate a fast microorganism optical detection system for the exhaustive identification and quantification of pathogens in volumes of biofluids with clinical relevance (~1 milliliter) in minutes. We drive each type of bacteria to accumulate antibody functionalized SERS-labelled silver nanoparticles. Particle aggregation on the bacteria membranes renders dense arrays of inter-particle gaps in which the Raman signal is exponentially amplified by several orders of magnitude relative to the dispersed particles. This enables a multiplex identification of the microorganisms through the molecule-specific spectral fingerprints. PMID:27364357

  5. Enhanced FIB-SEM systems for large-volume 3D imaging

    Science.gov (United States)

    Xu, C Shan; Hayworth, Kenneth J; Lu, Zhiyuan; Grob, Patricia; Hassan, Ahmed M; García-Cerdán, José G; Niyogi, Krishna K; Nogales, Eva; Weinberg, Richard J; Hess, Harald F

    2017-01-01

    Focused Ion Beam Scanning Electron Microscopy (FIB-SEM) can automatically generate 3D images with superior z-axis resolution, yielding data that needs minimal image registration and related post-processing. Obstacles blocking wider adoption of FIB-SEM include slow imaging speed and lack of long-term system stability, which caps the maximum possible acquisition volume. Here, we present techniques that accelerate image acquisition while greatly improving FIB-SEM reliability, allowing the system to operate for months and generating continuously imaged volumes > 106 µm3. These volumes are large enough for connectomics, where the excellent z resolution can help in tracing of small neuronal processes and accelerate the tedious and time-consuming human proofreading effort. Even higher resolution can be achieved on smaller volumes. We present example data sets from mammalian neural tissue, Drosophila brain, and Chlamydomonas reinhardtii to illustrate the power of this novel high-resolution technique to address questions in both connectomics and cell biology. DOI: http://dx.doi.org/10.7554/eLife.25916.001 PMID:28500755

  6. GPU-Based 3D Cone-Beam CT Image Reconstruction for Large Data Volume

    Directory of Open Access Journals (Sweden)

    Xing Zhao

    2009-01-01

    Full Text Available Currently, 3D cone-beam CT image reconstruction speed is still a severe limitation for clinical application. The computational power of modern graphics processing units (GPUs has been harnessed to provide impressive acceleration of 3D volume image reconstruction. For extra large data volume exceeding the physical graphic memory of GPU, a straightforward compromise is to divide data volume into blocks. Different from the conventional Octree partition method, a new partition scheme is proposed in this paper. This method divides both projection data and reconstructed image volume into subsets according to geometric symmetries in circular cone-beam projection layout, and a fast reconstruction for large data volume can be implemented by packing the subsets of projection data into the RGBA channels of GPU, performing the reconstruction chunk by chunk and combining the individual results in the end. The method is evaluated by reconstructing 3D images from computer-simulation data and real micro-CT data. Our results indicate that the GPU implementation can maintain original precision and speed up the reconstruction process by 110–120 times for circular cone-beam scan, as compared to traditional CPU implementation.

  7. Transportable aerosol sampling station with fixed volume (15 l) DMPA-15

    International Nuclear Information System (INIS)

    Giolu, G.; Guta, V.

    1999-01-01

    The mobile installation is used for air-sampling operations with fixed intake volumes, to be analysed by laboratories of routine environmental air monitoring. The station consists of several units, installed on a two-wheel mobile carriage-type platform: - a double - diaphragm pump (ensuring oil separation) that provides air intake and its evacuation to the air-analysers. The sampling and control unit has the following functions: - intake ensured by the pump that aspirates fixed volumes of air from the ambient atmosphere and feeding with it an inflatable rubber chamber. Air intake is automatically stopped as the cushion is filled up completely. A separation clamp is provided to seal up the cushion; - exhaust - allows the residual air to be evacuated from the cushion, ensuring its 'self-cleaning'; - shut down, manually operated; - analyse, the aerosol containing sample is aspirated from the inflatable rubber chamber and evacuated through a flow regulator to the analyser; - stop, canceling any previous commands. A relay unit controls the pneumatic lines and a pressure relay provides automatic stop of air intake process. The following technical features are given: - The fixed air volume in the chamber, 15 l - the air flow at the exit from the flow-meter, 0 - 15 l/min; - power requirements, 220 V/ 50 Hz; - power consumption, max. 1,5 kW; - overall dimensions, 460 x 500 x 820 mm; - weight, 53 kg. (authors)

  8. Feasibility of large volume casting cementation process for intermediate level radioactive waste

    International Nuclear Information System (INIS)

    Chen Zhuying; Chen Baisong; Zeng Jishu; Yu Chengze

    1988-01-01

    The recent tendency of radioactive waste treatment and disposal both in China and abroad is reviewed. The feasibility of the large volume casting cementation process for treating and disposing the intermediate level radioactive waste from spent fuel reprocessing plant in shallow land is assessed on the basis of the analyses of the experimental results (such as formulation study, solidified radioactive waste properties measurement ect.). It can be concluded large volume casting cementation process is a promising, safe and economic process. It is feasible to dispose the intermediate level radioactive waste from reprocessing plant it the disposal site chosen has resonable geological and geographical conditions and some additional effective protection means are taken

  9. Development of large-volume rhyolitic ignibrites (LRI'S): The Chalupas Caldera, an example from Ecuador

    International Nuclear Information System (INIS)

    Hammersley, L.; DePaolo, D.J; Beate, B

    2001-01-01

    The mechanisms responsible for the generation of large volumes of silicic magma and the eruption of large-volume rhyolitic ignimbrites (LRI's) remain poorly understood. Of particular interest are the relative roles of crustal assimilation, fractional crystallization and magma supply and the processes by which large volumes of magma accumulate in crustal chambers rather than erupt in smaller batches. Isotope geochemistry, combined with study of major and trace element variations of lavas, can be used to infer the relative contribution of crustal material and continued magmatic supply. Timescales for the accumulation of magma can be estimated using detailed geochronology. Magma supply rates can be estimated from eruption rates of nearby volcanoes. In this study we investigate the evolution of the Chalupas LRI, a caldera system in the Ecuadorian Andes where LRI's are rare in comparison to the Southern Volcanic Zone (SVZ) of South America (au)

  10. Measurement of Atmospheric Neutrino Oscillations with Very Large Volume Neutrino Telescopes

    Directory of Open Access Journals (Sweden)

    J. P. Yáñez

    2015-01-01

    Full Text Available Neutrino oscillations have been probed during the last few decades using multiple neutrino sources and experimental set-ups. In the recent years, very large volume neutrino telescopes have started contributing to the field. First ANTARES and then IceCube have relied on large and sparsely instrumented volumes to observe atmospheric neutrinos for combinations of baselines and energies inaccessible to other experiments. Using this advantage, the latest result from IceCube starts approaching the precision of other established technologies and is paving the way for future detectors, such as ORCA and PINGU. These new projects seek to provide better measurements of neutrino oscillation parameters and eventually determine the neutrino mass ordering. The results from running experiments and the potential from proposed projects are discussed in this review, emphasizing the experimental challenges involved in the measurements.

  11. Critical point relascope sampling for unbiased volume estimation of downed coarse woody debris

    Science.gov (United States)

    Jeffrey H. Gove; Michael S. Williams; Mark J. Ducey; Mark J. Ducey

    2005-01-01

    Critical point relascope sampling is developed and shown to be design-unbiased for the estimation of log volume when used with point relascope sampling for downed coarse woody debris. The method is closely related to critical height sampling for standing trees when trees are first sampled with a wedge prism. Three alternative protocols for determining the critical...

  12. Flow-through electroporation based on constant voltage for large-volume transfection of cells.

    Science.gov (United States)

    Geng, Tao; Zhan, Yihong; Wang, Hsiang-Yu; Witting, Scott R; Cornetta, Kenneth G; Lu, Chang

    2010-05-21

    Genetic modification of cells is a critical step involved in many cell therapy and gene therapy protocols. In these applications, cell samples of large volume (10(8)-10(9)cells) are often processed for transfection. This poses new challenges for current transfection methods and practices. Here we present a novel flow-through electroporation method for delivery of genes into cells at high flow rates (up to approximately 20 mL/min) based on disposable microfluidic chips, a syringe pump, and a low-cost direct current (DC) power supply that provides a constant voltage. By eliminating pulse generators used in conventional electroporation, we dramatically lowered the cost of the apparatus and improved the stability and consistency of the electroporation field for long-time operation. We tested the delivery of pEFGP-C1 plasmids encoding enhanced green fluorescent protein into Chinese hamster ovary (CHO-K1) cells in the devices of various dimensions and geometries. Cells were mixed with plasmids and then flowed through a fluidic channel continuously while a constant voltage was established across the device. Together with the applied voltage, the geometry and dimensions of the fluidic channel determined the electrical parameters of the electroporation. With the optimal design, approximately 75% of the viable CHO cells were transfected after the procedure. We also generalize the guidelines for scaling up these flow-through electroporation devices. We envision that this technique will serve as a generic and low-cost tool for a variety of clinical applications requiring large volume of transfected cells. Copyright 2010 Elsevier B.V. All rights reserved.

  13. The parallel volume at large distances

    DEFF Research Database (Denmark)

    Kampf, Jürgen

    In this paper we examine the asymptotic behavior of the parallel volume of planar non-convex bodies as the distance tends to infinity. We show that the difference between the parallel volume of the convex hull of a body and the parallel volume of the body itself tends to . This yields a new proof...... for the fact that a planar body can only have polynomial parallel volume, if it is convex. Extensions to Minkowski spaces and random sets are also discussed....

  14. The parallel volume at large distances

    DEFF Research Database (Denmark)

    Kampf, Jürgen

    In this paper we examine the asymptotic behavior of the parallel volume of planar non-convex bodies as the distance tends to infinity. We show that the difference between the parallel volume of the convex hull of a body and the parallel volume of the body itself tends to 0. This yields a new proof...... for the fact that a planar body can only have polynomial parallel volume, if it is convex. Extensions to Minkowski spaces and random sets are also discussed....

  15. Sampling large random knots in a confined space

    International Nuclear Information System (INIS)

    Arsuaga, J; Blackstone, T; Diao, Y; Hinson, K; Karadayi, E; Saito, M

    2007-01-01

    DNA knots formed under extreme conditions of condensation, as in bacteriophage P4, are difficult to analyze experimentally and theoretically. In this paper, we propose to use the uniform random polygon model as a supplementary method to the existing methods for generating random knots in confinement. The uniform random polygon model allows us to sample knots with large crossing numbers and also to generate large diagrammatically prime knot diagrams. We show numerically that uniform random polygons sample knots with large minimum crossing numbers and certain complicated knot invariants (as those observed experimentally). We do this in terms of the knot determinants or colorings. Our numerical results suggest that the average determinant of a uniform random polygon of n vertices grows faster than O(e n 2 )). We also investigate the complexity of prime knot diagrams. We show rigorously that the probability that a randomly selected 2D uniform random polygon of n vertices is almost diagrammatically prime goes to 1 as n goes to infinity. Furthermore, the average number of crossings in such a diagram is at the order of O(n 2 ). Therefore, the two-dimensional uniform random polygons offer an effective way in sampling large (prime) knots, which can be useful in various applications

  16. Sampling large random knots in a confined space

    Science.gov (United States)

    Arsuaga, J.; Blackstone, T.; Diao, Y.; Hinson, K.; Karadayi, E.; Saito, M.

    2007-09-01

    DNA knots formed under extreme conditions of condensation, as in bacteriophage P4, are difficult to analyze experimentally and theoretically. In this paper, we propose to use the uniform random polygon model as a supplementary method to the existing methods for generating random knots in confinement. The uniform random polygon model allows us to sample knots with large crossing numbers and also to generate large diagrammatically prime knot diagrams. We show numerically that uniform random polygons sample knots with large minimum crossing numbers and certain complicated knot invariants (as those observed experimentally). We do this in terms of the knot determinants or colorings. Our numerical results suggest that the average determinant of a uniform random polygon of n vertices grows faster than O(e^{n^2}) . We also investigate the complexity of prime knot diagrams. We show rigorously that the probability that a randomly selected 2D uniform random polygon of n vertices is almost diagrammatically prime goes to 1 as n goes to infinity. Furthermore, the average number of crossings in such a diagram is at the order of O(n2). Therefore, the two-dimensional uniform random polygons offer an effective way in sampling large (prime) knots, which can be useful in various applications.

  17. Sampling large random knots in a confined space

    Energy Technology Data Exchange (ETDEWEB)

    Arsuaga, J [Department of Mathematics, San Francisco State University, 1600 Holloway Ave, San Francisco, CA 94132 (United States); Blackstone, T [Department of Computer Science, San Francisco State University, 1600 Holloway Ave., San Francisco, CA 94132 (United States); Diao, Y [Department of Mathematics and Statistics, University of North Carolina at Charlotte, Charlotte, NC 28223 (United States); Hinson, K [Department of Mathematics and Statistics, University of North Carolina at Charlotte, Charlotte, NC 28223 (United States); Karadayi, E [Department of Mathematics, University of South Florida, 4202 E Fowler Avenue, Tampa, FL 33620 (United States); Saito, M [Department of Mathematics, University of South Florida, 4202 E Fowler Avenue, Tampa, FL 33620 (United States)

    2007-09-28

    DNA knots formed under extreme conditions of condensation, as in bacteriophage P4, are difficult to analyze experimentally and theoretically. In this paper, we propose to use the uniform random polygon model as a supplementary method to the existing methods for generating random knots in confinement. The uniform random polygon model allows us to sample knots with large crossing numbers and also to generate large diagrammatically prime knot diagrams. We show numerically that uniform random polygons sample knots with large minimum crossing numbers and certain complicated knot invariants (as those observed experimentally). We do this in terms of the knot determinants or colorings. Our numerical results suggest that the average determinant of a uniform random polygon of n vertices grows faster than O(e{sup n{sup 2}}). We also investigate the complexity of prime knot diagrams. We show rigorously that the probability that a randomly selected 2D uniform random polygon of n vertices is almost diagrammatically prime goes to 1 as n goes to infinity. Furthermore, the average number of crossings in such a diagram is at the order of O(n{sup 2}). Therefore, the two-dimensional uniform random polygons offer an effective way in sampling large (prime) knots, which can be useful in various applications.

  18. Removal of rare gases from large volume airstreams

    International Nuclear Information System (INIS)

    Hopke, P.K.; Leong, K.H.; Stukel, J.J.; Lewis, C.; Jebackumar, R.; Illinois Univ., Urbana; Illinois Univ., Urbana

    1986-01-01

    The cost-effective removal of low levels of rare gases and particularly radon from large volume air flows is a difficult problem. The use of various scrubbing systems using non-conventional fluids has been studied. The parameters for both a packed tower absorber and a gas scrubber have been calculated for a system using perfluorobenzene as the fluid. Based on these parameters, a packed bed tower of conventional proportions is feasible for the removal of >95% of 37 Bq/m 3 of radon from a flow of 4.7 m 3 /second. (author)

  19. High-Dimensional Function Approximation With Neural Networks for Large Volumes of Data.

    Science.gov (United States)

    Andras, Peter

    2018-02-01

    Approximation of high-dimensional functions is a challenge for neural networks due to the curse of dimensionality. Often the data for which the approximated function is defined resides on a low-dimensional manifold and in principle the approximation of the function over this manifold should improve the approximation performance. It has been show that projecting the data manifold into a lower dimensional space, followed by the neural network approximation of the function over this space, provides a more precise approximation of the function than the approximation of the function with neural networks in the original data space. However, if the data volume is very large, the projection into the low-dimensional space has to be based on a limited sample of the data. Here, we investigate the nature of the approximation error of neural networks trained over the projection space. We show that such neural networks should have better approximation performance than neural networks trained on high-dimensional data even if the projection is based on a relatively sparse sample of the data manifold. We also find that it is preferable to use a uniformly distributed sparse sample of the data for the purpose of the generation of the low-dimensional projection. We illustrate these results considering the practical neural network approximation of a set of functions defined on high-dimensional data including real world data as well.

  20. Importance sampling large deviations in nonequilibrium steady states. I

    Science.gov (United States)

    Ray, Ushnish; Chan, Garnet Kin-Lic; Limmer, David T.

    2018-03-01

    Large deviation functions contain information on the stability and response of systems driven into nonequilibrium steady states and in such a way are similar to free energies for systems at equilibrium. As with equilibrium free energies, evaluating large deviation functions numerically for all but the simplest systems is difficult because by construction they depend on exponentially rare events. In this first paper of a series, we evaluate different trajectory-based sampling methods capable of computing large deviation functions of time integrated observables within nonequilibrium steady states. We illustrate some convergence criteria and best practices using a number of different models, including a biased Brownian walker, a driven lattice gas, and a model of self-assembly. We show how two popular methods for sampling trajectory ensembles, transition path sampling and diffusion Monte Carlo, suffer from exponentially diverging correlations in trajectory space as a function of the bias parameter when estimating large deviation functions. Improving the efficiencies of these algorithms requires introducing guiding functions for the trajectories.

  1. Importance sampling large deviations in nonequilibrium steady states. I.

    Science.gov (United States)

    Ray, Ushnish; Chan, Garnet Kin-Lic; Limmer, David T

    2018-03-28

    Large deviation functions contain information on the stability and response of systems driven into nonequilibrium steady states and in such a way are similar to free energies for systems at equilibrium. As with equilibrium free energies, evaluating large deviation functions numerically for all but the simplest systems is difficult because by construction they depend on exponentially rare events. In this first paper of a series, we evaluate different trajectory-based sampling methods capable of computing large deviation functions of time integrated observables within nonequilibrium steady states. We illustrate some convergence criteria and best practices using a number of different models, including a biased Brownian walker, a driven lattice gas, and a model of self-assembly. We show how two popular methods for sampling trajectory ensembles, transition path sampling and diffusion Monte Carlo, suffer from exponentially diverging correlations in trajectory space as a function of the bias parameter when estimating large deviation functions. Improving the efficiencies of these algorithms requires introducing guiding functions for the trajectories.

  2. Zirconia coated stir bar sorptive extraction combined with large volume sample stacking capillary electrophoresis-indirect ultraviolet detection for the determination of chemical warfare agent degradation products in water samples.

    Science.gov (United States)

    Li, Pingjing; Hu, Bin; Li, Xiaoyong

    2012-07-20

    In this study, a sensitive, selective and reliable analytical method by combining zirconia (ZrO₂) coated stir bar sorptive extraction (SBSE) with large volume sample stacking capillary electrophoresis-indirect ultraviolet (LVSS-CE/indirect UV) was developed for the direct analysis of chemical warfare agent degradation products of alkyl alkylphosphonic acids (AAPAs) (including ethyl methylphosphonic acid (EMPA) and pinacolyl methylphosphonate (PMPA)) and methylphosphonic acid (MPA) in environmental waters. ZrO₂ coated stir bar was prepared by adhering nanometer-sized ZrO₂ particles onto the surface of stir bar with commercial PDMS sol as adhesion agent. Due to the high affinity of ZrO₂ to the electronegative phosphonate group, ZrO₂ coated stir bars could selectively extract the strongly polar AAPAs and MPA. After systematically optimizing the extraction conditions of ZrO₂-SBSE, the analytical performance of ZrO₂-SBSE-CE/indirect UV and ZrO₂-SBSE-LVSS-CE/indirect UV was assessed. The limits of detection (LODs, at a signal-to-noise ratio of 3) obtained by ZrO₂-SBSE-CE/indirect UV were 13.4-15.9 μg/L for PMPA, EMPA and MPA. The relative standard deviations (RSDs, n=7, c=200 μg/L) of the corrected peak area for the target analytes were in the range of 6.4-8.8%. Enhancement factors (EFs) in terms of LODs were found to be from 112- to 145-fold. By combining ZrO₂ coating SBSE with LVSS as a dual preconcentration strategy, the EFs were magnified up to 1583-fold, and the LODs of ZrO₂-SBSE-LVSS-CE/indirect UV were 1.4, 1.2 and 3.1 μg/L for PMPA, EMPA, and MPA, respectively. The RSDs (n=7, c=20 μg/L) were found to be in the range of 9.0-11.8%. The developed ZrO₂-SBSE-LVSS-CE/indirect UV method has been successfully applied to the analysis of PMPA, EMPA, and MPA in different environmental water samples, and the recoveries for the spiked water samples were found to be in the range of 93.8-105.3%. Copyright © 2012 Elsevier B.V. All rights reserved.

  3. Influence of volume of sample processed on detection of Chlamydia trachomatis in urogenital samples by PCR

    NARCIS (Netherlands)

    Goessens, W H; Kluytmans, J A; den Toom, N; van Rijsoort-Vos, T H; Niesters, B G; Stolz, E; Verbrugh, H A; Quint, W G

    In the present study, it was demonstrated that the sensitivity of the PCR for the detection of Chlamydia trachomatis is influenced by the volume of the clinical sample which is processed in the PCR. An adequate sensitivity for PCR was established by processing at least 4%, i.e., 80 microliters, of

  4. The BREAST-V: a unifying predictive formula for volume assessment in small, medium, and large breasts.

    Science.gov (United States)

    Longo, Benedetto; Farcomeni, Alessio; Ferri, Germano; Campanale, Antonella; Sorotos, Micheal; Santanelli, Fabio

    2013-07-01

    Breast volume assessment enhances preoperative planning of both aesthetic and reconstructive procedures, helping the surgeon in the decision-making process of shaping the breast. Numerous methods of breast size determination are currently reported but are limited by methodologic flaws and variable estimations. The authors aimed to develop a unifying predictive formula for volume assessment in small to large breasts based on anthropomorphic values. Ten anthropomorphic breast measurements and direct volumes of 108 mastectomy specimens from 88 women were collected prospectively. The authors performed a multivariate regression to build the optimal model for development of the predictive formula. The final model was then internally validated. A previously published formula was used as a reference. Mean (±SD) breast weight was 527.9 ± 227.6 g (range, 150 to 1250 g). After model selection, sternal notch-to-nipple, inframammary fold-to-nipple, and inframammary fold-to-fold projection distances emerged as the most important predictors. The resulting formula (the BREAST-V) showed an adjusted R of 0.73. The estimated expected absolute error on new breasts is 89.7 g (95 percent CI, 62.4 to 119.1 g) and the expected relative error is 18.4 percent (95 percent CI, 12.9 to 24.3 percent). Application of reference formula on the sample yielded worse predictions than those derived by the formula, showing an R of 0.55. The BREAST-V is a reliable tool for predicting small to large breast volumes accurately for use as a complementary device in surgeon evaluation. An app entitled BREAST-V for both iOS and Android devices is currently available for free download in the Apple App Store and Google Play Store. Diagnostic, II.

  5. Rejecting escape events in large volume Ge detectors by a pulse shape selection procedure

    International Nuclear Information System (INIS)

    Del Zoppo, A.; Agodi, C.; Alba, R.; Bellia, G.; Coniglione, R.; Loukachine, K.; Maiolino, C.; Migneco, E.; Piattelli, P.; Santonocito, D.; Sapienza, P.

    1993-01-01

    The dependence of the response to γ-rays of a large volume Ge detector on the interval width of a selected initial rise pulse slope is investigated. The number of escape events associated with a small pulse slope is found to be greater than the corresponding number of full energy events. An escape event rejection procedure based on the observed correlation between energy deposition and pulse shape is discussed. Such a procedure seems particularly suited for the design of highly granular large volume Ge detector arrays. (orig.)

  6. Method for Determination of Neptunium in Large-Sized Urine Samples Using Manganese Dioxide Coprecipitation and 242Pu as Yield Tracer

    DEFF Research Database (Denmark)

    Qiao, Jixin; Hou, Xiaolin; Roos, Per

    2013-01-01

    A novel method for bioassay of large volumes of human urine samples using manganese dioxide coprecipitation for preconcentration was developed for rapid determination of 237Np. 242Pu was utilized as a nonisotopic tracer to monitor the chemical yield of 237Np. A sequential injection extraction chr...... and rapid analysis of neptunium contamination level for emergency preparedness....

  7. Broadband frequency ECR ion source concepts with large resonant plasma volumes

    International Nuclear Information System (INIS)

    Alton, G.D.

    1995-01-01

    New techniques are proposed for enhancing the performances of ECR ion sources. The techniques are based on the use of high-power, variable-frequency, multiple-discrete-frequency, or broadband microwave radiation, derived from standard TWT technology, to effect large resonant ''volume'' ECR sources. The creation of a large ECR plasma ''volume'' permits coupling of more power into the plasma, resulting in the heating of a much larger electron population to higher energies, the effect of which is to produce higher charge state distributions and much higher intensities within a particular charge state than possible in present forms of the ECR ion source. If successful, these developments could significantly impact future accelerator designs and accelerator-based, heavy-ion-research programs by providing multiply-charged ion beams with the energies and intensities required for nuclear physics research from existing ECR ion sources. The methods described in this article can be used to retrofit any ECR ion source predicated on B-minimum plasma confinement techniques

  8. Numerical simulation of seismic wave propagation from land-excited large volume air-gun source

    Science.gov (United States)

    Cao, W.; Zhang, W.

    2017-12-01

    The land-excited large volume air-gun source can be used to study regional underground structures and to detect temporal velocity changes. The air-gun source is characterized by rich low frequency energy (from bubble oscillation, 2-8Hz) and high repeatability. It can be excited in rivers, reservoirs or man-made pool. Numerical simulation of the seismic wave propagation from the air-gun source helps to understand the energy partitioning and characteristics of the waveform records at stations. However, the effective energy recorded at a distance station is from the process of bubble oscillation, which can not be approximated by a single point source. We propose a method to simulate the seismic wave propagation from the land-excited large volume air-gun source by finite difference method. The process can be divided into three parts: bubble oscillation and source coupling, solid-fluid coupling and the propagation in the solid medium. For the first part, the wavelet of the bubble oscillation can be simulated by bubble model. We use wave injection method combining the bubble wavelet with elastic wave equation to achieve the source coupling. Then, the solid-fluid boundary condition is implemented along the water bottom. And the last part is the seismic wave propagation in the solid medium, which can be readily implemented by the finite difference method. Our method can get accuracy waveform of land-excited large volume air-gun source. Based on the above forward modeling technology, we analysis the effect of the excited P wave and the energy of converted S wave due to different water shapes. We study two land-excited large volume air-gun fields, one is Binchuan in Yunnan, and the other is Hutubi in Xinjiang. The station in Binchuan, Yunnan is located in a large irregular reservoir, the waveform records have a clear S wave. Nevertheless, the station in Hutubi, Xinjiang is located in a small man-made pool, the waveform records have very weak S wave. Better understanding of

  9. Sampling of finite elements for sparse recovery in large scale 3D electrical impedance tomography

    International Nuclear Information System (INIS)

    Javaherian, Ashkan; Moeller, Knut; Soleimani, Manuchehr

    2015-01-01

    This study proposes a method to improve performance of sparse recovery inverse solvers in 3D electrical impedance tomography (3D EIT), especially when the volume under study contains small-sized inclusions, e.g. 3D imaging of breast tumours. Initially, a quadratic regularized inverse solver is applied in a fast manner with a stopping threshold much greater than the optimum. Based on assuming a fixed level of sparsity for the conductivity field, finite elements are then sampled via applying a compressive sensing (CS) algorithm to the rough blurred estimation previously made by the quadratic solver. Finally, a sparse inverse solver is applied solely to the sampled finite elements, with the solution to the CS as its initial guess. The results show the great potential of the proposed CS-based sparse recovery in improving accuracy of sparse solution to the large-size 3D EIT. (paper)

  10. Doping the 1 kton Large Volume Detector with Gd

    International Nuclear Information System (INIS)

    Bruno, Gianmarco; Fulgione, Walter; Porta, Amanda; Machado, Ana Amelia Bergamini; Mal'gin, Alexei; Molinario, Andrea; Vigorito, Carlo

    2011-01-01

    The Large Volume Detector (LVD) in the INFN Gran Sasso National Laboratory (LNGS), Italy, is a ν observatory which has been monitoring the Galaxy since June 1992 to study neutrinos from core collapse supernovae. The experiment in the present configuration is made by 840 scintillator detectors, for a total active mass of 1000 tons. The detector sensitivity to neutrino bursts due to a core collapse supernova has been already discussed in term of maximum detectable distance. In this paper we evaluate the improvements that LVD could obtain if all its active scintillator mass was doped with a small amount (0.14% in weight) of Gadolinium. We simulated neutron captures following ν-bar e inverse beta decay reactions in one LVD counter (1.2 ton) with Gd doped liquid scintillator obtaining an efficiency for the detection of this process of η n | Gd = 80% and a mean capture time τ = 25μs, in good agreement with the results obtained by the measures. This implies a gain of a factor ∼ 20 in the signal to noise ratio for neutron capture detection with respect to the undoped liquid scintillator. We discuss how the captures of neutrons from rock radioactivity on Gd modify the background conditions of the detector and we calculate the curves expressing the sensitivity to a ν-bar e burst from core collapse supernovae depending on the distance of the collapsing star. It results that doping the 1 kton Large Volume Detector with Gd would assure a 90% detection efficiency at the distance of the Large Magellanic Cloud (50 kpc), an achievement which is equivalent to that obtained by doubling the number of counters in LVD

  11. Sparse PDF Volumes for Consistent Multi-Resolution Volume Rendering

    KAUST Repository

    Sicat, Ronell Barrera

    2014-12-31

    This paper presents a new multi-resolution volume representation called sparse pdf volumes, which enables consistent multi-resolution volume rendering based on probability density functions (pdfs) of voxel neighborhoods. These pdfs are defined in the 4D domain jointly comprising the 3D volume and its 1D intensity range. Crucially, the computation of sparse pdf volumes exploits data coherence in 4D, resulting in a sparse representation with surprisingly low storage requirements. At run time, we dynamically apply transfer functions to the pdfs using simple and fast convolutions. Whereas standard low-pass filtering and down-sampling incur visible differences between resolution levels, the use of pdfs facilitates consistent results independent of the resolution level used. We describe the efficient out-of-core computation of large-scale sparse pdf volumes, using a novel iterative simplification procedure of a mixture of 4D Gaussians. Finally, our data structure is optimized to facilitate interactive multi-resolution volume rendering on GPUs.

  12. In-Situ Systematic Error Correction for Digital Volume Correlation Using a Reference Sample

    KAUST Repository

    Wang, B.

    2017-11-27

    The self-heating effect of a laboratory X-ray computed tomography (CT) scanner causes slight change in its imaging geometry, which induces translation and dilatation (i.e., artificial displacement and strain) in reconstructed volume images recorded at different times. To realize high-accuracy internal full-field deformation measurements using digital volume correlation (DVC), these artificial displacements and strains associated with unstable CT imaging must be eliminated. In this work, an effective and easily implemented reference sample compensation (RSC) method is proposed for in-situ systematic error correction in DVC. The proposed method utilizes a stationary reference sample, which is placed beside the test sample to record the artificial displacement fields caused by the self-heating effect of CT scanners. The detected displacement fields are then fitted by a parametric polynomial model, which is used to remove the unwanted artificial deformations in the test sample. Rescan tests of a stationary sample and real uniaxial compression tests performed on copper foam specimens demonstrate the accuracy, efficacy, and practicality of the presented RSC method.

  13. In-Situ Systematic Error Correction for Digital Volume Correlation Using a Reference Sample

    KAUST Repository

    Wang, B.; Pan, B.; Lubineau, Gilles

    2017-01-01

    The self-heating effect of a laboratory X-ray computed tomography (CT) scanner causes slight change in its imaging geometry, which induces translation and dilatation (i.e., artificial displacement and strain) in reconstructed volume images recorded at different times. To realize high-accuracy internal full-field deformation measurements using digital volume correlation (DVC), these artificial displacements and strains associated with unstable CT imaging must be eliminated. In this work, an effective and easily implemented reference sample compensation (RSC) method is proposed for in-situ systematic error correction in DVC. The proposed method utilizes a stationary reference sample, which is placed beside the test sample to record the artificial displacement fields caused by the self-heating effect of CT scanners. The detected displacement fields are then fitted by a parametric polynomial model, which is used to remove the unwanted artificial deformations in the test sample. Rescan tests of a stationary sample and real uniaxial compression tests performed on copper foam specimens demonstrate the accuracy, efficacy, and practicality of the presented RSC method.

  14. F-term stabilization of odd axions in LARGE volume scenario

    International Nuclear Information System (INIS)

    Gao, Xin; Shukla, Pramod

    2014-01-01

    In the context of the LARGE volume scenario, stabilization of axionic moduli is revisited. This includes both even and odd axions with their scalar potential being generated by F-term contributions via various tree-level and non-perturbative effects like fluxed E3-brane instantons and fluxed poly-instantons. In all the cases, we estimate the decay constants and masses of the axions involved

  15. Analysis of large soil samples for actinides

    Science.gov (United States)

    Maxwell, III; Sherrod, L [Aiken, SC

    2009-03-24

    A method of analyzing relatively large soil samples for actinides by employing a separation process that includes cerium fluoride precipitation for removing the soil matrix and precipitates plutonium, americium, and curium with cerium and hydrofluoric acid followed by separating these actinides using chromatography cartridges.

  16. Federal Radiological Monitoring and Assessment Center Monitoring Manual Volume 2, Radiation Monitoring and Sampling

    Energy Technology Data Exchange (ETDEWEB)

    NSTec Aerial Measurement Systems

    2012-07-31

    The FRMAC Monitoring and Sampling Manual, Volume 2 provides standard operating procedures (SOPs) for field radiation monitoring and sample collection activities that are performed by the Monitoring group during a FRMAC response to a radiological emergency.

  17. Reliability and statistical power analysis of cortical and subcortical FreeSurfer metrics in a large sample of healthy elderly.

    Science.gov (United States)

    Liem, Franziskus; Mérillat, Susan; Bezzola, Ladina; Hirsiger, Sarah; Philipp, Michel; Madhyastha, Tara; Jäncke, Lutz

    2015-03-01

    FreeSurfer is a tool to quantify cortical and subcortical brain anatomy automatically and noninvasively. Previous studies have reported reliability and statistical power analyses in relatively small samples or only selected one aspect of brain anatomy. Here, we investigated reliability and statistical power of cortical thickness, surface area, volume, and the volume of subcortical structures in a large sample (N=189) of healthy elderly subjects (64+ years). Reliability (intraclass correlation coefficient) of cortical and subcortical parameters is generally high (cortical: ICCs>0.87, subcortical: ICCs>0.95). Surface-based smoothing increases reliability of cortical thickness maps, while it decreases reliability of cortical surface area and volume. Nevertheless, statistical power of all measures benefits from smoothing. When aiming to detect a 10% difference between groups, the number of subjects required to test effects with sufficient power over the entire cortex varies between cortical measures (cortical thickness: N=39, surface area: N=21, volume: N=81; 10mm smoothing, power=0.8, α=0.05). For subcortical regions this number is between 16 and 76 subjects, depending on the region. We also demonstrate the advantage of within-subject designs over between-subject designs. Furthermore, we publicly provide a tool that allows researchers to perform a priori power analysis and sensitivity analysis to help evaluate previously published studies and to design future studies with sufficient statistical power. Copyright © 2014 Elsevier Inc. All rights reserved.

  18. Absorption and scattering coefficient dependence of laser-Doppler flowmetry models for large tissue volumes

    International Nuclear Information System (INIS)

    Binzoni, T; Leung, T S; Ruefenacht, D; Delpy, D T

    2006-01-01

    Based on quasi-elastic scattering theory (and random walk on a lattice approach), a model of laser-Doppler flowmetry (LDF) has been derived which can be applied to measurements in large tissue volumes (e.g. when the interoptode distance is >30 mm). The model holds for a semi-infinite medium and takes into account the transport-corrected scattering coefficient and the absorption coefficient of the tissue, and the scattering coefficient of the red blood cells. The model holds for anisotropic scattering and for multiple scattering of the photons by the moving scatterers of finite size. In particular, it has also been possible to take into account the simultaneous presence of both Brownian and pure translational movements. An analytical and simplified version of the model has also been derived and its validity investigated, for the case of measurements in human skeletal muscle tissue. It is shown that at large optode spacing it is possible to use the simplified model, taking into account only a 'mean' light pathlength, to predict the blood flow related parameters. It is also demonstrated that the 'classical' blood volume parameter, derived from LDF instruments, may not represent the actual blood volume variations when the investigated tissue volume is large. The simplified model does not need knowledge of the tissue optical parameters and thus should allow the development of very simple and cost-effective LDF hardware

  19. The use of cosmic muons in detecting heterogeneities in large volumes

    International Nuclear Information System (INIS)

    Grabski, V.; Reche, R.; Alfaro, R.; Belmont-Moreno, E.; Martinez-Davalos, A.; Sandoval, A.; Menchaca-Rocha, A.

    2008-01-01

    The muon intensity attenuation method to detect heterogeneities in large matter volumes is analyzed. Approximate analytical expressions to estimate the collection time and the signal to noise ratio, are proposed and validated by Monte Carlo simulations. Important parameters, including point spread function and coordinate reconstruction uncertainty are also estimated using Monte Carlo simulations

  20. Automation of registration of sample weights for high-volume neutron activation analysis at the IBR-2 reactor of FLNP, JINR

    International Nuclear Information System (INIS)

    Dmitriev, A.Yu.; Dmitriev, F.A.

    2015-01-01

    The 'Weight' software tool was created at FLNP JINR to automate the reading of analytical balance readouts and saving these values in the NAA database. The analytical balance connected to the personal computer is used to measure weight values. The 'Weight' software tool controls the reading of weight values and the exchange of information with the NAA database. The weighing process of a large amount of samples is reliably provided during high-volume neutron activation analysis. [ru

  1. Doping the 1 kton Large Volume Detector with Gd

    Energy Technology Data Exchange (ETDEWEB)

    Bruno, Gianmarco [University of L' Aquila, Via Vetoio snc, 67100 Coppito (AQ) Italy (Italy); Fulgione, Walter; Porta, Amanda [Istituto di Fisica dello Spazio Interplanetario, INAF, Corso Fiume 4, Torino (Italy); Machado, Ana Amelia Bergamini [Laboratori Nazionali del Gran Sasso, INFN, s.s. 17bis Km 18-10, Assergi (AQ) (Italy); Mal' gin, Alexei [Institute for Nuclear Research, Russian Academy of Sciences, pr. Shestidesyatiletiya Oktyabrya 7a, Moscow, 117312 (Russian Federation); Molinario, Andrea; Vigorito, Carlo, E-mail: bruno@to.infn.it, E-mail: fulgione@to.infn.it, E-mail: ana.machado@lngs.infn.it, E-mail: malgin@lngs.infn.it, E-mail: amolinar@to.infn.it, E-mail: Amanda.Porta@subatech.in2p3.fr, E-mail: vigorito@to.infn.it [INFN, Via Pietro Giuria 1, Torino (Italy)

    2011-06-01

    The Large Volume Detector (LVD) in the INFN Gran Sasso National Laboratory (LNGS), Italy, is a ν observatory which has been monitoring the Galaxy since June 1992 to study neutrinos from core collapse supernovae. The experiment in the present configuration is made by 840 scintillator detectors, for a total active mass of 1000 tons. The detector sensitivity to neutrino bursts due to a core collapse supernova has been already discussed in term of maximum detectable distance. In this paper we evaluate the improvements that LVD could obtain if all its active scintillator mass was doped with a small amount (0.14% in weight) of Gadolinium. We simulated neutron captures following ν-bar {sub e} inverse beta decay reactions in one LVD counter (1.2 ton) with Gd doped liquid scintillator obtaining an efficiency for the detection of this process of η{sub n}|{sub Gd} = 80% and a mean capture time τ = 25μs, in good agreement with the results obtained by the measures. This implies a gain of a factor ∼ 20 in the signal to noise ratio for neutron capture detection with respect to the undoped liquid scintillator. We discuss how the captures of neutrons from rock radioactivity on Gd modify the background conditions of the detector and we calculate the curves expressing the sensitivity to a ν-bar {sub e} burst from core collapse supernovae depending on the distance of the collapsing star. It results that doping the 1 kton Large Volume Detector with Gd would assure a 90% detection efficiency at the distance of the Large Magellanic Cloud (50 kpc), an achievement which is equivalent to that obtained by doubling the number of counters in LVD.

  2. A Novel Technique for Endovascular Removal of Large Volume Right Atrial Tumor Thrombus

    Energy Technology Data Exchange (ETDEWEB)

    Nickel, Barbara, E-mail: nickel.ba@gmail.com [US Teleradiology and Quantum Medical Radiology Group (United States); McClure, Timothy, E-mail: tmcclure@gmail.com; Moriarty, John, E-mail: jmoriarty@mednet.ucla.edu [UCLA Medical Center, Department of Interventional Radiology (United States)

    2015-08-15

    Venous thromboembolic disease is a significant cause of morbidity and mortality, particularly in the setting of large volume pulmonary embolism. Thrombolytic therapy has been shown to be a successful treatment modality; however, its use somewhat limited due to the risk of hemorrhage and potential for distal embolization in the setting of large mobile thrombi. In patients where either thrombolysis is contraindicated or unsuccessful, and conventional therapies prove inadequate, surgical thrombectomy may be considered. We present a case of percutaneous endovascular extraction of a large mobile mass extending from the inferior vena cava into the right atrium using the Angiovac device, a venovenous bypass system designed for high-volume aspiration of undesired endovascular material. Standard endovascular methods for removal of cancer-associated thrombus, such as catheter-directed lysis, maceration, and exclusion, may prove inadequate in the setting of underlying tumor thrombus. Where conventional endovascular methods either fail or are unsuitable, endovascular thrombectomy with the Angiovac device may be a useful and safe minimally invasive alternative to open resection.

  3. A Novel Technique for Endovascular Removal of Large Volume Right Atrial Tumor Thrombus

    International Nuclear Information System (INIS)

    Nickel, Barbara; McClure, Timothy; Moriarty, John

    2015-01-01

    Venous thromboembolic disease is a significant cause of morbidity and mortality, particularly in the setting of large volume pulmonary embolism. Thrombolytic therapy has been shown to be a successful treatment modality; however, its use somewhat limited due to the risk of hemorrhage and potential for distal embolization in the setting of large mobile thrombi. In patients where either thrombolysis is contraindicated or unsuccessful, and conventional therapies prove inadequate, surgical thrombectomy may be considered. We present a case of percutaneous endovascular extraction of a large mobile mass extending from the inferior vena cava into the right atrium using the Angiovac device, a venovenous bypass system designed for high-volume aspiration of undesired endovascular material. Standard endovascular methods for removal of cancer-associated thrombus, such as catheter-directed lysis, maceration, and exclusion, may prove inadequate in the setting of underlying tumor thrombus. Where conventional endovascular methods either fail or are unsuitable, endovascular thrombectomy with the Angiovac device may be a useful and safe minimally invasive alternative to open resection

  4. Toxicity Profile With a Large Prostate Volume After External Beam Radiotherapy for Localized Prostate Cancer

    International Nuclear Information System (INIS)

    Pinkawa, Michael; Fischedick, Karin; Asadpour, Branka; Gagel, Bernd; Piroth, Marc D.; Nussen, Sandra; Eble, Michael J.

    2008-01-01

    Purpose: To assess the impact of prostate volume on health-related quality of life (HRQOL) before and at different intervals after radiotherapy for prostate cancer. Methods and Materials: A group of 204 patients was surveyed prospectively before (Time A), at the last day (Time B), 2 months after (Time C), and 16 months (median) after (Time D) radiotherapy, with a validated questionnaire (Expanded Prostate Cancer Index Composite). The group was divided into subgroups with a small (11-43 cm 3 ) and a large (44-151 cm 3 ) prostate volume. Results: Patients with large prostates presented with lower urinary bother scores (median 79 vs. 89; p = 0.01) before treatment. Urinary function/bother scores for patients with large prostates decreased significantly compared to patients with small prostates due to irritative/obstructive symptoms only at Time B (pain with urination more than once daily in 48% vs. 18%; p 3 vs. 47 cm 3 ; p < 0.01). Conclusions: Patients with a large prostate volume have a great risk of irritative/obstructive symptoms (particularly dysuria) in the acute radiotherapy phase. These symptoms recover rapidly and do not influence long-term HRQOL

  5. Prospects for accelerator neutron sources for large volume minerals analysis

    International Nuclear Information System (INIS)

    Clayton, C.G.; Spackman, R.

    1988-01-01

    The electron Linac can be regarded as a practical source of thermal neutrons for activation analysis of large volume mineral samples. With a suitable target and moderator, a neutron flux of about 10 10 n/cm/s over 2-3 kg of rock can be generated. The proton Linac gives the possibility of a high neutron yield (> 10 12 n/s) of fast neutrons at selected energies. For the electron Linac, targets of W-U and W-Be are discussed. The advantages and limitations of the system are demonstrated for the analysis of gold in rocks and ores and for platinum in chromitite. These elements were selected as they are most likely to justify an accelerator installation at the present time. Errors due to self shielding in gold particles for thermal neutrons are discussed. The proton Linac is considered for neutrons generated from a lithium target through the 7 Li(p, n) 7 Be reaction. The analysis of gold by fast neutron activation is considered. This approach avoids particle self-absorption and, by appropriate proton energy selection, avoids potentially dominating interfering reactions. The analysis of 235 U in the presence of 238 U and 232 Th is also considered. (author)

  6. Gibbs sampling on large lattice with GMRF

    Science.gov (United States)

    Marcotte, Denis; Allard, Denis

    2018-02-01

    Gibbs sampling is routinely used to sample truncated Gaussian distributions. These distributions naturally occur when associating latent Gaussian fields to category fields obtained by discrete simulation methods like multipoint, sequential indicator simulation and object-based simulation. The latent Gaussians are often used in data assimilation and history matching algorithms. When the Gibbs sampling is applied on a large lattice, the computing cost can become prohibitive. The usual practice of using local neighborhoods is unsatisfying as it can diverge and it does not reproduce exactly the desired covariance. A better approach is to use Gaussian Markov Random Fields (GMRF) which enables to compute the conditional distributions at any point without having to compute and invert the full covariance matrix. As the GMRF is locally defined, it allows simultaneous updating of all points that do not share neighbors (coding sets). We propose a new simultaneous Gibbs updating strategy on coding sets that can be efficiently computed by convolution and applied with an acceptance/rejection method in the truncated case. We study empirically the speed of convergence, the effect of choice of boundary conditions, of the correlation range and of GMRF smoothness. We show that the convergence is slower in the Gaussian case on the torus than for the finite case studied in the literature. However, in the truncated Gaussian case, we show that short scale correlation is quickly restored and the conditioning categories at each lattice point imprint the long scale correlation. Hence our approach enables to realistically apply Gibbs sampling on large 2D or 3D lattice with the desired GMRF covariance.

  7. A spinner magnetometer for large Apollo lunar samples

    Science.gov (United States)

    Uehara, M.; Gattacceca, J.; Quesnel, Y.; Lepaulard, C.; Lima, E. A.; Manfredi, M.; Rochette, P.

    2017-10-01

    We developed a spinner magnetometer to measure the natural remanent magnetization of large Apollo lunar rocks in the storage vault of the Lunar Sample Laboratory Facility (LSLF) of NASA. The magnetometer mainly consists of a commercially available three-axial fluxgate sensor and a hand-rotating sample table with an optical encoder recording the rotation angles. The distance between the sample and the sensor is adjustable according to the sample size and magnetization intensity. The sensor and the sample are placed in a two-layer mu-metal shield to measure the sample natural remanent magnetization. The magnetic signals are acquired together with the rotation angle to obtain stacking of the measured signals over multiple revolutions. The developed magnetometer has a sensitivity of 5 × 10-7 Am2 at the standard sensor-to-sample distance of 15 cm. This sensitivity is sufficient to measure the natural remanent magnetization of almost all the lunar basalt and breccia samples with mass above 10 g in the LSLF vault.

  8. A spinner magnetometer for large Apollo lunar samples.

    Science.gov (United States)

    Uehara, M; Gattacceca, J; Quesnel, Y; Lepaulard, C; Lima, E A; Manfredi, M; Rochette, P

    2017-10-01

    We developed a spinner magnetometer to measure the natural remanent magnetization of large Apollo lunar rocks in the storage vault of the Lunar Sample Laboratory Facility (LSLF) of NASA. The magnetometer mainly consists of a commercially available three-axial fluxgate sensor and a hand-rotating sample table with an optical encoder recording the rotation angles. The distance between the sample and the sensor is adjustable according to the sample size and magnetization intensity. The sensor and the sample are placed in a two-layer mu-metal shield to measure the sample natural remanent magnetization. The magnetic signals are acquired together with the rotation angle to obtain stacking of the measured signals over multiple revolutions. The developed magnetometer has a sensitivity of 5 × 10 -7 Am 2 at the standard sensor-to-sample distance of 15 cm. This sensitivity is sufficient to measure the natural remanent magnetization of almost all the lunar basalt and breccia samples with mass above 10 g in the LSLF vault.

  9. Management of Large Volumes of Waste Arising in a Nuclear or Radiological Emergency

    International Nuclear Information System (INIS)

    2017-10-01

    This publication, prepared in light of the IAEA Action Plan on Nuclear Safety developed after the accident at the Fukushima Daiichi nuclear power plant, addresses the management of large volumes of radioactive waste arising in a nuclear or radiological emergency, as part of overall emergency preparedness. The management of large volumes of waste will be one of many efforts to be dealt with to allow recovery of affected areas, to support return of evacuated or relocated populations and preparations for normal social and economic activities, and/or to mitigate additional environmental impacts. The publication is intended to be of use to national planners and policy makers, facility and programme managers, and other professionals responsible for developing and implementing national plans and strategies to manage radioactive waste arising from nuclear or radiological emergencies.

  10. SUSY’s Ladder: reframing sequestering at Large Volume

    Energy Technology Data Exchange (ETDEWEB)

    Reece, Matthew [Department of Physics, Harvard University,Cambridge, MA 02138 (United States); Xue, Wei [Center for Theoretical Physics, Massachusetts Institute of Technology,Cambridge, MA 02139 (United States)

    2016-04-07

    Theories with approximate no-scale structure, such as the Large Volume Scenario, have a distinctive hierarchy of multiple mass scales in between TeV gaugino masses and the Planck scale, which we call SUSY’s Ladder. This is a particular realization of Split Supersymmetry in which the same small parameter suppresses gaugino masses relative to scalar soft masses, scalar soft masses relative to the gravitino mass, and the UV cutoff or string scale relative to the Planck scale. This scenario has many phenomenologically interesting properties, and can avoid dangers including the gravitino problem, flavor problems, and the moduli-induced LSP problem that plague other supersymmetric theories. We study SUSY’s Ladder using a superspace formalism that makes the mysterious cancelations in previous computations manifest. This opens the possibility of a consistent effective field theory understanding of the phenomenology of these scenarios, based on power-counting in the small ratio of string to Planck scales. We also show that four-dimensional theories with approximate no-scale structure enforced by a single volume modulus arise only from two special higher-dimensional theories: five-dimensional supergravity and ten-dimensional type IIB supergravity. This gives a phenomenological argument in favor of ten dimensional ultraviolet physics which is different from standard arguments based on the consistency of superstring theory.

  11. A test of alternative estimators for volume at time 1 from remeasured point samples

    Science.gov (United States)

    Francis A. Roesch; Edwin J. Green; Charles T. Scott

    1993-01-01

    Two estimators for volume at time 1 for use with permanent horizontal point samples are evaluated. One estimator, used traditionally, uses only the trees sampled at time 1, while the second estimator, originally presented by Roesch and coauthors (F.A. Roesch, Jr., E.J. Green, and C.T. Scott. 1989. For. Sci. 35(2):281-293). takes advantage of additional sample...

  12. Large electrically induced height and volume changes in poly(3,4- ethylenedioxythiophene) /poly(styrenesulfonate) thin films

    NARCIS (Netherlands)

    Charrier, D.S.H.; Janssen, R.A.J.; Kemerink, M.

    2010-01-01

    We demonstrate large, partly reversible height and volume changes of thin films of poly(3,4-ethylenedioxythiophene)/poly(styrenesulfonate) (PEDOT:PSS) on the anode of interdigitating gold electrodes under ambient conditions by applying an electrical bias. The height and volume changes were monitored

  13. Preliminary level 2 specification for the nested, fixed-depth sampling system

    International Nuclear Information System (INIS)

    BOGER, R.M.

    1999-01-01

    This preliminary Level 2 Component Specification establishes the performance, design, development, and test requirements for the in-tank sampling system which will support the BNFL contract in the final disposal of Hanford's High Level Wastes (HLW) and Low Activity Wastes (LAW). The PHMC will provide Low Activity Wastes (LAW) tank wastes for final treatment by BNFL from double-shell feed tanks. Concerns about the inability of the baseline ''grab'' sampling to provide large volume samples within time constraints has led to the development of a nested, fixed-depth sampling system. This sampling system will provide large volume? representative samples without the environmental, radiation exposure, and sample volume Impacts of the current base-line ''grab'' sampling method. This preliminary Level 2 Component Specification is not a general specification for tank sampling, but is based on a ''record of decision'', AGA (HNF-SD-TWR-AGA-001 ), the System Specification for the Double Shell Tank System (HNF-SD-WM-TRD-O07), and the BNFL privatization contract

  14. Fast and effective determination of strontium-90 in high volumes water samples

    International Nuclear Information System (INIS)

    Basarabova, B.; Dulanska, S.

    2014-01-01

    A simple and fast method was developed for determination of 90 Sr in high volumes of water samples from vicinity of nuclear power facilities. Samples were taken from the environment near Nuclear Power Plants in Jaslovske Bohunice and Mochovce in Slovakia. For determination of 90 Sr was used solid phase extraction using commercial sorbent Analig R Sr-01 from company IBC Advanced Technologies, Inc.. Determination of 90 Sr was performed with dilute solution of HNO 3 (1.5-2 M) and also tested in base medium with NaOH. For elution of 90 Sr was used eluent EDTA with pH in range 8-9. To achieve fast determination, automation was applied, which brings significant reduction of separation time. Concentration of water samples with evaporation was not necessary. Separation was performed immediately after filtration of analyzed samples. The aim of this study was development of less expensive, time unlimited and energy saving method for determination of 90 Sr in comparison with conventional methods. Separation time for fast-flow with volume of 10 dm 3 of water samples was 3.5 hours (flow-rate approximately 3.2 dm 3 / 1 hour). Radiochemical strontium yield was traced by using radionuclide 85 Sr. Samples were measured with HPGe detector (High-purity Germanium detector) at energy E φ = 514 keV. By using Analig R Sr-01 yields in range 72 - 96 % were achieved. Separation based on solid phase extraction using Analig R Sr-01 employing utilization of automation offers new, fast and effective method for determination of 90 Sr in water matrix. After ingrowth of yttrium samples were measured by Liquid Scintillation Spectrometer Packard Tricarb 2900 TR with software Quanta Smart. (authors)

  15. Naturally light hidden photons in LARGE volume string compactifications

    International Nuclear Information System (INIS)

    Goodsell, M.; Jaeckel, J.; Redondo, J.; Ringwald, A.

    2009-09-01

    Extra ''hidden'' U(1) gauge factors are a generic feature of string theory that is of particular phenomenological interest. They can kinetically mix with the Standard Model photon and are thereby accessible to a wide variety of astrophysical and cosmological observations and laboratory experiments. In this paper we investigate the masses and the kinetic mixing of hidden U(1)s in LARGE volume compactifications of string theory. We find that in these scenarios the hidden photons can be naturally light and that their kinetic mixing with the ordinary electromagnetic photon can be of a size interesting for near future experiments and observations. (orig.)

  16. ESB application for effective synchronization of large volume measurements data

    CERN Document Server

    Wyszkowski, Przemysław Michał

    2011-01-01

    The TOTEM experiment at CERN aims at measurement of total cross section, elastic scattering and diffractive processes of colliding protons in the Large Hadron Collider. In order for the research to be possible, it is necessary to process huge amounts of data coming from variety of sources: TOTEM detectors, CMS detectors, measurement devices around the Large Hadron Collider tunnel and many other external systems. Preparing final results involves also calculating plenty of intermediate figures, which also need to be stored. In order for the work of the scientist to be effective and convenient it is crucial to provide central point for the data storage, where all raw and intermediate figures will be stored. This thesis aims at presenting the usage of Enterprise Service Bus concept in building software infrastructure for transferring large volume of measurements data. Topics discussed here include technologies and mechanisms realizing the concept of integration bus, model of data transferring system based on ...

  17. A review of methods for sampling large airborne particles and associated radioactivity

    International Nuclear Information System (INIS)

    Garland, J.A.; Nicholson, K.W.

    1990-01-01

    Radioactive particles, tens of μm or more in diameter, are unlikely to be emitted directly from nuclear facilities with exhaust gas cleansing systems, but may arise in the case of an accident or where resuspension from contaminated surfaces is significant. Such particles may dominate deposition and, according to some workers, may contribute to inhalation doses. Quantitative sampling of large airborne particles is difficult because of their inertia and large sedimentation velocities. The literature describes conditions for unbiased sampling and the magnitude of sampling errors for idealised sampling inlets in steady winds. However, few air samplers for outdoor use have been assessed for adequacy of sampling. Many size selective sampling methods are found in the literature but few are suitable at the low concentrations that are often encountered in the environment. A number of approaches for unbiased sampling of large particles have been found in the literature. Some are identified as meriting further study, for application in the measurement of airborne radioactivity. (author)

  18. Sampling data summary for the ninth run of the Large Slurry Fed Melter

    International Nuclear Information System (INIS)

    Sabatino, D.M.

    1983-01-01

    The ninth experimental run of the Large Slurry Fed Melter (LSFM) was completed June 27, 1983, after 63 days of continuous operation. During the run, the various melter and off-gas streams were sampled and analyzed to determine melter material balances and to characterize off-gas emissions. Sampling methods and preliminary results were reported earlier. The emphasis was on the chemical analyses of the off-gas entrainment, deposits, and scrubber liquid. The significant sampling results from the run are summarized below: Flushing the Frit 165 with Frit 131 without bubbler agitation required 3 to 4.5 melter volumes. The off-gas cesium concentration during feeding was on the order of 36 to 56 μgCs/scf. The cesium concentration in the melter plenum (based on air in leakage only) was on the order of 110 to 210 μgCs/scf. Using <1 micron as the cut point for semivolatile material 60% of the chloride, 35% of the sodium and less than 5% of the managanese and iron in the entrainment are present as semivolatiles. A material balance on the scrubber tank solids shows good agreement with entrainment data. An overall cesium balance using LSFM-9 data and the DWPF production rate indicates an emission of 0.11 mCi/yr of cesium from the DWPF off-gas. This is a factor of 27 less than the maximum allowable 3 mCi/yr

  19. Dilute scintillators for large-volume tracking detectors

    Energy Technology Data Exchange (ETDEWEB)

    Reeder, R.A. (University of New Mexico, Albuquerque, NM (United States)); Dieterle, B.D. (University of New Mexico, Albuquerque, NM (United States)); Gregory, C. (University of New Mexico, Albuquerque, NM (United States)); Schaefer, F. (University of New Mexico, Albuquerque, NM (United States)); Schum, K. (University of New Mexico, Albuquerque, NM (United States)); Strossman, W. (University of California, Riverside, CA (United States)); Smith, D. (Embry-Riddle Aeronautical Univ., Prescott, AZ (United States)); Christofek, L. (Los Alamos National Lab., NM (United States)); Johnston, K. (Los Alamos National Lab., NM (United States)); Louis, W.C. (Los Alamos National Lab., NM (United States)); Schillaci, M. (Los Alamos National Lab., NM (United States)); Volta, M. (Los Alamos National Lab., NM (United States)); White, D.H. (Los Alamos National Lab., NM (United States)); Whitehouse, D. (Los Alamos National Lab., NM (United States)); Albert, M. (University of Pennsylvania, Phi

    1993-10-01

    Dilute scintillation mixtures emit isotropic light for both fast and slow particles, but retain the Cherenkov light cone from fast particles. Large volume detectors using photomultipliers to reconstruct relativistic tracks will also be sensitive to slow particles if they are filled with these mixtures. Our data show that 0.03 g/l of b-PBD in mineral oil has a 2.4:1 ratio (in the first 12 ns) of isotropic light to Cherenkov light for positron tracks. The light attenuation length is greater than 15 m for wavelength above 400 nm, and the scintillation decay time is about 2 ns for the fast component. There is also a slow isotropic light component that is larger (relative to the fast component) for protons than for electrons. This effect allows particle identification by a technique similar to pulse shape discrimination. These features will be utilized in LSND, a neutrino detector at LAMPF. (orig.)

  20. Large volumes and spectroscopy of walking theories

    CERN Document Server

    Del Debbio, L; Patella, A; Pica, C; Rago, A

    2016-01-01

    A detailed investigation of finite size effects is performed for SU(2) gauge theory with two fermions in the adjoint representation, which previous lattice studies have shown to be inside the conformal window. The system is investigated with different spatial and temporal boundary conditions on lattices of various spatial and temporal extensions, for two values of the bare fermion mass representing a {\\em heavy} and {\\em light} fermion regime. Our study shows that the infinite volume limit of masses and decay constants in the mesonic sector is reached only when the mass of the pseudoscalar particle $M_\\mathrm{PS}$ and the spatial lattice size $L$ satisfy the relation $L M_\\mathrm{PS} \\ge 15$. This bound, which is at least a factor of three higher than what observed in QCD, is a likely consequence of the different spectral signatures of the two theories, with the scalar isosinglet ($0^{++}$ glueball) being the lightest particle in our model. In addition to stressing the importance of simulating large lattice s...

  1. A digital gain stabilizer for large volume organic scintillation detectors

    International Nuclear Information System (INIS)

    Braunsfurth, J.; Geske, K.

    1976-01-01

    A digital gain stabilizer is described, optimized for use with photomultipliers mounted on large volume organic scintillators, or other radiation detectors, which exhibit no prominent peaks in their amplitude spectra. As applications of this kind usually involve many phototubes or detectors, circuit simplicity, production reproduceability, and the possibility of computer controlled operation were major design criteria. Two versions were built, the first one using standard TTL-SSI and MSI circuitry, the second one - to reduce power requirements - using a mixture of TTL- and CMOS-LSI circuits. (Auth.)

  2. National comparison on volume sample activity measurement methods

    International Nuclear Information System (INIS)

    Sahagia, M.; Grigorescu, E.L.; Popescu, C.; Razdolescu, C.

    1992-01-01

    A national comparison on volume sample activity measurements methods may be regarded as a step toward accomplishing the traceability of the environmental and food chain activity measurements to national standards. For this purpose, the Radionuclide Metrology Laboratory has distributed 137 Cs and 134 Cs water-equivalent solid standard sources to 24 laboratories having responsibilities in this matter. Every laboratory has to measure the activity of the received source(s) by using its own standards, equipment and methods and report the obtained results to the organizer. The 'measured activities' will be compared with the 'true activities'. A final report will be issued, which plans to evaluate the national level of precision of such measurements and give some suggestions for improvement. (Author)

  3. Biological intrusion barriers for large-volume waste-disposal sites

    International Nuclear Information System (INIS)

    Hakonson, T.E.; Cline, J.F.; Rickard, W.H.

    1982-01-01

    intrusion of plants and animals into shallow land burial sites with subsequent mobilization of toxic and radiotoxic materials has occured. Based on recent pathway modeling studies, such intrusions can contribute to the dose received by man. This paper describes past work on developing biological intrusion barrier systems for application to large volume waste site stabilization. State-of-the-art concepts employing rock and chemical barriers are discussed relative to long term serviceability and cost of application. The interaction of bio-intrusion barrier systems with other processes affecting trench cover stability are discussed to ensure that trench cover designs minimize the potential dose to man. 3 figures, 6 tables

  4. 105-DR Large Sodium Fire Facility decontamination, sampling, and analysis plan

    International Nuclear Information System (INIS)

    Knaus, Z.C.

    1995-01-01

    This is the decontamination, sampling, and analysis plan for the closure activities at the 105-DR Large Sodium Fire Facility at Hanford Reservation. This document supports the 105-DR Large Sodium Fire Facility Closure Plan, DOE-RL-90-25. The 105-DR LSFF, which operated from about 1972 to 1986, was a research laboratory that occupied the former ventilation supply room on the southwest side of the 105-DR Reactor facility in the 100-D Area of the Hanford Site. The LSFF was established to investigate fire fighting and safety associated with alkali metal fires in the liquid metal fast breeder reactor facilities. The decontamination, sampling, and analysis plan identifies the decontamination procedures, sampling locations, any special handling requirements, quality control samples, required chemical analysis, and data validation needed to meet the requirements of the 105-DR Large Sodium Fire Facility Closure Plan in compliance with the Resource Conservation and Recovery Act

  5. Multivariate statistics high-dimensional and large-sample approximations

    CERN Document Server

    Fujikoshi, Yasunori; Shimizu, Ryoichi

    2010-01-01

    A comprehensive examination of high-dimensional analysis of multivariate methods and their real-world applications Multivariate Statistics: High-Dimensional and Large-Sample Approximations is the first book of its kind to explore how classical multivariate methods can be revised and used in place of conventional statistical tools. Written by prominent researchers in the field, the book focuses on high-dimensional and large-scale approximations and details the many basic multivariate methods used to achieve high levels of accuracy. The authors begin with a fundamental presentation of the basic

  6. Large sample NAA facility and methodology development

    International Nuclear Information System (INIS)

    Roth, C.; Gugiu, D.; Barbos, D.; Datcu, A.; Aioanei, L.; Dobrea, D.; Taroiu, I. E.; Bucsa, A.; Ghinescu, A.

    2013-01-01

    A Large Sample Neutron Activation Analysis (LSNAA) facility has been developed at the TRIGA- Annular Core Pulsed Reactor (ACPR) operated by the Institute for Nuclear Research in Pitesti, Romania. The central irradiation cavity of the ACPR core can accommodate a large irradiation device. The ACPR neutron flux characteristics are well known and spectrum adjustment techniques have been successfully applied to enhance the thermal component of the neutron flux in the central irradiation cavity. An analysis methodology was developed by using the MCNP code in order to estimate counting efficiency and correction factors for the major perturbing phenomena. Test experiments, comparison with classical instrumental neutron activation analysis (INAA) methods and international inter-comparison exercise have been performed to validate the new methodology. (authors)

  7. Large volume syringe pump extruder for desktop 3D printers

    Directory of Open Access Journals (Sweden)

    Kira Pusch

    2018-04-01

    Full Text Available Syringe pump extruders are required for a wide range of 3D printing applications, including bioprinting, embedded printing, and food printing. However, the mass of the syringe becomes a major challenge for most printing platforms, requiring compromises in speed, resolution and/or volume. To address these issues, we have designed a syringe pump large volume extruder (LVE that is compatible with low-cost, open source 3D printers, and herein demonstrate its performance on a PrintrBot Simple Metal. Key aspects of the LVE include: (1 it is open source and compatible with open source hardware and software, making it inexpensive and widely accessible to the 3D printing community, (2 it utilizes a standard 60 mL syringe as its ink reservoir, effectively increasing print volume of the average bioprinter, (3 it is capable of retraction and high speed movements, and (4 it can print fluids using nozzle diameters as small as 100 μm, enabling the printing of complex shapes/objects when used in conjunction with the freeform reversible embedding of suspended hydrogels (FRESH 3D printing method. Printing performance of the LVE is demonstrated by utilizing alginate as a model biomaterial ink to fabricate parametric CAD models and standard calibration objects. Keywords: Additive manufacturing, 3D bioprinting, Embedded printing, FRESH, Soft materials extrusion

  8. Studies on a pulse shaping system for fast coincidence with very large volume HPGe detectors

    International Nuclear Information System (INIS)

    Bose, S.; Chatterjee, M.B.; Sinha, B.K.; Bhattacharya, R.

    1987-01-01

    A variant of the leading edge timing (LET) has been proposed which compensates the ''walk'' due to risetime spread in very large volume (∝100 cm 3 ) HPGe detectors. The method - shape compensated leading edge timing (SCLET) - can be used over a wide dynamic range of energies with 100% efficiency and has been compared with the LET and ARC methods. A time resolution of 10 ns fwhm and 21 ns fwtm has been obtained with 22 Na gamma rays and two HPGe detectors of 96 and 114 cm 3 volume. This circuit is easy to duplicate and use can be a low cost alternative to commercial circuits in experiments requiring a large number of detectors. (orig.)

  9. Sample preparation method for ICP-MS measurement of 99Tc in a large amount of environmental samples

    International Nuclear Information System (INIS)

    Kondo, M.; Seki, R.

    2002-01-01

    Sample preparation for measurement of 99 Tc in a large amount of soil and water samples by ICP-MS has been developed using 95m Tc as a yield tracer. This method is based on the conventional method for a small amount of soil samples using incineration, acid digestion, extraction chromatography (TEVA resin) and ICP-MS measurement. Preliminary concentration of Tc has been introduced by co-precipitation with ferric oxide. The matrix materials in a large amount of samples were more sufficiently removed with keeping the high recovery of Tc than previous method. The recovery of Tc was 70-80% for 100 g soil samples and 60-70% for 500 g of soil and 500 L of water samples. The detection limit of this method was evaluated as 0.054 mBq/kg in 500 g soil and 0.032 μBq/L in 500 L water. The determined value of 99 Tc in the IAEA-375 (soil sample collected near the Chernobyl Nuclear Reactor) was 0.25 ± 0.02 Bq/kg. (author)

  10. Electro-mechanical probe positioning system for large volume plasma device

    Science.gov (United States)

    Sanyasi, A. K.; Sugandhi, R.; Srivastava, P. K.; Srivastav, Prabhakar; Awasthi, L. M.

    2018-05-01

    An automated electro-mechanical system for the positioning of plasma diagnostics has been designed and implemented in a Large Volume Plasma Device (LVPD). The system consists of 12 electro-mechanical assemblies, which are orchestrated using the Modbus communication protocol on 4-wire RS485 communications to meet the experimental requirements. Each assembly has a lead screw-based mechanical structure, Wilson feed-through-based vacuum interface, bipolar stepper motor, micro-controller-based stepper drive, and optical encoder for online positioning correction of probes. The novelty of the system lies in the orchestration of multiple drives on a single interface, fabrication and installation of the system for a large experimental device like the LVPD, in-house developed software, and adopted architectural practices. The paper discusses the design, description of hardware and software interfaces, and performance results in LVPD.

  11. APPHi: Automated Photometry Pipeline for High Cadence Large Volume Data

    Science.gov (United States)

    Sánchez, E.; Castro, J.; Silva, J.; Hernández, J.; Reyes, M.; Hernández, B.; Alvarez, F.; García T.

    2018-04-01

    APPHi (Automated Photometry Pipeline) carries out aperture and differential photometry of TAOS-II project data. It is computationally efficient and can be used also with other astronomical wide-field image data. APPHi works with large volumes of data and handles both FITS and HDF5 formats. Due the large number of stars that the software has to handle in an enormous number of frames, it is optimized to automatically find the best value for parameters to carry out the photometry, such as mask size for aperture, size of window for extraction of a single star, and the number of counts for the threshold for detecting a faint star. Although intended to work with TAOS-II data, APPHi can analyze any set of astronomical images and is a robust and versatile tool to performing stellar aperture and differential photometry.

  12. Effect, Feasibility, and Clinical Relevance of Cell Enrichment in Large Volume Fat Grafting

    DEFF Research Database (Denmark)

    Rasmussen, Bo Sonnich; Lykke Sørensen, Celine; Vester-Glowinski, Peter Viktor

    2017-01-01

    Large volume fat grafting is limited by unpredictable volume loss; therefore, methods of improving graft retention have been developed. Fat graft enrichment with either stromal vascular fraction (SVF) cells or adipose tissue-derived stem/stromal cells (ASCs) has been investigated in several animal...... and human studies, and significantly improved graft retention has been reported. Improvement of graft retention and the feasibility of these techniques are equally important in evaluating the clinical relevance of cell enrichment. We conducted a systematic search of PubMed to identify studies on fat graft...... enrichment that used either SVF cells or ASCs, and only studies reporting volume assessment were included. A total of 38 articles (15 human and 23 animal) were included to investigate the effects of cell enrichment on graft retention as well as the feasibility and clinical relevance of cell-enriched fat...

  13. Large-volume paracentesis with indwelling peritoneal catheter and albumin infusion: a community hospital study

    Directory of Open Access Journals (Sweden)

    Daniel K. Martin

    2016-10-01

    Full Text Available Background: The management of ascites can be problematic. This is especially true in patients with diuretic refractory ascites who develop a tense abdomen. This often results in hypotension and decreased venous return with resulting renal failure. In this paper, we further examine the risks and benefits of utilizing an indwelling peritoneal catheter to remove large-volume ascites over a 72-h period while maintaining intravascular volume and preventing renal failure. Methods: We retrospectively reviewed charts and identified 36 consecutive patients undergoing continuous large-volume paracentesis with an indwelling peritoneal catheter. At the time of drain placement, no patients had signs or laboratory parameters suggestive of spontaneous bacterial peritonitis. The patients underwent ascitic fluid removal through an indwelling peritoneal catheter and were supported with scheduled albumin throughout the duration. The catheter was used to remove up to 3 L every 8 h for a maximum of 72 h. Regular laboratory and ascitic fluid testing was performed. All patients had a clinical follow-up within 3 months after the drain placement. Results: An average of 16.5 L was removed over the 72-h time frame of indwelling peritoneal catheter maintenance. The albumin infusion utilized correlated to 12 mg/L removed. The average creatinine trend improved in a statistically significant manner from 1.37 on the day of admission to 1.21 on the day of drain removal. No patients developed renal failure during the hospital course. There were no documented episodes of neutrocytic ascites or bacterial peritonitis throughout the study review. Conclusion: Large-volume peritoneal drainage with an indwelling peritoneal catheter is safe and effective for patients with tense ascites. Concomitant albumin infusion allows for maintenance of renal function, and no increase in infectious complications was noted.

  14. Determination of air-loop volume and radon partition coefficient for measuring radon in water sample.

    Science.gov (United States)

    Lee, Kil Yong; Burnett, William C

    A simple method for the direct determination of the air-loop volume in a RAD7 system as well as the radon partition coefficient was developed allowing for an accurate measurement of the radon activity in any type of water. The air-loop volume may be measured directly using an external radon source and an empty bottle with a precisely measured volume. The partition coefficient and activity of radon in the water sample may then be determined via the RAD7 using the determined air-loop volume. Activity ratios instead of absolute activities were used to measure the air-loop volume and the radon partition coefficient. In order to verify this approach, we measured the radon partition coefficient in deionized water in the temperature range of 10-30 °C and compared the values to those calculated from the well-known Weigel equation. The results were within 5 % variance throughout the temperature range. We also applied the approach for measurement of the radon partition coefficient in synthetic saline water (0-75 ppt salinity) as well as tap water. The radon activity of the tap water sample was determined by this method as well as the standard RAD-H 2 O and BigBottle RAD-H 2 O. The results have shown good agreement between this method and the standard methods.

  15. Determination of air-loop volume and radon partition coefficient for measuring radon in water sample

    International Nuclear Information System (INIS)

    Kil Yong Lee; Burnett, W.C.

    2013-01-01

    A simple method for the direct determination of the air-loop volume in a RAD7 system as well as the radon partition coefficient was developed allowing for an accurate measurement of the radon activity in any type of water. The air-loop volume may be measured directly using an external radon source and an empty bottle with a precisely measured volume. The partition coefficient and activity of radon in the water sample may then be determined via the RAD7 using the determined air-loop volume. Activity ratios instead of absolute activities were used to measure the air-loop volume and the radon partition coefficient. In order to verify this approach, we measured the radon partition coefficient in deionized water in the temperature range of 10-30 deg C and compared the values to those calculated from the well-known Weigel equation. The results were within 5 % variance throughout the temperature range. We also applied the approach for measurement of the radon partition coefficient in synthetic saline water (0-75 ppt salinity) as well as tap water. The radon activity of the tap water sample was determined by this method as well as the standard RAD-H 2 O and BigBottle RAD-H 2 O. The results have shown good agreement between this method and the standard methods. (author)

  16. Examining gray matter structures associated with individual differences in global life satisfaction in a large sample of young adults

    Science.gov (United States)

    Kong, Feng; Ding, Ke; Yang, Zetian; Dang, Xiaobin; Hu, Siyuan; Song, Yiying

    2015-01-01

    Although much attention has been directed towards life satisfaction that refers to an individual’s general cognitive evaluations of his or her life as a whole, little is known about the neural basis underlying global life satisfaction. In this study, we used voxel-based morphometry to investigate the structural neural correlates of life satisfaction in a large sample of young healthy adults (n = 299). We showed that individuals’ life satisfaction was positively correlated with the regional gray matter volume (rGMV) in the right parahippocampal gyrus (PHG), and negatively correlated with the rGMV in the left precuneus and left ventromedial prefrontal cortex. This pattern of results remained significant even after controlling for the effect of general positive and negative affect, suggesting a unique structural correlates of life satisfaction. Furthermore, we found that self-esteem partially mediated the association between the PHG volume and life satisfaction as well as that between the precuneus volume and global life satisfaction. Taken together, we provide the first evidence for the structural neural basis of life satisfaction, and highlight that self-esteem might play a crucial role in cultivating an individual’s life satisfaction. PMID:25406366

  17. Large volume liquid silicone injection in the upper thighs : a never ending story

    NARCIS (Netherlands)

    Hofer, SOP; Damen, A; Nicolai, JPA

    This report concerns a 26-year-old male-to-female transsexual who had received a large volume liquid silicone injection of unknown grade into her upper lateral thighs to gain female contour. She presented at our outpatient clinic 4 years after the silicone injection with complaints of pain and

  18. Does Size Really Matter? Analysis of the Effect of Large Fibroids and Uterine Volumes on Complication Rates of Uterine Artery Embolisation

    International Nuclear Information System (INIS)

    Parthipun, A. A.; Taylor, J.; Manyonda, I.; Belli, A. M.

    2010-01-01

    The purpose of this study was to determine whether there is a correlation between large uterine fibroid diameter, uterine volume, number of vials of embolic agent used and risk of complications from uterine artery embolisation (UAE). This was a prospective study involving 121 patients undergoing UAE embolisation for symptomatic uterine fibroids at a single institution. Patients were grouped according to diameter of largest fibroid and uterine volume. Results were also stratified according to the number of vials of embolic agent used and rate of complications. No statistical difference in complication rate was demonstrated between the two groups according to diameter of the largest fibroid (large fibroids were classified as ≥10 cm; Fisher's exact test P = 1.00), and no statistical difference in complication rate was demonstrated according to uterine volume (large uterine volume was defined as ≥750 cm 3 ; Fisher's exact test P = 0.70). 84 of the 121 patients had documentation of the number of vials used during the procedure. Patients were divided into two groups, with ≥4 used defined as a large number of embolic agent. There was no statistical difference between these two groups and no associated increased risk of developing complications. This study showed no increased incidence of complications in women with large-diameter fibroids or uterine volumes as defined. In addition, there was no evidence of increased complications according to quantity of embolic material used. Therefore, UAE should be offered to women with large fibroids and uterine volumes.

  19. An annular BF3 counter of large sensitive volume

    International Nuclear Information System (INIS)

    Janardhanan, S.; Swaminathan, N.

    1975-01-01

    An annular neutron counter having a large sensitive volume with inner and outer diameter 31 cms with multiple electrode system fabricated especially to measure the neutron output from fissile region of standard fast reactor fuel of length nearly equivalent to 500 cms is described. The counter efficiency is nearly 0.3% for neutron and sensitivity 0.0018 counts/neutron for (alpha, neutron) and spontaneous fission source. Its other potential applications which are indicated are : (1) quality control of fast reactor fuel pins (2) fuel inventory (3) assessing radioactivity of solid waste packets containing PuO 2 (4) uniformity of fuel loading of a reactor and (5) neutron monitoring in a fuel plant. (M.G.B.)

  20. Dependence of the clustering properties of galaxies on stellar velocity dispersion in the Main galaxy sample of SDSS DR10

    Science.gov (United States)

    Deng, Xin-Fa; Song, Jun; Chen, Yi-Qing; Jiang, Peng; Ding, Ying-Ping

    2014-08-01

    Using two volume-limited Main galaxy samples of the Sloan Digital Sky Survey Data Release 10 (SDSS DR10), we investigate the dependence of the clustering properties of galaxies on stellar velocity dispersion by cluster analysis. It is found that in the luminous volume-limited Main galaxy sample, except at r=1.2, richer and larger systems can be more easily formed in the large stellar velocity dispersion subsample, while in the faint volume-limited Main galaxy sample, at r≥0.9, an opposite trend is observed. According to statistical analyses of the multiplicity functions, we conclude in two volume-limited Main galaxy samples: small stellar velocity dispersion galaxies preferentially form isolated galaxies, close pairs and small group, while large stellar velocity dispersion galaxies preferentially inhabit the dense groups and clusters. However, we note the difference between two volume-limited Main galaxy samples: in the faint volume-limited Main galaxy sample, at r≥0.9, the small stellar velocity dispersion subsample has a higher proportion of galaxies in superclusters ( n≥200) than the large stellar velocity dispersion subsample.

  1. SDSS Log Viewer: visual exploratory analysis of large-volume SQL log data

    Science.gov (United States)

    Zhang, Jian; Chen, Chaomei; Vogeley, Michael S.; Pan, Danny; Thakar, Ani; Raddick, Jordan

    2012-01-01

    User-generated Structured Query Language (SQL) queries are a rich source of information for database analysts, information scientists, and the end users of databases. In this study a group of scientists in astronomy and computer and information scientists work together to analyze a large volume of SQL log data generated by users of the Sloan Digital Sky Survey (SDSS) data archive in order to better understand users' data seeking behavior. While statistical analysis of such logs is useful at aggregated levels, efficiently exploring specific patterns of queries is often a challenging task due to the typically large volume of the data, multivariate features, and data requirements specified in SQL queries. To enable and facilitate effective and efficient exploration of the SDSS log data, we designed an interactive visualization tool, called the SDSS Log Viewer, which integrates time series visualization, text visualization, and dynamic query techniques. We describe two analysis scenarios of visual exploration of SDSS log data, including understanding unusually high daily query traffic and modeling the types of data seeking behaviors of massive query generators. The two scenarios demonstrate that the SDSS Log Viewer provides a novel and potentially valuable approach to support these targeted tasks.

  2. Intraoperative ventilation: incidence and risk factors for receiving large tidal volumes during general anesthesia

    Directory of Open Access Journals (Sweden)

    Fernandez-Bustamante Ana

    2011-11-01

    Full Text Available Abstract Background There is a growing concern of the potential injurious role of ventilatory over-distention in patients without lung injury. No formal guidelines exist for intraoperative ventilation settings, but the use of tidal volumes (VT under 10 mL/kg predicted body weight (PBW has been recommended in healthy patients. We explored the incidence and risk factors for receiving large tidal volumes (VT > 10 mL/kg PBW. Methods We performed a cross-sectional analysis of our prospectively collected perioperative electronic database for current intraoperative ventilation practices and risk factors for receiving large tidal volumes (VT > 10 mL/kg PBW. We included all adults undergoing prolonged (≥ 4 h elective abdominal surgery and collected demographic, preoperative (comorbidities, intraoperative (i.e. ventilatory settings, fluid administration and postoperative (outcomes information. We compared patients receiving exhaled tidal volumes > 10 mL/kg PBW with those that received 8-10 or Results Ventilatory settings were non-uniform in the 429 adults included in the analysis. 17.5% of all patients received VT > 10 mL/kg PBW. 34.0% of all obese patients (body mass index, BMI, ≥ 30, 51% of all patients with a height T > 10 mL/kg PBW. Conclusions Ventilation with VT > 10 mL/kg PBW is still common, although poor correlation with PBW suggests it may be unintentional. BMI ≥ 30, female gender and height

  3. Large volume TENAX {sup registered} extraction of the bioaccessible fraction of sediment-associated organic compounds for a subsequent effect-directed analysis

    Energy Technology Data Exchange (ETDEWEB)

    Schwab, K.; Brack, W. [UFZ - Helmholtz Centre or Environmental Research, Leipzig (Germany). Dept. of Effect-Directed Analysis

    2007-06-15

    Background, Aim and Scope: Effect-directed analysis (EDA) is a powerful tool for the identification of key toxicants in complex environmental samples. In most cases, EDA is based on total extraction of organic contaminants leading to an erroneous prioritization with regard to hazard and risk. Bioaccessibility-directed extraction aims to discriminate between contaminants that take part in partitioning between sediment and biota in a relevant time frame and those that are enclosed in structures, that do not allow rapid desorption. Standard protocols of targeted extraction of rapidly desorbing, and thus bioaccessible fraction using TENAX {sup registered} are based only on small amounts of sediment. In order to get sufficient amounts of extracts for subsequent biotesting, fractionation, and structure elucidation a large volume extraction technique needs to be developed applying one selected extraction time and excluding toxic procedural blanks. Materials and Methods: Desorption behaviour of sediment contaminants was determined by a consecutive solid-solid extraction of sediment using TENAX {sup registered} fitting a tri-compartment model on experimental data. Time needed to remove the rapidly desorbing fraction trap was calculated to select a fixed extraction time for single extraction procedures. Up-scaling by about a factor of 100 provided a large volume extraction technique for EDA. Reproducibility and comparability to small volume approach were proved. Blanks of respective TENAX {sup registered} mass were investigated using Scenedesmus vacuolatus and Artemia salina as test organisms. Results: Desorption kinetics showed that 12 to 30 % of sediment associated pollutants are available for rapid desorption. t{sub r}ap is compound dependent and covers a range of 2 to 18 h. On that basis a fixed extraction time of 24 h was selected. Validation of large volume approach was done by the means of comparison to small method and reproducibility. The large volume showed a good

  4. Performance of large electron energy filter in large volume plasma device

    International Nuclear Information System (INIS)

    Singh, S. K.; Srivastava, P. K.; Awasthi, L. M.; Mattoo, S. K.; Sanyasi, A. K.; Kaw, P. K.; Singh, R.

    2014-01-01

    This paper describes an in-house designed large Electron Energy Filter (EEF) utilized in the Large Volume Plasma Device (LVPD) [S. K. Mattoo, V. P. Anita, L. M. Awasthi, and G. Ravi, Rev. Sci. Instrum. 72, 3864 (2001)] to secure objectives of (a) removing the presence of remnant primary ionizing energetic electrons and the non-thermal electrons, (b) introducing a radial gradient in plasma electron temperature without greatly affecting the radial profile of plasma density, and (c) providing a control on the scale length of gradient in electron temperature. A set of 19 independent coils of EEF make a variable aspect ratio, rectangular solenoid producing a magnetic field (B x ) of 100 G along its axis and transverse to the ambient axial field (B z ∼ 6.2 G) of LVPD, when all its coils are used. Outside the EEF, magnetic field reduces rapidly to 1 G at a distance of 20 cm from the center of the solenoid on either side of target and source plasma. The EEF divides LVPD plasma into three distinct regions of source, EEF and target plasma. We report that the target plasma (n e ∼ 2 × 10 11  cm −3 and T e ∼ 2 eV) has no detectable energetic electrons and the radial gradients in its electron temperature can be established with scale length between 50 and 600 cm by controlling EEF magnetic field. Our observations reveal that the role of the EEF magnetic field is manifested by the energy dependence of transverse electron transport and enhanced transport caused by the plasma turbulence in the EEF plasma

  5. Utilization of AHWR critical facility for research and development work on large sample NAA

    International Nuclear Information System (INIS)

    Acharya, R.; Dasari, K.B.; Pujari, P.K.; Swain, K.K.; Reddy, A.V.R.; Verma, S.K.; De, S.K.

    2014-01-01

    The graphite reflector position of AHWR critical facility (CF) was utilized for analysis of large size (g-kg scale) samples using internal mono standard neutron activation analysis (IM-NAA). The reactor position was characterized by cadmium ratio method using In monitor for total flux and sub cadmium to epithermal flux ratio (f). Large sample neutron activation analysis (LSNAA) work was carried out for samples of stainless steel, ancient and new clay potteries and dross. Large as well as non-standard geometry samples (1 g - 0.5 kg) were irradiated. Radioactive assay was carried out using high resolution gamma ray spectrometry. Concentration ratios obtained by IM-NAA were used for provenance study of 30 clay potteries, obtained from excavated Buddhist sites of AP, India. Concentrations of Au and Ag were determined in not so homogeneous three large size samples of dross. An X-Z rotary scanning unit has been installed for counting large and not so homogeneous samples. (author)

  6. A simple method for the production of large volume 3D macroporous hydrogels for advanced biotechnological, medical and environmental applications

    Science.gov (United States)

    Savina, Irina N.; Ingavle, Ganesh C.; Cundy, Andrew B.; Mikhalovsky, Sergey V.

    2016-02-01

    The development of bulk, three-dimensional (3D), macroporous polymers with high permeability, large surface area and large volume is highly desirable for a range of applications in the biomedical, biotechnological and environmental areas. The experimental techniques currently used are limited to the production of small size and volume cryogel material. In this work we propose a novel, versatile, simple and reproducible method for the synthesis of large volume porous polymer hydrogels by cryogelation. By controlling the freezing process of the reagent/polymer solution, large-scale 3D macroporous gels with wide interconnected pores (up to 200 μm in diameter) and large accessible surface area have been synthesized. For the first time, macroporous gels (of up to 400 ml bulk volume) with controlled porous structure were manufactured, with potential for scale up to much larger gel dimensions. This method can be used for production of novel 3D multi-component macroporous composite materials with a uniform distribution of embedded particles. The proposed method provides better control of freezing conditions and thus overcomes existing drawbacks limiting production of large gel-based devices and matrices. The proposed method could serve as a new design concept for functional 3D macroporous gels and composites preparation for biomedical, biotechnological and environmental applications.

  7. Topologically-based visualization of large-scale volume data

    International Nuclear Information System (INIS)

    Takeshima, Y.; Tokunaga, M.; Fujishiro, I.; Takahashi, S.

    2004-01-01

    Due to the recent progress in the performance of computing/measurement environments and the advent of ITBL environments, volume datasets have become larger and more complicated. Although computer visualization is one of the tools to analyze such datasets effectively, it is almost impossible to adjust the visualization parameter value by trial and error without taking the feature of a given volume dataset into consideration. In this article, we introduce a scheme of topologically-based volume visualization, which is intended to choose appropriate visualization parameter values automatically through topological volume skeletonization. (author)

  8. Evaluation of Sampling Methods for Bacillus Spore ...

    Science.gov (United States)

    Journal Article Following a wide area release of biological materials, mapping the extent of contamination is essential for orderly response and decontamination operations. HVAC filters process large volumes of air and therefore collect highly representative particulate samples in buildings. HVAC filter extraction may have great utility in rapidly estimating the extent of building contamination following a large-scale incident. However, until now, no studies have been conducted comparing the two most appropriate sampling approaches for HVAC filter materials: direct extraction and vacuum-based sampling.

  9. Exploring Technostress: Results of a Large Sample Factor Analysis

    OpenAIRE

    Jonušauskas, Steponas; Raišienė, Agota Giedrė

    2016-01-01

    With reference to the results of a large sample factor analysis, the article aims to propose the frame examining technostress in a population. The survey and principal component analysis of the sample consisting of 1013 individuals who use ICT in their everyday work was implemented in the research. 13 factors combine 68 questions and explain 59.13 per cent of the answers dispersion. Based on the factor analysis, questionnaire was reframed and prepared to reasonably analyze the respondents’ an...

  10. Safe total corporal contouring with large-volume liposuction for the obese patient.

    Science.gov (United States)

    Dhami, Lakshyajit D; Agarwal, Meenakshi

    2006-01-01

    The advent of the tumescent technique in 1987 allowed for safe total corporal contouring as an ambulatory, single-session megaliposuction with the patient under regional anesthesia supplemented by local anesthetic only in selected areas. Safety and aesthetic issues define large-volume liposuction as having a 5,000-ml aspirate, mega-volume liposuction as having an 8,000-ml aspirate, and giganto-volume liposuction as having an aspirate of 12,000 ml or more. Clinically, a total volume comprising 5,000 ml of fat and wetting solution aspirated during the procedure qualifies for megaliposuction/large-volume liposuction. Between September 2000 and August 2005, 470 cases of liposuction were managed. In 296 (63%) of the 470 cases, the total volume of aspirate exceeded 5 l (range, 5,000-22,000 ml). Concurrent limited or total-block lipectomy was performed in 70 of 296 cases (23.6%). Regional anesthesia with conscious sedation was preferred, except where liposuction targeted areas above the subcostal region (the upper trunk, lateral chest, gynecomastia, breast, arms, and face), or when the patient so desired. Tumescent infiltration was achieved with hypotonic lactated Ringer's solution, adrenalin, triamcinalone, and hyalase in all cases during the last one year of the series. This approach has clinically shown less tissue edema in the postoperative period than with conventional physiologic saline used in place of the Ringer's lactate solution. The amount injected varied from 1,000 to 8,000 ml depending on the size, site, and area. Local anesthetic was included only for the terminal portion of the tumescent mixture, wherever the subcostal regions were infiltrated. The aspirate was restricted to the unstained white/yellow fat, and the amount of fat aspirated did not have any bearing on the amount of solution infiltrated. There were no major complications, and no blood transfusions were administered. The hospital stay ranged from 8 to 24 h for both liposuction and liposuction

  11. 105-DR Large sodium fire facility soil sampling data evaluation report

    International Nuclear Information System (INIS)

    Adler, J.G.

    1996-01-01

    This report evaluates the soil sampling activities, soil sample analysis, and soil sample data associated with the closure activities at the 105-DR Large Sodium Fire Facility. The evaluation compares these activities to the regulatory requirements for meeting clean closure. The report concludes that there is no soil contamination from the waste treatment activities

  12. New experimental procedure for measuring volume magnetostriction on powder samples

    International Nuclear Information System (INIS)

    Rivero, G.; Multigner, M.; Valdes, J.; Crespo, P.; Martinez, A.; Hernando, A.

    2005-01-01

    Conventional techniques used for volume magnetostriction measurements, as strain gauge or cantilever method, are very useful for ribbons or thin films but cannot be applied when the samples are in powder form. To overcome this problem a new experimental procedure has been developed. In this work, the experimental set-up is described, together with the results obtained in amorphous FeCuZr powders, which exhibit a strong dependence of the magnetization on the strength of the applied magnetic field. The magnetostriction measurements presented in this work point out that this dependence is related to a magnetovolume effect

  13. Sampling-based motion planning with reachable volumes: Theoretical foundations

    KAUST Repository

    McMahon, Troy

    2014-05-01

    © 2014 IEEE. We introduce a new concept, reachable volumes, that denotes the set of points that the end effector of a chain or linkage can reach. We show that the reachable volume of a chain is equivalent to the Minkowski sum of the reachable volumes of its links, and give an efficient method for computing reachable volumes. We present a method for generating configurations using reachable volumes that is applicable to various types of robots including open and closed chain robots, tree-like robots, and complex robots including both loops and branches. We also describe how to apply constraints (both on end effectors and internal joints) using reachable volumes. Unlike previous methods, reachable volumes work for spherical and prismatic joints as well as planar joints. Visualizations of reachable volumes can allow an operator to see what positions the robot can reach and can guide robot design. We present visualizations of reachable volumes for representative robots including closed chains and graspers as well as for examples with joint and end effector constraints.

  14. Sampling-based motion planning with reachable volumes: Theoretical foundations

    KAUST Repository

    McMahon, Troy; Thomas, Shawna; Amato, Nancy M.

    2014-01-01

    © 2014 IEEE. We introduce a new concept, reachable volumes, that denotes the set of points that the end effector of a chain or linkage can reach. We show that the reachable volume of a chain is equivalent to the Minkowski sum of the reachable volumes of its links, and give an efficient method for computing reachable volumes. We present a method for generating configurations using reachable volumes that is applicable to various types of robots including open and closed chain robots, tree-like robots, and complex robots including both loops and branches. We also describe how to apply constraints (both on end effectors and internal joints) using reachable volumes. Unlike previous methods, reachable volumes work for spherical and prismatic joints as well as planar joints. Visualizations of reachable volumes can allow an operator to see what positions the robot can reach and can guide robot design. We present visualizations of reachable volumes for representative robots including closed chains and graspers as well as for examples with joint and end effector constraints.

  15. Subcortical intelligence: caudate volume predicts IQ in healthy adults.

    Science.gov (United States)

    Grazioplene, Rachael G; G Ryman, Sephira; Gray, Jeremy R; Rustichini, Aldo; Jung, Rex E; DeYoung, Colin G

    2015-04-01

    This study examined the association between size of the caudate nuclei and intelligence. Based on the central role of the caudate in learning, as well as neuroimaging studies linking greater caudate volume to better attentional function, verbal ability, and dopamine receptor availability, we hypothesized the existence of a positive association between intelligence and caudate volume in three large independent samples of healthy adults (total N = 517). Regression of IQ onto bilateral caudate volume controlling for age, sex, and total brain volume indicated a significant positive correlation between caudate volume and intelligence, with a comparable magnitude of effect across each of the three samples. No other subcortical structures were independently associated with IQ, suggesting a specific biological link between caudate morphology and intelligence. © 2014 Wiley Periodicals, Inc.

  16. Examining gray matter structures associated with individual differences in global life satisfaction in a large sample of young adults.

    Science.gov (United States)

    Kong, Feng; Ding, Ke; Yang, Zetian; Dang, Xiaobin; Hu, Siyuan; Song, Yiying; Liu, Jia

    2015-07-01

    Although much attention has been directed towards life satisfaction that refers to an individual's general cognitive evaluations of his or her life as a whole, little is known about the neural basis underlying global life satisfaction. In this study, we used voxel-based morphometry to investigate the structural neural correlates of life satisfaction in a large sample of young healthy adults (n = 299). We showed that individuals' life satisfaction was positively correlated with the regional gray matter volume (rGMV) in the right parahippocampal gyrus (PHG), and negatively correlated with the rGMV in the left precuneus and left ventromedial prefrontal cortex. This pattern of results remained significant even after controlling for the effect of general positive and negative affect, suggesting a unique structural correlates of life satisfaction. Furthermore, we found that self-esteem partially mediated the association between the PHG volume and life satisfaction as well as that between the precuneus volume and global life satisfaction. Taken together, we provide the first evidence for the structural neural basis of life satisfaction, and highlight that self-esteem might play a crucial role in cultivating an individual's life satisfaction. © The Author (2014). Published by Oxford University Press. For Permissions, please email: journals.permissions@oup.com.

  17. Human blood RNA stabilization in samples collected and transported for a large biobank

    Science.gov (United States)

    2012-01-01

    Background The Norwegian Mother and Child Cohort Study (MoBa) is a nation-wide population-based pregnancy cohort initiated in 1999, comprising more than 108.000 pregnancies recruited between 1999 and 2008. In this study we evaluated the feasibility of integrating RNA analyses into existing MoBa protocols. We compared two different blood RNA collection tube systems – the PAXgene™ Blood RNA system and the Tempus™ Blood RNA system - and assessed the effects of suboptimal blood volumes in collection tubes and of transportation of blood samples by standard mail. Endpoints to characterize the samples were RNA quality and yield, and the RNA transcript stability of selected genes. Findings High-quality RNA could be extracted from blood samples stabilized with both PAXgene and Tempus tubes. The RNA yields obtained from the blood samples collected in Tempus tubes were consistently higher than from PAXgene tubes. Higher RNA yields were obtained from cord blood (3 – 4 times) compared to adult blood with both types of tubes. Transportation of samples by standard mail had moderate effects on RNA quality and RNA transcript stability; the overall RNA quality of the transported samples was high. Some unexplained changes in gene expression were noted, which seemed to correlate with suboptimal blood volumes collected in the tubes. Temperature variations during transportation may also be of some importance. Conclusions Our results strongly suggest that special collection tubes are necessary for RNA stabilization and they should be used for establishing new biobanks. We also show that the 50,000 samples collected in the MoBa biobank provide RNA of high quality and in sufficient amounts to allow gene expression analyses for studying the association of disease with altered patterns of gene expression. PMID:22988904

  18. Characterization of segmented large volume, high purity germanium detectors

    International Nuclear Information System (INIS)

    Bruyneel, B.

    2006-01-01

    γ-ray tracking in future HPGe arrays like AGATA will rely on pulse shape analysis (PSA) of multiple γ-interactions. For this purpose, a simple and fast procedure was developed which enabled the first full characterization of a segmented large volume HPGe detector. An analytical model for the hole mobility in a Ge crystal lattice was developed to describe the hole drift anisotropy with experimental velocity values along the crystal axis as parameters. The new model is based on the drifted Maxwellian hole distribution in Ge. It is verified by reproducing successfully experimental longitudinal hole anisotropy data. A comparison between electron and hole mobility shows large differences for the longitudinal and tangential velocity anisotropy as a function of the electrical field orientation. Measurements on a 12 fold segmented, n-type, large volume, irregular shaped HPGe detector were performed in order to determine the parameters of anisotropic mobility for electrons and holes as charge carriers created by γ-ray interactions. To characterize the electron mobility the complete outer detector surface was scanned in small steps employing photopeak interactions at 60 keV. A precise measurement of the hole drift anisotropy was performed with 356 keV rays. The drift velocity anisotropy and crystal geometry cause considerable rise time differences in pulse shapes depending on the position of the spatial charge carrier creation. Pulse shapes of direct and transient signals are reproduced by weighting potential calculations with high precision. The measured angular dependence of rise times is caused by the anisotropic mobility, crystal geometry, changing field strength and space charge effects. Preamplified signals were processed employing digital spectroscopy electronics. Response functions, crosstalk contributions and averaging procedures were taken into account implying novel methods due to the segmentation of the Ge-crystal and the digital electronics. The results are

  19. Characterization of segmented large volume, high purity germanium detectors

    Energy Technology Data Exchange (ETDEWEB)

    Bruyneel, B. [Koeln Univ. (Germany). Inst. fuer Kernphysik

    2006-07-01

    {gamma}-ray tracking in future HPGe arrays like AGATA will rely on pulse shape analysis (PSA) of multiple {gamma}-interactions. For this purpose, a simple and fast procedure was developed which enabled the first full characterization of a segmented large volume HPGe detector. An analytical model for the hole mobility in a Ge crystal lattice was developed to describe the hole drift anisotropy with experimental velocity values along the crystal axis as parameters. The new model is based on the drifted Maxwellian hole distribution in Ge. It is verified by reproducing successfully experimental longitudinal hole anisotropy data. A comparison between electron and hole mobility shows large differences for the longitudinal and tangential velocity anisotropy as a function of the electrical field orientation. Measurements on a 12 fold segmented, n-type, large volume, irregular shaped HPGe detector were performed in order to determine the parameters of anisotropic mobility for electrons and holes as charge carriers created by {gamma}-ray interactions. To characterize the electron mobility the complete outer detector surface was scanned in small steps employing photopeak interactions at 60 keV. A precise measurement of the hole drift anisotropy was performed with 356 keV rays. The drift velocity anisotropy and crystal geometry cause considerable rise time differences in pulse shapes depending on the position of the spatial charge carrier creation. Pulse shapes of direct and transient signals are reproduced by weighting potential calculations with high precision. The measured angular dependence of rise times is caused by the anisotropic mobility, crystal geometry, changing field strength and space charge effects. Preamplified signals were processed employing digital spectroscopy electronics. Response functions, crosstalk contributions and averaging procedures were taken into account implying novel methods due to the segmentation of the Ge-crystal and the digital electronics

  20. Associations between sociodemographic, sampling and health factors and various salivary cortisol indicators in a large sample without psychopathology

    NARCIS (Netherlands)

    Vreeburg, Sophie A.; Kruijtzer, Boudewijn P.; van Pelt, Johannes; van Dyck, Richard; DeRijk, Roel H.; Hoogendijk, Witte J. G.; Smit, Johannes H.; Zitman, Frans G.; Penninx, Brenda

    Background: Cortisol levels are increasingly often assessed in large-scale psychosomatic research. Although determinants of different salivary cortisol indicators have been described, they have not yet been systematically studied within the same study with a Large sample size. Sociodemographic,

  1. Sampling and analysis plan for the consolidated sludge samples from the canisters and floor of the 105-K East basin

    International Nuclear Information System (INIS)

    BAKER, R.B.

    1999-01-01

    This Sampling and Analysis Plan (SAP) provides direction for sampling of fuel canister and floor Sludge from the K East Basin to complete the inventory of samples needed for Sludge treatment process testing. Sample volumes and sources consider recent reviews made by the Sludge treatment subproject. The representative samples will be characterized to the extent needed for the material to be used effectively for testing. Sampling equipment used allows drawing of large volume sludge samples and consolidation of sample material from a number of basin locations into one container. Once filled, the containers will be placed in a cask and transported to Hanford laboratories for recovery and evaluation. Included in the present SAP are the logic for sample location selection, laboratory analysis procedures required, and reporting needed to meet the Data Quality Objectives (DQOs) for this initiative

  2. A course in mathematical statistics and large sample theory

    CERN Document Server

    Bhattacharya, Rabi; Patrangenaru, Victor

    2016-01-01

    This graduate-level textbook is primarily aimed at graduate students of statistics, mathematics, science, and engineering who have had an undergraduate course in statistics, an upper division course in analysis, and some acquaintance with measure theoretic probability. It provides a rigorous presentation of the core of mathematical statistics. Part I of this book constitutes a one-semester course on basic parametric mathematical statistics. Part II deals with the large sample theory of statistics — parametric and nonparametric, and its contents may be covered in one semester as well. Part III provides brief accounts of a number of topics of current interest for practitioners and other disciplines whose work involves statistical methods. Large Sample theory with many worked examples, numerical calculations, and simulations to illustrate theory Appendices provide ready access to a number of standard results, with many proofs Solutions given to a number of selected exercises from Part I Part II exercises with ...

  3. Polymeric ionic liquid-based portable tip microextraction device for on-site sample preparation of water samples.

    Science.gov (United States)

    Chen, Lei; Pei, Junxian; Huang, Xiaojia; Lu, Min

    2018-06-05

    On-site sample preparation is highly desired because it avoids the transportation of large-volume samples and ensures the accuracy of the analytical results. In this work, a portable prototype of tip microextraction device (TMD) was designed and developed for on-site sample pretreatment. The assembly procedure of TMD is quite simple. Firstly, polymeric ionic liquid (PIL)-based adsorbent was in-situ prepared in a pipette tip. After that, the tip was connected with a syringe which was driven by a bidirectional motor. The flow rates in adsorption and desorption steps were controlled accurately by the motor. To evaluate the practicability of the developed device, the TMD was used to on-site sample preparation of waters and combined with high-performance liquid chromatography with diode array detection to measure trace estrogens in water samples. Under the most favorable conditions, the limits of detection (LODs, S/N = 3) for the target analytes were in the range of 4.9-22 ng/L, with good coefficients of determination. Confirmatory study well evidences that the extraction performance of TMD is comparable to that of the traditional laboratory solid-phase extraction process, but the proposed TMD is more simple and convenient. At the same time, the TMD avoids complicated sampling and transferring steps of large-volume water samples. Copyright © 2018 Elsevier B.V. All rights reserved.

  4. Resuscitation of a Polytraumatized Patient with Large Volume Crystalloid-Colloid Infusions – Correlation Between Global and Regional Hemodynamics: Case Report

    OpenAIRE

    Lončarić-Katušin, Mirjana; Belavić, Matija; Žunić, Josip; Gučanin, Snježana; Žilić, Antonio; Korać, Želimir

    2010-01-01

    Aggressive large volume resuscitation is obligatory to achieve necessary tissue oxygenation. An adequate venous preload normalizes global hemodynamics and avoids multiorgan failure (MOF) and death in patients with multiple injuries. Large volume resuscitation is associated with complications in minimally monitored patients. A properly guided resuscitation procedure will finally prevent MOF and patient death. Transpulmonary thermodilution technique and gastric tonometry are used in venous prel...

  5. Test plan for evaluating the performance of the in-tank fluidic sampling system

    International Nuclear Information System (INIS)

    BOGER, R.M.

    1999-01-01

    The PHMC will provide Low Activity Wastes (LAW) tank wastes for final treatment by a privatization contractor from double-shell feed tanks, 241-AP-102 and 241-AP-104, Concerns about the inability of the baseline ''grab'' sampling to provide large volume samples within time constraints has led to the development of a conceptual sampling system that would be deployed in a feed tank riser, This sampling system will provide large volume, representative samples without the environmental, radiation exposure, and sample volume impacts of the current base-line ''grab'' sampling method. This test plan identifies ''proof-of-principle'' cold tests for the conceptual sampling system using simulant materials. The need for additional testing was identified as a result of completing tests described in the revision test plan document, Revision 1 outlines tests that will evaluate the performance and ability to provide samples that are representative of a tanks' content within a 95 percent confidence interval, to recovery from plugging, to sample supernatant wastes with over 25 wt% solids content, and to evaluate the impact of sampling at different heights within the feed tank. The test plan also identifies operating parameters that will optimize the performance of the sampling system

  6. State of art data acquisition system for large volume plasma device

    International Nuclear Information System (INIS)

    Sugandhi, Ritesh; Srivastava, Pankaj; Sanyasi, Amulya Kumar; Srivastav, Prabhakar; Awasthi, Lalit Mohan; Mattoo, Shiban Krishna; Parmar, Vijay; Makadia, Keyur; Patel, Ishan; Shah, Sandeep

    2015-01-01

    The Large volume plasma device (LVPD) is a cylindrical device (ϕ = 2m, L = 3m) dedicated for carrying out investigations on plasma physics problems ranging from excitation of whistler structures to plasma turbulence especially, exploring the linear and nonlinear aspects of electron temperature gradient(ETG) driven turbulence, plasma transport over the entire cross section of LVPD. The machine operates in a pulsed mode with repetition cycle of 1 Hz and acquisition pulse length of duration of 15 ms, presently, LVPD has VXI data acquisition system but this is now in phasing out mode because of non-functioning of its various amplifier stages, expandability and unavailability of service support. The VXI system has limited capabilities to meet new experimental requirements in terms of numbers of channel (16), bit resolutions (8 bit), record length (30K points) and calibration support. Recently, integration of new acquisition system for simultaneous sampling of 40 channels of data, collected over multiple time scales with high speed is successfully demonstrated, by configuring latest available hardware and in-house developed software solutions. The operational feasibility provided by LabVIEW platform is not only for operating DAQ system but also for providing controls to various subsystems associated with the device. The new system is based on PXI express instrumentation bus and supersedes the existing VXI based data acquisition system in terms of instrumentation capabilities. This system has capability to measure 32 signals at 60 MHz sampling frequency and 8 signals with 1.25 GHz with 10 bit and 12 bit resolution capability for amplitude measurements. The PXI based system successfully addresses and demonstrate the issues concerning high channel count, high speed data streaming and multiple I/O modules synchronization. The system consists of chassis (NI 1085), 4 high sampling digitizers (NI 5105), 2 very high sampling digitizers (NI 5162), data streaming RAID drive (NI

  7. Gender moderates the association between dorsal medial prefrontal cortex volume and depressive symptoms in a subclinical sample.

    Science.gov (United States)

    Carlson, Joshua M; Depetro, Emily; Maxwell, Joshua; Harmon-Jones, Eddie; Hajcak, Greg

    2015-08-30

    Major depressive disorder is associated with lower medial prefrontal cortex volumes. The role that gender might play in moderating this relationship and what particular medial prefrontal cortex subregion(s) might be implicated is unclear. Magnetic resonance imaging was used to assess dorsal, ventral, and anterior cingulate regions of the medial prefrontal cortex in a normative sample of male and female adults. The Depression, Anxiety, and Stress Scale (DASS) was used to measure these three variables. Voxel-based morphometry was used to test for correlations between medial prefrontal gray matter volume and depressive traits. The dorsal medial frontal cortex was correlated with greater levels of depression, but not anxiety and stress. Gender moderates this effect: in males greater levels of depression were associated with lower dorsal medial prefrontal volumes, but in females no relationship was observed. The results indicate that even within a non-clinical sample, male participants with higher levels of depressive traits tend to have lower levels of gray matter volume in the dorsal medial prefrontal cortex. Our finding is consistent with low dorsal medial prefrontal volume contributing to the development of depression in males. Future longitudinal work is needed to substantiate this possibility. Copyright © 2015 Elsevier Ireland Ltd. All rights reserved.

  8. A New Electropositive Filter for Concentrating Enterovirus and Norovirus from Large Volumes of Water - MCEARD

    Science.gov (United States)

    The detection of enteric viruses in environmental water usually requires the concentration of viruses from large volumes of water. The 1MDS electropositive filter is commonly used for concentrating enteric viruses from water but unfortunately these filters are not cost-effective...

  9. Pre-column dilution large volume injection ultra-high performance liquid chromatography-tandem mass spectrometry for the analysis of multi-class pesticides in cabbages.

    Science.gov (United States)

    Zhong, Qisheng; Shen, Lingling; Liu, Jiaqi; Yu, Dianbao; Li, Siming; Yao, Jinting; Zhan, Song; Huang, Taohong; Hashi, Yuki; Kawano, Shin-ichi; Liu, Zhaofeng; Zhou, Ting

    2016-04-15

    Pre-column dilution large volume injection (PD-LVI), a novel sample injection technique for reverse phase ultra-high performance liquid chromatography-tandem mass spectrometry (UHPLC-MS/MS), was developed in this study. The PD-LVI UHPLC-MS/MS system was designed by slightly modifying the commercial UHPLC-MS/MS equipment with a mixer chamber. During the procedure of PD-LVI, sample solution of 200μL was directly carried by the organic mobile phase to the mixer and diluted with the aqueous mobile phase. After the mixture was introduced to the UHPLC column in a mobile phase of acetonitrile-water (15/85, v/v), the target analytes were stacked on the head of the column until following separation. Using QuEChERS extraction, no additional steps such as solvent evaporation or residue redissolution were needed before injection. The features of PD-LVI UHPLC-MS/MS system were systematically investigated, including the injection volume, the mixer volume, the precondition time and the gradient elution. The efficiency of this approach was demonstrated by direct analysis of 24 pesticides in cabbages. Under the optimized conditions, low limits of detection (0.00074-0.8 ng/kg) were obtained. The recoveries were in the range of 63.3-109% with relative standard deviations less than 8.1%. Compared with common UHPLC-MS/MS technique, PD-LVI UHPLC-MS/MS showed significant advantages such as excellent sensitivity and reliability. The mechanism of PD-LVI was demonstrated to be based on the column-head stacking effect with pre-column dilution. Based on the results, PD-LVI as a simple and effective sample injection technique of reverse phase UHPLC-MS/MS for the analysis of trace analytes in complex samples showed a great promising prospect. Copyright © 2016 Elsevier B.V. All rights reserved.

  10. Comparison of pathogen DNA isolation methods from large volumes of whole blood to improve molecular diagnosis of bloodstream infections.

    Directory of Open Access Journals (Sweden)

    Anne J M Loonen

    Full Text Available For patients suffering from bloodstream infections (BSI molecular diagnostics from whole blood holds promise to provide fast and adequate treatment. However, this approach is hampered by the need of large blood volumes. Three methods for pathogen DNA isolation from whole blood were compared, i.e. an enzymatic method (MolYsis, 1-5 ml, the novel non-enzymatic procedure (Polaris, 1-5 ml, and a method that does not entail removal of human DNA (Triton-Tris-EDTA EasyMAG, 200 µl. These methods were evaluated by processing blood spiked with 0-1000 CFU/ml of Staphylococcus aureus, Pseudomonas aeruginosa and Candida albicans. Downstream detection was performed with real-time PCR assays. Polaris and MolYsis processing followed by real-time PCRs enabled pathogen detection at clinically relevant concentrations of 1-10 CFU/ml blood. By increasing sample volumes, concurrent lower cycle threshold (Ct values were obtained at clinically relevant pathogen concentrations, demonstrating the benefit of using larger blood volumes. A 100% detection rate at a concentration of 10 CFU/ml for all tested pathogens was obtained with the Polaris enrichment, whereas comparatively lower detection rates were measured for MolYsis (50-67% and EasyMAG (58-79%. For the samples with a concentration of 1 CFU/ml Polaris resulted in most optimal detection rates of 70-75% (MolYsis 17-50% and TTE-EasyMAG 20-36%. The Polaris method was more reproducible, less labour intensive, and faster (45 minutes (including Qiagen DNA extraction vs. 2 hours (MolYsis. In conclusion, Polaris and MolYsis enrichment followed by DNA isolation and real-time PCR enables reliable and sensitive detection of bacteria and fungi from 5 ml blood. With Polaris results are available within 3 hours, showing potential for improved BSI diagnostics.

  11. Micro- and nano-volume samples by electrothermal, near-torch vaporization sample introduction using removable, interchangeable and portable rhenium coiled-filament assemblies and axially-viewed inductively coupled plasma-atomic emission spectrometry

    Energy Technology Data Exchange (ETDEWEB)

    Badiei, Hamid R.; Lai, Bryant; Karanassios, Vassili

    2012-11-15

    An electrothermal, near-torch vaporization (NTV) sample introduction for micro- or nano-volume samples is described. Samples were pipetted onto coiled-filament assemblies that were purposely developed to be removable and interchangeable and were dried and vaporized into a small-volume vaporization chamber that clips onto any ICP torch with a ball joint. Interchangeable assemblies were also constructed to be small-size (e.g., less than 3 cm long with max diameter of 0.65 cm) and light-weight (1.4 g) so that they can be portable. Interchangeable assemblies with volume-capacities in three ranges (i.e., < 1 {mu}L, 1-10 {mu}L and 10-100 {mu}L) were fabricated and used. The horizontally-operated NTV sample introduction was interfaced to an axially-viewed ICP-AES (inductively coupled plasma-atomic emission spectrometry) system and NTV was optimized using ICP-AES and 8 elements (Pb, Cd, Zn, V, Ba, Mg, Be and Ca). Precision was 1.0-2.3% (peak height) and 1.1-2.4% (peak area). Detection limits (obtained using 5 {mu}L volumes) expressed in absolute-amounts ranged between 4 pg for Pb to 0.3 fg ({approx} 5 million atoms) for Ca. Detection limits expressed in concentration units (obtained using 100 {mu}L volumes of diluted, single-element standard solutions) were: 50 pg/mL for Pb; 10 pg/mL for Cd; 9 pg/mL for Zn; 1 pg/mL for V; 0.9 pg/mL for Ba; 0.5 pg/mL for Mg; 50 fg/mL for Be; and 3 fg/mL for Ca. Analytical capability and utility was demonstrated using the determination of Pb in pg/mL levels of diluted natural water Certified Reference Material (CRM) and the determination of Zn in 80 nL volumes of the liquid extracted from an individual vesicle. It is shown that portable and interchangeable assemblies with dried sample residues on them can be transported without analyte loss (for the concentrations tested), thus opening up the possibility for 'taking part of the lab to the sample' applications, such as testing for Cu concentration-compliance with the lead

  12. P08.52 Proton therapy re-Irradiation in large-volume recurrent glioblastoma.

    Science.gov (United States)

    Amelio, D.; Widesott, L.; Vennarini, S.; Fellin, F.; Maines, F.; Righetto, R.; Lorentini, S.; Farace, P.; Schwarz, M.; Amichetti, M.

    2016-01-01

    , concentration impairment, and dysphasia. During follow-up two pts (20%) developed radionecrosis (diagnosed at imaging) with mild symptoms controlled with steroids. There were no grade 3 or higher toxicities. The median progression-free survival (PFS) was 6.4 months, while the 3-, 6- and 9-month PFS rates were 80%, 67% and 22%, respectively. Median overall survival (OS) after PT was not achieved, while the 6- and 12-month survival after PT rates were 100% and 60%, respectively. Conclusion: PT re-irradiation of large-volume rGBM showed to be feasible and safe even with concomitant chemotherapy administration. Despite the small number of patients and the retrospective nature of the study PFS and OS rates were promising and deserve further evaluation in a larger pts sample.

  13. Relationship Between LIBS Ablation and Pit Volume for Geologic Samples: Applications for the In Situ Absolute Geochronology

    Science.gov (United States)

    Devismes, Damien; Cohen, Barbara; Miller, J.-S.; Gillot, P.-Y.; Lefevre, J.-C.; Boukari, C.

    2014-01-01

    These first results demonstrate that LIBS spectra can be an interesting tool to estimate the ablated volume. When the ablated volume is bigger than 9.10(exp 6) cubic micrometers, this method has less than 10% of uncertainties. Far enough to be directly implemented in the KArLE experiment protocol. Nevertheless, depending on the samples and their mean grain size, the difficulty to have homogeneous spectra will increase with the ablated volume. Several K-Ar dating studies based on this approach will be implemented. After that, the results will be shown and discussed.

  14. Using Mobile Device Samples to Estimate Traffic Volumes

    Science.gov (United States)

    2017-12-01

    In this project, TTI worked with StreetLight Data to evaluate a beta version of its traffic volume estimates derived from global positioning system (GPS)-based mobile devices. TTI evaluated the accuracy of average annual daily traffic (AADT) volume :...

  15. Quantification of Protozoa and Viruses from Small Water Volumes

    Directory of Open Access Journals (Sweden)

    J. Alfredo Bonilla

    2015-06-01

    Full Text Available Large sample volumes are traditionally required for the analysis of waterborne pathogens. The need for large volumes greatly limits the number of samples that can be processed. The aims of this study were to compare extraction and detection procedures for quantifying protozoan parasites and viruses from small volumes of marine water. The intent was to evaluate a logistically simpler method of sample collection and processing that would facilitate direct pathogen measures as part of routine monitoring programs. Samples were collected simultaneously using a bilayer device with protozoa capture by size (top filter and viruses capture by charge (bottom filter. Protozoan detection technologies utilized for recovery of Cryptosporidium spp. and Giardia spp. were qPCR and the more traditional immunomagnetic separation—IFA-microscopy, while virus (poliovirus detection was based upon qPCR versus plaque assay. Filters were eluted using reagents consistent with the downstream detection technologies. Results showed higher mean recoveries using traditional detection methods over qPCR for Cryptosporidium (91% vs. 45% and poliovirus (67% vs. 55% whereas for Giardia the qPCR-based methods were characterized by higher mean recoveries (41% vs. 28%. Overall mean recoveries are considered high for all detection technologies. Results suggest that simultaneous filtration may be suitable for isolating different classes of pathogens from small marine water volumes. More research is needed to evaluate the suitability of this method for detecting pathogens at low ambient concentration levels.

  16. Fan-beam scanning laser optical computed tomography for large volume dosimetry

    Science.gov (United States)

    Dekker, K. H.; Battista, J. J.; Jordan, K. J.

    2017-05-01

    A prototype scanning-laser fan beam optical CT scanner is reported which is capable of high resolution, large volume dosimetry with reasonable scan time. An acylindrical, asymmetric aquarium design is presented which serves to 1) generate parallel-beam scan geometry, 2) focus light towards a small acceptance angle detector, and 3) avoid interference fringe-related artifacts. Preliminary experiments with uniform solution phantoms (11 and 15 cm diameter) and finger phantoms (13.5 mm diameter FEP tubing) demonstrate that the design allows accurate optical CT imaging, with optical CT measurements agreeing within 3% of independent Beer-Lambert law calculations.

  17. Fan-beam scanning laser optical computed tomography for large volume dosimetry

    International Nuclear Information System (INIS)

    Dekker, K H; Battista, J J; Jordan, K J

    2017-01-01

    A prototype scanning-laser fan beam optical CT scanner is reported which is capable of high resolution, large volume dosimetry with reasonable scan time. An acylindrical, asymmetric aquarium design is presented which serves to 1) generate parallel-beam scan geometry, 2) focus light towards a small acceptance angle detector, and 3) avoid interference fringe-related artifacts. Preliminary experiments with uniform solution phantoms (11 and 15 cm diameter) and finger phantoms (13.5 mm diameter FEP tubing) demonstrate that the design allows accurate optical CT imaging, with optical CT measurements agreeing within 3% of independent Beer-Lambert law calculations. (paper)

  18. Diversity in the stellar velocity dispersion profiles of a large sample of brightest cluster galaxies z ≤ 0.3

    Science.gov (United States)

    Loubser, S. I.; Hoekstra, H.; Babul, A.; O'Sullivan, E.

    2018-06-01

    We analyse spatially resolved deep optical spectroscopy of brightestcluster galaxies (BCGs) located in 32 massive clusters with redshifts of 0.05 ≤ z ≤ 0.30 to investigate their velocity dispersion profiles. We compare these measurements to those of other massive early-type galaxies, as well as central group galaxies, where relevant. This unique, large sample extends to the most extreme of massive galaxies, spanning MK between -25.7 and -27.8 mag, and host cluster halo mass M500 up to 1.7 × 1015 M⊙. To compare the kinematic properties between brightest group and cluster members, we analyse similar spatially resolved long-slit spectroscopy for 23 nearby brightest group galaxies (BGGs) from the Complete Local-Volume Groups Sample. We find a surprisingly large variety in velocity dispersion slopes for BCGs, with a significantly larger fraction of positive slopes, unique compared to other (non-central) early-type galaxies as well as the majority of the brightest members of the groups. We find that the velocity dispersion slopes of the BCGs and BGGs correlate with the luminosity of the galaxies, and we quantify this correlation. It is not clear whether the full diversity in velocity dispersion slopes that we see is reproduced in simulations.

  19. The impact of large tidal volume ventilation on the absorption of inhaled insulin in rabbits

    DEFF Research Database (Denmark)

    Petersen, Astrid Heide; Laursen, Torben; Ahrén, Bo

    2007-01-01

    Previous studies have shown that ventilation patterns affect absorption of inhaled compounds. Thus, the aim of this study was to investigate the effect of large tidal volume ventilation (LTVV) on the absorption of inhaled insulin in rabbits. Mechanically ventilated rabbits were given human insulin...

  20. Double sampling with multiple imputation to answer large sample meta-research questions: Introduction and illustration by evaluating adherence to two simple CONSORT guidelines

    Directory of Open Access Journals (Sweden)

    Patrice L. Capers

    2015-03-01

    Full Text Available BACKGROUND: Meta-research can involve manual retrieval and evaluation of research, which is resource intensive. Creation of high throughput methods (e.g., search heuristics, crowdsourcing has improved feasibility of large meta-research questions, but possibly at the cost of accuracy. OBJECTIVE: To evaluate the use of double sampling combined with multiple imputation (DS+MI to address meta-research questions, using as an example adherence of PubMed entries to two simple Consolidated Standards of Reporting Trials (CONSORT guidelines for titles and abstracts. METHODS: For the DS large sample, we retrieved all PubMed entries satisfying the filters: RCT; human; abstract available; and English language (n=322,107. For the DS subsample, we randomly sampled 500 entries from the large sample. The large sample was evaluated with a lower rigor, higher throughput (RLOTHI method using search heuristics, while the subsample was evaluated using a higher rigor, lower throughput (RHITLO human rating method. Multiple imputation of the missing-completely-at-random RHITLO data for the large sample was informed by: RHITLO data from the subsample; RLOTHI data from the large sample; whether a study was an RCT; and country and year of publication. RESULTS: The RHITLO and RLOTHI methods in the subsample largely agreed (phi coefficients: title=1.00, abstract=0.92. Compliance with abstract and title criteria has increased over time, with non-US countries improving more rapidly. DS+MI logistic regression estimates were more precise than subsample estimates (e.g., 95% CI for change in title and abstract compliance by Year: subsample RHITLO 1.050-1.174 vs. DS+MI 1.082-1.151. As evidence of improved accuracy, DS+MI coefficient estimates were closer to RHITLO than the large sample RLOTHI. CONCLUSIONS: Our results support our hypothesis that DS+MI would result in improved precision and accuracy. This method is flexible and may provide a practical way to examine large corpora of

  1. Dark Radiation predictions from general Large Volume Scenarios

    Science.gov (United States)

    Hebecker, Arthur; Mangat, Patrick; Rompineve, Fabrizio; Witkowski, Lukas T.

    2014-09-01

    Recent observations constrain the amount of Dark Radiation (Δ N eff ) and may even hint towards a non-zero value of Δ N eff . It is by now well-known that this puts stringent constraints on the sequestered Large Volume Scenario (LVS), i.e. on LVS realisations with the Standard Model at a singularity. We go beyond this setting by considering LVS models where SM fields are realised on 7-branes in the geometric regime. As we argue, this naturally goes together with high-scale supersymmetry. The abundance of Dark Radiation is determined by the competition between the decay of the lightest modulus to axions, to the SM Higgs and to gauge fields, and leads to strict constraints on these models. Nevertheless, these constructions can in principle meet current DR bounds due to decays into gauge bosons alone. Further, a rather robust prediction for a substantial amount of Dark Radiation can be made. This applies both to cases where the SM 4-cycles are stabilised by D-terms and are small `by accident', i.e. tuning, as well as to fibred models with the small cycles stabilised by loops. In these constructions the DR axion and the QCD axion are the same field and we require a tuning of the initial misalignment to avoid Dark Matter overproduction. Furthermore, we analyse a closely related setting where the SM lives at a singularity but couples to the volume modulus through flavour branes. We conclude that some of the most natural LVS settings with natural values of model parameters lead to Dark Radiation predictions just below the present observational limits. Barring a discovery, rather modest improvements of present Dark Radiation bounds can rule out many of these most simple and generic variants of the LVS.

  2. Analysis of polycyclic aromatic hydrocarbons in water and beverages using membrane-assisted solvent extraction in combination with large volume injection-gas chromatography-mass spectrometric detection.

    Science.gov (United States)

    Rodil, Rosario; Schellin, Manuela; Popp, Peter

    2007-09-07

    Membrane-assisted solvent extraction (MASE) in combination with large volume injection-gas chromatography-mass spectrometry (LVI-GC-MS) was applied for the determination of 16 polycyclic aromatic hydrocarbons (PAHs) in aqueous samples. The MASE conditions were optimized for achieving high enrichment of the analytes from aqueous samples, in terms of extraction conditions (shaking speed, extraction temperature and time), extraction solvent and composition (ionic strength, sample pH and presence of organic solvent). Parameters like linearity and reproducibility of the procedure were determined. The extraction efficiency was above 65% for all the analytes and the relative standard deviation (RSD) for five consecutive extractions ranged from 6 to 18%. At optimized conditions detection limits at the ng/L level were achieved. The effectiveness of the method was tested by analyzing real samples, such as river water, apple juice, red wine and milk.

  3. Computational methods and modeling. 1. Sampling a Position Uniformly in a Trilinear Hexahedral Volume

    International Nuclear Information System (INIS)

    Urbatsch, Todd J.; Evans, Thomas M.; Hughes, H. Grady

    2001-01-01

    Monte Carlo particle transport plays an important role in some multi-physics simulations. These simulations, which may additionally involve deterministic calculations, typically use a hexahedral or tetrahedral mesh. Trilinear hexahedrons are attractive for physics calculations because faces between cells are uniquely defined, distance-to-boundary calculations are deterministic, and hexahedral meshes tend to require fewer cells than tetrahedral meshes. We discuss one aspect of Monte Carlo transport: sampling a position in a tri-linear hexahedron, which is made up of eight control points, or nodes, and six bilinear faces, where each face is defined by four non-coplanar nodes in three-dimensional Cartesian space. We derive, code, and verify the exact sampling method and propose an approximation to it. Our proposed approximate method uses about one-third the memory and can be twice as fast as the exact sampling method, but we find that its inaccuracy limits its use to well-behaved hexahedrons. Daunted by the expense of the exact method, we propose an alternate approximate sampling method. First, calculate beforehand an approximate volume for each corner of the hexahedron by taking one-eighth of the volume of an imaginary parallelepiped defined by the corner node and the three nodes to which it is directly connected. For the sampling, assume separability in the parameters, and sample each parameter, in turn, from a linear pdf defined by the sum of the four corner volumes at each limit (-1 and 1) of the parameter. This method ignores the quadratic portion of the pdf, but it requires less storage, has simpler sampling, and needs no extra, on-the-fly calculations. We simplify verification by designing tests that consist of one or more cells that entirely fill a unit cube. Uniformly sampling complicated cells that fill a unit cube will result in uniformly sampling the unit cube. Unit cubes are easily analyzed. The first problem has four wedges (or tents, or A frames) whose

  4. Optimization of Large Volume Injection for Improved Detection of Polycyclic Aromatic Hydrocarbons (PAH) in Mussels

    DEFF Research Database (Denmark)

    Duedahl-Olesen, Lene; Ghorbani, Faranak

    2008-01-01

    Detection of PAH of six benzene rings is somewhat troublesome and lowering the limits of detection (LODs) for these compounds in food is necessary. For this purpose, we optimized a Programmable-Temperature-Vaporisation (PTV) injection with Large Volume Injection (LVI) with regard to the GC-MS det...

  5. Forced transport of thermal energy in magmatic and phreatomagmatic large volume ignimbrites: Paleomagnetic evidence from the Colli Albani volcano, Italy

    Science.gov (United States)

    Trolese, Matteo; Giordano, Guido; Cifelli, Francesca; Winkler, Aldo; Mattei, Massimo

    2017-11-01

    Few studies have detailed the thermal architecture of large-volume pyroclastic density current deposits, although such work has a clear importance for understanding the dynamics of eruptions of this magnitude. Here we examine the temperature of emplacement of large-volume caldera-forming ignimbrites related to magmatic and phreatomagmatic eruptions at the Colli Albani volcano, Italy, by using thermal remanent magnetization analysis on both lithic and juvenile clasts. Results show that all the magmatic ignimbrites were deposited at high temperature, between the maximum blocking temperature of the magnetic carrier (600-630 °C) and the glass transition temperature (about 710 °C). Temperature estimations for the phreatomagmatic ignimbrite range between 200 and 400 °C, with most of the clasts emplaced between 200 and 320 °C. Because all the investigated ignimbrites, magmatic and phreatomagmatic, share similar magma composition, volume and mobility, we attribute the temperature difference to magma-water interaction, highlighting its pronounced impact on thermal dissipation, even in large-volume eruptions. The homogeneity of the deposit temperature of each ignimbrite across its areal extent, which is maintained across topographic barriers, suggests that these systems are thermodynamically isolated from the external environment for several tens of kilometers. Based on these findings, we propose that these large-volume ignimbrites are dominated by the mass flux, which forces the lateral transport of mass, momentum, and thermal energy for distances up to tens of kilometers away from the vent. We conclude that spatial variation of the emplacement temperature can be used as a proxy for determining the degree of forced-convection flow.

  6. Sample Preparation and Extraction in Small Sample Volumes Suitable for Pediatric Clinical Studies: Challenges, Advances, and Experiences of a Bioanalytical HPLC-MS/MS Method Validation Using Enalapril and Enalaprilat

    Science.gov (United States)

    Burckhardt, Bjoern B.; Laeer, Stephanie

    2015-01-01

    In USA and Europe, medicines agencies force the development of child-appropriate medications and intend to increase the availability of information on the pediatric use. This asks for bioanalytical methods which are able to deal with small sample volumes as the trial-related blood lost is very restricted in children. Broadly used HPLC-MS/MS, being able to cope with small volumes, is susceptible to matrix effects. The latter restrains the precise drug quantification through, for example, causing signal suppression. Sophisticated sample preparation and purification utilizing solid-phase extraction was applied to reduce and control matrix effects. A scale-up from vacuum manifold to positive pressure manifold was conducted to meet the demands of high-throughput within a clinical setting. Faced challenges, advances, and experiences in solid-phase extraction are exemplarily presented on the basis of the bioanalytical method development and validation of low-volume samples (50 μL serum). Enalapril, enalaprilat, and benazepril served as sample drugs. The applied sample preparation and extraction successfully reduced the absolute and relative matrix effect to comply with international guidelines. Recoveries ranged from 77 to 104% for enalapril and from 93 to 118% for enalaprilat. The bioanalytical method comprising sample extraction by solid-phase extraction was fully validated according to FDA and EMA bioanalytical guidelines and was used in a Phase I study in 24 volunteers. PMID:25873972

  7. Sample Preparation and Extraction in Small Sample Volumes Suitable for Pediatric Clinical Studies: Challenges, Advances, and Experiences of a Bioanalytical HPLC-MS/MS Method Validation Using Enalapril and Enalaprilat

    Directory of Open Access Journals (Sweden)

    Bjoern B. Burckhardt

    2015-01-01

    Full Text Available In USA and Europe, medicines agencies force the development of child-appropriate medications and intend to increase the availability of information on the pediatric use. This asks for bioanalytical methods which are able to deal with small sample volumes as the trial-related blood lost is very restricted in children. Broadly used HPLC-MS/MS, being able to cope with small volumes, is susceptible to matrix effects. The latter restrains the precise drug quantification through, for example, causing signal suppression. Sophisticated sample preparation and purification utilizing solid-phase extraction was applied to reduce and control matrix effects. A scale-up from vacuum manifold to positive pressure manifold was conducted to meet the demands of high-throughput within a clinical setting. Faced challenges, advances, and experiences in solid-phase extraction are exemplarily presented on the basis of the bioanalytical method development and validation of low-volume samples (50 μL serum. Enalapril, enalaprilat, and benazepril served as sample drugs. The applied sample preparation and extraction successfully reduced the absolute and relative matrix effect to comply with international guidelines. Recoveries ranged from 77 to 104% for enalapril and from 93 to 118% for enalaprilat. The bioanalytical method comprising sample extraction by solid-phase extraction was fully validated according to FDA and EMA bioanalytical guidelines and was used in a Phase I study in 24 volunteers.

  8. Sampling strategy for a large scale indoor radiation survey - a pilot project

    International Nuclear Information System (INIS)

    Strand, T.; Stranden, E.

    1986-01-01

    Optimisation of a stratified random sampling strategy for large scale indoor radiation surveys is discussed. It is based on the results from a small scale pilot project where variances in dose rates within different categories of houses were assessed. By selecting a predetermined precision level for the mean dose rate in a given region, the number of measurements needed can be optimised. The results of a pilot project in Norway are presented together with the development of the final sampling strategy for a planned large scale survey. (author)

  9. Tracer techniques for urine volume determination and urine collection and sampling back-up system

    Science.gov (United States)

    Ramirez, R. V.

    1971-01-01

    The feasibility, functionality, and overall accuracy of the use of lithium were investigated as a chemical tracer in urine for providing a means of indirect determination of total urine volume by the atomic absorption spectrophotometry method. Experiments were conducted to investigate the parameters of instrumentation, tracer concentration, mixing times, and methods for incorporating the tracer material in the urine collection bag, and to refine and optimize the urine tracer technique to comply with the Skylab scheme and operational parameters of + or - 2% of volume error and + or - 1% accuracy of amount of tracer added to each container. In addition, a back-up method for urine collection and sampling system was developed and evaluated. This back-up method incorporates the tracer technique for volume determination in event of failure of the primary urine collection and preservation system. One chemical preservative was selected and evaluated as a contingency chemical preservative for the storage of urine in event of failure of the urine cooling system.

  10. Investigation of a large volume negative hydrogen ion source

    International Nuclear Information System (INIS)

    Courteille, C.; Bruneteau, A.M.; Bacal, M.

    1995-01-01

    The electron and negative ion densities and temperatures are reported for a large volume hybrid multicusp negative ion source. Based on the scaling laws an analysis is made of the plasma formation and loss processes. It is shown that the positive ions are predominantly lost to the walls, although the observed scaling law is n + ∝I 0.57 d . However, the total plasma loss scales linearly with the discharge current, in agreement with the theoretical model. The negative ion formation and loss is also discussed. It is shown that at low pressure (1 mTorr) the negative ion wall loss becomes a significant part of the total loss. The dependence of n - /n e versus the electron temperature is reported. When the negative ion wall loss is negligible, all the data on n - /n e versus the electron temperatures fit a single curve. copyright 1995 American Institute of Physics

  11. Predicting Volume and Biomass Change from Multi-Temporal Lidar Sampling and Remeasured Field Inventory Data in Panther Creek Watershed, Oregon, USA

    Directory of Open Access Journals (Sweden)

    Krishna P. Poudel

    2018-01-01

    Full Text Available Using lidar for large-scale forest management can improve operational and management decisions. Using multi-temporal lidar sampling and remeasured field inventory data collected from 78 plots in the Panther Creek Watershed, Oregon, USA, we evaluated the performance of different fixed and mixed models in estimating change in aboveground biomass ( ∆ AGB and cubic volume including top and stump ( ∆ CVTS over a five-year period. Actual values of CVTS and AGB were obtained using newly fitted volume and biomass equations or the equations used by the Pacific Northwest unit of the Forest Inventory and Analysis program. Estimates of change based on fixed and mixed-effect linear models were more accurate than change estimates based on differences in LIDAR-based estimates. This may have been due to the compounding of errors in LIDAR-based estimates over the two time periods. Models used to predict volume and biomass at a given time were, however, more precise than the models used to predict change. Models used to estimate ∆ CVTS were not as accurate as the models employed to estimate ∆ AGB . Final models had cross-validation root mean squared errors as low as 40.90% for ∆ AGB and 54.36% for ∆ CVTS .

  12. No evidence of a threshold in traffic volume affecting road-kill mortality at a large spatio-temporal scale

    Energy Technology Data Exchange (ETDEWEB)

    Grilo, Clara, E-mail: clarabentesgrilo@gmail.com [Departamento de Biología de la Conservación, Estación Biológica de Doñana (EBD-CSIC), Calle Américo Vespucio s/n, E-41092 Sevilla (Spain); Centro Brasileiro de Estudos em Ecologia de Estradas, Departamento de Biologia, Universidade Federal de Lavras, Campus Universitário, 37200-000 Lavras, Minas Gerais (Brazil); Ferreira, Flavio Zanchetta; Revilla, Eloy [Departamento de Biología de la Conservación, Estación Biológica de Doñana (EBD-CSIC), Calle Américo Vespucio s/n, E-41092 Sevilla (Spain)

    2015-11-15

    Previous studies have found that the relationship between wildlife road mortality and traffic volume follows a threshold effect on low traffic volume roads. We aimed at evaluating the response of several species to increasing traffic intensity on highways over a large geographic area and temporal period. We used data of four terrestrial vertebrate species with different biological and ecological features known by their high road-kill rates: the barn owl (Tyto alba), hedgehog (Erinaceus europaeus), red fox (Vulpes vulpes) and European rabbit (Oryctolagus cuniculus). Additionally, we checked whether road-kill likelihood varies when traffic patterns depart from the average. We used annual average daily traffic (AADT) and road-kill records observed along 1000 km of highways in Portugal over seven consecutive years (2003–2009). We fitted candidate models using Generalized Linear Models with a binomial distribution through a sample unit of 1 km segments to describe the effect of traffic on the probability of finding at least one victim in each segment during the study. We also assigned for each road-kill record the traffic of that day and the AADT on that year to test for differences using Paired Student's t-test. Mortality risk declined significantly with traffic volume but varied among species: the probability of finding road-killed red foxes and rabbits occurs up to moderate traffic volumes (< 20,000 AADT) whereas barn owls and hedgehogs occurred up to higher traffic volumes (40,000 AADT). Perception of risk may explain differences in responses towards high traffic highway segments. Road-kill rates did not vary significantly when traffic intensity departed from the average. In summary, we did not find evidence of traffic thresholds for the analysed species and traffic intensities. We suggest mitigation measures to reduce mortality be applied in particular on low traffic roads (< 5000 AADT) while additional measures to reduce barrier effects should take into

  13. No evidence of a threshold in traffic volume affecting road-kill mortality at a large spatio-temporal scale

    International Nuclear Information System (INIS)

    Grilo, Clara; Ferreira, Flavio Zanchetta; Revilla, Eloy

    2015-01-01

    Previous studies have found that the relationship between wildlife road mortality and traffic volume follows a threshold effect on low traffic volume roads. We aimed at evaluating the response of several species to increasing traffic intensity on highways over a large geographic area and temporal period. We used data of four terrestrial vertebrate species with different biological and ecological features known by their high road-kill rates: the barn owl (Tyto alba), hedgehog (Erinaceus europaeus), red fox (Vulpes vulpes) and European rabbit (Oryctolagus cuniculus). Additionally, we checked whether road-kill likelihood varies when traffic patterns depart from the average. We used annual average daily traffic (AADT) and road-kill records observed along 1000 km of highways in Portugal over seven consecutive years (2003–2009). We fitted candidate models using Generalized Linear Models with a binomial distribution through a sample unit of 1 km segments to describe the effect of traffic on the probability of finding at least one victim in each segment during the study. We also assigned for each road-kill record the traffic of that day and the AADT on that year to test for differences using Paired Student's t-test. Mortality risk declined significantly with traffic volume but varied among species: the probability of finding road-killed red foxes and rabbits occurs up to moderate traffic volumes (< 20,000 AADT) whereas barn owls and hedgehogs occurred up to higher traffic volumes (40,000 AADT). Perception of risk may explain differences in responses towards high traffic highway segments. Road-kill rates did not vary significantly when traffic intensity departed from the average. In summary, we did not find evidence of traffic thresholds for the analysed species and traffic intensities. We suggest mitigation measures to reduce mortality be applied in particular on low traffic roads (< 5000 AADT) while additional measures to reduce barrier effects should take into

  14. Rapid determination of benzene derivatives in water samples by trace volume solvent DLLME prior to GC-FID

    Energy Technology Data Exchange (ETDEWEB)

    Diao, Chun Peng; Wei, Chao Hai; Feng, Chun Hua [South China Univ. of Technology, Guangzhou Higher Education Mega Center (China). College of Environmental Science and Engineering; Guangdong Regular Higher Education Institutions, Guangzhou (China). Key Lab. of Environmental Protection and Eco-Remediation

    2012-05-15

    An inexpensive, simple and environmentally friendly method based on dispersive liquid liquid microextraction (DLLME) for rapid determination of benzene derivatives in water samples was proposed. A significant improvement of DLLME procedure was achieved. Trace volume ethyl acetate (60 {mu}L) was exploited as dispersion solvent instead of common ones such as methanol and acetone, the volume of which was more than 0.5 mL, and the organic solvent required in DLLME was reduced to a great extent. Only 83-{mu}L organic solvent was consumed in the whole analytic process and the preconcentration procedure was less than 10 min. The advantageous approach coupled with gas chromatograph-flame ionization detector was proposed for the rapid determination of benzene, toluene, ethylbenzene and xylene isomers in water samples. Results showed that the proposed approach was an efficient method for rapid determination of benzene derivatives in aqueous samples. (orig.)

  15. Full-scale borehole sealing test in salt under simulated downhole conditions. Volume 2

    International Nuclear Information System (INIS)

    Scheetz, B.E.; Licastro, P.H.; Roy, D.M.

    1986-05-01

    Large-scale testing of the permeability by brine of a salt/grout sample designed to simulate a borehole plug was conducted. The results of these tests showed that a quantity of fluid equivalent to a permeability of 3 microdarcys was collected during the course of the test. This flow rate was used to estimate the smooth bore aperture. Details of this test ware presented in Volume 1 of this report. This report, Volume 2, covers post-test characterization including a detailed study of the salt/grout interface, as well as determination of the physical/mechanical properties of grout samples molded at Terra Tek, Inc. at the time of the large-scale test. Additional studies include heat of hydration, radial stress, and longitudinal volume changes for an equivalent grout mixture

  16. Sample preparation for large-scale bioanalytical studies based on liquid chromatographic techniques.

    Science.gov (United States)

    Medvedovici, Andrei; Bacalum, Elena; David, Victor

    2018-01-01

    Quality of the analytical data obtained for large-scale and long term bioanalytical studies based on liquid chromatography depends on a number of experimental factors including the choice of sample preparation method. This review discusses this tedious part of bioanalytical studies, applied to large-scale samples and using liquid chromatography coupled with different detector types as core analytical technique. The main sample preparation methods included in this paper are protein precipitation, liquid-liquid extraction, solid-phase extraction, derivatization and their versions. They are discussed by analytical performances, fields of applications, advantages and disadvantages. The cited literature covers mainly the analytical achievements during the last decade, although several previous papers became more valuable in time and they are included in this review. Copyright © 2017 John Wiley & Sons, Ltd.

  17. Successful Large-volume Leukapheresis for Hematopoietic Stem Cell Collection in a Very-low-weight Brain Tumor Infant with Coagulopathy

    Directory of Open Access Journals (Sweden)

    Yu-Mei Liao

    2013-06-01

    Full Text Available Peripheral apheresis has become a safe procedure to collect hematopoietic stem cells, even in pediatric patients and donors. However, the apheresis procedure for small and sick children is more complicated due to difficult venous access, relatively large extracorporeal volume, toxicity of citrate, and unstable hemostasis. We report a small and sick child with refractory medulloblastoma, impaired liver function, and coagulopathy after several major cycles of cisplatin-based chemotherapy. She successfully received large-volume leukapheresis for hematopoietic stem cell collection, although the patient experienced severe coagulopathy during the procedures. Health care providers should be alert to this potential risk.

  18. Paraffin scintillator for radioassay of solid support samples

    International Nuclear Information System (INIS)

    Fujii, Haruo; Takiue, Makoto

    1989-01-01

    A new paraffin scintillator used for solid support sample counting has been proposed, and its composition and various characteristics are described. The solid support sample treated with this scintillator can be easily handled because of rigid sample conditions. This technique provides great advantages such as the elimination of a large volume of scintillator and little radioactive waste material by using an economical polyethylene bag instead of the conventional counting vial. (author)

  19. Galaxy redshift surveys with sparse sampling

    International Nuclear Information System (INIS)

    Chiang, Chi-Ting; Wullstein, Philipp; Komatsu, Eiichiro; Jee, Inh; Jeong, Donghui; Blanc, Guillermo A.; Ciardullo, Robin; Gronwall, Caryl; Hagen, Alex; Schneider, Donald P.; Drory, Niv; Fabricius, Maximilian; Landriau, Martin; Finkelstein, Steven; Jogee, Shardha; Cooper, Erin Mentuch; Tuttle, Sarah; Gebhardt, Karl; Hill, Gary J.

    2013-01-01

    Survey observations of the three-dimensional locations of galaxies are a powerful approach to measure the distribution of matter in the universe, which can be used to learn about the nature of dark energy, physics of inflation, neutrino masses, etc. A competitive survey, however, requires a large volume (e.g., V survey ∼ 10Gpc 3 ) to be covered, and thus tends to be expensive. A ''sparse sampling'' method offers a more affordable solution to this problem: within a survey footprint covering a given survey volume, V survey , we observe only a fraction of the volume. The distribution of observed regions should be chosen such that their separation is smaller than the length scale corresponding to the wavenumber of interest. Then one can recover the power spectrum of galaxies with precision expected for a survey covering a volume of V survey (rather than the volume of the sum of observed regions) with the number density of galaxies given by the total number of observed galaxies divided by V survey (rather than the number density of galaxies within an observed region). We find that regularly-spaced sampling yields an unbiased power spectrum with no window function effect, and deviations from regularly-spaced sampling, which are unavoidable in realistic surveys, introduce calculable window function effects and increase the uncertainties of the recovered power spectrum. On the other hand, we show that the two-point correlation function (pair counting) is not affected by sparse sampling. While we discuss the sparse sampling method within the context of the forthcoming Hobby-Eberly Telescope Dark Energy Experiment, the method is general and can be applied to other galaxy surveys

  20. Use of nitrogen to remove solvent from through oven transfer adsorption desorption interface during analysis of polycyclic aromatic hydrocarbons by large volume injection in gas chromatography.

    Science.gov (United States)

    Áragón, Alvaro; Toledano, Rosa M; Cortés, José M; Vázquez, Ana M; Villén, Jesús

    2014-04-25

    The through oven transfer adsorption desorption (TOTAD) interface allows large volume injection (LVI) in gas chromatography and the on-line coupling of liquid chromatography and gas chromatography (LC-GC), enabling the LC step to be carried out in normal as well as in reversed phase. However, large amounts of helium, which is both expensive and scarce, are necessary for solvent elimination. We describe how slight modification of the interface and the operating mode allows nitrogen to be used during the solvent elimination steps. In order to evaluate the performance of the new system, volumes ranging from 20 to 100μL of methanolic solutions of four polycyclic aromatic hydrocarbons (PAHs) were sampled. No significant differences were found in the repeatability and sensitivity of the analyses of standard PAH solutions when using nitrogen or helium. The performance using the proposed modification was similar and equally satisfactory when using nitrogen or helium for solvent elimination in the TOTAD interface. In conclusion, the use of nitrogen will make analyses less expensive. Copyright © 2014 Elsevier B.V. All rights reserved.

  1. Uncertainty budget in internal monostandard NAA for small and large size samples analysis

    International Nuclear Information System (INIS)

    Dasari, K.B.; Acharya, R.

    2014-01-01

    Total uncertainty budget evaluation on determined concentration value is important under quality assurance programme. Concentration calculation in NAA or carried out by relative NAA and k0 based internal monostandard NAA (IM-NAA) method. IM-NAA method has been used for small and large sample analysis of clay potteries. An attempt was made to identify the uncertainty components in IM-NAA and uncertainty budget for La in both small and large size samples has been evaluated and compared. (author)

  2. Improved mesh based photon sampling techniques for neutron activation analysis

    International Nuclear Information System (INIS)

    Relson, E.; Wilson, P. P. H.; Biondo, E. D.

    2013-01-01

    The design of fusion power systems requires analysis of neutron activation of large, complex volumes, and the resulting particles emitted from these volumes. Structured mesh-based discretization of these problems allows for improved modeling in these activation analysis problems. Finer discretization of these problems results in large computational costs, which drives the investigation of more efficient methods. Within an ad hoc subroutine of the Monte Carlo transport code MCNP, we implement sampling of voxels and photon energies for volumetric sources using the alias method. The alias method enables efficient sampling of a discrete probability distribution, and operates in 0(1) time, whereas the simpler direct discrete method requires 0(log(n)) time. By using the alias method, voxel sampling becomes a viable alternative to sampling space with the 0(1) approach of uniformly sampling the problem volume. Additionally, with voxel sampling it is straightforward to introduce biasing of volumetric sources, and we implement this biasing of voxels as an additional variance reduction technique that can be applied. We verify our implementation and compare the alias method, with and without biasing, to direct discrete sampling of voxels, and to uniform sampling. We study the behavior of source biasing in a second set of tests and find trends between improvements and source shape, material, and material density. Overall, however, the magnitude of improvements from source biasing appears to be limited. Future work will benefit from the implementation of efficient voxel sampling - particularly with conformal unstructured meshes where the uniform sampling approach cannot be applied. (authors)

  3. First results on material identification and imaging with a large-volume muon tomography prototype

    Energy Technology Data Exchange (ETDEWEB)

    Pesente, S. [INFN Sezione di Padova, via Marzolo 8, 35131 Padova (Italy); Vanini, S. [University of Padova and INFN Sezione di Padova, via Marzolo 8, 35131 Padova (Italy)], E-mail: sara.vanini@pd.infn.it; Benettoni, M. [INFN Sezione di Padova, via Marzolo 8, 35131 Padova (Italy); Bonomi, G. [University of Brescia, via Branze 38, 25123 Brescia and INFN Sezione di Pavia, via Bassi 6, 27100 Pavia (Italy); Calvini, P. [University of Genova and INFN Sezione di Genova, via Dodecaneso 33, 16146 Genova (Italy); Checchia, P.; Conti, E.; Gonella, F.; Nebbia, G. [INFN Sezione di Padova, via Marzolo 8, 35131 Padova (Italy); Squarcia, S. [University of Genova and INFN Sezione di Genova, via Dodecaneso 33, 16146 Genova (Italy); Viesti, G. [University of Padova and INFN Sezione di Padova, via Marzolo 8, 35131 Padova (Italy); Zenoni, A. [University of Brescia, via Branze 38, 25123 Brescia and INFN Sezione di Pavia, via Bassi 6, 27100 Pavia (Italy); Zumerle, G. [University of Padova and INFN Sezione di Padova, via Marzolo 8, 35131 Padova (Italy)

    2009-06-11

    The muon tomography technique, based on the Multiple Coulomb Scattering of cosmic ray muons, has been proposed recently as a tool to perform non-destructive assays of large-volume objects without any radiation hazard. In this paper we discuss experimental results obtained with a scanning system prototype, assembled using two large-area CMS Muon Barrel drift chambers. The capability of the apparatus to produce 3D images of objects and to classify them according to their density is presented. We show that the absorption of low-momentum muons in the scanned objects produces an underestimate of their scattering density, making the discrimination of materials heavier than lead more difficult.

  4. First results on material identification and imaging with a large-volume muon tomography prototype

    Energy Technology Data Exchange (ETDEWEB)

    Viesti, G. [Dipartimento di Fisica, Universita di Padova, via Marzolo 8, I-35131 Padova (Italy); Pesente, S.; Benettoni, M.; Checchia, P.; Conti, E.; Gonella, F.; Nebbia, G. [INFN, Sez. di Padova, Via Marzolo 8, I-35131 Padova (Italy); Vanini, S.; Viesti, G.; Zumerle, G. [Dip. di Fisica G. Galilei, Universita di Padova, I-35131 Padova (Italy); INFN, Sez. di Padova, Via Marzolo 8, I-35131 Padova (Italy); Bonomi, G.; Zenoni, A. [Universita di Brescia, I-25133 Brescia (Italy); INFN, Sez. di Pavia, Via Valotti 9, I-25133 Brescia (Italy); Calvini, P.; Squarcia, S. [Dip. di Fisica, Universita di Genova, Genova (Italy); INFN, Sez. di Genova, Via Dodecaneso 33, I-16146 Genova (Italy)

    2009-07-01

    The muon tomography technique, based on the multiple Coulomb scattering of cosmic ray muons, has been proposed recently as a tool to perform non-destructive assays of large volume objects without any radiation hazard. In this paper we present the experimental results obtained with a scanning system prototype, assembled using two large area CMS Muon Barrel drift chambers. The imaging capability of the apparatus is shown, and the possibility to discriminate among different materials is discussed in a specific case of detecting lead objects inside a metal matrix. This specific case is dictated by a need in safely handling scrap metal cargoes in the steel industry. (authors)

  5. Colloids Versus Albumin in Large Volume Paracentesis to Prevent Circulatory Dysfunction: Evidence-based Case Report.

    Science.gov (United States)

    Widjaja, Felix F; Khairan, Paramita; Kamelia, Telly; Hasan, Irsan

    2016-04-01

    Large volume paracentesis may cause paracentesis induced circulatory dysfunction (PICD). Albumin is recommended to prevent this abnormality. Meanwhile, the price of albumin is too expensive and there should be another alternative that may prevent PICD. This report aimed to compare albumin to colloids in preventing PICD. Search strategy was done using PubMed, Scopus, Proquest, dan Academic Health Complete from EBSCO with keywords of "ascites", "albumin", "colloid", "dextran", "hydroxyethyl starch", "gelatin", and "paracentesis induced circulatory dysfunction". Articles was limited to randomized clinical trial and meta-analysis with clinical question of "In hepatic cirrhotic patient undergone large volume paracentesis, whether colloids were similar to albumin to prevent PICD". We found one meta-analysis and four randomized clinical trials (RCT). A meta analysis showed that albumin was still superior of which odds ratio 0.34 (0.23-0.51). Three RCTs showed the same results and one RCT showed albumin was not superior than colloids. We conclude that colloids could not constitute albumin to prevent PICD, but colloids still have a role in patient who undergone paracentesis less than five liters.

  6. Gamut Volume Index: a color preference metric based on meta-analysis and optimized colour samples.

    Science.gov (United States)

    Liu, Qiang; Huang, Zheng; Xiao, Kaida; Pointer, Michael R; Westland, Stephen; Luo, M Ronnier

    2017-07-10

    A novel metric named Gamut Volume Index (GVI) is proposed for evaluating the colour preference of lighting. This metric is based on the absolute gamut volume of optimized colour samples. The optimal colour set of the proposed metric was obtained by optimizing the weighted average correlation between the metric predictions and the subjective ratings for 8 psychophysical studies. The performance of 20 typical colour metrics was also investigated, which included colour difference based metrics, gamut based metrics, memory based metrics as well as combined metrics. It was found that the proposed GVI outperformed the existing counterparts, especially for the conditions where correlated colour temperatures differed.

  7. Process automation system for integration and operation of Large Volume Plasma Device

    International Nuclear Information System (INIS)

    Sugandhi, R.; Srivastava, P.K.; Sanyasi, A.K.; Srivastav, Prabhakar; Awasthi, L.M.; Mattoo, S.K.

    2016-01-01

    Highlights: • Analysis and design of process automation system for Large Volume Plasma Device (LVPD). • Data flow modeling for process model development. • Modbus based data communication and interfacing. • Interface software development for subsystem control in LabVIEW. - Abstract: Large Volume Plasma Device (LVPD) has been successfully contributing towards understanding of the plasma turbulence driven by Electron Temperature Gradient (ETG), considered as a major contributor for the plasma loss in the fusion devices. Large size of the device imposes certain difficulties in the operation, such as access of the diagnostics, manual control of subsystems and large number of signals monitoring etc. To achieve integrated operation of the machine, automation is essential for the enhanced performance and operational efficiency. Recently, the machine is undergoing major upgradation for the new physics experiments. The new operation and control system consists of following: (1) PXIe based fast data acquisition system for the equipped diagnostics; (2) Modbus based Process Automation System (PAS) for the subsystem controls and (3) Data Utilization System (DUS) for efficient storage, processing and retrieval of the acquired data. In the ongoing development, data flow model of the machine’s operation has been developed. As a proof of concept, following two subsystems have been successfully integrated: (1) Filament Power Supply (FPS) for the heating of W- filaments based plasma source and (2) Probe Positioning System (PPS) for control of 12 number of linear probe drives for a travel length of 100 cm. The process model of the vacuum production system has been prepared and validated against acquired pressure data. In the next upgrade, all the subsystems of the machine will be integrated in a systematic manner. The automation backbone is based on 4-wire multi-drop serial interface (RS485) using Modbus communication protocol. Software is developed on LabVIEW platform using

  8. Process automation system for integration and operation of Large Volume Plasma Device

    Energy Technology Data Exchange (ETDEWEB)

    Sugandhi, R., E-mail: ritesh@ipr.res.in; Srivastava, P.K.; Sanyasi, A.K.; Srivastav, Prabhakar; Awasthi, L.M.; Mattoo, S.K.

    2016-11-15

    Highlights: • Analysis and design of process automation system for Large Volume Plasma Device (LVPD). • Data flow modeling for process model development. • Modbus based data communication and interfacing. • Interface software development for subsystem control in LabVIEW. - Abstract: Large Volume Plasma Device (LVPD) has been successfully contributing towards understanding of the plasma turbulence driven by Electron Temperature Gradient (ETG), considered as a major contributor for the plasma loss in the fusion devices. Large size of the device imposes certain difficulties in the operation, such as access of the diagnostics, manual control of subsystems and large number of signals monitoring etc. To achieve integrated operation of the machine, automation is essential for the enhanced performance and operational efficiency. Recently, the machine is undergoing major upgradation for the new physics experiments. The new operation and control system consists of following: (1) PXIe based fast data acquisition system for the equipped diagnostics; (2) Modbus based Process Automation System (PAS) for the subsystem controls and (3) Data Utilization System (DUS) for efficient storage, processing and retrieval of the acquired data. In the ongoing development, data flow model of the machine’s operation has been developed. As a proof of concept, following two subsystems have been successfully integrated: (1) Filament Power Supply (FPS) for the heating of W- filaments based plasma source and (2) Probe Positioning System (PPS) for control of 12 number of linear probe drives for a travel length of 100 cm. The process model of the vacuum production system has been prepared and validated against acquired pressure data. In the next upgrade, all the subsystems of the machine will be integrated in a systematic manner. The automation backbone is based on 4-wire multi-drop serial interface (RS485) using Modbus communication protocol. Software is developed on LabVIEW platform using

  9. Measurement of lung fluid volumes and albumin exclusion in sheep

    International Nuclear Information System (INIS)

    Pou, N.A.; Roselli, R.J.; Parker, R.E.; Clanton, J.A.; Harris, T.R.

    1989-01-01

    A radioactive tracer technique was used to determine interstitial diethylenetriaminepentaacetic acid (DTPA) and albumin distribution volume in sheep lungs. 125 I- and/or 131 I-labeled albumin were injected intravenously and allowed to equilibrate for 24 h. 99m Tc-labeled DTPA and 51 Cr-labeled erythrocytes were injected and allowed to equilibrate (2 h and 15 min, respectively) before a lethal dose of thiamylal sodium. Two biopsies (1-3 g) were taken from each lung and the remaining tissue was homogenized for wet-to-dry lung weight and volume calculations. Estimates of distribution volumes from whole lung homogenized samples were statistically smaller than biopsy samples for extravascular water, interstitial 99m Tc-DTPA, and interstitial albumin. The mean fraction of the interstitium (Fe), which excludes albumin, was 0.68 +/- 0.04 for whole lung samples compared with 0.62 +/- 0.03 for biopsy samples. Hematocrit may explain the consistent difference. To make the Fe for biopsy samples match that for homogenized samples, a mean hematocrit, which was 82% of large vessel hematocrit, was required. Excluded volume fraction for exogenous sheep albumin was compared with that of exogenous human albumin in two sheep, and no difference was found at 24 h

  10. Evaluation of the effects of insufficient blood volume samples on the performance of blood glucose self-test meters.

    Science.gov (United States)

    Pfützner, Andreas; Schipper, Christina; Ramljak, Sanja; Flacke, Frank; Sieber, Jochen; Forst, Thomas; Musholt, Petra B

    2013-11-01

    Accuracy of blood glucose readings is (among other things) dependent on the test strip being completely filled with sufficient sample volume. The devices are supposed to display an error message in case of incomplete filling. This laboratory study was performed to test the performance of 31 commercially available devices in case of incomplete strip filling. Samples with two different glucose levels (60-90 and 300-350 mg/dl) were used to generate three different sample volumes: 0.20 µl (too low volume for any device), 0.32 µl (borderline volume), and 1.20 µl (low but supposedly sufficient volume for all devices). After a point-of-care capillary reference measurement (StatStrip, NovaBiomedical), the meter strip was filled (6x) with the respective volume, and the response of the meters (two devices) was documented (72 determinations/meter type). Correct response was defined as either an error message indicating incomplete filling or a correct reading (±20% compared with reference reading). Only five meters showed 100% correct responses [BGStar and iBGStar (both Sanofi), ACCU-CHEK Compact+ and ACCU-CHEK Mobile (both Roche Diagnostics), OneTouch Verio (LifeScan)]. The majority of the meters (17) had up to 10% incorrect reactions [predominantly incorrect readings with sufficient volume; Precision Xceed and Xtra, FreeStyle Lite, and Freedom Lite (all Abbott); GlucoCard+ and GlucoMen GM (both Menarini); Contour, Contour USB, and Breeze2 (all Bayer); OneTouch Ultra Easy, Ultra 2, and Ultra Smart (all LifeScan); Wellion Dialog and Premium (both MedTrust); FineTouch (Terumo); ACCU-CHEK Aviva (Roche); and GlucoTalk (Axis-Shield)]. Ten percent to 20% incorrect reactions were seen with OneTouch Vita (LifeScan), ACCU-CHEK Aviva Nano (Roche), OmniTest+ (BBraun), and AlphaChek+ (Berger Med). More than 20% incorrect reactions were obtained with Pura (Ypsomed), GlucoCard Meter and GlucoMen LX (both Menarini), Elite (Bayer), and MediTouch (Medisana). In summary, partial and

  11. Aerodynamics of the Large-Volume, Flow-Through Detector System. Final report

    International Nuclear Information System (INIS)

    Reed, H.; Saric, W.; Laananen, D.; Martinez, C.; Carrillo, R.; Myers, J.; Clevenger, D.

    1996-03-01

    The Large-Volume Flow-Through Detector System (LVFTDS) was designed to monitor alpha radiation from Pu, U, and Am in mixed-waste incinerator offgases; however, it can be adapted to other important monitoring uses that span a number of potential markets, including site remediation, indoor air quality, radon testing, and mine shaft monitoring. Goal of this effort was to provide mechanical design information for installation of LVFTDS in an incinerator, with emphasis on ability to withstand the high temperatures and high flow rates expected. The work was successfully carried out in three stages: calculation of pressure drop through the system, materials testing to determine surrogate materials for wind-tunnel testing, and wind-tunnel testing of an actual configuration

  12. Exploring Technostress: Results of a Large Sample Factor Analysis

    Directory of Open Access Journals (Sweden)

    Steponas Jonušauskas

    2016-06-01

    Full Text Available With reference to the results of a large sample factor analysis, the article aims to propose the frame examining technostress in a population. The survey and principal component analysis of the sample consisting of 1013 individuals who use ICT in their everyday work was implemented in the research. 13 factors combine 68 questions and explain 59.13 per cent of the answers dispersion. Based on the factor analysis, questionnaire was reframed and prepared to reasonably analyze the respondents’ answers, revealing technostress causes and consequences as well as technostress prevalence in the population in a statistically validated pattern. A key elements of technostress based on factor analysis can serve for the construction of technostress measurement scales in further research.

  13. Large volume recycling of oceanic lithosphere over short time scales: geochemical constraints from the Caribbean Large Igneous Province

    Science.gov (United States)

    Hauff, F.; Hoernle, K.; Tilton, G.; Graham, D. W.; Kerr, A. C.

    2000-01-01

    Oceanic flood basalts are poorly understood, short-term expressions of highly increased heat flux and mass flow within the convecting mantle. The uniqueness of the Caribbean Large Igneous Province (CLIP, 92-74 Ma) with respect to other Cretaceous oceanic plateaus is its extensive sub-aerial exposures, providing an excellent basis to investigate the temporal and compositional relationships within a starting plume head. We present major element, trace element and initial Sr-Nd-Pb isotope composition of 40 extrusive rocks from the Caribbean Plateau, including onland sections in Costa Rica, Colombia and Curaçao as well as DSDP Sites in the Central Caribbean. Even though the lavas were erupted over an area of ˜3×10 6 km 2, the majority have strikingly uniform incompatible element patterns (La/Yb=0.96±0.16, n=64 out of 79 samples, 2σ) and initial Nd-Pb isotopic compositions (e.g. 143Nd/ 144Nd in=0.51291±3, ɛNdi=7.3±0.6, 206Pb/ 204Pb in=18.86±0.12, n=54 out of 66, 2σ). Lavas with endmember compositions have only been sampled at the DSDP Sites, Gorgona Island (Colombia) and the 65-60 Ma accreted Quepos and Osa igneous complexes (Costa Rica) of the subsequent hotspot track. Despite the relatively uniform composition of most lavas, linear correlations exist between isotope ratios and between isotope and highly incompatible trace element ratios. The Sr-Nd-Pb isotope and trace element signatures of the chemically enriched lavas are compatible with derivation from recycled oceanic crust, while the depleted lavas are derived from a highly residual source. This source could represent either oceanic lithospheric mantle left after ocean crust formation or gabbros with interlayered ultramafic cumulates of the lower oceanic crust. High 3He/ 4He in olivines of enriched picrites at Quepos are ˜12 times higher than the atmospheric ratio suggesting that the enriched component may have once resided in the lower mantle. Evaluation of the Sm-Nd and U-Pb isotope systematics on

  14. Hippocampal volume is positively associated with behavioural inhibition (BIS) in a large community-based sample of mid-life adults: the PATH through life study.

    Science.gov (United States)

    Cherbuin, Nicolas; Windsor, Tim D; Anstey, Kaarin J; Maller, Jerome J; Meslin, Chantal; Sachdev, Perminder S

    2008-09-01

    The fields of personality research and neuropsychology have developed with very little overlap. Gray and McNaughton were among the first to recognize that personality traits must have neurobiological correlates and developed models relating personality factors to brain structures. Of particular note was their description of associations between conditioning, inhibition and activation of behaviours, and specific neural structures such as the hippocampus, amygdala and the prefrontal cortex. The aim of this study was to determine whether personality constructs representing the behavioural inhibition and activation systems (BIS/BAS) were associated with volumetric measures of the hippocampus and amygdala in humans. Amygdalar and hippocampal volumes were measured in 430 brain scans of cognitively intact community-based volunteers. Linear associations between brain volumes and the BIS/BAS measures were assessed using multiple regression, controlling for age, sex, education, intra-cranial and total brain volume. Results showed that hippocampal volumes were positively associated with BIS sensitivity and to a lesser extent with BAS sensitivity. No association was found between amygdalar volume and either the BIS or BAS. These findings add support to the model of Gray and McNaughton, which proposes a role of the hippocampus in the regulation of defensive/approach behaviours and trait anxiety but suggest an absence of associations between amygdala volume and BIS/BAS measures.

  15. Evaluation of a laser scanner for large volume coordinate metrology: a comparison of results before and after factory calibration

    International Nuclear Information System (INIS)

    Ferrucci, M; Muralikrishnan, B; Sawyer, D; Phillips, S; Petrov, P; Yakovlev, Y; Astrelin, A; Milligan, S; Palmateer, J

    2014-01-01

    Large volume laser scanners are increasingly being used for a variety of dimensional metrology applications. Methods to evaluate the performance of these scanners are still under development and there are currently no documentary standards available. This paper describes the results of extensive ranging and volumetric performance tests conducted on a large volume laser scanner. The results demonstrated small but clear systematic errors that are explained in the context of a geometric error model for the instrument. The instrument was subsequently returned to the manufacturer for factory calibration. The ranging and volumetric tests were performed again and the results are compared against those obtained prior to the factory calibration. (paper)

  16. On the fairness of the main galaxy sample of SDSS

    International Nuclear Information System (INIS)

    Meng Kelai; Pan Jun; Feng Longlong; Ma Bin

    2011-01-01

    Flux-limited and volume-limited galaxy samples are constructed from the Sloan Digital Sky Survey (SDSS) data releases DR4, DR6 and DR7 for statistical analysis. The two-point correlation functions ξ(s), monopole of three-point correlation functions ζ 0 , projected two-point correlation function w p and pairwise velocity dispersion σ 12 are measured to test if galaxy samples are fair for these statistics. We find that with the increment of sky coverage of subsequent data releases in SDSS, ξ(s) of the flux-limited sample is extremely robust and insensitive to local structures at low redshift. However, for volume-limited samples fainter than L* at large scales s > or approx. 10 h -1 Mpc, the deviation of ξ(s) from different SDSS data releases (DR7, DR6 and DR4) increases with the increment of absolute magnitude. The case of ζ 0 (s) is similar to that of ξ(s). In the weakly nonlinear regime, there is no agreement between ζ 0 of different data releases in all luminosity bins. Furthermore, w p of volume-limited samples of DR7 in luminosity bins fainter than -M r,0.1 = [18.5, 19.5] are significantly larger and σ 12 of the two faintest volume-limited samples of DR7 display a very different scale dependence than results from DR4 and DR6. Our findings call for caution in understanding clustering analysis results of SDSS faint galaxy samples and higher order statistics of SDSS volume-limited samples in the weakly nonlinear regime. The first zero-crossing points of ξ(s) from volume-limited samples are also investigated and discussed. (research papers)

  17. Automated, feature-based image alignment for high-resolution imaging mass spectrometry of large biological samples

    NARCIS (Netherlands)

    Broersen, A.; Liere, van R.; Altelaar, A.F.M.; Heeren, R.M.A.; McDonnell, L.A.

    2008-01-01

    High-resolution imaging mass spectrometry of large biological samples is the goal of several research groups. In mosaic imaging, the most common method, the large sample is divided into a mosaic of small areas that are then analyzed with high resolution. Here we present an automated alignment

  18. Capillary ion chromatography with on-column focusing for ultra-trace analysis of methanesulfonate and inorganic anions in limited volume Antarctic ice core samples.

    Science.gov (United States)

    Rodriguez, Estrella Sanz; Poynter, Sam; Curran, Mark; Haddad, Paul R; Shellie, Robert A; Nesterenko, Pavel N; Paull, Brett

    2015-08-28

    Preservation of ionic species within Antarctic ice yields a unique proxy record of the Earth's climate history. Studies have been focused until now on two proxies: the ionic components of sea salt aerosol and methanesulfonic acid. Measurement of the all of the major ionic species in ice core samples is typically carried out by ion chromatography. Former methods, whilst providing suitable detection limits, have been based upon off-column preconcentration techniques, requiring larger sample volumes, with potential for sample contamination and/or carryover. Here, a new capillary ion chromatography based analytical method has been developed for quantitative analysis of limited volume Antarctic ice core samples. The developed analytical protocol applies capillary ion chromatography (with suppressed conductivity detection) and direct on-column sample injection and focusing, thus eliminating the requirement for off-column sample preconcentration. This limits the total sample volume needed to 300μL per analysis, allowing for triplicate sample analysis with Application to composite ice-core samples is demonstrated, with coupling of the capillary ion chromatograph to high resolution mass spectrometry used to confirm the presence and purity of the observed methanesulfonate peak. Copyright © 2015 Elsevier B.V. All rights reserved.

  19. A rheo-optical apparatus for real time kinetic studies on shear-induced alignment of self-assembled soft matter with small sample volumes

    Science.gov (United States)

    Laiho, Ari; Ikkala, Olli

    2007-01-01

    In soft materials, self-assembled nanoscale structures can allow new functionalities but a general problem is to align such local structures aiming at monodomain overall order. In order to achieve shear alignment in a controlled manner, a novel type of rheo-optical apparatus has here been developed that allows small sample volumes and in situ monitoring of the alignment process during the shear. Both the amplitude and orientation angles of low level linear birefringence and dichroism are measured while the sample is subjected to large amplitude oscillatory shear flow. The apparatus is based on a commercial rheometer where we have constructed a flow cell that consists of two quartz teeth. The lower tooth can be set in oscillatory motion whereas the upper one is connected to the force transducers of the rheometer. A custom made cylindrical oven allows the operation of the flow cell at elevated temperatures up to 200 °C. Only a small sample volume is needed (from 9 to 25 mm3), which makes the apparatus suitable especially for studying new materials which are usually obtainable only in small quantities. Using this apparatus the flow alignment kinetics of a lamellar polystyrene-b-polyisoprene diblock copolymer is studied during shear under two different conditions which lead to parallel and perpendicular alignment of the lamellae. The open device geometry allows even combined optical/x-ray in situ characterization of the alignment process by combining small-angle x-ray scattering using concepts shown by Polushkin et al. [Macromolecules 36, 1421 (2003)].

  20. Characteristic Performance Evaluation of a new SAGe Well Detector for Small and Large Sample Geometries

    International Nuclear Information System (INIS)

    Adekola, A.S.; Colaresi, J.; Douwen, J.; Jaederstroem, H.; Mueller, W.F.; Yocum, K.M.; Carmichael, K.

    2015-01-01

    Environmental scientific research requires a detector that has sensitivity low enough to reveal the presence of any contaminant in the sample at a reasonable counting time. Canberra developed the germanium detector geometry called Small Anode Germanium (SAGe) Well detector, which is now available commercially. The SAGe Well detector is a new type of low capacitance germanium well detector manufactured using small anode technology capable of advancing many environmental scientific research applications. The performance of this detector has been evaluated for a range of sample sizes and geometries counted inside the well, and on the end cap of the detector. The detector has energy resolution performance similar to semi-planar detectors, and offers significant improvement over the existing coaxial and Well detectors. Energy resolution performance of 750 eV Full Width at Half Maximum (FWHM) at 122 keV γ-ray energy and resolution of 2.0 - 2.3 keV FWHM at 1332 keV γ-ray energy are guaranteed for detector volumes up to 425 cm 3 . The SAGe Well detector offers an optional 28 mm well diameter with the same energy resolution as the standard 16 mm well. Such outstanding resolution performance will benefit environmental applications in revealing the detailed radionuclide content of samples, particularly at low energy, and will enhance the detection sensitivity resulting in reduced counting time. The detector is compatible with electric coolers without any sacrifice in performance and supports the Canberra Mathematical efficiency calibration method (In situ Object Calibration Software or ISOCS, and Laboratory Source-less Calibration Software or LABSOCS). In addition, the SAGe Well detector supports true coincidence summing available in the ISOCS/LABSOCS framework. The improved resolution performance greatly enhances detection sensitivity of this new detector for a range of sample sizes and geometries counted inside the well. This results in lower minimum detectable

  1. Characteristic Performance Evaluation of a new SAGe Well Detector for Small and Large Sample Geometries

    Energy Technology Data Exchange (ETDEWEB)

    Adekola, A.S.; Colaresi, J.; Douwen, J.; Jaederstroem, H.; Mueller, W.F.; Yocum, K.M.; Carmichael, K. [Canberra Industries Inc., 800 Research Parkway, Meriden, CT 06450 (United States)

    2015-07-01

    Environmental scientific research requires a detector that has sensitivity low enough to reveal the presence of any contaminant in the sample at a reasonable counting time. Canberra developed the germanium detector geometry called Small Anode Germanium (SAGe) Well detector, which is now available commercially. The SAGe Well detector is a new type of low capacitance germanium well detector manufactured using small anode technology capable of advancing many environmental scientific research applications. The performance of this detector has been evaluated for a range of sample sizes and geometries counted inside the well, and on the end cap of the detector. The detector has energy resolution performance similar to semi-planar detectors, and offers significant improvement over the existing coaxial and Well detectors. Energy resolution performance of 750 eV Full Width at Half Maximum (FWHM) at 122 keV γ-ray energy and resolution of 2.0 - 2.3 keV FWHM at 1332 keV γ-ray energy are guaranteed for detector volumes up to 425 cm{sup 3}. The SAGe Well detector offers an optional 28 mm well diameter with the same energy resolution as the standard 16 mm well. Such outstanding resolution performance will benefit environmental applications in revealing the detailed radionuclide content of samples, particularly at low energy, and will enhance the detection sensitivity resulting in reduced counting time. The detector is compatible with electric coolers without any sacrifice in performance and supports the Canberra Mathematical efficiency calibration method (In situ Object Calibration Software or ISOCS, and Laboratory Source-less Calibration Software or LABSOCS). In addition, the SAGe Well detector supports true coincidence summing available in the ISOCS/LABSOCS framework. The improved resolution performance greatly enhances detection sensitivity of this new detector for a range of sample sizes and geometries counted inside the well. This results in lower minimum detectable

  2. Percutaneous Image-Guided Cryoablation of Challenging Mediastinal Lesions Using Large-Volume Hydrodissection: Technical Considerations and Outcomes

    Energy Technology Data Exchange (ETDEWEB)

    Garnon, Julien, E-mail: juliengarnon@gmail.com; Koch, Guillaume, E-mail: Guillaume.koch@gmail.com; Caudrelier, Jean, E-mail: caudjean@yahoo.fr [University Hospital of Strasbourg, Department of Interventional Radiology (France); Ramamurthy, Nitin, E-mail: Nitin-ramamurthy@hotmail.com [Norfolk and Norwich University Hospital, Department of Radiology (United Kingdom); Rao, Pramod, E-mail: pramodrao@me.com [University of Strasbourg, ICube (France); Tsoumakidou, Georgia, E-mail: Georgia.tsoumakidou@chru-strasbourg.fr; Cazzato, Roberto Luigi, E-mail: cazzatorobertoluigi@gmail.com; Gangi, Afshin, E-mail: Afshin.gangi@chru-strasbourg.fr [University Hospital of Strasbourg, Department of Interventional Radiology (France)

    2016-11-15

    ObjectiveThis study was designed to describe the technique of percutaneous image-guided cryoablation with large-volume hydrodissection for the treatment of challenging mediastinal lesions.MethodsBetween March 2014 and June 2015, three patients (mean age 62.7 years) with four neoplastic anterior mediastinal lesions underwent five cryoablation procedures using large-volume hydrodissection. Procedures were performed under general anaesthesia using CT guidance. Lesion characteristics, hydrodissection and cryoablation data, technical success, complications, and clinical outcomes were assessed using retrospective chart review.ResultsLesions (mean size 2.7 cm; range 2–4.3 cm) were in contact with great vessels (n = 13), trachea (n = 3), and mediastinal nerves (n = 6). Hydrodissection was performed intercostally (n = 4), suprasternally (n = 2), transsternally (n = 1), or via the sternoclavicular joint (n = 1) using 1–3 spinal needles over 13.4 (range 7–26) minutes; 450 ml of dilute contrast was injected (range 300–600 ml) and increased mean lesion-collateral structure distance from 1.9 to 7.7 mm. Vulnerable mediastinal nerves were identified in four of five procedures. Technical success was 100 %, with one immediate complication (recurrent laryngeal nerve injury). Mean follow-up period was 15 months. One lesion demonstrated residual disease on restaging PET-CT and was retreated to achieve complete ablation. At last follow-up, two patients remained disease-free, and one patient developed distant disease after 1 year without local recurrence.ConclusionsCryoablation using large-volume hydrodissection is a feasible technique, enabling safe and effective treatment of challenging mediastinal lesions.

  3. Sampling large landscapes with small-scale stratification-User's Manual

    Science.gov (United States)

    Bart, Jonathan

    2011-01-01

    This manual explains procedures for partitioning a large landscape into plots, assigning the plots to strata, and selecting plots in each stratum to be surveyed. These steps are referred to as the "sampling large landscapes (SLL) process." We assume that users of the manual have a moderate knowledge of ArcGIS and Microsoft ® Excel. The manual is written for a single user but in many cases, some steps will be carried out by a biologist designing the survey and some steps will be carried out by a quantitative assistant. Thus, the manual essentially may be passed back and forth between these users. The SLL process primarily has been used to survey birds, and we refer to birds as subjects of the counts. The process, however, could be used to count any objects. ®

  4. Development of production methods of volume source by the resinous solution which has hardening

    CERN Document Server

    Motoki, R

    2002-01-01

    Volume sources is used for standard sources by radioactive measurement using Ge semiconductor detector of environmental sample, e.g. water, soil and etc. that require large volume. The commercial volume source used in measurement of the water sample is made of agar-agar, and that used in measurement of the soil sample is made of alumina powder. When the plastic receptacles of this two kinds of volume sources were damaged, the leakage contents cause contamination. Moreover, if hermetically sealing performance of volume source made of agar-agar fell, volume decrease due to an evaporation off moisture gives an error to radioactive measurement. Therefore, we developed the two type methods using unsaturated polyester resin, vinilester resin, their hardening agent and acrylicresin. The first type is due to dispersing the hydrochloric acid solution included the radioisotopes uniformly in each resin and hardening the resin. The second is due to dispersing the alumina powder absorbed the radioisotopes in each resin an...

  5. Complex Security System for Premises Under Conditions of Large Volume of Passenger Traffic

    Directory of Open Access Journals (Sweden)

    Yakubov Vladimir

    2016-01-01

    Full Text Available Subsystems of the design of a complex security system for premises under conditions of large volume of passenger traffic are considered. These subsystems provide video- and thermal imaging control, radio wave tomography, and gas analysis. Simultaneous application of all examined variants will essentially increase the probability of timely prevention of dangerous situations with the probability of false alarm as low as possible. It is important that finally, this will provide protection of population and will facilitate the work of intelligence services.

  6. Open loop control of filament heating power supply for large volume plasma device

    Energy Technology Data Exchange (ETDEWEB)

    Sugandhi, R., E-mail: ritesh@ipr.res.in [Institute for Plasma Research, Gandhinagar, Gujarat 382428 (India); Homi Bhabha National Institute, Mumbai 400094 (India); Srivastava, P.K.; Sanyasi, A.K. [Homi Bhabha National Institute, Mumbai 400094 (India); Srivastav, Prabhakar [Institute for Plasma Research, Gandhinagar, Gujarat 382428 (India); Homi Bhabha National Institute, Mumbai 400094 (India); Awasthi, L.M., E-mail: kushagra.lalit@gmail.com [Institute for Plasma Research, Gandhinagar, Gujarat 382428 (India); Homi Bhabha National Institute, Mumbai 400094 (India); Mattoo, S.K. [Homi Bhabha National Institute, Mumbai 400094 (India)

    2017-02-15

    A power supply (20 V, 10 kA) for powering the filamentary cathode has been procured, interfaced and integrated with the centralized control system of Large Volume Plasma Device (LVPD). Software interface has been developed on the standard Modbus RTU communication protocol. It facilitates the dashboard for configuration, on line status monitoring, alarm management, data acquisition, synchronization and controls. It has been tested for stable operation of the power supply for the operational capabilities. The paper highlights the motivation, interface description, implementation and results obtained.

  7. Open loop control of filament heating power supply for large volume plasma device

    International Nuclear Information System (INIS)

    Sugandhi, R.; Srivastava, P.K.; Sanyasi, A.K.; Srivastav, Prabhakar; Awasthi, L.M.; Mattoo, S.K.

    2017-01-01

    A power supply (20 V, 10 kA) for powering the filamentary cathode has been procured, interfaced and integrated with the centralized control system of Large Volume Plasma Device (LVPD). Software interface has been developed on the standard Modbus RTU communication protocol. It facilitates the dashboard for configuration, on line status monitoring, alarm management, data acquisition, synchronization and controls. It has been tested for stable operation of the power supply for the operational capabilities. The paper highlights the motivation, interface description, implementation and results obtained.

  8. Toepassingsmogelijkheden van groot-volume-injectie gaschromatografie met infraroodspectrometrische detectie

    NARCIS (Netherlands)

    Visser T; Vredenbregt MJ; Hankemeier Th; Hooischuur E; Laan R van der; LOC; VU, vakgroep Analytische Chemie, Amsterdam

    1997-01-01

    Research has been carried out to enlarge the analyte detectability of gaschromatography with infrared spectrometric detection (GC-IR) by techniques that allow injection of large volumes of liquid samples (100 ul typical). Two techniques have been investigated; loop-type and on-column interfacing.

  9. Transfer function design based on user selected samples for intuitive multivariate volume exploration

    KAUST Repository

    Zhou, Liang

    2013-02-01

    Multivariate volumetric datasets are important to both science and medicine. We propose a transfer function (TF) design approach based on user selected samples in the spatial domain to make multivariate volumetric data visualization more accessible for domain users. Specifically, the user starts the visualization by probing features of interest on slices and the data values are instantly queried by user selection. The queried sample values are then used to automatically and robustly generate high dimensional transfer functions (HDTFs) via kernel density estimation (KDE). Alternatively, 2D Gaussian TFs can be automatically generated in the dimensionality reduced space using these samples. With the extracted features rendered in the volume rendering view, the user can further refine these features using segmentation brushes. Interactivity is achieved in our system and different views are tightly linked. Use cases show that our system has been successfully applied for simulation and complicated seismic data sets. © 2013 IEEE.

  10. Transfer function design based on user selected samples for intuitive multivariate volume exploration

    KAUST Repository

    Zhou, Liang; Hansen, Charles

    2013-01-01

    Multivariate volumetric datasets are important to both science and medicine. We propose a transfer function (TF) design approach based on user selected samples in the spatial domain to make multivariate volumetric data visualization more accessible for domain users. Specifically, the user starts the visualization by probing features of interest on slices and the data values are instantly queried by user selection. The queried sample values are then used to automatically and robustly generate high dimensional transfer functions (HDTFs) via kernel density estimation (KDE). Alternatively, 2D Gaussian TFs can be automatically generated in the dimensionality reduced space using these samples. With the extracted features rendered in the volume rendering view, the user can further refine these features using segmentation brushes. Interactivity is achieved in our system and different views are tightly linked. Use cases show that our system has been successfully applied for simulation and complicated seismic data sets. © 2013 IEEE.

  11. Volume Stability of Bitumen Bound Building Blocks

    Directory of Open Access Journals (Sweden)

    Thanaya I.N.A.

    2010-01-01

    Full Text Available This paper covers results of laboratory investigations on the volume stability of masonry units incorporating waste materials bound with bitumen (Bitublocks, due to moisture adsorption, thermal exposure and vacuum saturation. The materials used were steel slag, crushed glass, coal fly ash, and 50 pen bitumen. The samples were produced in hot mix method, compacted, then exposed to moist and temperature. It was found that moisture adsorption from the environment caused the Bitublock to expand. The samples with less intense curing regime experienced lower expansion and became stable faster, and vice versa. Under thermal condition (at 70°C, the samples with less intense curing regime underwent higher expansion, and vice versa. They were also highly reversible. Their volume stability was found unique under water exposure. The expansion on first vacuum saturation cycle was irreversible, then largely reversible on the following cycles.

  12. 'Dip-sticks' calibration handles self-attenuation and coincidence effects in large-volume gamma-ray spectrometry

    CERN Document Server

    Wolterbeek, H T

    2000-01-01

    Routine gamma-spectrometric analyses of samples with low-level activities (e.g. food, water, environmental and industrial samples) are often performed in large samples, placed close to the detector. In these geometries, detection sensitivity is improved but large errors are introduced due to self-attenuation and coincidence summing. Current approaches to these problems comprise computational methods and spiked standard materials. However, the first are often regarded as too complex for practical routine use, the latter never fully match real samples. In the present study, we introduce a dip-sticks calibration as a fast and easy practical solution to this quantification problem in a routine analytical setting. In the proposed set-up, calibrations are performed within the sample itself, thus making it a broadly accessible matching-reference approach, which is principally usable for all sample matrices.

  13. An automatic sample changer for gamma spectrometry

    International Nuclear Information System (INIS)

    Andrews, D.J.

    1984-01-01

    An automatic sample changer for gamma spectrometry is described which is designed for large-volume, low radioactivity environmental samples of various sizes up to maximum dimensions 100 mm diameter x 60 mm high. The sample changer is suitable for use with most existing gamma spectrometry systems which utilize GeLi or NaI detectors in vertical mode, in conjunction with a pulse height analyzer having auto-cycle and suitable data output facilities; it is linked to a Nuclear Data ND 6620 computer-based analysis system. (U.K.)

  14. Temperature monitoring in large volume spread footing foundations: case study "Parque da Cidade" - São Paulo

    Directory of Open Access Journals (Sweden)

    D. Couto

    Full Text Available ABSTRACT In recent years, the construction of foundation elements from large-volume reinforced concrete is becoming increasingly common. This implies a potential increase in the risk of cracks of thermal origin, due to the heat of hydration of cement. Under these circumstances, these concrete elements need to be treated using the mass concrete theory, widespread in dam construction, but little used when designing buildings. This paper aims to present a case study about the procedures and problems involved in the construction of a spread footing with a volume of approximately 800m³ designed for the foundation of a shopping center in São Paulo, Brazil.

  15. A model for steady-state large-volume plasma generation

    International Nuclear Information System (INIS)

    Uhm, H.S.; Miller, J.D.; Schneider, R.F.

    1991-01-01

    In this paper, a simple, new scheme to generate a uniform, steady-state, large-volume plasma is presented. The weakly magnetized plasma is created by direct ionization of the background gas by low-energy electrons generated from thermionic filaments. An annular arrangement of the filaments ensures a uniform plasma density in the radial direction as predicted by theory. Experiments have been performed to characterize the plasma generated in such a configuration. In order to explain the experimental observation, we develop a bulk plasma theory based on plasma transport via cross-field diffusion. As assumed in the theoretical model, the experimental measurements indicate a uniform plasma density along the axis. Both the theory and experiment indicate that the plasma density is a function of the square of the external magnetic field. The theory also predicts the plasma density to be proportional to the neutral density to the two-thirds power in agreement with the experimental data. We also observe the experimental data to agree remarkably well with theoretical prediction for a broad range of system parameters

  16. Mass anomalous dimension of Adjoint QCD at large N from twisted volume reduction

    CERN Document Server

    Pérez, Margarita García; Keegan, Liam; Okawa, Masanori

    2015-01-01

    In this work we consider the $SU(N)$ gauge theory with two Dirac fermions in the adjoint representation, in the limit of large $N$. In this limit the infinite-volume physics of this model can be studied by means of the corresponding twisted reduced model defined on a single site lattice. Making use of this strategy we study the reduced model for various values of $N$ up to 289. By analyzing the eigenvalue distribution of the adjoint Dirac operator we test the conformality of the theory and extract the corresponding mass anomalous dimension.

  17. Mass anomalous dimension of adjoint QCD at large N from twisted volume reduction

    Energy Technology Data Exchange (ETDEWEB)

    Pérez, Margarita García [Instituto de Física Teórica UAM-CSIC, Nicolás Cabrera 13-15, Universidad Autónoma de Madrid,E-28049-Madrid (Spain); González-Arroyo, Antonio [Instituto de Física Teórica UAM-CSIC, Nicolás Cabrera 13-15, Universidad Autónoma de Madrid,E-28049-Madrid (Spain); Departamento de Física Teórica, C-XI, Universidad Autónoma de Madrid,E-28049-Madrid (Spain); Keegan, Liam [PH-TH, CERN,CH-1211 Geneva 23 (Switzerland); Okawa, Masanori [Graduate School of Science, Hiroshima University,Higashi-Hiroshima, Hiroshima 739-8526 (Japan); Core of Research for the Energetic Universe, Hiroshima University,Higashi-Hiroshima, Hiroshima 739-8526 (Japan)

    2015-08-07

    In this work we consider the SU(N) gauge theory with two Dirac fermions in the adjoint representation, in the limit of large N. In this limit the infinite-volume physics of this model can be studied by means of the corresponding twisted reduced model defined on a single site lattice. Making use of this strategy we study the reduced model for various values of N up to 289. By analyzing the eigenvalue distribution of the adjoint Dirac operator we test the conformality of the theory and extract the corresponding mass anomalous dimension.

  18. In vitro validation of a Pitot-based flow meter for the measurement of respiratory volume and flow in large animal anaesthesia.

    Science.gov (United States)

    Moens, Yves P S; Gootjes, Peter; Ionita, Jean-Claude; Heinonen, Erkki; Schatzmann, Urs

    2009-05-01

    To remodel and validate commercially available monitors and their Pitot tube-based flow sensors for use in large animals, using in vitro techniques. Prospective, in vitro experiment. Both the original and the remodelled sensor were studied with a reference flow generator. Measurements were taken of the static flow-pressure relationship and linearity of the flow signal. Sensor airway resistance was calculated. Following recalibration of the host monitor, volumes ranging from 1 to 7 L were generated by a calibration syringe, and bias and precision of spirometric volume was determined. Where manual recalibration was not available, a conversion factor for volume measurement was determined. The influence of gas composition mixture and peak flow on the conversion factor was studied. Both the original and the remodelled sensor showed similar static flow-pressure relationships and linearity of the flow signal. Mean bias (%) of displayed values compared with the reference volume of 3, 5 and 7 L varied between -0.4% and +2.4%, and this was significantly smaller than that for 1 L (4.8% to +5.0%). Conversion factors for 3, 5 and 7 L were very similar (mean 6.00 +/- 0.2, range 5.91-6.06) and were not significantly influenced by the gas mixture used. Increasing peak flow caused a small decrease in the conversion factor. Volume measurement error and conversion factors for inspiration and expiration were close to identity. The combination of the host monitor with the remodelled flow sensor allowed accurate in vitro measurement of flows and volumes in a range expected during large animal anaesthesia. This combination has potential as a reliable spirometric monitor for use during large animal anaesthesia.

  19. Ultra-large scale AFM of lipid droplet arrays: investigating the ink transfer volume in dip pen nanolithography

    International Nuclear Information System (INIS)

    Förste, Alexander; Pfirrmann, Marco; Sachs, Johannes; Gröger, Roland; Walheim, Stefan; Brinkmann, Falko; Hirtz, Michael; Fuchs, Harald; Schimmel, Thomas

    2015-01-01

    There are only few quantitative studies commenting on the writing process in dip-pen nanolithography with lipids. Lipids are important carrier ink molecules for the delivery of bio-functional patters in bio-nanotechnology. In order to better understand and control the writing process, more information on the transfer of lipid material from the tip to the substrate is needed. The dependence of the transferred ink volume on the dwell time of the tip on the substrate was investigated by topography measurements with an atomic force microscope (AFM) that is characterized by an ultra-large scan range of 800 × 800 μm 2 . For this purpose arrays of dots of the phospholipid1,2-dioleoyl-sn-glycero-3-phosphocholine were written onto planar glass substrates and the resulting pattern was imaged by large scan area AFM. Two writing regimes were identified, characterized of either a steady decline or a constant ink volume transfer per dot feature. For the steady state ink transfer, a linear relationship between the dwell time and the dot volume was determined, which is characterized by a flow rate of about 16 femtoliters per second. A dependence of the ink transport from the length of pauses before and in between writing the structures was observed and should be taken into account during pattern design when aiming at best writing homogeneity. The ultra-large scan range of the utilized AFM allowed for a simultaneous study of the entire preparation area of almost 1 mm 2 , yielding good statistic results. (paper)

  20. Ultra-large scale AFM of lipid droplet arrays: investigating the ink transfer volume in dip pen nanolithography

    Science.gov (United States)

    Förste, Alexander; Pfirrmann, Marco; Sachs, Johannes; Gröger, Roland; Walheim, Stefan; Brinkmann, Falko; Hirtz, Michael; Fuchs, Harald; Schimmel, Thomas

    2015-05-01

    There are only few quantitative studies commenting on the writing process in dip-pen nanolithography with lipids. Lipids are important carrier ink molecules for the delivery of bio-functional patters in bio-nanotechnology. In order to better understand and control the writing process, more information on the transfer of lipid material from the tip to the substrate is needed. The dependence of the transferred ink volume on the dwell time of the tip on the substrate was investigated by topography measurements with an atomic force microscope (AFM) that is characterized by an ultra-large scan range of 800 × 800 μm2. For this purpose arrays of dots of the phospholipid1,2-dioleoyl-sn-glycero-3-phosphocholine were written onto planar glass substrates and the resulting pattern was imaged by large scan area AFM. Two writing regimes were identified, characterized of either a steady decline or a constant ink volume transfer per dot feature. For the steady state ink transfer, a linear relationship between the dwell time and the dot volume was determined, which is characterized by a flow rate of about 16 femtoliters per second. A dependence of the ink transport from the length of pauses before and in between writing the structures was observed and should be taken into account during pattern design when aiming at best writing homogeneity. The ultra-large scan range of the utilized AFM allowed for a simultaneous study of the entire preparation area of almost 1 mm2, yielding good statistic results.

  1. Dosimetric Comparison of Split Field and Fixed Jaw Techniques for Large IMRT Target Volumes in the Head and Neck

    International Nuclear Information System (INIS)

    Srivastava, Shiv P.; Das, Indra J.; Kumar, Arvind; Johnstone, Peter A.S.

    2011-01-01

    Some treatment planning systems (TPSs), when used for large-field (>14 cm) intensity-modulated radiation therapy (IMRT), create split fields that produce excessive multiple-leaf collimator segments, match-line dose inhomogeneity, and higher treatment times than nonsplit fields. A new method using a fixed-jaw technique (FJT) forces the jaw to stay at a fixed position during optimization and is proposed to reduce problems associated with split fields. Dosimetric comparisons between split-field technique (SFT) and FJT used for IMRT treatment is presented. Five patients with head and neck malignancies and regional target volumes were studied and compared with both techniques. Treatment planning was performed on an Eclipse TPS using beam data generated for Varian 2100C linear accelerator. A standard beam arrangement consisting of nine coplanar fields, equally spaced, was used in both techniques. Institutional dose-volume constraints used in head and neck cancer were kept the same for both techniques. The dosimetric coverage for the target volumes between SFT and FJT for head and neck IMRT plan is identical within ±1% up to 90% dose. Similarly, the organs at risk (OARs) have dose-volume coverage nearly identical for all patients. When the total monitor unit (MU) and segments were analyzed, SFT produces statistically significant higher segments (17.3 ± 6.3%) and higher MU (13.7 ± 4.4%) than the FJT. There is no match line in FJT and hence dose uniformity in the target volume is superior to the SFT. Dosimetrically, SFT and FJT are similar for dose-volume coverage; however, the FJT method provides better logistics, lower MU, shorter treatment time, and better dose uniformity. The number of segments and MU also has been correlated with the whole body radiation dose with long-term complications. Thus, FJT should be the preferred option over SFT for large target volumes.

  2. Neuronal correlates of the five factor model (FFM) of human personality: Multimodal imaging in a large healthy sample.

    Science.gov (United States)

    Bjørnebekk, Astrid; Fjell, Anders M; Walhovd, Kristine B; Grydeland, Håkon; Torgersen, Svenn; Westlye, Lars T

    2013-01-15

    Advances in neuroimaging techniques have recently provided glimpse into the neurobiology of complex traits of human personality. Whereas some intriguing findings have connected aspects of personality to variations in brain morphology, the relations are complex and our current understanding is incomplete. Therefore, we aimed to provide a comprehensive investigation of brain-personality relations using a multimodal neuroimaging approach in a large sample comprising 265 healthy individuals. The NEO Personality Inventory was used to provide measures of core aspects of human personality, and imaging phenotypes included measures of total and regional brain volumes, regional cortical thickness and arealization, and diffusion tensor imaging indices of white matter (WM) microstructure. Neuroticism was the trait most clearly linked to brain structure. Higher neuroticism including facets reflecting anxiety, depression and vulnerability to stress was associated with smaller total brain volume, widespread decrease in WM microstructure, and smaller frontotemporal surface area. Higher scores on extraversion were associated with thinner inferior frontal gyrus, and conscientiousness was negatively associated with arealization of the temporoparietal junction. No reliable associations between brain structure and agreeableness and openness, respectively, were found. The results provide novel evidence of the associations between brain structure and variations in human personality, and corroborate previous findings of a consistent neuroanatomical basis of negative emotionality. Copyright © 2012 Elsevier Inc. All rights reserved.

  3. Air-deployable oil spill sampling devices review phase 2 testing. Volume 1

    International Nuclear Information System (INIS)

    Hawke, L.; Dumouchel, A.; Fingas, M.; Brown, C.E.

    2007-01-01

    SAIC Canada tested air deployable oil sampling devices for the Emergencies Science and Technology Division of Environment Canada in order to determine the applicability and status of these devices. The 3 devices tested were: Canada's SABER (sampling autonomous buoy for evidence recovery), the United States' POPEIE (probe for oil pollution evidence in the environment); and, Sweden's SAR Floatation 2000. They were tested for buoyancy properties, drift behaviour and sampler sorbent pickup ratios. The SAR and SABER both had lesser draft and greater freeboard, while the POPEIE had much greater draft than freeboard. All 3 devices could be used for oil sample collection in that their drift characteristics would allow for the SABER and SAR devices to be placed upwind of the slick while the POPEIE device could be placed downwind of an oil spill. The sorbent testing revealed that Sefar sorbent and Spectra sorbent used in the 3 devices had negative pickup ratios for diesel but performance improved as oil viscosity increased. Both sorbents are inert and capable of collecting oil in sufficient volumes for consistent fingerprinting analysis. 10 refs., 8 tabs., 8 figs

  4. Procedure for plutonium analysis of large (100g) soil and sediment samples

    International Nuclear Information System (INIS)

    Meadows, J.W.T.; Schweiger, J.S.; Mendoza, B.; Stone, R.

    1975-01-01

    A method for the complete dissolution of large soil or sediment samples is described. This method is in routine usage at Lawrence Livermore Laboratory for the analysis of fall-out levels of Pu in soils and sediments. Intercomparison with partial dissolution (leach) techniques shows the complete dissolution method to be superior for the determination of plutonium in a wide variety of environmental samples. (author)

  5. Large volume serial section tomography by Xe Plasma FIB dual beam microscopy

    International Nuclear Information System (INIS)

    Burnett, T.L.; Kelley, R.; Winiarski, B.; Contreras, L.; Daly, M.; Gholinia, A.; Burke, M.G.; Withers, P.J.

    2016-01-01

    Ga + Focused Ion Beam-Scanning Electron Microscopes (FIB-SEM) have revolutionised the level of microstructural information that can be recovered in 3D by block face serial section tomography (SST), as well as enabling the site-specific removal of smaller regions for subsequent transmission electron microscope (TEM) examination. However, Ga + FIB material removal rates limit the volumes and depths that can be probed to dimensions in the tens of microns range. Emerging Xe + Plasma Focused Ion Beam-Scanning Electron Microscope (PFIB-SEM) systems promise faster removal rates. Here we examine the potential of the method for large volume serial section tomography as applied to bainitic steel and WC–Co hard metals. Our studies demonstrate that with careful control of milling parameters precise automated serial sectioning can be achieved with low levels of milling artefacts at removal rates some 60× faster. Volumes that are hundreds of microns in dimension have been collected using fully automated SST routines in feasible timescales (<24 h) showing good grain orientation contrast and capturing microstructural features at the tens of nanometres to the tens of microns scale. Accompanying electron back scattered diffraction (EBSD) maps show high indexing rates suggesting low levels of surface damage. Further, under high current Ga + FIB milling WC–Co is prone to amorphisation of WC surface layers and phase transformation of the Co phase, neither of which have been observed at PFIB currents as high as 60 nA at 30 kV. Xe + PFIB dual beam microscopes promise to radically extend our capability for 3D tomography, 3D EDX, 3D EBSD as well as correlative tomography. - Highlights: • The uptake of dual beam FIBs has been rapid but long milling times have limited imaged volumes to tens of micron dimensions. • Emerging plasma Xe + PFIB-SEM technology offers materials removal rates at least 60× greater than conventional Ga + FIB systems with comparable or less damage. • The

  6. Fast sampling from a Hidden Markov Model posterior for large data

    DEFF Research Database (Denmark)

    Bonnevie, Rasmus; Hansen, Lars Kai

    2014-01-01

    Hidden Markov Models are of interest in a broad set of applications including modern data driven systems involving very large data sets. However, approximate inference methods based on Bayesian averaging are precluded in such applications as each sampling step requires a full sweep over the data...

  7. 17 CFR Appendix B to Part 420 - Sample Large Position Report

    Science.gov (United States)

    2010-04-01

    ..., and as collateral for financial derivatives and other securities transactions $ Total Memorandum 1... 17 Commodity and Securities Exchanges 3 2010-04-01 2010-04-01 false Sample Large Position Report B Appendix B to Part 420 Commodity and Securities Exchanges DEPARTMENT OF THE TREASURY REGULATIONS UNDER...

  8. Large area synchrotron X-ray fluorescence mapping of biological samples

    International Nuclear Information System (INIS)

    Kempson, I.; Thierry, B.; Smith, E.; Gao, M.; De Jonge, M.

    2014-01-01

    Large area mapping of inorganic material in biological samples has suffered severely from prohibitively long acquisition times. With the advent of new detector technology we can now generate statistically relevant information for studying cell populations, inter-variability and bioinorganic chemistry in large specimen. We have been implementing ultrafast synchrotron-based XRF mapping afforded by the MAIA detector for large area mapping of biological material. For example, a 2.5 million pixel map can be acquired in 3 hours, compared to a typical synchrotron XRF set-up needing over 1 month of uninterrupted beamtime. Of particular focus to us is the fate of metals and nanoparticles in cells, 3D tissue models and animal tissues. The large area scanning has for the first time provided statistically significant information on sufficiently large numbers of cells to provide data on intercellular variability in uptake of nanoparticles. Techniques such as flow cytometry generally require analysis of thousands of cells for statistically meaningful comparison, due to the large degree of variability. Large area XRF now gives comparable information in a quantifiable manner. Furthermore, we can now image localised deposition of nanoparticles in tissues that would be highly improbable to 'find' by typical XRF imaging. In addition, the ultra fast nature also makes it viable to conduct 3D XRF tomography over large dimensions. This technology avails new opportunities in biomonitoring and understanding metal and nanoparticle fate ex-vivo. Following from this is extension to molecular imaging through specific anti-body targeted nanoparticles to label specific tissues and monitor cellular process or biological consequence

  9. Validation of low-volume enrichment protocols for detection of Escherichia coli O157 in raw ground beef components, using commercial kits.

    Science.gov (United States)

    Ahmed, Imtiaz; Hughes, Denise; Jenson, Ian; Karalis, Tass

    2009-03-01

    Testing of beef destined for use in ground beef products for the presence of Escherichia coli O157:H7 has become an important cornerstone of control and verification activities within many meat supply chains. Validation of the ability of methods to detect low levels of E. coli O157:H7 is critical to confidence in test systems. Many rapid methods have been validated against standard cultural methods for 25-g samples. In this study, a number of previously validated enrichment broths and commercially available test kits were validated for the detection of low numbers of E. coli O157:H7 in 375-g samples of raw ground beef component matrices using 1 liter of enrichment broth (large-sample:low-volume enrichment protocol). Standard AOAC International methods for 25-g samples in 225 ml of enrichment broth, using the same media, incubation conditions, and test kits, were used as reference methods. No significant differences were detected in the ability of any of the tests to detect low levels of E. coli O157:H7 in samples of raw ground beef components when enriched according to standard or large-sample:low-volume enrichment protocols. The use of large-sample:low-volume enrichment protocols provides cost savings for media and logistical benefits when handling and incubating large numbers of samples.

  10. Soft SUSY breaking parameters and RG running of squark and slepton masses in large volume Swiss Cheese compactifications

    International Nuclear Information System (INIS)

    Misra, Aalok; Shukla, Pramod

    2010-01-01

    We consider type IIB large volume compactifications involving orientifolds of the Swiss Cheese Calabi-Yau WCP 4 [1,1,1,6,9] with a single mobile space-time filling D3-brane and stacks of D7-branes wrapping the 'big' divisor Σ B (as opposed to the 'small' divisor usually done in the literature thus far) as well as supporting D7-brane fluxes. After reviewing our proposal of (Misra and Shukla, 2010) for resolving a long-standing tension between large volume cosmology and phenomenology pertaining to obtaining a 10 12 GeV gravitino in the inflationary era and a TeV gravitino in the present era, and summarizing our results of (Misra and Shukla, 2010) on soft supersymmetry breaking terms and open-string moduli masses, we discuss the one-loop RG running of the squark and slepton masses in mSUGRA-like models (using the running of the gaugino masses) to the EW scale in the large volume limit. Phenomenological constraints and some of the calculated soft SUSY parameters identify the D7-brane Wilson line moduli as the first two generations/families of squarks and sleptons and the D3-brane (restricted to the big divisor) position moduli as the two Higgses for MSSM-like models at TeV scale. We also discuss how the obtained open-string/matter moduli make it easier to impose FCNC constraints, as well as RG flow of off-diagonal squark mass(-squared) matrix elements.

  11. Soft SUSY breaking parameters and RG running of squark and slepton masses in large volume Swiss Cheese compactifications

    Science.gov (United States)

    Misra, Aalok; Shukla, Pramod

    2010-03-01

    We consider type IIB large volume compactifications involving orientifolds of the Swiss Cheese Calabi-Yau WCP[1,1,1,6,9] with a single mobile space-time filling D3-brane and stacks of D7-branes wrapping the “big” divisor ΣB (as opposed to the “small” divisor usually done in the literature thus far) as well as supporting D7-brane fluxes. After reviewing our proposal of [1] (Misra and Shukla, 2010) for resolving a long-standing tension between large volume cosmology and phenomenology pertaining to obtaining a 10 GeV gravitino in the inflationary era and a TeV gravitino in the present era, and summarizing our results of [1] (Misra and Shukla, 2010) on soft supersymmetry breaking terms and open-string moduli masses, we discuss the one-loop RG running of the squark and slepton masses in mSUGRA-like models (using the running of the gaugino masses) to the EW scale in the large volume limit. Phenomenological constraints and some of the calculated soft SUSY parameters identify the D7-brane Wilson line moduli as the first two generations/families of squarks and sleptons and the D3-brane (restricted to the big divisor) position moduli as the two Higgses for MSSM-like models at TeV scale. We also discuss how the obtained open-string/matter moduli make it easier to impose FCNC constraints, as well as RG flow of off-diagonal squark mass(-squared) matrix elements.

  12. System of large transport containers for waste from dismantling light water and gas-cooled nuclear reactors. Volume 2

    International Nuclear Information System (INIS)

    Price, M.S.T.; Lafontaine, I.

    1985-01-01

    The purpose of this volume is to assess the means of transportation of decommissioning wastes, costs of transport, radiological detriment attributable to transport and develops conceptual designs of large transport containers. The document ends with Conclusions and Recommendations

  13. Ultra-High-Throughput Sample Preparation System for Lymphocyte Immunophenotyping Point-of-Care Diagnostics.

    Science.gov (United States)

    Walsh, David I; Murthy, Shashi K; Russom, Aman

    2016-10-01

    Point-of-care (POC) microfluidic devices often lack the integration of common sample preparation steps, such as preconcentration, which can limit their utility in the field. In this technology brief, we describe a system that combines the necessary sample preparation methods to perform sample-to-result analysis of large-volume (20 mL) biopsy model samples with staining of captured cells. Our platform combines centrifugal-paper microfluidic filtration and an analysis system to process large, dilute biological samples. Utilizing commercialization-friendly manufacturing methods and materials, yielding a sample throughput of 20 mL/min, and allowing for on-chip staining and imaging bring together a practical, yet powerful approach to microfluidic diagnostics of large, dilute samples. © 2016 Society for Laboratory Automation and Screening.

  14. Sampling of charged liquid radwaste stored in large tanks

    International Nuclear Information System (INIS)

    Tchemitcheff, E.; Domage, M.; Bernard-Bruls, X.

    1995-01-01

    The final safe disposal of radwaste, in France and elsewhere, entails, for liquid effluents, their conversion to a stable solid form, hence implying their conditioning. The production of conditioned waste with the requisite quality, traceability of the characteristics of the packages produced, and safe operation of the conditioning processes, implies at least the accurate knowledge of the chemical and radiochemical properties of the effluents concerned. The problem in sampling the normally charged effluents is aggravated for effluents that have been stored for several years in very large tanks, without stirring and retrieval systems. In 1992, SGN was asked by Cogema to study the retrieval and conditioning of LL/ML chemical sludge and spent ion-exchange resins produced in the operation of the UP2 400 plant at La Hague, and stored temporarily in rectangular silos and tanks. The sampling aspect was crucial for validating the inventories, identifying the problems liable to arise in the aging of the effluents, dimensioning the retrieval systems and checking the transferability and compatibility with the downstream conditioning process. Two innovative self-contained systems were developed and built for sampling operations, positioned above the tanks concerned. Both systems have been operated in active conditions and have proved totally satisfactory for taking representative samples. Today SGN can propose industrially proven overall solutions, adaptable to the various constraints of many spent fuel cycle operators

  15. Relationship between LIBS Ablation and Pit Volume for Geologic Samples: Applications for in situ Absolute Geochronology

    Science.gov (United States)

    Devismes, D.; Cohen, Barbara A.

    2014-01-01

    In planetary sciences, in situ absolute geochronology is a scientific and engineering challenge. Currently, the age of the Martian surface can only be determined by crater density counting. However this method has significant uncertainties and needs to be calibrated with absolute ages. We are developing an instrument to acquire in situ absolute geochronology based on the K-Ar method. The protocol is based on the laser ablation of a rock by hundreds of laser pulses. Laser Induced Breakdown Spectroscopy (LIBS) gives the potassium content of the ablated material and a mass spectrometer (quadrupole or ion trap) measures the quantity of 40Ar released. In order to accurately measure the quantity of released 40Ar in cases where Ar is an atmospheric constituent (e.g., Mars), the sample is first put into a chamber under high vacuum. The 40Arquantity, the concentration of K and the estimation of the ablated mass are the parameters needed to give the age of the rocks. The main uncertainties with this method are directly linked to the measures of the mass (typically some µg) and of the concentration of K by LIBS (up to 10%). Because the ablated mass is small compared to the mass of the sample, and because material is redeposited onto the sample after ablation, it is not possible to directly measure the ablated mass. Our current protocol measures the ablated volume and estimates the sample density to calculate ablated mass. The precision and accuracy of this method may be improved by using knowledge of the sample's geologic properties to predict its response to laser ablation, i.e., understanding whether natural samples have a predictable relationship between laser energy deposited and resultant ablation volume. In contrast to most previous studies of laser ablation, theoretical equations are not highly applicable. The reasons are numerous, but the most important are: a) geologic rocks are complex, polymineralic materials; b) the conditions of ablation are unusual (for example

  16. 'Finite' non-Gaussianities and tensor-scalar ratio in large volume Swiss-cheese compactifications

    International Nuclear Information System (INIS)

    Misra, Aalok; Shukla, Pramod

    2009-01-01

    Developing on the ideas of (Section 4 of) [A. Misra, P. Shukla, Moduli stabilization, large-volume dS minimum without anti-D3-branes, (non-)supersymmetric black hole attractors and two-parameter Swiss cheese Calabi-Yau's, Nucl. Phys. B 799 (2008) 165-198, (arXiv: 0707.0105)] and [A. Misra, P. Shukla, Large volume axionic Swiss-cheese inflation, Nucl. Phys. B 800 (2008) 384-400, (arXiv: 0712.1260 [hep-th])] and using the formalisms of [S. Yokoyama, T. Suyama, T. Tanaka, Primordial non-Gaussianity in multi-scalar slow-roll inflation, (arXiv: 0705.3178 [astro-ph]); S. Yokoyama, T. Suyama, T. Tanaka, Primordial non-Gaussianity in multi-scalar inflation, Phys. Rev. D 77 (2008) 083511, (arXiv: 0711.2920 [astro-ph])], after inclusion of perturbative and non-perturbative α' corrections to the Kaehler potential and (D1- and D3-)instanton generated superpotential, we show the possibility of getting finite values for the non-linear parameter f NL while looking for non-Gaussianities in type IIB compactifications on orientifolds of the Swiss cheese Calabi-Yau WCP 4 [1,1,1,6,9] in the L(arge) V(olume) S(cenarios) limit. We show the same in two contexts. First is multi-field slow-roll inflation with D3-instanton contribution coming from a large number of multiple wrappings of a single (Euclidean) D3-brane around the 'small' divisor yielding f NL ∼O(1). The second is when the slow-roll conditions are violated and for the number of the aforementioned D3-instanton wrappings being of O(1) but more than one, yielding f NL ∼O(1). Based on general arguments not specific to our (string-theory) set-up, we argue that requiring curvature perturbations not to grow at horizon crossing and at super-horizon scales, automatically picks out hybrid inflationary scenarios which in our set up can yield f NL ∼O(1) and tensor-scalar ratio of O(10 -2 ). For all our calculations, the world-sheet instanton contributions to the Kaehler potential coming from the non-perturbative α ' corrections

  17. CO2 isotope analyses using large air samples collected on intercontinental flights by the CARIBIC Boeing 767

    NARCIS (Netherlands)

    Assonov, S.S.; Brenninkmeijer, C.A.M.; Koeppel, C.; Röckmann, T.

    2009-01-01

    Analytical details for 13C and 18O isotope analyses of atmospheric CO2 in large air samples are given. The large air samples of nominally 300 L were collected during the passenger aircraft-based atmospheric chemistry research project CARIBIC and analyzed for a large number of trace gases and

  18. Examining the effect of psychopathic traits on gray matter volume in a community substance abuse sample.

    Science.gov (United States)

    Cope, Lora M; Shane, Matthew S; Segall, Judith M; Nyalakanti, Prashanth K; Stevens, Michael C; Pearlson, Godfrey D; Calhoun, Vince D; Kiehl, Kent A

    2012-11-30

    Psychopathy is believed to be associated with brain abnormalities in both paralimbic (i.e., orbitofrontal cortex, insula, temporal pole, parahippocampal gyrus, posterior cingulate) and limbic (i.e., amygdala, hippocampus, anterior cingulate) regions. Recent structural imaging studies in both community and prison samples are beginning to support this view. Sixty-six participants, recruited from community corrections centers, were administered the Hare psychopathy checklist-revised (PCL-R), and underwent magnetic resonance imaging (MRI). Voxel-based morphometry was used to test the hypothesis that psychopathic traits would be associated with gray matter reductions in limbic and paralimbic regions. Effects of lifetime drug and alcohol use on gray matter volume were covaried. Psychopathic traits were negatively associated with gray matter volumes in right insula and right hippocampus. Additionally, psychopathic traits were positively associated with gray matter volumes in bilateral orbital frontal cortex and right anterior cingulate. Exploratory regression analyses indicated that gray matter volumes within right hippocampus and left orbital frontal cortex combined to explain 21.8% of the variance in psychopathy scores. These results support the notion that psychopathic traits are associated with abnormal limbic and paralimbic gray matter volume. Furthermore, gray matter increases in areas shown to be functionally impaired suggest that the structure-function relationship may be more nuanced than previously thought. Copyright © 2012 Elsevier Ireland Ltd. All rights reserved.

  19. Construction and Start-up of a Large-Volume Thermostat for Dielectric-Constant Gas Thermometry

    Science.gov (United States)

    Merlone, A.; Moro, F.; Zandt, T.; Gaiser, C.; Fellmuth, B.

    2010-07-01

    A liquid-bath thermostat with a volume of about 800 L was designed to provide a suitable thermal environment for a dielectric-constant gas thermometer (DCGT) in the range from the triple point of mercury to the melting point of gallium. In the article, results obtained with the unique, huge thermostat without the DCGT measuring chamber are reported to demonstrate the capability of controlling the temperature of very large systems at a metrological level. First tests showed that the bath together with its temperature controller provide a temperature variation of less than ±0.5mK peak-to-peak. This temperature instability could be maintained over a period of several days. In the central working volume (diameter—500mm, height—650mm), in which the vacuum chamber containing the measuring system of the DCGT will be placed later, the temperature inhomogeneity has been demonstrated to be also well below 1mK.

  20. Research on volume metrology method of large vertical energy storage tank based on internal electro-optical distance-ranging method

    Science.gov (United States)

    Hao, Huadong; Shi, Haolei; Yi, Pengju; Liu, Ying; Li, Cunjun; Li, Shuguang

    2018-01-01

    A Volume Metrology method based on Internal Electro-optical Distance-ranging method is established for large vertical energy storage tank. After analyzing the vertical tank volume calculation mathematical model, the key processing algorithms, such as gross error elimination, filtering, streamline, and radius calculation are studied for the point cloud data. The corresponding volume values are automatically calculated in the different liquids by calculating the cross-sectional area along the horizontal direction and integrating from vertical direction. To design the comparison system, a vertical tank which the nominal capacity is 20,000 m3 is selected as the research object, and there are shown that the method has good repeatability and reproducibility. Through using the conventional capacity measurement method as reference, the relative deviation of calculated volume is less than 0.1%, meeting the measurement requirements. And the feasibility and effectiveness are demonstrated.

  1. Intralesional and metastatic heterogeneity in malignant melanomas demonstrated by stereologic estimates of nuclear volume

    DEFF Research Database (Denmark)

    Sørensen, Flemming Brandt; Erlandsen, M

    1990-01-01

    Regional variability of nuclear 3-dimensional size can be estimated objectively using point-sampled intercepts obtained from different, defined zones within individual neoplasms. In the present study, stereologic estimates of the volume-weighted mean nuclear volume, nuclear vv, within peripheral...... melanomas showed large interindividual variation. This finding emphasizes that unbiased estimates of nuclear vv are robust to regional heterogeneity of nuclear volume and thus suitable for purposes of objective, quantitative malignancy grading of melanomas....

  2. Statistical characterization of a large geochemical database and effect of sample size

    Science.gov (United States)

    Zhang, C.; Manheim, F.T.; Hinde, J.; Grossman, J.N.

    2005-01-01

    The authors investigated statistical distributions for concentrations of chemical elements from the National Geochemical Survey (NGS) database of the U.S. Geological Survey. At the time of this study, the NGS data set encompasses 48,544 stream sediment and soil samples from the conterminous United States analyzed by ICP-AES following a 4-acid near-total digestion. This report includes 27 elements: Al, Ca, Fe, K, Mg, Na, P, Ti, Ba, Ce, Co, Cr, Cu, Ga, La, Li, Mn, Nb, Nd, Ni, Pb, Sc, Sr, Th, V, Y and Zn. The goal and challenge for the statistical overview was to delineate chemical distributions in a complex, heterogeneous data set spanning a large geographic range (the conterminous United States), and many different geological provinces and rock types. After declustering to create a uniform spatial sample distribution with 16,511 samples, histograms and quantile-quantile (Q-Q) plots were employed to delineate subpopulations that have coherent chemical and mineral affinities. Probability groupings are discerned by changes in slope (kinks) on the plots. Major rock-forming elements, e.g., Al, Ca, K and Na, tend to display linear segments on normal Q-Q plots. These segments can commonly be linked to petrologic or mineralogical associations. For example, linear segments on K and Na plots reflect dilution of clay minerals by quartz sand (low in K and Na). Minor and trace element relationships are best displayed on lognormal Q-Q plots. These sensitively reflect discrete relationships in subpopulations within the wide range of the data. For example, small but distinctly log-linear subpopulations for Pb, Cu, Zn and Ag are interpreted to represent ore-grade enrichment of naturally occurring minerals such as sulfides. None of the 27 chemical elements could pass the test for either normal or lognormal distribution on the declustered data set. Part of the reasons relate to the presence of mixtures of subpopulations and outliers. Random samples of the data set with successively

  3. Material properties of large-volume cadmium zinc telluride crystals and their relationship to nuclear detector performance

    International Nuclear Information System (INIS)

    James, R.B.; Lund, J.; Yoon, H.

    1997-01-01

    The material showing the greatest promise today for production of large-volume gamma-ray spectrometers operable at room temperature is cadmium zinc telluride (CZT). Unfortunately, because of deficiencies in the quality of the present material, high-resolution CZT spectrometers have thus far been limited to relatively small dimensions, which makes them inefficient at detecting high photon energies and ineffective for weak radiation signals except in near proximity. To exploit CZT fully, it will be necessary to make substantial improvements in the material quality. Improving the material involves advances in the purity, crystallinity, and control of the electrical compensation mechanism. Sandia National Laboratories, California, in close collaboration with US industry and academia, has initiated efforts to develop a detailed understanding of the underlying material problems limiting the performance of large volume gamma-ray spectrometers and to overcome them through appropriate corrections therein. A variety of analytical and numerical techniques are employed to quantify impurities, compositional and stoichiometric variations, crystallinity, strain, bulk and surface defect states, carrier mobilities and lifetimes, electric field distributions, and contact chemistry. Data from these measurements are correlated with spatial maps of the gamma-ray and alpha particle spectroscopic response to determine improvements in the material purification, crystal growth, detector fabrication, and surface passivation procedures. The results of several analytical techniques will be discussed. The intended accomplishment of this work is to develop a low-cost, high-efficiency CZT spectrometer with an active volume of 5 cm 3 and energy resolution of 1--2% (at 662 keV), which would give the US a new field capability for screening radioactive substances

  4. Ribbon scanning confocal for high-speed high-resolution volume imaging of brain.

    Directory of Open Access Journals (Sweden)

    Alan M Watson

    Full Text Available Whole-brain imaging is becoming a fundamental means of experimental insight; however, achieving subcellular resolution imagery in a reasonable time window has not been possible. We describe the first application of multicolor ribbon scanning confocal methods to collect high-resolution volume images of chemically cleared brains. We demonstrate that ribbon scanning collects images over ten times faster than conventional high speed confocal systems but with equivalent spectral and spatial resolution. Further, using this technology, we reconstruct large volumes of mouse brain infected with encephalitic alphaviruses and demonstrate that regions of the brain with abundant viral replication were inaccessible to vascular perfusion. This reveals that the destruction or collapse of large regions of brain micro vasculature may contribute to the severe disease caused by Venezuelan equine encephalitis virus. Visualization of this fundamental impact of infection would not be possible without sampling at subcellular resolution within large brain volumes.

  5. Analytical and Experimental Investigation of Mixing in Large Passive Containment Volumes

    International Nuclear Information System (INIS)

    Peterson, Per F.

    2002-01-01

    This final report details results from the past three years of the three-year UC Berkeley NEER investigation of mixing phenomena in large-scale passive reactor containments. We have completed all of our three-year deliverables specified in our proposal, as summarized for each deliverable in the body of this report, except for the experiments of steam condensation in the presence of noncondensable gas. We have particularly exiting results from the experiments studying the mixing in large insulated containment with a vertical cooling plate. These experiments now have shown why augmentation has been observed in wall-condensation experiments due to the momentum of the steam break-flow entering large volumes. More importantly, we also have shown that the forced-jet augmentation can be predicted using relatively simple correlations, and that it is independent of the break diameter and depends only on the break flow orientation, location, and momentum. This suggests that we will now be able to take credit for this augmentation in reactor safety analysis, improving safety margins for containment structures. We have finished the version 1 of 1-D Lagrangian flow and heat transfer code BMIX++. This version has ability to solve many complex stratified problems, such as multi-components problems, multi-enclosures problems (two enclosures connected by one connection for the current version), incompressible and compressible problems, multi jets, plumes, sinks in one enclosure problems, problems with wall conduction, and the combinations of the above problems. We believe the BMIX++ code is a very powerful computation tool to study stratified enclosures mixing problems

  6. Final report on Phase II remedial action at the former Middlesex Sampling Plant and associated properties. Volume 2

    International Nuclear Information System (INIS)

    1985-04-01

    Volume 2 presents the radiological measurement data taken after remedial action on properties surrounding the former Middlesex Sampling Plant during Phase II of the DOE Middlesex Remedial Action Program. Also included are analyses of the confirmatory radiological survey data for each parcel with respect to the remedial action criteria established by DOE for the Phase II cleanup and a discussion of the final status of each property. Engineering details of this project and a description of the associated health physics and environmental monitoring activities are presented in Volume 1

  7. Large volume serial section tomography by Xe Plasma FIB dual beam microscopy.

    Science.gov (United States)

    Burnett, T L; Kelley, R; Winiarski, B; Contreras, L; Daly, M; Gholinia, A; Burke, M G; Withers, P J

    2016-02-01

    Ga(+) Focused Ion Beam-Scanning Electron Microscopes (FIB-SEM) have revolutionised the level of microstructural information that can be recovered in 3D by block face serial section tomography (SST), as well as enabling the site-specific removal of smaller regions for subsequent transmission electron microscope (TEM) examination. However, Ga(+) FIB material removal rates limit the volumes and depths that can be probed to dimensions in the tens of microns range. Emerging Xe(+) Plasma Focused Ion Beam-Scanning Electron Microscope (PFIB-SEM) systems promise faster removal rates. Here we examine the potential of the method for large volume serial section tomography as applied to bainitic steel and WC-Co hard metals. Our studies demonstrate that with careful control of milling parameters precise automated serial sectioning can be achieved with low levels of milling artefacts at removal rates some 60× faster. Volumes that are hundreds of microns in dimension have been collected using fully automated SST routines in feasible timescales (<24h) showing good grain orientation contrast and capturing microstructural features at the tens of nanometres to the tens of microns scale. Accompanying electron back scattered diffraction (EBSD) maps show high indexing rates suggesting low levels of surface damage. Further, under high current Ga(+) FIB milling WC-Co is prone to amorphisation of WC surface layers and phase transformation of the Co phase, neither of which have been observed at PFIB currents as high as 60nA at 30kV. Xe(+) PFIB dual beam microscopes promise to radically extend our capability for 3D tomography, 3D EDX, 3D EBSD as well as correlative tomography. Copyright © 2015 The Authors. Published by Elsevier B.V. All rights reserved.

  8. Evaluation of environmental sampling methods for detection of Salmonella enterica in a large animal veterinary hospital.

    Science.gov (United States)

    Goeman, Valerie R; Tinkler, Stacy H; Hammac, G Kenitra; Ruple, Audrey

    2018-04-01

    Environmental surveillance for Salmonella enterica can be used for early detection of contamination; thus routine sampling is an integral component of infection control programs in hospital environments. At the Purdue University Veterinary Teaching Hospital (PUVTH), the technique regularly employed in the large animal hospital for sample collection uses sterile gauze sponges for environmental sampling, which has proven labor-intensive and time-consuming. Alternative sampling methods use Swiffer brand electrostatic wipes for environmental sample collection, which are reportedly effective and efficient. It was hypothesized that use of Swiffer wipes for sample collection would be more efficient and less costly than the use of gauze sponges. A head-to-head comparison between the 2 sampling methods was conducted in the PUVTH large animal hospital and relative agreement, cost-effectiveness, and sampling efficiency were compared. There was fair agreement in culture results between the 2 sampling methods, but Swiffer wipes required less time and less physical effort to collect samples and were more cost-effective.

  9. Large volume serial section tomography by Xe Plasma FIB dual beam microscopy

    Energy Technology Data Exchange (ETDEWEB)

    Burnett, T.L. [School of Materials, University of Manchester, Manchester M13 9PL (United Kingdom); FEI Company, Achtseweg Noord 5, Bldg, 5651 GG, Eindhoven (Netherlands); Kelley, R. [FEI Company, 5350 NE Dawson Creek Drive, Hillsboro, OR 97124 (United States); Winiarski, B. [School of Materials, University of Manchester, Manchester M13 9PL (United Kingdom); FEI Company, Achtseweg Noord 5, Bldg, 5651 GG, Eindhoven (Netherlands); Contreras, L. [FEI Company, 5350 NE Dawson Creek Drive, Hillsboro, OR 97124 (United States); Daly, M.; Gholinia, A.; Burke, M.G. [School of Materials, University of Manchester, Manchester M13 9PL (United Kingdom); Withers, P.J., E-mail: P.J.Withers@manchester.ac.uk [School of Materials, University of Manchester, Manchester M13 9PL (United Kingdom); BP International Centre for Advanced Materials, University of Manchester, Manchester M13 9PL (United Kingdom)

    2016-02-15

    Ga{sup +} Focused Ion Beam-Scanning Electron Microscopes (FIB-SEM) have revolutionised the level of microstructural information that can be recovered in 3D by block face serial section tomography (SST), as well as enabling the site-specific removal of smaller regions for subsequent transmission electron microscope (TEM) examination. However, Ga{sup +} FIB material removal rates limit the volumes and depths that can be probed to dimensions in the tens of microns range. Emerging Xe{sup +} Plasma Focused Ion Beam-Scanning Electron Microscope (PFIB-SEM) systems promise faster removal rates. Here we examine the potential of the method for large volume serial section tomography as applied to bainitic steel and WC–Co hard metals. Our studies demonstrate that with careful control of milling parameters precise automated serial sectioning can be achieved with low levels of milling artefacts at removal rates some 60× faster. Volumes that are hundreds of microns in dimension have been collected using fully automated SST routines in feasible timescales (<24 h) showing good grain orientation contrast and capturing microstructural features at the tens of nanometres to the tens of microns scale. Accompanying electron back scattered diffraction (EBSD) maps show high indexing rates suggesting low levels of surface damage. Further, under high current Ga{sup +} FIB milling WC–Co is prone to amorphisation of WC surface layers and phase transformation of the Co phase, neither of which have been observed at PFIB currents as high as 60 nA at 30 kV. Xe{sup +} PFIB dual beam microscopes promise to radically extend our capability for 3D tomography, 3D EDX, 3D EBSD as well as correlative tomography. - Highlights: • The uptake of dual beam FIBs has been rapid but long milling times have limited imaged volumes to tens of micron dimensions. • Emerging plasma Xe{sup +} PFIB-SEM technology offers materials removal rates at least 60× greater than conventional Ga{sup +} FIB systems with

  10. Volume Visualization and Compositing on Large-Scale Displays Using Handheld Touchscreen Interaction

    KAUST Repository

    Gastelum, Cristhopper Jacobo Armenta

    2011-07-27

    Advances in the physical sciences have progressively delivered ever increasing, already extremely large data sets to be analyzed. High performance volume rendering has become critical to the scientists for a better understanding of the massive amounts of data to be visualized. Cluster based rendering systems have become the base line to achieve the power and flexibility required to perform such task. Furthermore, display arrays have become the most suitable solution to display these data sets at their natural size and resolution which can be critical for human perception and evaluation. The work in this thesis aims at improving the scalability and usability of volume rendering systems that target visualization on display arrays. The first part deals with improving the performance by introducing the implementations of two parallel compositing algorithms for volume rendering: direct send and binary swap. The High quality Volume Rendering (HVR) framework has been extended to accommodate parallel compositing where previously only serial compositing was possible. The preliminary results show improvements in the compositing times for direct send even for a small number of processors. Unfortunately, the results of binary swap exhibit a negative behavior. This is due to the naive use of the graphics hardware blending mechanism. The expensive transfers account for the lengthy compositing times. The second part targets the development of scalable and intuitive interaction mechanisms. It introduces the development of a new client application for multitouch tablet devices, like the Apple iPad. The main goal is to provide the HVR framework, that has been extended to use tiled displays, a more intuitive and portable interaction mechanism that can get advantage of the new environment. The previous client is a PC application for the typical desktop settings that use a mouse and keyboard as sources of interaction. The current implementation of the client lets the user steer and

  11. Characterization of large volume HPGe detectors. Part II: Experimental results

    International Nuclear Information System (INIS)

    Bruyneel, Bart; Reiter, Peter; Pascovici, Gheorghe

    2006-01-01

    Measurements on a 12-fold segmented, n-type, large volume, irregular shaped HPGe detector were performed in order to determine the parameters of anisotropic mobility for electrons and holes as charge carriers created by γ-ray interactions. To characterize the electron mobility the complete outer detector surface was scanned in small steps employing photopeak interactions at 60keV. A precise measurement of the hole drift anisotropy was performed with 356keV γ-rays. The drift velocity anisotropy and crystal geometry cause considerable rise time differences in pulse shapes depending on the position of the spatial charge carrier creation. Pulse shapes of direct and transient signals are reproduced by weighting potential calculations with high precision. The measured angular dependence of rise times is caused by the anisotropic mobility, crystal geometry, changing field strength and space charge effects. Preamplified signals were processed employing digital spectroscopy electronics. Response functions, crosstalk contributions and averaging procedures were taken into account implying novel methods due to the segmentation of the Ge-crystal and digital signal processing electronics

  12. A simple method to recover Norovirus from fresh produce with large sample size by using histo-blood group antigen-conjugated to magnetic beads in a recirculating affinity magnetic separation system (RCAMS).

    Science.gov (United States)

    Tian, Peng; Yang, David; Mandrell, Robert

    2011-06-30

    Human norovirus (NoV) outbreaks are major food safety concerns. The virus has to be concentrated from food samples in order to be detected. PEG precipitation is the most common method to recover the virus. Recently, histo-blood group antigens (HBGA) have been recognized as receptors for human NoV, and have been utilized as an alternative method to concentrate human NoV for samples up to 40 mL in volume. However, to wash off the virus from contaminated fresh food samples, at least 250 mL of wash volume is required. Recirculating affinity magnetic separation system (RCAMS) has been tried by others to concentrate human NoV from large-volume samples and failed to yield consistent results with the standard procedure of 30 min of recirculation at the default flow rate. Our work here demonstrates that proper recirculation time and flow rate are key factors for success in using the RCAMS. The bead recovery rate was increased from 28% to 47%, 67% and 90% when recirculation times were extended from 30 min to 60 min, 120 min and 180 min, respectively. The kinetics study suggests that at least 120 min recirculation is required to obtain a good recovery of NoV. In addition, different binding and elution conditions were compared for releasing NoV from inoculated lettuce. Phosphate-buffered saline (PBS) and water results in similar efficacy for virus release, but the released virus does not bind to RCAMS effectively unless pH was adjusted to acidic. Either citrate-buffered saline (CBS) wash, or water wash followed by CBS adjustment, resulted in an enhanced recovery of virus. We also demonstrated that the standard curve generated from viral RNA extracted from serially-diluted virus samples is more accurate for quantitative analysis than standard curves generated from serially-diluted plasmid DNA or transcribed-RNA templates, both of which tend to overestimate the concentration power. The efficacy of recovery of NoV from produce using RCAMS was directly compared with that of the

  13. Reducing Contingency through Sampling at the Luckey FUSRAP Site - 13186

    International Nuclear Information System (INIS)

    Frothingham, David; Barker, Michelle; Buechi, Steve; Durham, Lisa

    2013-01-01

    Typically, the greatest risk in developing accurate cost estimates for the remediation of hazardous, toxic, and radioactive waste sites is the uncertainty in the estimated volume of contaminated media requiring remediation. Efforts to address this risk in the remediation cost estimate can result in large cost contingencies that are often considered unacceptable when budgeting for site cleanups. Such was the case for the Luckey Formerly Utilized Sites Remedial Action Program (FUSRAP) site near Luckey, Ohio, which had significant uncertainty surrounding the estimated volume of site soils contaminated with radium, uranium, thorium, beryllium, and lead. Funding provided by the American Recovery and Reinvestment Act (ARRA) allowed the U.S. Army Corps of Engineers (USACE) to conduct additional environmental sampling and analysis at the Luckey Site between November 2009 and April 2010, with the objective to further delineate the horizontal and vertical extent of contaminated soils in order to reduce the uncertainty in the soil volume estimate. Investigative work included radiological, geophysical, and topographic field surveys, subsurface borings, and soil sampling. Results from the investigative sampling were used in conjunction with Argonne National Laboratory's Bayesian Approaches for Adaptive Spatial Sampling (BAASS) software to update the contaminated soil volume estimate for the site. This updated volume estimate was then used to update the project cost-to-complete estimate using the USACE Cost and Schedule Risk Analysis process, which develops cost contingencies based on project risks. An investment of $1.1 M of ARRA funds for additional investigative work resulted in a reduction of 135,000 in-situ cubic meters (177,000 in-situ cubic yards) in the estimated base volume estimate. This refinement of the estimated soil volume resulted in a $64.3 M reduction in the estimated project cost-to-complete, through a reduction in the uncertainty in the contaminated soil

  14. Reducing Contingency through Sampling at the Luckey FUSRAP Site - 13186

    Energy Technology Data Exchange (ETDEWEB)

    Frothingham, David; Barker, Michelle; Buechi, Steve [U.S. Army Corps of Engineers Buffalo District, 1776 Niagara St., Buffalo, NY 14207 (United States); Durham, Lisa [Argonne National Laboratory, Environmental Science Division, 9700 S. Cass Ave., Argonne, IL 60439 (United States)

    2013-07-01

    Typically, the greatest risk in developing accurate cost estimates for the remediation of hazardous, toxic, and radioactive waste sites is the uncertainty in the estimated volume of contaminated media requiring remediation. Efforts to address this risk in the remediation cost estimate can result in large cost contingencies that are often considered unacceptable when budgeting for site cleanups. Such was the case for the Luckey Formerly Utilized Sites Remedial Action Program (FUSRAP) site near Luckey, Ohio, which had significant uncertainty surrounding the estimated volume of site soils contaminated with radium, uranium, thorium, beryllium, and lead. Funding provided by the American Recovery and Reinvestment Act (ARRA) allowed the U.S. Army Corps of Engineers (USACE) to conduct additional environmental sampling and analysis at the Luckey Site between November 2009 and April 2010, with the objective to further delineate the horizontal and vertical extent of contaminated soils in order to reduce the uncertainty in the soil volume estimate. Investigative work included radiological, geophysical, and topographic field surveys, subsurface borings, and soil sampling. Results from the investigative sampling were used in conjunction with Argonne National Laboratory's Bayesian Approaches for Adaptive Spatial Sampling (BAASS) software to update the contaminated soil volume estimate for the site. This updated volume estimate was then used to update the project cost-to-complete estimate using the USACE Cost and Schedule Risk Analysis process, which develops cost contingencies based on project risks. An investment of $1.1 M of ARRA funds for additional investigative work resulted in a reduction of 135,000 in-situ cubic meters (177,000 in-situ cubic yards) in the estimated base volume estimate. This refinement of the estimated soil volume resulted in a $64.3 M reduction in the estimated project cost-to-complete, through a reduction in the uncertainty in the contaminated soil

  15. Thermal neutron self-shielding correction factors for large sample instrumental neutron activation analysis using the MCNP code

    International Nuclear Information System (INIS)

    Tzika, F.; Stamatelatos, I.E.

    2004-01-01

    Thermal neutron self-shielding within large samples was studied using the Monte Carlo neutron transport code MCNP. The code enabled a three-dimensional modeling of the actual source and geometry configuration including reactor core, graphite pile and sample. Neutron flux self-shielding correction factors derived for a set of materials of interest for large sample neutron activation analysis are presented and evaluated. Simulations were experimentally verified by measurements performed using activation foils. The results of this study can be applied in order to determine neutron self-shielding factors of unknown samples from the thermal neutron fluxes measured at the surface of the sample

  16. Study on water boiling noises in a large volume

    International Nuclear Information System (INIS)

    Masagutov, R.F.; Krivtsov, V.A.

    1977-01-01

    Presented are the results of measurement of the noise spectra during boiling of water in a large volume at the pressure of 1 at. Boiling of the distilled water has been accomplished with the use of the heaters made of the Kh18N10T steel, 50 mm in length, 2 mm in the outside diameter, with the wall thickness of 0.1 mm. The degree of water under heating changed during the experiments from 0 to 80 deg C, and the magnitude of the specific heat flux varied from o to 0.7 - 0.9 qsup(x), where qsup(x) was the specific heat flux of the tube burn-out. The noise spectrum of the boiling water was analyzed at frequencies of 0.5 to 200 kHz. The submerge-type pressure-electric transmitters were used for measurements. At underheating boiling during the experiment the standing waves have formed which determine the structure of the measured spectra. During saturated boiling of water no standing waves were revealed. At underheating over 15 - 20 deg C the water boiling process is accompanied by the noises within the ultrasonic frequency range. The maximum upper boundary of the noise in the experiments amounts to 90 - 100 kHz

  17. Measuring Blood Glucose Concentrations in Photometric Glucometers Requiring Very Small Sample Volumes.

    Science.gov (United States)

    Demitri, Nevine; Zoubir, Abdelhak M

    2017-01-01

    Glucometers present an important self-monitoring tool for diabetes patients and, therefore, must exhibit high accuracy as well as good usability features. Based on an invasive photometric measurement principle that drastically reduces the volume of the blood sample needed from the patient, we present a framework that is capable of dealing with small blood samples, while maintaining the required accuracy. The framework consists of two major parts: 1) image segmentation; and 2) convergence detection. Step 1 is based on iterative mode-seeking methods to estimate the intensity value of the region of interest. We present several variations of these methods and give theoretical proofs of their convergence. Our approach is able to deal with changes in the number and position of clusters without any prior knowledge. Furthermore, we propose a method based on sparse approximation to decrease the computational load, while maintaining accuracy. Step 2 is achieved by employing temporal tracking and prediction, herewith decreasing the measurement time, and, thus, improving usability. Our framework is tested on several real datasets with different characteristics. We show that we are able to estimate the underlying glucose concentration from much smaller blood samples than is currently state of the art with sufficient accuracy according to the most recent ISO standards and reduce measurement time significantly compared to state-of-the-art methods.

  18. Radioimmunoassay of h-TSH - methodological suggestions for dealing with medium to large numbers of samples

    International Nuclear Information System (INIS)

    Mahlstedt, J.

    1977-01-01

    The article deals with practical aspects of establishing a TSH-RIA for patients, with particular regard to predetermined quality criteria. Methodological suggestions are made for medium to large numbers of samples with the target of reducing monotonous precision working steps by means of simple aids. The quality criteria required are well met, while the test procedure is well adapted to the rhythm of work and may be carried out without loss of precision even with large numbers of samples. (orig.) [de

  19. Highly selective solid-phase extraction and large volume injection for the robust gas chromatography-mass spectrometric analysis of TCA and TBA in wines.

    Science.gov (United States)

    Insa, S; Anticó, E; Ferreira, V

    2005-09-30

    A reliable solid-phase extraction (SPE) method for the simultaneous determination of 2,4,6-trichloroanisole (TCA) and 2,4,6-tribromoanisole (TBA) in wines has been developed. In the proposed procedure 50 mL of wine are extracted in a 1 mL cartridge filled with 50 mg of LiChrolut EN resins. Most wine volatiles are washed up with 12.5 mL of a water:methanol solution (70%, v/v) containing 1% of NaHCO3. Analytes are further eluted with 0.6 mL of dichloromethane. A 40 microL aliquot of this extract is directly injected into a PTV injector operated in the solvent split mode, and analysed by gas chromatography (GC)-ion trap mass spectrometry using the selected ion storage mode. The solid-phase extraction, including sample volume and rinsing and elution solvents, and the large volume GC injection have been carefully evaluated and optimized. The resulting method is precise (RSD (%) TBA, respectively), robust (the absolute recoveries of both analytes are higher than 80% and consistent wine to wine) and friendly to the GC-MS system (the extract is clean, simple and free from non-volatiles).

  20. Material properties of large-volume cadmium zinc telluride crystals and their relationship to nuclear detector performance

    Energy Technology Data Exchange (ETDEWEB)

    James, R.B.; Lund, J. [Sandia National Labs., Livermore, CA (United States); Yoon, H. [Sandia National Labs., Livermore, CA (United States)]|[Univ. of California, Los Angeles, CA (United States)] [and others

    1997-09-01

    The material showing the greatest promise today for production of large-volume gamma-ray spectrometers operable at room temperature is cadmium zinc telluride (CZT). Unfortunately, because of deficiencies in the quality of the present material, high-resolution CZT spectrometers have thus far been limited to relatively small dimensions, which makes them inefficient at detecting high photon energies and ineffective for weak radiation signals except in near proximity. To exploit CZT fully, it will be necessary to make substantial improvements in the material quality. Improving the material involves advances in the purity, crystallinity, and control of the electrical compensation mechanism. Sandia National Laboratories, California, in close collaboration with US industry and academia, has initiated efforts to develop a detailed understanding of the underlying material problems limiting the performance of large volume gamma-ray spectrometers and to overcome them through appropriate corrections therein. A variety of analytical and numerical techniques are employed to quantify impurities, compositional and stoichiometric variations, crystallinity, strain, bulk and surface defect states, carrier mobilities and lifetimes, electric field distributions, and contact chemistry. Data from these measurements are correlated with spatial maps of the gamma-ray and alpha particle spectroscopic response to determine improvements in the material purification, crystal growth, detector fabrication, and surface passivation procedures. The results of several analytical techniques will be discussed. The intended accomplishment of this work is to develop a low-cost, high-efficiency CZT spectrometer with an active volume of 5 cm{sup 3} and energy resolution of 1--2% (at 662 keV), which would give the US a new field capability for screening radioactive substances.

  1. Elemental mapping of large samples by external ion beam analysis with sub-millimeter resolution and its applications

    Science.gov (United States)

    Silva, T. F.; Rodrigues, C. L.; Added, N.; Rizzutto, M. A.; Tabacniks, M. H.; Mangiarotti, A.; Curado, J. F.; Aguirre, F. R.; Aguero, N. F.; Allegro, P. R. P.; Campos, P. H. O. V.; Restrepo, J. M.; Trindade, G. F.; Antonio, M. R.; Assis, R. F.; Leite, A. R.

    2018-05-01

    The elemental mapping of large areas using ion beam techniques is a desired capability for several scientific communities, involved on topics ranging from geoscience to cultural heritage. Usually, the constraints for large-area mapping are not met in setups employing micro- and nano-probes implemented all over the world. A novel setup for mapping large sized samples in an external beam was recently built at the University of São Paulo employing a broad MeV-proton probe with sub-millimeter dimension, coupled to a high-precision large range XYZ robotic stage (60 cm range in all axis and precision of 5 μ m ensured by optical sensors). An important issue on large area mapping is how to deal with the irregularities of the sample's surface, that may introduce artifacts in the images due to the variation of the measuring conditions. In our setup, we implemented an automatic system based on machine vision to correct the position of the sample to compensate for its surface irregularities. As an additional benefit, a 3D digital reconstruction of the scanned surface can also be obtained. Using this new and unique setup, we have produced large-area elemental maps of ceramics, stones, fossils, and other sort of samples.

  2. Reduction of Powerplex(®) Y23 reaction volume for genotyping buccal cell samples on FTA(TM) cards.

    Science.gov (United States)

    Raziel, Aliza; Dell'Ariccia-Carmon, Aviva; Zamir, Ashira

    2015-01-01

    PowerPlex(®) Y23 is a novel kit for Y-STR typing that includes new highly discriminating loci. The Israel DNA Database laboratory has recently adopted it for routine Y-STR analysis. This study examined PCR amplification from 1.2-mm FTA punch in reduced volumes of 5 and 10 μL. Direct amplification and washing of the FTA punches were examined in different PCR cycle numbers. One short robotically performed wash was found to improve the quality and the percent of profiles obtained. The optimal PCR cycle number was determined for 5 and 10 μL reaction volumes. The percent of obtained profiles, color balance, and reproducibility were examined. High-quality profiles were achieved in 90% and 88% of the samples amplified in 5 and 10 μL, respectively, in the first attempt. Volume reduction to 5 μL has a vast economic impact especially for DNA database laboratories. © 2014 American Academy of Forensic Sciences.

  3. Software engineering the mixed model for genome-wide association studies on large samples.

    Science.gov (United States)

    Zhang, Zhiwu; Buckler, Edward S; Casstevens, Terry M; Bradbury, Peter J

    2009-11-01

    Mixed models improve the ability to detect phenotype-genotype associations in the presence of population stratification and multiple levels of relatedness in genome-wide association studies (GWAS), but for large data sets the resource consumption becomes impractical. At the same time, the sample size and number of markers used for GWAS is increasing dramatically, resulting in greater statistical power to detect those associations. The use of mixed models with increasingly large data sets depends on the availability of software for analyzing those models. While multiple software packages implement the mixed model method, no single package provides the best combination of fast computation, ability to handle large samples, flexible modeling and ease of use. Key elements of association analysis with mixed models are reviewed, including modeling phenotype-genotype associations using mixed models, population stratification, kinship and its estimation, variance component estimation, use of best linear unbiased predictors or residuals in place of raw phenotype, improving efficiency and software-user interaction. The available software packages are evaluated, and suggestions made for future software development.

  4. New methods to interpolate large volume of data from points or particles (Mesh-Free) methods application for its scientific visualization

    International Nuclear Information System (INIS)

    Reyes Lopez, Y.; Yervilla Herrera, H.; Viamontes Esquivel, A.; Recarey Morfa, C. A.

    2009-01-01

    In the following paper we developed a new method to interpolate large volumes of scattered data, focused mainly on the results of the Mesh free Methods, Points Methods and the Particles Methods application. Through this one, we use local radial basis function as interpolating functions. We also use over-tree as the data structure that allows to accelerate the localization of the data that influences to interpolate the values at a new point, speeding up the application of scientific visualization techniques to generate images from large data volumes from the application of Mesh-free Methods, Points and Particle Methods, in the resolution of diverse models of physics-mathematics. As an example, the results obtained after applying this method using the local interpolation functions of Shepard are shown. (Author) 22 refs

  5. APS 6BM-B Large Volume High Pressure Beamline: A Workhorse for Rock and Mineral Physics

    Science.gov (United States)

    Chen, H.; Whitaker, M. L.; Baldwin, K. J.; Huebsch, W. R.; Vaughan, M. T.; Weidner, D. J.

    2017-12-01

    With the inheritance of decades of technical innovations at the NSLS X17B2 Beamline, APS 6BM-B Beamline was established in 2015 and is a dedicated beamline for synchrotron-based large volume high pressure research in earth sciences, especially rock and mineral physics. Currently a 250-ton hydraulic press equipped with a D-DIA module is installed and a Rotational Drickamer Apparatus from Yale University is hosted every cycle, covering a pressure range from crust to lower mantle. 6BM-B operates in white beam mode with an effective energy range of 20-100 keV. Energy dispersive X-ray diffraction data is collected using a 10-element solid state Ge array detector arranged in a circular geometry to allow for the real time assessment of stress. Direct radiographic imaging using Prosillica CCD camera and scintillating YAG crystals yields sample strain and strain rate. In addition to applications in phase transitions, equation of states measurements, sound velocity measurements, this setup is ideal for studies of steady state and dynamic deformation process. In this presentation, technical features and strengths of 6BM-B will be discussed. Most recent progress and science highlights of our user community will be showcased.

  6. Rapid separation method for {sup 237}Np and Pu isotopes in large soil samples

    Energy Technology Data Exchange (ETDEWEB)

    Maxwell, Sherrod L., E-mail: sherrod.maxwell@srs.go [Savannah River Nuclear Solutions, LLC, Building 735-B, Aiken, SC 29808 (United States); Culligan, Brian K.; Noyes, Gary W. [Savannah River Nuclear Solutions, LLC, Building 735-B, Aiken, SC 29808 (United States)

    2011-07-15

    A new rapid method for the determination of {sup 237}Np and Pu isotopes in soil and sediment samples has been developed at the Savannah River Site Environmental Lab (Aiken, SC, USA) that can be used for large soil samples. The new soil method utilizes an acid leaching method, iron/titanium hydroxide precipitation, a lanthanum fluoride soil matrix removal step, and a rapid column separation process with TEVA Resin. The large soil matrix is removed easily and rapidly using these two simple precipitations with high chemical recoveries and effective removal of interferences. Vacuum box technology and rapid flow rates are used to reduce analytical time.

  7. The effect of duration of illness and antipsychotics on subcortical volumes in schizophrenia: Analysis of 778 subjects

    Directory of Open Access Journals (Sweden)

    Naoki Hashimoto

    2018-01-01

    Discussion: A large sample size, uniform data collection methodology and robust statistical analysis are strengths of the current study. This result suggests that we need special attention to discuss about relationship between subcortical regional brain volumes and pathophysiology of schizophrenia because regional brain volumes may be affected by antipsychotic medication.

  8. Onco plastic volume replacement with latissimus dorsi myocutaneous flap in patients with large ptotic breasts. Is it feasible?

    International Nuclear Information System (INIS)

    El-Marakby, H.H.; Kotb, M.H.

    2011-01-01

    Onco plastic breast conservative surgery has evolved as a safe alternative to the standard mastectomy in the treatment of early breast cancer. The procedure involves tumour resection with an adequate safety margin and either breast reshaping with volume displacement procedures (large or pt otic breasts) or volume replacement with latissimus dorsi myocutaneous flap (LDF) (small to medium sized non-ptotic breasts). A contra lateral mastopexy procedure is usually necessary with the volume displacement Onco plastic surgery, a procedure that is often rejected by a significant number of patients. This limits the choice of the reconstruction of breast defects in such patients to autologous tissues i.e. LDF. Aim: Aim is to evaluate the feasibility of volume replacement onco plastic breast conservative surgery with lemmatise dorsi myocutaneous flaps for patients with large ptotic breasts. This involves testing the oncologic safety in terms of adequate safety margin, the complications rate and the final cosmetic outcome. The loco regional recurrence rate will be recorded and compared with oncoplastic volume displacement for similar sized breast defects. Patients and methods: A group of 50 female patients with early breast cancers (T2) who presented to the department of surgery at the National Cancer Institute, Cairo, Egypt in the period between carried out in all patient groups and was used to annually follow up the patients. All patients were detected with T2 NO breast cancer by both clinical and radiological examinations. All patients underwent partial mastectomy was and reconstruction with LDFs. Results: The average age at presentation was 46.5 ± 9 years and the range was 26-65 years. Most of the patients were subjected to partial mastectomy in 30 patients (60%), excision of a single quadrant from the four major quadrants was carried out in 15 patients (30%) where skin sparing wide local excision was carried out in only five patients (10%). The safety margin ranged from

  9. Pore water sampling in acid sulfate soils: a new peeper method.

    Science.gov (United States)

    Johnston, Scott G; Burton, Edward D; Keene, Annabelle F; Bush, Richard T; Sullivan, Leigh A; Isaacson, Lloyd

    2009-01-01

    This study describes the design, deployment, and application of a modified equilibration dialysis device (peeper) optimized for sampling pore waters in acid sulfate soils (ASS). The modified design overcomes the limitations of traditional-style peepers, when sampling firm ASS materials over relatively large depth intervals. The new peeper device uses removable, individual cells of 25 mL volume housed in a 1.5 m long rigid, high-density polyethylene rod. The rigid housing structure allows the device to be inserted directly into relatively firm soils without requiring a supporting frame. The use of removable cells eliminates the need for a large glove-box after peeper retrieval, thus simplifying physical handling. Removable cells are easily maintained in an inert atmosphere during sample processing and the 25-mL sample volume is sufficient for undertaking multiple analyses. A field evaluation of equilibration times indicates that 32 to 38 d of deployment was necessary. Overall, the modified method is simple and effective and well suited to acquisition and processing of redox-sensitive pore water profiles>1 m deep in acid sulfate soil or any other firm wetland soils.

  10. Matrix Sampling of Items in Large-Scale Assessments

    Directory of Open Access Journals (Sweden)

    Ruth A. Childs

    2003-07-01

    Full Text Available Matrix sampling of items -' that is, division of a set of items into different versions of a test form..-' is used by several large-scale testing programs. Like other test designs, matrixed designs have..both advantages and disadvantages. For example, testing time per student is less than if each..student received all the items, but the comparability of student scores may decrease. Also,..curriculum coverage is maintained, but reporting of scores becomes more complex. In this paper,..matrixed designs are compared with more traditional designs in nine categories of costs:..development costs, materials costs, administration costs, educational costs, scoring costs,..reliability costs, comparability costs, validity costs, and reporting costs. In choosing among test..designs, a testing program should examine the costs in light of its mandate(s, the content of the..tests, and the financial resources available, among other considerations.

  11. A rapid method for estimation of Pu-isotopes in urine samples using high volume centrifuge.

    Science.gov (United States)

    Kumar, Ranjeet; Rao, D D; Dubla, Rupali; Yadav, J R

    2017-07-01

    The conventional radio-analytical technique used for estimation of Pu-isotopes in urine samples involves anion exchange/TEVA column separation followed by alpha spectrometry. This sequence of analysis consumes nearly 3-4 days for completion. Many a times excreta analysis results are required urgently, particularly under repeat and incidental/emergency situations. Therefore, there is need to reduce the analysis time for the estimation of Pu-isotopes in bioassay samples. This paper gives the details of standardization of a rapid method for estimation of Pu-isotopes in urine samples using multi-purpose centrifuge, TEVA resin followed by alpha spectrometry. The rapid method involves oxidation of urine samples, co-precipitation of plutonium along with calcium phosphate followed by sample preparation using high volume centrifuge and separation of Pu using TEVA resin. Pu-fraction was electrodeposited and activity estimated using 236 Pu tracer recovery by alpha spectrometry. Ten routine urine samples of radiation workers were analyzed and consistent radiochemical tracer recovery was obtained in the range 47-88% with a mean and standard deviation of 64.4% and 11.3% respectively. With this newly standardized technique, the whole analytical procedure is completed within 9h (one working day hour). Copyright © 2017 Elsevier Ltd. All rights reserved.

  12. Random sampling of elementary flux modes in large-scale metabolic networks.

    Science.gov (United States)

    Machado, Daniel; Soons, Zita; Patil, Kiran Raosaheb; Ferreira, Eugénio C; Rocha, Isabel

    2012-09-15

    The description of a metabolic network in terms of elementary (flux) modes (EMs) provides an important framework for metabolic pathway analysis. However, their application to large networks has been hampered by the combinatorial explosion in the number of modes. In this work, we develop a method for generating random samples of EMs without computing the whole set. Our algorithm is an adaptation of the canonical basis approach, where we add an additional filtering step which, at each iteration, selects a random subset of the new combinations of modes. In order to obtain an unbiased sample, all candidates are assigned the same probability of getting selected. This approach avoids the exponential growth of the number of modes during computation, thus generating a random sample of the complete set of EMs within reasonable time. We generated samples of different sizes for a metabolic network of Escherichia coli, and observed that they preserve several properties of the full EM set. It is also shown that EM sampling can be used for rational strain design. A well distributed sample, that is representative of the complete set of EMs, should be suitable to most EM-based methods for analysis and optimization of metabolic networks. Source code for a cross-platform implementation in Python is freely available at http://code.google.com/p/emsampler. dmachado@deb.uminho.pt Supplementary data are available at Bioinformatics online.

  13. Enantioselective column coupled electrophoresis employing large bore capillaries hyphenated with tandem mass spectrometry for ultra-trace determination of chiral compounds in complex real samples.

    Science.gov (United States)

    Piešťanský, Juraj; Maráková, Katarína; Kovaľ, Marián; Havránek, Emil; Mikuš, Peter

    2015-12-01

    A new multidimensional analytical approach for the ultra-trace determination of target chiral compounds in unpretreated complex real samples was developed in this work. The proposed analytical system provided high orthogonality due to on-line combination of three different methods (separation mechanisms), i.e. (1) isotachophoresis (ITP), (2) chiral capillary zone electrophoresis (chiral CZE), and (3) triple quadrupole mass spectrometry (QqQ MS). The ITP step, performed in a large bore capillary (800 μm), was utilized for the effective sample pretreatment (preconcentration and matrix clean-up) in a large injection volume (1-10 μL) enabling to obtain as low as ca. 80 pg/mL limits of detection for the target enantiomers in urine matrices. In the chiral CZE step, the different chiral selectors (neutral, ionizable, and permanently charged cyclodextrins) and buffer systems were tested in terms of enantioselectivity and influence on the MS detection response. The performance parameters of the optimized ITP - chiral CZE-QqQ MS method were evaluated according to the FDA guidance for bioanalytical method validation. Successful validation and application (enantioselective monitoring of renally eliminated pheniramine and its metabolite in human urine) highlighted great potential of this chiral approach in advanced enantioselective biomedical applications. © 2015 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  14. Improved tolerance of abdominal large-volume radiotherapy due to ornithine aspartate

    International Nuclear Information System (INIS)

    Kuttig, H.

    1983-01-01

    The influence of ornithine aspartate on supporting the hepatic function was investigated in a group of 47 patients with tumour dissemination in the pelvic and abdominal region, randomised on the basis of the progress of the serum enzymes GOT, GPT, LAD, LDH, LAP and the alkaline phosphatase during and following completion of a course of large-volume radiotherapy. The adjuvant therapy with ornithine aspartate resulted in reduced enzyme movement with an earlier tendency to normalisation. The results, which are borne out by statistics, clearly show an improvement in the hepatic function on detoxication of toxic degradation products of radiotherapy with reduced impairment of the body's own defence mechanisms. Subjectively too, the course of treatment with ornithine aspartate showed a reduced ratio of side effects as regards lassitude and impairment of the patient's general well-being as compared with the group of patients to whom ornithine aspartate was not simultaneously administered. (orig.) [de

  15. Development of metallic molds for the large volume plastic scintillator fabrication

    International Nuclear Information System (INIS)

    Calvo, Wilson A.P.; Vieira, Jose M.; Rela, Paulo R.; Bruzinga, Wilson A.; Araujo, Eduardo P.; Costa Junior, Nelson P.; Hamada, Margarida M.

    1997-01-01

    The plastic scintillators are radiation detectors made of organic fluorescent compounds dissolved in a solidified polymer matrix. The manufacturing process of large volume detectors (55 liters) at low cost, by polymerization of the styrene monomer plus PPO and POPOP scintillators, was studied in this paper. Metallic molds of ASTM 1200 aluminum and AISI 304 stainless steel were produced by TIG welding process since the polymerization reaction is very exothermic. The measurements of transmittance, luminescence, X-ray fluorescence and light output were carried out in the plastic scintillators made using different metallic molds. The characterization results of the detectors produced in an open system using ASTM 1200 aluminum mold show that there is not quality change in the scintillator, even with aluminum being considered as unstable for styrene monomer. Therefore, the ASTM 1200 aluminum was found to be the best alternative to produce the detector by an open system polymerization. (author). 11 refs., 8 figs., 1 tab

  16. Assembling large, complex environmental metagenomes

    Energy Technology Data Exchange (ETDEWEB)

    Howe, A. C. [Michigan State Univ., East Lansing, MI (United States). Microbiology and Molecular Genetics, Plant Soil and Microbial Sciences; Jansson, J. [USDOE Joint Genome Institute (JGI), Walnut Creek, CA (United States); Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States). Earth Sciences Division; Malfatti, S. A. [USDOE Joint Genome Institute (JGI), Walnut Creek, CA (United States); Tringe, S. G. [USDOE Joint Genome Institute (JGI), Walnut Creek, CA (United States); Tiedje, J. M. [Michigan State Univ., East Lansing, MI (United States). Microbiology and Molecular Genetics, Plant Soil and Microbial Sciences; Brown, C. T. [Michigan State Univ., East Lansing, MI (United States). Microbiology and Molecular Genetics, Computer Science and Engineering

    2012-12-28

    The large volumes of sequencing data required to sample complex environments deeply pose new challenges to sequence analysis approaches. De novo metagenomic assembly effectively reduces the total amount of data to be analyzed but requires significant computational resources. We apply two pre-assembly filtering approaches, digital normalization and partitioning, to make large metagenome assemblies more computationaly tractable. Using a human gut mock community dataset, we demonstrate that these methods result in assemblies nearly identical to assemblies from unprocessed data. We then assemble two large soil metagenomes from matched Iowa corn and native prairie soils. The predicted functional content and phylogenetic origin of the assembled contigs indicate significant taxonomic differences despite similar function. The assembly strategies presented are generic and can be extended to any metagenome; full source code is freely available under a BSD license.

  17. Study of a large rapid ashing apparatus and a rapid dry ashing method for biological samples and its application

    International Nuclear Information System (INIS)

    Jin Meisun; Wang Benli; Liu Wencang

    1988-04-01

    A large rapid-dry-ashing apparatus and a rapid ashing method for biological samples are described. The apparatus consists of specially made ashing furnace, gas supply system and temperature-programming control cabinet. The following adventages have been showed by ashing experiment with the above apparatus: (1) high speed of ashing and saving of electric energy; (2) The apparatus can ash a large amount of samples at a time; (3) The ashed sample is pure white (or spotless), loose and easily soluble with few content of residual char; (4) The fresh sample can also be ashed directly. The apparatus is suitable for ashing a large amount of the environmental samples containing low level radioactivity trace elements and the medical, food and agricultural research samples

  18. A fast learning method for large scale and multi-class samples of SVM

    Science.gov (United States)

    Fan, Yu; Guo, Huiming

    2017-06-01

    A multi-class classification SVM(Support Vector Machine) fast learning method based on binary tree is presented to solve its low learning efficiency when SVM processing large scale multi-class samples. This paper adopts bottom-up method to set up binary tree hierarchy structure, according to achieved hierarchy structure, sub-classifier learns from corresponding samples of each node. During the learning, several class clusters are generated after the first clustering of the training samples. Firstly, central points are extracted from those class clusters which just have one type of samples. For those which have two types of samples, cluster numbers of their positive and negative samples are set respectively according to their mixture degree, secondary clustering undertaken afterwards, after which, central points are extracted from achieved sub-class clusters. By learning from the reduced samples formed by the integration of extracted central points above, sub-classifiers are obtained. Simulation experiment shows that, this fast learning method, which is based on multi-level clustering, can guarantee higher classification accuracy, greatly reduce sample numbers and effectively improve learning efficiency.

  19. Major risk from rapid, large-volume landslides in Europe (EU Project RUNOUT)

    Science.gov (United States)

    Kilburn, Christopher R. J.; Pasuto, Alessandro

    2003-08-01

    Project RUNOUT has investigated methods for reducing the risk from large-volume landslides in Europe, especially those involving rapid rates of emplacement. Using field data from five test sites (Bad Goisern and Köfels in Austria, Tessina and Vajont in Italy, and the Barranco de Tirajana in Gran Canaria, Spain), the studies have developed (1) techniques for applying geomorphological investigations and optical remote sensing to map landslides and their evolution; (2) analytical, numerical, and cellular automata models for the emplacement of sturzstroms and debris flows; (3) a brittle-failure model for forecasting catastrophic slope failure; (4) new strategies for integrating large-area Global Positioning System (GPS) arrays with local geodetic monitoring networks; (5) methods for raising public awareness of landslide hazards; and (6) Geographic Information System (GIS)-based databases for the test areas. The results highlight the importance of multidisciplinary studies of landslide hazards, combining subjects as diverse as geology and geomorphology, remote sensing, geodesy, fluid dynamics, and social profiling. They have also identified key goals for an improved understanding of the physical processes that govern landslide collapse and runout, as well as for designing strategies for raising public awareness of landslide hazards and for implementing appropriate land management policies for reducing landslide risk.

  20. Plasma volume changes during hypoglycaemia: the effect of arterial blood sampling

    DEFF Research Database (Denmark)

    Hilsted, J; Bendtsen, Flemming; Christensen, N J

    1990-01-01

    To investigate whether previously reported changes in venous blood volume and composition induced by acute hypoglycaemia in humans are representative for the entire body we measured erythrocyte 51Cr content, haematocrit, plasma volume, intravascular albumin content and transcapillary escape rate...... hypoglycaemia. The magnitude of the changes in arterial and venous blood were not significantly different. These results indicate that the above changes in blood volume and composition are whole-body phenomena: furthermore, the major part of the changes are likely to occur in tissues other than upper extremity...... of albumin in arterial and venous blood in seven healthy subjects before and during insulin-induced hypoglycaemia. In both vascular sites blood 51Cr content and the haematocrit increased, plasma volume and intravascular albumin content decreased and the transcapillary escape rate of albumin increased during...

  1. A Monte-Carlo code for neutron efficiency calculations for large volume Gd-loaded liquid scintillation detectors

    Energy Technology Data Exchange (ETDEWEB)

    Trzcinski, A.; Zwieglinski, B. [Soltan Inst. for Nuclear Studies, Warsaw (Poland); Lynen, U. [Gesellschaft fuer Schwerionenforschung mbH, Darmstadt (Germany); Pochodzalla, J. [Max-Planck-Institut fuer Kernphysik, Heidelberg (Germany)

    1998-10-01

    This paper reports on a Monte-Carlo program, MSX, developed to evaluate the performance of large-volume, Gd-loaded liquid scintillation detectors used in neutron multiplicity measurements. The results of simulations are presented for the detector intended to count neutrons emitted by the excited target residue in coincidence with the charged products of the projectile fragmentation following relativistic heavy-ion collisions. The latter products could be detected with the ALADIN magnetic spectrometer at GSI-Darmstadt. (orig.) 61 refs.

  2. Development of digital gamma-activation autoradiography for analysis of samples of large area

    International Nuclear Information System (INIS)

    Kolotov, V.P.; Grozdov, D.S.; Dogadkin, N.N.; Korobkov, V.I.

    2011-01-01

    Gamma-activation autoradiography is a prospective method for screening detection of inclusions of precious metals in geochemical samples. Its characteristics allow analysis of thin sections of large size (tens of cm2), that favourably distinguishes it among the other methods for local analysis. At the same time, the activating field of the accelerator bremsstrahlung, displays a sharp intensity decrease relative to the distance along the axis. A method for activation dose ''equalization'' during irradiation of the large size thin sections has been developed. The method is based on the usage of a hardware-software system. This includes a device for moving the sample during the irradiation, a program for computer modelling of the acquired activating dose for the chosen kinematics of the sample movement and a program for pixel-by pixel correction of the autoradiographic images. For detection of inclusions of precious metals, a method for analysis of the acquired dose dynamics during sample decay has been developed. The method is based on the software processing pixel by pixel a time-series of coaxial autoradiographic images and generation of the secondary meta-images allowing interpretation regarding the presence of interesting inclusions based on half-lives. The method is tested for analysis of copper-nickel polymetallic ores. The developed solutions considerably expand the possible applications of digital gamma-activation autoradiography. (orig.)

  3. Development of digital gamma-activation autoradiography for analysis of samples of large area

    Energy Technology Data Exchange (ETDEWEB)

    Kolotov, V.P.; Grozdov, D.S.; Dogadkin, N.N.; Korobkov, V.I. [Russian Academy of Sciences, Moscow (Russian Federation). Vernadsky Inst. of Geochemistry and Analytical Chemistry

    2011-07-01

    Gamma-activation autoradiography is a prospective method for screening detection of inclusions of precious metals in geochemical samples. Its characteristics allow analysis of thin sections of large size (tens of cm2), that favourably distinguishes it among the other methods for local analysis. At the same time, the activating field of the accelerator bremsstrahlung, displays a sharp intensity decrease relative to the distance along the axis. A method for activation dose ''equalization'' during irradiation of the large size thin sections has been developed. The method is based on the usage of a hardware-software system. This includes a device for moving the sample during the irradiation, a program for computer modelling of the acquired activating dose for the chosen kinematics of the sample movement and a program for pixel-by pixel correction of the autoradiographic images. For detection of inclusions of precious metals, a method for analysis of the acquired dose dynamics during sample decay has been developed. The method is based on the software processing pixel by pixel a time-series of coaxial autoradiographic images and generation of the secondary meta-images allowing interpretation regarding the presence of interesting inclusions based on half-lives. The method is tested for analysis of copper-nickel polymetallic ores. The developed solutions considerably expand the possible applications of digital gamma-activation autoradiography. (orig.)

  4. Tumor Volume Decrease via Feeder Occlusion for Treating a Large, Firm Trigone Meningioma.

    Science.gov (United States)

    Nakashima, Takuma; Hatano, Norikazu; Kanamori, Fumiaki; Muraoka, Shinsuke; Kawabata, Teppei; Takasu, Syuntaro; Watanabe, Tadashi; Kojima, Takao; Nagatani, Tetsuya; Seki, Yukio

    2018-01-01

    Trigone meningiomas are considered a surgical challenge, as they tend to be considerably large and hypervascularized at the time of presentation. We experienced a case of a large and very hard trigone meningioma that was effectively treated using initial microsurgical feeder occlusion followed by surgery in stages. A 19-year-old woman who presented with loss of consciousness was referred to our hospital for surgical treatment of a brain tumor. Radiological findings were compatible with a left ventricular trigone meningioma extending laterally in proximity to the Sylvian fissure. At initial surgery using the transsylvian approach, main feeders originating from the anterior and lateral posterior choroidal arteries were occluded at the inferior horn; however, only a small section of the tumor could initially be removed because of its firmness. Over time, feeder occlusion resulted in tumor necrosis and a 20% decrease in its diameter; the mass effect was alleviated within 1 year. The residual meningioma was then totally excised in staged surgical procedures after resection became more feasible owing to ischemia-induced partial softening of the tumor. When a trigone meningioma is large and very hard, initial microsurgical feeder occlusion in the inferior horn can be a safe and effective option, and can lead to necrosis, volume decrease, and partial softening of the residual tumor to allow for its staged surgical excision.

  5. Sample-path large deviations in credit risk

    NARCIS (Netherlands)

    Leijdekker, V.J.G.; Mandjes, M.R.H.; Spreij, P.J.C.

    2011-01-01

    The event of large losses plays an important role in credit risk. As these large losses are typically rare, and portfolios usually consist of a large number of positions, large deviation theory is the natural tool to analyze the tail asymptotics of the probabilities involved. We first derive a

  6. Relationship of fish indices with sampling effort and land use change in a large Mediterranean river.

    Science.gov (United States)

    Almeida, David; Alcaraz-Hernández, Juan Diego; Merciai, Roberto; Benejam, Lluís; García-Berthou, Emili

    2017-12-15

    Fish are invaluable ecological indicators in freshwater ecosystems but have been less used for ecological assessments in large Mediterranean rivers. We evaluated the effects of sampling effort (transect length) on fish metrics, such as species richness and two fish indices (the new European Fish Index EFI+ and a regional index, IBICAT2b), in the mainstem of a large Mediterranean river. For this purpose, we sampled by boat electrofishing five sites each with 10 consecutive transects corresponding to a total length of 20 times the river width (European standard required by the Water Framework Directive) and we also analysed the effect of sampling area on previous surveys. Species accumulation curves and richness extrapolation estimates in general suggested that species richness was reasonably estimated with transect lengths of 10 times the river width or less. The EFI+ index was significantly affected by sampling area, both for our samplings and previous data. Surprisingly, EFI+ values in general decreased with increasing sampling area, despite the higher observed richness, likely because the expected values of metrics were higher. By contrast, the regional fish index was not dependent on sampling area, likely because it does not use a predictive model. Both fish indices, but particularly the EFI+, decreased with less forest cover percentage, even within the smaller disturbance gradient in the river type studied (mainstem of a large Mediterranean river, where environmental pressures are more general). Although the two fish-based indices are very different in terms of their development, methodology, and metrics used, they were significantly correlated and provided a similar assessment of ecological status. Our results reinforce the importance of standardization of sampling methods for bioassessment and suggest that predictive models that use sampling area as a predictor might be more affected by differences in sampling effort than simpler biotic indices. Copyright

  7. Development of a spatially uniform fast ionization wave in a large-volume discharge

    International Nuclear Information System (INIS)

    Zatsepin, D.V.; Starikovskaya, S.M.; Starikovskii, A.Yu.

    1998-01-01

    A study is made of a high-voltage nanosecond breakdown in the form of a fast ionization wave produced in a large-volume (401) discharge chamber. The propagation speed of the wave front and the integral energy deposition in a plasma are measured for various regimes of the air discharge at pressures of 10 -2 -4 Torr. A high degree of both the spatial uniformity of the discharge and the reproducibility of the discharge parameters is obtained. The possibility of the development of a fast ionization wave in an electrodeless system is demonstrated. A transition of the breakdown occurring in the form of a fast ionization wave into the streamer breakdown is observed. It is shown that such discharges are promising for technological applications

  8. Characteristics of the NE-213 large-volume neutron counters for muon catalyzed fusion investigation

    International Nuclear Information System (INIS)

    Bystritsky, V.M.; Wozniak, J.; Zinov, V.G.

    1984-01-01

    The Monte-Carlo method was used to establish the properties and feasibility of a large-volume NE-213 scin illator as an efficient neutron detector. The recoil proton spectra, calculated efficiencies for different detection thresholds and scintillator sizes are presented for the neutron energy up to 15 MeV. The time characteristics, e. g., time resolution, are discussed. It is also shown that no strong influence of light attenuation by the scintilla or itself on calculated efficiencies is observed, when gamma-calibration technique is used. The detector vol me of approximately 100 l is suggested for application in investigations of μ-atom and μ-molecular processes

  9. Plasma volume changes during hypoglycaemia: the effect of arterial blood sampling

    DEFF Research Database (Denmark)

    Hilsted, J; Bendtsen, F; Christensen, N J

    1990-01-01

    To investigate whether previously reported changes in venous blood volume and composition induced by acute hypoglycaemia in humans are representative for the entire body we measured erythrocyte 51Cr content, haematocrit, plasma volume, intravascular albumin content and transcapillary escape rate...... hypoglycaemia. The magnitude of the changes in arterial and venous blood were not significantly different. These results indicate that the above changes in blood volume and composition are whole-body phenomena: furthermore, the major part of the changes are likely to occur in tissues other than upper extremity...

  10. Large single-crystal diamond substrates for ionizing radiation detection

    Energy Technology Data Exchange (ETDEWEB)

    Girolami, Marco; Bellucci, Alessandro; Calvani, Paolo; Trucchi, Daniele M. [Istituto di Struttura della Materia (ISM), Consiglio Nazionale delle Ricerche (CNR), Sede Secondaria di Montelibretti, Monterotondo Stazione, Roma (Italy)

    2016-10-15

    The need for large active volume detectors for ionizing radiations and particles, with both large area and thickness, is becoming more and more compelling in a wide range of applications, spanning from X-ray dosimetry to neutron spectroscopy. Recently, 8.0 x 8.0 mm{sup 2} wide and 1.2 mm thick single-crystal diamond plates have been put on the market, representing a first step to the fabrication of large area monolithic diamond detectors with optimized charge transport properties, obtainable up to now only with smaller samples. The more-than-double thickness, if compared to standard plates (typically 500 μm thick), demonstrated to be effective in improving the detector response to highly penetrating ionizing radiations, such as γ-rays. Here we report on the first measurements performed on large active volume single-crystal diamond plates, both in the dark and under irradiation with optical wavelengths (190-1100 nm), X-rays, and radioactive γ-emitting sources ({sup 57}Co and {sup 22}Na). (copyright 2016 WILEY-VCH Verlag GmbH and Co. KGaA, Weinheim)

  11. Prognostic value of defining the systemic tumor volume with FDG-PET in diffuse large b cell lymphoma

    International Nuclear Information System (INIS)

    Byun, Byung Hyun; Lim, Sang Moo; Cheon, Gi Jeong; Choi, Chang Woon; Kang, Hye Jin; Na, Im Il; Ryoo, Baek Yeol; Yang, Sung Hyun

    2007-01-01

    We measured the systemic tumor volume using FDG-PET in patients with diffuse large B cell lymphoma (DLBL). We also investigated its prognostic role, and compared it with that of other prognostic factors. FDG PET was performed in 38 newly diagnosed DLBL patients (20 men, 18 women, age 55.715.1 years) at pre-treatment of chemotherapy. Clinical staging of lymphoma was evaluated by Ann Arbor system. On each FDG PET scan, we acquired volume of interest (VOl) at the cut-off value of SUV=2.5 in every measurable tumor by the automatic edge detection software. According to the VOI, we measured the metabolic volume and mean SUV, and estimated volume-activity indexes (SUV Vol) as mean SUV times metabolic volume. And then, we calculated the summed metabolic volume (VOLsum) and summed SUV Vol (SUV Volsum) in every FDG PET scan. Maximum SUV of involved lesion (SUVmax) was also acquired on each FDG PET scan. Time to treatment failure (TTF) was compared among VOLsum (median), SUV Volsum (median), SUVmax (median), clinical stage, gender, age, LDH, and performance status-assigned response designations by Kaplan-Meier survival analysis. Initial stages of DLBL patients were stage I in 4, II in 14, III in 15, and IV in 4 by Ann Arbor system. Median follow up period was 15.5months, and estimated mean TTF was 22.3 months. Univariate analysis demonstrated that TTF is statistically significantly reduced in those with high VOLsum (>215.1cm2, p=0.004), high SUV Volsum (>1577.5, p=0.003), and increased LDH (p=0.036). TTF did not correlate with SUVmax (p=0.571), clinical stage (p=0.194), gender (p=0.549), and age (p=0.128), and performance status =2 (p=0.074). Multivariate analysis using VOLsum, SUV Volsum, LDH, and performance status demonstrated no statistically significant predictor of TTF (p>0.05). Systemic tumor volume measurement using FDG-PET is suggestive to be the significant prognostic factor in patients with DLBL

  12. Measurement and genetics of human subcortical and hippocampal asymmetries in large datasets.

    Science.gov (United States)

    Guadalupe, Tulio; Zwiers, Marcel P; Teumer, Alexander; Wittfeld, Katharina; Vasquez, Alejandro Arias; Hoogman, Martine; Hagoort, Peter; Fernandez, Guillen; Buitelaar, Jan; Hegenscheid, Katrin; Völzke, Henry; Franke, Barbara; Fisher, Simon E; Grabe, Hans J; Francks, Clyde

    2014-07-01

    Functional and anatomical asymmetries are prevalent features of the human brain, linked to gender, handedness, and cognition. However, little is known about the neurodevelopmental processes involved. In zebrafish, asymmetries arise in the diencephalon before extending within the central nervous system. We aimed to identify genes involved in the development of subtle, left-right volumetric asymmetries of human subcortical structures using large datasets. We first tested the feasibility of measuring left-right volume differences in such large-scale samples, as assessed by two automated methods of subcortical segmentation (FSL|FIRST and FreeSurfer), using data from 235 subjects who had undergone MRI twice. We tested the agreement between the first and second scan, and the agreement between the segmentation methods, for measures of bilateral volumes of six subcortical structures and the hippocampus, and their volumetric asymmetries. We also tested whether there were biases introduced by left-right differences in the regional atlases used by the methods, by analyzing left-right flipped images. While many bilateral volumes were measured well (scan-rescan r = 0.6-0.8), most asymmetries, with the exception of the caudate nucleus, showed lower repeatabilites. We meta-analyzed genome-wide association scan results for caudate nucleus asymmetry in a combined sample of 3,028 adult subjects but did not detect associations at genome-wide significance (P left-right patterning of the viscera. Our results provide important information for researchers who are currently aiming to carry out large-scale genome-wide studies of subcortical and hippocampal volumes, and their asymmetries. Copyright © 2013 Wiley Periodicals, Inc.

  13. A simplified method to recover urinary vesicles for clinical applications, and sample banking.

    Science.gov (United States)

    Musante, Luca; Tataruch, Dorota; Gu, Dongfeng; Benito-Martin, Alberto; Calzaferri, Giulio; Aherne, Sinead; Holthofer, Harry

    2014-12-23

    Urinary extracellular vesicles provide a novel source for valuable biomarkers for kidney and urogenital diseases: Current isolation protocols include laborious, sequential centrifugation steps which hampers their widespread research and clinical use. Furthermore, large individual urine sample volumes or sizable target cohorts are to be processed (e.g. for biobanking), the storage capacity is an additional problem. Thus, alternative methods are necessary to overcome such limitations. We have developed a practical vesicle isolation technique to yield easily manageable sample volumes in an exceptionally cost efficient way to facilitate their full utilization in less privileged environments and maximize the benefit of biobanking. Urinary vesicles were isolated by hydrostatic dialysis with minimal interference of soluble proteins or vesicle loss. Large volumes of urine were concentrated up to 1/100 of original volume and the dialysis step allowed equalization of urine physico-chemical characteristics. Vesicle fractions were found suitable to any applications, including RNA analysis. In the yield, our hydrostatic filtration dialysis system outperforms the conventional ultracentrifugation-based methods and the labour intensive and potentially hazardous step of ultracentrifugations are eliminated. Likewise, the need for trained laboratory personnel and heavy initial investment is avoided. Thus, our method qualifies as a method for laboratories working with urinary vesicles and biobanking.

  14. Feasibility studies on large sample neutron activation analysis using a low power research reactor

    International Nuclear Information System (INIS)

    Gyampo, O.

    2008-06-01

    Instrumental neutron activation analysis (INAA) using Ghana Research Reactor-1 (GHARR-1) can be directly applied to samples with masses in grams. Samples weights were in the range of 0.5g to 5g. Therefore, the representativity of the sample is improved as well as sensitivity. Irradiation of samples was done using a low power research reactor. The correction for the neutron self-shielding within the sample is determined from measurement of the neutron flux depression just outside the sample. Correction for gamma ray self-attenuation in the sample was performed via linear attenuation coefficients derived from transmission measurements. Quantitative and qualitative analysis of data were done using gamma ray spectrometry (HPGe detector). The results of this study on the possibilities of large sample NAA using a miniature neutron source reactor (MNSR) show clearly that the Ghana Research Reactor-1 (GHARR-1) at the National Nuclear Research Institute (NNRI) can be used for sample analyses up to 5 grams (5g) using the pneumatic transfer systems.

  15. Analysis of Lung Tumor Motion in a Large Sample: Patterns and Factors Influencing Precise Delineation of Internal Target Volume

    International Nuclear Information System (INIS)

    Knybel, Lukas; Cvek, Jakub; Molenda, Lukas; Stieberova, Natalie; Feltl, David

    2016-01-01

    Purpose/Objective: To evaluate lung tumor motion during respiration and to describe factors affecting the range and variability of motion in patients treated with stereotactic ablative radiation therapy. Methods and Materials: Log file analysis from online respiratory tumor tracking was performed in 145 patients. Geometric tumor location in the lungs, tumor volume and origin (primary or metastatic), sex, and tumor motion amplitudes in the superior-inferior (SI), latero-lateral (LL), and anterior-posterior (AP) directions were recorded. Tumor motion variability during treatment was described using intrafraction/interfraction amplitude variability and tumor motion baseline changes. Tumor movement dependent on the tumor volume, position and origin, and sex were evaluated using statistical regression and correlation analysis. Results: After analysis of >500 hours of data, the highest rates of motion amplitudes, intrafraction/interfraction variation, and tumor baseline changes were in the SI direction (6.0 ± 2.2 mm, 2.2 ± 1.8 mm, 1.1 ± 0.9 mm, and −0.1 ± 2.6 mm). The mean motion amplitudes in the lower/upper geometric halves of the lungs were significantly different (P 15 mm were observed only in the lower geometric quarter of the lungs. Higher tumor motion amplitudes generated higher intrafraction variations (R=.86, P 3 mm indicated tumors contacting mediastinal structures or parietal pleura. On univariate analysis, neither sex nor tumor origin (primary vs metastatic) was an independent predictive factor of different movement patterns. Metastatic lesions in women, but not men, showed significantly higher mean amplitudes (P=.03) and variability (primary, 2.7 mm; metastatic, 4.9 mm; P=.002) than primary tumors. Conclusion: Online tracking showed significant irregularities in lung tumor movement during respiration. Motion amplitude was significantly lower in upper lobe tumors; higher interfraction amplitude variability indicated tumors in contact

  16. Software engineering the mixed model for genome-wide association studies on large samples

    Science.gov (United States)

    Mixed models improve the ability to detect phenotype-genotype associations in the presence of population stratification and multiple levels of relatedness in genome-wide association studies (GWAS), but for large data sets the resource consumption becomes impractical. At the same time, the sample siz...

  17. Preface to the volume Large Rivers

    Science.gov (United States)

    Latrubesse, Edgardo M.; Abad, Jorge D.

    2018-02-01

    The study and knowledge of the geomorphology of large rivers increased significantly during the last years and the factors that triggered these advances are multiple. On one hand, modern technologies became more accessible and their disseminated usage allowed the collection of data from large rivers as never seen before. The generalized use of high tech data collection with geophysics equipment such as acoustic Doppler current profilers-ADCPs, multibeam echosounders, plus the availability of geospatial and computational tools for morphodynamics, hydrological and hydrosedimentological modeling, have accelerated the scientific production on the geomorphology of large rivers at a global scale. Despite the advances, there is yet a lot of work ahead. Good parts of the large rivers are in the tropics and many are still unexplored. The tropics also hold crucial fluvial basins that concentrate good part of the gross domestic product of large countries like the Parana River in Argentina and Brazil, the Ganges-Brahmaputra in India, the Indus River in Pakistan, and the Mekong River in several countries of South East Asia. The environmental importance of tropical rivers is also outstanding. They hold the highest biodiversity of fluvial fauna and alluvial vegetation and many of them, particularly those in Southeast Asia, are among the most hazardous systems for floods in the entire world. Tropical rivers draining mountain chains such as the Himalaya, the Andes and insular Southeast Asia are also among the most heavily sediment loaded rivers and play a key role in both the storage of sediment at continental scale and the transference of sediments from the continent to the Ocean at planetary scale (Andermann et al., 2012; Latrubesse and Restrepo, 2014; Milliman and Syvitski, 1992; Milliman and Farsnworth, 2011; Sinha and Friend, 1994).

  18. On-line combination of liquid chromatography and capillary gas chromatography : preconcentration and analysis of organic compounds in aqueous samples

    NARCIS (Netherlands)

    Noij, T.H.M.; Weiss, E.; Herps, T.; Cruchten, van H.; Rijks, J.A.

    1988-01-01

    This paper describes the design of a new, versatile, and low-cost on-line LC-GC interface that allows the fast and reliable introduction of large sample volumes onto a capillary GC column. The sample introduction procedure consists successively of: evaporation of the entire sample (LC fraction),

  19. Active species in a large volume N2-O2 post-discharge reactor

    International Nuclear Information System (INIS)

    Kutasi, K; Pintassilgo, C D; Loureiro, J; Coelho, P J

    2007-01-01

    A large volume post-discharge reactor placed downstream from a flowing N 2 -O 2 microwave discharge is modelled using a three-dimensional hydrodynamic model. The density distributions of the most populated active species present in the reactor-O( 3 P), O 2 (a 1 Δ g ), O 2 (b 1 Σ g + ), NO(X 2 Π), NO(A 2 Σ + ), NO(B 2 Π), NO 2 (X), O 3 , O 2 (X 3 Σ g - ) and N( 4 S)-are calculated and the main source and loss processes for each species are identified for two discharge conditions: (i) p = 2 Torr, f = 2450 MHz, and (ii) p = 8 Torr, f = 915 MHz; in the case of a N 2 -2%O 2 mixture composition and gas flow rate of 2 x 10 3 sccm. The modification of the species relative densities by changing the oxygen percentage in the initial gas mixture composition, in the 0.2%-5% range, are presented. The possible tuning of the species concentrations in the reactor by changing the size of the connecting afterglow tube between the active discharge and the large post-discharge reactor is investigated as well

  20. Determination of 129I in large soil samples after alkaline wet disintegration

    International Nuclear Information System (INIS)

    Bunzl, K.; Kracke, W.

    1992-01-01

    Large soil samples (up to 500 g) can conveniently be disintegrated by hydrogen peroxide in an utility tank under alkaline conditions to determine subsequently 129 I by neutron activation analysis. Interfering elements such as Br are removed already before neutron irradiation to reduce the radiation exposure of the personnel. The precision of the method is 129 I also by the combustion method. (orig.)

  1. Radio-chemical dosage of 90Sr in large volumes of drinking water

    International Nuclear Information System (INIS)

    Jeanmaire, L.; Patti, F.; Bullier, D.

    1965-01-01

    I. Principle of the method: 1. Fixing on a resin of all the cations present in the water. 2. Elution using 5 N nitric acid and precipitation of strontium as the carbonate. 3. Concentration of the strontium using the fuming nitric acid method. 4. Purification of the strontium on a resin by selective elution with ammonium citrate. 5. The strontium-90 is measured by separation at the 90 Y equilibrium in the form of the oxalate which is then counted. II. Advantages of the method The concentration of the radio-activity starting from large volumes (100 l) is generally tedious but this method which makes use of a fixation on a cationic resin makes it very simple. The rest of the method consists of a series of simple chemical operations using ion-exchange on resins and coprecipitation. Finally, it is possible to dose stable strontium. (authors) [fr

  2. Digitally programmable microfluidic automaton for multiscale combinatorial mixing and sample processing†

    Science.gov (United States)

    Jensen, Erik C.; Stockton, Amanda M.; Chiesl, Thomas N.; Kim, Jungkyu; Bera, Abhisek; Mathies, Richard A.

    2013-01-01

    A digitally programmable microfluidic Automaton consisting of a 2-dimensional array of pneumatically actuated microvalves is programmed to perform new multiscale mixing and sample processing operations. Large (µL-scale) volume processing operations are enabled by precise metering of multiple reagents within individual nL-scale valves followed by serial repetitive transfer to programmed locations in the array. A novel process exploiting new combining valve concepts is developed for continuous rapid and complete mixing of reagents in less than 800 ms. Mixing, transfer, storage, and rinsing operations are implemented combinatorially to achieve complex assay automation protocols. The practical utility of this technology is demonstrated by performing automated serial dilution for quantitative analysis as well as the first demonstration of on-chip fluorescent derivatization of biomarker targets (carboxylic acids) for microchip capillary electrophoresis on the Mars Organic Analyzer. A language is developed to describe how unit operations are combined to form a microfluidic program. Finally, this technology is used to develop a novel microfluidic 6-sample processor for combinatorial mixing of large sets (>26 unique combinations) of reagents. The digitally programmable microfluidic Automaton is a versatile programmable sample processor for a wide range of process volumes, for multiple samples, and for different types of analyses. PMID:23172232

  3. Determinants of salivary evening alpha-amylase in a large sample free of psychopathology

    NARCIS (Netherlands)

    Veen, Gerthe; Giltay, Erik J.; Vreeburg, Sophie A.; Licht, Carmilla M. M.; Cobbaert, Christa M.; Zitman, Frans G.; Penninx, Brenda W. J. H.

    Objective: Recently, salivary alpha-amylase (sAA) has been proposed as a suitable index for sympathetic activity and dysregulation of the autonomic nervous system (ANS). Although determinants of sAA have been described, they have not been studied within the same study with a large sample size

  4. A new large-volume metal reference standard for radioactive waste management.

    Science.gov (United States)

    Tzika, F; Hult, M; Stroh, H; Marissens, G; Arnold, D; Burda, O; Kovář, P; Suran, J; Listkowska, A; Tyminski, Z

    2016-03-01

    A new large-volume metal reference standard has been developed. The intended use is for calibration of free-release radioactivity measurement systems and is made up of cast iron tubes placed inside a box of the size of a Euro-pallet (80 × 120 cm). The tubes contain certified activity concentrations of (60)Co (0.290 ± 0.006 Bq g(-1)) and (110m)Ag (3.05 ± 0.09 Bq g(-1)) (reference date: 30 September 2013). They were produced using centrifugal casting from a smelt into which (60)Co was first added and then one piece of neutron irradiated silver wire was progressively diluted. The iron castings were machined to the desirable dimensions. The final material consists of 12 iron tubes of 20 cm outer diameter, 17.6 cm inner diameter, 40 cm length/height and 245.9 kg total mass. This paper describes the reference standard and the process of determining the reference activity values. © The Author 2015. Published by Oxford University Press.

  5. Velocity dependent passive sampling for monitoring of micropollutants in dynamic stormwater discharges

    DEFF Research Database (Denmark)

    Birch, Heidi; Sharma, Anitha Kumari; Vezzaro, Luca

    2013-01-01

    Micropollutant monitoring in stormwater discharges is challenging because of the diversity of sources and thus large number of pollutants found in stormwater. This is further complicated by the dynamics in runoff flows and the large number of discharge points. Most passive samplers are non......-ideal for sampling such systems because they sample in a time-integrative manner. This paper reports test of a flow-through passive sampler, deployed in stormwater runoff at the outlet of a residential-industrial catchment. Momentum from the water velocity during runoff events created flow through the sampler...... resulting in velocity dependent sampling. This approach enables the integrative sampling of stormwater runoff during periods of weeks to months while weighting actual runoff events higher than no flow periods. Results were comparable to results from volume-proportional samples and results obtained from...

  6. Fiber sample presentation system for spectrophotometer cotton fiber color measurements

    Science.gov (United States)

    The Uster® High Volume Instrument (HVI) is used to class U.S. cotton for fiber color, yielding the industry accepted, cotton-specific color parameters Rd and +b. The HVI examines a 9 square inch fiber sample, and it is also used to test large AMS standard cotton “biscuits” or rectangles. Much inte...

  7. Large area solid target neutron source

    International Nuclear Information System (INIS)

    Crawford, J.C.; Bauer, W.

    1974-01-01

    A potentially useful neutron source may result from the combination of a solid deuterium-tritium loaded target with the large area, high energy ion beams from ion sources being developed for neutral beam injection. The resulting neutron source would have a large radiating area and thus produce the sizable experimental volume necessary for future studies of bulk and synergistic surface radiation effects as well as experiments on engineering samples and small components. With a 200 keV D + T + beam and 40 kW/cm 2 power dissipation on a 200 cm 2 target spot, a total neutron yield of about 4 x 10 15 n/sec may be achieved. Although the useable neutron flux from this source is limited to 1 to 2 x 10 13 n/cm 2 /sec, this flux can be produced 3 cm in front of the target and over about 300 cm 3 of experimental volume. Problems of total power dissipation, sputtering, isotopic flushing and thermal dissociation are reviewed. Neutron flux profiles and potential experimental configurations are presented and compared to other neutron source concepts. (U.S.)

  8. A hard-to-read font reduces the framing effect in a large sample.

    Science.gov (United States)

    Korn, Christoph W; Ries, Juliane; Schalk, Lennart; Oganian, Yulia; Saalbach, Henrik

    2018-04-01

    How can apparent decision biases, such as the framing effect, be reduced? Intriguing findings within recent years indicate that foreign language settings reduce framing effects, which has been explained in terms of deeper cognitive processing. Because hard-to-read fonts have been argued to trigger deeper cognitive processing, so-called cognitive disfluency, we tested whether hard-to-read fonts reduce framing effects. We found no reliable evidence for an effect of hard-to-read fonts on four framing scenarios in a laboratory (final N = 158) and an online study (N = 271). However, in a preregistered online study with a rather large sample (N = 732), a hard-to-read font reduced the framing effect in the classic "Asian disease" scenario (in a one-sided test). This suggests that hard-read-fonts can modulate decision biases-albeit with rather small effect sizes. Overall, our findings stress the importance of large samples for the reliability and replicability of modulations of decision biases.

  9. A Pipeline for Large Data Processing Using Regular Sampling for Unstructured Grids

    Energy Technology Data Exchange (ETDEWEB)

    Berres, Anne Sabine [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Adhinarayanan, Vignesh [Virginia Polytechnic Inst. and State Univ. (Virginia Tech), Blacksburg, VA (United States); Turton, Terece [Univ. of Texas, Austin, TX (United States); Feng, Wu [Virginia Polytechnic Inst. and State Univ. (Virginia Tech), Blacksburg, VA (United States); Rogers, David Honegger [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2017-05-12

    Large simulation data requires a lot of time and computational resources to compute, store, analyze, visualize, and run user studies. Today, the largest cost of a supercomputer is not hardware but maintenance, in particular energy consumption. Our goal is to balance energy consumption and cognitive value of visualizations of resulting data. This requires us to go through the entire processing pipeline, from simulation to user studies. To reduce the amount of resources, data can be sampled or compressed. While this adds more computation time, the computational overhead is negligible compared to the simulation time. We built a processing pipeline at the example of regular sampling. The reasons for this choice are two-fold: using a simple example reduces unnecessary complexity as we know what to expect from the results. Furthermore, it provides a good baseline for future, more elaborate sampling methods. We measured time and energy for each test we did, and we conducted user studies in Amazon Mechanical Turk (AMT) for a range of different results we produced through sampling.

  10. Cross-correlations in volume space: Differences between buy and sell volumes

    Science.gov (United States)

    Lee, Sun Young; Hwang, Dong Il; Kim, Min Jae; Koh, In Gyu; Kim, Soo Yong

    2011-03-01

    We study the cross-correlations of buy and sell volumes on the Korean stock market in high frequency. We observe that the pulling effects of volumes are as small as that of returns. The properties of the correlations of buy and sell volumes differ. They are explained by the degree of synchronization of stock volumes. Further, the pulling effects on the minimal spanning tree are studied. In minimal spanning trees with directed links, the large pulling effects are clustered at the center, not uniformly distributed. The Epps effect of buy and sell volumes are observed. The reversal of the cross-correlations of buy and sell volumes is also detected.

  11. the Preliminary Research Based on Seismic Signals Generated by Hutubi Transmitting Seismic Station with One Large-volume Airgun Array

    Science.gov (United States)

    Wang, Q.; Su, J.; Wei, Y.; Zhang, W.; Wang, H.; Wang, B.; Ji, Z.

    2017-12-01

    For studying the subsurface structure and its subtle changes, we built the Hutubi transmitting seismic station with one large-volume airgun array at one artificial water pool in the northern segment of Tianshan mountain, where earthquakes occurred frequently. The airgun array consists of six airguns with every airgun capacity of 2000in3, and the artificial water pool with the top diameter of 100m, bottom diameter of 20m and the depth of 18m.We started the regular excitation experiment with the large-volume airgun source every week since June, 2013. Using seismic signals geneated by the Hutubi airgun source, we made the preliminary research on the airgun source, waveform characteristics and the subsurface velocity changes in the northern Tiansh mountain. The results are as follows: The seismic signal exited by the airgun source is characteristic of low-frequency ,and the dominant frequency is in the range of 2 6Hz. The Hutubi transmitting seismic station can continuously generate long-distance detectable and highly repeatable signals, and the correlation coefficient of sigals is greater than 0.95; and the longest propagation distance arrives to 380km, in addition, the 5000-shot stacked sigal using the phase weighted stack technique can be identified in the station, which is about 1300km from the Hutubi transmitting seismic station. Hutubi large-volume airgun source is fitted to detect and monitor the regional-scale subsurface stress state. Applying correlation test method, we measured weak subsurface velocity changes in the northern Tianshan mountain, and found that the several stations, which are within 150km from the the Hutubi transmitting seismic station, appeared 0.1 0.2% relative velocity changes before the Hutubi MS6.2 earthquake on Dec.8, 2016.

  12. A compact time-of-flight SANS instrument optimised for measurements of small sample volumes at the European Spallation Source

    Energy Technology Data Exchange (ETDEWEB)

    Kynde, Søren, E-mail: kynde@nbi.ku.dk [Niels Bohr Institute, University of Copenhagen (Denmark); Hewitt Klenø, Kaspar [Niels Bohr Institute, University of Copenhagen (Denmark); Nagy, Gergely [SINQ, Paul Scherrer Institute (Switzerland); Mortensen, Kell; Lefmann, Kim [Niels Bohr Institute, University of Copenhagen (Denmark); Kohlbrecher, Joachim, E-mail: Joachim.kohlbrecher@psi.ch [SINQ, Paul Scherrer Institute (Switzerland); Arleth, Lise, E-mail: arleth@nbi.ku.dk [Niels Bohr Institute, University of Copenhagen (Denmark)

    2014-11-11

    The high flux at European Spallation Source (ESS) will allow for performing experiments with relatively small beam-sizes while maintaining a high intensity of the incoming beam. The pulsed nature of the source makes the facility optimal for time-of-flight small-angle neutron scattering (ToF-SANS). We find that a relatively compact SANS instrument becomes the optimal choice in order to obtain the widest possible q-range in a single setting and the best possible exploitation of the neutrons in each pulse and hence obtaining the highest possible flux at the sample position. The instrument proposed in the present article is optimised for performing fast measurements of small scattering volumes, typically down to 2×2×2 mm{sup 3}, while covering a broad q-range from about 0.005 1/Å to 0.5 1/Å in a single instrument setting. This q-range corresponds to that available at a typical good BioSAXS instrument and is relevant for a wide set of biomacromolecular samples. A central advantage of covering the whole q-range in a single setting is that each sample has to be loaded only once. This makes it convenient to use the fully automated high-throughput flow-through sample changers commonly applied at modern synchrotron BioSAXS-facilities. The central drawback of choosing a very compact instrument is that the resolution in terms of δλ/λ obtained with the short wavelength neutrons becomes worse than what is usually the standard at state-of-the-art SANS instruments. Our McStas based simulations of the instrument performance for a set of characteristic biomacromolecular samples show that the resulting smearing effects still have relatively minor effects on the obtained data and can be compensated for in the data analysis. However, in cases where a better resolution is required in combination with the large simultaneous q-range characteristic of the instrument, we show that this can be obtained by inserting a set of choppers.

  13. More practical critical height sampling.

    Science.gov (United States)

    Thomas B. Lynch; Jeffrey H. Gove

    2015-01-01

    Critical Height Sampling (CHS) (Kitamura 1964) can be used to predict cubic volumes per acre without using volume tables or equations. The critical height is defined as the height at which the tree stem appears to be in borderline condition using the point-sampling angle gauge (e.g. prism). An estimate of cubic volume per acre can be obtained from multiplication of the...

  14. Annealing as grown large volume CZT single crystals for increased spectral resolution

    International Nuclear Information System (INIS)

    Li, Longxia

    2008-01-01

    The spectroscopic performance of current large-volume Cadmium 10% Zinc Telluride, Cd 0.9 Zn 0.1 Te, (CZT) detectors is impaired by cumulative effect of tellurium precipitates (secondary phases) presented in CZT single-crystal grown by low-pressure Bridgman techniques(1). This statistical effect may limit the energy resolution of large-volume CZT detectors (typically 2-5% at 662 keV for 12-mm thick devices). The stochastic nature of the interaction prevents the use of any electronic or digital charge correction techniques without a significant reduction in the detector efficiency. This volume constraint hampers the utility of CZT since the detectors are inefficient at detecting photons >1MeV and/or in low fluency situations. During the project, seven runs CZT ingots have been grown, in these ingots the indium dopant concentrations have been changed in the range between 0.5ppm to 6ppm. The I-R mapping imaging method has been employed to study the Te-precipitates. The Teprecipitates in as-grown CZT wafers, and after annealing wafers have been systematically studied by using I-R mapping system (home installed, resolution of 1.5 (micro)m). We employed our I-R standard annealing CZT (Zn=4%) procedure or two-steps annealing into radiation CZT (Zn=10%), we achieved the 'non'-Te precipitates (size 10 9-10 (Omega)-cm. We believe that the Te-precipitates are the p-type defects, its reducing number causes the CZT became n+-type, therefore we varied or reduced the indium dapant concentration during the growth and changed the Te-precipitates size and density by using different Cd-temperature and different annealing procedures. We have made the comparisons among Te-precipitates size, density and Indium dopant concentrations, and we found that the CZT with smaller size of Te-precipitates is suitable for radiation uses but non-Te precipitates is impossible to be used in the radiation detectors, because the CZT would became un-dopant or 'intrinsic' with non radiation affection (we

  15. Association between genetic variation in a region on chromosome 11 and schizophrenia in large samples from Europe

    DEFF Research Database (Denmark)

    Rietschel, M; Mattheisen, M; Degenhardt, F

    2012-01-01

    the recruitment of very large samples of patients and controls (that is tens of thousands), or large, potentially more homogeneous samples that have been recruited from confined geographical areas using identical diagnostic criteria. Applying the latter strategy, we performed a genome-wide association study (GWAS...... between emotion regulation and cognition that is structurally and functionally abnormal in SCZ and bipolar disorder.Molecular Psychiatry advance online publication, 12 July 2011; doi:10.1038/mp.2011.80....

  16. A Simple Sampling Method for Estimating the Accuracy of Large Scale Record Linkage Projects.

    Science.gov (United States)

    Boyd, James H; Guiver, Tenniel; Randall, Sean M; Ferrante, Anna M; Semmens, James B; Anderson, Phil; Dickinson, Teresa

    2016-05-17

    Record linkage techniques allow different data collections to be brought together to provide a wider picture of the health status of individuals. Ensuring high linkage quality is important to guarantee the quality and integrity of research. Current methods for measuring linkage quality typically focus on precision (the proportion of incorrect links), given the difficulty of measuring the proportion of false negatives. The aim of this work is to introduce and evaluate a sampling based method to estimate both precision and recall following record linkage. In the sampling based method, record-pairs from each threshold (including those below the identified cut-off for acceptance) are sampled and clerically reviewed. These results are then applied to the entire set of record-pairs, providing estimates of false positives and false negatives. This method was evaluated on a synthetically generated dataset, where the true match status (which records belonged to the same person) was known. The sampled estimates of linkage quality were relatively close to actual linkage quality metrics calculated for the whole synthetic dataset. The precision and recall measures for seven reviewers were very consistent with little variation in the clerical assessment results (overall agreement using the Fleiss Kappa statistics was 0.601). This method presents as a possible means of accurately estimating matching quality and refining linkages in population level linkage studies. The sampling approach is especially important for large project linkages where the number of record pairs produced may be very large often running into millions.

  17. Reachable volume RRT

    KAUST Repository

    McMahon, Troy

    2015-05-01

    © 2015 IEEE. Reachable volumes are a new technique that allows one to efficiently restrict sampling to feasible/reachable regions of the planning space even for high degree of freedom and highly constrained problems. However, they have so far only been applied to graph-based sampling-based planners. In this paper we develop the methodology to apply reachable volumes to tree-based planners such as Rapidly-Exploring Random Trees (RRTs). In particular, we propose a reachable volume RRT called RVRRT that can solve high degree of freedom problems and problems with constraints. To do so, we develop a reachable volume stepping function, a reachable volume expand function, and a distance metric based on these operations. We also present a reachable volume local planner to ensure that local paths satisfy constraints for methods such as PRMs. We show experimentally that RVRRTs can solve constrained problems with as many as 64 degrees of freedom and unconstrained problems with as many as 134 degrees of freedom. RVRRTs can solve problems more efficiently than existing methods, requiring fewer nodes and collision detection calls. We also show that it is capable of solving difficult problems that existing methods cannot.

  18. Reachable volume RRT

    KAUST Repository

    McMahon, Troy; Thomas, Shawna; Amato, Nancy M.

    2015-01-01

    © 2015 IEEE. Reachable volumes are a new technique that allows one to efficiently restrict sampling to feasible/reachable regions of the planning space even for high degree of freedom and highly constrained problems. However, they have so far only been applied to graph-based sampling-based planners. In this paper we develop the methodology to apply reachable volumes to tree-based planners such as Rapidly-Exploring Random Trees (RRTs). In particular, we propose a reachable volume RRT called RVRRT that can solve high degree of freedom problems and problems with constraints. To do so, we develop a reachable volume stepping function, a reachable volume expand function, and a distance metric based on these operations. We also present a reachable volume local planner to ensure that local paths satisfy constraints for methods such as PRMs. We show experimentally that RVRRTs can solve constrained problems with as many as 64 degrees of freedom and unconstrained problems with as many as 134 degrees of freedom. RVRRTs can solve problems more efficiently than existing methods, requiring fewer nodes and collision detection calls. We also show that it is capable of solving difficult problems that existing methods cannot.

  19. SU-F-T-538: CyberKnife with MLC for Treatment of Large Volume Tumors: A Feasibility Study

    Energy Technology Data Exchange (ETDEWEB)

    Bichay, T; Mayville, A [Mercy Health, Saint Mary’s, Grand Rapids, MI (United States)

    2016-06-15

    Purpose: CyberKnife is a well-documented modality for SRS and SBRT treatments. Typical tumors are small and 1–5 fractions are usually used. We determined the feasibility of using CyberKnife, with an InCise multileaf collimator option, for larger tumors undergoing standard dose and fractionation. The intent was to understand the limitation of using this modality for other external beam radiation treatments. Methods: Five tumors from different anatomical sites with volumes from 127.8 cc to 1,320.5 cc were contoured and planned on a Multiplan V5.1 workstation. The target average diameter ranged from 7 cm to 13 cm. The dose fractionation was 1.8–2.0 Gy/fraction and 25–45 fractions for total doses of 45–81 Gy. The sites planned were: pancreas, head and neck, prostate, anal, and esophagus. The plans were optimized to meet conventional dose constraints based on various RTOG protocols for conventional fractionation. Results: The Multiplan treatment planning system successfully generated clinically acceptable plans for all sites studied. The resulting dose distributions achieved reasonable target coverage, all greater than 95%, and satisfactory normal tissue sparing. Treatment times ranged from 9 minutes to 38 minutes, the longest being a head and neck plan with dual targets receiving different doses and with multiple adjacent critical structures. Conclusion: CyberKnife, with the InCise multileaf collimation option, can achieve acceptable dose distributions in large volume tumors treated with conventional dose and fractionation. Although treatment times are greater than conventional accelerator time; target coverage and dose to critical structures can be kept within a clinically acceptable range. While time limitations exist, when necessary CyberKnife can provide an alternative to traditional treatment modalities for large volume tumors.

  20. A retrospective analysis of complications of large volume liposuction; local perspective from a third world country

    International Nuclear Information System (INIS)

    Arshad, S.M.; Latif, S.; Altaf, H.N.

    2017-01-01

    This study was aimed at evaluating the complications that occurred in patients undergoing large volume liposuction and to see if there was a correlation between amount of aspirate and the rate of complications. Methodology: A detailed history, complete physical examination, BMI, and anthropometric measurements were documented for all patients. All patients under went liposuction using tumescent technique under general anesthesia in Yusra General Hospital. Patients were discharged home after 24 to 48 hours. Pressure garments were advised for 6 weeks and were called for weekly follow up for 6 weeks. Pressure garments were advised for 6 weeks. Complications were documented. SPSS version 20 was used for analysis of data. Results: Out of 217 patients, 163 (75%) were female and 54 male. Mean age was 37.1 SD+-6.7 years. Bruising and seroma were most common complications; 4.1% and 2.3%, respectively. The incidence of infection was 0.9%. One patient had over-correction and four patients (1.8%) had under-correction. Significant blood loss was encountered in one patient. Two patients (0.9%) had pulmonary embolism and 2(0.9%) suffered from necrotizing fasciitis. None of our patients undergoing large volume liposuction had fat embolism and there was no mortality. Conclusion: Careful patient selection and strict adherence to guidelines can ensure a good outcome and can minimize risk of complications. Both physicians and patients should be educated to have realistic expectations to avoid complications and improve patient safety. (author)

  1. Stormbow: A Cloud-Based Tool for Reads Mapping and Expression Quantification in Large-Scale RNA-Seq Studies.

    Science.gov (United States)

    Zhao, Shanrong; Prenger, Kurt; Smith, Lance

    2013-01-01

    RNA-Seq is becoming a promising replacement to microarrays in transcriptome profiling and differential gene expression study. Technical improvements have decreased sequencing costs and, as a result, the size and number of RNA-Seq datasets have increased rapidly. However, the increasing volume of data from large-scale RNA-Seq studies poses a practical challenge for data analysis in a local environment. To meet this challenge, we developed Stormbow, a cloud-based software package, to process large volumes of RNA-Seq data in parallel. The performance of Stormbow has been tested by practically applying it to analyse 178 RNA-Seq samples in the cloud. In our test, it took 6 to 8 hours to process an RNA-Seq sample with 100 million reads, and the average cost was $3.50 per sample. Utilizing Amazon Web Services as the infrastructure for Stormbow allows us to easily scale up to handle large datasets with on-demand computational resources. Stormbow is a scalable, cost effective, and open-source based tool for large-scale RNA-Seq data analysis. Stormbow can be freely downloaded and can be used out of box to process Illumina RNA-Seq datasets.

  2. Analysis of Lung Tumor Motion in a Large Sample: Patterns and Factors Influencing Precise Delineation of Internal Target Volume

    Energy Technology Data Exchange (ETDEWEB)

    Knybel, Lukas [Department of Oncology, University Hospital Ostrava, Ostrava (Czech Republic); VŠB-Technical University of Ostrava, Ostrava (Czech Republic); Cvek, Jakub, E-mail: Jakub.cvek@fno.cz [Department of Oncology, University Hospital Ostrava, Ostrava (Czech Republic); Molenda, Lukas; Stieberova, Natalie; Feltl, David [Department of Oncology, University Hospital Ostrava, Ostrava (Czech Republic)

    2016-11-15

    Purpose/Objective: To evaluate lung tumor motion during respiration and to describe factors affecting the range and variability of motion in patients treated with stereotactic ablative radiation therapy. Methods and Materials: Log file analysis from online respiratory tumor tracking was performed in 145 patients. Geometric tumor location in the lungs, tumor volume and origin (primary or metastatic), sex, and tumor motion amplitudes in the superior-inferior (SI), latero-lateral (LL), and anterior-posterior (AP) directions were recorded. Tumor motion variability during treatment was described using intrafraction/interfraction amplitude variability and tumor motion baseline changes. Tumor movement dependent on the tumor volume, position and origin, and sex were evaluated using statistical regression and correlation analysis. Results: After analysis of >500 hours of data, the highest rates of motion amplitudes, intrafraction/interfraction variation, and tumor baseline changes were in the SI direction (6.0 ± 2.2 mm, 2.2 ± 1.8 mm, 1.1 ± 0.9 mm, and −0.1 ± 2.6 mm). The mean motion amplitudes in the lower/upper geometric halves of the lungs were significantly different (P<.001). Motion amplitudes >15 mm were observed only in the lower geometric quarter of the lungs. Higher tumor motion amplitudes generated higher intrafraction variations (R=.86, P<.001). Interfraction variations and baseline changes >3 mm indicated tumors contacting mediastinal structures or parietal pleura. On univariate analysis, neither sex nor tumor origin (primary vs metastatic) was an independent predictive factor of different movement patterns. Metastatic lesions in women, but not men, showed significantly higher mean amplitudes (P=.03) and variability (primary, 2.7 mm; metastatic, 4.9 mm; P=.002) than primary tumors. Conclusion: Online tracking showed significant irregularities in lung tumor movement during respiration. Motion amplitude was significantly lower in upper lobe

  3. Genotyping for DQA1 and PM loci in urine using PCR-based amplification: effects of sample volume, storage temperature, preservatives, and aging on DNA extraction and typing.

    Science.gov (United States)

    Vu, N T; Chaturvedi, A K; Canfield, D V

    1999-05-31

    Urine is often the sample of choice for drug screening in aviation/general forensic toxicology and in workplace drug testing. In some instances, the origin of the submitted samples may be challenged because of the medicolegal and socioeconomic consequences of a positive drug test. Methods for individualization of biological samples have reached a new boundary with the application of the polymerase chain reaction (PCR) in DNA profiling, but a successful characterization of the urine specimens depends on the quantity and quality of DNA present in the samples. Therefore, the present study investigated the influence of storage conditions, sample volume, concentration modes, extraction procedures, and chemical preservations on the quantity of DNA recovered, as well as the success rate of PCR-based genotyping for DQA1 and PM loci in urine. Urine specimens from male and female volunteers were divided and stored at various temperatures for up to 30 days. The results suggested that sample purification by dialfiltration, using 3000-100,000 molecular weight cut-off filters, did not enhance DNA recovery and typing rate as compared with simple centrifugation procedures. Extraction of urinary DNA by the organic method and by the resin method gave comparable typing results. Larger sample volume yielded a higher amount of DNA, but the typing rates were not affected for sample volumes between 1 and 5 ml. The quantifiable amounts of DNA present were found to be greater in female (14-200 ng/ml) than in male (4-60 ng/ml) samples and decreased with the elapsed time under both room temperature (RT) and frozen storage. Typing of the male samples also demonstrated that RT storage samples produced significantly higher success rates than that of frozen samples, while there was only marginal difference in the DNA typing rates among the conditions tested using female samples. Successful assignment of DQA1 + PM genotype was achieved for all samples of fresh urine, independent of gender

  4. A high volume sampling system for isotope determination of volatile halocarbons and hydrocarbons

    Directory of Open Access Journals (Sweden)

    E. Bahlmann

    2011-10-01

    Full Text Available The isotopic composition of volatile organic compounds (VOCs can provide valuable information on their sources and fate not deducible from mixing ratios alone. In particular the reported carbon stable isotope ratios of chloromethane and bromomethane from different sources cover a δ13C-range of almost 100‰ making isotope ratios a very promising tool for studying the biogeochemistry of these compounds. So far, the determination of the isotopic composition of C1 and C2 halocarbons others than chloromethane is hampered by their low mixing ratios.

    In order to determine the carbon isotopic composition of C1 and C2 halocarbons with mixing ratios as low as 1 pptv (i a field suitable cryogenic high volume sampling system and (ii a chromatographic set up for processing these samples have been developed and validated. The sampling system was tested at two different sampling sites, an urban and a coastal location in Northern Germany. The average δ13C-values for bromomethane at the urban site were −42.9 ± 1.1‰ and agreed well with previously published results. But at the coastal site bromomethane was substantially enriched in 13C by almost 10‰. Less pronounced differences were observed for chlorodifluoromethane, 1,1,1-trichloroethane and chloromethane. We suggest that these differences are related to the turnover of these compounds in ocean surface waters. Furthermore we report first carbon isotope ratios for iodomethane (−40.4‰ to −79.8‰, bromoform (−13.8‰ to 22.9‰, and other halocarbons.

  5. Large scale air monitoring: Biological indicators versus air particulate matter

    International Nuclear Information System (INIS)

    Rossbach, M.; Jayasekera, R.; Kniewald, G.

    2000-01-01

    Biological indicator organisms are widely used for monitoring and banking purposes since many years. Although the complexity of the interactions between bioorganisms and their environment is generally not easily comprehensible, environmental quality assessment using the bioindicator approach offers some convincing advantages compared to direct analysis of soil, water, or air. Direct measurement of air particulates is restricted to experienced laboratories with access to expensive sampling equipment. Additionally, the amount of material collected generally is just enough for one determination per sampling and no multidimensional characterization might be possible. Further, fluctuations in air masses have a pronounced effect on the results from air filter sampling. Combining the integrating property of bioindicators with the world wide availability and uniform matrix characteristics of air particulates as a prerequisite for global monitoring of air pollution will be discussed. A new approach for sampling urban dust using large volume filtering devices installed in air conditioners of large hotel buildings is assessed. A first experiment was initiated to collect air particulates (300 to 500 g each) from a number of hotels during a period of three to four months by successive vacuum cleaning of used inlet filters from high volume air conditioning installations reflecting average concentrations per three months in different large cities. This approach is expected to be upgraded and applied for global monitoring. Highly positive correlated elements were found in lichen such as K/S, Zn/P, the rare earth elements (REE) and a significant negative correlation between Fig and Cu was observed in these samples. The ratio of concentrations of elements in dust and Usnea spp. is highest for Cr, Zn, and Fe (400-200) and lowest for elements such as Ca, Rb, and Sr (20-10). (author)

  6. Densidade global de solos medida com anel volumétrico e por cachimbagem de terra fina seca ao ar Bulk density of soil samples measured in the field and through volume measurement of sieved soil

    Directory of Open Access Journals (Sweden)

    Bernardo Van Raij

    1989-01-01

    Full Text Available Em laboratórios de rotina de fertilidade do solo, a medida de quantidade de terra para análise é feita em volume, mediante utensílios chamados "cachimbos", que permitem medir volumes de terra. Admite-se que essas medidas reflitam a quantidade de terra existente em volume de solo similar em condições de campo. Essa hipótese foi avaliada neste trabalho, por doze amostras dos horizontes A e B de seis perfis de solos. A densidade em condições de campo foi avaliada por anel volumétrico e, no laboratório, por meio de cachimbos de diversos tamanhos. A cachimbagem revelou-se bastante precisa. Os valores de densidade global calculada variaram de 0,63 a 1,46g/cm³ para medidas de campo e de 0,91 a 1,33g/cm³ para medidas com cachimbos. Portanto, a medida de laboratório subestimou valores altos de densidade e deu resultados mais elevados para valores de campo mais baixos.In soil testing laboratories, soil samples for chemical analysis are usually measured by volume, using appropriate measuring spoons. It is tacitly assumed that such measurements would reflect amounts of soil existing in the same volume under field conditions. This hypothesis was tested, using 12 soil samples of the A and B horizons of six soil profiles. Bulk density in the field was evaluated through a cylindrical metal sampler of 50cm³ and in the laboratory using spoons of different sizes. Measurements of soil volumes by spoons were quite precise. Values of bulk density varied between 0.63 and 1.46g/cm³ for field measurements and between 0.91 and 1.33g/cm³ for laboratory measurements with spoons. Thus, laboratory measurements overestimated lower values of bulk densities and underestimated the higher ones.

  7. Dosimetric Coverage of the Prostate, Normal Tissue Sparing, and Acute Toxicity with High-Dose-Rate Brachytherapy for Large Prostate Volumes

    Directory of Open Access Journals (Sweden)

    George Yang

    2015-06-01

    Full Text Available ABSTRACTPurposeTo evaluate dosimetric coverage of the prostate, normal tissue sparing, and acute toxicity with HDR brachytherapy for large prostate volumes.Materials and MethodsOne hundred and two prostate cancer patients with prostate volumes >50 mL (range: 5-29 mL were treated with high-dose-rate (HDR brachytherapy ± intensity modulated radiation therapy (IMRT to 4,500 cGy in 25 daily fractions between 2009 and 2013. HDR brachytherapy monotherapy doses consisted of two 1,350-1,400 cGy fractions separated by 2-3 weeks, and HDR brachytherapy boost doses consisted of two 950-1,150 cGy fractions separated by 4 weeks. Twelve of 32 (38% unfavorable intermediate risk, high risk, and very high risk patients received androgen deprivation therapy. Acute toxicity was graded according to the Common Terminology Criteria for Adverse Events (CTCAE version 4.ResultsMedian follow-up was 14 months. Dosimetric goals were achieved in over 90% of cases. Three of 102 (3% patients developed Grade 2 acute proctitis. No variables were significantly associated with Grade 2 acute proctitis. Seventeen of 102 (17% patients developed Grade 2 acute urinary retention. American Urological Association (AUA symptom score was the only variable significantly associated with Grade 2 acute urinary retention (p=0.04. There was no ≥ Grade 3 acute toxicity.ConclusionsDosimetric coverage of the prostate and normal tissue sparing were adequate in patients with prostate volumes >50 mL. Higher pre-treatment AUA symptom scores increased the relative risk of Grade 2 acute urinary retention. However, the overall incidence of acute toxicity was acceptable in patients with large prostate volumes.

  8. Dosimetric coverage of the prostate, normal tissue sparing, and acute toxicity with high-dose-rate brachytherapy for large prostate volumes

    Energy Technology Data Exchange (ETDEWEB)

    Yang, George; Strom, Tobin J.; Shrinath, Kushagra; Mellon, Eric A.; Fernandez, Daniel C.; Biagioli, Matthew C. [Department of Radiation Oncology, H. Lee Moffitt Cancer Center and Research Institute, Tampa, FL (United States); Wilder, Richard B., E-mail: mcbiagioli@yahoo.com [Cancer Treatment Centers of America, Newnan, GA (United States)

    2015-05-15

    Purpose: to evaluate dosimetric coverage of the prostate, normal tissue sparing, and acute toxicity with HDR brachytherapy for large prostate volumes. Materials and methods: one hundred and two prostate cancer patients with prostate volumes >50 mL (range: 5-29 mL) were treated with high-dose-rate (HDR) brachytherapy ± intensity modulated radiation therapy (IMRT) to 4,500 cGy in 25 daily fractions between 2009 and 2013. HDR brachytherapy monotherapy doses consisted of two 1,350-1,400 cGy fractions separated by 2-3 weeks, and HDR brachytherapy boost doses consisted of two 950-1,150 cGy fractions separated by 4 weeks. Twelve of 32 (38%) unfavorable intermediate risk, high risk, and very high risk patients received androgen deprivation therapy. Acute toxicity was graded according to the Common Terminology Criteria for Adverse Events (CTCAE) version 4. Results: median follow-up was 14 months. Dosimetric goals were achieved in over 90% of cases. Three of 102 (3%) patients developed Grade 2 acute proctitis. No variables were significantly associated with Grade 2 acute proctitis. Seventeen of 102 (17%) patients developed Grade 2 acute urinary retention. American Urological Association (AUA) symptom score was the only variable significantly associated with Grade 2 acute urinary retention (p-0.04). There was no ≥ Grade 3 acute toxicity. Conclusions: dosimetric coverage of the prostate and normal tissue sparing were adequate in patients with prostate volumes >50 mL. Higher pre-treatment AUA symptom scores increased the relative risk of Grade 2 acute urinary retention. However, the overall incidence of acute toxicity was acceptable in patients with large prostate volumes. (author)

  9. Modelling lidar volume-averaging and its significance to wind turbine wake measurements

    Science.gov (United States)

    Meyer Forsting, A. R.; Troldborg, N.; Borraccino, A.

    2017-05-01

    Lidar velocity measurements need to be interpreted differently than conventional in-situ readings. A commonly ignored factor is “volume-averaging”, which refers to lidars not sampling in a single, distinct point but along its entire beam length. However, especially in regions with large velocity gradients, like the rotor wake, can it be detrimental. Hence, an efficient algorithm mimicking lidar flow sampling is presented, which considers both pulsed and continous-wave lidar weighting functions. The flow-field around a 2.3 MW turbine is simulated using Detached Eddy Simulation in combination with an actuator line to test the algorithm and investigate the potential impact of volume-averaging. Even with very few points discretising the lidar beam is volume-averaging captured accurately. The difference in a lidar compared to a point measurement is greatest at the wake edges and increases from 30% one rotor diameter (D) downstream of the rotor to 60% at 3D.

  10. Determination of Sr-90 in rain water samples

    International Nuclear Information System (INIS)

    Lima, M.F.; Cunha, I.I.L.

    1988-01-01

    A work that aim is to establish radiochemical method for the determination of Sr-90 in rain water samples has been studied, as a step in an environmental monitoring program of radioactive elements. The analysis includes the preconcentration of strontium diluted in a large volume sample by precipitation of strontium as carbonate, separation of strontium from interfering elements (calcium, barium and rare earths), separation of strontium from ytrium, precipitation of purified strontium and ytrium respectively as carbonate and oxalate, and counting of Sr-90 and Y-90 activities in a low background anticoincidence beta counter. (author) [pt

  11. Volume changes of extremely large and giant intracranial aneurysms after treatment with flow diverter stents

    Energy Technology Data Exchange (ETDEWEB)

    Carneiro, Angelo; Byrne, James V. [ohn Radcliffe Hospital, Oxford Neurovascular and Neuroradiology Research Unit, Nuffield Department of Surgical Sciences, Oxford (United Kingdom); Rane, Neil; Kueker, Wilhelm; Cellerini, Martino; Corkill, Rufus [John Radcliffe Hospital, Department of Neuroradiology, Oxford (United Kingdom)

    2014-01-15

    This study assessed volume changes of unruptured large and giant aneurysms (greatest diameter >20 mm) after treatment with flow diverter (FD) stents. Clinical audit of the cases treated in a single institution, over a 5-year period. Demographic and clinical data were retrospectively collected from the hospital records. Aneurysm volumes were measured by manual outlining at sequential slices using computerised tomography (CT) or magnetic resonance (MR) angiography data. The audit included eight patients (seven females) with eight aneurysms. Four aneurysms involved the cavernous segment of the internal carotid artery (ICA), three the supraclinoid ICA and one the basilar artery. Seven patients presented with signs and symptoms of mass effect and one with seizures. All but one aneurysm was treated with a single FD stent; six aneurysms were also coiled (either before or simultaneously with FD placement). Minimum follow-up time was 6 months (mean 20 months). At follow-up, three aneurysms decreased in size, three were unchanged and two increased. Both aneurysms that increased in size showed persistent endosaccular flow at follow-up MR; in one case, failure was attributed to suboptimal position of the stent; in the other case, it was attributed to persistence of a side branch originating from the aneurysm (similar to the endoleak phenomenon of aortic aneurysms). At follow-up, five aneurysms were completely occluded; none of these increased in volume. Complete occlusion of the aneurysms leads, in most cases, to its shrinkage. In cases of late aneurysm growth or regrowth, consideration should be given to possible endoleak as the cause. (orig.)

  12. A systematic investigation of sample diluents in modern supercritical fluid chromatography.

    Science.gov (United States)

    Desfontaine, Vincent; Tarafder, Abhijit; Hill, Jason; Fairchild, Jacob; Grand-Guillaume Perrenoud, Alexandre; Veuthey, Jean-Luc; Guillarme, Davy

    2017-08-18

    This paper focuses on the possibility to inject large volumes (up to 10μL) in ultra-high performance supercritical fluid chromatography (UHPSFC) under generic gradient conditions. Several injection and method parameters have been individually evaluated (i.e. analyte concentration, injection volume, initial percentage of co-solvent in the gradient, nature of the weak needle wash solvent, nature of the sample diluent, nature of the column and of the analyte). The most critical parameters were further investigated using in a multivariate approach. The overall results suggested that several aprotic solvents including methyl tert-butyl ether (MTBE), dichloromethane, acetonitrile or cyclopentyl methyl ether (CPME) were well adapted for the injection of large volume in UHPSFC, while MeOH was generally the worst alternative. However, the nature of the stationary phase also had a strong impact and some of these diluents did not perform equally on each column. This was due to the existence of a competition in the adsorption of the analyte and the diluent on the stationary phase. This observation introduced the idea that the sample diluent should not only be chosen according to the analyte but also to the column chemistry to limit the interactions between the diluent and the ligands. Other important characteristics of the "ideal" SFC sample diluent were finally highlighted. Aprotic solvents with low viscosity are preferable to avoid strong solvent effects and viscous fingering, respectively. In the end, the authors suggest that the choice of the sample diluent should be part of the method development, as a function of the analyte and the selected stationary phase. Copyright © 2017 Elsevier B.V. All rights reserved.

  13. A large volume uniform plasma generator for the experiments of electromagnetic wave propagation in plasma

    International Nuclear Information System (INIS)

    Yang Min; Li Xiaoping; Xie Kai; Liu Donglin; Liu Yanming

    2013-01-01

    A large volume uniform plasma generator is proposed for the experiments of electromagnetic (EM) wave propagation in plasma, to reproduce a “black out” phenomenon with long duration in an environment of the ordinary laboratory. The plasma generator achieves a controllable approximate uniform plasma in volume of 260 mm× 260 mm× 180 mm without the magnetic confinement. The plasma is produced by the glow discharge, and the special discharge structure is built to bring a steady approximate uniform plasma environment in the electromagnetic wave propagation path without any other barriers. In addition, the electron density and luminosity distributions of plasma under different discharge conditions were diagnosed and experimentally investigated. Both the electron density and the plasma uniformity are directly proportional to the input power and in roughly reverse proportion to the gas pressure in the chamber. Furthermore, the experiments of electromagnetic wave propagation in plasma are conducted in this plasma generator. Blackout phenomena at GPS signal are observed under this system and the measured attenuation curve is of reasonable agreement with the theoretical one, which suggests the effectiveness of the proposed method.

  14. Psychometric Evaluation of the Thought–Action Fusion Scale in a Large Clinical Sample

    Science.gov (United States)

    Meyer, Joseph F.; Brown, Timothy A.

    2015-01-01

    This study examined the psychometric properties of the 19-item Thought–Action Fusion (TAF) Scale, a measure of maladaptive cognitive intrusions, in a large clinical sample (N = 700). An exploratory factor analysis (n = 300) yielded two interpretable factors: TAF Moral (TAF-M) and TAF Likelihood (TAF-L). A confirmatory bifactor analysis was conducted on the second portion of the sample (n = 400) to account for possible sources of item covariance using a general TAF factor (subsuming TAF-M) alongside the TAF-L domain-specific factor. The bifactor model provided an acceptable fit to the sample data. Results indicated that global TAF was more strongly associated with a measure of obsessive-compulsiveness than measures of general worry and depression, and the TAF-L dimension was more strongly related to obsessive-compulsiveness than depression. Overall, results support the bifactor structure of the TAF in a clinical sample and its close relationship to its neighboring obsessive-compulsiveness construct. PMID:22315482

  15. Psychometric evaluation of the thought-action fusion scale in a large clinical sample.

    Science.gov (United States)

    Meyer, Joseph F; Brown, Timothy A

    2013-12-01

    This study examined the psychometric properties of the 19-item Thought-Action Fusion (TAF) Scale, a measure of maladaptive cognitive intrusions, in a large clinical sample (N = 700). An exploratory factor analysis (n = 300) yielded two interpretable factors: TAF Moral (TAF-M) and TAF Likelihood (TAF-L). A confirmatory bifactor analysis was conducted on the second portion of the sample (n = 400) to account for possible sources of item covariance using a general TAF factor (subsuming TAF-M) alongside the TAF-L domain-specific factor. The bifactor model provided an acceptable fit to the sample data. Results indicated that global TAF was more strongly associated with a measure of obsessive-compulsiveness than measures of general worry and depression, and the TAF-L dimension was more strongly related to obsessive-compulsiveness than depression. Overall, results support the bifactor structure of the TAF in a clinical sample and its close relationship to its neighboring obsessive-compulsiveness construct.

  16. Recurrence interval analysis of trading volumes.

    Science.gov (United States)

    Ren, Fei; Zhou, Wei-Xing

    2010-06-01

    We study the statistical properties of the recurrence intervals τ between successive trading volumes exceeding a certain threshold q. The recurrence interval analysis is carried out for the 20 liquid Chinese stocks covering a period from January 2000 to May 2009, and two Chinese indices from January 2003 to April 2009. Similar to the recurrence interval distribution of the price returns, the tail of the recurrence interval distribution of the trading volumes follows a power-law scaling, and the results are verified by the goodness-of-fit tests using the Kolmogorov-Smirnov (KS) statistic, the weighted KS statistic and the Cramér-von Mises criterion. The measurements of the conditional probability distribution and the detrended fluctuation function show that both short-term and long-term memory effects exist in the recurrence intervals between trading volumes. We further study the relationship between trading volumes and price returns based on the recurrence interval analysis method. It is found that large trading volumes are more likely to occur following large price returns, and the comovement between trading volumes and price returns is more pronounced for large trading volumes.

  17. Relationship between haemoglobin concentration and packed cell volume in cattle blood samples

    Directory of Open Access Journals (Sweden)

    Paa-Kobina Turkson

    2015-02-01

    Full Text Available A convention that has been adopted in medicine is to estimate haemoglobin (HB concentration as a third of packed cell volume (PCV or vice versa. The present research set out to determine whether a proportional relationship exists between PCV and Hb concentration in cattle blood samples, and to assess the validity of the convention of estimating Hb concentration as a third of PCV. A total of 440 cattle in Ghana from four breeds (Ndama, 110; West African Short Horn, 110; Zebu, 110 and Sanga, 110 were bled for haematological analysis, specifically packed cell volume, using the microhaematocrit technique and haemoglobin concentration using the cyanmethaemoglobin method. Means, standard deviations, standard errors of mean and 95% confidence intervals were calculated. Trendline analyses generated linear regression equations from scatterplots. For all the cattle, a significant and consistent relationship (r = 0.74 was found between Hb concentration and PCV (%. This was expressed as Hb concentration (g/dL = 0.28 PCV + 3.11. When the Hb concentration was estimated by calculating it as a third of PCV, the relationship was expressed in linear regression as Hb concentration (g/dL = 0.83 calculated Hb + 3.11. The difference in the means of determined (12.2 g/dL and calculated (10.9 g/dL Hb concentrations for all cattle was significant (p < 0.001, whereas the difference in the means of determined Hb and corrected calculated Hb was not significant. In conclusion, a simplified relationship of Hb (g/dL = (0.3 PCV + 3 may provide a better estimate of Hb concentration from the PCV of cattle.

  18. Recovery of diverse microbes in high turbidity surface water samples using dead-end ultrafiltration.

    Science.gov (United States)

    Mull, Bonnie; Hill, Vincent R

    2012-12-01

    Dead-end ultrafiltration (DEUF) has been reported to be a simple, field-deployable technique for recovering bacteria, viruses, and parasites from large-volume water samples for water quality testing and waterborne disease investigations. While DEUF has been reported for application to water samples having relatively low turbidity, little information is available regarding recovery efficiencies for this technique when applied to sampling turbid water samples such as those commonly found in lakes and rivers. This study evaluated the effectiveness of a DEUF technique for recovering MS2 bacteriophage, enterococci, Escherichia coli, Clostridium perfringens, and Cryptosporidium parvum oocysts in surface water samples having elevated turbidity. Average recovery efficiencies for each study microbe across all turbidity ranges were: MS2 (66%), C. parvum (49%), enterococci (85%), E. coli (81%), and C. perfringens (63%). The recovery efficiencies for MS2 and C. perfringens exhibited an inversely proportional relationship with turbidity, however no significant differences in recovery were observed for C. parvum, enterococci, or E. coli. Although ultrafilter clogging was observed, the DEUF method was able to process 100-L surface water samples at each turbidity level within 60 min. This study supports the use of the DEUF method for recovering a wide array of microbes in large-volume surface water samples having medium to high turbidity. Published by Elsevier B.V.

  19. Analysis of Three Compounds in Flos Farfarae by Capillary Electrophoresis with Large-Volume Sample Stacking

    Directory of Open Access Journals (Sweden)

    Hai-xia Yu

    2017-01-01

    Full Text Available The aim of this study was to develop a method combining an online concentration and high-efficiency capillary electrophoresis separation to analyze and detect three compounds (rutin, hyperoside, and chlorogenic acid in Flos Farfarae. In order to get good resolution and enrichment, several parameters such as the choice of running buffer, pH and concentration of the running buffer, organic modifier, temperature, and separation voltage were all investigated. The optimized conditions were obtained as follows: the buffer of 40 mM NaH2P04-40 mM Borax-30% v/v methanol (pH 9.0; the sample hydrodynamic injection of up to 4 s at 0.5 psi; 20 kV applied voltage. The diode-array detector was used, and the detection wavelength was 364 nm. Based on peak area, higher levels of selective and sensitive improvements in analysis were observed and about 14-, 26-, and 5-fold enrichment of rutin, hyperoside, and chlorogenic acid were achieved, respectively. This method was successfully applied to determine the three compounds in Flos Farfarae. The linear curve of peak response versus concentration was from 20 to 400 µg/ml, 16.5 to 330 µg/mL, and 25 to 500 µg/mL, respectively. The regression coefficients were 0.9998, 0.9999, and 0.9991, respectively.

  20. A prototype splitter apparatus for dividing large catches of small fish

    Science.gov (United States)

    Stapanian, Martin A.; Edwards, William H.

    2012-01-01

    Due to financial and time constraints, it is often necessary in fisheries studies to divide large samples of fish and estimate total catch from the subsample. The subsampling procedure may involve potential human biases or may be difficult to perform in rough conditions. We present a prototype gravity-fed splitter apparatus for dividing large samples of small fish (30–100 mm TL). The apparatus features a tapered hopper with a sliding and removable shutter. The apparatus provides a comparatively stable platform for objectively obtaining subsamples, and it can be modified to accommodate different sizes of fish and different sample volumes. The apparatus is easy to build, inexpensive, and convenient to use in the field. To illustrate the performance of the apparatus, we divided three samples (total N = 2,000 fish) composed of four fish species. Our results indicated no significant bias in estimating either the number or proportion of each species from the subsample. Use of this apparatus or a similar apparatus can help to standardize subsampling procedures in large surveys of fish. The apparatus could be used for other applications that require dividing a large amount of material into one or more smaller subsamples.

  1. Molecular dynamics based enhanced sampling of collective variables with very large time steps

    Science.gov (United States)

    Chen, Pei-Yang; Tuckerman, Mark E.

    2018-01-01

    Enhanced sampling techniques that target a set of collective variables and that use molecular dynamics as the driving engine have seen widespread application in the computational molecular sciences as a means to explore the free-energy landscapes of complex systems. The use of molecular dynamics as the fundamental driver of the sampling requires the introduction of a time step whose magnitude is limited by the fastest motions in a system. While standard multiple time-stepping methods allow larger time steps to be employed for the slower and computationally more expensive forces, the maximum achievable increase in time step is limited by resonance phenomena, which inextricably couple fast and slow motions. Recently, we introduced deterministic and stochastic resonance-free multiple time step algorithms for molecular dynamics that solve this resonance problem and allow ten- to twenty-fold gains in the large time step compared to standard multiple time step algorithms [P. Minary et al., Phys. Rev. Lett. 93, 150201 (2004); B. Leimkuhler et al., Mol. Phys. 111, 3579-3594 (2013)]. These methods are based on the imposition of isokinetic constraints that couple the physical system to Nosé-Hoover chains or Nosé-Hoover Langevin schemes. In this paper, we show how to adapt these methods for collective variable-based enhanced sampling techniques, specifically adiabatic free-energy dynamics/temperature-accelerated molecular dynamics, unified free-energy dynamics, and by extension, metadynamics, thus allowing simulations employing these methods to employ similarly very large time steps. The combination of resonance-free multiple time step integrators with free-energy-based enhanced sampling significantly improves the efficiency of conformational exploration.

  2. Investigating sex differences in psychological predictors of snack intake among a large representative sample

    NARCIS (Netherlands)

    Adriaanse, M.A.; Evers, C.; Verhoeven, A.A.C.; de Ridder, D.T.D.

    It is often assumed that there are substantial sex differences in eating behaviour (e.g. women are more likely to be dieters or emotional eaters than men). The present study investigates this assumption in a large representative community sample while incorporating a comprehensive set of

  3. Volume and Surface-Enhanced Volume Negative Ion Sources

    International Nuclear Information System (INIS)

    Stockli, M P

    2013-01-01

    H - volume sources and, especially, caesiated H - volume sources are important ion sources for generating high-intensity proton beams, which then in turn generate large quantities of other particles. This chapter discusses the physics and technology of the volume production and the caesium-enhanced (surface) production of H - ions. Starting with Bacal's discovery of the H - volume production, the chapter briefly recounts the development of some H - sources, which capitalized on this process to significantly increase the production of H - beams. Another significant increase was achieved in the 1990s by adding caesiated surfaces to supplement the volume-produced ions with surface-produced ions, as illustrated with other H - sources. Finally, the focus turns to some of the experience gained when such a source was successfully ramped up in H - output and in duty factor to support the generation of 1 MW proton beams for the Spallation Neutron Source. (author)

  4. PVR: Patch-to-Volume Reconstruction for Large Area Motion Correction of Fetal MRI.

    Science.gov (United States)

    Alansary, Amir; Rajchl, Martin; McDonagh, Steven G; Murgasova, Maria; Damodaram, Mellisa; Lloyd, David F A; Davidson, Alice; Rutherford, Mary; Hajnal, Joseph V; Rueckert, Daniel; Kainz, Bernhard

    2017-10-01

    In this paper, we present a novel method for the correction of motion artifacts that are present in fetal magnetic resonance imaging (MRI) scans of the whole uterus. Contrary to current slice-to-volume registration (SVR) methods, requiring an inflexible anatomical enclosure of a single investigated organ, the proposed patch-to-volume reconstruction (PVR) approach is able to reconstruct a large field of view of non-rigidly deforming structures. It relaxes rigid motion assumptions by introducing a specific amount of redundant information that is exploited with parallelized patchwise optimization, super-resolution, and automatic outlier rejection. We further describe and provide an efficient parallel implementation of PVR allowing its execution within reasonable time on commercially available graphics processing units, enabling its use in the clinical practice. We evaluate PVR's computational overhead compared with standard methods and observe improved reconstruction accuracy in the presence of affine motion artifacts compared with conventional SVR in synthetic experiments. Furthermore, we have evaluated our method qualitatively and quantitatively on real fetal MRI data subject to maternal breathing and sudden fetal movements. We evaluate peak-signal-to-noise ratio, structural similarity index, and cross correlation with respect to the originally acquired data and provide a method for visual inspection of reconstruction uncertainty. We further evaluate the distance error for selected anatomical landmarks in the fetal head, as well as calculating the mean and maximum displacements resulting from automatic non-rigid registration to a motion-free ground truth image. These experiments demonstrate a successful application of PVR motion compensation to the whole fetal body, uterus, and placenta.

  5. Dynamics of acoustically levitated disk samples.

    Science.gov (United States)

    Xie, W J; Wei, B

    2004-10-01

    The acoustic levitation force on disk samples and the dynamics of large water drops in a planar standing wave are studied by solving the acoustic scattering problem through incorporating the boundary element method. The dependence of levitation force amplitude on the equivalent radius R of disks deviates seriously from the R3 law predicted by King's theory, and a larger force can be obtained for thin disks. When the disk aspect ratio gamma is larger than a critical value gamma(*) ( approximately 1.9 ) and the disk radius a is smaller than the critical value a(*) (gamma) , the levitation force per unit volume of the sample will increase with the enlargement of the disk. The acoustic levitation force on thin-disk samples ( gammaacoustic field for stable levitation of a large water drop is to adjust the reflector-emitter interval H slightly above the resonant interval H(n) . The simulation shows that the drop is flattened and the central parts of its top and bottom surface become concave with the increase of sound pressure level, which agrees with the experimental observation. The main frequencies of the shape oscillation under different sound pressures are slightly larger than the Rayleigh frequency because of the large shape deformation. The simulated translational frequencies of the vertical vibration under normal gravity condition agree with the theoretical analysis.

  6. Legacy sample disposition project. Volume 2: Final report

    International Nuclear Information System (INIS)

    Gurley, R.N.; Shifty, K.L.

    1998-02-01

    This report describes the legacy sample disposition project at the Idaho Engineering and Environmental Laboratory (INEEL), which assessed Site-wide facilities/areas to locate legacy samples and owner organizations and then characterized and dispositioned these samples. This project resulted from an Idaho Department of Environmental Quality inspection of selected areas of the INEEL in January 1996, which identified some samples at the Test Reactor Area and Idaho Chemical Processing Plant that had not been characterized and dispositioned according to Resource Conservation and Recovery Act (RCRA) requirements. The objective of the project was to manage legacy samples in accordance with all applicable environmental and safety requirements. A systems engineering approach was used throughout the project, which included collecting the legacy sample information and developing a system for amending and retrieving the information. All legacy samples were dispositioned by the end of 1997. Closure of the legacy sample issue was achieved through these actions

  7. Soil sample preparation using microwave digestion for uranium analysis

    International Nuclear Information System (INIS)

    Mohagheghi, Amir H.; Preston, Rose; Akbarzadeh, Mansoor; Bakthiar, Steven

    2000-01-01

    A new sample preparation procedure has been developed for digestion of soil samples for uranium analysis. The technique employs a microwave oven digestion system to digest the sample and to prepare it for separation chemistry and analysis. The method significantly reduces the volume of acids used, eliminates a large fraction of acid vapor emissions, and speeds up the analysis time. The samples are analyzed by four separate techniques: Gamma Spectrometry, Alpha Spectroscopy using the open digestion method, Kinetic Phosphorescence Analysis (KPA) using open digestion, and KPA by Microwave digestion technique. The results for various analytical methods are compared and used to confirm the validity of the new procedure. The details of the preparation technique along with its benefits are discussed

  8. Technical Note: New methodology for measuring viscosities in small volumes characteristic of environmental chamber particle samples

    Directory of Open Access Journals (Sweden)

    L. Renbaum-Wolff

    2013-01-01

    Full Text Available Herein, a method for the determination of viscosities of small sample volumes is introduced, with important implications for the viscosity determination of particle samples from environmental chambers (used to simulate atmospheric conditions. The amount of sample needed is < 1 μl, and the technique is capable of determining viscosities (η ranging between 10−3 and 103 Pascal seconds (Pa s in samples that cover a range of chemical properties and with real-time relative humidity and temperature control; hence, the technique should be well-suited for determining the viscosities, under atmospherically relevant conditions, of particles collected from environmental chambers. In this technique, supermicron particles are first deposited on an inert hydrophobic substrate. Then, insoluble beads (~1 μm in diameter are embedded in the particles. Next, a flow of gas is introduced over the particles, which generates a shear stress on the particle surfaces. The sample responds to this shear stress by generating internal circulations, which are quantified with an optical microscope by monitoring the movement of the beads. The rate of internal circulation is shown to be a function of particle viscosity but independent of the particle material for a wide range of organic and organic-water samples. A calibration curve is constructed from the experimental data that relates the rate of internal circulation to particle viscosity, and this calibration curve is successfully used to predict viscosities in multicomponent organic mixtures.

  9. The core collapse supernova rate from 24 years of data of the Large Volume Detector

    Science.gov (United States)

    Bruno, G.; Fulgione, W.; Molinario, A.; Vigorito, C.; LVD Collaboration

    2017-09-01

    The Large Volume Detector (LVD) at INFN Laboratori Nazionali del Gran Sasso, Italy is a 1 kt liquid scintillator neutrino observatory mainly designed to study low energy neutrinos from Gravitational Stellar Collapses (GSC) with 100% efficiency over the entire Galaxy. Here we summarize the results of the search for supernova neutrino bursts over the full data set lasting from June 1992 to May 2016 for a total live time of 8211 days. In the lack of a positive observation, either in standalone mode or in coincidence with other experiments, we establish the upper limit to the rate of GSC event in the Milky Way: 0.1 year-1 at 90% c.l..

  10. Sample preparation and analysis of large 238PuO2 and ThO2 spheres

    International Nuclear Information System (INIS)

    Wise, R.L.; Selle, J.E.

    1975-01-01

    A program was initiated to determine the density gradient across a large spherical 238 PuO 2 sample produced by vacuum hot pressing. Due to the high thermal output of the ceramic a thin section was necessary to prevent overheating of the plastic mount. Techniques were developed for cross sectioning, mounting, grinding, and polishing of the sample. The polished samples were then analyzed on a quantitative image analyzer to determine the density as a function of location across the sphere. The techniques for indexing, analyzing, and reducing the data are described. Typical results obtained on a ThO 2 simulant sphere are given

  11. High failure rates after (131)I therapy in Graves hyperthyroidism patients with large thyroid volumes, high iodine uptake, and high iodine turnover.

    Science.gov (United States)

    de Jong, Jeroen A F; Verkooijen, Helena M; Valk, Gerlof D; Zelissen, Pierre M J; de Keizer, Bart

    2013-06-01

    The objective of this study was to identify patient characteristics positively and independently associated with I-iodide treatment failure in a large cohort of patients with Graves hyperthyroidism treated with either a calculated "standard" activity of 3.7 MBq/mL (0.1 mCi) or 7.4 MBq/mL (0.2 mCi) of thyroid volume. Data on 385 consecutive patients were prospectively collected. Clinical treatment outcome up to 1 year in relation to thyroid volume, 5- and 24-hour I uptake, 5/24-hour I uptake ratio, and the administered activity of radioiodine were analyzed. Overall treatment results were hypothyroidism in 46%, euthyroidism in 29%, and recurrent hyperthyroidism in 26% of patients. Thyroid volume (P = 0.000), 5/24-hour uptake ratio (P = 0.000), and 5- and 24-hour uptake alone (respectively, P = 0.000 and P = 0.002) were significantly associated with therapy outcome. Patients with a combination of a thyroid volume greater than 50 mL and a 5/24-hour uptake ratio 0.8 or greater showed treatment failure in 70% and 42% (respectively, 3.7 MBq/mL, n = 20; and 7.4 MBq/mL, n = 41).Thyroid volume and 5/24-hour uptake ratio were positively and independently associated with recurrent hyperthyroidism (respectively, odds ratio [OR], 5.3; 95% confidence interval [CI], 2.39-11.76; and OR, 2.97; 95% CI, 1.59-5.59). Higher activities of 7.4 MBq/mL I were associated with a lower risk of treatment failure (OR, 0.34; 95% CI, 0.18-0.62). Large thyroid volumes and high 5/24-hour uptake ratios are positively and independently associated with recurrent hyperthyroidism following I therapy in Graves hyperthyroidism. Higher success rates can be achieved when account is taken of these poor prognostic factors. In consequence, these patients should be treated with activities greater than 7.4 MBq/mL.

  12. The Complete Local Volume Groups Sample - I. Sample selection and X-ray properties of the high-richness subsample

    Science.gov (United States)

    O'Sullivan, Ewan; Ponman, Trevor J.; Kolokythas, Konstantinos; Raychaudhury, Somak; Babul, Arif; Vrtilek, Jan M.; David, Laurence P.; Giacintucci, Simona; Gitti, Myriam; Haines, Chris P.

    2017-12-01

    We present the Complete Local-Volume Groups Sample (CLoGS), a statistically complete optically selected sample of 53 groups within 80 Mpc. Our goal is to combine X-ray, radio and optical data to investigate the relationship between member galaxies, their active nuclei and the hot intra-group medium (IGM). We describe sample selection, define a 26-group high-richness subsample of groups containing at least four optically bright (log LB ≥ 10.2 LB⊙) galaxies, and report the results of XMM-Newton and Chandra observations of these systems. We find that 14 of the 26 groups are X-ray bright, possessing a group-scale IGM extending at least 65 kpc and with luminosity >1041 erg s-1, while a further three groups host smaller galaxy-scale gas haloes. The X-ray bright groups have masses in the range M500 ≃ 0.5-5 × 1013 M⊙, based on system temperatures of 0.4-1.4 keV, and X-ray luminosities in the range 2-200 × 1041 erg s-1. We find that ∼53-65 per cent of the X-ray bright groups have cool cores, a somewhat lower fraction than found by previous archival surveys. Approximately 30 per cent of the X-ray bright groups show evidence of recent dynamical interactions (mergers or sloshing), and ∼35 per cent of their dominant early-type galaxies host active galactic nuclei with radio jets. We find no groups with unusually high central entropies, as predicted by some simulations, and confirm that CLoGS is in principle capable of detecting such systems. We identify three previously unrecognized groups, and find that they are either faint (LX, R500 < 1042 erg s-1) with no concentrated cool core, or highly disturbed. This leads us to suggest that ∼20 per cent of X-ray bright groups in the local universe may still be unidentified.

  13. The Kinematics of the Permitted C ii λ 6578 Line in a Large Sample of Planetary Nebulae

    Energy Technology Data Exchange (ETDEWEB)

    Richer, Michael G.; Suárez, Genaro; López, José Alberto; García Díaz, María Teresa, E-mail: richer@astrosen.unam.mx, E-mail: gsuarez@astro.unam.mx, E-mail: jal@astrosen.unam.mx, E-mail: tere@astro.unam.mx [Instituto de Astronomía, Universidad Nacional Autónoma de México, Ensenada, Baja California (Mexico)

    2017-03-01

    We present spectroscopic observations of the C ii λ 6578 permitted line for 83 lines of sight in 76 planetary nebulae at high spectral resolution, most of them obtained with the Manchester Echelle Spectrograph on the 2.1 m telescope at the Observatorio Astronómico Nacional on the Sierra San Pedro Mártir. We study the kinematics of the C ii λ 6578 permitted line with respect to other permitted and collisionally excited lines. Statistically, we find that the kinematics of the C ii λ 6578 line are not those expected if this line arises from the recombination of C{sup 2+} ions or the fluorescence of C{sup +} ions in ionization equilibrium in a chemically homogeneous nebular plasma, but instead its kinematics are those appropriate for a volume more internal than expected. The planetary nebulae in this sample have well-defined morphology and are restricted to a limited range in H α line widths (no large values) compared to their counterparts in the Milky Way bulge; both these features could be interpreted as the result of young nebular shells, an inference that is also supported by nebular modeling. Concerning the long-standing discrepancy between chemical abundances inferred from permitted and collisionally excited emission lines in photoionized nebulae, our results imply that multiple plasma components occur commonly in planetary nebulae.

  14. Determination of environmental levels of 239240Pu, 241Am, 137Cs, and 90Sr in large volume sea water samples

    International Nuclear Information System (INIS)

    Sutton, D.C.; Calderon, G.; Rosa, W.

    1976-06-01

    A method is reported for the determination of environmental levels of 239 240 Pu and 241 Am in approximately 60-liter size samples of seawater. 137 Cs and 90 Sr were also separated and determined from the same samples. The samples were collected at the sea surface and at various depths in the oceans through the facilities of the Woods Hole Oceanographic Institution. Plutonium and americium were separated from the seawater by iron hydroxide scavenging then treated with a mixture of nitric, hydrochloric, and perchloric acids. A series of anion exchange separations were used to remove interferences and purify plutonium and americium; then each was electroplated on platinum disks and measured by solid state alpha particle spectrometry. The overall chemical yields averaged 62 +- 9 and 69 +- 14 percent for 236 Pu, and 243 Am tracers, respectively. Following the iron hydroxide scavenge of the transuranics, cesium was removed from the acidified seawater matrix by adsorption onto ammonium phosphomolybdate. Cesium carrier and 137 Cs isolation was effected by ion exchange and precipitations were made using chloroplatinic acid. The samples were weighed to determine overall chemical yield then beta counted. Cesium recoveries averaged 75 +- 5 percent. After cesium was removed from the seawater matrix, the samples were neutralized with sodium hydroxide and ammonium carbonate was added to precipitate 85 Sr tracer and the mixed alkaline earth carbonates. Strontium was separated as the nitrate and scavenged by chromate and hydroxide precipitations. Yttrium-90 was allowed to build up for two weeks, then milked and precipitated as the oxalate, weighed, and beta counted. The overall chemical yields of 85 Sr tracer averaged 84 +- 16 percent. The recovery of the yttrium oxalate precipitates averaged 96 +- 3 percent

  15. Application of Conventional and K0-Based Internal Monostandard NAA Using Reactor Neutrons for Compositional Analysis of Large Samples

    International Nuclear Information System (INIS)

    Reddy, A.V.R.; Acharya, R.; Swain, K. K.; Pujari, P.K.

    2018-01-01

    Large sample neutron activation analysis (LSNAA) work was carried out for samples of coal, uranium ore, stainless steel, ancient and new clay potteries, dross and clay pottery replica from Peru using low flux high thermalized irradiation sites. Large as well as non-standard geometry samples (1 g - 0.5 kg) were irradiated using thermal column (TC) facility of Apsara reactor as well as graphite reflector position of critical facility (CF) at Bhabha Atomic Research Centre, Mumbai. Small size (10 - 500 mg) samples were also irradiated at core position of Apsara reactor, pneumatic carrier facility (PCF) of Dhruva reactor and pneumatic fast transfer facility (PFTS) of KAMINI reactor. Irradiation positions were characterized using indium flux monitor for TC and CF whereas multi monitors were used at other positions. Radioactive assay was carried out using high resolution gamma ray spectrometry. The k0-based internal monostandard NAA (IM-NAA) method was used to determine elemental concentration ratios with respect to Na in coal and uranium ore samples, Sc in pottery samples and Fe in stainless steel. Insitu relative detection efficiency for each irradiated sample was obtained using γ rays of activation products in the required energy range. Representative sample sizes were arrived at for coal and uranium ore from the plots of La/Na ratios as a function of the mass of the sample. For stainless steel sample of SS 304L, the absolute concentrations were calculated from concentration ratios by mass balance approach since all the major elements (Fe, Cr, Ni and Mn) were amenable to NAA. Concentration ratios obtained by IM-NAA were used for provenance study of 30 clay potteries, obtained from excavated Buddhist sites of AP, India. The La to Ce concentration ratios were used for preliminary grouping and concentration ratios of 15 elements with respect to Sc were used by statistical cluster analysis for confirmation of grouping. Concentrations of Au and Ag were determined in not so

  16. Implication on the core collapse supernova rate from 21 years of data of the Large Volume Detector

    CERN Document Server

    Agafonova, N Y; Antonioli, P; Ashikhmin, V V; Badino, G.; Bari, G; Bertoni, R; Bressan, E; Bruno, G; Dadykin, V L; Dobrynina, E A; Enikeev, R I; Fulgione, W; Galeotti, P; Garbini, M; Ghia, P L; Giusti, P; Gomez, F; Kemp, E; Malgin, A S; Molinario, A; Persiani, R; Pless, I A; Porta, A; Ryasny, V G; Ryazhskaya, O G; Saavedra, O; Sartorelli, G; Shakiryanova, I R; Selvi, M; Trinchero, G C; Vigorito, C; Yakushev, V F; Zichichi, A

    2015-01-01

    The Large Volume Detector (LVD) has been continuously taking data since 1992 at the INFN Gran Sasso National Laboratory. LVD is sensitive to neutrino bursts from gravitational stellar collapses with full detection probability over the Galaxy. We have searched for neutrino bursts in LVD data taken in 7335 days of operation. No evidence of neutrino signals has been found between June 1992 and December 2013. The 90% C.L. upper limit on the rate of core-collapse and failed supernova explosions out to distances of 25 kpc is found to be 0.114/y.

  17. Clinical application of microsampling versus conventional sampling techniques in the quantitative bioanalysis of antibiotics: a systematic review.

    Science.gov (United States)

    Guerra Valero, Yarmarly C; Wallis, Steven C; Lipman, Jeffrey; Stove, Christophe; Roberts, Jason A; Parker, Suzanne L

    2018-03-01

    Conventional sampling techniques for clinical pharmacokinetic studies often require the removal of large blood volumes from patients. This can result in a physiological or emotional burden, particularly for neonates or pediatric patients. Antibiotic pharmacokinetic studies are typically performed on healthy adults or general ward patients. These may not account for alterations to a patient's pathophysiology and can lead to suboptimal treatment. Microsampling offers an important opportunity for clinical pharmacokinetic studies in vulnerable patient populations, where smaller sample volumes can be collected. This systematic review provides a description of currently available microsampling techniques and an overview of studies reporting the quantitation and validation of antibiotics using microsampling. A comparison of microsampling to conventional sampling in clinical studies is included.

  18. Effect of crowd size on patient volume at a large, multipurpose, indoor stadium.

    Science.gov (United States)

    De Lorenzo, R A; Gray, B C; Bennett, P C; Lamparella, V J

    1989-01-01

    A prediction of patient volume expected at "mass gatherings" is desirable in order to provide optimal on-site emergency medical care. While several methods of predicting patient loads have been suggested, a reliable technique has not been established. This study examines the frequency of medical emergencies at the Syracuse University Carrier Dome, a 50,500-seat indoor stadium. Patient volume and level of care at collegiate basketball and football games as well as rock concerts, over a 7-year period were examined and tabulated. This information was analyzed using simple regression and nonparametric statistical methods to determine level of correlation between crowd size and patient volume. These analyses demonstrated no statistically significant increase in patient volume for increasing crowd size for basketball and football events. There was a small but statistically significant increase in patient volume for increasing crowd size for concerts. A comparison of similar crowd size for each of the three events showed that patient frequency is greatest for concerts and smallest for basketball. The study suggests that crowd size alone has only a minor influence on patient volume at any given event. Structuring medical services based solely on expected crowd size and not considering other influences such as event type and duration may give poor results.

  19. Distributed database kriging for adaptive sampling (D2KAS)

    International Nuclear Information System (INIS)

    Roehm, Dominic; Pavel, Robert S.; Barros, Kipton; Rouet-Leduc, Bertrand; McPherson, Allen L.; Germann, Timothy C.; Junghans, Christoph

    2015-01-01

    We present an adaptive sampling method supplemented by a distributed database and a prediction method for multiscale simulations using the Heterogeneous Multiscale Method. A finite-volume scheme integrates the macro-scale conservation laws for elastodynamics, which are closed by momentum and energy fluxes evaluated at the micro-scale. In the original approach, molecular dynamics (MD) simulations are launched for every macro-scale volume element. Our adaptive sampling scheme replaces a large fraction of costly micro-scale MD simulations with fast table lookup and prediction. The cloud database Redis provides the plain table lookup, and with locality aware hashing we gather input data for our prediction scheme. For the latter we use kriging, which estimates an unknown value and its uncertainty (error) at a specific location in parameter space by using weighted averages of the neighboring points. We find that our adaptive scheme significantly improves simulation performance by a factor of 2.5 to 25, while retaining high accuracy for various choices of the algorithm parameters

  20. Plasma response to electron energy filter in large volume plasma device

    International Nuclear Information System (INIS)

    Sanyasi, A. K.; Awasthi, L. M.; Mattoo, S. K.; Srivastava, P. K.; Singh, S. K.; Singh, R.; Kaw, P. K.

    2013-01-01

    An electron energy filter (EEF) is embedded in the Large Volume Plasma Device plasma for carrying out studies on excitation of plasma turbulence by a gradient in electron temperature (ETG) described in the paper of Mattoo et al. [S. K. Mattoo et al., Phys. Rev. Lett. 108, 255007 (2012)]. In this paper, we report results on the response of the plasma to the EEF. It is shown that inhomogeneity in the magnetic field of the EEF switches on several physical phenomena resulting in plasma regions with different characteristics, including a plasma region free from energetic electrons, suitable for the study of ETG turbulence. Specifically, we report that localized structures of plasma density, potential, electron temperature, and plasma turbulence are excited in the EEF plasma. It is shown that structures of electron temperature and potential are created due to energy dependence of the electron transport in the filter region. On the other hand, although structure of plasma density has origin in the particle transport but two distinct steps of the density structure emerge from dominance of collisionality in the source-EEF region and of the Bohm diffusion in the EEF-target region. It is argued and experimental evidence is provided for existence of drift like flute Rayleigh-Taylor in the EEF plasma

  1. High-throughput analysis using non-depletive SPME: challenges and applications to the determination of free and total concentrations in small sample volumes.

    Science.gov (United States)

    Boyacı, Ezel; Bojko, Barbara; Reyes-Garcés, Nathaly; Poole, Justen J; Gómez-Ríos, Germán Augusto; Teixeira, Alexandre; Nicol, Beate; Pawliszyn, Janusz

    2018-01-18

    In vitro high-throughput non-depletive quantitation of chemicals in biofluids is of growing interest in many areas. Some of the challenges facing researchers include the limited volume of biofluids, rapid and high-throughput sampling requirements, and the lack of reliable methods. Coupled to the above, growing interest in the monitoring of kinetics and dynamics of miniaturized biosystems has spurred the demand for development of novel and revolutionary methodologies for analysis of biofluids. The applicability of solid-phase microextraction (SPME) is investigated as a potential technology to fulfill the aforementioned requirements. As analytes with sufficient diversity in their physicochemical features, nicotine, N,N-Diethyl-meta-toluamide, and diclofenac were selected as test compounds for the study. The objective was to develop methodologies that would allow repeated non-depletive sampling from 96-well plates, using 100 µL of sample. Initially, thin film-SPME was investigated. Results revealed substantial depletion and consequent disruption in the system. Therefore, new ultra-thin coated fibers were developed. The applicability of this device to the described sampling scenario was tested by determining the protein binding of the analytes. Results showed good agreement with rapid equilibrium dialysis. The presented method allows high-throughput analysis using small volumes, enabling fast reliable free and total concentration determinations without disruption of system equilibrium.

  2. Total and regional brain volumes in a population-based normative sample from 4 to 18 years: the NIH MRI Study of Normal Brain Development.

    Science.gov (United States)

    2012-01-01

    Using a population-based sampling strategy, the National Institutes of Health (NIH) Magnetic Resonance Imaging Study of Normal Brain Development compiled a longitudinal normative reference database of neuroimaging and correlated clinical/behavioral data from a demographically representative sample of healthy children and adolescents aged newborn through early adulthood. The present paper reports brain volume data for 325 children, ages 4.5-18 years, from the first cross-sectional time point. Measures included volumes of whole-brain gray matter (GM) and white matter (WM), left and right lateral ventricles, frontal, temporal, parietal and occipital lobe GM and WM, subcortical GM (thalamus, caudate, putamen, and globus pallidus), cerebellum, and brainstem. Associations with cross-sectional age, sex, family income, parental education, and body mass index (BMI) were evaluated. Key observations are: 1) age-related decreases in lobar GM most prominent in parietal and occipital cortex; 2) age-related increases in lobar WM, greatest in occipital, followed by the temporal lobe; 3) age-related trajectories predominantly curvilinear in females, but linear in males; and 4) small systematic associations of brain tissue volumes with BMI but not with IQ, family income, or parental education. These findings constitute a normative reference on regional brain volumes in children and adolescents.

  3. Optimization of sampling parameters for standardized exhaled breath sampling.

    Science.gov (United States)

    Doran, Sophie; Romano, Andrea; Hanna, George B

    2017-09-05

    The lack of standardization of breath sampling is a major contributing factor to the poor repeatability of results and hence represents a barrier to the adoption of breath tests in clinical practice. On-line and bag breath sampling have advantages but do not suit multicentre clinical studies whereas storage and robust transport are essential for the conduct of wide-scale studies. Several devices have been developed to control sampling parameters and to concentrate volatile organic compounds (VOCs) onto thermal desorption (TD) tubes and subsequently transport those tubes for laboratory analysis. We conducted three experiments to investigate (i) the fraction of breath sampled (whole vs. lower expiratory exhaled breath); (ii) breath sample volume (125, 250, 500 and 1000ml) and (iii) breath sample flow rate (400, 200, 100 and 50 ml/min). The target VOCs were acetone and potential volatile biomarkers for oesophago-gastric cancer belonging to the aldehyde, fatty acids and phenol chemical classes. We also examined the collection execution time and the impact of environmental contamination. The experiments showed that the use of exhaled breath-sampling devices requires the selection of optimum sampling parameters. The increase in sample volume has improved the levels of VOCs detected. However, the influence of the fraction of exhaled breath and the flow rate depends on the target VOCs measured. The concentration of potential volatile biomarkers for oesophago-gastric cancer was not significantly different between the whole and lower airway exhaled breath. While the recovery of phenols and acetone from TD tubes was lower when breath sampling was performed at a higher flow rate, other VOCs were not affected. A dedicated 'clean air supply' overcomes the contamination from ambient air, but the breath collection device itself can be a source of contaminants. In clinical studies using VOCs to diagnose gastro-oesophageal cancer, the optimum parameters are 500mls sample volume

  4. Cosmological implications of a large complete quasar sample.

    Science.gov (United States)

    Segal, I E; Nicoll, J F

    1998-04-28

    Objective and reproducible determinations of the probabilistic significance levels of the deviations between theoretical cosmological prediction and direct model-independent observation are made for the Large Bright Quasar Sample [Foltz, C., Chaffee, F. H., Hewett, P. C., MacAlpine, G. M., Turnshek, D. A., et al. (1987) Astron. J. 94, 1423-1460]. The Expanding Universe model as represented by the Friedman-Lemaitre cosmology with parameters qo = 0, Lambda = 0 denoted as C1 and chronometric cosmology (no relevant adjustable parameters) denoted as C2 are the cosmologies considered. The mean and the dispersion of the apparent magnitudes and the slope of the apparent magnitude-redshift relation are the directly observed statistics predicted. The C1 predictions of these cosmology-independent quantities are deviant by as much as 11sigma from direct observation; none of the C2 predictions deviate by >2sigma. The C1 deviations may be reconciled with theory by the hypothesis of quasar "evolution," which, however, appears incapable of being substantiated through direct observation. The excellent quantitative agreement of the C1 deviations with those predicted by C2 without adjustable parameters for the results of analysis predicated on C1 indicates that the evolution hypothesis may well be a theoretical artifact.

  5. A study of diabetes mellitus within a large sample of Australian twins

    DEFF Research Database (Denmark)

    Condon, Julianne; Shaw, Joanne E; Luciano, Michelle

    2008-01-01

    with type 2 diabetes (T2D), 41 female pairs with gestational diabetes (GD), 5 pairs with impaired glucose tolerance (IGT) and one pair with MODY. Heritabilities of T1D, T2D and GD were all high, but our samples did not have the power to detect effects of shared environment unless they were very large......Twin studies of diabetes mellitus can help elucidate genetic and environmental factors in etiology and can provide valuable biological samples for testing functional hypotheses, for example using expression and methylation studies of discordant pairs. We searched the volunteer Australian Twin...... Registry (19,387 pairs) for twins with diabetes using disease checklists from nine different surveys conducted from 1980-2000. After follow-up questionnaires to the twins and their doctors to confirm diagnoses, we eventually identified 46 pairs where one or both had type 1 diabetes (T1D), 113 pairs...

  6. Is Business Failure Due to Lack of Effort? Empirical Evidence from a Large Administrative Sample

    NARCIS (Netherlands)

    Ejrnaes, M.; Hochguertel, S.

    2013-01-01

    Does insurance provision reduce entrepreneurs' effort to avoid business failure? We exploit unique features of the voluntary Danish unemployment insurance (UI) scheme, that is available to the self-employed. Using a large sample of self-employed individuals, we estimate the causal effect of

  7. Numerical Study for a Large Volume Droplet on the Dual-rough Surface: Apparent Contact Angle, Contact Angle Hysteresis and Transition Barrier.

    Science.gov (United States)

    Dong, Jian; Jin, Yanli; Dong, He; Liu, Jiawei; Ye, Senbin

    2018-06-14

    The profile, apparent contact angle (ACA), contact angle hysteresis (CAH) and wetting state transmission energy barrier (WSTEB) are important static and dynamic properties of a large volume droplet on the hierarchical surface. Understanding them can provide us with important insights to functional surfaces and promote the application in corresponding areas. In this paper, we established three theoretical models (Model 1, Model 2 and Model 3) and corresponding numerical methods, which were obtained by the free energy minimization and the nonlinear optimization algorithm, to predict the profile, ACA, CAH and WSTEB of a large volume droplet on the horizontal regular dual-rough surface. In consideration of the gravity, the energy barrier on the contact circle, the dual heterogenous structures and their roughness on the surface, the models are more universal and accurate than previous models. It showed that the predictions of the models were in good agreement with the results from the experiment or literature. The models are promising to become novel design approaches of functional surfaces, which are frequently applied in microfluidic chips, water self-catchment system and dropwise condensation heat transfer system.

  8. In-situ high resolution particle sampling by large time sequence inertial spectrometry

    International Nuclear Information System (INIS)

    Prodi, V.; Belosi, F.

    1990-09-01

    In situ sampling is always preferred, when possible, because of the artifacts that can arise when the aerosol has to flow through long sampling lines. On the other hand, the amount of possible losses can be calculated with some confidence only when the size distribution can be measured with a sufficient precision and the losses are not too large. This makes it desirable to sample directly in the vicinity of the aerosol source or containment. High temperature sampling devices with a detailed aerodynamic separation are extremely useful to this purpose. Several measurements are possible with the inertial spectrometer (INSPEC), but not with cascade impactors or cyclones. INSPEC - INertial SPECtrometer - has been conceived to measure the size distribution of aerosols by separating the particles while airborne according to their size and collecting them on a filter. It consists of a channel of rectangular cross-section with a 90 degree bend. Clean air is drawn through the channel, with a thin aerosol sheath injected close to the inner wall. Due to the bend, the particles are separated according to their size, leaving the original streamline by a distance which is a function of particle inertia and resistance, i.e. of aerodynamic diameter. The filter collects all the particles of the same aerodynamic size at the same distance from the inlet, in a continuous distribution. INSPEC particle separation at high temperature (up to 800 C) has been tested with Zirconia particles as calibration aerosols. The feasibility study has been concerned with resolution and time sequence sampling capabilities under high temperature (700 C)

  9. Green sample preparation for liquid chromatography and capillary electrophoresis of anionic and cationic analytes.

    Science.gov (United States)

    Wuethrich, Alain; Haddad, Paul R; Quirino, Joselito P

    2015-04-21

    A sample preparation device for the simultaneous enrichment and separation of cationic and anionic analytes was designed and implemented in an eight-channel configuration. The device is based on the use of an electric field to transfer the analytes from a large volume of sample into small volumes of electrolyte that was suspended into two glass micropipettes using a conductive hydrogel. This simple, economical, fast, and green (no organic solvent required) sample preparation scheme was evaluated using cationic and anionic herbicides as test analytes in water. The analytical figures of merit and ecological aspects were evaluated against the state-of-the-art sample preparation, solid-phase extraction. A drastic reduction in both sample preparation time (94% faster) and resources (99% less consumables used) was observed. Finally, the technique in combination with high-performance liquid chromatography and capillary electrophoresis was applied to analysis of quaternary ammonium and phenoxypropionic acid herbicides in fortified river water as well as drinking water (at levels relevant to Australian guidelines). The presented sustainable sample preparation approach could easily be applied to other charged analytes or adopted by other laboratories.

  10. Field sampling, preparation procedure and plutonium analyses of large freshwater samples

    International Nuclear Information System (INIS)

    Straelberg, E.; Bjerk, T.O.; Oestmo, K.; Brittain, J.E.

    2002-01-01

    This work is part of an investigation of the mobility of plutonium in freshwater systems containing humic substances. A well-defined bog-stream system located in the catchment area of a subalpine lake, Oevre Heimdalsvatn, Norway, is being studied. During the summer of 1999, six water samples were collected from the tributary stream Lektorbekken and the lake itself. However, the analyses showed that the plutonium concentration was below the detection limit in all the samples. Therefore renewed sampling at the same sites was carried out in August 2000. The results so far are in agreement with previous analyses from the Heimdalen area. However, 100 times higher concentrations are found in the lowlands in the eastern part of Norway. The reason for this is not understood, but may be caused by differences in the concentrations of humic substances and/or the fact that the mountain areas are covered with snow for a longer period of time every year. (LN)

  11. IN SITU NON-INVASIVE SOIL CARBON ANALYSIS: SAMPLE SIZE AND GEOSTATISTICAL CONSIDERATIONS.

    Energy Technology Data Exchange (ETDEWEB)

    WIELOPOLSKI, L.

    2005-04-01

    I discuss a new approach for quantitative carbon analysis in soil based on INS. Although this INS method is not simple, it offers critical advantages not available with other newly emerging modalities. The key advantages of the INS system include the following: (1) It is a non-destructive method, i.e., no samples of any kind are taken. A neutron generator placed above the ground irradiates the soil, stimulating carbon characteristic gamma-ray emission that is counted by a detection system also placed above the ground. (2) The INS system can undertake multielemental analysis, so expanding its usefulness. (3) It can be used either in static or scanning modes. (4) The volume sampled by the INS method is large with a large footprint; when operating in a scanning mode, the sampled volume is continuous. (5) Except for a moderate initial cost of about $100,000 for the system, no additional expenses are required for its operation over two to three years after which a NG has to be replenished with a new tube at an approximate cost of $10,000, this regardless of the number of sites analyzed. In light of these characteristics, the INS system appears invaluable for monitoring changes in the carbon content in the field. For this purpose no calibration is required; by establishing a carbon index, changes in carbon yield can be followed with time in exactly the same location, thus giving a percent change. On the other hand, with calibration, it can be used to determine the carbon stock in the ground, thus estimating the soil's carbon inventory. However, this requires revising the standard practices for deciding upon the number of sites required to attain a given confidence level, in particular for the purposes of upward scaling. Then, geostatistical considerations should be incorporated in considering properly the averaging effects of the large volumes sampled by the INS system that would require revising standard practices in the field for determining the number of spots to

  12. Small sample whole-genome amplification

    Science.gov (United States)

    Hara, Christine; Nguyen, Christine; Wheeler, Elizabeth; Sorensen, Karen; Arroyo, Erin; Vrankovich, Greg; Christian, Allen

    2005-11-01

    Many challenges arise when trying to amplify and analyze human samples collected in the field due to limitations in sample quantity, and contamination of the starting material. Tests such as DNA fingerprinting and mitochondrial typing require a certain sample size and are carried out in large volume reactions; in cases where insufficient sample is present whole genome amplification (WGA) can be used. WGA allows very small quantities of DNA to be amplified in a way that enables subsequent DNA-based tests to be performed. A limiting step to WGA is sample preparation. To minimize the necessary sample size, we have developed two modifications of WGA: the first allows for an increase in amplified product from small, nanoscale, purified samples with the use of carrier DNA while the second is a single-step method for cleaning and amplifying samples all in one column. Conventional DNA cleanup involves binding the DNA to silica, washing away impurities, and then releasing the DNA for subsequent testing. We have eliminated losses associated with incomplete sample release, thereby decreasing the required amount of starting template for DNA testing. Both techniques address the limitations of sample size by providing ample copies of genomic samples. Carrier DNA, included in our WGA reactions, can be used when amplifying samples with the standard purification method, or can be used in conjunction with our single-step DNA purification technique to potentially further decrease the amount of starting sample necessary for future forensic DNA-based assays.

  13. A topological analysis of large-scale structure, studied using the CMASS sample of SDSS-III

    International Nuclear Information System (INIS)

    Parihar, Prachi; Gott, J. Richard III; Vogeley, Michael S.; Choi, Yun-Young; Kim, Juhan; Kim, Sungsoo S.; Speare, Robert; Brownstein, Joel R.; Brinkmann, J.

    2014-01-01

    We study the three-dimensional genus topology of large-scale structure using the northern region of the CMASS Data Release 10 (DR10) sample of the SDSS-III Baryon Oscillation Spectroscopic Survey. We select galaxies with redshift 0.452 < z < 0.625 and with a stellar mass M stellar > 10 11.56 M ☉ . We study the topology at two smoothing lengths: R G = 21 h –1 Mpc and R G = 34 h –1 Mpc. The genus topology studied at the R G = 21 h –1 Mpc scale results in the highest genus amplitude observed to date. The CMASS sample yields a genus curve that is characteristic of one produced by Gaussian random phase initial conditions. The data thus support the standard model of inflation where random quantum fluctuations in the early universe produced Gaussian random phase initial conditions. Modest deviations in the observed genus from random phase are as expected from shot noise effects and the nonlinear evolution of structure. We suggest the use of a fitting formula motivated by perturbation theory to characterize the shift and asymmetries in the observed genus curve with a single parameter. We construct 54 mock SDSS CMASS surveys along the past light cone from the Horizon Run 3 (HR3) N-body simulations, where gravitationally bound dark matter subhalos are identified as the sites of galaxy formation. We study the genus topology of the HR3 mock surveys with the same geometry and sampling density as the observational sample and find the observed genus topology to be consistent with ΛCDM as simulated by the HR3 mock samples. We conclude that the topology of the large-scale structure in the SDSS CMASS sample is consistent with cosmological models having primordial Gaussian density fluctuations growing in accordance with general relativity to form galaxies in massive dark matter halos.

  14. Gene coexpression measures in large heterogeneous samples using count statistics.

    Science.gov (United States)

    Wang, Y X Rachel; Waterman, Michael S; Huang, Haiyan

    2014-11-18

    With the advent of high-throughput technologies making large-scale gene expression data readily available, developing appropriate computational tools to process these data and distill insights into systems biology has been an important part of the "big data" challenge. Gene coexpression is one of the earliest techniques developed that is still widely in use for functional annotation, pathway analysis, and, most importantly, the reconstruction of gene regulatory networks, based on gene expression data. However, most coexpression measures do not specifically account for local features in expression profiles. For example, it is very likely that the patterns of gene association may change or only exist in a subset of the samples, especially when the samples are pooled from a range of experiments. We propose two new gene coexpression statistics based on counting local patterns of gene expression ranks to take into account the potentially diverse nature of gene interactions. In particular, one of our statistics is designed for time-course data with local dependence structures, such as time series coupled over a subregion of the time domain. We provide asymptotic analysis of their distributions and power, and evaluate their performance against a wide range of existing coexpression measures on simulated and real data. Our new statistics are fast to compute, robust against outliers, and show comparable and often better general performance.

  15. Gaussian vs. Bessel light-sheets: performance analysis in live large sample imaging

    Science.gov (United States)

    Reidt, Sascha L.; Correia, Ricardo B. C.; Donnachie, Mark; Weijer, Cornelis J.; MacDonald, Michael P.

    2017-08-01

    Lightsheet fluorescence microscopy (LSFM) has rapidly progressed in the past decade from an emerging technology into an established methodology. This progress has largely been driven by its suitability to developmental biology, where it is able to give excellent spatial-temporal resolution over relatively large fields of view with good contrast and low phototoxicity. In many respects it is superseding confocal microscopy. However, it is no magic bullet and still struggles to image deeply in more highly scattering samples. Many solutions to this challenge have been presented, including, Airy and Bessel illumination, 2-photon operation and deconvolution techniques. In this work, we show a comparison between a simple but effective Gaussian beam illumination and Bessel illumination for imaging in chicken embryos. Whilst Bessel illumination is shown to be of benefit when a greater depth of field is required, it is not possible to see any benefits for imaging into the highly scattering tissue of the chick embryo.

  16. A study of toxic emissions from a coal-fired power plant utilizing an ESP/Wet FGD system. Volume 1, Sampling, results, and special topics: Final report

    Energy Technology Data Exchange (ETDEWEB)

    1994-07-01

    This was one of a group of assessments of toxic emissions from coal-fired power plants, conducted for DOE-PETC in 1993 as mandated by the 1990 Clean Air Act. It is organized into 2 volumes; Volume 1 describes the sampling effort, presents the concentration data on toxic chemicals in several power plant streams, and reports the results of evaluations and calculations. The study involved solid, liquid, and gaseous samples from input, output, and process streams at Coal Creek Station Unit No. 1, Underwood, North Dakota (1100 MW mine-mouth plant burning lignite from the Falkirk mine located adjacent to the plant). This plant had an electrostatic precipitator and a wet scrubber flue gas desulfurization unit. Measurements were conducted on June 21--24, 26, and 27, 1993; chemicals measured were 6 major and 16 trace elements (including Hg, Cr, Cd, Pb, Se, As, Be, Ni), acids and corresponding anions (HCl, HF, chloride, fluoride, phosphate, sulfate), ammonia and cyanide, elemental C, radionuclides, VOCs, semivolatiles (incl. PAH, polychlorinated dioxins, furans), and aldehydes. Volume 2: Appendices includes process data log sheets, field sampling data sheets, uncertainty calculations, and quality assurance results.

  17. A Comparison of Soil-Water Sampling Techniques

    Science.gov (United States)

    Tindall, J. A.; Figueroa-Johnson, M.; Friedel, M. J.

    2007-12-01

    The representativeness of soil pore water extracted by suction lysimeters in ground-water monitoring studies is a problem that often confounds interpretation of measured data. Current soil water sampling techniques cannot identify the soil volume from which a pore water sample is extracted, neither macroscopic, microscopic, or preferential flowpath. This research was undertaken to compare values of extracted suction lysimeters samples from intact soil cores with samples obtained by the direct extraction methods to determine what portion of soil pore water is sampled by each method. Intact soil cores (30 centimeter (cm) diameter by 40 cm height) were extracted from two different sites - a sandy soil near Altamonte Springs, Florida and a clayey soil near Centralia in Boone County, Missouri. Isotopically labeled water (O18? - analyzed by mass spectrometry) and bromide concentrations (KBr- - measured using ion chromatography) from water samples taken by suction lysimeters was compared with samples obtained by direct extraction methods of centrifugation and azeotropic distillation. Water samples collected by direct extraction were about 0.25 ? more negative (depleted) than that collected by suction lysimeter values from a sandy soil and about 2-7 ? more negative from a well structured clayey soil. Results indicate that the majority of soil water in well-structured soil is strongly bound to soil grain surfaces and is not easily sampled by suction lysimeters. In cases where a sufficient volume of water has passed through the soil profile and displaced previous pore water, suction lysimeters will collect a representative sample of soil pore water from the sampled depth interval. It is suggested that for stable isotope studies monitoring precipitation and soil water, suction lysimeter should be installed at shallow depths (10 cm). Samples should also be coordinated with precipitation events. The data also indicate that each extraction method be use to sample a different

  18. A Parallel, Finite-Volume Algorithm for Large-Eddy Simulation of Turbulent Flows

    Science.gov (United States)

    Bui, Trong T.

    1999-01-01

    A parallel, finite-volume algorithm has been developed for large-eddy simulation (LES) of compressible turbulent flows. This algorithm includes piecewise linear least-square reconstruction, trilinear finite-element interpolation, Roe flux-difference splitting, and second-order MacCormack time marching. Parallel implementation is done using the message-passing programming model. In this paper, the numerical algorithm is described. To validate the numerical method for turbulence simulation, LES of fully developed turbulent flow in a square duct is performed for a Reynolds number of 320 based on the average friction velocity and the hydraulic diameter of the duct. Direct numerical simulation (DNS) results are available for this test case, and the accuracy of this algorithm for turbulence simulations can be ascertained by comparing the LES solutions with the DNS results. The effects of grid resolution, upwind numerical dissipation, and subgrid-scale dissipation on the accuracy of the LES are examined. Comparison with DNS results shows that the standard Roe flux-difference splitting dissipation adversely affects the accuracy of the turbulence simulation. For accurate turbulence simulations, only 3-5 percent of the standard Roe flux-difference splitting dissipation is needed.

  19. Translational and Brownian motion in laser-Doppler flowmetry of large tissue volumes

    International Nuclear Information System (INIS)

    Binzoni, T; Leung, T S; Seghier, M L; Delpy, D T

    2004-01-01

    This study reports the derivation of a precise mathematical relationship existing between the different p-moments of the power spectrum of the photoelectric current, obtained from a laser-Doppler flowmeter (LDF), and the red blood cell speed. The main purpose is that both the Brownian (defining the 'biological zero') and the translational movements are taken into account, clarifying in this way what the exact contribution of each parameter is to the LDF derived signals. The derivation of the equations is based on the quasi-elastic scattering theory and holds for multiple scattering (i.e. measurements in large tissue volumes and/or very high red blood cell concentration). The paper also discusses why experimentally there exists a range in which the relationship between the first moment of the power spectrum and the average red blood cells speed may be considered as 'linear' and what are the physiological determinants that can result in nonlinearity. A correct way to subtract the biological zero from the LDF data is also proposed. The findings should help in the design of improved LDF instruments and in the interpretation of experimental data

  20. Determination of tributyltin in environmental water matrices using stir bar sorptive extraction with in-situ derivatisation and large volume injection-gas chromatography-mass spectrometry.

    Science.gov (United States)

    Neng, N R; Santalla, R P; Nogueira, J M F

    2014-08-01

    Stir bar sorptive extraction with in-situ derivatization using sodium tetrahydridoborate (NaBH4) followed by liquid desorption and large volume injection-gas chromatography-mass spectrometry detection under the selected ion monitoring mode (SBSE(NaBH4)in-situ-LD/LVI-GC-MS(SIM)) was successfully developed for the determination of tributyltin (TBT) in environmental water matrices. NaBH4 proved to be an effective and easy in-situ speciation agent for TBT in aqueous media, allowing the formation of adducts with enough stability and suitable polarity for SBSE analysis. Assays performed on water samples spiked at the 10.0μg/L, yielded convenient recoveries (68.2±3.0%), showed good accuracy, suitable precision (RSD<9.0%), low detection limits (23ng/L) and excellent linear dynamic range (r(2)=0.9999) from 0.1 to 170.0µg/L, under optimized experimental conditions. By using the standard addition method, the application of the present methodology to real surface water samples allowed very good performance at the trace level. The proposed methodology proved to be a feasible alternative for routine quality control analysis, easy to implement, reliable and sensitive to monitor TBT in environmental water matrices. Copyright © 2014 Elsevier B.V. All rights reserved.

  1. Large volume unresectable locally advanced non-small cell lung cancer: acute toxicity and initial outcome results with rapid arc

    Directory of Open Access Journals (Sweden)

    Fogliata Antonella

    2010-10-01

    Full Text Available Abstract Background To report acute toxicity, initial outcome results and planning therapeutic parameters in radiation treatment of advanced lung cancer (stage III with volumetric modulated arcs using RapidArc (RA. Methods Twenty-four consecutive patients were treated with RA. All showed locally advanced non-small cell lung cancer with stage IIIA-IIIB and with large volumes (GTV:299 ± 175 cm3, PTV:818 ± 206 cm3. Dose prescription was 66Gy in 33 fractions to mean PTV. Delivery was performed with two partial arcs with a 6 MV photon beam. Results From a dosimetric point of view, RA allowed us to respect most planning objectives on target volumes and organs at risk. In particular: for GTV D1% = 105.6 ± 1.7%, D99% = 96.7 ± 1.8%, D5%-D95% = 6.3 ± 1.4%; contra-lateral lung mean dose resulted in 13.7 ± 3.9Gy, for spinal cord D1% = 39.5 ± 4.0Gy, for heart V45Gy = 9.0 ± 7.0Gy, for esophagus D1% = 67.4 ± 2.2Gy. Delivery time was 133 ± 7s. At three months partial remission > 50% was observed in 56% of patients. Acute toxicities at 3 months showed 91% with grade 1 and 9% with grade 2 esophageal toxicity; 18% presented grade 1 and 9% with grade 2 pneumonia; no grade 3 acute toxicity was observed. The short follow-up does not allow assessment of local control and progression free survival. Conclusions RA proved to be a safe and advantageous treatment modality for NSCLC with large volumes. Long term observation of patients is needed to assess outcome and late toxicity.

  2. Waardenburg syndrome: Novel mutations in a large Brazilian sample.

    Science.gov (United States)

    Bocángel, Magnolia Astrid Pretell; Melo, Uirá Souto; Alves, Leandro Ucela; Pardono, Eliete; Lourenço, Naila Cristina Vilaça; Marcolino, Humberto Vicente Cezar; Otto, Paulo Alberto; Mingroni-Netto, Regina Célia

    2018-06-01

    This paper deals with the molecular investigation of Waardenburg syndrome (WS) in a sample of 49 clinically diagnosed probands (most from southeastern Brazil), 24 of them having the type 1 (WS1) variant (10 familial and 14 isolated cases) and 25 being affected by the type 2 (WS2) variant (five familial and 20 isolated cases). Sequential Sanger sequencing of all coding exons of PAX3, MITF, EDN3, EDNRB, SOX10 and SNAI2 genes, followed by CNV detection by MLPA of PAX3, MITF and SOX10 genes in selected cases revealed many novel pathogenic variants. Molecular screening, performed in all patients, revealed 19 causative variants (19/49 = 38.8%), six of them being large whole-exon deletions detected by MLPA, seven (four missense and three nonsense substitutions) resulting from single nucleotide substitutions (SNV), and six representing small indels. A pair of dizygotic affected female twins presented the c.430delC variant in SOX10, but the mutation, imputed to gonadal mosaicism, was not found in their unaffected parents. At least 10 novel causative mutations, described in this paper, were found in this Brazilian sample. Copy-number-variation detected by MLPA identified the causative mutation in 12.2% of our cases, corresponding to 31.6% of all causative mutations. In the majority of cases, the deletions were sporadic, since they were not present in the parents of isolated cases. Our results, as a whole, reinforce the fact that the screening of copy-number-variants by MLPA is a powerful tool to identify the molecular cause in WS patients. Copyright © 2018 Elsevier Masson SAS. All rights reserved.

  3. Chemical Method of Urine Volume Measurement

    Science.gov (United States)

    Petrack, P.

    1967-01-01

    A system has been developed and qualified as flight hardware for the measurement of micturition volumes voided by crewmen during Gemini missions. This Chemical Urine Volume Measurement System (CUVMS) is used for obtaining samples of each micturition for post-flight volume determination and laboratory analysis for chemical constituents of physiological interest. The system is versatile with respect to volumes measured, with a capacity beyond the largest micturition expected to be encountered, and with respect to mission duration of inherently indefinite length. The urine sample is used for the measurement of total micturition volume by a tracer dilution technique, in which a fixed, predetermined amount of tritiated water is introduced and mixed into the voided urine, and the resulting concentration of the tracer in the sample is determined with a liquid scintillation spectrometer. The tracer employed does not interfere with the analysis for the chemical constituents of the urine. The CUVMS hardware consists of a four-way selector valve in which an automatically operated tracer metering pump is incorporated, a collection/mixing bag, and tracer storage accumulators. The assembled system interfaces with a urine receiver at the selector valve inlet, sample bags which connect to the side of the selector valve, and a flexible hose which carries the excess urine to the overboard drain connection. Results of testing have demonstrated system volume measurement accuracy within the specification limits of +/-5%, and operating reliability suitable for system use aboard the GT-7 mission, in which it was first used.

  4. Tracing the trajectory of skill learning with a very large sample of online game players.

    Science.gov (United States)

    Stafford, Tom; Dewar, Michael

    2014-02-01

    In the present study, we analyzed data from a very large sample (N = 854,064) of players of an online game involving rapid perception, decision making, and motor responding. Use of game data allowed us to connect, for the first time, rich details of training history with measures of performance from participants engaged for a sustained amount of time in effortful practice. We showed that lawful relations exist between practice amount and subsequent performance, and between practice spacing and subsequent performance. Our methodology allowed an in situ confirmation of results long established in the experimental literature on skill acquisition. Additionally, we showed that greater initial variation in performance is linked to higher subsequent performance, a result we link to the exploration/exploitation trade-off from the computational framework of reinforcement learning. We discuss the benefits and opportunities of behavioral data sets with very large sample sizes and suggest that this approach could be particularly fecund for studies of skill acquisition.

  5. Effect, Feasibility, and Clinical Relevance of Cell Enrichment in Large Volume Fat Grafting: A Systematic Review.

    Science.gov (United States)

    Rasmussen, Bo Sonnich; Lykke Sørensen, Celine; Vester-Glowinski, Peter Viktor; Herly, Mikkel; Trojahn Kølle, Stig-Frederik; Fischer-Nielsen, Anne; Drzewiecki, Krzysztof Tadeusz

    2017-07-01

    Large volume fat grafting is limited by unpredictable volume loss; therefore, methods of improving graft retention have been developed. Fat graft enrichment with either stromal vascular fraction (SVF) cells or adipose tissue-derived stem/stromal cells (ASCs) has been investigated in several animal and human studies, and significantly improved graft retention has been reported. Improvement of graft retention and the feasibility of these techniques are equally important in evaluating the clinical relevance of cell enrichment. We conducted a systematic search of PubMed to identify studies on fat graft enrichment that used either SVF cells or ASCs, and only studies reporting volume assessment were included. A total of 38 articles (15 human and 23 animal) were included to investigate the effects of cell enrichment on graft retention as well as the feasibility and clinical relevance of cell-enriched fat grafting. Improvements in graft retention, the SVF to fat (SVF:fat) ratio, and the ASC concentration used for enrichment were emphasized. We proposed an increased retention rate greater than 1.5-fold relative to nonenriched grafts and a maximum SVF:fat ratio of 1:1 as the thresholds for clinical relevance and feasibility, respectively. Nine studies fulfilled these criteria, whereof 6 used ASCs for enrichment. We found no convincing evidence of a clinically relevant effect of SVF enrichment in humans. ASC enrichment has shown promising results in enhancing graft retention, but additional clinical trials are needed to substantiate this claim and also determine the optimal concentration of SVF cells/ASCs for enrichment. 4. © 2017 The American Society for Aesthetic Plastic Surgery, Inc. Reprints and permission: journals.permissions@oup.com.

  6. Imaging a Large Sample with Selective Plane Illumination Microscopy Based on Multiple Fluorescent Microsphere Tracking

    Science.gov (United States)

    Ryu, Inkeon; Kim, Daekeun

    2018-04-01

    A typical selective plane illumination microscopy (SPIM) image size is basically limited by the field of view, which is a characteristic of the objective lens. If an image larger than the imaging area of the sample is to be obtained, image stitching, which combines step-scanned images into a single panoramic image, is required. However, accurately registering the step-scanned images is very difficult because the SPIM system uses a customized sample mount where uncertainties for the translational and the rotational motions exist. In this paper, an image registration technique based on multiple fluorescent microsphere tracking is proposed in the view of quantifying the constellations and measuring the distances between at least two fluorescent microspheres embedded in the sample. Image stitching results are demonstrated for optically cleared large tissue with various staining methods. Compensation for the effect of the sample rotation that occurs during the translational motion in the sample mount is also discussed.

  7. The Brief Negative Symptom Scale (BNSS): Independent validation in a large sample of Italian patients with schizophrenia.

    Science.gov (United States)

    Mucci, A; Galderisi, S; Merlotti, E; Rossi, A; Rocca, P; Bucci, P; Piegari, G; Chieffi, M; Vignapiano, A; Maj, M

    2015-07-01

    The Brief Negative Symptom Scale (BNSS) was developed to address the main limitations of the existing scales for the assessment of negative symptoms of schizophrenia. The initial validation of the scale by the group involved in its development demonstrated good convergent and discriminant validity, and a factor structure confirming the two domains of negative symptoms (reduced emotional/verbal expression and anhedonia/asociality/avolition). However, only relatively small samples of patients with schizophrenia were investigated. Further independent validation in large clinical samples might be instrumental to the broad diffusion of the scale in clinical research. The present study aimed to examine the BNSS inter-rater reliability, convergent/discriminant validity and factor structure in a large Italian sample of outpatients with schizophrenia. Our results confirmed the excellent inter-rater reliability of the BNSS (the intraclass correlation coefficient ranged from 0.81 to 0.98 for individual items and was 0.98 for the total score). The convergent validity measures had r values from 0.62 to 0.77, while the divergent validity measures had r values from 0.20 to 0.28 in the main sample (n=912) and in a subsample without clinically significant levels of depression and extrapyramidal symptoms (n=496). The BNSS factor structure was supported in both groups. The study confirms that the BNSS is a promising measure for quantifying negative symptoms of schizophrenia in large multicenter clinical studies. Copyright © 2015 Elsevier Masson SAS. All rights reserved.

  8. Large-Animal Biventricular Working Heart Perfusion System with Low Priming Volume-Comparison between in vivo and ex vivo Cardiac Function.

    Science.gov (United States)

    Abicht, Jan-Michael; Mayr, Tanja Axinja Jelena; Jauch, Judith; Guethoff, Sonja; Buchholz, Stefan; Reichart, Bruno; Bauer, Andreas

    2018-01-01

    Existing large-animal, ex vivo, cardiac perfusion models are restricted in their ability to establish an ischemia/reperfusion condition as seen in cardiac surgery or transplantation. Other working heart systems only challenge one ventricle or require a substantially larger priming volume. We describe a novel biventricular cardiac perfusion system with reduced priming volume. Juvenile pig hearts were cardiopleged, explanted, and reperfused ex vivo after 150 minutes of cold ischemia. Autologous whole blood was used as perfusate (minimal priming volume 350 mL). After 15 minutes of Langendorff perfusion (LM), the system was switched into a biventricular working mode (WM) and studied for 3 hours. During reperfusion, complete unloading of both ventricles and constant-pressure coronary perfusion was achieved. During working mode perfusion, the preload and afterload pressure of both ventricles was controlled within the targeted physiologic range. Functional parameters such as left ventricular work index were reduced in ex vivo working mode (in vivo: 787 ± 186 vs. 1 h WM 498 ± 66 mm Hg·mL/g·min; p  hours while functional and blood parameters are easily accessible. Moreover, because of the minimal priming volume, the novel ex vivo cardiac perfusion circuit allows for autologous perfusion, using the limited amount of blood available from the organ donating animal. Georg Thieme Verlag KG Stuttgart · New York.

  9. Albumin infusion in patients undergoing large-volume paracentesis: a meta-analysis of randomized trials.

    Science.gov (United States)

    Bernardi, Mauro; Caraceni, Paolo; Navickis, Roberta J; Wilkes, Mahlon M

    2012-04-01

    Albumin infusion reduces the incidence of postparacentesis circulatory dysfunction among patients with cirrhosis and tense ascites, as compared with no treatment. Treatment alternatives to albumin, such as artificial colloids and vasoconstrictors, have been widely investigated. The aim of this meta-analysis was to determine whether morbidity and mortality differ between patients receiving albumin versus alternative treatments. The meta-analysis included randomized trials evaluating albumin infusion in patients with tense ascites. Primary endpoints were postparacentesis circulatory dysfunction, hyponatremia, and mortality. Eligible trials were sought by multiple methods, including computer searches of bibliographic and abstract databases and the Cochrane Library. Results were quantitatively combined under a fixed-effects model. Seventeen trials with 1,225 total patients were included. There was no evidence of heterogeneity or publication bias. Compared with alternative treatments, albumin reduced the incidence of postparacentesis circulatory dysfunction (odds ratio [OR], 0.39; 95% confidence interval [CI], 0.27-0.55). Significant reductions in that complication by albumin were also shown in subgroup analyses versus each of the other volume expanders tested (e.g., dextran, gelatin, hydroxyethyl starch, and hypertonic saline). The occurrence of hyponatremia was also decreased by albumin, compared with alternative treatments (OR, 0.58; 95% CI, 0.39-0.87). In addition, mortality was lower in patients receiving albumin than alternative treatments (OR, 0.64; 95% CI, 0.41-0.98). This meta-analysis provides evidence that albumin reduces morbidity and mortality among patients with tense ascites undergoing large-volume paracentesis, as compared with alternative treatments investigated thus far. Copyright © 2011 American Association for the Study of Liver Diseases.

  10. Curvature computation in volume-of-fluid method based on point-cloud sampling

    Science.gov (United States)

    Kassar, Bruno B. M.; Carneiro, João N. E.; Nieckele, Angela O.

    2018-01-01

    This work proposes a novel approach to compute interface curvature in multiphase flow simulation based on Volume of Fluid (VOF) method. It is well documented in the literature that curvature and normal vector computation in VOF may lack accuracy mainly due to abrupt changes in the volume fraction field across the interfaces. This may cause deterioration on the interface tension forces estimates, often resulting in inaccurate results for interface tension dominated flows. Many techniques have been presented over the last years in order to enhance accuracy in normal vectors and curvature estimates including height functions, parabolic fitting of the volume fraction, reconstructing distance functions, coupling Level Set method with VOF, convolving the volume fraction field with smoothing kernels among others. We propose a novel technique based on a representation of the interface by a cloud of points. The curvatures and the interface normal vectors are computed geometrically at each point of the cloud and projected onto the Eulerian grid in a Front-Tracking manner. Results are compared to benchmark data and significant reduction on spurious currents as well as improvement in the pressure jump are observed. The method was developed in the open source suite OpenFOAM® extending its standard VOF implementation, the interFoam solver.

  11. Efficient inference of population size histories and locus-specific mutation rates from large-sample genomic variation data.

    Science.gov (United States)

    Bhaskar, Anand; Wang, Y X Rachel; Song, Yun S

    2015-02-01

    With the recent increase in study sample sizes in human genetics, there has been growing interest in inferring historical population demography from genomic variation data. Here, we present an efficient inference method that can scale up to very large samples, with tens or hundreds of thousands of individuals. Specifically, by utilizing analytic results on the expected frequency spectrum under the coalescent and by leveraging the technique of automatic differentiation, which allows us to compute gradients exactly, we develop a very efficient algorithm to infer piecewise-exponential models of the historical effective population size from the distribution of sample allele frequencies. Our method is orders of magnitude faster than previous demographic inference methods based on the frequency spectrum. In addition to inferring demography, our method can also accurately estimate locus-specific mutation rates. We perform extensive validation of our method on simulated data and show that it can accurately infer multiple recent epochs of rapid exponential growth, a signal that is difficult to pick up with small sample sizes. Lastly, we use our method to analyze data from recent sequencing studies, including a large-sample exome-sequencing data set of tens of thousands of individuals assayed at a few hundred genic regions. © 2015 Bhaskar et al.; Published by Cold Spring Harbor Laboratory Press.

  12. Extreme value statistics and thermodynamics of earthquakes. Large earthquakes

    Energy Technology Data Exchange (ETDEWEB)

    Lavenda, B. [Camerino Univ., Camerino, MC (Italy); Cipollone, E. [ENEA, Centro Ricerche Casaccia, S. Maria di Galeria, RM (Italy). National Centre for Research on Thermodynamics

    2000-06-01

    A compound Poisson process is used to derive a new shape parameter which can be used to discriminate between large earthquakes and aftershocks sequences. Sample exceedance distributions of large earthquakes are fitted to the Pareto tail and the actual distribution of the maximum to the Frechet distribution, while the sample distribution of aftershocks are fitted to a Beta distribution and the distribution of the minimum to the Weibull distribution for the smallest value. The transition between initial sample distributions and asymptotic extreme value distributions show that self-similar power laws are transformed into non scaling exponential distributions so that neither self-similarity nor the Gutenberg-Richter law can be considered universal. The energy-magnitude transformation converts the Frechet distribution into the Gumbel distribution, originally proposed by Epstein and Lomnitz, and not the Gompertz distribution as in the Lomnitz-Adler and Lomnitz generalization of the Gutenberg-Richter law. Numerical comparison is made with the Lomnitz-Adler and Lomnitz analysis using the same catalogue of Chinese earthquakes. An analogy is drawn between large earthquakes and high energy particle physics. A generalized equation of state is used to transform the Gamma density into the order-statistic Frechet distribution. Earthquake temperature and volume are determined as functions of the energy. Large insurance claims based on the Pareto distribution, which does not have a right endpoint, show why there cannot be a maximum earthquake energy.

  13. Fluidic sampling

    International Nuclear Information System (INIS)

    Houck, E.D.

    1992-01-01

    This paper covers the development of the fluidic sampler and its testing in a fluidic transfer system. The major findings of this paper are as follows. Fluidic jet samples can dependably produce unbiased samples of acceptable volume. The fluidic transfer system with a fluidic sampler in-line will transfer water to a net lift of 37.2--39.9 feet at an average ratio of 0.02--0.05 gpm (77--192 cc/min). The fluidic sample system circulation rate compares very favorably with the normal 0.016--0.026 gpm (60--100 cc/min) circulation rate that is commonly produced for this lift and solution with the jet-assisted airlift sample system that is normally used at ICPP. The volume of the sample taken with a fluidic sampler is dependant on the motive pressure to the fluidic sampler, the sample bottle size and on the fluidic sampler jet characteristics. The fluidic sampler should be supplied with fluid having the motive pressure of the 140--150 percent of the peak vacuum producing motive pressure for the jet in the sampler. Fluidic transfer systems should be operated by emptying a full pumping chamber to nearly empty or empty during the pumping cycle, this maximizes the solution transfer rate

  14. Insufficient evidence of benefit regarding mortality due to albumin substitution in HCC-free cirrhotic patients undergoing large volume paracentesis.

    Science.gov (United States)

    Kütting, Fabian; Schubert, Jens; Franklin, Jeremy; Bowe, Andrea; Hoffmann, Vera; Demir, Muenevver; Pelc, Agnes; Nierhoff, Dirk; Töx, Ulrich; Steffen, Hans-Michael

    2017-02-01

    Current guidelines for clinical practice recommend the infusion of human albumin after large volume paracentesis. After inspecting the current evidence behind this recommendation, we decided to conduct a systematic review and meta-analysis in order to address the effect of albumin on mortality and morbidity in the context of large volume paracentesis. We performed a comprehensive search of large databases and abstract books of conference proceedings up to March 15th 2016 for randomized controlled trials, testing the infusion of human albumin against alternatives (vs no treatment, vs plasma expanders; vs vasoconstrictors) in HCC-free patients suffering from cirrhosis. We analyzed these trials with regard to mortality, changes in plasma renin activity (PRA), hyponatremia, renal impairment, recurrence of ascites with consequential re-admission into hospital and additional complications. We employed trial sequential analysis in order to calculate the number of patients required in controlled trials to be able to determine a statistically significant advantage of the administration of one agent over another with regard to mortality. We were able to include 21 trials totaling 1277 patients. While the administration of albumin prevents a rise in PRA as well as hyponatremia, no improvement in strong clinical endpoints such as mortality could be demonstrated. Trial sequential analysis showed that at least 1550 additional patients need to be recruited into RCTs and analyzed with regard to this question in order to detect or disprove a 25% mortality effect. There is insufficient evidence that the infusion of albumin after LVP significantly lowers mortality in HCC-free patients with advanced liver disease. © 2016 Journal of Gastroenterology and Hepatology Foundation and John Wiley & Sons Australia, Ltd.

  15. Sample cell for powder x-ray diffraction at up to 500 bars and 200 deg. C

    International Nuclear Information System (INIS)

    Jupe, Andrew C.; Wilkinson, Angus P.

    2006-01-01

    A low cost sample cell for powder diffraction at high pressure and temperature that employs either sapphire or steel pressure tubes is described. The cell can be assembled rapidly, facilitating the study of chemically reacting systems, and it provides good control of both pressure and temperature in a regimen where diamond anvil cells and multianvil apparatus cannot be used. The design provides a relatively large sample volume making it suitable for the study of quite large grain size materials, such as hydrating cement slurries. However, relatively high energy x rays are needed to penetrate the pressure tube

  16. Recent results of the investigation of a micro-fluidic sampling chip and sampling system for hot cell aqueous processing streams

    International Nuclear Information System (INIS)

    Tripp, J.; Smith, T.; Law, J.

    2013-01-01

    A Fuel Cycle Research and Development project has investigated an innovative sampling method that could evolve into the next generation sampling and analysis system for metallic elements present in aqueous processing streams. Initially sampling technologies were evaluated and micro-fluidic sampling chip technology was selected and tested. A conceptual design for a fully automated microcapillary-based system was completed and a robotic automated sampling system was fabricated. The mechanical and sampling operation of the completed sampling system was investigated. Different sampling volumes have been tested. It appears that the 10 μl volume has produced data that had much smaller relative standard deviations than the 2 μl volume. In addition, the production of a less expensive, mass produced sampling chip was investigated to avoid chip reuse thus increasing sampling reproducibility/accuracy. The micro-fluidic-based robotic sampling system's mechanical elements were tested to ensure analytical reproducibility and the optimum robotic handling of micro-fluidic sampling chips. (authors)

  17. Low Energy Neutrino Astronomy in the future large-volume liquid-scintillator detector LENA

    International Nuclear Information System (INIS)

    Wurm, Michael; Feilitzsch, F V; Goeger-Neff, M; Lewke, T; Undagoitia, T Marrodan; Oberauer, L; Potzel, W; Todor, S; Winter, J

    2008-01-01

    The recent successes in neutrino physics prove that liquid-scintillator detectors allow to combine high energy resolution, efficient means of background reduction, and a large detection volume. In the planned LENA (Low Energy Neutrino Astronomy) experiment, a target mass of 50 kt will enable the investigation of a variety of terrestrial and astrophysical neutrino sources. The high-statistics spectroscopy of geoneutrinos, solar neutrinos and supernova neutrinos will provide new insights in the heat production processes of Earth and Sun, and the workings of a gravitational collapse. The same measurements will as well investigate neutrino properties as oscillation parameters and mass hierarchy. A first spectroscopic measurement of the low flux of diffuse supernova neutrino background is within the sensitivity of the LENA detector. Finally, a life-time limit of several 1034 years can be set to the proton decay into proton and anti-neutrino, testing the predictions of SUSY theory. The present contribution includes a review of the scientific studies that were performed in the last years as well as a report on currently on-going R and D activities.

  18. Low Energy Neutrino Astronomy in the future large-volume liquid-scintillator detector LENA

    Energy Technology Data Exchange (ETDEWEB)

    Wurm, Michael; Feilitzsch, F V; Goeger-Neff, M; Lewke, T; Undagoitia, T Marrodan; Oberauer, L; Potzel, W; Todor, S; Winter, J [E15 Chair for Astroparticle Physics, Technische Universitat Miinchen, Physik Department, James-Franck-Str., D-85748 Garching (Germany)

    2008-11-01

    The recent successes in neutrino physics prove that liquid-scintillator detectors allow to combine high energy resolution, efficient means of background reduction, and a large detection volume. In the planned LENA (Low Energy Neutrino Astronomy) experiment, a target mass of 50 kt will enable the investigation of a variety of terrestrial and astrophysical neutrino sources. The high-statistics spectroscopy of geoneutrinos, solar neutrinos and supernova neutrinos will provide new insights in the heat production processes of Earth and Sun, and the workings of a gravitational collapse. The same measurements will as well investigate neutrino properties as oscillation parameters and mass hierarchy. A first spectroscopic measurement of the low flux of diffuse supernova neutrino background is within the sensitivity of the LENA detector. Finally, a life-time limit of several 1034 years can be set to the proton decay into proton and anti-neutrino, testing the predictions of SUSY theory. The present contribution includes a review of the scientific studies that were performed in the last years as well as a report on currently on-going R and D activities.

  19. Psychometric Properties of the Penn State Worry Questionnaire for Children in a Large Clinical Sample

    Science.gov (United States)

    Pestle, Sarah L.; Chorpita, Bruce F.; Schiffman, Jason

    2008-01-01

    The Penn State Worry Questionnaire for Children (PSWQ-C; Chorpita, Tracey, Brown, Collica, & Barlow, 1997) is a 14-item self-report measure of worry in children and adolescents. Although the PSWQ-C has demonstrated favorable psychometric properties in small clinical and large community samples, this study represents the first psychometric…

  20. Evaluation of the AGCU Expressmarker 16 and 22 PCR Amplification Kits Using Biological Samples Applied to FTA Micro Cards in Reduced Volume Direct PCR Amplification Reactions

    Directory of Open Access Journals (Sweden)

    Samantha J Ogden

    2015-01-01

    Full Text Available This study evaluated the performance of the  Wuxi AGCU ScienTech Incorporation (HuiShan, Wuxi, China AGCU Expressmarker 16 (EX 16 and 22 (EX22 short tandem repeat (STR amplification kits in reduced reaction volumes using direct polymerase chain reaction (PCR amplification workflows. The commercially available PowerPlex® 21 (PP21 System (Promega, Wisconsin, USA, which follows similar direct workflows, was used as a reference. Anticoagulate blood applied to chemically impregnated  FTA TM Micro Cards (GE Healthcare UK Limited, Amersham Place, Little Chalfont, Buckinghamshire, HP7 9NA, UK was used to represent a complex biological sample. Allelic concordance, first-pass success rate, average peak heights, heterozygous peak height ratios (HPHRs, and intracolor and intercolor peak height balance were determined. In reduced volume PCR reactions, the performances of both the EX16 and EX22 STR amplification kits were comparable to that of the PP21 System. The level of performance was maintained at PCR reaction volumes, which are 40% of that recommended. The EX22 and PP21 System kits possess comparable overlapping genome coverage. This study evaluated the performance of the AGCU EX16 and EX22 STR amplification kits in reduced PCR reaction volumes using direct workflows in combination with whole blood applied to FTA TM Micro Cards. Allelic concordance, first-pass success rate, average peak heights, HPHRs, and intracolor and intercolor peak height balance were determined. A concordance analysis was completed that compared the performance of the EX16 and EX22 kits using human blood applied to FTA Micro Cards in combination with full, half, and reduced PCR reaction volumes. The PP21 System (Promega was used as a reference kit. Where appropriate, the distributions of data were assessed using the Shapiro-Wilk test. For normally-distributed data, statistics were calculated using analysis of variance (ANOVA and for nonparametric data the Wilcoxon

  1. Sample preparation for accelerator mass spectrometry at the University of Washington

    International Nuclear Information System (INIS)

    Grootes, P.M.; Stuiver, M.; Farwell, G.W.; Schmidt, F.H.

    1981-01-01

    The adaptation of the University of Washington FN tandem Van de Graaff to accelerator mass spectrometry (AMS), as well as some of the results obtained, are described in another paper in this volume (Farwell et al., 1981). Here we discuss our experiences in preparing carbon and beryllium samples that give large and stable ion beams when used in our Extrion cesium sputter source with an inverted cesium beam geometry

  2. Analysis of reflection-peak wavelengths of sampled fiber Bragg gratings with large chirp.

    Science.gov (United States)

    Zou, Xihua; Pan, Wei; Luo, Bin

    2008-09-10

    The reflection-peak wavelengths (RPWs) in the spectra of sampled fiber Bragg gratings with large chirp (SFBGs-LC) are theoretically investigated. Such RPWs are divided into two parts, the RPWs of equivalent uniform SFBGs (U-SFBGs) and the wavelength shift caused by the large chirp in the grating period (CGP). We propose a quasi-equivalent transform to deal with the CGP. That is, the CGP is transferred into quasi-equivalent phase shifts to directly derive the Fourier transform of the refractive index modulation. Then, in the case of both the direct and the inverse Talbot effect, the wavelength shift is obtained from the Fourier transform. Finally, the RPWs of SFBGs-LC can be achieved by combining the wavelength shift and the RPWs of equivalent U-SFBGs. Several simulations are shown to numerically confirm these predicted RPWs of SFBGs-LC.

  3. Automated Sample Preparation for Radiogenic and Non-Traditional Metal Isotopes: Removing an Analytical Barrier for High Sample Throughput

    Science.gov (United States)

    Field, M. Paul; Romaniello, Stephen; Gordon, Gwyneth W.; Anbar, Ariel D.; Herrmann, Achim; Martinez-Boti, Miguel A.; Anagnostou, Eleni; Foster, Gavin L.

    2014-05-01

    MC-ICP-MS has dramatically improved the analytical throughput for high-precision radiogenic and non-traditional isotope ratio measurements, compared to TIMS. The generation of large data sets, however, remains hampered by tedious manual drip chromatography required for sample purification. A new, automated chromatography system reduces the laboratory bottle neck and expands the utility of high-precision isotope analyses in applications where large data sets are required: geochemistry, forensic anthropology, nuclear forensics, medical research and food authentication. We have developed protocols to automate ion exchange purification for several isotopic systems (B, Ca, Fe, Cu, Zn, Sr, Cd, Pb and U) using the new prepFAST-MC™ (ESI, Nebraska, Omaha). The system is not only inert (all-flouropolymer flow paths), but is also very flexible and can easily facilitate different resins, samples, and reagent types. When programmed, precise and accurate user defined volumes and flow rates are implemented to automatically load samples, wash the column, condition the column and elute fractions. Unattended, the automated, low-pressure ion exchange chromatography system can process up to 60 samples overnight. Excellent reproducibility, reliability, recovery, with low blank and carry over for samples in a variety of different matrices, have been demonstrated to give accurate and precise isotopic ratios within analytical error for several isotopic systems (B, Ca, Fe, Cu, Zn, Sr, Cd, Pb and U). This illustrates the potential of the new prepFAST-MC™ (ESI, Nebraska, Omaha) as a powerful tool in radiogenic and non-traditional isotope research.

  4. Intralesional and metastatic heterogeneity in malignant melanomas demonstrated by stereologic estimates of nuclear volume

    DEFF Research Database (Denmark)

    Sørensen, Flemming Brandt; Erlandsen, M

    1990-01-01

    Regional variability of nuclear 3-dimensional size can be estimated objectively using point-sampled intercepts obtained from different, defined zones within individual neoplasms. In the present study, stereologic estimates of the volume-weighted mean nuclear volume, nuclear vv, within peripheral...... on average larger in the peripheral zones of primary melanomas, than nuclear vv in central zones (2p = 6.7 x 10(-4), whereas no zonal differences were demonstrated in metastatic lesions (2p = 0.21). A marked intraindividual variation was demonstrated between primary and corresponding secondary melanomas (2p...... melanomas showed large interindividual variation. This finding emphasizes that unbiased estimates of nuclear vv are robust to regional heterogeneity of nuclear volume and thus suitable for purposes of objective, quantitative malignancy grading of melanomas....

  5. Enrichment of circulating tumor cells from a large blood volume using leukapheresis and elutriation: proof of concept.

    Science.gov (United States)

    Eifler, Robert L; Lind, Judith; Falkenhagen, Dieter; Weber, Viktoria; Fischer, Michael B; Zeillinger, Robert

    2011-03-01

    The aim of this study was to determine the applicability of a sequential process using leukapheresis, elutriation, and fluorescence-activated cell sorting (FACS) to enrich and isolate circulating tumor cells from a large blood volume to allow further molecular analysis. Mononuclear cells were collected from 10 L of blood by leukapheresis, to which carboxyfluorescein succinimidyl ester prelabeled CaOV-3 tumor cells were spiked at a ratio of 26 to 10⁶ leukocytes. Elutriation separated the spiked leukapheresates primarily by cell size into distinct fractions, and leukocytes and tumor cells, characterized as carboxyfluorescein succinimidyl ester positive, EpCAM positive and CD45 negative events, were quantified by flow cytometry. Tumor cells were isolated from the last fraction using FACS or anti-EpCAM coupled immunomagnetic beads, and their recovery and purity determined by fluorescent microscopy and real-time PCR. Leukapheresis collected 13.5 x 10⁹ mononuclear cells with 87% efficiency. In total, 53 to 78% of spiked tumor cells were pre-enriched in the last elutriation fraction among 1.6 x 10⁹ monocytes. Flow cytometry predicted a circulating tumor cell purity of ~90% giving an enrichment of 100,000-fold following leukapheresis, elutriation, and FACS, where CaOV-3 cells were identified as EpCAM positive and CD45 negative events. FACS confirmed this purity. Alternatively, immunomagnetic bead adsorption recovered 10% of tumor cells with a median purity of 3.5%. This proof of concept study demonstrated that elutriation and FACS following leukapheresis are able to enrich and isolate tumor cells from a large blood volume for molecular characterization. Copyright © 2010 International Clinical Cytometry Society.

  6. Recent Trends in Microextraction Techniques Employed in Analytical and Bioanalytical Sample Preparation

    Directory of Open Access Journals (Sweden)

    Abuzar Kabir

    2017-12-01

    Full Text Available Sample preparation has been recognized as a major step in the chemical analysis workflow. As such, substantial efforts have been made in recent years to simplify the overall sample preparation process. Major focusses of these efforts have included miniaturization of the extraction device; minimizing/eliminating toxic and hazardous organic solvent consumption; eliminating sample pre-treatment and post-treatment steps; reducing the sample volume requirement; reducing extraction equilibrium time, maximizing extraction efficiency etc. All these improved attributes are congruent with the Green Analytical Chemistry (GAC principles. Classical sample preparation techniques such as solid phase extraction (SPE and liquid-liquid extraction (LLE are being rapidly replaced with emerging miniaturized and environmentally friendly techniques such as Solid Phase Micro Extraction (SPME, Stir bar Sorptive Extraction (SBSE, Micro Extraction by Packed Sorbent (MEPS, Fabric Phase Sorptive Extraction (FPSE, and Dispersive Liquid-Liquid Micro Extraction (DLLME. In addition to the development of many new generic extraction sorbents in recent years, a large number of molecularly imprinted polymers (MIPs created using different template molecules have also enriched the large cache of microextraction sorbents. Application of nanoparticles as high-performance extraction sorbents has undoubtedly elevated the extraction efficiency and method sensitivity of modern chromatographic analyses to a new level. Combining magnetic nanoparticles with many microextraction sorbents has opened up new possibilities to extract target analytes from sample matrices containing high volumes of matrix interferents. The aim of the current review is to critically audit the progress of microextraction techniques in recent years, which has indisputably transformed the analytical chemistry practices, from biological and therapeutic drug monitoring to the environmental field; from foods to phyto

  7. Superwind Outflows in Seyfert Galaxies? : Large-Scale Radio Maps of an Edge-On Sample

    Science.gov (United States)

    Colbert, E.; Gallimore, J.; Baum, S.; O'Dea, C.

    1995-03-01

    Large-scale galactic winds (superwinds) are commonly found flowing out of the nuclear region of ultraluminous infrared and powerful starburst galaxies. Stellar winds and supernovae from the nuclear starburst provide the energy to drive these superwinds. The outflowing gas escapes along the rotation axis, sweeping up and shock-heating clouds in the halo, which produces optical line emission, radio synchrotron emission, and X-rays. These features can most easily be studied in edge-on systems, so that the wind emission is not confused by that from the disk. We have begun a systematic search for superwind outflows in Seyfert galaxies. In an earlier optical emission-line survey, we found extended minor axis emission and/or double-peaked emission line profiles in >~30% of the sample objects. We present here large-scale (6cm VLA C-config) radio maps of 11 edge-on Seyfert galaxies, selected (without bias) from a distance-limited sample of 23 edge-on Seyferts. These data have been used to estimate the frequency of occurrence of superwinds. Preliminary results indicate that four (36%) of the 11 objects observed and six (26%) of the 23 objects in the distance-limited sample have extended radio emission oriented perpendicular to the galaxy disk. This emission may be produced by a galactic wind blowing out of the disk. Two (NGC 2992 and NGC 5506) of the nine objects for which we have both radio and optical data show good evidence for a galactic wind in both datasets. We suggest that galactic winds occur in >~30% of all Seyferts. A goal of this work is to find a diagnostic that can be used to distinguish between large-scale outflows that are driven by starbursts and those that are driven by an AGN. The presence of starburst-driven superwinds in Seyferts, if established, would have important implications for the connection between starburst galaxies and AGN.

  8. Environmental report 1995. Volume 2

    International Nuclear Information System (INIS)

    Harrach, R.J.; Failor, R.A.; Gallegos, G.M.

    1996-09-01

    This is Volume 2 of the Lawrence Livermore National Laboratory's (LLNL's) annual Environmental Report 1995. This volume is intended to support summary data from Volume 1 and is essentially a detailed data report that provides additional data points, where applicable. Some summary data are also included in Volume 2, and more detailed accounts are given of sample collection and analytical methods. Volume 2 includes information in eight chapters on monitoring of air, air effluent, sewage, surface water, ground water, soil and sediment, vegetation and foodstuff, and environmental radiation, as well as three chapters on ground water protection, compliance self-monitoring and quality assurance

  9. Evaluation of sampling inhalable PM10 particulate matter (≤ 10 μm) using co-located high volume samplers

    International Nuclear Information System (INIS)

    Rajoy, R R S; Dias, J W C; Rego, E C P; Netto, A D Pereira

    2015-01-01

    This paper presents the results of the determination of the concentrations of atmospheric particulate matter ≤ 10 μm (PM10), collected simultaneously by six PM10 high volume samplers from two different manufacturers installed in the same location. Fifteen samples of 24 h were obtained with each equipment at a selected urban area of Rio de Janeiro city. The concentration of PM10 ranged between 10.73 and 54.04 μg m −3 . The samplers were considered comparable to each other, as the adopted methodology presented good repeatability

  10. Blink and it's done: Interactive queries on very large data

    OpenAIRE

    Agarwal, Sameer; Iyer, Anand P.; Panda, Aurojit; Mozafari, Barzan; Stoica, Ion; Madden, Samuel R.

    2012-01-01

    In this demonstration, we present BlinkDB, a massively parallel, sampling-based approximate query processing framework for running interactive queries on large volumes of data. The key observation in BlinkDB is that one can make reasonable decisions in the absence of perfect answers. BlinkDB extends the Hive/HDFS stack and can handle the same set of SPJA (selection, projection, join and aggregate) queries as supported by these systems. BlinkDB provides real-time answers along with statistical...

  11. Application of k0-based internal monostandard NAA for large sample analysis of clay pottery. As a part of inter comparison exercise

    International Nuclear Information System (INIS)

    Acharya, R.; Dasari, K.B.; Pujari, P.K.; Swain, K.K.; Shinde, A.D.; Reddy, A.V.R.

    2014-01-01

    As a part of inter comparison exercise of an IAEA Coordinated Research Project on large sample neutron activation analysis, a large size and non standard geometry size pottery replica (obtained from Peru) was analyzed by k 0 -based internal monostandard neutron activation analysis (IM-NAA). Two large size sub samples (0.40 and 0.25 kg) were irradiated at graphite reflector position of AHWR Critical Facility in BARC, Trombay, Mumbai, India. Small samples (100-200 mg) were also analyzed by IM-NAA for comparison purpose. Radioactive assay was carried out using a 40 % relative efficiency HPGe detector. To examine homogeneity of the sample, counting was also carried out using X-Z rotary scanning unit. In situ relative detection efficiency was evaluated using gamma rays of the activation products in the irradiated sample in the energy range of 122-2,754 keV. Elemental concentration ratios with respect to Na of small size (100 mg mass) as well as large size (15 and 400 g) samples were used to check the homogeneity of the samples. Concentration ratios of 18 elements such as K, Sc, Cr, Mn, Fe, Co, Zn, As, Rb, Cs, La, Ce, Sm, Eu, Yb, Lu, Hf and Th with respect to Na (internal mono standard) were calculated using IM-NAA. Absolute concentrations were arrived at for both large and small samples using Na concentration, obtained from relative method of NAA. The percentage combined uncertainties at ±1 s confidence limit on the determined values were in the range of 3-9 %. Two IAEA reference materials SL-1 and SL-3 were analyzed by IM-NAA to evaluate accuracy of the method. (author)

  12. Uranium in US surface, ground, and domestic waters. Volume 2

    International Nuclear Information System (INIS)

    Drury, J.S.; Reynolds, S.; Owen, P.T.; Ross, R.H.; Ensminger, J.T.

    1981-04-01

    The report Uranium in US Surface, Ground, and Domestic Waters comprises four volumes. Volumes 2, 3, and 4 contain data characterizing the location, sampling date, type, use, and uranium conentrations of 89,994 individual samples presented in tabular form. The tabular data in volumes 2, 3, and 4 are summarized in volume 1 in narrative form and with maps and histograms

  13. Large parallel volumes of finite and compact sets in d-dimensional Euclidean space

    DEFF Research Database (Denmark)

    Kampf, Jürgen; Kiderlen, Markus

    The r-parallel volume V (Cr) of a compact subset C in d-dimensional Euclidean space is the volume of the set Cr of all points of Euclidean distance at most r > 0 from C. According to Steiner’s formula, V (Cr) is a polynomial in r when C is convex. For finite sets C satisfying a certain geometric...

  14. Surface sampling concentration and reaction probe

    Science.gov (United States)

    Van Berkel, Gary J; Elnaggar, Mariam S

    2013-07-16

    A method of analyzing a chemical composition of a specimen is described. The method can include providing a probe comprising an outer capillary tube and an inner capillary tube disposed co-axially within the outer capillary tube, where the inner and outer capillary tubes define a solvent capillary and a sampling capillary in fluid communication with one another at a distal end of the probe; contacting a target site on a surface of a specimen with a solvent in fluid communication with the probe; maintaining a plug volume proximate a solvent-specimen interface, wherein the plug volume is in fluid communication with the probe; draining plug sampling fluid from the plug volume through the sampling capillary; and analyzing a chemical composition of the plug sampling fluid with an analytical instrument. A system for performing the method is also described.

  15. Neurocognitive impairment in a large sample of homeless adults with mental illness.

    Science.gov (United States)

    Stergiopoulos, V; Cusi, A; Bekele, T; Skosireva, A; Latimer, E; Schütz, C; Fernando, I; Rourke, S B

    2015-04-01

    This study examines neurocognitive functioning in a large, well-characterized sample of homeless adults with mental illness and assesses demographic and clinical factors associated with neurocognitive performance. A total of 1500 homeless adults with mental illness enrolled in the At Home Chez Soi study completed neuropsychological measures assessing speed of information processing, memory, and executive functioning. Sociodemographic and clinical data were also collected. Linear regression analyses were conducted to examine factors associated with neurocognitive performance. Approximately half of our sample met criteria for psychosis, major depressive disorder, and alcohol or substance use disorder, and nearly half had experienced severe traumatic brain injury. Overall, 72% of participants demonstrated cognitive impairment, including deficits in processing speed (48%), verbal learning (71%) and recall (67%), and executive functioning (38%). The overall statistical model explained 19.8% of the variance in the neurocognitive summary score, with reduced neurocognitive performance associated with older age, lower education, first language other than English or French, Black or Other ethnicity, and the presence of psychosis. Homeless adults with mental illness experience impairment in multiple neuropsychological domains. Much of the variance in our sample's cognitive performance remains unexplained, highlighting the need for further research in the mechanisms underlying cognitive impairment in this population. © 2015 John Wiley & Sons A/S. Published by John Wiley & Sons Ltd.

  16. Detecting Boosted Dark Matter from the Sun with Large Volume Neutrino Detectors

    Energy Technology Data Exchange (ETDEWEB)

    Berger, Joshua; /SLAC; Cui, Yanou; /Perimeter Inst. Theor. Phys.; Zhao, Yue; /Stanford U., ITP /Stanford U., Phys. Dept.

    2015-04-02

    We study novel scenarios where thermal dark matter (DM) can be efficiently captured in the Sun and annihilate into boosted dark matter. In models with semi-annihilating DM, where DM has a non-minimal stabilization symmetry, or in models with a multi-component DM sector, annihilations of DM can give rise to stable dark sector particles with moderate Lorentz boosts. We investigate both of these possibilities, presenting concrete models as proofs of concept. Both scenarios can yield viable thermal relic DM with masses O(1)-O(100) GeV. Taking advantage of the energetic proton recoils that arise when the boosted DM scatters off matter, we propose a detection strategy which uses large volume terrestrial detectors, such as those designed to detect neutrinos or proton decays. In particular, we propose a search for proton tracks pointing towards the Sun. We focus on signals at Cherenkov-radiation-based detectors such as Super-Kamiokande (SK) and its upgrade Hyper-Kamiokande (HK). We find that with spin-dependent scattering as the dominant DM-nucleus interaction at low energies, boosted DM can leave detectable signals at SK or HK, with sensitivity comparable to DM direct detection experiments while being consistent with current constraints. Our study provides a new search path for DM sectors with non-minimal structure.

  17. Crystal structure and thermoelectric properties of clathrate, Ba{sub 8}Ni{sub 3.5}Si{sub 42.0}: Small cage volume and large disorder of the guest atom

    Energy Technology Data Exchange (ETDEWEB)

    Roudebush, John H., E-mail: jhr@princeton.edu [Department of Chemistry, University of California, One Shields Ave., Davis, CA 95616 (United States); Orellana, Mike [Department of Chemistry, University of California, One Shields Ave., Davis, CA 95616 (United States); Bux, Sabah [Thermal Energy Conversion Technologies Group, Jet Propulsion Laboratory/California Institute of Technology, Pasadena, CA 91109 (United States); Yi Tanghong; Kauzlarich, Susan M. [Department of Chemistry, University of California, One Shields Ave., Davis, CA 95616 (United States)

    2012-08-15

    Samples with the type-I clathrate composition Ba{sub 8}Ni{sub x}Si{sub 46-x} have been synthesized and their structure and thermoelectric properties characterized. Microprobe analysis indicates the Ni incorporation to be 2.62{<=}x{<=}3.53. The x=3.5 phase crystallizes in the type-I clathrate structure (space group: Pm-3n) with a lattice parameter of 10.2813(3) A. The refined composition was Ba{sub 8}Ni{sub 3.5}Si{sub 42.0}, with small vacancies, 0.4 and 0.5 atoms per formula unit, at the 2a and 6c sites, respectively. The position of the Ba2 atom in the large cage was modeled using a 4-fold split position (24j site), displaced 0.18 A from the cage center (6d site). The volume of the large cage is calculated to be 146 A{sup 3}, smaller than other clathrates with similar cation displacement. The sample shows n-type behavior with a maximum of -50 {mu}V/K at 823 K above which the Seebeck coefficient decreases, suggesting mixed carriers. Lattice thermal conductivity, {kappa}{sub l}, is 55 mW/K above 600 K. - Graphical abstract: Seebeck coefficient and resistivity of the type-I clathrate Ba{sub 8}Ni{sub 3.5}Si{sub 41.0}. Structure show's large displacement of the Ba cation in the large cage (6c site). Highlights: Black-Right-Pointing-Pointer Crystal structure of the Ba{sub 8}Ni{sub 3.5}Si{sub 41.0} reported. Black-Right-Pointing-Pointer Vacancies at the 2a and 6c sites. Black-Right-Pointing-Pointer Large disorder of Ba guest atom, 0.18 A from cage center. Black-Right-Pointing-Pointer Structure is compared to Ba{sub 8}Si{sub 46} and other type-I clathrates. Black-Right-Pointing-Pointer Max Seebeck of -50.7 {mu}V/C at 798.4 K, thermal conductivity {approx}55 mW/K.

  18. Multivariate volume visualization through dynamic projections

    Energy Technology Data Exchange (ETDEWEB)

    Liu, Shusen [Univ. of Utah, Salt Lake City, UT (United States); Wang, Bei [Univ. of Utah, Salt Lake City, UT (United States); Thiagarajan, Jayaraman J. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Bremer, Peer -Timo [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Pascucci, Valerio [Univ. of Utah, Salt Lake City, UT (United States)

    2014-11-01

    We propose a multivariate volume visualization framework that tightly couples dynamic projections with a high-dimensional transfer function design for interactive volume visualization. We assume that the complex, high-dimensional data in the attribute space can be well-represented through a collection of low-dimensional linear subspaces, and embed the data points in a variety of 2D views created as projections onto these subspaces. Through dynamic projections, we present animated transitions between different views to help the user navigate and explore the attribute space for effective transfer function design. Our framework not only provides a more intuitive understanding of the attribute space but also allows the design of the transfer function under multiple dynamic views, which is more flexible than being restricted to a single static view of the data. For large volumetric datasets, we maintain interactivity during the transfer function design via intelligent sampling and scalable clustering. As a result, using examples in combustion and climate simulations, we demonstrate how our framework can be used to visualize interesting structures in the volumetric space.

  19. Floating substructure flexibility of large-volume 10MW offshore wind turbine platforms in dynamic calculations

    International Nuclear Information System (INIS)

    Borg, Michael; Hansen, Anders Melchior; Bredmose, Henrik

    2016-01-01

    Designing floating substructures for the next generation of 10MW and larger wind turbines has introduced new challenges in capturing relevant physical effects in dynamic simulation tools. In achieving technically and economically optimal floating substructures, structural flexibility may increase to the extent that it becomes relevant to include in addition to the standard rigid body substructure modes which are typically described through linear radiation-diffraction theory. This paper describes a method for the inclusion of substructural flexibility in aero-hydro-servo-elastic dynamic simulations for large-volume substructures, including wave-structure interactions, to form the basis of deriving sectional loads and stresses within the substructure. The method is applied to a case study to illustrate the implementation and relevance. It is found that the flexible mode is significantly excited in an extreme event, indicating an increase in predicted substructure internal loads. (paper)

  20. Sampling Strategies and Processing of Biobank Tissue Samples from Porcine Biomedical Models.

    Science.gov (United States)

    Blutke, Andreas; Wanke, Rüdiger

    2018-03-06

    In translational medical research, porcine models have steadily become more popular. Considering the high value of individual animals, particularly of genetically modified pig models, and the often-limited number of available animals of these models, establishment of (biobank) collections of adequately processed tissue samples suited for a broad spectrum of subsequent analyses methods, including analyses not specified at the time point of sampling, represent meaningful approaches to take full advantage of the translational value of the model. With respect to the peculiarities of porcine anatomy, comprehensive guidelines have recently been established for standardized generation of representative, high-quality samples from different porcine organs and tissues. These guidelines are essential prerequisites for the reproducibility of results and their comparability between different studies and investigators. The recording of basic data, such as organ weights and volumes, the determination of the sampling locations and of the numbers of tissue samples to be generated, as well as their orientation, size, processing and trimming directions, are relevant factors determining the generalizability and usability of the specimen for molecular, qualitative, and quantitative morphological analyses. Here, an illustrative, practical, step-by-step demonstration of the most important techniques for generation of representative, multi-purpose biobank specimen from porcine tissues is presented. The methods described here include determination of organ/tissue volumes and densities, the application of a volume-weighted systematic random sampling procedure for parenchymal organs by point-counting, determination of the extent of tissue shrinkage related to histological embedding of samples, and generation of randomly oriented samples for quantitative stereological analyses, such as isotropic uniform random (IUR) sections generated by the "Orientator" and "Isector" methods, and vertical

  1. Large sample NAA of a pottery replica utilizing thermal neutron flux at AHWR critical facility and X-Z rotary scanning unit

    International Nuclear Information System (INIS)

    Acharya, R.; Dasari, K.B.; Pujari, P.K.; Swain, K.K.; Shinde, A.D.; Reddy, A.V.R.

    2013-01-01

    Large sample neutron activation analysis (LSNAA) of a clay pottery replica from Peru was carried out using low neutron flux graphite reflector position of Advanced Heavy Water Reactor (AHWR) critical facility. This work was taken up as a part of inter-comparison exercise under IAEA CRP on LSNAA of archaeological objects. Irradiated large size sample, placed on an X-Z rotary scanning unit, was assayed using a 40% relative efficiency HPGe detector. The k 0 -based internal monostandard NAA (IM-NAA) in conjunction with insitu relative detection efficiency was used to calculate concentration ratios of 12 elements with respect to Na. Analyses of both small and large size samples were carried out to check homogeneity and to arrive at absolute concentrations. (author)

  2. Development of a methodology for the detection of Ra226 in large volumes of water by gamma spectrometry; modification and validation of the method for detection and quantification of Ra226 in small volumes of water by alpha spectrometry, used by the Centro de Investigacion en Ciencias Atomicas, Nucleares y Moleculares (CICANUM, UCR)

    International Nuclear Information System (INIS)

    Molina Porras, Arnold

    2011-01-01

    The test method has been validated for quantifying the specific activity of Ra 226 in water alpha spectrometry. The CICANUM has used this method as part of the proposed harmonization of methods ARCAL (IAEA). The method is based on a first separation and preconcentration of Ra 226 by coprecipitation and subsequent MnO 2 micro precipitation as Ba (Ra) SO 4 . Samples were prepared and then was performed the counting by alpha spectrometry. A methodology of radio sampling for large volumes of water was tested in parallel, using acrylic fibers impregnated with manganese oxide (IV) to determine the amount of Ra 226 present by gamma spectrometry. Small-scale tests, have determined that the best way to prepare the fiber is the reference method found in the literature and using the oven at 60 degrees Celsius. (author) [es

  3. Bigger is better! Hippocampal volume and declarative memory performance in healthy young men.

    Science.gov (United States)

    Pohlack, Sebastian T; Meyer, Patric; Cacciaglia, Raffaele; Liebscher, Claudia; Ridder, Stephanie; Flor, Herta

    2014-01-01

    The importance of the hippocampus for declarative memory processes is firmly established. Nevertheless, the issue of a correlation between declarative memory performance and hippocampal volume in healthy subjects still remains controversial. The aim of the present study was to investigate this relationship in more detail. For this purpose, 50 healthy young male participants performed the California Verbal Learning Test. Hippocampal volume was assessed by manual segmentation of high-resolution 3D magnetic resonance images. We found a significant positive correlation between putatively hippocampus-dependent memory measures like short-delay retention, long-delay retention and discriminability and percent hippocampal volume. No significant correlation with measures related to executive processes was found. In addition, percent amygdala volume was not related to any of these measures. Our data advance previous findings reported in studies of brain-damaged individuals in a large and homogeneous young healthy sample and are important for theories on the neural basis of episodic memory.

  4. Distribution of dead wood volume and mass in mediterranean Fagus sylvatica L. forests in Northern Iberian Peninsula. Implications for field sampling inventory

    Energy Technology Data Exchange (ETDEWEB)

    Herrero, C.; Monleon, V.J.; Gómez, N.; Bravo, F.

    2016-07-01

    Aim of the study: The aim of this study was to 1) estimate the amount of dead wood in managed beech (Fagus sylvatica L.) stands in northern Iberian Peninsula and 2) evaluate the most appropriate volume equation and the optimal transect length for sampling downed wood. Area of study: The study area is the Aralar Forest in Navarra (Northern Iberian Peninsula). Material and methods: The amount of dead wood by component (downed logs, snags, stumps and fine woody debris) was inventoried in 51 plots across a chronosequence of stand ages (0-120 years old). Main results: The average volume and biomass of dead wood was 24.43 m3 ha-1 and 7.65 Mg ha-1, respectively. This amount changed with stand development stage [17.14 m3 ha-1 in seedling stage; 34.09 m3 ha-1 inpole stage; 22.54 m3 ha-1 in mature stage and 24.27 m3 ha-1 in regular stand in regeneration stage], although the differences were not statistically significant for coarse woody debris. However, forest management influenced the amount of dead wood, because the proportion of mass in the different components and the decay stage depended on time since last thinning. The formula based on intersection diameter resulted on the smallest coefficient of variation out of seven log-volume formulae. Thus, the intersection diameter is the preferred method because it gives unbiased estimates, has the greatest precision and is the easiest to implement in the field. Research highlights: The amount of dead wood, and in particular snags, was significantly lower than that in reserved forests. Results of this study showed that sampling effort should be directed towards increasing the number of transects, instead of increasing transect length or collecting additional piece diameters that do not increase the accuracy or precision of DWM volume estimation. (Author)

  5. Development of rigor mortis is not affected by muscle volume.

    Science.gov (United States)

    Kobayashi, M; Ikegaya, H; Takase, I; Hatanaka, K; Sakurada, K; Iwase, H

    2001-04-01

    There is a hypothesis suggesting that rigor mortis progresses more rapidly in small muscles than in large muscles. We measured rigor mortis as tension determined isometrically in rat musculus erector spinae that had been cut into muscle bundles of various volumes. The muscle volume did not influence either the progress or the resolution of rigor mortis, which contradicts the hypothesis. Differences in pre-rigor load on the muscles influenced the onset and resolution of rigor mortis in a few pairs of samples, but did not influence the time taken for rigor mortis to reach its full extent after death. Moreover, the progress of rigor mortis in this muscle was biphasic; this may reflect the early rigor of red muscle fibres and the late rigor of white muscle fibres.

  6. Evaluation of single and two-stage adaptive sampling designs for estimation of density and abundance of freshwater mussels in a large river

    Science.gov (United States)

    Smith, D.R.; Rogala, J.T.; Gray, B.R.; Zigler, S.J.; Newton, T.J.

    2011-01-01

    Reliable estimates of abundance are needed to assess consequences of proposed habitat restoration and enhancement projects on freshwater mussels in the Upper Mississippi River (UMR). Although there is general guidance on sampling techniques for population assessment of freshwater mussels, the actual performance of sampling designs can depend critically on the population density and spatial distribution at the project site. To evaluate various sampling designs, we simulated sampling of populations, which varied in density and degree of spatial clustering. Because of logistics and costs of large river sampling and spatial clustering of freshwater mussels, we focused on adaptive and non-adaptive versions of single and two-stage sampling. The candidate designs performed similarly in terms of precision (CV) and probability of species detection for fixed sample size. Both CV and species detection were determined largely by density, spatial distribution and sample size. However, designs did differ in the rate that occupied quadrats were encountered. Occupied units had a higher probability of selection using adaptive designs than conventional designs. We used two measures of cost: sample size (i.e. number of quadrats) and distance travelled between the quadrats. Adaptive and two-stage designs tended to reduce distance between sampling units, and thus performed better when distance travelled was considered. Based on the comparisons, we provide general recommendations on the sampling designs for the freshwater mussels in the UMR, and presumably other large rivers.

  7. Plasma properties in a large-volume, cylindrical and asymmetric radio-frequency capacitively coupled industrial-prototype reactor

    International Nuclear Information System (INIS)

    Lazović, Saša; Puač, Nevena; Spasić, Kosta; Malović, Gordana; Petrović, Zoran Lj; Cvelbar, Uroš; Mozetič, Miran; Radetić, Maja

    2013-01-01

    We have developed a large-volume low-pressure cylindrical plasma reactor with a size that matches industrial reactors for treatment of textiles. It was shown that it efficiently produces plasmas with only a small increase in power as compared with a similar reactor with 50 times smaller volume. Plasma generated at 13.56 MHz was stable from transition to streamers and capable of long-term continuous operation. An industrial-scale asymmetric cylindrical reactor of simple design and construction enabled good control over a wide range of active plasma species and ion concentrations. Detailed characterization of the discharge was performed using derivative, Langmuir and catalytic probes which enabled determination of the optimal sets of plasma parameters necessary for successful industry implementation and process control. Since neutral atomic oxygen plays a major role in many of the material processing applications, its spatial profile was measured using nickel catalytic probe over a wide range of plasma parameters. The spatial profiles show diffusion profiles with particle production close to the powered electrode and significant wall losses due to surface recombination. Oxygen atom densities range from 10 19 m −3 near the powered electrode to 10 17 m −3 near the wall. The concentrations of ions at the same time are changing from 10 16 to the 10 15 m −3 at the grounded chamber wall. (paper)

  8. Simulation of the radiography formation process from CT patient volume

    International Nuclear Information System (INIS)

    Bifulco, P.; Cesarelli, M.; Verso, E.; Roccasalva Firenze, M.; Sansone, M.; Bracale, M.

    1998-01-01

    The aim of this work is to develop an algorithm to simulate the radiographic image formation process using volumetric anatomical data of the patient, obtained from 3D diagnostic CT images. Many applications, including radiographic driven surgery, virtual reality in medicine and radiologist teaching and training, may take advantage of such technique. The designed algorithm has been developed to simulate a generic radiographic equipment, whatever oriented respect to the patient. The simulated radiography is obtained considering a discrete number of X-ray paths departing from the focus, passing through the patient volume and reaching the radiographic plane. To evaluate a generic pixel of the simulated radiography, the cumulative absorption along the corresponding X-ray is computed. To estimate X-ray absorption in a generic point of the patient volume, 3D interpolation of CT data has been adopted. The proposed technique is quite similar to those employed in Ray Tracing. A computer designed test volume has been used to assess the reliability of the radiography simulation algorithm as a measuring tool. From the errors analysis emerges that the accuracy achieved by the radiographic simulation algorithm is largely confined within the sampling step of the CT volume. (authors)

  9. Rapid surface sampling and archival record system

    Energy Technology Data Exchange (ETDEWEB)

    Barren, E.; Penney, C.M.; Sheldon, R.B. [GE Corporate Research and Development Center, Schenectady, NY (United States)] [and others

    1995-10-01

    A number of contamination sites exist in this country where the area and volume of material to be remediated is very large, approaching or exceeding 10{sup 6} m{sup 2} and 10{sup 6} m{sup 3}. Typically, only a small fraction of this material is actually contaminated. In such cases there is a strong economic motivation to test the material with a sufficient density of measurements to identify which portions are uncontaminated, so extensively they be left in place or be disposed of as uncontaminated waste. Unfortunately, since contamination often varies rapidly from position to position, this procedure can involve upwards of one million measurements per site. The situation is complicated further in many cases by the difficulties of sampling porous surfaces, such as concrete. This report describes a method for sampling concretes in which an immediate distinction can be made between contaminated and uncontaminated surfaces. Sample acquisition and analysis will be automated.

  10. Comparison of Statistically Modeled Contaminated Soil Volume Estimates and Actual Excavation Volumes at the Maywood FUSRAP Site - 13555

    Energy Technology Data Exchange (ETDEWEB)

    Moore, James [U.S. Army Corps of Engineers - New York District 26 Federal Plaza, New York, New York 10278 (United States); Hays, David [U.S. Army Corps of Engineers - Kansas City District 601 E. 12th Street, Kansas City, Missouri 64106 (United States); Quinn, John; Johnson, Robert; Durham, Lisa [Argonne National Laboratory, Environmental Science Division 9700 S. Cass Ave., Argonne, Illinois 60439 (United States)

    2013-07-01

    As part of the ongoing remediation process at the Maywood Formerly Utilized Sites Remedial Action Program (FUSRAP) properties, Argonne National Laboratory (Argonne) assisted the U.S. Army Corps of Engineers (USACE) New York District by providing contaminated soil volume estimates for the main site area, much of which is fully or partially remediated. As part of the volume estimation process, an initial conceptual site model (ICSM) was prepared for the entire site that captured existing information (with the exception of soil sampling results) pertinent to the possible location of surface and subsurface contamination above cleanup requirements. This ICSM was based on historical anecdotal information, aerial photographs, and the logs from several hundred soil cores that identified the depth of fill material and the depth to bedrock under the site. Specialized geostatistical software developed by Argonne was used to update the ICSM with historical sampling results and down-hole gamma survey information for hundreds of soil core locations. The updating process yielded both a best guess estimate of contamination volumes and a conservative upper bound on the volume estimate that reflected the estimate's uncertainty. Comparison of model results to actual removed soil volumes was conducted on a parcel-by-parcel basis. Where sampling data density was adequate, the actual volume matched the model's average or best guess results. Where contamination was un-characterized and unknown to the model, the actual volume exceeded the model's conservative estimate. Factors affecting volume estimation were identified to assist in planning further excavations. (authors)

  11. Site Environmental Report for 1998 Volume II

    International Nuclear Information System (INIS)

    Ruggieri, Michael

    1999-01-01

    Volume II of the Site Environment Report for 1998 is provided by Ernest Orlando Lawrence Berkeley National Laboratory as a supplemental appendix to the report printed in volume I. Volume II contains the environmental monitoring and sampling data used to generate summary results in the main report for routine and non routine activities at the Laboratory (except for groundwater sampling data, which may be found in the reports referred to in chapter 6). Data presented in the tables are given in International System of Units (SI) units of measure

  12. "Best Practices in Using Large, Complex Samples: The Importance of Using Appropriate Weights and Design Effect Compensation"

    Directory of Open Access Journals (Sweden)

    Jason W. Osborne

    2011-09-01

    Full Text Available Large surveys often use probability sampling in order to obtain representative samples, and these data sets are valuable tools for researchers in all areas of science. Yet many researchers are not formally prepared to appropriately utilize these resources. Indeed, users of one popular dataset were generally found not to have modeled the analyses to take account of the complex sample (Johnson & Elliott, 1998 even when publishing in highly-regarded journals. It is well known that failure to appropriately model the complex sample can substantially bias the results of the analysis. Examples presented in this paper highlight the risk of error of inference and mis-estimation of parameters from failure to analyze these data sets appropriately.

  13. Large scale sample management and data analysis via MIRACLE

    DEFF Research Database (Denmark)

    Block, Ines; List, Markus; Pedersen, Marlene Lemvig

    Reverse-phase protein arrays (RPPAs) allow sensitive quantification of relative protein abundance in thousands of samples in parallel. In the past years the technology advanced based on improved methods and protocols concerning sample preparation and printing, antibody selection, optimization...... of staining conditions and mode of signal analysis. However, the sample management and data analysis still poses challenges because of the high number of samples, sample dilutions, customized array patterns, and various programs necessary for array construction and data processing. We developed...... a comprehensive and user-friendly web application called MIRACLE (MIcroarray R-based Analysis of Complex Lysate Experiments), which bridges the gap between sample management and array analysis by conveniently keeping track of the sample information from lysate preparation, through array construction and signal...

  14. Examining gray matter structure associated with academic performance in a large sample of Chinese high school students

    OpenAIRE

    Song Wang; Ming Zhou; Taolin Chen; Xun Yang; Guangxiang Chen; Meiyun Wang; Qiyong Gong

    2017-01-01

    Achievement in school is crucial for students to be able to pursue successful careers and lead happy lives in the future. Although many psychological attributes have been found to be associated with academic performance, the neural substrates of academic performance remain largely unknown. Here, we investigated the relationship between brain structure and academic performance in a large sample of high school students via structural magnetic resonance imaging (S-MRI) using voxel-based morphome...

  15. On Assumptions in Development of a Mathematical Model of Thermo-gravitational Convection in the Large Volume Process Tanks Taking into Account Fermentation

    Directory of Open Access Journals (Sweden)

    P. M. Shkapov

    2015-01-01

    Full Text Available The paper provides a mathematical model of thermo-gravity convection in a large volume vertical cylinder. The heat is removed from the product via the cooling jacket at the top of the cylinder. We suppose that a laminar fluid motion takes place. The model is based on the NavierStokes equation, the equation of heat transfer through the wall, and the heat transfer equation. The peculiarity of the process in large volume tanks was the distribution of the physical parameters of the coordinates that was taken into account when constructing the model. The model corresponds to a process of wort beer fermentation in the cylindrical-conical tanks (CCT. The CCT volume is divided into three zones and for each zone model equations was obtained. The first zone has an annular cross-section and it is limited to the height by the cooling jacket. In this zone the heat flow from the cooling jacket to the product is uppermost. Model equation of the first zone describes the process of heat transfer through the wall and is presented by linear inhomogeneous differential equation in partial derivatives that is solved analytically. For the second and third zones description there was a number of engineering assumptions. The fluid was considered Newtonian, viscous and incompressible. Convective motion considered in the Boussinesq approximation. The effect of viscous dissipation is not considered. The topology of fluid motion is similar to the cylindrical Poiseuille. The second zone model consists of the Navier-Stokes equations in cylindrical coordinates with the introduction of a simplified and the heat equation in the liquid layer. The volume that is occupied by an upward convective flow pertains to the third area. Convective flows do not mix and do not exchange heat. At the start of the process a medium has the same temperature and a zero initial velocity in the whole volume that allows us to specify the initial conditions for the process. The paper shows the

  16. Sample vial inserts: A better approach for sampling heterogeneous slurry samples in the SRS Defense Waste Processing Facility

    International Nuclear Information System (INIS)

    Coleman, C.J.; Goode, S.R.

    1996-01-01

    A convenient and effective new approach for analyzing DWPF samples involves the use of inserts with volumes of 1.5--3 ml placed in the neck of 14 ml sample vials. The inserts have rims that conform to the rim of the vials so that they sit straight and stable in the vial. The DWPF tank sampling system fills the pre-weighed insert rather than the entire vial, so the vial functions only as the insert holder. The shielded cell operator then removes the vial cap and decants the insert containing the sample into a plastic bottle, crucible, etc., for analysis. Inert materials such as Teflon, plastic, and zirconium are used for the insert so it is unnecessary to separate the insert from the sample for most analyses. The key technique advantage of using inserts to take DWPF samples versus filling sample vials is that it provides a convenient and almost foolproof way of obtaining and handling small volumes of slurry samples in a shielded cell without corrupting the sample. Since the insert allows the entire sample to be analyzed, this approach eliminates the errors inherent with subsampling heterogeneous slurries that comprise DWPF samples. Slurry samples can then be analyzed with confidence. Analysis times are dramatically reduced by eliminating the drying and vitrification steps normally used to produce a homogeneous solid sample. Direct dissolution and elemental analysis of slurry samples are achieved in 8 hours or less compared with 40 hours for analysis of vitrified slurry samples. Comparison of samples taken in inserts versus full vials indicate that the insert does not significantly affect sample composition

  17. Extreme value statistics and thermodynamics of earthquakes: large earthquakes

    Directory of Open Access Journals (Sweden)

    B. H. Lavenda

    2000-06-01

    Full Text Available A compound Poisson process is used to derive a new shape parameter which can be used to discriminate between large earthquakes and aftershock sequences. Sample exceedance distributions of large earthquakes are fitted to the Pareto tail and the actual distribution of the maximum to the Fréchet distribution, while the sample distribution of aftershocks are fitted to a Beta distribution and the distribution of the minimum to the Weibull distribution for the smallest value. The transition between initial sample distributions and asymptotic extreme value distributions shows that self-similar power laws are transformed into nonscaling exponential distributions so that neither self-similarity nor the Gutenberg-Richter law can be considered universal. The energy-magnitude transformation converts the Fréchet distribution into the Gumbel distribution, originally proposed by Epstein and Lomnitz, and not the Gompertz distribution as in the Lomnitz-Adler and Lomnitz generalization of the Gutenberg-Richter law. Numerical comparison is made with the Lomnitz-Adler and Lomnitz analysis using the same Catalogue of Chinese Earthquakes. An analogy is drawn between large earthquakes and high energy particle physics. A generalized equation of state is used to transform the Gamma density into the order-statistic Fréchet distribution. Earthquaketemperature and volume are determined as functions of the energy. Large insurance claims based on the Pareto distribution, which does not have a right endpoint, show why there cannot be a maximum earthquake energy.

  18. Hippocampal volumes are important predictors for memory function in elderly women

    Directory of Open Access Journals (Sweden)

    Adolfsdottir Steinunn

    2009-08-01

    Full Text Available Abstract Background Normal aging involves a decline in cognitive function that has been shown to correlate with volumetric change in the hippocampus, and with genetic variability in the APOE-gene. In the present study we utilize 3D MR imaging, genetic analysis and assessment of verbal memory function to investigate relationships between these factors in a sample of 170 healthy volunteers (age range 46–77 years. Methods Brain morphometric analysis was performed with the automated segmentation work-flow implemented in FreeSurfer. Genetic analysis of the APOE genotype was determined with polymerase chain reaction (PCR on DNA from whole-blood. All individuals were subjected to extensive neuropsychological testing, including the California Verbal Learning Test-II (CVLT. To obtain robust and easily interpretable relationships between explanatory variables and verbal memory function we applied the recent method of conditional inference trees in addition to scatterplot matrices and simple pairwise linear least-squares regression analysis. Results APOE genotype had no significant impact on the CVLT results (scores on long delay free recall, CVLT-LD or the ICV-normalized hippocampal volumes. Hippocampal volumes were found to decrease with age and a right-larger-than-left hippocampal asymmetry was also found. These findings are in accordance with previous studies. CVLT-LD score was shown to correlate with hippocampal volume. Multivariate conditional inference analysis showed that gender and left hippocampal volume largely dominated predictive values for CVLT-LD scores in our sample. Left hippocampal volume dominated predictive values for females but not for males. APOE genotype did not alter the model significantly, and age was only partly influencing the results. Conclusion Gender and left hippocampal volumes are main predictors for verbal memory function in normal aging. APOE genotype did not affect the results in any part of our analysis.

  19. Crowdsourcing for large-scale mosquito (Diptera: Culicidae) sampling

    Science.gov (United States)

    Sampling a cosmopolitan mosquito (Diptera: Culicidae) species throughout its range is logistically challenging and extremely resource intensive. Mosquito control programmes and regional networks operate at the local level and often conduct sampling activities across much of North America. A method f...

  20. The Effect of Stock Return Sequences on Trading Volumes

    Directory of Open Access Journals (Sweden)

    Andrey Kudryavtsev

    2017-10-01

    Full Text Available The present study explores the effect of the gambler’s fallacy on stock trading volumes. I hypothesize that if a stock’s price rises (falls during a number of consecutive trading days, then the gambler’s fallacy may cause at least some of the investors to expect that the stock’s price “has” to subsequently fall (rise, and thus, to increase their willingness to sell (buy the stock, resulting in a stronger degree of disagreement between the investors and a higher-than-usual stock trading volume on the first day when the stock’s price indeed falls (rises. Employing a large sample of daily price and trading volume data, I document that following relatively long sequences of the same-sign stock returns, on the days when the sign is reversed, the trading activity in the respective stocks is abnormally high. Moreover, average abnormal trading volumes gradually and significantly increase with the length of the preceding return sequence. The effect is slightly more pronounced following the sequences of negative stock returns, and remains significant after controlling for other potentially influential factors, including contemporaneous and lagged actual and absolute stock returns, historical stock returns and volatilities, and company-specific events, such as earnings announcements and dividend payments.