WorldWideScience

Sample records for resolution ptr-tof quantification

  1. Analysis of high mass resolution PTR-TOF mass spectra from 1,3,5-trimethylbenzene (TMB) environmental chamber experiments

    Science.gov (United States)

    Müller, M.; Graus, M.; Wisthaler, A.; Hansel, A.; Metzger, A.; Dommen, J.; Baltensperger, U.

    2012-01-01

    A series of 1,3,5-trimethylbenzene (TMB) photo-oxidation experiments was performed in the 27-m3 Paul Scherrer Institute environmental chamber under various NOx conditions. A University of Innsbruck prototype high resolution Proton Transfer Reaction Time-of-Flight Mass Spectrometer (PTR-TOF) was used for measurements of gas and particulate phase organics. The gas phase mass spectrum displayed ~200 ion signals during the TMB photo-oxidation experiments. Molecular formulas CmHnNoOp were determined and ion signals were separated and grouped according to their C, O and N numbers. This allowed to determine the time evolution of the O:C ratio and of the average carbon oxidation state solid #000; color: #000;">OSC of the reaction mixture. Both quantities were compared with master chemical mechanism (MCMv3.1) simulations. The O:C ratio in the particle phase was about twice the O:C ratio in the gas phase. Average carbon oxidation states of secondary organic aerosol (SOA) samples solid #000; color: #000;">OSCSOA were in the range of -0.34 to -0.31, in agreement with expected average carbon oxidation states of fresh SOA (solid #000; color: #000;">OSC = -0.5-0).

  2. Eddy covariance VOC emission and deposition fluxes above grassland using PTR-TOF

    Directory of Open Access Journals (Sweden)

    T. M. Ruuskanen

    2011-01-01

    Full Text Available Eddy covariance (EC is the preferable technique for flux measurements since it is the only direct flux determination method. It requires a continuum of high time resolution measurements (e.g. 5–20 Hz. For volatile organic compounds (VOC soft ionization via proton transfer reaction has proven to be a quantitative method for real time mass spectrometry; here we use a proton transfer reaction time of flight mass spectrometer (PTR-TOF for 10 Hz EC measurements of full mass spectra up to m/z 315. The mass resolution of the PTR-TOF enabled the identification of chemical formulas and separation of oxygenated and hydrocarbon species exhibiting the same nominal mass. We determined 481 ion mass peaks from ambient air concentration above a managed, temperate mountain grassland in Neustift, Stubai Valley, Austria. During harvesting we found significant fluxes of 18 compounds distributed over 43 ions, including protonated parent compounds, as well as their isotopes and fragments and VOC-H+ – water clusters. The dominant BVOC fluxes were methanol, acetaldehyde, ethanol, hexenal and other C6 leaf wound compounds, acetone, acetic acid, monoterpenes and sequiterpenes.

    The smallest reliable fluxes we determined were less than 0.1 nmol m−2 s−1, as in the case of sesquiterpene emissions from freshly cut grass. Terpenoids, including mono- and sesquiterpenes, were also deposited to the grassland before and after the harvesting. During cutting, total VOC emission fluxes up to 200 nmolC m−2 s−1 were measured. Methanol emissions accounted for half of the emissions of oxygenated VOCs and a third of the carbon of all measured VOC emissions during harvesting.

  3. Field performance and identification capability of the Innsbruck PTR-TOF

    Science.gov (United States)

    Graus, M.; Müller, M.; Hansel, A.

    2009-04-01

    Over the last one and a half decades Proton Transfer Reaction Mass Spectrometry (PTR-MS) [1, 2] has gained recognition as fast on-line sensor for monitoring volatile organic compounds (VOC) in the atmosphere. Sample collection is very straight forward and the fact that no pre-concentration is needed is of particular advantage for compounds that are notoriously difficult to pre-concentrate and/or analyze by gas chromatographic (GC) methods. Its ionization method is very versatile, i.e. all compounds that perform exothermic proton transfer with hydronium ions - and most VOCs do so - are readily ionized, producing quasi-molecular ions VOC.H+. In the quasi-molecular ion the elemental composition of the analyte compound is conserved and allows, in combination with some background knowledge of the sample, conclusions about the identity of that compound. De Gouw and Warneke (2007) [3] summarized the applicability of PTR-MS in atmospheric chemistry but they also pointed out shortcomings in the identification capabilities. Goldstein and Galbally (2007) [4] addressed the multitude of VOCs potentially present in the atmosphere and they emphasized the gasphase-to-aerosol partitioning of organic compounds (volatile and semi-volatile) in dependence of carbon-chain length and oxygen containing functional groups. In collaboration with Ionicon and assisted by TOFWERK we developed a PTR time-of-flight (PTR-TOF) instrument that allows for the identification of the atomic composition of oxygenated hydrocarbons by exact-mass determination. A detection limit in the low pptv range was achieved at a time resolution of one minute, one-second detection limit is in the sub-ppbv range. In 2008 the Innsbruck PTR-TOF was field deployed in the icebreaker- and helicopter based Arctic Summer Cloud Ocean Study (ASCOS) to characterize the organic trace gas composition of the High Arctic atmosphere. During the six-week field campaign the PTR-TOF was run without problems even under harsh conditions in

  4. Biogenic volatile organic compound analyses by PTR-TOF-MS: Calibration, humidity effect and reduced electric field dependency.

    Science.gov (United States)

    Pang, Xiaobing

    2015-06-01

    Green leaf volatiles (GLVs) emitted by plants after stress or damage induction are a major part of biogenic volatile organic compounds (BVOCs). Proton transfer reaction time-of-flight mass spectrometry (PTR-TOF-MS) is a high-resolution and sensitive technique for in situ GLV analyses, while its performance is dramatically influenced by humidity, electric field, etc. In this study the influence of gas humidity and the effect of reduced field (E/N) were examined in addition to measuring calibration curves for the GLVs. Calibration curves measured for seven of the GLVs in dry air were linear, with sensitivities ranging from 5 to 10 ncps/ppbv (normalized counts per second/parts per billion by volume). The sensitivities for most GLV analyses were found to increase by between 20% and 35% when the humidity of the sample gas was raised from 0% to 70% relative humidity (RH) at 21°C, with the exception of (E)-2-hexenol. Product ion branching ratios were also affected by humidity, with the relative abundance of the protonated molecular ions and higher mass fragment ions increasing with humidity. The effect of reduced field (E/N) on the fragmentation of GLVs was examined in the drift tube of the PTR-TOF-MS. The structurally similar GLVs are acutely susceptible to fragmentation following ionization and the fragmentation patterns are highly dependent on E/N. Overall the measured fragmentation patterns contain sufficient information to permit at least partial separation and identification of the isomeric GLVs by looking at differences in their fragmentation patterns at high and low E/N. Copyright © 2015. Published by Elsevier B.V.

  5. Analytical detection of explosives and illicit, prescribed and designer drugs using proton transfer reaction time-of-flight mass spectrometry (PTR-TOF-MS)

    Energy Technology Data Exchange (ETDEWEB)

    Agarwal, Bishu; Petersson, Fredrik; Juerschik, Simone [Institut fuer Ionenphysik und Angewandte Physik, Universitaet Innsbruck, Technikerstr. 25, 6020 Innsbruck (Austria); Sulzer, Philipp; Jordan, Alfons [IONICON Analytik GmbH, Eduard-Bodem-Gasse 3, 6020 Innsbruck (Austria); Maerk, Tilmann D. [Institut fuer Ionenphysik und Angewandte Physik, Universitaet Innsbruck, Technikerstr. 25, 6020 Innsbruck (Austria); IONICON Analytik GmbH, Eduard-Bodem-Gasse 3, 6020 Innsbruck (Austria); Watts, Peter; Mayhew, Chris A. [School of Physics and Astronomy, University of Birmingham, Edgbaston, Birmingham B15 4TT (United Kingdom)

    2011-07-01

    This work demonstrates the extremely favorable features of Proton Transfer Reaction Time-of-flight Mass Spectrometry (PTR-TOF-MS) for the detection and identification of solid explosives, chemical warfare agent simulants and illicit, prescribed and designer drugs in real time. Here, we report the use of PTR-TOF, for the detection of explosives (e.g., trinitrotoluene, trinitrobenzene) and illicit, prescribed and designer drugs (e.g., ecstasy, morphine, heroin, ethcathinone, 2C-D). For all substances, the protonated parent ion (as we used H{sub 3}O{sup +} as a reagent ion) could be detected, providing a high level of confidence in their identification since the high mass resolution allows compounds having the same nominal mass to be separated. We varied the E/N from 90 to 220 T{sub d} (1 T{sub d}=10{sup -17} Vcm{sup -1}). This allowed us to study fragmentation pathways as a function of E/N (reduced electric field). For a few compounds rather unusual E/N dependencies were also discovered.

  6. Does the novel fast-GC coupled with PTR-TOF-MS allow a significant advancement in detecting VOC emissions from plants?

    Czech Academy of Sciences Publication Activity Database

    Pallozzi, E.; Guidolotti, G.; Ciccioli, P.; Brilli, F.; Feil, S.; Calfapietra, Carlo

    2016-01-01

    Roč. 216, JAN (2016), s. 232-240 ISSN 0168-1923 R&D Projects: GA MŠk(CZ) LD13031; GA MŠk(CZ) LO1415 Institutional support: RVO:67179843 Keywords : VOC * Gas chromatography * Time of flight * PTR-TOF-MS * Quercus * eucalyptus Subject RIV: EH - Ecology, Behaviour Impact factor: 3.887, year: 2016

  7. Overview of VOC emissions and chemistry from PTR-TOF-MS measurements during the SusKat-ABC campaign: high acetaldehyde, isoprene and isocyanic acid in wintertime air of the Kathmandu Valley

    Science.gov (United States)

    Sarkar, Chinmoy; Sinha, Vinayak; Kumar, Vinod; Rupakheti, Maheswar; Panday, Arnico; Mahata, Khadak S.; Rupakheti, Dipesh; Kathayat, Bhogendra; Lawrence, Mark G.

    2016-03-01

    The Kathmandu Valley in Nepal suffers from severe wintertime air pollution. Volatile organic compounds (VOCs) are key constituents of air pollution, though their specific role in the valley is poorly understood due to insufficient data. During the SusKat-ABC (Sustainable Atmosphere for the Kathmandu Valley-Atmospheric Brown Clouds) field campaign conducted in Nepal in the winter of 2012-2013, a comprehensive study was carried out to characterise the chemical composition of ambient Kathmandu air, including the determination of speciated VOCs, by deploying a proton transfer reaction time-of-flight mass spectrometer (PTR-TOF-MS) - the first such deployment in South Asia. In the study, 71 ion peaks (for which measured ambient concentrations exceeded the 2σ detection limit) were detected in the PTR-TOF-MS mass scan data, highlighting the chemical complexity of ambient air in the valley. Of the 71 species, 37 were found to have campaign average concentrations greater than 200 ppt and were identified based on their spectral characteristics, ambient diel profiles and correlation with specific emission tracers as a result of the high mass resolution (m / Δm > 4200) and temporal resolution (1 min) of the PTR-TOF-MS. The concentration ranking in the average VOC mixing ratios during our wintertime deployment was acetaldehyde (8.8 ppb) > methanol (7.4 ppb) > acetone + propanal (4.2 ppb) > benzene (2.7 ppb) > toluene (1.5 ppb) > isoprene (1.1 ppb) > acetonitrile (1.1 ppb) > C8-aromatics ( ˜ 1 ppb) > furan ( ˜ 0.5 ppb) > C9-aromatics (0.4 ppb). Distinct diel profiles were observed for the nominal isobaric compounds isoprene (m / z = 69.070) and furan (m / z = 69.033). Comparison with wintertime measurements from several locations elsewhere in the world showed mixing ratios of acetaldehyde ( ˜ 9 ppb), acetonitrile ( ˜ 1 ppb) and isoprene ( ˜ 1 ppb) to be among the highest reported to date. Two "new" ambient compounds, namely formamide (m / z = 46.029) and acetamide (m / z

  8. Eddy covariance emission and deposition flux measurements using proton transfer reaction – time of flight – mass spectrometry (PTR-TOF-MS): comparison with PTR-MS measured vertical gradients and fluxes

    NARCIS (Netherlands)

    Park, J.H.; Goldstein, A.H.; Timkovsky, J|info:eu-repo/dai/nl/330541676; Fares, S.; Weber, R.; Karlik, J.; Holzinger, R.|info:eu-repo/dai/nl/337989338

    2013-01-01

    During summer 2010, a proton transfer reaction – time of flight – mass spectrometer (PTR-TOF-MS) and a quadrupole proton transfer reaction mass spectrometer (PTR-MS) were deployed simultaneously for one month in an orange orchard in the Central Valley of California to collect continuous data

  9. Overview of VOC emissions and chemistry from PTR-TOF-MS measurements during the SusKat-ABC campaign: high acetaldehyde, isoprene and isocyanic acid in wintertime air of the Kathmandu Valley

    Directory of Open Access Journals (Sweden)

    C. Sarkar

    2016-03-01

    Full Text Available The Kathmandu Valley in Nepal suffers from severe wintertime air pollution. Volatile organic compounds (VOCs are key constituents of air pollution, though their specific role in the valley is poorly understood due to insufficient data. During the SusKat-ABC (Sustainable Atmosphere for the Kathmandu Valley–Atmospheric Brown Clouds field campaign conducted in Nepal in the winter of 2012–2013, a comprehensive study was carried out to characterise the chemical composition of ambient Kathmandu air, including the determination of speciated VOCs, by deploying a proton transfer reaction time-of-flight mass spectrometer (PTR-TOF-MS – the first such deployment in South Asia. In the study, 71 ion peaks (for which measured ambient concentrations exceeded the 2σ detection limit were detected in the PTR-TOF-MS mass scan data, highlighting the chemical complexity of ambient air in the valley. Of the 71 species, 37 were found to have campaign average concentrations greater than 200 ppt and were identified based on their spectral characteristics, ambient diel profiles and correlation with specific emission tracers as a result of the high mass resolution (m ∕ Δm  >  4200 and temporal resolution (1 min of the PTR-TOF-MS. The concentration ranking in the average VOC mixing ratios during our wintertime deployment was acetaldehyde (8.8 ppb  >  methanol (7.4 ppb  >  acetone + propanal (4.2 ppb  >  benzene (2.7 ppb  >  toluene (1.5 ppb  >  isoprene (1.1 ppb  >  acetonitrile (1.1 ppb  >  C8-aromatics ( ∼ 1 ppb  >  furan ( ∼ 0.5 ppb  >  C9-aromatics (0.4 ppb. Distinct diel profiles were observed for the nominal isobaric compounds isoprene (m ∕ z  =  69.070 and furan (m ∕ z  =  69.033. Comparison with wintertime measurements from several locations elsewhere in the world showed mixing ratios of acetaldehyde ( ∼  9 ppb, acetonitrile ( ∼  1 ppb and isoprene

  10. Evaluation of an on-line methodology for measuring volatile organic compounds (VOC) fluxes by eddy-covariance with a PTR-TOF-Qi-MS

    Science.gov (United States)

    Loubet, Benjamin; Buysse, Pauline; Lafouge, Florence; Ciuraru, Raluca; Decuq, Céline; Zurfluh, Olivier

    2017-04-01

    Field scale flux measurements of volatile organic compounds (VOC) are essential for improving our knowledge of VOC emissions from ecosystems. Many VOCs are emitted from and deposited to ecosystems. Especially less known, are crops which represent more than 50% of French terrestrial surfaces. In this study, we evaluate a new on-line methodology for measuring VOC fluxes by Eddy Covariance with a PTR-Qi-TOF-MS. Measurements were performed at the ICOS FR-GRI site over a crop using a 30 m long high flow rate sampling line and an ultrasonic anemometer. A Labview program was specially designed for acquisition and on-line covariance calculation: Whole mass spectra ( 240000 channels) were acquired on-line at 10 Hz and stored in a temporary memory. Every 5 minutes, the spectra were mass-calibrated and normalized by the primary ion peak integral at 10 Hz. The mass spectra peaks were then retrieved from the 5-min averaged spectra by withdrawing the baseline, determining the resolution and using a multiple-peak detection algorithm. In order to optimize the peak detection algorithm for the covariance, we determined the covariances as the integrals of the peaks of the vertical-air-velocity-fluctuation weighed-averaged-spectra. In other terms, we calculate , were w is the vertical component of the air velocity, Sp is the spectra, t is time, lag is the decorrelation lag time and denotes an average. The lag time was determined as the decorrelation time between w and the primary ion (at mass 21.022) which integrates the contribution of all reactions of VOC and water with the primary ion. Our algorithm was evaluated by comparing the exchange velocity of water vapor measured by an open path absorption spectroscopy instrument and the water cluster measured with the PTRQi-TOF-MS. The influence of the algorithm parameters and lag determination is discussed. This study was supported by the ADEME-CORTEA COV3ER project (http://www6.inra.fr/cov3er).

  11. Current position of high-resolution MS for drug quantification in clinical & forensic toxicology.

    Science.gov (United States)

    Meyer, Markus R; Helfer, Andreas G; Maurer, Hans H

    2014-08-01

    This paper reviews high-resolution MS approaches published from January 2011 until March 2014 for the quantification of drugs (of abuse) and/or their metabolites in biosamples using LC-MS with time-of-flight or Orbitrap™ mass analyzers. Corresponding approaches are discussed including sample preparation and mass spectral settings. The advantages and limitations of high-resolution MS for drug quantification, as well as the demand for a certain resolution or a specific mass accuracy are also explored.

  12. Atomic Resolution Imaging and Quantification of Chemical Functionality of Surfaces

    Energy Technology Data Exchange (ETDEWEB)

    Schwarz, Udo D. [Yale Univ., New Haven, CT (United States). Dept. of Mechanical Engineering and Materials Science; Altman, Eric I. [Yale Univ., New Haven, CT (United States). Dept. of Chemical and Environmental Engineering

    2014-12-10

    The work carried out from 2006-2014 under DoE support was targeted at developing new approaches to the atomic-scale characterization of surfaces that include species-selective imaging and an ability to quantify chemical surface interactions with site-specific accuracy. The newly established methods were subsequently applied to gain insight into the local chemical interactions that govern the catalytic properties of model catalysts of interest to DoE. The foundation of our work was the development of three-dimensional atomic force microscopy (3DAFM), a new measurement mode that allows the mapping of the complete surface force and energy fields with picometer resolution in space (x, y, and z) and piconewton/millielectron volts in force/energy. From this experimental platform, we further expanded by adding the simultaneous recording of tunneling current (3D-AFM/STM) using chemically well-defined tips. Through comparison with simulations, we were able to achieve precise quantification and assignment of local chemical interactions to exact positions within the lattice. During the course of the project, the novel techniques were applied to surface-oxidized copper, titanium dioxide, and silicon oxide. On these materials, defect-induced changes to the chemical surface reactivity and electronic charge density were characterized with site-specific accuracy.

  13. Quantification of resolution in multiplanar reconstructions for digital breast tomosynthesis

    Science.gov (United States)

    Vent, Trevor L.; Acciavatti, Raymond J.; Kwon, Young Joon; Maidment, Andrew D. A.

    2016-03-01

    Multiplanar reconstruction (MPR) in digital breast tomosynthesis (DBT) allows tomographic images to be portrayed in various orientations. We have conducted research to determine the resolution of tomosynthesis MPR. We built a phantom that houses a star test pattern to measure resolution. This phantom provides three rotational degrees of freedom. The design consists of two hemispheres with longitudinal and latitudinal grooves that reference angular increments. When joined together, the hemispheres form a dome that sits inside a cylindrical encasement. The cylindrical encasement contains reference notches to match the longitudinal and latitudinal grooves that guide the phantom's rotations. With this design, any orientation of the star-pattern can be analyzed. Images of the star-pattern were acquired using a DBT mammography system at the Hospital of the University of Pennsylvania. Images taken were reconstructed and analyzed by two different methods. First, the maximum visible frequency (in line pairs per millimeter) of the star test pattern was measured. Then, the contrast was calculated at a fixed spatial frequency. These analyses confirm that resolution decreases with tilt relative to the breast support. They also confirm that resolution in tomosynthesis MPR is dependent on object orientation. Current results verify that the existence of super-resolution depends on the orientation of the frequency; the direction parallel to x-ray tube motion shows super-resolution. In conclusion, this study demonstrates that the direction of the spatial frequency relative to the motion of the x-ray tube is a determinant of resolution in MPR for DBT.

  14. Quantification of upland thermokarst features with high resolution remote sensing

    International Nuclear Information System (INIS)

    Belshe, E F; Schuur, E A G; Grosse, G

    2013-01-01

    Climate-induced changes to permafrost are altering high latitude landscapes in ways that could increase the vulnerability of the vast soil carbon pools of the region. Permafrost thaw is temporally dynamic and spatially heterogeneous because, in addition to the thickening of the active layer, localized thermokarst features form when ice-rich permafrost thaws and the ground subsides. Thermokarst produces a diversity of landforms and alters the physical environment in dynamic ways. To estimate potential changes to the carbon cycle it is imperative to quantify the size and distribution of thermokarst landforms. By performing a supervised classification on a high resolution IKONOS image, we detected and mapped small, irregular thermokarst features occurring within an upland watershed in discontinuous permafrost of Interior Alaska. We found that 12% of the Eight Mile Lake (EML) watershed has undergone thermokarst, predominantly in valleys where tussock tundra resides. About 35% of the 3.7 km 2 tussock tundra class has likely transitioned to thermokarst. These landscape level changes created by permafrost thaw at EML have important implications for ecosystem carbon cycling because thermokarst features are forming in carbon-rich areas and are altering the hydrology in ways that increase seasonal thawing of the soil. (letter)

  15. Quantification of Iodine-123-FP-CIT SPECT with a resolution-independent method

    International Nuclear Information System (INIS)

    Dobbeleir, A.A.; Ham, H.R.; Hambye, A.E.; Vervaet, A.M.

    2005-01-01

    Accurate quantification of small-sized objects by SPECT is hampered by the partial volume effect. The present work evaluates the magnitude of this phenomenon with Iodine- 123 in phantom studies, and presents a resolution- independent method to quantify striatal I-123 FP-CIT uptake in patients. At first five syringes with internal diameters varying between 9 and 29mm and an anthropomorphic striatal phantom were filled with known concentrations of Iodine-123 and imaged by SPECT using different collimators and radii of rotation. Data were processed with and without scatter correction. From the measured activities, calibration factors were calculated for each specific collimator. Then a resolution-independent method for FP-CIT quantification using large regions of interest was developed and validated in 34 human studies (controls and patients) acquired in 2 different hospitals, by comparing its results to those obtained by a semi- quantitative striatal-to-occipital analysis. Taking the injected activity and decay into account, the measured counts/volume could be converted into absolute tracer concentrations. For the fan-beam, high resolution and medium energy collimators, the measured maximum activity in comparison to the 29 mm-diameter syringe was respectively 38%, 16% and 9% for the 9 mm-diameter syringe and 82%, 80% and 30% for the 16 mm syringe, and not significantly modified after scatter correction. For the anthropomorphic phantom, the error in measurement in % of the true concentration ranged between 0.3-9.5% and was collimator dependent. Medium energy collimators yielded the most homogeneous results. In the human studies, inter- observer variability was 11.4% for the striatal-to-occipital ratio and 3.1% for the resolution-independent method, with correlation coefficients >0.8 between both. The resolution- independent method was 89%-sensitive and 100%-specific to separate the patients without and with abnormal FP-CIT uptake (accuracy: 94%). Also the

  16. Quantification of steroid hormones in human serum by liquid chromatography-high resolution tandem mass spectrometry.

    Science.gov (United States)

    Matysik, Silke; Liebisch, Gerhard

    2017-12-01

    A limited specificity is inherent to immunoassays for steroid hormone analysis. To improve selectivity mass spectrometric analysis of steroid hormones by liquid chromatography-tandem mass spectrometry (LC-MS/MS) has been introduced in the clinical laboratory over the past years usually with low mass resolution triple-quadrupole instruments or more recently by high resolution mass spectrometry (HR-MS). Here we introduce liquid chromatography-high resolution tandem mass spectrometry (LC-MS/HR-MS) to further increase selectivity of steroid hormone quantification. Application of HR-MS demonstrates an enhanced selectivity compared to low mass resolution. Separation of isobaric interferences reduces background noise and avoids overestimation. Samples were prepared by automated liquid-liquid extraction with MTBE. The LC-MS/HR-MS method using a quadrupole-Orbitrap analyzer includes eight steroid hormones i.e. androstenedione, corticosterone, cortisol, cortisone, 11-deoxycortisol, 17-hydroxyprogesterone, progesterone, and testosterone. It has a run-time of 5.3min and was validated according to the U.S. Food and Drug Administration (FDA) and the European Medicines Agency (EMA) guidelines. For most of the analytes coefficient of variation were 10% or lower and LOQs were determined significantly below 1ng/ml. Full product ion spectra including accurate masses substantiate compound identification by matching their masses and ratios with authentic standards. In summary, quantification of steroid hormones by LC-MS/HR-MS is applicable for clinical diagnostics and holds also promise for highly selective quantification of other small molecules. Copyright © 2017 Elsevier B.V. All rights reserved.

  17. Myocardial Infarction Area Quantification using High-Resolution SPECT Images in Rats

    Energy Technology Data Exchange (ETDEWEB)

    Oliveira, Luciano Fonseca Lemos de [Divisão de Cardiologia, Departamento de Clínica Médica, Faculdade de Medicina de Ribeirão Preto, Universidade de São Paulo, Ribeirão Preto, SP (Brazil); Mejia, Jorge [Faculdade de Medicina de São José do Rio Preto, São José do Rio Preto, SP (Brazil); Carvalho, Eduardo Elias Vieira de; Lataro, Renata Maria; Frassetto, Sarita Nasbine [Divisão de Cardiologia, Departamento de Clínica Médica, Faculdade de Medicina de Ribeirão Preto, Universidade de São Paulo, Ribeirão Preto, SP (Brazil); Fazan, Rubens Jr.; Salgado, Hélio Cesar [Departamento de Fisiologia, Faculdade de Medicina de Ribeirão Preto, Universidade de São Paulo, Ribeirão Preto, SP (Brazil); Galvis-Alonso, Orfa Yineth [Faculdade de Medicina de São José do Rio Preto, São José do Rio Preto, SP (Brazil); Simões, Marcus Vinícius, E-mail: msimoes@fmrp.usp.br [Divisão de Cardiologia, Departamento de Clínica Médica, Faculdade de Medicina de Ribeirão Preto, Universidade de São Paulo, Ribeirão Preto, SP (Brazil)

    2013-07-15

    Imaging techniques enable in vivo sequential assessment of the morphology and function of animal organs in experimental models. We developed a device for high-resolution single photon emission computed tomography (SPECT) imaging based on an adapted pinhole collimator. To determine the accuracy of this system for quantification of myocardial infarct area in rats. Thirteen male Wistar rats (250 g) underwent experimental myocardial infarction by occlusion of the left coronary artery. After 4 weeks, SPECT images were acquired 1.5 hours after intravenous injection of 555 MBq of 99mTc-Sestamibi. The tomographic reconstruction was performed by using specially developed software based on the Maximum Likelihood algorithm. The analysis of the data included the correlation between the area of perfusion defects detected by scintigraphy and extent of myocardial fibrosis assessed by histology. The images showed a high target organ/background ratio with adequate visualization of the left ventricular walls and cavity. All animals presenting infarction areas were correctly identified by the perfusion images. There was no difference of the infarct area as measured by SPECT (21.1 ± 21.2%) and by histology (21.7 ± 22.0%; p=0.45). There was a strong correlation between individual values of the area of infarction measured by these two methods. The developed system presented adequate spatial resolution and high accuracy for the detection and quantification of myocardial infarction areas, consisting in a low cost and versatile option for high-resolution SPECT imaging of small rodents.

  18. Robust high-resolution quantification of time signals encoded by in vivo magnetic resonance spectroscopy

    Science.gov (United States)

    Belkić, Dževad; Belkić, Karen

    2018-01-01

    This paper on molecular imaging emphasizes improving specificity of magnetic resonance spectroscopy (MRS) for early cancer diagnostics by high-resolution data analysis. Sensitivity of magnetic resonance imaging (MRI) is excellent, but specificity is insufficient. Specificity is improved with MRS by going beyond morphology to assess the biochemical content of tissue. This is contingent upon accurate data quantification of diagnostically relevant biomolecules. Quantification is spectral analysis which reconstructs chemical shifts, amplitudes and relaxation times of metabolites. Chemical shifts inform on electronic shielding of resonating nuclei bound to different molecular compounds. Oscillation amplitudes in time signals retrieve the abundance of MR sensitive nuclei whose number is proportional to metabolite concentrations. Transverse relaxation times, the reciprocal of decay probabilities of resonances, arise from spin-spin coupling and reflect local field inhomogeneities. In MRS single voxels are used. For volumetric coverage, multi-voxels are employed within a hybrid of MRS and MRI called magnetic resonance spectroscopic imaging (MRSI). Common to MRS and MRSI is encoding of time signals and subsequent spectral analysis. Encoded data do not provide direct clinical information. Spectral analysis of time signals can yield the quantitative information, of which metabolite concentrations are the most clinically important. This information is equivocal with standard data analysis through the non-parametric, low-resolution fast Fourier transform and post-processing via fitting. By applying the fast Padé transform (FPT) with high-resolution, noise suppression and exact quantification via quantum mechanical signal processing, advances are made, presented herein, focusing on four areas of critical public health importance: brain, prostate, breast and ovarian cancers.

  19. Quantification and spatial distribution of salicylic acid in film tablets using FT-Raman mapping with multivariate curve resolution

    OpenAIRE

    Haslet Eksi-Kocak; Sibel Ilbasmis Tamer; Sebnem Yilmaz; Merve Eryilmaz; Ismail Hakkı Boyaci; Ugur Tamer

    2018-01-01

    In this study, we proposed a rapid and sensitive method for quantification and spatial distribution of salicylic acid in film tablets using FT-Raman spectroscopy with multivariate curve resolution (MCR). For this purpose, the constituents of film tablets were identified by using FT-Raman spectroscopy, and then eight different concentrations of salicylic acid tablets were visualized by Raman mapping. MCR was applied to mapping data to expose the active pharmaceutical ingredients in the presenc...

  20. Radiologist agreement on the quantification of bronchiectasis by high-resolution computed tomography

    Energy Technology Data Exchange (ETDEWEB)

    Brito, Milene Carneiro Barbosa de, E-mail: milenebrito7@gmail.com [Clinica da Imagem do Tocantins, Araguaia, TO (Brazil); Ota, Mauricio Kenji [Fundacao Instituto de Pesquisa e Estudos de Diagnostico por Imagem (FIDI), Sao Paulo, SP (Brazil); Leitao Filho, Fernando Sergio Studart [Universidade de Fortaleza (UNIFOR), Fortaleza, CE (Brazil); Meirelles, Gustavo de Souza Portes [Grupo Fleury, Sao Paulo, SP (Brazil)

    2017-01-15

    Objective: To evaluate radiologist agreement on the quantification of bronchiectasis by high-resolution computed tomography (HRCT). Materials and Methods: The HRCT scans of 43 patients with bronchiectasis were analyzed by two radiologists, who used a scoring system to grade the findings. Kappa (κ) values and overall agreement were calculated. Results: For the measurement and appearance of bronchiectasis, the interobserver agreement was moderate (κ = 0.45 and κ = 0.43, respectively), as was the intraobserver agreement (κ = 0.54 and κ = 0.47, respectively). Agreement on the presence of mucous plugging was fair, for central distribution (overall interobserver agreement of 68.3% and κ = 0.39 for intraobserver agreement) and for peripheral distribution (κ = 0.34 and κ = 0.35 for interobserver and intraobserver agreement, respectively). The agreement was also fair for peri bronchial thickening (κ = 0.21 and κ = 0.30 for interobserver and intraobserver agreement, respectively). There was fair interobserver and intraobserver agreement on the detection of opacities (κ = 0.39 and 71.9%, respectively), ground-glass attenuation (64.3% and κ = 0.24, respectively), and cysts/bullae (κ = 0.47 and κ = 0.44, respectively). Qualitative analysis of the HRCT findings of bronchiectasis and the resulting individual patient scores showed that there was an excellent correlation between the observers (intra class correlation coefficient of 0.85 and 0.81 for interobserver and intraobserver agreement, respectively). Conclusion: In the interpretation of HRCT findings of bronchiectasis, radiologist agreement appears to be fair. In our final analysis of the findings using the proposed score, we observed excellent interobserver and intraobserver agreement. (author)

  1. Radiologist agreement on the quantification of bronchiectasis by high-resolution computed tomography

    International Nuclear Information System (INIS)

    Brito, Milene Carneiro Barbosa de; Ota, Mauricio Kenji; Leitao Filho, Fernando Sergio Studart; Meirelles, Gustavo de Souza Portes

    2017-01-01

    Objective: To evaluate radiologist agreement on the quantification of bronchiectasis by high-resolution computed tomography (HRCT). Materials and Methods: The HRCT scans of 43 patients with bronchiectasis were analyzed by two radiologists, who used a scoring system to grade the findings. Kappa (κ) values and overall agreement were calculated. Results: For the measurement and appearance of bronchiectasis, the interobserver agreement was moderate (κ = 0.45 and κ = 0.43, respectively), as was the intraobserver agreement (κ = 0.54 and κ = 0.47, respectively). Agreement on the presence of mucous plugging was fair, for central distribution (overall interobserver agreement of 68.3% and κ = 0.39 for intraobserver agreement) and for peripheral distribution (κ = 0.34 and κ = 0.35 for interobserver and intraobserver agreement, respectively). The agreement was also fair for peri bronchial thickening (κ = 0.21 and κ = 0.30 for interobserver and intraobserver agreement, respectively). There was fair interobserver and intraobserver agreement on the detection of opacities (κ = 0.39 and 71.9%, respectively), ground-glass attenuation (64.3% and κ = 0.24, respectively), and cysts/bullae (κ = 0.47 and κ = 0.44, respectively). Qualitative analysis of the HRCT findings of bronchiectasis and the resulting individual patient scores showed that there was an excellent correlation between the observers (intra class correlation coefficient of 0.85 and 0.81 for interobserver and intraobserver agreement, respectively). Conclusion: In the interpretation of HRCT findings of bronchiectasis, radiologist agreement appears to be fair. In our final analysis of the findings using the proposed score, we observed excellent interobserver and intraobserver agreement. (author)

  2. Histamine quantification in human plasma using high resolution accurate mass LC-MS technology.

    Science.gov (United States)

    Laurichesse, Mathieu; Gicquel, Thomas; Moreau, Caroline; Tribut, Olivier; Tarte, Karin; Morel, Isabelle; Bendavid, Claude; Amé-Thomas, Patricia

    2016-01-01

    Histamine (HA) is a small amine playing an important role in anaphylactic reactions. In order to identify and quantify HA in plasma matrix, different methods have been developed but present several disadvantages. Here, we developed an alternative method using liquid chromatography coupled with an ultra-high resolution and accurate mass instrument, Q Exactive™ (Thermo Fisher) (LCHRMS). The method includes a protein precipitation of plasma samples spiked with HA-d4 as internal standard (IS). LC separation was performed on a C18 Accucore column (100∗2.1mm, 2.6μm) using a mobile phase containing nonafluoropentanoic acid (3nM) and acetonitrile with 0.1% (v/v) formic acid on gradient mode. Separation of analytes was obtained within 10min. Analysis was performed from full scan mode and targeted MS2 mode using a 5ppm mass window. Ion transitions monitored for targeted MS2 mode were 112.0869>95.0607m/z for HA and 116.1120>99.0855m/z for HA-d4. Calibration curves were obtained by adding standard calibration dilution at 1 to 180nM in TrisBSA. Elution of HA and IS occurred at 4.1min. The method was validated over a range of concentrations from 1nM to 100nM. The intra- and inter-run precisions were <15% for quality controls. Human plasma samples from 30 patients were analyzed by LCHRMS, and the results were highly correlated with those obtained using the gold standard radioimmunoassay (RIA) method. Overall, we demonstrate here that LCHRMS is a sensitive method for histamine quantification in biological human plasmas, suitable for routine use in medical laboratories. In addition, LCHRMS is less time-consuming than RIA, avoids the use of radioactivity, and could then be considered as an alternative quantitative method. Copyright © 2015 The Canadian Society of Clinical Chemists. Published by Elsevier Inc. All rights reserved.

  3. Quantification of errors induced by temporal resolution on Lagrangian particles in an eddy-resolving model

    Science.gov (United States)

    Qin, Xuerong; van Sebille, Erik; Sen Gupta, Alexander

    2014-04-01

    Lagrangian particle tracking within ocean models is an important tool for the examination of ocean circulation, ventilation timescales and connectivity and is increasingly being used to understand ocean biogeochemistry. Lagrangian trajectories are obtained by advecting particles within velocity fields derived from hydrodynamic ocean models. For studies of ocean flows on scales ranging from mesoscale up to basin scales, the temporal resolution of the velocity fields should ideally not be more than a few days to capture the high frequency variability that is inherent in mesoscale features. However, in reality, the model output is often archived at much lower temporal resolutions. Here, we quantify the differences in the Lagrangian particle trajectories embedded in velocity fields of varying temporal resolution. Particles are advected from 3-day to 30-day averaged fields in a high-resolution global ocean circulation model. We also investigate whether adding lateral diffusion to the particle movement can compensate for the reduced temporal resolution. Trajectory errors reveal the expected degradation of accuracy in the trajectory positions when decreasing the temporal resolution of the velocity field. Divergence timescales associated with averaging velocity fields up to 30 days are faster than the intrinsic dispersion of the velocity fields but slower than the dispersion caused by the interannual variability of the velocity fields. In experiments focusing on the connectivity along major currents, including western boundary currents, the volume transport carried between two strategically placed sections tends to increase with increased temporal averaging. Simultaneously, the average travel times tend to decrease. Based on these two bulk measured diagnostics, Lagrangian experiments that use temporal averaging of up to nine days show no significant degradation in the flow characteristics for a set of six currents investigated in more detail. The addition of random

  4. Visual quantification of diffuse emphysema with Sakal's method and high-resolution chest CT

    International Nuclear Information System (INIS)

    Feuerstein, I.M.; McElvaney, N.G.; Simon, T.R.; Hubbard, R.C.; Crystal, R.G.

    1990-01-01

    This paper determines the accuracy and efficacy of visual quantitation for a diffuse form of pulmonary emphysema with high-resolution CT (HRCT). Twenty- five adults patients with symptomatic emphysema due to α-antitrypsin deficiency prospectively underwent HRCT with 1.5-mm sections, a high-spatial-resolution algorithm, and targeted reconstruction. Photography was performed with narrow lung windows to accentuate diffuse emphysema. Emphysema was then scored with use of a modification of Sakai's extent and severity scoring method. The scans were all scored by the same blinded observer. Pulmonary function testing (PFT), including diffusing capacity measurement, was performed in all patients. Results were statistically correlated with the use of regression analysis

  5. High-resolution quantification of atmospheric CO2 mixing ratios in the Greater Toronto Area, Canada

    Science.gov (United States)

    Pugliese, Stephanie C.; Murphy, Jennifer G.; Vogel, Felix R.; Moran, Michael D.; Zhang, Junhua; Zheng, Qiong; Stroud, Craig A.; Ren, Shuzhan; Worthy, Douglas; Broquet, Gregoire

    2018-03-01

    Many stakeholders are seeking methods to reduce carbon dioxide (CO2) emissions in urban areas, but reliable, high-resolution inventories are required to guide these efforts. We present the development of a high-resolution CO2 inventory available for the Greater Toronto Area and surrounding region in Southern Ontario, Canada (area of ˜ 2.8 × 105 km2, 26 % of the province of Ontario). The new SOCE (Southern Ontario CO2 Emissions) inventory is available at the 2.5 × 2.5 km spatial and hourly temporal resolution and characterizes emissions from seven sectors: area, residential natural-gas combustion, commercial natural-gas combustion, point, marine, on-road, and off-road. To assess the accuracy of the SOCE inventory, we developed an observation-model framework using the GEM-MACH chemistry-transport model run on a high-resolution grid with 2.5 km grid spacing coupled to the Fossil Fuel Data Assimilation System (FFDAS) v2 inventories for anthropogenic CO2 emissions and the European Centre for Medium-Range Weather Forecasts (ECMWF) land carbon model C-TESSEL for biogenic fluxes. A run using FFDAS for the Southern Ontario region was compared to a run in which its emissions were replaced by the SOCE inventory. Simulated CO2 mixing ratios were compared against in situ measurements made at four sites in Southern Ontario - Downsview, Hanlan's Point, Egbert and Turkey Point - in 3 winter months, January-March 2016. Model simulations had better agreement with measurements when using the SOCE inventory emissions versus other inventories, quantified using a variety of statistics such as correlation coefficient, root-mean-square error, and mean bias. Furthermore, when run with the SOCE inventory, the model had improved ability to capture the typical diurnal pattern of CO2 mixing ratios, particularly at the Downsview, Hanlan's Point, and Egbert sites. In addition to improved model-measurement agreement, the SOCE inventory offers a sectoral breakdown of emissions

  6. High-resolution quantification of atmospheric CO2 mixing ratios in the Greater Toronto Area, Canada

    Directory of Open Access Journals (Sweden)

    S. C. Pugliese

    2018-03-01

    Full Text Available Many stakeholders are seeking methods to reduce carbon dioxide (CO2 emissions in urban areas, but reliable, high-resolution inventories are required to guide these efforts. We present the development of a high-resolution CO2 inventory available for the Greater Toronto Area and surrounding region in Southern Ontario, Canada (area of  ∼ 2.8 × 105 km2, 26 % of the province of Ontario. The new SOCE (Southern Ontario CO2 Emissions inventory is available at the 2.5 × 2.5 km spatial and hourly temporal resolution and characterizes emissions from seven sectors: area, residential natural-gas combustion, commercial natural-gas combustion, point, marine, on-road, and off-road. To assess the accuracy of the SOCE inventory, we developed an observation–model framework using the GEM-MACH chemistry–transport model run on a high-resolution grid with 2.5 km grid spacing coupled to the Fossil Fuel Data Assimilation System (FFDAS v2 inventories for anthropogenic CO2 emissions and the European Centre for Medium-Range Weather Forecasts (ECMWF land carbon model C-TESSEL for biogenic fluxes. A run using FFDAS for the Southern Ontario region was compared to a run in which its emissions were replaced by the SOCE inventory. Simulated CO2 mixing ratios were compared against in situ measurements made at four sites in Southern Ontario – Downsview, Hanlan's Point, Egbert and Turkey Point – in 3 winter months, January–March 2016. Model simulations had better agreement with measurements when using the SOCE inventory emissions versus other inventories, quantified using a variety of statistics such as correlation coefficient, root-mean-square error, and mean bias. Furthermore, when run with the SOCE inventory, the model had improved ability to capture the typical diurnal pattern of CO2 mixing ratios, particularly at the Downsview, Hanlan's Point, and Egbert sites. In addition to improved model–measurement agreement, the SOCE inventory offers a

  7. Quantification of tidal inlet morphodynamics using high-resolution MBES and LiDAR

    DEFF Research Database (Denmark)

    Ernstsen, Verner Brandbyge; Lefebvre, Alice; Fraccascia, Serena

    -bathymetric surveys using high-resolution red and green Light Detection And Ranging (LiDAR). Detailed digital elevation models with a grid cell size of 1 m x 1 m were generated and analysed geomorphometrically. The analyses reveal a main ebb-directed net sand transport in the main channel; however, due...... to the geometry of the main channel, displaying a confluent meander bend, confined areas in the main channel are characterised by an opposite-directed net sand transport. In the inter-tidal areas the main net sand transport is flood-directed. However, also here the analyses reveal the existence of oblique second...... is transported from the inlet channel to the intertidal flat. Therefore, in addition to the typical main sand transport directions with net export in the inlet channel and net import over the adjacent inter-tidal flats, these investigations suggest an exchange and possible recirculation of sand between the inlet...

  8. The Hestia Project: High Spatial Resolution Fossil Fuel Carbon Dioxide Emissions Quantification at Hourly Scale in Indianapolis, USA

    Science.gov (United States)

    Zhou, Y.; Gurney, K. R.

    2009-12-01

    In order to advance the scientific understanding of carbon exchange with the land surface and contribute to sound, quantitatively-based U.S. climate change policy interests, quantification of greenhouse gases emissions drivers at fine spatial and temporal scales is essential. Quantification of fossil fuel CO2 emissions, the primary greenhouse gases, has become a key component to cost-effective CO2 emissions mitigation options and a carbon trading system. Called the ‘Hestia Project’, this pilot study generated CO2 emissions down to high spatial resolution and hourly scale for the greater Indianapolis region in the USA through the use of air quality and traffic monitoring data, remote sensing, GIS, and building energy modeling. The CO2 emissions were constructed from three data source categories: area, point, and mobile. For the area source emissions, we developed an energy consumption model using DOE/EIA survey data on building characteristics and energy consumption. With the Vulcan Project’s county-level CO2 emissions and simulated building energy consumption, we quantified the CO2 emissions for each individual building by allocating Vulcan emissions to roughly 50,000 structures in Indianapolis. The temporal pattern of CO2 emissions in each individual building was developed based on temporal patterns of energy consumption. The point sources emissions were derived from the EPA National Emissions Inventory data and effluent monitoring of electricity producing facilities. The mobile source CO2 emissions were estimated at the month/county scale using the Mobile6 combustion model and the National Mobile Inventory Model database. The month/county scale mobile source CO2 emissions were downscaled to the “native” spatial resolution of road segments every hour using a GIS road atlas and traffic monitoring data. The result is shown in Figure 1. The resulting urban-scale inventory can serve as a baseline of current CO2 emissions and should be of immediate use to

  9. Quantification of the Arrhythmogenic Effects of Spontaneous Atrial Extrasystole Using High-Resolution Epicardial Mapping.

    Science.gov (United States)

    Teuwen, Christophe P; Kik, Charles; van der Does, Lisette J M E; Lanters, Eva A H; Knops, Paul; Mouws, Elisabeth M J P; Bogers, Ad J J C; de Groot, Natasja M S

    2018-01-01

    Atrial extrasystoles (AES) can initiate atrial fibrillation. However, the impact of spontaneous AES on intra-atrial conduction is unknown. The aims of this study were to examine conduction disorders provoked by AES and to correlate these conduction differences with patient characteristics, mapping locations, and type of AES. High-resolution epicardial mapping (electrodes N=128 or N=192; interelectrode distance, 2 mm) of the entire atrial surface was performed in patients (N=164; 69.5% male; age 67.2±10.5 years) undergoing open-chest cardiac surgery. AES were classified as premature, aberrant, or prematurely aberrant. Conduction delay and conduction block were quantified during sinus rhythm and AES and subsequently compared. Median incidence of conduction delay and conduction block during sinus rhythm was 1.2% (interquartile, 0%-2.3%) and 0.4% (interquartile, 0%-2.1%). In comparison, the median incidence of conduction delay and conduction block during 339 AES was respectively 2.8% (interquartile, 1.3%-4.6%) and 2.2% (interquartile, 0.3%-5.1%) and differed between the types of AES (prematurely aberrant>aberrant>premature). The degree of prematurity was not associated with a higher incidence of conduction disorders ( P >0.05). In contrast, a higher degree of aberrancy was associated with a higher incidence of conduction disorders; AES emerging as epicardial breakthrough provoked most conduction disorders ( P ≥0.002). AES caused most conduction disorders in patients with diabetes mellitus and left atrial dilatation ( P <0.05). Intraoperative high-resolution epicardial mapping showed that conduction disorders are mainly provoked by prematurely aberrant AES, particularly in patients with left atrial dilation and diabetes mellitus or emerging as epicardial breakthrough. © 2017 American Heart Association, Inc.

  10. Characterizing, measuring, and utilizing the resolution of CT imagery for improved quantification of fine-scale features

    Energy Technology Data Exchange (ETDEWEB)

    Ketcham, Richard A., E-mail: ketcham@jsg.utexas.edu; Hildebrandt, Jordan

    2014-04-01

    Quantitative results extracted from computed tomographic (CT) data sets should be the same across resolutions and between different instruments and laboratory groups. Despite the proliferation of scanners and data processing methods and tools, and scientific studies utilizing them, relatively little emphasis has been given to ensuring that these results are comparable or reproducible. This issue is particularly pertinent when the features being imaged and measured are of the same order size as data voxels, as is often the case with fracture apertures, pore throats, and cell walls. We have created a tool that facilitates quantification of the spatial resolution of CT data via its point-spread function (PSF), in which the user draws a traverse across a sharp interface between two materials and a Gaussian PSF is fitted to the blurring across that interface. Geometric corrections account for voxel shape and the angle of the traverse to the interface, which does not need to be orthogonal. We use the tool to investigate a series of grid phantoms scanned at varying conditions and observe how the PSF varies within and between slices. The PSF increases with increasing radial distance within slices, and can increase tangentially with increasing radial distance in CT data sets acquired with relatively few projections. The PSF between CT slices is similar to that within slices when a 2-D detector is used, but is much sharper when the data are acquired one slice at a time with a collimated linear detector array. The capability described here can be used not only to calibrate processing algorithms that use deconvolution operations, but it can also help evaluate scans on a routine basis within and between CT research groups, and with respect to the features within the imagery that are being measured.

  11. StatSTEM: An efficient approach for accurate and precise model-based quantification of atomic resolution electron microscopy images

    Energy Technology Data Exchange (ETDEWEB)

    De Backer, A.; Bos, K.H.W. van den [Electron Microscopy for Materials Science (EMAT), University of Antwerp, Groenenborgerlaan 171, 2020 Antwerp (Belgium); Van den Broek, W. [AG Strukturforschung/Elektronenmikroskopie, Institut für Physik, Humboldt-Universität zu Berlin, Newtonstraße 15, 12489 Berlin (Germany); Sijbers, J. [iMinds-Vision Lab, University of Antwerp, Universiteitsplein 1, 2610 Wilrijk (Belgium); Van Aert, S., E-mail: sandra.vanaert@uantwerpen.be [Electron Microscopy for Materials Science (EMAT), University of Antwerp, Groenenborgerlaan 171, 2020 Antwerp (Belgium)

    2016-12-15

    An efficient model-based estimation algorithm is introduced to quantify the atomic column positions and intensities from atomic resolution (scanning) transmission electron microscopy ((S)TEM) images. This algorithm uses the least squares estimator on image segments containing individual columns fully accounting for overlap between neighbouring columns, enabling the analysis of a large field of view. For this algorithm, the accuracy and precision with which measurements for the atomic column positions and scattering cross-sections from annular dark field (ADF) STEM images can be estimated, has been investigated. The highest attainable precision is reached even for low dose images. Furthermore, the advantages of the model-based approach taking into account overlap between neighbouring columns are highlighted. This is done for the estimation of the distance between two neighbouring columns as a function of their distance and for the estimation of the scattering cross-section which is compared to the integrated intensity from a Voronoi cell. To provide end-users this well-established quantification method, a user friendly program, StatSTEM, is developed which is freely available under a GNU public license. - Highlights: • An efficient model-based method for quantitative electron microscopy is introduced. • Images are modelled as a superposition of 2D Gaussian peaks. • Overlap between neighbouring columns is taken into account. • Structure parameters can be obtained with the highest precision and accuracy. • StatSTEM, auser friendly program (GNU public license) is developed.

  12. Impact of attenuation correction strategies on the quantification of High Resolution Research Tomograph PET studies

    International Nuclear Information System (INIS)

    Velden, Floris H P van; Kloet, Reina W; Berckel, Bart N M van; Molthoff, Carla F M; Jong, Hugo W A M de; Lammertsma, Adriaan A; Boellaard, Ronald

    2008-01-01

    In this study, the quantitative accuracy of different attenuation correction strategies presently available for the High Resolution Research Tomograph (HRRT) was investigated. These attenuation correction methods differ in reconstruction and processing (segmentation) algorithms used for generating a μ-image from measured 2D transmission scans, an intermediate step in the generation of 3D attenuation correction factors. Available methods are maximum-a-posteriori reconstruction (MAP-TR), unweighted OSEM (UW-OSEM) and NEC-TR, which transforms sinogram values back to their noise equivalent counts (NEC) to restore Poisson distribution. All methods can be applied with or without μ-image segmentation. However, for MAP-TR a μ-histogram is a prior during reconstruction. All possible strategies were evaluated using phantoms of various sizes, simulating preclinical and clinical situations. Furthermore, effects of emission contamination of the transmission scan on the accuracy of various attenuation correction strategies were studied. Finally, the accuracy of various attenuation corrections strategies and its relative impact on the reconstructed activity concentration (AC) were evaluated using small animal and human brain studies. For small structures, MAP-TR with human brain priors showed smaller differences in μ-values for transmission scans with and without emission contamination (<8%) than the other methods (<26%). In addition, it showed best agreement with true AC (deviation <4.5%). A specific prior designed to take into account the presence of small animal fixation devices only very slightly improved AC precision to 4.3%. All methods scaled μ-values of a large homogeneous phantom to within 4% of the water peak, but MAP-TR provided most accurate AC after reconstruction. However, for clinical data MAP-TR using the default prior settings overestimated the thickness of the skull, resulting in overestimations of μ-values in regions near the skull and thus in incorrect

  13. Evaluation of global fine-resolution precipitation products and their uncertainty quantification in ensemble discharge simulations

    Science.gov (United States)

    Qi, W.; Zhang, C.; Fu, G.; Sweetapple, C.; Zhou, H.

    2016-02-01

    The applicability of six fine-resolution precipitation products, including precipitation radar, infrared, microwave and gauge-based products, using different precipitation computation recipes, is evaluated using statistical and hydrological methods in northeastern China. In addition, a framework quantifying uncertainty contributions of precipitation products, hydrological models, and their interactions to uncertainties in ensemble discharges is proposed. The investigated precipitation products are Tropical Rainfall Measuring Mission (TRMM) products (TRMM3B42 and TRMM3B42RT), Global Land Data Assimilation System (GLDAS)/Noah, Asian Precipitation - Highly-Resolved Observational Data Integration Towards Evaluation of Water Resources (APHRODITE), Precipitation Estimation from Remotely Sensed Information using Artificial Neural Networks (PERSIANN), and a Global Satellite Mapping of Precipitation (GSMAP-MVK+) product. Two hydrological models of different complexities, i.e. a water and energy budget-based distributed hydrological model and a physically based semi-distributed hydrological model, are employed to investigate the influence of hydrological models on simulated discharges. Results show APHRODITE has high accuracy at a monthly scale compared with other products, and GSMAP-MVK+ shows huge advantage and is better than TRMM3B42 in relative bias (RB), Nash-Sutcliffe coefficient of efficiency (NSE), root mean square error (RMSE), correlation coefficient (CC), false alarm ratio, and critical success index. These findings could be very useful for validation, refinement, and future development of satellite-based products (e.g. NASA Global Precipitation Measurement). Although large uncertainty exists in heavy precipitation, hydrological models contribute most of the uncertainty in extreme discharges. Interactions between precipitation products and hydrological models can have the similar magnitude of contribution to discharge uncertainty as the hydrological models. A

  14. Easy, Fast, and Reproducible Quantification of Cholesterol and Other Lipids in Human Plasma by Combined High Resolution MSX and FTMS Analysis

    Science.gov (United States)

    Gallego, Sandra F.; Højlund, Kurt; Ejsing, Christer S.

    2018-01-01

    Reliable, cost-effective, and gold-standard absolute quantification of non-esterified cholesterol in human plasma is of paramount importance in clinical lipidomics and for the monitoring of metabolic health. Here, we compared the performance of three mass spectrometric approaches available for direct detection and quantification of cholesterol in extracts of human plasma. These approaches are high resolution full scan Fourier transform mass spectrometry (FTMS) analysis, parallel reaction monitoring (PRM), and novel multiplexed MS/MS (MSX) technology, where fragments from selected precursor ions are detected simultaneously. Evaluating the performance of these approaches in terms of dynamic quantification range, linearity, and analytical precision showed that the MSX-based approach is superior to that of the FTMS and PRM-based approaches. To further show the efficacy of this approach, we devised a simple routine for extensive plasma lipidome characterization using only 8 μL of plasma, using a new commercially available ready-to-spike-in mixture with 14 synthetic lipid standards, and executing a single 6 min sample injection with combined MSX analysis for cholesterol quantification and FTMS analysis for quantification of sterol esters, glycerolipids, glycerophospholipids, and sphingolipids. Using this simple routine afforded reproducible and absolute quantification of 200 lipid species encompassing 13 lipid classes in human plasma samples. Notably, the analysis time of this procedure can be shortened for high throughput-oriented clinical lipidomics studies or extended with more advanced MSALL technology (Almeida R. et al., J. Am. Soc. Mass Spectrom. 26, 133-148 [1]) to support in-depth structural elucidation of lipid molecules. [Figure not available: see fulltext.

  15. Comparison of high-resolution ultrasonic resonator technology and Raman spectroscopy as novel process analytical tools for drug quantification in self-emulsifying drug delivery systems.

    Science.gov (United States)

    Stillhart, Cordula; Kuentz, Martin

    2012-02-05

    Self-emulsifying drug delivery systems (SEDDS) are complex mixtures in which drug quantification can become a challenging task. Thus, a general need exists for novel analytical methods and a particular interest lies in techniques with the potential for process monitoring. This article compares Raman spectroscopy with high-resolution ultrasonic resonator technology (URT) for drug quantification in SEDDS. The model drugs fenofibrate, indomethacin, and probucol were quantitatively assayed in different self-emulsifying formulations. We measured ultrasound velocity and attenuation in the bulk formulation containing drug at different concentrations. The formulations were also studied by Raman spectroscopy. We used both, an in-line immersion probe for the bulk formulation and a multi-fiber sensor for measuring through hard-gelatin capsules that were filled with SEDDS. Each method was assessed by calculating the relative standard error of prediction (RSEP) as well as the limit of quantification (LOQ) and the mean recovery. Raman spectroscopy led to excellent calibration models for the bulk formulation as well as the capsules. The RSEP depended on the SEDDS type with values of 1.5-3.8%, while LOQ was between 0.04 and 0.35% (w/w) for drug quantification in the bulk. Similarly, the analysis of the capsules led to RSEP of 1.9-6.5% and LOQ of 0.01-0.41% (w/w). On the other hand, ultrasound attenuation resulted in RSEP of 2.3-4.4% and LOQ of 0.1-0.6% (w/w). Moreover, ultrasound velocity provided an interesting analytical response in cases where the drug strongly affected the density or compressibility of the SEDDS. We conclude that ultrasonic resonator technology and Raman spectroscopy constitute suitable methods for drug quantification in SEDDS, which is promising for their use as process analytical technologies. Copyright © 2011 Elsevier B.V. All rights reserved.

  16. Liquid chromatography with diode array detection and multivariate curve resolution for the selective and sensitive quantification of estrogens in natural waters.

    Science.gov (United States)

    Pérez, Rocío L; Escandar, Graciela M

    2014-07-04

    Following the green analytical chemistry principles, an efficient strategy involving second-order data provided by liquid chromatography (LC) with diode array detection (DAD) was applied for the simultaneous determination of estriol, 17β-estradiol, 17α-ethinylestradiol and estrone in natural water samples. After a simple pre-concentration step, LC-DAD matrix data were rapidly obtained (in less than 5 min) with a chromatographic system operating isocratically. Applying a second-order calibration algorithm based on multivariate curve resolution with alternating least-squares (MCR-ALS), successful resolution was achieved in the presence of sample constituents that strongly coelute with the analytes. The flexibility of this multivariate model allowed the quantification of the four estrogens in tap, mineral, underground and river water samples. Limits of detection in the range between 3 and 13 ng L(-1), and relative prediction errors from 2 to 11% were achieved. Copyright © 2014 Elsevier B.V. All rights reserved.

  17. Direct nuclear magnetic resonance identification and quantification of geometric isomers of conjugated linoleic acid in milk lipid fraction without derivatization steps: Overcoming sensitivity and resolution barriers

    International Nuclear Information System (INIS)

    Tsiafoulis, Constantinos G.; Skarlas, Theodore; Tzamaloukas, Ouranios; Miltiadou, Despoina; Gerothanassis, Ioannis P.

    2014-01-01

    Highlights: • The first NMR quantification of four geometric 18:2 CLA isomers has been achieved. • Sensitivity and resolution NMR barriers have been overcome. • Selective suppression and reduced 13 C spectral width have been utilized. • The method is applied in the milk lipid fraction without derivatization steps. • The method is selective, sensitive with very good analytical characteristics. - Abstract: We report the first successful direct and unequivocal identification and quantification of four minor geometric (9-cis, 11-trans) 18:2, (9-trans, 11-cis) 18:2, (9-cis, 11-cis) 18:2 and (9-trans, 11-trans) 18:2 conjugated linoleic acid (CLA) isomers in lipid fractions of lyophilized milk samples with the combined use of 1D 1 H-NMR, 2D 1 H- 1 H TOCSY and 2D 1 H- 13 C HSQC NMR. The significant sensitivity barrier has been successfully overcome under selective suppression of the major resonances, with over 10 4 greater equilibrium magnetization of the -(CH 2 ) n - 1 H spins compared to that of the 1 H spins of the conjugated bonds of the CLA isomers. The resolution barrier has been significantly increased using reduced 13 C spectral width in the 2D 1 H- 13 C HSQC experiment. The assignment was confirmed with spiking experiments with CLA standard compounds and the method does not require any derivatization steps for the lipid fraction. The proposed method is selective, sensitive and compares favorably with the GS-MS method of analysis

  18. Improved algorithm for computerized detection and quantification of pulmonary emphysema at high-resolution computed tomography (HRCT)

    Science.gov (United States)

    Tylen, Ulf; Friman, Ola; Borga, Magnus; Angelhed, Jan-Erik

    2001-05-01

    Emphysema is characterized by destruction of lung tissue with development of small or large holes within the lung. These areas will have Hounsfield values (HU) approaching -1000. It is possible to detect and quantificate such areas using simple density mask technique. The edge enhancement reconstruction algorithm, gravity and motion of the heart and vessels during scanning causes artefacts, however. The purpose of our work was to construct an algorithm that detects such image artefacts and corrects them. The first step is to apply inverse filtering to the image removing much of the effect of the edge enhancement reconstruction algorithm. The next step implies computation of the antero-posterior density gradient caused by gravity and correction for that. Motion artefacts are in a third step corrected for by use of normalized averaging, thresholding and region growing. Twenty healthy volunteers were investigated, 10 with slight emphysema and 10 without. Using simple density mask technique it was not possible to separate persons with disease from those without. Our algorithm improved separation of the two groups considerably. Our algorithm needs further refinement, but may form a basis for further development of methods for computerized diagnosis and quantification of emphysema by HRCT.

  19. Quantification of the fluorine containing drug 5-fluorouracil in cancer cells by GaF molecular absorption via high-resolution continuum source molecular absorption spectrometry

    International Nuclear Information System (INIS)

    Krüger, Magnus; Huang, Mao-Dong; Becker-Roß, Helmut; Florek, Stefan; Ott, Ingo; Gust, Ronald

    2012-01-01

    The development of high-resolution continuum source molecular absorption spectrometry made the quantification of fluorine feasible by measuring the molecular absorption as gallium monofluoride (GaF). Using this new technique, we developed on the example of 5-fluorouracil (5-FU) a graphite furnace method to quantify fluorine in organic molecules. The effect of 5-FU on the generation of the diatomic GaF molecule was investigated. The experimental conditions such as gallium nitrate amount, temperature program, interfering anions (represented as corresponding acids) and calibration for the determination of 5-FU in standard solution and in cellular matrix samples were investigated and optimized. The sample matrix showed no effect on the sensitivity of GaF molecular absorption. A simple calibration curve using an inorganic sodium fluoride solution can conveniently be used for the calibration. The described method is sensitive and the achievable limit of detection is 0.23 ng of 5-FU. In order to establish the concept of “fluorine as a probe in medicinal chemistry” an exemplary application was selected, in which the developed method was successfully demonstrated by performing cellular uptake studies of the 5-FU in human colon carcinoma cells. - Highlights: ► Development of HR-CS MAS for quantification of fluorine bound to organic molecules ► Measuring as molecular absorption of gallium monofluoride ► Quantification of organic-bound fluorine in biological material ► The concept of “fluorine as a probe in medicinal chemistry” could be established

  20. Quantification of the fluorine containing drug 5-fluorouracil in cancer cells by GaF molecular absorption via high-resolution continuum source molecular absorption spectrometry

    Energy Technology Data Exchange (ETDEWEB)

    Krueger, Magnus [Freie Universitaet Berlin, Institut fuer Pharmazie, Pharmazeutische Chemie, Koenigin-Luise-Str. 2-4, 14195 Berlin (Germany); Huang, Mao-Dong; Becker-Ross, Helmut; Florek, Stefan [Leibniz Institut fuer Analytische Wissenschaften, ISAS-e.V., Department Berlin, Albert-Einstein-Str. 9, 12489 Berlin (Germany); Ott, Ingo [Technische Universitaet Carolo Wilhelmina zu Braunschweig, Institut fuer Medizinische und Pharmazeutische Chemie, Beethovenstr. 55, 38106 Braunschweig (Germany); Gust, Ronald, E-mail: ronald.gust@uibk.ac.at [Universitaet Innsbruck, Institut fuer Pharmazie, Pharmazeutische Chemie, Innrain 80/82, 6020 Innsbruck (Austria)

    2012-03-15

    The development of high-resolution continuum source molecular absorption spectrometry made the quantification of fluorine feasible by measuring the molecular absorption as gallium monofluoride (GaF). Using this new technique, we developed on the example of 5-fluorouracil (5-FU) a graphite furnace method to quantify fluorine in organic molecules. The effect of 5-FU on the generation of the diatomic GaF molecule was investigated. The experimental conditions such as gallium nitrate amount, temperature program, interfering anions (represented as corresponding acids) and calibration for the determination of 5-FU in standard solution and in cellular matrix samples were investigated and optimized. The sample matrix showed no effect on the sensitivity of GaF molecular absorption. A simple calibration curve using an inorganic sodium fluoride solution can conveniently be used for the calibration. The described method is sensitive and the achievable limit of detection is 0.23 ng of 5-FU. In order to establish the concept of 'fluorine as a probe in medicinal chemistry' an exemplary application was selected, in which the developed method was successfully demonstrated by performing cellular uptake studies of the 5-FU in human colon carcinoma cells. - Highlights: Black-Right-Pointing-Pointer Development of HR-CS MAS for quantification of fluorine bound to organic molecules Black-Right-Pointing-Pointer Measuring as molecular absorption of gallium monofluoride Black-Right-Pointing-Pointer Quantification of organic-bound fluorine in biological material Black-Right-Pointing-Pointer The concept of 'fluorine as a probe in medicinal chemistry' could be established.

  1. Identification and quantification of fumonisin A1, A2, and A3 in corn by high-resolution liquid chromatography-orbitrap mass spectrometry.

    Science.gov (United States)

    Tamura, Masayoshi; Mochizuki, Naoki; Nagatomi, Yasushi; Harayama, Koichi; Toriba, Akira; Hayakawa, Kazuichi

    2015-02-16

    Three compounds, hypothesized as fumonisin A1 (FA1), fumonisin A2 (FA2), and fumonisin A3 (FA3), were detected in a corn sample contaminated with mycotoxins by high-resolution liquid chromatography-Orbitrap mass spectrometry (LC-Orbitrap MS). One of them has been identified as FA1 synthesized by the acetylation of fumonisin B1 (FB1), and established a method for its quantification. Herein, we identified the two remaining compounds as FA2 and FA3, which were acetylated fumonisin B2 (FB2) and fumonisin B3 (FB3), respectively. Moreover, we examined a method for the simultaneous analysis of FA1, FA2, FA3, FB1, FB2, and FB3. The corn samples were prepared by extraction using a QuEChERS kit and purification using a multifunctional cartridge. The linearity, recovery, repeatability, limit of detection, and limit of quantification of the method were >0.99, 82.9%-104.6%, 3.7%-9.5%, 0.02-0.60 μg/kg, and 0.05-1.98 μg/kg, respectively. The simultaneous analysis of the six fumonisins revealed that FA1, FA2, and FA3 were present in all corn samples contaminated with FB1, FB2, and FB3. The results suggested that corn marketed for consumption can be considered as being contaminated with both the fumonisin B-series and with fumonisin A-series. This report presents the first identification and quantification of FA1, FA2, and FA3 in corn samples.

  2. Identification and accurate quantification of structurally related peptide impurities in synthetic human C-peptide by liquid chromatography-high resolution mass spectrometry.

    Science.gov (United States)

    Li, Ming; Josephs, Ralf D; Daireaux, Adeline; Choteau, Tiphaine; Westwood, Steven; Wielgosz, Robert I; Li, Hongmei

    2018-06-04

    Peptides are an increasingly important group of biomarkers and pharmaceuticals. The accurate purity characterization of peptide calibrators is critical for the development of reference measurement systems for laboratory medicine and quality control of pharmaceuticals. The peptides used for these purposes are increasingly produced through peptide synthesis. Various approaches (for example mass balance, amino acid analysis, qNMR, and nitrogen determination) can be applied to accurately value assign the purity of peptide calibrators. However, all purity assessment approaches require a correction for structurally related peptide impurities in order to avoid biases. Liquid chromatography coupled to high resolution mass spectrometry (LC-hrMS) has become the key technique for the identification and accurate quantification of structurally related peptide impurities in intact peptide calibrator materials. In this study, LC-hrMS-based methods were developed and validated in-house for the identification and quantification of structurally related peptide impurities in a synthetic human C-peptide (hCP) material, which served as a study material for an international comparison looking at the competencies of laboratories to perform peptide purity mass fraction assignments. More than 65 impurities were identified, confirmed, and accurately quantified by using LC-hrMS. The total mass fraction of all structurally related peptide impurities in the hCP study material was estimated to be 83.3 mg/g with an associated expanded uncertainty of 3.0 mg/g (k = 2). The calibration hierarchy concept used for the quantification of individual impurities is described in detail. Graphical abstract ᅟ.

  3. Advancing the quantification of humid tropical forest cover loss with multi-resolution optical remote sensing data: Sampling & wall-to-wall mapping

    Science.gov (United States)

    Broich, Mark

    Humid tropical forest cover loss is threatening the sustainability of ecosystem goods and services as vast forest areas are rapidly cleared for industrial scale agriculture and tree plantations. Despite the importance of humid tropical forest in the provision of ecosystem services and economic development opportunities, the spatial and temporal distribution of forest cover loss across large areas is not well quantified. Here I improve the quantification of humid tropical forest cover loss using two remote sensing-based methods: sampling and wall-to-wall mapping. In all of the presented studies, the integration of coarse spatial, high temporal resolution data with moderate spatial, low temporal resolution data enable advances in quantifying forest cover loss in the humid tropics. Imagery from the Moderate Resolution Imaging Spectroradiometer (MODIS) are used as the source of coarse spatial resolution, high temporal resolution data and imagery from the Landsat Enhanced Thematic Mapper Plus (ETM+) sensor are used as the source of moderate spatial, low temporal resolution data. In a first study, I compare the precision of different sampling designs for the Brazilian Amazon using the annual deforestation maps derived by the Brazilian Space Agency for reference. I show that sampling designs can provide reliable deforestation estimates; furthermore, sampling designs guided by MODIS data can provide more efficient estimates than the systematic design used for the United Nations Food and Agricultural Organization Forest Resource Assessment 2010. Sampling approaches, such as the one demonstrated, are viable in regions where data limitations, such as cloud contamination, limit exhaustive mapping methods. Cloud-contaminated regions experiencing high rates of change include Insular Southeast Asia, specifically Indonesia and Malaysia. Due to persistent cloud cover, forest cover loss in Indonesia has only been mapped at a 5-10 year interval using photo interpretation of single

  4. Improved Precision and Accuracy of Quantification of Rare Earth Element Abundances via Medium-Resolution LA-ICP-MS.

    Science.gov (United States)

    Funderburg, Rebecca; Arevalo, Ricardo; Locmelis, Marek; Adachi, Tomoko

    2017-11-01

    Laser ablation ICP-MS enables streamlined, high-sensitivity measurements of rare earth element (REE) abundances in geological materials. However, many REE isotope mass stations are plagued by isobaric interferences, particularly from diatomic oxides and argides. In this study, we compare REE abundances quantitated from mass spectra collected with low-resolution (m/Δm = 300 at 5% peak height) and medium-resolution (m/Δm = 2500) mass discrimination. A wide array of geological samples was analyzed, including USGS and NIST glasses ranging from mafic to felsic in composition, with NIST 610 employed as the bracketing calibrating reference material. The medium-resolution REE analyses are shown to be significantly more accurate and precise (at the 95% confidence level) than low-resolution analyses, particularly in samples characterized by low (ICP-MS methods, particularly those relying on mass analyzers that do not offer tuneable mass-resolution and/or collision cell technologies that can reduce oxide and/or argide formation. Graphical Abstract ᅟ.

  5. Quantification of endogenous and exogenous protein expressions of Na,K-ATPase with super-resolution PALM/STORM imaging.

    Science.gov (United States)

    Bernhem, Kristoffer; Blom, Hans; Brismar, Hjalmar

    2018-01-01

    Transient transfection of fluorescent fusion proteins is a key enabling technology in fluorescent microscopy to spatio-temporally map cellular protein distributions. Transient transfection of proteins may however bypass normal regulation of expression, leading to overexpression artefacts like misallocations and excess amounts. In this study we investigate the use of STORM and PALM microscopy to quantitatively monitor endogenous and exogenous protein expression. Through incorporation of an N-terminal hemagglutinin epitope to a mMaple3 fused Na,K-ATPase (α1 isoform), we analyze the spatial and quantitative changes of plasma membrane Na,K-ATPase localization during competitive transient expression. Quantification of plasma membrane protein density revealed a time dependent increase of Na,K-ATPase, but no increase in size of protein clusters. Results show that after 41h transfection, the total plasma membrane density of Na,K-ATPase increased by 63% while the endogenous contribution was reduced by 16%.

  6. Identification and quantification of the main organic components of vinegars by high resolution 1H NMR spectroscopy

    International Nuclear Information System (INIS)

    Caligiani, A.; Acquotti, D.; Palla, G.; Bocchi, V.

    2007-01-01

    A detailed analysis of the proton high-field NMR spectra of vinegars (in particular of Italian balsamic vinegars) is reported. A large number of organic substances belonging to different classes, such as carbohydrates, alcohols, organic acids, volatile compounds and amino acids, were assigned. The possibility of quantification of the substances identified in the whole vinegar sample, without extraction or pre-concentration steps, was also tested. The data validity was demonstrated in terms of precision, accuracy, repeatability and inter-day reproducibility. The effects of the most critical experimental parameters (sample concentration, water suppression and relaxation time) on the analysis response were also discussed. 1 H NMR results were compared with those obtained by traditional techniques (GC-MS, titrations), and good correlations were obtained. The results showed that 1 H NMR with water suppression allows a rapid, simultaneous determination of carbohydrates (glucose and fructose), organic acids (acetic, formic, lactic, malic, citric, succinic and tartaric acids), alcohols and polyols (ethanol, acetoin, 2,3-butanediol, hydroxymethylfurfural), and volatile substances (ethyl acetate) in vinegar samples. On the contrary, the amino acid determination without sample pre-concentration was critical. The 1 H NMR method proposed was applied to different samples of vinegars, allowing, in particular, the discrimination of vinegars and balsamic vinegars

  7. Methyl jasmonate-induced emission of biogenic volatiles is biphasic in cucumber: a high-resolution analysis of dose dependence.

    Science.gov (United States)

    Jiang, Yifan; Ye, Jiayan; Li, Shuai; Niinemets, Ülo

    2017-07-20

    Methyl jasmonate (MeJA) is a key airborne elicitor activating jasmonate-dependent signaling pathways, including induction of stress-related volatile emissions, but how the magnitude and timing of these emissions scale with MeJA dose is not known. Treatments with exogenous MeJA concentrations ranging from mild (0.2 mM) to lethal (50 mM) were used to investigate quantitative relationships among MeJA dose and the kinetics and magnitude of volatile release in Cucumis sativus by combining high-resolution measurements with a proton-transfer reaction time-of-flight mass spectrometer (PTR-TOF-MS) and GC-MS. The results highlighted biphasic kinetics of elicitation of volatiles. The early phase, peaking in 0.1-1 h after the MeJA treatment, was characterized by emissions of lipoxygenase (LOX) pathway volatiles and methanol. In the subsequent phase, starting in 6-12 h and reaching a maximum in 15-25 h after the treatment, secondary emissions of LOX compounds as well as emissions of monoterpenes and sesquiterpenes were elicited. For both phases, the maximum emission rates and total integrated emissions increased with applied MeJA concentration. Furthermore, the rates of induction and decay, and the duration of emission bursts were positively, and the timing of emission maxima were negatively associated with MeJA dose for LOX compounds and terpenoids, except for the duration of the first LOX burst. These results demonstrate major effects of MeJA dose on the kinetics and magnitude of volatile response, underscoring the importance of biotic stress severity in deciphering the downstream events of biological impacts. © The Author 2017. Published by Oxford University Press on behalf of the Society for Experimental Biology.

  8. eMethylsorb: electrochemical quantification of DNA methylation at CpG resolution using DNA-gold affinity interactions.

    Science.gov (United States)

    Sina, Abu Ali Ibn; Howell, Sidney; Carrascosa, Laura G; Rauf, Sakandar; Shiddiky, Muhammad J A; Trau, Matt

    2014-11-07

    We report a simple electrochemical method referred to as "eMethylsorb" for the detection of DNA methylation. The method relies on the base dependent affinity interaction of DNA with gold. The methylation status of DNA is quantified by monitoring the electrochemical current as a function of the relative adsorption level of bisulphite treated DNA samples onto a bare gold electrode. This method can successfully distinguish methylated and unmethylated epigenotypes at single CpG resolution.

  9. Patient-specific quantification of image quality: An automated method for measuring spatial resolution in clinical CT images

    Energy Technology Data Exchange (ETDEWEB)

    Sanders, Jeremiah, E-mail: jeremiah.sanders@duke.edu [Medical Physics Graduate Program, Carl E. Ravin Advanced Imaging Laboratories, Department of Radiology, Clinical Imaging Physics Group, Duke University, Durham, North Carolina 27710 (United States); Hurwitz, Lynne [Department of Radiology, Duke University, Durham, North Carolina 27710 (United States); Samei, Ehsan [Medical Physics Graduate Program, Carl E. Ravin Advanced Imaging Laboratories, Department of Radiology, Clinical Imaging Physics Group, Duke University, Durham, North Carolina 27710 and Departments of Physics, Biomedical Engineering, Electrical and Computer Engineering, Duke University, Durham, North Carolina 27710 (United States)

    2016-10-15

    Purpose: To develop and validate an automated technique for evaluating the spatial resolution characteristics of clinical computed tomography (CT) images. Methods: Twenty one chest and abdominopelvic clinical CT datasets were examined in this study. An algorithm was developed to extract a CT resolution index (RI) analogous to the modulation transfer function from clinical CT images by measuring the edge-spread function (ESF) across the patient’s skin. A polygon mesh of the air-skin boundary was created. The faces of the mesh were then used to measure the ESF across the air-skin interface. The ESF was differentiated to obtain the line-spread function (LSF), and the LSF was Fourier transformed to obtain the RI. The algorithm’s ability to detect the radial dependence of the RI was investigated. RIs measured with the proposed method were compared with a conventional phantom-based method across two reconstruction algorithms (FBP and iterative) using the spatial frequency at 50% RI, f{sub 50}, as the metric for comparison. Three reconstruction kernels were investigated for each reconstruction algorithm. Finally, an observer study was conducted to determine if observers could visually perceive the differences in the measured blurriness of images reconstructed with a given reconstruction method. Results: RI measurements performed with the proposed technique exhibited the expected dependencies on the image reconstruction. The measured f{sub 50} values increased with harder kernels for both FBP and iterative reconstruction. Furthermore, the proposed algorithm was able to detect the radial dependence of the RI. Patient-specific measurements of the RI were comparable to the phantom-based technique, but the patient data exhibited a large spread in the measured f{sub 50}, indicating that some datasets were blurrier than others even when the projection data were reconstructed with the same reconstruction algorithm and kernel. Results from the observer study substantiated this

  10. Quantification of uncertainty associated with United States high resolution fossil fuel CO2 emissions: updates, challenges and future plans

    Science.gov (United States)

    Gurney, K. R.; Chandrasekaran, V.; Mendoza, D. L.; Geethakumar, S.

    2010-12-01

    The Vulcan Project has estimated United States fossil fuel CO2 emissions at the hourly time scale and at spatial scales below the county level for the year 2002. Vulcan is built from a wide variety of observational data streams including regulated air pollutant emissions reporting, traffic monitoring, energy statistics, and US census data. In addition to these data sets, Vulcan relies on a series of modeling assumptions and constructs to interpolate in space, time and transform non-CO2 reporting into an estimate of CO2 combustion emissions. The recent version 2.0 of the Vulcan inventory has produced advances in a number of categories with particular emphasis on improved temporal structure. Onroad transportation emissions now avail of roughly 5000 automated traffic count monitors allowing for much improved diurnal and weekly time structure in our onroad transportation emissions. Though the inventory shows excellent agreement with independent national-level CO2 emissions estimates, uncertainty quantification has been a challenging task given the large number of data sources and numerous modeling assumptions. However, we have now accomplished a complete uncertainty estimate across all the Vulcan economic sectors and will present uncertainty estimates as a function of space, time, sector and fuel. We find that, like the underlying distribution of CO2 emissions themselves, the uncertainty is also strongly lognormal with high uncertainty associated with a relatively small number of locations. These locations typically are locations reliant upon coal combustion as the dominant CO2 source. We will also compare and contrast Vulcan fossil fuel CO2 emissions estimates against estimates built from DOE fuel-based surveys at the state level. We conclude that much of the difference between the Vulcan inventory and DOE statistics are not due to biased estimation but mechanistic differences in supply versus demand and combustion in space/time.

  11. Structure Annotation and Quantification of Wheat Seed Oxidized Lipids by High-Resolution LC-MS/MS.

    Science.gov (United States)

    Riewe, David; Wiebach, Janine; Altmann, Thomas

    2017-10-01

    Lipid oxidation is a process ubiquitous in life, but the direct and comprehensive analysis of oxidized lipids has been limited by available analytical methods. We applied high-resolution liquid chromatography-mass spectrometry (LC-MS) and tandem mass spectrometry (MS/MS) to quantify oxidized lipids (glycerides, fatty acids, phospholipids, lysophospholipids, and galactolipids) and implemented a platform-independent high-throughput-amenable analysis pipeline for the high-confidence annotation and acyl composition analysis of oxidized lipids. Lipid contents of 90 different naturally aged wheat ( Triticum aestivum ) seed stocks were quantified in an untargeted high-resolution LC-MS experiment, resulting in 18,556 quantitative mass-to-charge ratio features. In a posthoc liquid chromatography-tandem mass spectrometry experiment, high-resolution MS/MS spectra (5 mD accuracy) were recorded for 8,957 out of 12,080 putatively monoisotopic features of the LC-MS data set. A total of 353 nonoxidized and 559 oxidized lipids with up to four additional oxygen atoms were annotated based on the accurate mass recordings (1.5 ppm tolerance) of the LC-MS data set and filtering procedures. MS/MS spectra available for 828 of these annotations were analyzed by translating experimentally known fragmentation rules of lipids into the fragmentation of oxidized lipids. This led to the identification of 259 nonoxidized and 365 oxidized lipids by both accurate mass and MS/MS spectra and to the determination of acyl compositions for 221 nonoxidized and 295 oxidized lipids. Analysis of 15-year aged wheat seeds revealed increased lipid oxidation and hydrolysis in seeds stored in ambient versus cold conditions. © 2017 The author(s). All Rights Reserved.

  12. Simultaneous quantification of amino acids and Amadori products in foods through ion-pairing liquid chromatography-high-resolution mass spectrometry.

    Science.gov (United States)

    Troise, Antonio Dario; Fiore, Alberto; Roviello, Giovanni; Monti, Simona Maria; Fogliano, Vincenzo

    2015-01-01

    The formation of the Amadori products (APs) is the first key step of Maillard reaction. Only few papers have dealt with simultaneous quantitation of amino acids and corresponding APs (1-amino-1-deoxy-2-ketose). Chromatographic separation of APs is affected by several drawbacks mainly related to their poor retention in conventional reversed phase separation. In this paper, a method for the simultaneous quantification of amino acids and their respective APs was developed combining high-resolution mass spectrometry with ion-pairing liquid chromatography. The limit of detection was 0.1 ng/mL for tryptophan, valine and arginine, while the limit of quantification ranged from 2 to 5 ng/mL according to the specific sensitivity of each analyte. The relative standard deviation % was lower than 10 % and the coefficient of correlation was higher than 0.99 for each calibration curve. The method was applied to milk, milk-based products, raw and processed tomato. Among the analyzed products, the most abundant amino acid was glutamic acid (16,646.89 ± 1,385.40 µg/g) and the most abundant AP was fructosyl-arginine in tomato puree (774.82 ± 10.01 µg/g). The easiness of sample preparation coupled to the analytical performances of the proposed method introduced the possibility to use the pattern of free amino acids and corresponding APs in the evaluation of the quality of raw food as well as the extent of thermal treatments in different food products.

  13. Chemometrics resolution and quantification power evaluation: Application on pharmaceutical quaternary mixture of Paracetamol, Guaifenesin, Phenylephrine and p-aminophenol

    Science.gov (United States)

    Yehia, Ali M.; Mohamed, Heba M.

    2016-01-01

    Three advanced chemmometric-assisted spectrophotometric methods namely; Concentration Residuals Augmented Classical Least Squares (CRACLS), Multivariate Curve Resolution-Alternating Least Squares (MCR-ALS) and Principal Component Analysis-Artificial Neural Networks (PCA-ANN) were developed, validated and benchmarked to PLS calibration; to resolve the severely overlapped spectra and simultaneously determine; Paracetamol (PAR), Guaifenesin (GUA) and Phenylephrine (PHE) in their ternary mixture and in presence of p-aminophenol (AP) the main degradation product and synthesis impurity of Paracetamol. The analytical performance of the proposed methods was described by percentage recoveries, root mean square error of calibration and standard error of prediction. The four multivariate calibration methods could be directly used without any preliminary separation step and successfully applied for pharmaceutical formulation analysis, showing no excipients' interference.

  14. Filtering high resolution hyperspectral imagery and analyzing it for quantification of water quality parameters and aquatic vegetation

    Science.gov (United States)

    Pande-Chhetri, Roshan

    High resolution hyperspectral imagery (airborne or ground-based) is gaining momentum as a useful analytical tool in various fields including agriculture and aquatic systems. These images are often contaminated with stripes and noise resulting in lower signal-to-noise ratio, especially in aquatic regions where signal is naturally low. This research investigates effective methods for filtering high spatial resolution hyperspectral imagery and use of the imagery in water quality parameter estimation and aquatic vegetation classification. The striping pattern of the hyperspectral imagery is non-parametric and difficult to filter. In this research, a de-striping algorithm based on wavelet analysis and adaptive Fourier domain normalization was examined. The result of this algorithm was found superior to other available algorithms and yielded highest Peak Signal to Noise Ratio improvement. The algorithm was implemented on individual image bands and on selected bands of the Maximum Noise Fraction (MNF) transformed images. The results showed that image filtering in the MNF domain was efficient and produced best results. The study investigated methods of analyzing hyperspectral imagery to estimate water quality parameters and to map aquatic vegetation in case-2 waters. Ground-based hyperspectral imagery was analyzed to determine chlorophyll-a (Chl-a) concentrations in aquaculture ponds. Two-band and three-band indices were implemented and the effect of using submerged reflectance targets was evaluated. Laboratory measured values were found to be in strong correlation with two-band and three-band spectral indices computed from the hyperspectral image. Coefficients of determination (R2) values were found to be 0.833 and 0.862 without submerged targets and stronger values of 0.975 and 0.982 were obtained using submerged targets. Airborne hyperspectral images were used to detect and classify aquatic vegetation in a black river estuarine system. Image normalization for water

  15. Windowed direct exponential curve resolution quantification of nuclear magnetic resonance spectroscopy with applications to amniotic fluid metabonomics

    International Nuclear Information System (INIS)

    Botros, L.L.

    2007-01-01

    This thesis presents a quantitative protocol of proton nuclear magnetic resonance ( 1 H NMR) that allows the determination of human amniotic fluid metabolite concentrations, which are then used in a metabonomic study to establish patient health during gestation. 1 H NMR free inductive decays (FIDs) of 258 human amniotic fluid samples from a 500MHz spectrometer are acquired. Quantitative analyses methods in both the frequency- and time-domain are carried out and compared. Frequency-domain analysis is accomplished by integration of the metabolite peaks before and after the inclusion of a known standard addition of alanine. Time-domain analysis is accomplished by the direct exponential curve resolution algorithm (DECRA). Both techniques are assessed by applications to calibration biological solutions and a simulated data set. The DECRA method proves to be a more accurate and precise route for quantitative analysis, and is included in the developed protocol. Well-defined peaks of various components are visible in the frequency-domain 1 H NMR spectra, including lactate, alanine, acetate, citrate, choline, glycine, and glucose. All are quantified with the proposed protocol. Statistical t-test and notched box and whisker plots are used to compare means of metabolite concentrations for diabetic and normal patients. Glucose, glycine, and choline are all found to correlate with gestational diabetes mellitus early in gestation. With further development, time-domain quantitative 1 H NMR has potential to become a robust diagnostic tool for gestational health. (author)

  16. Windowed direct exponential curve resolution quantification of nuclear magnetic resonance spectroscopy with applications to amniotic fluid metabonomics

    Energy Technology Data Exchange (ETDEWEB)

    Botros, L.L

    2007-07-01

    This thesis presents a quantitative protocol of proton nuclear magnetic resonance ({sup 1}H NMR) that allows the determination of human amniotic fluid metabolite concentrations, which are then used in a metabonomic study to establish patient health during gestation. {sup 1}H NMR free inductive decays (FIDs) of 258 human amniotic fluid samples from a 500MHz spectrometer are acquired. Quantitative analyses methods in both the frequency- and time-domain are carried out and compared. Frequency-domain analysis is accomplished by integration of the metabolite peaks before and after the inclusion of a known standard addition of alanine. Time-domain analysis is accomplished by the direct exponential curve resolution algorithm (DECRA). Both techniques are assessed by applications to calibration biological solutions and a simulated data set. The DECRA method proves to be a more accurate and precise route for quantitative analysis, and is included in the developed protocol. Well-defined peaks of various components are visible in the frequency-domain {sup 1}H NMR spectra, including lactate, alanine, acetate, citrate, choline, glycine, and glucose. All are quantified with the proposed protocol. Statistical t-test and notched box and whisker plots are used to compare means of metabolite concentrations for diabetic and normal patients. Glucose, glycine, and choline are all found to correlate with gestational diabetes mellitus early in gestation. With further development, time-domain quantitative {sup 1}H NMR has potential to become a robust diagnostic tool for gestational health. (author)

  17. Detection and quantification of phenolic compounds in olive oil by high resolution {sup 1}H nuclear magnetic resonance spectroscopy

    Energy Technology Data Exchange (ETDEWEB)

    Christophoridou, Stella [NMR Laboratory, Department of Chemistry, University of Crete, P.O. Box 2208, Voutes, 71003 Heraklion, Crete (Greece); Dais, Photis [NMR Laboratory, Department of Chemistry, University of Crete, P.O. Box 2208, Voutes, 71003 Heraklion, Crete (Greece)], E-mail: dais@chemistry.uoc.gr

    2009-02-09

    High resolution {sup 1}H NMR spectroscopy has been employed as a versatile and rapid method to analyze the polar fraction of extra virgin olive oils containing various classes of phenolic compounds. The strategy for identification of phenolic compounds is based on the NMR chemical shifts of a large number of model compounds assigned by using two-dimensional (2D) NMR spectroscopy. Furthermore, 2D NMR was applied to phenolic extracts in an attempt to discover additional phenolic compounds. The {sup 1}H NMR methodology was successful in detecting simple phenols, such as p-coumaric acid, vanillic acid, homovanillyl alcohol, vanillin, free tyrosol, and free hydroxytyrosol, the flavonols apigenin and luteolin, the lignans (+) pinoresinol, (+) 1-acetoxypinoresinol and syringaresinol, two isomers of the aldehydic form of oleuropein and ligstroside, the dialdehydic form of oleuropein and ligstroside lacking a carboxymethyl group, and finally total hydroxytyrosol and total tyrosol reflecting the total amounts of free and esterified hydroxytyrol and tyrosol, respectively. The absolute amount of each phenolic constituent was determined in the polar fraction by using anhydrous 1,3,5-triazine as an internal standard.

  18. Detection and quantification of phenolic compounds in olive oil by high resolution 1H nuclear magnetic resonance spectroscopy

    International Nuclear Information System (INIS)

    Christophoridou, Stella; Dais, Photis

    2009-01-01

    High resolution 1 H NMR spectroscopy has been employed as a versatile and rapid method to analyze the polar fraction of extra virgin olive oils containing various classes of phenolic compounds. The strategy for identification of phenolic compounds is based on the NMR chemical shifts of a large number of model compounds assigned by using two-dimensional (2D) NMR spectroscopy. Furthermore, 2D NMR was applied to phenolic extracts in an attempt to discover additional phenolic compounds. The 1 H NMR methodology was successful in detecting simple phenols, such as p-coumaric acid, vanillic acid, homovanillyl alcohol, vanillin, free tyrosol, and free hydroxytyrosol, the flavonols apigenin and luteolin, the lignans (+) pinoresinol, (+) 1-acetoxypinoresinol and syringaresinol, two isomers of the aldehydic form of oleuropein and ligstroside, the dialdehydic form of oleuropein and ligstroside lacking a carboxymethyl group, and finally total hydroxytyrosol and total tyrosol reflecting the total amounts of free and esterified hydroxytyrol and tyrosol, respectively. The absolute amount of each phenolic constituent was determined in the polar fraction by using anhydrous 1,3,5-triazine as an internal standard

  19. Quantification of the fluorine containing drug 5-fluorouracil in cancer cells by GaF molecular absorption via high-resolution continuum source molecular absorption spectrometry

    Science.gov (United States)

    Krüger, Magnus; Huang, Mao-Dong; Becker-Roß, Helmut; Florek, Stefan; Ott, Ingo; Gust, Ronald

    The development of high-resolution continuum source molecular absorption spectrometry made the quantification of fluorine feasible by measuring the molecular absorption as gallium monofluoride (GaF). Using this new technique, we developed on the example of 5-fluorouracil (5-FU) a graphite furnace method to quantify fluorine in organic molecules. The effect of 5-FU on the generation of the diatomic GaF molecule was investigated. The experimental conditions such as gallium nitrate amount, temperature program, interfering anions (represented as corresponding acids) and calibration for the determination of 5-FU in standard solution and in cellular matrix samples were investigated and optimized. The sample matrix showed no effect on the sensitivity of GaF molecular absorption. A simple calibration curve using an inorganic sodium fluoride solution can conveniently be used for the calibration. The described method is sensitive and the achievable limit of detection is 0.23 ng of 5-FU. In order to establish the concept of "fluorine as a probe in medicinal chemistry" an exemplary application was selected, in which the developed method was successfully demonstrated by performing cellular uptake studies of the 5-FU in human colon carcinoma cells.

  20. Liquid chromatography with high resolution mass spectrometry for identification of organic contaminants in fish fillet: screening and quantification assessment using two scan modes for data acquisition.

    Science.gov (United States)

    Munaretto, Juliana S; May, Marília M; Saibt, Nathália; Zanella, Renato

    2016-07-22

    This study proposed a strategy to identify and quantify 182 organic contaminants from different chemical classes, as for instance pesticides, veterinary drug and personal care products, in fish fillet using liquid chromatography coupled to quadrupole time-of-flight mass spectrometry (LC-QToF/MS). For this purpose, two different scan methods (full scan and all ions MS/MS) were evaluated to assess the best option for screening analysis in spiked fish fillet samples. In general, full scan acquisition was found to be more reliable (84%) in the automatic identification and quantification when compared to all ions MS/MS with 72% of the compounds detected. Additionally, a qualitative automatic search showed a mass accuracy error below 5ppm for 77% of the compounds in full scan mode compared to only 52% in all ions MS/MS scan. However, all ions MS/MS provides fragmentation information of the target compounds. Undoubtedly, structural information of a wide number of compounds can be obtained using high resolution mass spectrometry (HRMS), but it is necessary thoroughly assess it, in order to choose the best scan mode. Copyright © 2016 Elsevier B.V. All rights reserved.

  1. Quantification in emission tomography

    International Nuclear Information System (INIS)

    Buvat, Irene

    2011-11-01

    The objective of this lecture is to understand the possibilities and limitations of the quantitative analysis of single photon emission computed tomography (SPECT) and positron emission tomography (PET) images. It is also to identify the conditions to be fulfilled to obtain reliable quantitative measurements from images. Content: 1 - Introduction: Quantification in emission tomography - definition and challenges; quantification biasing phenomena 2 - Main problems impacting quantification in PET and SPECT: problems, consequences, correction methods, results (Attenuation, scattering, partial volume effect, movement, un-stationary spatial resolution in SPECT, fortuitous coincidences in PET, standardisation in PET); 3 - Synthesis: accessible efficiency, know-how, Precautions, beyond the activity measurement

  2. Identification and quantification of the main isoflavones and other phytochemicals in soy based nutraceutical products by liquid chromatography-orbitrap high resolution mass spectrometry.

    Science.gov (United States)

    López-Gutiérrez, Noelia; Romero-González, Roberto; Garrido Frenich, Antonia; Martínez Vidal, José Luis

    2014-06-27

    The specific phytochemicals composition of soy nutritional supplements is usually not labelled. Hence, 12 dietary supplements were analyzed in order to detect and identify the main phytochemicals present in these samples, using a database containing 60 compounds. Ultra-high performance liquid chromatography coupled to single-stage Orbitrap high resolution mass spectrometry (UHPLC-Orbitrap-MS) has been used. Two consecutive extractions, using as extraction solvent a mixture of methanol:water (80:20, v/v), were employed, followed by two dilutions (10 or 100 times depending on the concentration of the components in the sample) with a mixture of an aqueous solution of ammonium acetate 30mM:methanol (50:50, v/v). The method was validated, obtaining adequate recovery and precision values. Limits of detection (LODs) and quantification (LOQs) were calculated, ranging from 2 to 150μgL(-1). Isoflavones were the predominant components present in the analyzed supplements with values higher than 93% of the total amount of phytochemicals in all cases. The aglycones (genistein, daidzein, glycitein and biochanin A) as well as their three conjugated forms, β-glucosides (genistin, daizin and glycitin) were detected and quantified, being daidzein the isoflavone detected at higher concentration in 8 out of 12 samples reported, with values ranging from 684 to 35,970mgkg(-1), whereas biochanin A was detected at very low concentrations, ranging from 18 to 50mgkg(-1). Moreover, other phytochemicals as flavones, flavonols, flavanones and phenolic acids were also detected and quantified. Copyright © 2014 Elsevier B.V. All rights reserved.

  3. High resolution systematic digital histological quantification of cardiac fibrosis and adipose tissue in phospholamban p.Arg14del mutation associated cardiomyopathy.

    Directory of Open Access Journals (Sweden)

    Johannes M I H Gho

    Full Text Available Myocardial fibrosis can lead to heart failure and act as a substrate for cardiac arrhythmias. In dilated cardiomyopathy diffuse interstitial reactive fibrosis can be observed, whereas arrhythmogenic cardiomyopathy is characterized by fibrofatty replacement in predominantly the right ventricle. The p.Arg14del mutation in the phospholamban (PLN gene has been associated with dilated cardiomyopathy and recently also with arrhythmogenic cardiomyopathy. Aim of the present study is to determine the exact pattern of fibrosis and fatty replacement in PLN p.Arg14del mutation positive patients, with a novel method for high resolution systematic digital histological quantification of fibrosis and fatty tissue in cardiac tissue. Transversal mid-ventricular slices (n = 8 from whole hearts were collected from patients with the PLN p.Arg14del mutation (age 48±16 years; 4 (50% male. An in-house developed open source MATLAB script was used for digital analysis of Masson's trichrome stained slides (http://sourceforge.net/projects/fibroquant/. Slides were divided into trabecular, inner and outer compact myocardium. Per region the percentage of connective tissue, cardiomyocytes and fatty tissue was quantified. In PLN p.Arg14del mutation associated cardiomyopathy, myocardial fibrosis is predominantly present in the left posterolateral wall and to a lesser extent in the right ventricular wall, whereas fatty changes are more pronounced in the right ventricular wall. No difference in distribution pattern of fibrosis and adipocytes was observed between patients with a clinical predominantly dilated and arrhythmogenic cardiomyopathy phenotype. In the future, this novel method for quantifying fibrosis and fatty tissue can be used to assess cardiac fibrosis and fatty tissue in animal models and a broad range of human cardiomyopathies.

  4. GHG emissions quantification at high spatial and temporal resolution at urban scale: the case of the town of Sassari (NW Sardinia - Italy)

    Science.gov (United States)

    Sanna, Laura; Ferrara, Roberto; Zara, Pierpaolo; Duce, Pierpaolo

    2014-05-01

    The European Union has set as priorities the fight against climate change related to greenhouse gas releases. The largest source of these emissions comes from human activities in urban areas that account for more than 70% of the world's emissions and several local governments intend to support the European strategic policies in understanding which crucial sectors drive GHG emissions in their city. Planning for mitigation actions at the community scale starts with the compilation of a GHG inventories that, among a wide range of measurement tools, provide information on the current status of GHG emissions across a specific jurisdiction. In the framework of a regional project for quantitative estimate of the net exchange of CO2 (emissions and sinks) at the municipal level in Sardinia, the town of Sassari represents a pilot site where a spatial and temporal high resolution GHG emissions inventory is built in line with European and international standard protocols to establish a baseline for tracking emission trends. The specific purpose of this accurate accounting is to obtain an appropriate allocation of CO2 and other GHG emissions at the fine building and hourly scale. The aim is to test the direct measurements needed to enable the construction of future scenarios of these emissions and for assessing possible strategies to reduce their impact. The key element of the methodologies used to construct this GHG emissions inventory is the Global Protocol for Community-Scale Greenhouse Gas Emissions (GPC) (March 2012) that identifies four main types of emission sources: (i) Stationary Units, (ii) Mobile Units, (iii) Waste, and (iv) Industrial Process and Product Use Emissions. The development of the GHG emissions account in Sassari consists in the collection of a range of alternative data sources (primary data, IPCC emission factors, national and local statistic, etc.) selected on the base on relevance and completeness criteria performed for 2010, as baseline year, using

  5. Easy, Fast, and Reproducible Quantification of Cholesterol and Other Lipids in Human Plasma by Combined High Resolution MSX and FTMS Analysis

    DEFF Research Database (Denmark)

    Gallego, Sandra F; Højlund, Kurt; Ejsing, Christer S

    2018-01-01

    Reliable, cost-effective, and gold-standard absolute quantification of non-esterified cholesterol in human plasma is of paramount importance in clinical lipidomics and for the monitoring of metabolic health. Here, we compared the performance of three mass spectrometric approaches available for di...... can be shortened for high throughput-oriented clinical lipidomics studies or extended with more advanced MS(ALL) technology (Almeida R. et al., J. Am. Soc. Mass Spectrom. 26, 133-148 [1]) to support in-depth structural elucidation of lipid molecules. Graphical Abstract ᅟ....

  6. Multiplexed data independent acquisition (MSX-DIA) applied by high resolution mass spectrometry improves quantification quality for the analysis of histone peptides.

    Science.gov (United States)

    Sidoli, Simone; Fujiwara, Rina; Garcia, Benjamin A

    2016-08-01

    We present the MS-based application of the innovative, although scarcely exploited, multiplexed data-independent acquisition (MSX-DIA) for the analysis of histone PTMs. Histones are golden standard for complexity in MS based proteomics, due to their large number of combinatorial modifications, leading to isobaric peptides after proteolytic digestion. DIA has, thus, gained popularity for the purpose as it allows for MS/MS-based quantification without upfront assay development. In this work, we evaluated the performance of traditional DIA versus MSX-DIA in terms of MS/MS spectra quality, instrument scan rate and quantification precision using histones from HeLa cells. We used an MS/MS isolation window of 10 and 6 m/z for DIA and MSX-DIA, respectively. Four MS/MS scans were multiplexed for MSX-DIA. Despite MSX-DIA was programmed to perform two-fold more MS/MS events than traditional DIA, it acquired on average ∼5% more full MS scans, indicating even faster scan rate. Results highlighted an overall decrease of background ion signals using MSX-DIA, and we illustrated specific examples where peptides of different precursor masses were co-fragmented by DIA but not MSX-DIA. Taken together, MSX-DIA proved thus to be a more favorable method for histone analysis in data independent mode. © 2016 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  7. Estimation and quantification of mangrove forest extent by using different spatial resolution satellite data for the sandspit area of Karachi coast

    International Nuclear Information System (INIS)

    Saeed, U.; Daud, A.; Ashraf, S.; Mahmood, A.

    2006-01-01

    Mangrove forest is an integral part of inter-tidal zone of the coastal environment extending throughout the tropics and subtropics of the world. In Pakistan, for the last thirty years, remote-sensing data has significantly been used for area estimation of mangrove forests. In the previous studies medium resolution satellite data have been used for the area estimation of mangrove forests that revealed some of the discrepancies in terms of recognition of the subtle variations of landcover features in the satellite imagery. Current study aims at the classification techniques employed for the area estimation using high and medium resolution satellite imageries. To study the effects of spatial resolution on classification results, three different satellite data were used, including Quickbird, TERRA and Landsat satellites. Thematic map derived from Quickbird data was comprised of maximum number of land cover classes with a definite zone of mangroves that extends from regeneration to mature canopies. Total estimated mangroves extent was 370 ha with 57.45, 125.9, 180.89, and 5.35 ha of tall, medium, small, and new recruitment mangrove plants respectively. While mangrove area estimations from thematic maps derived using TERRA and Landsat satellite data, showed a gradual increase in the mangrove extent from 390.95 ha to 417.92 ha. This increase in area is an indicative of the fact that some of the landcover classes may have been miss-classified and hence added to the area under mangrove forests. This study also showed that high-resolution satellite data could be used for identifying different height zones of mangrove forests, along with an accurate delineation of classes like salt bushes and algae, which could not be classified otherwise. (author)

  8. Improving the spatial and temporal resolution with quantification of uncertainty and errors in earth observation data sets using Data Interpolating Empirical Orthogonal Functions methodology

    Science.gov (United States)

    El Serafy, Ghada; Gaytan Aguilar, Sandra; Ziemba, Alexander

    2016-04-01

    There is an increasing use of process-based models in the investigation of ecological systems and scenario predictions. The accuracy and quality of these models are improved when run with high spatial and temporal resolution data sets. However, ecological data can often be difficult to collect which manifests itself through irregularities in the spatial and temporal domain of these data sets. Through the use of Data INterpolating Empirical Orthogonal Functions(DINEOF) methodology, earth observation products can be improved to have full spatial coverage within the desired domain as well as increased temporal resolution to daily and weekly time step, those frequently required by process-based models[1]. The DINEOF methodology results in a degree of error being affixed to the refined data product. In order to determine the degree of error introduced through this process, the suspended particulate matter and chlorophyll-a data from MERIS is used with DINEOF to produce high resolution products for the Wadden Sea. These new data sets are then compared with in-situ and other data sources to determine the error. Also, artificial cloud cover scenarios are conducted in order to substantiate the findings from MERIS data experiments. Secondly, the accuracy of DINEOF is explored to evaluate the variance of the methodology. The degree of accuracy is combined with the overall error produced by the methodology and reported in an assessment of the quality of DINEOF when applied to resolution refinement of chlorophyll-a and suspended particulate matter in the Wadden Sea. References [1] Sirjacobs, D.; Alvera-Azcárate, A.; Barth, A.; Lacroix, G.; Park, Y.; Nechad, B.; Ruddick, K.G.; Beckers, J.-M. (2011). Cloud filling of ocean colour and sea surface temperature remote sensing products over the Southern North Sea by the Data Interpolating Empirical Orthogonal Functions methodology. J. Sea Res. 65(1): 114-130. Dx.doi.org/10.1016/j.seares.2010.08.002

  9. Identification/quantification of free and bound phenolic acids in peel and pulp of apples (Malus domestica) using high resolution mass spectrometry (HRMS).

    Science.gov (United States)

    Lee, Jihyun; Chan, Bronte Lee Shan; Mitchell, Alyson E

    2017-01-15

    Free and bound phenolic acids were measured in the pulp and peel of four varieties of apples using high resolution mass spectrometry. Twenty-five phenolic acids were identified and included: 8 hydroxybenzoic acids, 11 hydroxycinnamic acids, 5 hydroxyphenylacetic acids, and 1 hydoxyphenylpropanoic acid. Several phenolics are tentatively identified for the first time in apples and include: methyl gallate, ethyl gallate, hydroxy phenyl acetic acid, three phenylacetic acid isomers, 3-(4-hydroxyphenyl)propionic acid, and homoveratric acid. With exception of chlorogenic and caffeic acid, most phenolic acids were quantified for the first time in apples. Significant varietal differences (pacids were higher in the pulp as compared to apple peel (dry weight) in all varieties. Coumaroylquinic, protocatechuic, 4-hydroxybenzoic, vanillic and t-ferulic acids were present in free forms. With exception of chlorogenic acid, all other phenolic acids were present only as bound forms. Copyright © 2016 Elsevier Ltd. All rights reserved.

  10. Multi-allergen quantification of fining-related egg and milk proteins in white wines by high-resolution mass spectrometry.

    Science.gov (United States)

    Monaci, Linda; Losito, Ilario; De Angelis, Elisabetta; Pilolli, Rosa; Visconti, Angelo

    2013-09-15

    A method based on High-Resolution Mass Spectrometry was developed for the simultaneous determination of fining agents containing potentially allergenic milk (casein) and egg-white (lysozyme and ovalbumin) proteins, added to commercial white wines at sub-ppm levels. Selected tryptic peptides were used as quantitative markers. An evaluation of protein digestion yields was also performed by implementing the (15)N-valine-labelled analogues of the best peptide markers identified for αS1 -casein and ovalbumin. The method was based on the combination of ultrafiltration (UF) of protein-containing wines, tryptic digestion of the dialyzed wine extracts and liquid chromatography/high resolution mass spectrometry (LC/HRMS) analysis of tryptic digests. Peptides providing the most intense electrospray ionization (ESI)-MS response were chosen as quantitative markers of the proteins under investigation. Six-point calibrations were performed by adding caseinate and egg-white powder in the concentration range between 0.25 and 10 µg/mL, to an allergen-free white wine. The following three peptide markers, LTEWTSSNVMEER, GGLEPINFQTAADQAR and ELINSWVESQTNGIIR, were highlighted as best markers for ovalbumin, while GTDVQAWIR and NTDGSTDYGILQINSR for lysozyme and YLGYLEQLLR, GPFPIIV and FFVAPFPEVFGK for caseinate. Limits of detection (LODs) ranged from 0.4 to 1.1 µg/mL. The developed method is suited for assessing the contemporary presence of allergenic milk and egg proteins characterizing egg white and caseinate, fining agents typically employed for wine clarification. The LODs of the method enable the detection of sub-ppm concentrations of residual fining agents, that could represent a potential risk for allergic consumers. Copyright © 2013 John Wiley & Sons, Ltd.

  11. Quantification of PAHs and oxy-PAHs on airborne particulate matter in Chiang Mai, Thailand, using gas chromatography high resolution mass spectrometry

    Science.gov (United States)

    Walgraeve, Christophe; Chantara, Somporn; Sopajaree, Khajornsak; De Wispelaere, Patrick; Demeestere, Kristof; Van Langenhove, Herman

    2015-04-01

    An analytical method using gas chromatography high resolution mass spectrometry was developed for the determination of 16 polycyclic aromatic hydrocarbons (PAHs) and 12 oxygenated PAHs (of which 4 diketones, 3 ketones, 4 aldehydes and one anhydride) on atmospheric particulate matter with an aerodynamic diameter less than 10 μm (PM10). The magnetic sector mass spectrometer was run in multiple ion detection mode (MID) with a mass resolution above 10 000 (10% valley definition) and allows for a selective accurate mass detection of the characteristic ions of the target analytes. Instrumental detection limits between 0.04 pg and 1.34 pg were obtained for the PAHs, whereas for the oxy-PAHs they ranged between 0.08 pg and 2.13 pg. Pressurized liquid extraction using dichloromethane was evaluated and excellent recoveries ranging between 87% and 98% for the PAHs and between 74% and 110% for 10 oxy-PAHs were obtained, when the optimum extraction temperature of 150 °C was applied. The developed method was finally used to determine PAHs and oxy-PAHs concentration levels from particulate matter samples collected in the wet season at 4 different locations in Chiang Mai, Thailand (n = 72). This study brings forward the first concentration levels of oxy-PAHs in Thailand. The median of the sum of the PAHs and oxy-PAHs concentrations was 3.4 ng/m3 and 1.1 ng/m3 respectively, which shows the importance of the group of the oxy-PAHs as PM10 constituents. High molecular weight PAHs contributed the most to the ∑PAHs. For example, benzo[ghi]perylene was responsible for 30-44% of the ∑PAHs. The highest contribution to ∑oxy-PAHs came from 1,8-napthalic anhydride (26-78%), followed by anthracene-9,10-dione (4-27%) and 7H-benzo[de]anthracene-7-one (6-26%). Indications of the degradation of PAHs and/or formation of oxy-PAHs were observed.

  12. A refined, rapid and reproducible high resolution melt (HRM-based method suitable for quantification of global LINE-1 repetitive element methylation

    Directory of Open Access Journals (Sweden)

    Tse M Yat

    2011-12-01

    Full Text Available Abstract Background The methylation of DNA is recognized as a key mechanism in the regulation of genomic stability and evidence for its role in the development of cancer is accumulating. LINE-1 methylation status represents a surrogate measure of genome-wide methylation. Findings Using high resolution melt (HRM curve analysis technology, we have established an in-tube assay that is linear (r > 0.9986 with a high amplification efficiency (90-105%, capable of discriminating between partcipant samples with small differences in methylation, and suitable for quantifying a wide range of LINE-1 methylation levels (0-100%--including the biologically relevant range of 50-90% expected in human DNA. We have optimized this procedure to perform using 2 μg of starting DNA and 2 ng of bisulfite-converted DNA for each PCR reaction. Intra- and inter-assay coefficients of variation were 1.44% and 0.49%, respectively, supporting the high reproducibility and precision of this approach. Conclusions In summary, this is a completely linear, quantitative HRM PCR method developed for the measurement of LINE-1 methylation. This cost-efficient, refined and reproducible assay can be performed using minimal amounts of starting DNA. These features make our assay suitable for high throughput analysis of multiple samples from large population-based studies.

  13. In vivo quantification of plant starch reserves at micrometer resolution using X-ray microCT imaging and machine learning.

    Science.gov (United States)

    Earles, J Mason; Knipfer, Thorsten; Tixier, Aude; Orozco, Jessica; Reyes, Clarissa; Zwieniecki, Maciej A; Brodersen, Craig R; McElrone, Andrew J

    2018-03-08

    Starch is the primary energy storage molecule used by most terrestrial plants to fuel respiration and growth during periods of limited to no photosynthesis, and its depletion can drive plant mortality. Destructive techniques at coarse spatial scales exist to quantify starch, but these techniques face methodological challenges that can lead to uncertainty about the lability of tissue-specific starch pools and their role in plant survival. Here, we demonstrate how X-ray microcomputed tomography (microCT) and a machine learning algorithm can be coupled to quantify plant starch content in vivo, repeatedly and nondestructively over time in grapevine stems (Vitis spp.). Starch content estimated for xylem axial and ray parenchyma cells from microCT images was correlated strongly with enzymatically measured bulk-tissue starch concentration on the same stems. After validating our machine learning algorithm, we then characterized the spatial distribution of starch concentration in living stems at micrometer resolution, and identified starch depletion in live plants under experimental conditions designed to halt photosynthesis and starch production, initiating the drawdown of stored starch pools. Using X-ray microCT technology for in vivo starch monitoring should enable novel research directed at resolving the spatial and temporal patterns of starch accumulation and depletion in woody plant species. No claim to original US Government works New Phytologist © 2018 New Phytologist Trust.

  14. Quantification in positron emission mammography (PEM) with planar detectors: contrast resolution measurements using a custom breast phantom and novel spherical hot-spots

    Science.gov (United States)

    Murthy, K.; Jolly, D.; Aznar, M.; Thompson, C. J.; Sciascia, P.; Loutfi, A.; Lisbona, R.; Gagnon, J. H.

    1999-12-01

    The authors have previously demonstrated that their Positron Emission Mammography-1 (PEM-1) system can successfully detect small (water. The heated solution is poured into spherical molds which are separated upon congealing to yield robust wall-less radioactive hot-spots. The hot-spots were uniform to within 1-5 parts in 100. Less than 0.1% of the total hot-spot activity leaked into the background in 30 minutes. Contrast resolution experiments have been performed with 12 mm and 16 mm diameter hot-spots in the breast phantom containing water with various amounts of background activity. In both cases, the observed contrast values agree well with the ideal values. In the case of the 12 mm hot-spot with a 350-650 keV energy window, image contrast differed from the ideal by an average of 11%. The image contrast for 12 mm hot-spot improved by 40% and the number of detected events decreased by 35% when the low energy threshold was increased from 300 keV to 450 keV.

  15. 4D imaging and quantification of pore structure modifications inside natural building stones by means of high resolution X-ray CT.

    Science.gov (United States)

    Dewanckele, J; De Kock, T; Boone, M A; Cnudde, V; Brabant, L; Boone, M N; Fronteau, G; Van Hoorebeke, L; Jacobs, P

    2012-02-01

    Weathering processes have been studied in detail for many natural building stones. The most commonly used analytical techniques in these studies are thin-section petrography, SEM, XRD and XRF. Most of these techniques are valuable for chemical and mineralogical analysis of the weathering patterns. However, to obtain crucial quantitative information on structural evolutions like porosity changes and growth of weathering crusts in function of time, non-destructive techniques become necessary. In this study, a Belgian historical calcareous sandstone, the Lede stone, was exposed to gaseous SO(2) under wet surface conditions according to the European Standard NBN EN 13919 (2003). Before, during and after the strong acid test, high resolution X-ray tomography has been performed to visualize gypsum crust formation to yield a better insight into the effects of gaseous SO(2) on the pore modification in 3D. The tomographic scans were taken at the Centre for X-ray Tomography at Ghent University (UGCT). With the aid of image analysis, partial porosity changes were calculated in different stadia of the process. Increasing porosity has been observed visually and quantitatively below the new superficial formed layer of gypsum crystals. In some cases micro-cracks and dissolution zones were detected on the grain boundaries of quartz. By using Morpho+, an in-house developed image analysis program, radial porosity, partial porosity, ratio of open and closed porosity and equivalent diameter of individual pore structures have been calculated. The results obtained in this study are promising for a better understanding of gypsum weathering mechanisms, porosity changes and patterns on natural building stones in four dimensions. Copyright © 2011 Elsevier B.V. All rights reserved.

  16. Superposition Quantification

    Science.gov (United States)

    Chang, Li-Na; Luo, Shun-Long; Sun, Yuan

    2017-11-01

    The principle of superposition is universal and lies at the heart of quantum theory. Although ever since the inception of quantum mechanics a century ago, superposition has occupied a central and pivotal place, rigorous and systematic studies of the quantification issue have attracted significant interests only in recent years, and many related problems remain to be investigated. In this work we introduce a figure of merit which quantifies superposition from an intuitive and direct perspective, investigate its fundamental properties, connect it to some coherence measures, illustrate it through several examples, and apply it to analyze wave-particle duality. Supported by Science Challenge Project under Grant No. TZ2016002, Laboratory of Computational Physics, Institute of Applied Physics and Computational Mathematics, Beijing, Key Laboratory of Random Complex Structures and Data Science, Chinese Academy of Sciences, Grant under No. 2008DP173182

  17. SU-G-IeP3-07: High-Resolution, High-Sensitivity Imaging and Quantification of Intratumoral Distributions of Gold Nanoparticles Using a Benchtop L-Shell XRF Imaging System

    Energy Technology Data Exchange (ETDEWEB)

    Manohar, N; Diagaradjane, P; Krishnan, S; Cho, S [UT MD Anderson Cancer Center, Houston, TX (United States); Reynoso, F [UT MD Anderson Cancer Center, Houston, TX (United States); Washington University School of Medicine, St. Louis, MO (United States)

    2016-06-15

    Purpose: To demonstrate the ability to perform high-resolution imaging and quantification of sparse distributions of gold nanoparticles (GNPs) within ex vivo tumor samples using a highly-sensitive benchtop L-shell x-ray fluorescence (XRF) imaging system. Methods: An optimized L-shell XRF imaging system was assembled using a tungsten-target x-ray source (operated at 62 kVp and 45 mA). The x-rays were filtered (copper: 0.08 mm & aluminum: 0.04 mm) and collimated (lead: 5 cm thickness, 3 cm aperture diameter) into a cone-beam in order to irradiate small samples or objects. A collimated (stainless steel: 4 cm thickness, 2 mm aperture diameter) silicon drift detector, capable of 2D translation, was placed at 90° with respect to the beam to acquire XRF/scatter spectra from regions of interest. Spectral processing involved extracting XRF signal from background, followed by attenuation correction using a Compton scatter-based normalization algorithm. Calibration phantoms with water/GNPs (0 and 0.00001–10 mg/cm{sup 3}) were used to determine the detection limit of the system at a 10-second acquisition time. The system was then used to map the distribution of GNPs within a 12×11×2 mm{sup 3} slice excised from the center of a GNP-loaded ex vivo murine tumor sample; a total of 110 voxels (2.65×10{sup −3} cm{sup 3}) were imaged with 1.3-mm spatial resolution. Results: The detection limit of the current cone-beam benchtop L-shell XRF system was 0.003 mg/cm{sup 3} (3 ppm). Intratumoral GNP concentrations ranging from 0.003 mg/cm{sup 3} (3 ppm) to a maximum of 0.055 mg/cm{sup 3} (55 ppm) and average of 0.0093 mg/cm{sup 3} (9.3 ppm) were imaged successfully within the ex vivo tumor slice. Conclusion: The developed cone-beam benchtop L-shell XRF imaging system can immediately be used for imaging of ex vivo tumor samples containing low concentrations of GNPs. With minor finetuning/optimization, the system can be directly adapted for performing routine preclinical in vivo

  18. Quantification in single photon emission computed tomography (SPECT)

    International Nuclear Information System (INIS)

    Buvat, Irene

    2005-01-01

    The objective of this lecture is to understand the possibilities and limitations of the quantitative analysis of single photon emission computed tomography (SPECT) images. It is also to identify the conditions to be fulfilled to obtain reliable quantitative measurements from images. Content: 1 - Introduction: Quantification in emission tomography - definition and challenges; quantification biasing phenomena; 2 - quantification in SPECT, problems and correction methods: Attenuation, scattering, un-stationary spatial resolution, partial volume effect, movement, tomographic reconstruction, calibration; 3 - Synthesis: actual quantification accuracy; 4 - Beyond the activity concentration measurement

  19. Biomass Burning: Major Uncertainties, Advances, and Opportunities

    Science.gov (United States)

    Yokelson, R. J.; Stockwell, C.; Veres, P. R.; Hatch, L. E.; Barsanti, K. C.; Liu, X.; Huey, L. G.; Ryerson, T. B.; Dibb, J. E.; Wisthaler, A.; Müller, M.; Alvarado, M. J.; Kreidenweis, S. M.; Robinson, A. L.; Toon, O. B.; Peischl, J.; Pollack, I. B.

    2014-12-01

    Domestic and open biomass burning are poorly-understood, major influences on Earth's atmosphere composed of countless individual fires that (along with their products) are difficult to quantify spatially and temporally. Each fire is a minimally-controlled complex phenomenon producing a diverse suite of gases and aerosols that experience many different atmospheric processing scenarios. New lab, airborne, and space-based observations along with model and algorithm development are significantly improving our knowledge of biomass burning. Several campaigns provided new detailed emissions profiles for previously undersampled fire types; including wildfires, cooking fires, peat fires, and agricultural burning; which may increase in importance with climate change and rising population. Multiple campaigns have better characterized black and brown carbon and used new instruments such as high resolution PTR-TOF-MS and 2D-GC/TOF-MS to improve quantification of semi-volatile precursors to aerosol and ozone. The aerosol evolution and formation of PAN and ozone, within hours after emission, have now been measured extensively. The NASA DC-8 sampled smoke before and after cloud-processing in two campaigns. The DC-8 performed continuous intensive sampling of a wildfire plume from the source in California to Canada probing multi-day aerosol and trace gas aging. Night-time plume chemistry has now been measured in detail. Fire inventories are being compared and improved, as is modeling of mass transfer between phases and sub-grid photochemistry for global models.

  20. Quantification of local mobilities

    DEFF Research Database (Denmark)

    Zhang, Y. B.

    2018-01-01

    A new method for quantification of mobilities of local recrystallization boundary segments is presented. The quantification is based on microstructures characterized using electron microscopy and on determination of migration velocities and driving forces for local boundary segments. Pure aluminium...... is investigated and the results show that even for a single recrystallization boundary, different boundary segments migrate differently, and the differences can be understood based on variations in mobilities and local deformed microstructures. The present work has important implications for understanding...

  1. Fluorescent quantification of melanin.

    Science.gov (United States)

    Fernandes, Bruno; Matamá, Teresa; Guimarães, Diana; Gomes, Andreia; Cavaco-Paulo, Artur

    2016-11-01

    Melanin quantification is reportedly performed by absorption spectroscopy, commonly at 405 nm. Here, we propose the implementation of fluorescence spectroscopy for melanin assessment. In a typical in vitro assay to assess melanin production in response to an external stimulus, absorption spectroscopy clearly overvalues melanin content. This method is also incapable of distinguishing non-melanotic/amelanotic control cells from those that are actually capable of performing melanogenesis. Therefore, fluorescence spectroscopy is the best method for melanin quantification as it proved to be highly specific and accurate, detecting even small variations in the synthesis of melanin. This method can also be applied to the quantification of melanin in more complex biological matrices like zebrafish embryos and human hair. © 2016 John Wiley & Sons A/S. Published by John Wiley & Sons Ltd.

  2. SPECT quantification of regional radionuclide distributions

    International Nuclear Information System (INIS)

    Jaszczak, R.J.; Greer, K.L.; Coleman, R.E.

    1986-01-01

    SPECT quantification of regional radionuclide activities within the human body is affected by several physical and instrumental factors including attenuation of photons within the patient, Compton scattered events, the system's finite spatial resolution and object size, finite number of detected events, partial volume effects, the radiopharmaceutical biokinetics, and patient and/or organ motion. Furthermore, other instrumentation factors such as calibration of the center-of-rotation, sampling, and detector nonuniformities will affect the SPECT measurement process. These factors are described, together with examples of compensation methods that are currently available for improving SPECT quantification. SPECT offers the potential to improve in vivo estimates of absorbed dose, provided the acquisition, reconstruction, and compensation procedures are adequately implemented and utilized. 53 references, 2 figures

  3. Disease quantification in dermatology

    DEFF Research Database (Denmark)

    Greve, Tanja Maria; Kamp, Søren; Jemec, Gregor B E

    2013-01-01

    Accurate documentation of disease severity is a prerequisite for clinical research and the practice of evidence-based medicine. The quantification of skin diseases such as psoriasis currently relies heavily on clinical scores. Although these clinical scoring methods are well established and very ...

  4. Simultaneous quantification of eight organic acid components in Artemisia capillaris Thunb (Yinchen extract using high-performance liquid chromatography coupled with diode array detection and high-resolution mass spectrometry

    Directory of Open Access Journals (Sweden)

    Fangjun Yu

    2018-04-01

    Full Text Available We aim to determine the chemical constituents of Yinchen extract and Yinchen herbs using high-performance liquid chromatography coupled with diode array detection and high-resolution mass spectrometry. The method was developed to analyze of eight organic acid components of Yinchen extract (including neochlorogenic acid, chlorogenic acid, cryptochlorogenic acid, caffeic acid, 1,3-dicaffeoylquinic acid, 3,4-dicaffeoylquinic acid, 3,5-dicaffeoylquinic acid and 4,5-dicaffeoylquinic acid. The separation was conducted using an Agilent TC-C18 column with acetonitrile – 0.2% formic acid solution as the mobile phases under gradient elution. The analytical method was fully validated in terms of linearity, sensitivity, precision, repeatability as well as recovery, and subsequently the method was performed for the quantitative assessment of Yinchen extracts and Yinchen herbs. In addition, the changes of selected markers were studied when Yinchen herbs decocting in water and isomerization occurred between the chlorogenic acids. The proposed method enables both qualitative and quantitative analyses and could be developed as a new tool for the quality evaluation of Yinchen extract and Yinchen herbs. The changes of selected markers in water decoction process could give us some novel idea when studying the link between substances and drug efficacy. Keywords: Artemisia capillaris Thunb (Yinchen extract, Quality control, Organic acid, Transformation pathways, High-performance liquid chromatography

  5. Quantification in dynamic and small-animal positron emission tomography

    NARCIS (Netherlands)

    Disselhorst, Johannes Antonius

    2011-01-01

    This thesis covers two aspects of positron emission tomography (PET) quantification. The first section addresses the characterization and optimization of a small-animal PET/CT scanner. The sensitivity and resolution as well as various parameters affecting image quality (reconstruction settings, type

  6. Structure elucidation and quantification of impurities formed between 6-aminocaproic acid and the excipients citric acid and sorbitol in an oral solution using high-resolution mass spectrometry and nuclear magnetic resonance spectroscopy.

    Science.gov (United States)

    Schou-Pedersen, Anne Marie V; Cornett, Claus; Nyberg, Nils; Østergaard, Jesper; Hansen, Steen Honoré

    2015-03-25

    Concentrated solutions containing 6-aminocaproic acid and the excipients citric acid and sorbitol have been studied at temperatures of 50°C, 60°C, 70°C and 80°C as well as at 20°C. It has previously been reported that the commonly employed citric acid is a reactive excipient, and it is therefore important to thoroughly investigate a possible reaction between 6-aminocaproic acid and citric acid. The current study revealed the formation of 3-hydroxy-3,4-dicarboxy-butanamide-N-hexanoic acid between 6-aminocaproic acid and citric acid by high-resolution mass spectrometry (HRMS) and nuclear magnetic resonance spectroscopy (NMR). Less than 0.03% of 6-aminocaproic acid was converted to 3-hydroxy-3,4-dicarboxy-butanamide-N-hexanoic acid after 30 days of storage at 80°C. Degradation products of 6-aminocaproic acid were also observed after storage at the applied temperatures, e.g., dimer, trimer and cyclized 6-aminocaproic acid, i.e., caprolactam. No reaction products between D-sorbitol and 6-aminocaproic acid could be observed. 3-Hydroxy-3,4-dicarboxy-butanamide-N-hexanoic acid, dimer and caprolactam were also observed after storage at 20°C for 3 months. The findings imply that an oral solution of 6-aminocaproic acid is relatively stable at 20°C at the pH values 4.00 and 5.00 as suggested in the USP for oral formulations. Compliance with the ICH guideline Q3B is expected. Copyright © 2015 Elsevier B.V. All rights reserved.

  7. SU-D-9A-02: Relative Effects of Threshold Choice and Spatial Resolution Modeling On SUV and Volume Quantification in F18-FDG PET Imaging of Anal Cancer Patients

    Energy Technology Data Exchange (ETDEWEB)

    Zhao, F [Duke University Medical Center, Durham, NC (United States); Shandong Cancer Hospital and Insititute, Jinan, Shandong (China); Bowsher, J; Palta, M; Czito, B; Willett, C; Yin, F [Duke University Medical Center, Durham, NC (United States)

    2014-06-01

    Purpose: PET imaging with F18-FDG is utilized for treatment planning, treatment assessment, and prognosis. A region of interest (ROI) encompassing the tumor may be determined on the PET image, often by a threshold T on the PET standard uptake values (SUVs). Several studies have shown prognostic value for relevant ROI properties including maximum SUV value (SUVmax), metabolic tumor volume (MTV), and total glycolytic activity (TGA). The choice of threshold T may affect mean SUV value (SUVmean), MTV, and TGA. Recently spatial resolution modeling (SRM) has been introduced on many PET systems. SRM may also affect these ROI properties. The purpose of this work is to investigate the relative influence of SRM and threshold choice T on SUVmean, MTV, TGA, and SUVmax. Methods: For 9 anal cancer patients, 18F-FDG PET scans were performed prior to treatment. PET images were reconstructed by 2 iterations of Ordered Subsets Expectation Maximization (OSEM), with and without SRM. ROI contours were generated by 5 different SUV threshold values T: 2.5, 3.0, 30%, 40%, and 50% of SUVmax. Paired-samples t tests were used to compare SUVmean, MTV, and TGA (a) for SRM on versus off and (b) between each pair of threshold values T. SUVmax was also compared for SRM on versus off. Results: For almost all (57/60) comparisons of 2 different threshold values, SUVmean, MTV, and TGA showed statistically significant variation. For comparison of SRM on versus off, there were no statistically significant changes in SUVmax and TGA, but there were statistically significant changes in MTV for T=2.5 and T=3.0 and in SUVmean for all T. Conclusion: The near-universal statistical significance of threshold choice T suggests that, regarding harmonization across sites, threshold choice may be a greater concern than choice of SRM. However, broader study is warranted, e.g. other iterations of OSEM should be considered.

  8. SU-D-9A-02: Relative Effects of Threshold Choice and Spatial Resolution Modeling On SUV and Volume Quantification in F18-FDG PET Imaging of Anal Cancer Patients

    International Nuclear Information System (INIS)

    Zhao, F; Bowsher, J; Palta, M; Czito, B; Willett, C; Yin, F

    2014-01-01

    Purpose: PET imaging with F18-FDG is utilized for treatment planning, treatment assessment, and prognosis. A region of interest (ROI) encompassing the tumor may be determined on the PET image, often by a threshold T on the PET standard uptake values (SUVs). Several studies have shown prognostic value for relevant ROI properties including maximum SUV value (SUVmax), metabolic tumor volume (MTV), and total glycolytic activity (TGA). The choice of threshold T may affect mean SUV value (SUVmean), MTV, and TGA. Recently spatial resolution modeling (SRM) has been introduced on many PET systems. SRM may also affect these ROI properties. The purpose of this work is to investigate the relative influence of SRM and threshold choice T on SUVmean, MTV, TGA, and SUVmax. Methods: For 9 anal cancer patients, 18F-FDG PET scans were performed prior to treatment. PET images were reconstructed by 2 iterations of Ordered Subsets Expectation Maximization (OSEM), with and without SRM. ROI contours were generated by 5 different SUV threshold values T: 2.5, 3.0, 30%, 40%, and 50% of SUVmax. Paired-samples t tests were used to compare SUVmean, MTV, and TGA (a) for SRM on versus off and (b) between each pair of threshold values T. SUVmax was also compared for SRM on versus off. Results: For almost all (57/60) comparisons of 2 different threshold values, SUVmean, MTV, and TGA showed statistically significant variation. For comparison of SRM on versus off, there were no statistically significant changes in SUVmax and TGA, but there were statistically significant changes in MTV for T=2.5 and T=3.0 and in SUVmean for all T. Conclusion: The near-universal statistical significance of threshold choice T suggests that, regarding harmonization across sites, threshold choice may be a greater concern than choice of SRM. However, broader study is warranted, e.g. other iterations of OSEM should be considered

  9. Quantification Scope Ambiguity Resolution: Evidence from Persian and English

    Science.gov (United States)

    Asadollahfam, Hassan; Lotfi, Ahmad Reza

    2010-01-01

    This study investigates the interpretation of scopally ambiguous sentences containing noun phrases with double quantified constituents from a processing perspective. The questions this study tried to answer were: whether or not the preferred interpretation for doubly quantified ambiguous sentences in English was influenced by English learners' L1…

  10. Accident sequence quantification with KIRAP

    International Nuclear Information System (INIS)

    Kim, Tae Un; Han, Sang Hoon; Kim, Kil You; Yang, Jun Eon; Jeong, Won Dae; Chang, Seung Cheol; Sung, Tae Yong; Kang, Dae Il; Park, Jin Hee; Lee, Yoon Hwan; Hwang, Mi Jeong.

    1997-01-01

    The tasks of probabilistic safety assessment(PSA) consists of the identification of initiating events, the construction of event tree for each initiating event, construction of fault trees for event tree logics, the analysis of reliability data and finally the accident sequence quantification. In the PSA, the accident sequence quantification is to calculate the core damage frequency, importance analysis and uncertainty analysis. Accident sequence quantification requires to understand the whole model of the PSA because it has to combine all event tree and fault tree models, and requires the excellent computer code because it takes long computation time. Advanced Research Group of Korea Atomic Energy Research Institute(KAERI) has developed PSA workstation KIRAP(Korea Integrated Reliability Analysis Code Package) for the PSA work. This report describes the procedures to perform accident sequence quantification, the method to use KIRAP's cut set generator, and method to perform the accident sequence quantification with KIRAP. (author). 6 refs

  11. Accident sequence quantification with KIRAP

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Tae Un; Han, Sang Hoon; Kim, Kil You; Yang, Jun Eon; Jeong, Won Dae; Chang, Seung Cheol; Sung, Tae Yong; Kang, Dae Il; Park, Jin Hee; Lee, Yoon Hwan; Hwang, Mi Jeong

    1997-01-01

    The tasks of probabilistic safety assessment(PSA) consists of the identification of initiating events, the construction of event tree for each initiating event, construction of fault trees for event tree logics, the analysis of reliability data and finally the accident sequence quantification. In the PSA, the accident sequence quantification is to calculate the core damage frequency, importance analysis and uncertainty analysis. Accident sequence quantification requires to understand the whole model of the PSA because it has to combine all event tree and fault tree models, and requires the excellent computer code because it takes long computation time. Advanced Research Group of Korea Atomic Energy Research Institute(KAERI) has developed PSA workstation KIRAP(Korea Integrated Reliability Analysis Code Package) for the PSA work. This report describes the procedures to perform accident sequence quantification, the method to use KIRAP`s cut set generator, and method to perform the accident sequence quantification with KIRAP. (author). 6 refs.

  12. Comparison of quantification methods for the analysis of polychlorinated alkanes using electron capture negative ionization mass spectrometry.

    NARCIS (Netherlands)

    Rusina, T.; Korytar, P.; de Boer, J.

    2011-01-01

    Four quantification methods for short-chain chlorinated paraffins (SCCPs) or polychlorinated alkanes (PCAs) using gas chromatography electron capture negative ionisation low resolution mass spectrometry (GC-ECNI-LRMS) were investigated. The method based on visual comparison of congener group

  13. Comparison of quantification methods for the analysis of polychlorinated alkanes using electron capture negative ionisation mass spectrometry

    NARCIS (Netherlands)

    Rusina, T.; Korytar, P.; Boer, de J.

    2011-01-01

    Four quantification methods for short-chain chlorinated paraffins (SCCPs) or polychlorinated alkanes (PCAs) using gas chromatography electron capture negative ionisation low resolution mass spectrometry (GC-ECNI-LRMS) were investigated. The method based on visual comparison of congener group

  14. Resolution propositions

    International Nuclear Information System (INIS)

    2003-05-01

    To put a resolution to the meeting in relation with the use of weapons made of depleted uranium is the purpose of this text. The situation of the use of depleted uranium by France during the Gulf war and other recent conflicts will be established. This resolution will give the most strict recommendations face to the eventual sanitary and environmental risks in the use of these kind of weapons. (N.C.)

  15. Verb aspect, alternations and quantification

    Directory of Open Access Journals (Sweden)

    Svetla Koeva

    2015-11-01

    Full Text Available Verb aspect, alternations and quantification In this paper we are briefly discuss the nature of Bulgarian verb aspect and argue that the verb aspect pairs are different lexical units with different (although related meaning, different argument structure (reflecting categories, explicitness and referential status of arguments and different sets of semantic and syntactic alternations. The verb prefixes resulting in perfective verbs derivation in some cases can be interpreted as lexical quantifiers as well. Thus the Bulgarian verb aspect is related (in different way both with the potential for the generation of alternations and with the prefixal lexical quantification. It is shown that the scope of the lexical quantification by means of verbal prefixes is the quantified verb phrase and the scope remains constant in all derived alternations. The paper concerns the basic issues of these complex problems, while the detailed description of the conditions satisfying particular alternation or particular lexical quantification are subject of a more detailed study.

  16. Uncertainty quantification for environmental models

    Science.gov (United States)

    Hill, Mary C.; Lu, Dan; Kavetski, Dmitri; Clark, Martyn P.; Ye, Ming

    2012-01-01

    Environmental models are used to evaluate the fate of fertilizers in agricultural settings (including soil denitrification), the degradation of hydrocarbons at spill sites, and water supply for people and ecosystems in small to large basins and cities—to mention but a few applications of these models. They also play a role in understanding and diagnosing potential environmental impacts of global climate change. The models are typically mildly to extremely nonlinear. The persistent demand for enhanced dynamics and resolution to improve model realism [17] means that lengthy individual model execution times will remain common, notwithstanding continued enhancements in computer power. In addition, high-dimensional parameter spaces are often defined, which increases the number of model runs required to quantify uncertainty [2]. Some environmental modeling projects have access to extensive funding and computational resources; many do not. The many recent studies of uncertainty quantification in environmental model predictions have focused on uncertainties related to data error and sparsity of data, expert judgment expressed mathematically through prior information, poorly known parameter values, and model structure (see, for example, [1,7,9,10,13,18]). Approaches for quantifying uncertainty include frequentist (potentially with prior information [7,9]), Bayesian [13,18,19], and likelihood-based. A few of the numerous methods, including some sensitivity and inverse methods with consequences for understanding and quantifying uncertainty, are as follows: Bayesian hierarchical modeling and Bayesian model averaging; single-objective optimization with error-based weighting [7] and multi-objective optimization [3]; methods based on local derivatives [2,7,10]; screening methods like OAT (one at a time) and the method of Morris [14]; FAST (Fourier amplitude sensitivity testing) [14]; the Sobol' method [14]; randomized maximum likelihood [10]; Markov chain Monte Carlo (MCMC) [10

  17. Quantification of informed opinion

    International Nuclear Information System (INIS)

    Rasmuson, D.M.

    1985-01-01

    The objective of this session, Quantification of Informed Opinion, is to provide the statistician with a better understanding of this important area. The NRC uses informed opinion, sometimes called engineering judgment or subjective judgment, in many areas. Sometimes informed opinion is the only source of information that exists, especially in phenomenological areas, such as steam explosions, where experiments are costly and phenomena are very difficult to measure. There are many degrees of informed opinion. These vary from the weatherman who makes predictions concerning relatively high probability events with a large data base to the phenomenological expert who must use his intuition tempered with basic knowledge and little or no measured data to predict the behavior of events with a low probability of occurrence. The first paper in this session provides the reader with an overview of the subject area. The second paper provides some aspects that must be considered in the collection of informed opinion to improve the quality of the information. The final paper contains an example of the use of informed opinion in the area of seismic hazard characterization. These papers should be useful to researchers and statisticians who need to collect and use informed opinion in their work

  18. Quantification In Neurology

    Directory of Open Access Journals (Sweden)

    Netravati M

    2005-01-01

    Full Text Available There is a distinct shift of emphasis in clinical neurology in the last few decades. A few years ago, it was just sufficient for a clinician to precisely record history, document signs, establish diagnosis and write prescription. In the present context, there has been a significant intrusion of scientific culture in clinical practice. Several criteria have been proposed, refined and redefined to ascertain accurate diagnosis for many neurological disorders. Introduction of the concept of impairment, disability, handicap and quality of life has added new dimension to the measurement of health and disease and neurological disorders are no exception. "Best guess" treatment modalities are no more accepted and evidence based medicine has become an integral component of medical care. Traditional treatments need validation and new therapies require vigorous trials. Thus, proper quantification in neurology has become essential, both in practice and research methodology in neurology. While this aspect is widely acknowledged, there is a limited access to a comprehensive document pertaining to measurements in neurology. This following description is a critical appraisal of various measurements and also provides certain commonly used rating scales/scores in neurological practice.

  19. Development of computational algorithms for quantification of pulmonary structures

    International Nuclear Information System (INIS)

    Oliveira, Marcela de; Alvarez, Matheus; Alves, Allan F.F.; Miranda, Jose R.A.; Pina, Diana R.

    2012-01-01

    The high-resolution computed tomography has become the imaging diagnostic exam most commonly used for the evaluation of the squeals of Paracoccidioidomycosis. The subjective evaluations the radiological abnormalities found on HRCT images do not provide an accurate quantification. The computer-aided diagnosis systems produce a more objective assessment of the abnormal patterns found in HRCT images. Thus, this research proposes the development of algorithms in MATLAB® computing environment can quantify semi-automatically pathologies such as pulmonary fibrosis and emphysema. The algorithm consists in selecting a region of interest (ROI), and by the use of masks, filter densities and morphological operators, to obtain a quantification of the injured area to the area of a healthy lung. The proposed method was tested on ten HRCT scans of patients with confirmed PCM. The results of semi-automatic measurements were compared with subjective evaluations performed by a specialist in radiology, falling to a coincidence of 80% for emphysema and 58% for fibrosis. (author)

  20. Gap Resolution

    Energy Technology Data Exchange (ETDEWEB)

    2017-04-25

    Gap Resolution is a software package that was developed to improve Newbler genome assemblies by automating the closure of sequence gaps caused by repetitive regions in the DNA. This is done by performing the follow steps:1) Identify and distribute the data for each gap in sub-projects. 2) Assemble the data associated with each sub-project using a secondary assembler, such as Newbler or PGA. 3) Determine if any gaps are closed after reassembly, and either design fakes (consensus of closed gap) for those that closed or lab experiments for those that require additional data. The software requires as input a genome assembly produce by the Newbler assembler provided by Roche and 454 data containing paired-end reads.

  1. Collagen Quantification in Tissue Specimens.

    Science.gov (United States)

    Coentro, João Quintas; Capella-Monsonís, Héctor; Graceffa, Valeria; Wu, Zhuning; Mullen, Anne Maria; Raghunath, Michael; Zeugolis, Dimitrios I

    2017-01-01

    Collagen is the major extracellular protein in mammals. Accurate quantification of collagen is essential in the biomaterials (e.g., reproducible collagen scaffold fabrication), drug discovery (e.g., assessment of collagen in pathophysiologies, such as fibrosis), and tissue engineering (e.g., quantification of cell-synthesized collagen) fields. Although measuring hydroxyproline content is the most widely used method to quantify collagen in biological specimens, the process is very laborious. To this end, the Sircol™ Collagen Assay is widely used due to its inherent simplicity and convenience. However, this method leads to overestimation of collagen content due to the interaction of Sirius red with basic amino acids of non-collagenous proteins. Herein, we describe the addition of an ultrafiltration purification step in the process to accurately determine collagen content in tissues.

  2. SPECT quantification: a review of the different correction methods with compton scatter, attenuation and spatial deterioration effects

    International Nuclear Information System (INIS)

    Groiselle, C.; Rocchisani, J.M.; Moretti, J.L.; Dreuille, O. de; Gaillard, J.F.; Bendriem, B.

    1997-01-01

    SPECT quantification: a review of the different correction methods with Compton scatter attenuation and spatial deterioration effects. The improvement of gamma-cameras, acquisition and reconstruction software opens new perspectives in term of image quantification in nuclear medicine. In order to meet the challenge, numerous works have been undertaken in recent years to correct for the different physical phenomena that prevent an exact estimation of the radioactivity distribution. The main phenomena that have to betaken into account are scatter, attenuation and resolution. In this work, authors present the physical basis of each issue, its consequences on quantification and the main methods proposed to correct them. (authors)

  3. DNA imaging and quantification using chemi-luminescent probes

    International Nuclear Information System (INIS)

    Dorner, G.; Redjdal, N.; Laniece, P.; Siebert, R.; Tricoire, H.; Valentin, L.

    1999-01-01

    During this interdisciplinary study we have developed an ultra sensitive and reliable imaging system of DNA labelled by chemiluminescence. Based on a liquid nitrogen cooled CCD, the system achieves sensitivities down to 10 fg/mm 2 labelled DNA over a surface area of 25 x 25 cm 2 with a sub-millimeter resolution. Commercially available chemi-luminescent - and enhancer molecules are compared and their reaction conditions optimized for best signal-to-noise ratios. Double labelling was performed to verify quantification with radioactive probes. (authors)

  4. Quantification of Hydroxyl Radical reactivity in the urban environment using the Comparative Reactivity Method (CRM)

    Science.gov (United States)

    Panchal, Rikesh; Monks, Paul

    2015-04-01

    Hydroxyl (OH) radicals play an important role in 'cleansing' the atmosphere of many pollutants such as, NOx, CH4 and various VOCs, through oxidation. To measure the reactivity of OH, both the sinks and sources of OH need to be quantified, and currently the overall sinks of OH seem not to be fully constrained. In order to measure the total rate loss of OH in an ambient air sample, all OH reactive species must be considered and their concentrations and reaction rate coefficients with OH known. Using the method pioneered by Sinha and Williams at the Max Plank Institute Mainz, the Comparative Reactivity Method (CRM) which directly quantifies total OH reactivity in ambient air without the need to consider the concentrations of individual species within the sample that can react with OH, has been developed and applied in a urban setting. The CRM measures the concentration of a reactive species that is present only in low concentrations in ambient air, in this case pyrrole, flowing through a reaction vessel and detected using Proton Transfer Reaction - Mass Spectrometry (PTR-MS). The poster will show a newly developed and tested PTR-TOF-MS system for CRM. The correction regime will be detailed to account for the influence of the varying humidity between ambient air and clean air on the pyrrole signal. Further, examination of the sensitivity dependence of the PTR-MS as a function of relative humidity and H3O+(H2O) (m/z=37) cluster ion allows the correction for the humidity variation, between the clean humid air entering the reaction vessel and ambient air will be shown. NO, present within ambient air, is also a potential interference and can cause recycling of OH, resulting in an overestimation of OH reactivity. Tests have been conducted on the effects of varying NO concentrations on OH reactivity and a correction factor determined for application to data when sampling ambient air. Finally, field tests in the urban environment at the University of Leicester will be shown

  5. DNA imaging and quantification using chemi-luminescent probes; Imagerie et quantification d`ADN par chimiluminescence

    Energy Technology Data Exchange (ETDEWEB)

    Dorner, G; Redjdal, N; Laniece, P; Siebert, R; Tricoire, H; Valentin, L [Groupe I.P.B., Experimental Research Division, Inst. de Physique Nucleaire, Paris-11 Univ., 91 - Orsay (France)

    1999-11-01

    During this interdisciplinary study we have developed an ultra sensitive and reliable imaging system of DNA labelled by chemiluminescence. Based on a liquid nitrogen cooled CCD, the system achieves sensitivities down to 10 fg/mm{sup 2} labelled DNA over a surface area of 25 x 25 cm{sup 2} with a sub-millimeter resolution. Commercially available chemi-luminescent - and enhancer molecules are compared and their reaction conditions optimized for best signal-to-noise ratios. Double labelling was performed to verify quantification with radioactive probes. (authors) 1 fig.

  6. Tentacle: distributed quantification of genes in metagenomes.

    Science.gov (United States)

    Boulund, Fredrik; Sjögren, Anders; Kristiansson, Erik

    2015-01-01

    In metagenomics, microbial communities are sequenced at increasingly high resolution, generating datasets with billions of DNA fragments. Novel methods that can efficiently process the growing volumes of sequence data are necessary for the accurate analysis and interpretation of existing and upcoming metagenomes. Here we present Tentacle, which is a novel framework that uses distributed computational resources for gene quantification in metagenomes. Tentacle is implemented using a dynamic master-worker approach in which DNA fragments are streamed via a network and processed in parallel on worker nodes. Tentacle is modular, extensible, and comes with support for six commonly used sequence aligners. It is easy to adapt Tentacle to different applications in metagenomics and easy to integrate into existing workflows. Evaluations show that Tentacle scales very well with increasing computing resources. We illustrate the versatility of Tentacle on three different use cases. Tentacle is written for Linux in Python 2.7 and is published as open source under the GNU General Public License (v3). Documentation, tutorials, installation instructions, and the source code are freely available online at: http://bioinformatics.math.chalmers.se/tentacle.

  7. On uncertainty quantification in hydrogeology and hydrogeophysics

    Science.gov (United States)

    Linde, Niklas; Ginsbourger, David; Irving, James; Nobile, Fabio; Doucet, Arnaud

    2017-12-01

    Recent advances in sensor technologies, field methodologies, numerical modeling, and inversion approaches have contributed to unprecedented imaging of hydrogeological properties and detailed predictions at multiple temporal and spatial scales. Nevertheless, imaging results and predictions will always remain imprecise, which calls for appropriate uncertainty quantification (UQ). In this paper, we outline selected methodological developments together with pioneering UQ applications in hydrogeology and hydrogeophysics. The applied mathematics and statistics literature is not easy to penetrate and this review aims at helping hydrogeologists and hydrogeophysicists to identify suitable approaches for UQ that can be applied and further developed to their specific needs. To bypass the tremendous computational costs associated with forward UQ based on full-physics simulations, we discuss proxy-modeling strategies and multi-resolution (Multi-level Monte Carlo) methods. We consider Bayesian inversion for non-linear and non-Gaussian state-space problems and discuss how Sequential Monte Carlo may become a practical alternative. We also describe strategies to account for forward modeling errors in Bayesian inversion. Finally, we consider hydrogeophysical inversion, where petrophysical uncertainty is often ignored leading to overconfident parameter estimation. The high parameter and data dimensions encountered in hydrogeological and geophysical problems make UQ a complicated and important challenge that has only been partially addressed to date.

  8. A refined methodology for modeling volume quantification performance in CT

    Science.gov (United States)

    Chen, Baiyu; Wilson, Joshua; Samei, Ehsan

    2014-03-01

    The utility of CT lung nodule volume quantification technique depends on the precision of the quantification. To enable the evaluation of quantification precision, we previously developed a mathematical model that related precision to image resolution and noise properties in uniform backgrounds in terms of an estimability index (e'). The e' was shown to predict empirical precision across 54 imaging and reconstruction protocols, but with different correlation qualities for FBP and iterative reconstruction (IR) due to the non-linearity of IR impacted by anatomical structure. To better account for the non-linearity of IR, this study aimed to refine the noise characterization of the model in the presence of textured backgrounds. Repeated scans of an anthropomorphic lung phantom were acquired. Subtracted images were used to measure the image quantum noise, which was then used to adjust the noise component of the e' calculation measured from a uniform region. In addition to the model refinement, the validation of the model was further extended to 2 nodule sizes (5 and 10 mm) and 2 segmentation algorithms. Results showed that the magnitude of IR's quantum noise was significantly higher in structured backgrounds than in uniform backgrounds (ASiR, 30-50%; MBIR, 100-200%). With the refined model, the correlation between e' values and empirical precision no longer depended on reconstruction algorithm. In conclusion, the model with refined noise characterization relfected the nonlinearity of iterative reconstruction in structured background, and further showed successful prediction of quantification precision across a variety of nodule sizes, dose levels, slice thickness, reconstruction algorithms, and segmentation software.

  9. Comparison of five DNA quantification methods

    DEFF Research Database (Denmark)

    Nielsen, Karsten; Mogensen, Helle Smidt; Hedman, Johannes

    2008-01-01

    Six commercial preparations of human genomic DNA were quantified using five quantification methods: UV spectrometry, SYBR-Green dye staining, slot blot hybridization with the probe D17Z1, Quantifiler Human DNA Quantification kit and RB1 rt-PCR. All methods measured higher DNA concentrations than...... Quantification kit in two experiments. The measured DNA concentrations with Quantifiler were 125 and 160% higher than expected based on the manufacturers' information. When the Quantifiler human DNA standard (Raji cell line) was replaced by the commercial human DNA preparation G147A (Promega) to generate the DNA...... standard curve in the Quantifiler Human DNA Quantification kit, the DNA quantification results of the human DNA preparations were 31% higher than expected based on the manufacturers' information. The results indicate a calibration problem with the Quantifiler human DNA standard for its use...

  10. Uncertainty Quantification in Geomagnetic Field Modeling

    Science.gov (United States)

    Chulliat, A.; Nair, M. C.; Alken, P.; Meyer, B.; Saltus, R.; Woods, A.

    2017-12-01

    Geomagnetic field models are mathematical descriptions of the various sources of the Earth's magnetic field, and are generally obtained by solving an inverse problem. They are widely used in research to separate and characterize field sources, but also in many practical applications such as aircraft and ship navigation, smartphone orientation, satellite attitude control, and directional drilling. In recent years, more sophisticated models have been developed, thanks to the continuous availability of high quality satellite data and to progress in modeling techniques. Uncertainty quantification has become an integral part of model development, both to assess the progress made and to address specific users' needs. Here we report on recent advances made by our group in quantifying the uncertainty of geomagnetic field models. We first focus on NOAA's World Magnetic Model (WMM) and the International Geomagnetic Reference Field (IGRF), two reference models of the main (core) magnetic field produced every five years. We describe the methods used in quantifying the model commission error as well as the omission error attributed to various un-modeled sources such as magnetized rocks in the crust and electric current systems in the atmosphere and near-Earth environment. A simple error model was derived from this analysis, to facilitate usage in practical applications. We next report on improvements brought by combining a main field model with a high resolution crustal field model and a time-varying, real-time external field model, like in NOAA's High Definition Geomagnetic Model (HDGM). The obtained uncertainties are used by the directional drilling industry to mitigate health, safety and environment risks.

  11. 'Motion frozen' quantification and display of myocardial perfusion gated SPECT

    International Nuclear Information System (INIS)

    Slomka, P.J.; Hurwitz, G.A.; Baddredine, M.; Baranowski, J.; Aladl, U.E.

    2002-01-01

    Aim: Gated SPECT imaging incorporates both functional and perfusion information of the left ventricle (LV). However perfusion data is confounded by the effect of ventricular motion. Most existing quantification paradigms simply add all gated frames and then proceed to extract the perfusion information from static images, discarding the effects of cardiac motion. In an attempt to improve the reliability and accuracy of cardiac SPECT quantification we propose to eliminate the LV motion prior to the perfusion quantification via automated image warping algorithm. Methods: A pilot series of 14 male and 11 female gated stress SPECT images acquired with 8 time bins have been co-registered to the coordinates of the 3D normal templates. Subsequently the LV endo and epi-cardial 3D points (300-500) were identified on end-systolic (ES) and end-diastolic (ED) frames, defining the ES-ED motion vectors. The nonlinear image warping algorithm (thin-plate-spline) was then applied to warp end-systolic frame was onto the end-diastolic frames using the corresponding ES-ED motion vectors. The remaining 6 intermediate frames were also transformed to the ED coordinates using fractions of the motion vectors. Such warped images were then summed to provide the LV perfusion image in the ED phase but with counts from the full cycle. Results: The identification of the ED/ES corresponding points was successful in all cases. The corrected displacement between ED and ES images was up to 25 mm. The summed images had the appearance of the ED frames but have been much less noisy since all the counts have been used. The spatial resolution of such images appeared higher than that of summed gated images, especially in the female scans. These 'motion frozen' images could be displayed and quantified as regular non-gated tomograms including polar map paradigm. Conclusions: This image processing technique may improve the effective image resolution of summed gated myocardial perfusion images used for

  12. Identificação e quantificação de voláteis de café através de cromatografia gasosa de alta resolução / espectrometria de massas empregando um amostrador automático de "headspace" Identification and quantification of coffee volatile components through high resolution gas chromatoghaph/mass spectrometer using a headspace automatic sampler

    Directory of Open Access Journals (Sweden)

    Leonardo César AMSTALDEN

    2001-01-01

    Full Text Available Usando um amostrador automático, os "headspaces" de três marcas comerciais de café torrado e moído foram analisados qualitativa e quantitativamente quanto a composição dos voláteis responsáveis pelo aroma através da técnica de cromatografia gasosa/espectrometria de massas. Uma vez que a metodologia não envolveu isolamento ou concentração dos aromas, suas proporções naturais foram mantidas, além de simplificar o preparo das amostras. O emprego do amostrador automático permitiu também boa resolução dos picos cromatográficos sem o emprego de criogenia, contribuindo para redução no tempo de análise. Noventa e um componentes puderam ser identificados, sendo que alguns compostos conhecidos como presentes em café como o dimetilsulfeto, metional e furfuril mercaptana não foram detectados. Os voláteis presentes em maior concentração puderam ser quantificados com o auxílio de dois padrões internos. A técnica se provou viável, tanto para caracterização como para quantificação de voláteis de café.Employing an automatic headspace sampler, the headspaces of three commercial brands of ground roasted coffee were qualitatively and quantitatively analyzed by gas chromatography / mass spectrometry. Since the methodology did not involve aroma isolation or concentration, their natural proportions were maintained, providing a more accurate composition of the flavors, and simplifying sample preparation. The automatic sampler allowed good resolution of the chromatographic peaks without cryofocusing the samples at the head of the column during injection, reducing analysis time. Ninety one compounds were identified and some known coffee volatiles, such as dimethyl sulphide, methional and furfuryl mercaptan were not detected. The more concentrated volatiles could be identified using two internal standards. The technique proved viable, for both characterization and for quantification of coffee volatiles.

  13. Pore REconstruction and Segmentation (PORES) method for improved porosity quantification of nanoporous materials

    Energy Technology Data Exchange (ETDEWEB)

    Van Eyndhoven, G., E-mail: geert.vaneyndhoven@uantwerpen.be [iMinds-Vision Lab, University of Antwerp, Universiteitsplein 1, B-2610 Wilrijk (Belgium); Kurttepeli, M. [EMAT, University of Antwerp, Groenenborgerlaan 171, B-2020 Antwerp (Belgium); Van Oers, C.J.; Cool, P. [Laboratory of Adsorption and Catalysis, University of Antwerp, Universiteitsplein 1, B-2610 Wilrijk (Belgium); Bals, S. [EMAT, University of Antwerp, Groenenborgerlaan 171, B-2020 Antwerp (Belgium); Batenburg, K.J. [iMinds-Vision Lab, University of Antwerp, Universiteitsplein 1, B-2610 Wilrijk (Belgium); Centrum Wiskunde and Informatica, Science Park 123, NL-1090 GB Amsterdam (Netherlands); Mathematical Institute, Universiteit Leiden, Niels Bohrweg 1, NL-2333 CA Leiden (Netherlands); Sijbers, J. [iMinds-Vision Lab, University of Antwerp, Universiteitsplein 1, B-2610 Wilrijk (Belgium)

    2015-01-15

    Electron tomography is currently a versatile tool to investigate the connection between the structure and properties of nanomaterials. However, a quantitative interpretation of electron tomography results is still far from straightforward. Especially accurate quantification of pore-space is hampered by artifacts introduced in all steps of the processing chain, i.e., acquisition, reconstruction, segmentation and quantification. Furthermore, most common approaches require subjective manual user input. In this paper, the PORES algorithm “POre REconstruction and Segmentation” is introduced; it is a tailor-made, integral approach, for the reconstruction, segmentation, and quantification of porous nanomaterials. The PORES processing chain starts by calculating a reconstruction with a nanoporous-specific reconstruction algorithm: the Simultaneous Update of Pore Pixels by iterative REconstruction and Simple Segmentation algorithm (SUPPRESS). It classifies the interior region to the pores during reconstruction, while reconstructing the remaining region by reducing the error with respect to the acquired electron microscopy data. The SUPPRESS reconstruction can be directly plugged into the remaining processing chain of the PORES algorithm, resulting in accurate individual pore quantification and full sample pore statistics. The proposed approach was extensively validated on both simulated and experimental data, indicating its ability to generate accurate statistics of nanoporous materials. - Highlights: • An electron tomography reconstruction/segmentation method for nanoporous materials. • The method exploits the porous nature of the scanned material. • Validated extensively on both simulation and real data experiments. • Results in increased image resolution and improved porosity quantification.

  14. Photoacoustic bio-quantification of graphene based nanomaterials at a single cell level (Conference Presentation)

    Science.gov (United States)

    Nedosekin, Dmitry A.; Nolan, Jacqueline; Biris, Alexandru S.; Zharov, Vladimir P.

    2017-03-01

    Arkansas Nanomedicine Center at the University of Arkansas for Medical Sciences in collaboration with other Arkansas Universities and the FDA-based National Center of Toxicological Research in Jefferson, AR is developing novel techniques for rapid quantification of graphene-based nanomaterials (GBNs) in various biological samples. All-carbon GBNs have wide range of potential applications in industry, agriculture, food processing and medicine; however, quantification of GBNs is difficult in carbon reach biological tissues. The accurate quantification of GBNs is essential for research on material toxicity and the development of GBNs-based drug delivery platforms. We have developed microscopy and cytometry platforms for detection and quantification of GBNs in single cells, tissue and blood samples using photoacoustic contrast of GBNs. We demonstrated PA quantification of individual graphene uptake by single cells. High-resolution PA microscopy provided mapping of GBN distribution within live cells to establish correlation with intracellular toxic phenomena using apoptotic and necrotic assays. This new methodology and corresponding technical platform provide the insight on possible toxicological risks of GBNs at singe cells levels. In addition, in vivo PA image flow cytometry demonstrated the capability to monitor of GBNs pharmacokinetics in mouse model and to map the resulting biodistribution of GBNs in mouse tissues. The integrated PA platform provided an unprecedented sensitivity toward GBNs and allowed to enhance conventional toxicology research by providing a direct correlation between uptake of GBNs at a single cell level and cell viability status.

  15. Development of Quantification Method for Bioluminescence Imaging

    International Nuclear Information System (INIS)

    Kim, Hyeon Sik; Min, Jung Joon; Lee, Byeong Il; Choi, Eun Seo; Tak, Yoon O; Choi, Heung Kook; Lee, Ju Young

    2009-01-01

    Optical molecular luminescence imaging is widely used for detection and imaging of bio-photons emitted by luminescent luciferase activation. The measured photons in this method provide the degree of molecular alteration or cell numbers with the advantage of high signal-to-noise ratio. To extract useful information from the measured results, the analysis based on a proper quantification method is necessary. In this research, we propose a quantification method presenting linear response of measured light signal to measurement time. We detected the luminescence signal by using lab-made optical imaging equipment of animal light imaging system (ALIS) and different two kinds of light sources. One is three bacterial light-emitting sources containing different number of bacteria. The other is three different non-bacterial light sources emitting very weak light. By using the concept of the candela and the flux, we could derive simplified linear quantification formula. After experimentally measuring light intensity, the data was processed with the proposed quantification function. We could obtain linear response of photon counts to measurement time by applying the pre-determined quantification function. The ratio of the re-calculated photon counts and measurement time present a constant value although different light source was applied. The quantification function for linear response could be applicable to the standard quantification process. The proposed method could be used for the exact quantitative analysis in various light imaging equipment with presenting linear response behavior of constant light emitting sources to measurement time

  16. Advancing agricultural greenhouse gas quantification*

    Science.gov (United States)

    Olander, Lydia; Wollenberg, Eva; Tubiello, Francesco; Herold, Martin

    2013-03-01

    1. Introduction Better information on greenhouse gas (GHG) emissions and mitigation potential in the agricultural sector is necessary to manage these emissions and identify responses that are consistent with the food security and economic development priorities of countries. Critical activity data (what crops or livestock are managed in what way) are poor or lacking for many agricultural systems, especially in developing countries. In addition, the currently available methods for quantifying emissions and mitigation are often too expensive or complex or not sufficiently user friendly for widespread use. The purpose of this focus issue is to capture the state of the art in quantifying greenhouse gases from agricultural systems, with the goal of better understanding our current capabilities and near-term potential for improvement, with particular attention to quantification issues relevant to smallholders in developing countries. This work is timely in light of international discussions and negotiations around how agriculture should be included in efforts to reduce and adapt to climate change impacts, and considering that significant climate financing to developing countries in post-2012 agreements may be linked to their increased ability to identify and report GHG emissions (Murphy et al 2010, CCAFS 2011, FAO 2011). 2. Agriculture and climate change mitigation The main agricultural GHGs—methane and nitrous oxide—account for 10%-12% of anthropogenic emissions globally (Smith et al 2008), or around 50% and 60% of total anthropogenic methane and nitrous oxide emissions, respectively, in 2005. Net carbon dioxide fluxes between agricultural land and the atmosphere linked to food production are relatively small, although significant carbon emissions are associated with degradation of organic soils for plantations in tropical regions (Smith et al 2007, FAO 2012). Population growth and shifts in dietary patterns toward more meat and dairy consumption will lead to

  17. Extension of least squares spectral resolution algorithm to high-resolution lipidomics data

    International Nuclear Information System (INIS)

    Zeng, Ying-Xu; Mjøs, Svein Are; David, Fabrice P.A.; Schmid, Adrien W.

    2016-01-01

    Lipidomics, which focuses on the global study of molecular lipids in biological systems, has been driven tremendously by technical advances in mass spectrometry (MS) instrumentation, particularly high-resolution MS. This requires powerful computational tools that handle the high-throughput lipidomics data analysis. To address this issue, a novel computational tool has been developed for the analysis of high-resolution MS data, including the data pretreatment, visualization, automated identification, deconvolution and quantification of lipid species. The algorithm features the customized generation of a lipid compound library and mass spectral library, which covers the major lipid classes such as glycerolipids, glycerophospholipids and sphingolipids. Next, the algorithm performs least squares resolution of spectra and chromatograms based on the theoretical isotope distribution of molecular ions, which enables automated identification and quantification of molecular lipid species. Currently, this methodology supports analysis of both high and low resolution MS as well as liquid chromatography-MS (LC-MS) lipidomics data. The flexibility of the methodology allows it to be expanded to support more lipid classes and more data interpretation functions, making it a promising tool in lipidomic data analysis. - Highlights: • A flexible strategy for analyzing MS and LC-MS data of lipid molecules is proposed. • Isotope distribution spectra of theoretically possible compounds were generated. • High resolution MS and LC-MS data were resolved by least squares spectral resolution. • The method proposed compounds that are likely to occur in the analyzed samples. • The proposed compounds matched results from manual interpretation of fragment spectra.

  18. Extension of least squares spectral resolution algorithm to high-resolution lipidomics data

    Energy Technology Data Exchange (ETDEWEB)

    Zeng, Ying-Xu [Department of Chemistry, University of Bergen, PO Box 7803, N-5020 Bergen (Norway); Mjøs, Svein Are, E-mail: svein.mjos@kj.uib.no [Department of Chemistry, University of Bergen, PO Box 7803, N-5020 Bergen (Norway); David, Fabrice P.A. [Bioinformatics and Biostatistics Core Facility, School of Life Sciences, Ecole Polytechnique Fédérale de Lausanne (EPFL) and Swiss Institute of Bioinformatics (SIB), Lausanne (Switzerland); Schmid, Adrien W. [Proteomics Core Facility, Ecole Polytechnique Fédérale de Lausanne (EPFL), 1015 Lausanne (Switzerland)

    2016-03-31

    Lipidomics, which focuses on the global study of molecular lipids in biological systems, has been driven tremendously by technical advances in mass spectrometry (MS) instrumentation, particularly high-resolution MS. This requires powerful computational tools that handle the high-throughput lipidomics data analysis. To address this issue, a novel computational tool has been developed for the analysis of high-resolution MS data, including the data pretreatment, visualization, automated identification, deconvolution and quantification of lipid species. The algorithm features the customized generation of a lipid compound library and mass spectral library, which covers the major lipid classes such as glycerolipids, glycerophospholipids and sphingolipids. Next, the algorithm performs least squares resolution of spectra and chromatograms based on the theoretical isotope distribution of molecular ions, which enables automated identification and quantification of molecular lipid species. Currently, this methodology supports analysis of both high and low resolution MS as well as liquid chromatography-MS (LC-MS) lipidomics data. The flexibility of the methodology allows it to be expanded to support more lipid classes and more data interpretation functions, making it a promising tool in lipidomic data analysis. - Highlights: • A flexible strategy for analyzing MS and LC-MS data of lipid molecules is proposed. • Isotope distribution spectra of theoretically possible compounds were generated. • High resolution MS and LC-MS data were resolved by least squares spectral resolution. • The method proposed compounds that are likely to occur in the analyzed samples. • The proposed compounds matched results from manual interpretation of fragment spectra.

  19. Determination of uranium in urine - Measurement of isotope ratios and quantification by use of inductively coupled plasma mass spectrometry

    NARCIS (Netherlands)

    Krystek, Petra; Ritsema, R.

    2002-01-01

    For analysis of uranium in urine determination of the isotope ratio and quantification were investigated by high-resolution inductively coupled plasma mass spectrometry (HR ICP-MS). The instrument used (ThermoFinniganMAT ELEMENT2) is a single-collector MS and, therefore, a stable sample-introduction

  20. Fluxes of biogenic volatile organic compounds measured and modelled above a Norway spruce forest

    Science.gov (United States)

    Juráň, Stanislav; Fares, Silvano; Pallozzi, Emanuele; Guidolotti, Gabriele; Savi, Flavia; Alivernini, Alessandro; Calfapietra, Carlo; Večeřová, Kristýna; Křůmal, Kamil; Večeřa, Zbyněk; Cudlín, Pavel; Urban, Otmar

    2016-04-01

    Fluxes of biogenic volatile organic compounds (BVOCs) were investigated at Norway spruce forest at Bílý Kříž in Beskydy Mountains of the Czech Republic during the summer 2014. A proton-transfer-reaction-time-of-flight mass spectrometer (PTR-TOF-MS, Ionicon Analytik, Austria) has been coupled with eddy-covariance system. Additionally, Inverse Lagrangian Transport Model has been used to derive fluxes from concentration gradient of various monoterpenes previously absorbed into n-heptane by wet effluent diffusion denuder with consequent quantification by gas chromatography with mass spectrometry detection. Modelled data cover each one day of three years with different climatic conditions and previous precipitation patterns. Model MEGAN was run to cover all dataset with monoterpene fluxes and measured basal emission factor. Highest fluxes measured by eddy-covariance were recorded during the noon hours, represented particularly by monoterpenes and isoprene. Inverse Lagrangian Transport Model suggests most abundant monoterpene fluxes being α- and β-pinene. Principal component analysis revealed dependencies of individual monoterpene fluxes on air temperature and particularly global radiation; however, these dependencies were monoterpene specific. Relationships of monoterpene fluxes with CO2 flux and relative air humidity were found to be negative. MEGAN model correlated to eddy-covariance PTR-TOF-MS measurement evince particular differences, which will be shown and discussed. Bi-directional fluxes of oxygenated short-chain volatiles (methanol, formaldehyde, acetone, acetaldehyde, formic acid, acetic acid, methyl vinyl ketone, methacrolein, and methyl ethyl ketone) were recorded by PTR-TOF-MS. Volatiles of anthropogenic origin as benzene and toluene were likely transported from the most benzene polluted region in Europe - Ostrava city and adjacent part of Poland around Katowice, where metallurgical and coal mining industries are located. Those were accumulated during

  1. The NASA Langley Multidisciplinary Uncertainty Quantification Challenge

    Science.gov (United States)

    Crespo, Luis G.; Kenny, Sean P.; Giesy, Daniel P.

    2014-01-01

    This paper presents the formulation of an uncertainty quantification challenge problem consisting of five subproblems. These problems focus on key aspects of uncertainty characterization, sensitivity analysis, uncertainty propagation, extreme-case analysis, and robust design.

  2. Uncertainty quantification theory, implementation, and applications

    CERN Document Server

    Smith, Ralph C

    2014-01-01

    The field of uncertainty quantification is evolving rapidly because of increasing emphasis on models that require quantified uncertainties for large-scale applications, novel algorithm development, and new computational architectures that facilitate implementation of these algorithms. Uncertainty Quantification: Theory, Implementation, and Applications provides readers with the basic concepts, theory, and algorithms necessary to quantify input and response uncertainties for simulation models arising in a broad range of disciplines. The book begins with a detailed discussion of applications where uncertainty quantification is critical for both scientific understanding and policy. It then covers concepts from probability and statistics, parameter selection techniques, frequentist and Bayesian model calibration, propagation of uncertainties, quantification of model discrepancy, surrogate model construction, and local and global sensitivity analysis. The author maintains a complementary web page where readers ca...

  3. Uncertainty Quantification in Aerodynamics Simulations, Phase I

    Data.gov (United States)

    National Aeronautics and Space Administration — The objective of the proposed work (Phases I and II) is to develop uncertainty quantification methodologies and software suitable for use in CFD simulations of...

  4. Quantification of virus syndrome in chili peppers

    African Journals Online (AJOL)

    Jane

    2011-06-15

    Jun 15, 2011 ... alternative for the quantification of the disease' syndromes in regards to this crop. The result of these ..... parison of treatments such as cultivars or control measures and ..... Vascular discoloration and stem necrosis. 2.

  5. Low cost high performance uncertainty quantification

    KAUST Repository

    Bekas, C.; Curioni, A.; Fedulova, I.

    2009-01-01

    Uncertainty quantification in risk analysis has become a key application. In this context, computing the diagonal of inverse covariance matrices is of paramount importance. Standard techniques, that employ matrix factorizations, incur a cubic cost

  6. Direct qPCR quantification using the Quantifiler(®) Trio DNA quantification kit.

    Science.gov (United States)

    Liu, Jason Yingjie

    2014-11-01

    The effectiveness of a direct quantification assay is essential to the adoption of the combined direct quantification/direct STR workflow. In this paper, the feasibility of using the Quantifiler(®) Trio DNA quantification kit for the direct quantification of forensic casework samples was investigated. Both low-level touch DNA samples and blood samples were collected on PE swabs and quantified directly. The increased sensitivity of the Quantifiler(®) Trio kit enabled the detection of less than 10pg of DNA in unprocessed touch samples and also minimizes the stochastic effect experienced by different targets in the same sample. The DNA quantity information obtained from a direct quantification assay using the Quantifiler(®) Trio kit can also be used to accurately estimate the optimal input DNA quantity for a direct STR amplification reaction. The correlation between the direct quantification results (Quantifiler(®) Trio kit) and the direct STR results (GlobalFiler™ PCR amplification kit(*)) for low-level touch DNA samples indicates that direct quantification using the Quantifiler(®) Trio DNA quantification kit is more reliable than the Quantifiler(®) Duo DNA quantification kit for predicting the STR results of unprocessed touch DNA samples containing less than 10pg of DNA. Copyright © 2014 Elsevier Ireland Ltd. All rights reserved.

  7. Automatic Segmentation and Quantification of Filamentous Structures in Electron Tomography.

    Science.gov (United States)

    Loss, Leandro A; Bebis, George; Chang, Hang; Auer, Manfred; Sarkar, Purbasha; Parvin, Bahram

    2012-10-01

    Electron tomography is a promising technology for imaging ultrastructures at nanoscale resolutions. However, image and quantitative analyses are often hindered by high levels of noise, staining heterogeneity, and material damage either as a result of the electron beam or sample preparation. We have developed and built a framework that allows for automatic segmentation and quantification of filamentous objects in 3D electron tomography. Our approach consists of three steps: (i) local enhancement of filaments by Hessian filtering; (ii) detection and completion (e.g., gap filling) of filamentous structures through tensor voting; and (iii) delineation of the filamentous networks. Our approach allows for quantification of filamentous networks in terms of their compositional and morphological features. We first validate our approach using a set of specifically designed synthetic data. We then apply our segmentation framework to tomograms of plant cell walls that have undergone different chemical treatments for polysaccharide extraction. The subsequent compositional and morphological analyses of the plant cell walls reveal their organizational characteristics and the effects of the different chemical protocols on specific polysaccharides.

  8. Uncertainty Quantification in Numerical Aerodynamics

    KAUST Repository

    Litvinenko, Alexander

    2017-05-16

    We consider uncertainty quantification problem in aerodynamic simulations. We identify input uncertainties, classify them, suggest an appropriate statistical model and, finally, estimate propagation of these uncertainties into the solution (pressure, velocity and density fields as well as the lift and drag coefficients). The deterministic problem under consideration is a compressible transonic Reynolds-averaged Navier-Strokes flow around an airfoil with random/uncertain data. Input uncertainties include: uncertain angle of attack, the Mach number, random perturbations in the airfoil geometry, mesh, shock location, turbulence model and parameters of this turbulence model. This problem requires efficient numerical/statistical methods since it is computationally expensive, especially for the uncertainties caused by random geometry variations which involve a large number of variables. In numerical section we compares five methods, including quasi-Monte Carlo quadrature, polynomial chaos with coefficients determined by sparse quadrature and gradient-enhanced version of Kriging, radial basis functions and point collocation polynomial chaos, in their efficiency in estimating statistics of aerodynamic performance upon random perturbation to the airfoil geometry [D.Liu et al \\'17]. For modeling we used the TAU code, developed in DLR, Germany.

  9. Stochastic approach for radionuclides quantification

    Science.gov (United States)

    Clement, A.; Saurel, N.; Perrin, G.

    2018-01-01

    Gamma spectrometry is a passive non-destructive assay used to quantify radionuclides present in more or less complex objects. Basic methods using empirical calibration with a standard in order to quantify the activity of nuclear materials by determining the calibration coefficient are useless on non-reproducible, complex and single nuclear objects such as waste packages. Package specifications as composition or geometry change from one package to another and involve a high variability of objects. Current quantification process uses numerical modelling of the measured scene with few available data such as geometry or composition. These data are density, material, screen, geometric shape, matrix composition, matrix and source distribution. Some of them are strongly dependent on package data knowledge and operator backgrounds. The French Commissariat à l'Energie Atomique (CEA) is developing a new methodology to quantify nuclear materials in waste packages and waste drums without operator adjustment and internal package configuration knowledge. This method suggests combining a global stochastic approach which uses, among others, surrogate models available to simulate the gamma attenuation behaviour, a Bayesian approach which considers conditional probability densities of problem inputs, and Markov Chains Monte Carlo algorithms (MCMC) which solve inverse problems, with gamma ray emission radionuclide spectrum, and outside dimensions of interest objects. The methodology is testing to quantify actinide activity in different kind of matrix, composition, and configuration of sources standard in terms of actinide masses, locations and distributions. Activity uncertainties are taken into account by this adjustment methodology.

  10. Inverse Problems and Uncertainty Quantification

    KAUST Repository

    Litvinenko, Alexander

    2014-01-06

    In a Bayesian setting, inverse problems and uncertainty quantification (UQ) - the propagation of uncertainty through a computational (forward) modelare strongly connected. In the form of conditional expectation the Bayesian update becomes computationally attractive. This is especially the case as together with a functional or spectral approach for the forward UQ there is no need for time- consuming and slowly convergent Monte Carlo sampling. The developed sampling- free non-linear Bayesian update is derived from the variational problem associated with conditional expectation. This formulation in general calls for further discretisa- tion to make the computation possible, and we choose a polynomial approximation. After giving details on the actual computation in the framework of functional or spectral approximations, we demonstrate the workings of the algorithm on a number of examples of increasing complexity. At last, we compare the linear and quadratic Bayesian update on the small but taxing example of the chaotic Lorenz 84 model, where we experiment with the influence of different observation or measurement operators on the update.

  11. Inverse Problems and Uncertainty Quantification

    KAUST Repository

    Litvinenko, Alexander; Matthies, Hermann G.

    2014-01-01

    In a Bayesian setting, inverse problems and uncertainty quantification (UQ) - the propagation of uncertainty through a computational (forward) modelare strongly connected. In the form of conditional expectation the Bayesian update becomes computationally attractive. This is especially the case as together with a functional or spectral approach for the forward UQ there is no need for time- consuming and slowly convergent Monte Carlo sampling. The developed sampling- free non-linear Bayesian update is derived from the variational problem associated with conditional expectation. This formulation in general calls for further discretisa- tion to make the computation possible, and we choose a polynomial approximation. After giving details on the actual computation in the framework of functional or spectral approximations, we demonstrate the workings of the algorithm on a number of examples of increasing complexity. At last, we compare the linear and quadratic Bayesian update on the small but taxing example of the chaotic Lorenz 84 model, where we experiment with the influence of different observation or measurement operators on the update.

  12. Inverse problems and uncertainty quantification

    KAUST Repository

    Litvinenko, Alexander

    2013-12-18

    In a Bayesian setting, inverse problems and uncertainty quantification (UQ)— the propagation of uncertainty through a computational (forward) model—are strongly connected. In the form of conditional expectation the Bayesian update becomes computationally attractive. This is especially the case as together with a functional or spectral approach for the forward UQ there is no need for time- consuming and slowly convergent Monte Carlo sampling. The developed sampling- free non-linear Bayesian update is derived from the variational problem associated with conditional expectation. This formulation in general calls for further discretisa- tion to make the computation possible, and we choose a polynomial approximation. After giving details on the actual computation in the framework of functional or spectral approximations, we demonstrate the workings of the algorithm on a number of examples of increasing complexity. At last, we compare the linear and quadratic Bayesian update on the small but taxing example of the chaotic Lorenz 84 model, where we experiment with the influence of different observation or measurement operators on the update.

  13. Lung involvement quantification in chest radiographs

    International Nuclear Information System (INIS)

    Giacomini, Guilherme; Alvarez, Matheus; Oliveira, Marcela de; Miranda, Jose Ricardo A.; Pina, Diana R.; Pereira, Paulo C.M.; Ribeiro, Sergio M.

    2014-01-01

    Tuberculosis (TB) caused by Mycobacterium tuberculosis, is an infectious disease which remains a global health problem. The chest radiography is the commonly method employed to assess the TB's evolution. The methods for quantification of abnormalities of chest are usually performed on CT scans (CT). This quantification is important to assess the TB evolution and treatment and comparing different treatments. However, precise quantification is not feasible for the amount of CT scans required. The purpose of this work is to develop a methodology for quantification of lung damage caused by TB through chest radiographs. It was developed an algorithm for computational processing of exams in Matlab, which creates a lungs' 3D representation, with compromised dilated regions inside. The quantification of lung lesions was also made for the same patients through CT scans. The measurements from the two methods were compared and resulting in strong correlation. Applying statistical Bland and Altman, all samples were within the limits of agreement, with a confidence interval of 95%. The results showed an average variation of around 13% between the two quantification methods. The results suggest the effectiveness and applicability of the method developed, providing better risk-benefit to the patient and cost-benefit ratio for the institution. (author)

  14. Detection, mapping, and quantification of single walled carbon nanotubes in histological specimens with photoacoustic microscopy.

    Science.gov (United States)

    Avti, Pramod K; Hu, Song; Favazza, Christopher; Mikos, Antonios G; Jansen, John A; Shroyer, Kenneth R; Wang, Lihong V; Sitharaman, Balaji

    2012-01-01

    In the present study, the efficacy of multi-scale photoacoustic microscopy (PAM) was investigated to detect, map, and quantify trace amounts [nanograms (ng) to micrograms (µg)] of SWCNTs in a variety of histological tissue specimens consisting of cancer and benign tissue biopsies (histological specimens from implanted tissue engineering scaffolds). Optical-resolution (OR) and acoustic-resolution (AR)--Photoacoustic microscopy (PAM) was employed to detect, map and quantify the SWCNTs in a variety of tissue histological specimens and compared with other optical techniques (bright-field optical microscopy, Raman microscopy, near infrared (NIR) fluorescence microscopy). Both optical-resolution and acoustic-resolution PAM, allow the detection and quantification of SWCNTs in histological specimens with scalable spatial resolution and depth penetration. The noise-equivalent detection sensitivity to SWCNTs in the specimens was calculated to be as low as ∼7 pg. Image processing analysis further allowed the mapping, distribution, and quantification of the SWCNTs in the histological sections. The results demonstrate the potential of PAM as a promising imaging technique to detect, map, and quantify SWCNTs in histological specimens, and could complement the capabilities of current optical and electron microscopy techniques in the analysis of histological specimens containing SWCNTs.

  15. Detection, mapping, and quantification of single walled carbon nanotubes in histological specimens with photoacoustic microscopy.

    Directory of Open Access Journals (Sweden)

    Pramod K Avti

    Full Text Available In the present study, the efficacy of multi-scale photoacoustic microscopy (PAM was investigated to detect, map, and quantify trace amounts [nanograms (ng to micrograms (µg] of SWCNTs in a variety of histological tissue specimens consisting of cancer and benign tissue biopsies (histological specimens from implanted tissue engineering scaffolds.Optical-resolution (OR and acoustic-resolution (AR--Photoacoustic microscopy (PAM was employed to detect, map and quantify the SWCNTs in a variety of tissue histological specimens and compared with other optical techniques (bright-field optical microscopy, Raman microscopy, near infrared (NIR fluorescence microscopy.Both optical-resolution and acoustic-resolution PAM, allow the detection and quantification of SWCNTs in histological specimens with scalable spatial resolution and depth penetration. The noise-equivalent detection sensitivity to SWCNTs in the specimens was calculated to be as low as ∼7 pg. Image processing analysis further allowed the mapping, distribution, and quantification of the SWCNTs in the histological sections.The results demonstrate the potential of PAM as a promising imaging technique to detect, map, and quantify SWCNTs in histological specimens, and could complement the capabilities of current optical and electron microscopy techniques in the analysis of histological specimens containing SWCNTs.

  16. Super-resolution

    DEFF Research Database (Denmark)

    Nasrollahi, Kamal; Moeslund, Thomas B.

    2014-01-01

    Super-resolution, the process of obtaining one or more high-resolution images from one or more low-resolution observations, has been a very attractive research topic over the last two decades. It has found practical applications in many real world problems in different fields, from satellite...

  17. The quantification of risk and tourism

    Directory of Open Access Journals (Sweden)

    Piet Croucamp

    2014-01-01

    Full Text Available Tourism in South Africa comprises 9.5% of Gross Domestic Product (GDP, but remains an underresearched industry, especially regarding the quantification of the risks prevailing in the social, political and economic environment in which the industry operates. Risk prediction, extrapolation forecasting is conducted largely in the context of a qualitative methodology. This article reflects on the quantification of social constructs as variables of risk in the tourism industry with reference to South Africa. The theory and methodology of quantification is briefly reviewed and the indicators of risk are conceptualized and operationalized. The identified indicators are scaled in indices for purposes of quantification. Risk assessments and the quantification of constructs rely heavily on the experience - often personal - of the researcher and this scholarly endeavour is, therefore, not inclusive of all possible identified indicators of risk. It is accepted that tourism in South Africa is an industry comprising of a large diversity of sectors, each with a different set of risk indicators and risk profiles. The emphasis of this article is thus on the methodology to be applied to a risk profile. A secondary endeavour is to provide for clarity about the conceptual and operational confines of risk in general, as well as how quantified risk relates to the tourism industry. The indices provided include both domesticand international risk indicators. The motivation for the article is to encourage a greater emphasis on quantitative research in our efforts to understand and manage a risk profile for the tourist industry.

  18. Theoretical Study of Penalized-Likelihood Image Reconstruction for Region of Interest Quantification

    International Nuclear Information System (INIS)

    Qi, Jinyi; Huesman, Ronald H.

    2006-01-01

    Region of interest (ROI) quantification is an important task in emission tomography (e.g., positron emission tomography and single photon emission computed tomography). It is essential for exploring clinical factors such as tumor activity, growth rate, and the efficacy of therapeutic interventions. Statistical image reconstruction methods based on the penalized maximum-likelihood (PML) or maximum a posteriori principle have been developed for emission tomography to deal with the low signal-to-noise ratio of the emission data. Similar to the filter cut-off frequency in the filtered backprojection method, the regularization parameter in PML reconstruction controls the resolution and noise tradeoff and, hence, affects ROI quantification. In this paper, we theoretically analyze the performance of ROI quantification in PML reconstructions. Building on previous work, we derive simplified theoretical expressions for the bias, variance, and ensemble mean-squared-error (EMSE) of the estimated total activity in an ROI that is surrounded by a uniform background. When the mean and covariance matrix of the activity inside the ROI are known, the theoretical expressions are readily computable and allow for fast evaluation of image quality for ROI quantification with different regularization parameters. The optimum regularization parameter can then be selected to minimize the EMSE. Computer simulations are conducted for small ROIs with variable uniform uptake. The results show that the theoretical predictions match the Monte Carlo results reasonably well

  19. Bone histomorphometric quantification by X-ray phase contrast and transmission 3D SR microcomputed tomography

    International Nuclear Information System (INIS)

    Nogueira, L.P.; Pinheiro, C.J.G.; Braz, D.; Oliveira, L.F.; Barroso, R.C.

    2008-01-01

    Full text: Conventional histomorphometry is an important method for quantitative evaluation of bone microstructure. X-ray computed tomography is a noninvasive technique, which can be used to evaluate histomorphometric indices. In this technique, the output 3D images are used to quantify the whole sample, differently from the conventional one, in which the quantification is performed in 2D slices and extrapolated for 3D case. Looking for better resolutions and visualization of soft tissues, X-ray phase contrast imaging technique was developed. The objective of this work was to perform histomorphometric quantification of human cancellous bone using 3D synchrotron X ray computed microtomography, using two distinct techniques: transmission and phase contrast, in order to compare the results and evaluate the viability of applying the same methodology of quantification for both technique. All experiments were performed at the ELETTRA Synchrotron Light Laboratory in Trieste (Italy). MicroCT data sets were collected using the CT set-up on the SYRMEP (Synchrotron Radiation for Medical Physics) beamline. Results showed that there is a better correlation between histomorphometric parameters of both techniques when morphological filters had been used. However, using these filters, some important information given by phase contrast are lost and they shall be explored by new techniques of quantification

  20. ANL high resolution injector

    International Nuclear Information System (INIS)

    Minehara, E.; Kutschera, W.; Hartog, P.D.; Billquist, P.

    1985-01-01

    The ANL (Argonne National Laboratory) high-resolution injector has been installed to obtain higher mass resolution and higher preacceleration, and to utilize effectively the full mass range of ATLAS (Argonne Tandem Linac Accelerator System). Preliminary results of the first beam test are reported briefly. The design and performance, in particular a high-mass-resolution magnet with aberration compensation, are discussed. 7 refs., 5 figs., 2 tabs

  1. Serendipity: Global Detection and Quantification of Plant Stress

    Science.gov (United States)

    Schimel, D.; Verma, M.; Drewry, D.

    2016-12-01

    Detecting and quantifying plant stress is a grand challenge for remote sensing, and is important for understanding climate impacts on ecosystems broadly and also for early warning systems supporting food security. The long record from moderate resolution sensors providing frequent data has allowed using phenology to detect stress in forest and agroecosystems, but can fail or give ambiguous results when stress occurs during later phases of growth and in high leaf area systems. The recent recognition that greenhouse gas satellites such as GOSAT and OCO-2 observe Solar-Induced Fluorescence has added a new and complementary tool for the quantification of stress but algorithms to detect and quantify stress using SIF are in their infancy. Here we report new results showing a more complex response of SIF to stress by evaluating spaceborne SIF against in situ eddy covariance data. The response observed is as predicted by theory, and shows that SIF, used in conjunction with moderate resolution remote sensing, can detect and likely quantify stress by indexing the nonlinear part of the SIF-GPP relationship using the photochemical reflectance index and remotely observed light absorption. There are several exciting opportunities on the near horizon for the implementation of SIF, together with syngeristic measurements such as PRI and evapotranspiration that suggest the next few years will be a golden age for global ecology. Adancing the science and associated algorithms now is essential to fully exploiting the next wave of missions.

  2. Terahertz identification and quantification of penicillamine enantiomers

    International Nuclear Information System (INIS)

    Ji Te; Zhao Hongwei; Chen Min; Xiao Tiqiao; Han Pengyu

    2013-01-01

    Identification and characterization of L-, D- and DL- penicillamine were demonstrated by Terahertz time-domain spectroscopy (THz-TDS). To understand the physical origins of the low frequency resonant modes, the density functional theory (DFT) was adopted for theoretical calculation. It was found that the collective THz frequency motions were decided by the intramolecular and intermolecular hydrogen bond interactions. Moreover, the quantification of penicillamine enantiomers mixture was demonstrated by a THz spectra fitting method with a relative error of less than 3.5%. This technique can be a valuable tool for the discrimination and quantification of chiral drugs in pharmaceutical industry. (authors)

  3. Benchmarking common quantification strategies for large-scale phosphoproteomics

    DEFF Research Database (Denmark)

    Hogrebe, Alexander; von Stechow, Louise; Bekker-Jensen, Dorte B

    2018-01-01

    Comprehensive mass spectrometry (MS)-based proteomics is now feasible, but reproducible quantification remains challenging, especially for post-translational modifications such as phosphorylation. Here, we compare the most popular quantification techniques for global phosphoproteomics: label-free...

  4. Automated conflict resolution issues

    Science.gov (United States)

    Wike, Jeffrey S.

    1991-01-01

    A discussion is presented of how conflicts for Space Network resources should be resolved in the ATDRSS era. The following topics are presented: a description of how resource conflicts are currently resolved; a description of issues associated with automated conflict resolution; present conflict resolution strategies; and topics for further discussion.

  5. Colour thresholding and objective quantification in bioimaging

    Science.gov (United States)

    Fermin, C. D.; Gerber, M. A.; Torre-Bueno, J. R.

    1992-01-01

    Computer imaging is rapidly becoming an indispensable tool for the quantification of variables in research and medicine. Whilst its use in medicine has largely been limited to qualitative observations, imaging in applied basic sciences, medical research and biotechnology demands objective quantification of the variables in question. In black and white densitometry (0-256 levels of intensity) the separation of subtle differences between closely related hues from stains is sometimes very difficult. True-colour and real-time video microscopy analysis offer choices not previously available with monochrome systems. In this paper we demonstrate the usefulness of colour thresholding, which has so far proven indispensable for proper objective quantification of the products of histochemical reactions and/or subtle differences in tissue and cells. In addition, we provide interested, but untrained readers with basic information that may assist decisions regarding the most suitable set-up for a project under consideration. Data from projects in progress at Tulane are shown to illustrate the advantage of colour thresholding over monochrome densitometry and for objective quantification of subtle colour differences between experimental and control samples.

  6. Recurrence quantification analysis in Liu's attractor

    International Nuclear Information System (INIS)

    Balibrea, Francisco; Caballero, M. Victoria; Molera, Lourdes

    2008-01-01

    Recurrence Quantification Analysis is used to detect transitions chaos to periodical states or chaos to chaos in a new dynamical system proposed by Liu et al. This system contains a control parameter in the second equation and was originally introduced to investigate the forming mechanism of the compound structure of the chaotic attractor which exists when the control parameter is zero

  7. Quantification of coating aging using impedance measurements

    NARCIS (Netherlands)

    Westing, E.P.M. van; Weijde, D.H. van der; Vreijling, M.P.W.; Ferrari, G.M.; Wit, J.H.W. de

    1998-01-01

    This chapter shows the application results of a novel approach to quantify the ageing of organic coatings using impedance measurements. The ageing quantification is based on the typical impedance behaviour of barrier coatings in immersion. This immersion behaviour is used to determine the limiting

  8. Quantification analysis of CT for aphasic patients

    International Nuclear Information System (INIS)

    Watanabe, Shunzo; Ooyama, Hiroshi; Hojo, Kei; Tasaki, Hiroichi; Hanazono, Toshihide; Sato, Tokijiro; Metoki, Hirobumi; Totsuka, Motokichi; Oosumi, Noboru.

    1987-01-01

    Using a microcomputer, the locus and extent of the lesions, as demonstrated by computed tomography, for 44 aphasic patients with various types of aphasia were superimposed onto standardized matrices, composed of 10 slices with 3000 points (50 by 60). The relationships between the foci of the lesions and types of aphasia were investigated on the slices numbered 3, 4, 5, and 6 using a quantification theory, Type 3 (pattern analysis). Some types of regularities were observed on Slices 3, 4, 5, and 6. The group of patients with Broca's aphasia and the group with Wernicke's aphasia were generally separated on the 1st component and the 2nd component of the quantification theory, Type 3. On the other hand, the group with global aphasia existed between the group with Broca's aphasia and that with Wernicke's aphasia. The group of patients with amnestic aphasia had no specific findings, and the group with conduction aphasia existed near those with Wernicke's aphasia. The above results serve to establish the quantification theory, Type 2 (discrimination analysis) and the quantification theory, Type 1 (regression analysis). (author)

  9. Quantification analysis of CT for aphasic patients

    Energy Technology Data Exchange (ETDEWEB)

    Watanabe, S.; Ooyama, H.; Hojo, K.; Tasaki, H.; Hanazono, T.; Sato, T.; Metoki, H.; Totsuka, M.; Oosumi, N.

    1987-02-01

    Using a microcomputer, the locus and extent of the lesions, as demonstrated by computed tomography, for 44 aphasic patients with various types of aphasia were superimposed onto standardized matrices, composed of 10 slices with 3000 points (50 by 60). The relationships between the foci of the lesions and types of aphasia were investigated on the slices numbered 3, 4, 5, and 6 using a quantification theory, Type 3 (pattern analysis). Some types of regularities were observed on slices 3, 4, 5, and 6. The group of patients with Broca's aphasia and the group with Wernicke's aphasia were generally separated on the 1st component and the 2nd component of the quantification theory, Type 3. On the other hand, the group with global aphasia existed between the group with Broca's aphasia and that with Wernicke's aphasia. The group of patients with amnestic aphasia had no specific findings, and the group with conduction aphasia existed near those with Wernicke's aphasia. The above results serve to establish the quantification theory, Type 2 (discrimination analysis) and the quantification theory, Type 1 (regression analysis).

  10. Quantification of Cannabinoid Content in Cannabis

    Science.gov (United States)

    Tian, Y.; Zhang, F.; Jia, K.; Wen, M.; Yuan, Ch.

    2015-09-01

    Cannabis is an economically important plant that is used in many fields, in addition to being the most commonly consumed illicit drug worldwide. Monitoring the spatial distribution of cannabis cultivation and judging whether it is drug- or fiber-type cannabis is critical for governments and international communities to understand the scale of the illegal drug trade. The aim of this study was to investigate whether the cannabinoids content in cannabis could be spectrally quantified using a spectrometer and to identify the optimal wavebands for quantifying the cannabinoid content. Spectral reflectance data of dried cannabis leaf samples and the cannabis canopy were measured in the laboratory and in the field, respectively. Correlation analysis and the stepwise multivariate regression method were used to select the optimal wavebands for cannabinoid content quantification based on the laboratory-measured spectral data. The results indicated that the delta-9-tetrahydrocannabinol (THC) content in cannabis leaves could be quantified using laboratory-measured spectral reflectance data and that the 695 nm band is the optimal band for THC content quantification. This study provides prerequisite information for designing spectral equipment to enable immediate quantification of THC content in cannabis and to discriminate drug- from fiber-type cannabis based on THC content quantification in the field.

  11. Quantification of glycyrrhizin biomarker in Glycyrrhiza glabra ...

    African Journals Online (AJOL)

    Background: A simple and sensitive thin-layer chromatographic method has been established for quantification of glycyrrhizin in Glycyrrhiza glabra rhizome and baby herbal formulations by validated Reverse Phase HPTLC method. Materials and Methods: RP-HPTLC Method was carried out using glass coated with RP-18 ...

  12. Noninvasive Quantification of Pancreatic Fat in Humans

    OpenAIRE

    Lingvay, Ildiko; Esser, Victoria; Legendre, Jaime L.; Price, Angela L.; Wertz, Kristen M.; Adams-Huet, Beverley; Zhang, Song; Unger, Roger H.; Szczepaniak, Lidia S.

    2009-01-01

    Objective: To validate magnetic resonance spectroscopy (MRS) as a tool for non-invasive quantification of pancreatic triglyceride (TG) content and to measure the pancreatic TG content in a diverse human population with a wide range of body mass index (BMI) and glucose control.

  13. Cues, quantification, and agreement in language comprehension.

    Science.gov (United States)

    Tanner, Darren; Bulkes, Nyssa Z

    2015-12-01

    We investigated factors that affect the comprehension of subject-verb agreement in English, using quantification as a window into the relationship between morphosyntactic processes in language production and comprehension. Event-related brain potentials (ERPs) were recorded while participants read sentences with grammatical and ungrammatical verbs, in which the plurality of the subject noun phrase was either doubly marked (via overt plural quantification and morphological marking on the noun) or singly marked (via only plural morphology on the noun). Both acceptability judgments and the ERP data showed heightened sensitivity to agreement violations when quantification provided an additional cue to the grammatical number of the subject noun phrase, over and above plural morphology. This is consistent with models of grammatical comprehension that emphasize feature prediction in tandem with cue-based memory retrieval. Our results additionally contrast with those of prior studies that showed no effects of plural quantification on agreement in language production. These findings therefore highlight some nontrivial divergences in the cues and mechanisms supporting morphosyntactic processing in language production and comprehension.

  14. Perfusion Quantification Using Gaussian Process Deconvolution

    DEFF Research Database (Denmark)

    Andersen, Irene Klærke; Have, Anna Szynkowiak; Rasmussen, Carl Edward

    2002-01-01

    The quantification of perfusion using dynamic susceptibility contrast MRI (DSC-MRI) requires deconvolution to obtain the residual impulse response function (IRF). In this work, a method using the Gaussian process for deconvolution (GPD) is proposed. The fact that the IRF is smooth is incorporated...

  15. Quantification and localization of mast cells in periapical lesions.

    Science.gov (United States)

    Mahita, V N; Manjunatha, B S; Shah, R; Astekar, M; Purohit, S; Kovvuru, S

    2015-01-01

    Periapical lesions occur in response to chronic irritation in periapical tissue, generally resulting from an infected root canal. Specific etiological agents of induction, participating cell population and growth factors associated with maintenance and resolution of periapical lesions are incompletely understood. Among the cells found in periapical lesions, mast cells have been implicated in the inflammatory mechanism. Quantifications and the possible role played by mast cells in the periapical granuloma and radicular cyst. Hence, this study is to emphasize the presence (localization) and quantification of mast cells in periapical granuloma and radicular cyst. A total of 30 cases and out of which 15 of periapical granuloma and 15 radicular cyst, each along with the case details from the previously diagnosed cases in the department of oral pathology were selected for the study. The gender distribution showed male 8 (53.3%) and females 7 (46.7%) in periapical granuloma cases and male 10 (66.7%) and females 5 (33.3%) in radicular cyst cases. The statistical analysis used was unpaired t-test. Mean mast cell count in periapical granuloma subepithelial and deeper connective tissue, was 12.40 (0.99%) and 7.13 (0.83%), respectively. The mean mast cell counts in subepithelial and deeper connective tissue of radicular cyst were 17.64 (1.59%) and 12.06 (1.33%) respectively, which was statistically significant. No statistical significant difference was noted among males and females. Mast cells were more in number in radicular cyst. Based on the concept that mast cells play a critical role in the induction of inflammation, it is logical to use therapeutic agents to alter mast cell function and secretion, to thwart inflammation at its earliest phases. These findings may suggest the possible role of mast cells in the pathogenesis of periapical lesions.

  16. The impact of reconstruction method on the quantification of DaTSCAN images

    Energy Technology Data Exchange (ETDEWEB)

    Dickson, John C.; Erlandsson, Kjell; Hutton, Brian F. [UCLH NHS Foundation Trust and University College London, Institute of Nuclear Medicine, London (United Kingdom); Tossici-Bolt, Livia [Southampton University Hospitals NHS Trust, Department of Medical Physics, Southampton (United Kingdom); Sera, Terez [University of Szeged, Department of Nuclear Medicine and Euromedic Szeged, Szeged (Hungary); Varrone, Andrea [Psychiatry Section and Stockholm Brain Institute, Karolinska Institute, Department of Clinical Neuroscience, Stockholm (Sweden); Tatsch, Klaus [EANM/European Network of Excellence for Brain Imaging, Vienna (Austria)

    2010-01-15

    Reconstruction of DaTSCAN brain studies using OS-EM iterative reconstruction offers better image quality and more accurate quantification than filtered back-projection. However, reconstruction must proceed for a sufficient number of iterations to achieve stable and accurate data. This study assessed the impact of the number of iterations on the image quantification, comparing the results of the iterative reconstruction with filtered back-projection data. A striatal phantom filled with {sup 123}I using striatal to background ratios between 2:1 and 10:1 was imaged on five different gamma camera systems. Data from each system were reconstructed using OS-EM (which included depth-independent resolution recovery) with various combinations of iterations and subsets to achieve up to 200 EM-equivalent iterations and with filtered back-projection. Using volume of interest analysis, the relationships between image reconstruction strategy and quantification of striatal uptake were assessed. For phantom filling ratios of 5:1 or less, significant convergence of measured ratios occurred close to 100 EM-equivalent iterations, whereas for higher filling ratios, measured uptake ratios did not display a convergence pattern. Assessment of the count concentrations used to derive the measured uptake ratio showed that nonconvergence of low background count concentrations caused peaking in higher measured uptake ratios. Compared to filtered back-projection, OS-EM displayed larger uptake ratios because of the resolution recovery applied in the iterative algorithm. The number of EM-equivalent iterations used in OS-EM reconstruction influences the quantification of DaTSCAN studies because of incomplete convergence and possible bias in areas of low activity due to the nonnegativity constraint in OS-EM reconstruction. Nevertheless, OS-EM using 100 EM-equivalent iterations provides the best linear discriminatory measure to quantify the uptake in DaTSCAN studies. (orig.)

  17. High-resolution investigations of edge effects in neutron imaging

    International Nuclear Information System (INIS)

    Strobl, M.; Kardjilov, N.; Hilger, A.; Kuehne, G.; Frei, G.; Manke, I.

    2009-01-01

    Edge enhancement is the main effect measured by the so-called inline or propagation-based neutron phase contrast imaging method. The effect has originally been explained by diffraction, and high spatial coherence has been claimed to be a necessary precondition. However, edge enhancement has also been found in conventional imaging with high resolution. In such cases the effects can produce artefacts and hinder quantification. In this letter the edge effects at cylindrical shaped samples and long straight edges have been studied in detail. The enhancement can be explained by refraction and total reflection. Using high-resolution imaging, where spatial resolutions better than 50 μm could be achieved, refraction and total reflection peaks - similar to diffraction patterns - could be separated and distinguished.

  18. Bank Resolution in Europe

    DEFF Research Database (Denmark)

    N. Gordon, Jeffery; Ringe, Georg

    2015-01-01

    Bank resolution is a key pillar of the European Banking Union. This column argues that the current structure of large EU banks is not conducive to an effective and unbiased resolution procedure. The authors would require systemically important banks to reorganise into a ‘holding company’ structure......, where the parent company holds unsecured term debt sufficient to cover losses at its operating financial subsidiaries. This would facilitate a ‘single point of entry’ resolution procedure, minimising the risk of creditor runs and destructive ring-fencing by national regulators....

  19. A novel quantification method of pantaprazole sodium monohydrate in sesquihydrate by thermogravimetric analyzer.

    Science.gov (United States)

    Reddy, V Ranga; Rajmohan, M Anantha; Shilpa, R Laxmi; Raut, Dilip M; Naveenkumar, Kolla; Suryanarayana, M V; Mathad, Vijayavitthal T

    2007-04-11

    To demonstrate the applicability of thermogravimetric analyzer as a tool for the quantification of pantaprazole sodium monohydrate in sesquihydrate, studies have been conducted. Thermal analysis (DSC, TGA) crystallographic (PXRD) and spectroscopic techniques (FT-IR) were used for the characterization of the polymorphs. Thermogravimetric analysis (TGA) analysis was explored by high-resolution dynamic (Hi-Res-dynamic) and high-resolution modulated (Hi-Res-modulated) test procedures to quantify the hydrate polymorphic mixtures. The two polymorphic forms exhibited significant differences and good resolution in the second derivative thermogram generated by Hi-Res-modulated test procedure. Thus, the TGA with Hi-Res-modulated test procedure was considered for the quantification of monohydrate in sesquihydrate. The calibration plot was constructed from the known mixtures of two polymorphs by plotting the peak area of the second derivative thermogram against the weight percent of monohydrate. Using this novel approach, 1 wt% limit of detection (LOD) was achieved. The polymorphic purity results, obtained by TGA in Hi-Res-modulated test procedure were found to be in good agreement with the results predicted by FT-IR and was comparable with the actual values of the known polymorphic mixtures. The Hi-Res-modulated TGA technique is very simple and easy to perform the analysis.

  20. Feasibility Study for Applicability of the Wavelet Transform to Code Accuracy Quantification

    International Nuclear Information System (INIS)

    Kim, Jong Rok; Choi, Ki Yong

    2012-01-01

    A purpose of the assessment process of large thermal-hydraulic system codes is verifying their quality by comparing code predictions against experimental data. This process is essential for reliable safety analysis of nuclear power plants. Extensive experimental programs have been conducted in order to support the development and validation activities of best estimate thermal-hydraulic codes. So far, the Fast Fourier Transform Based Method (FFTBM) has been used widely for quantification of the prediction accuracy regardless of its limitation that it does not provide any time resolution for a local event. As alternative options, several time windowing methods (running average, short time Fourier transform, and etc.) can be utilized, but such time windowing methods also have a limitation of a fixed resolution. This limitation can be overcome by a wavelet transform because the resolution of the wavelet transform effectively varies in the time-frequency plane depending on choice of basic functions which are not necessarily sinusoidal. In this study, a feasibility of a new code accuracy quantification methodology using the wavelet transform is pursued

  1. High Resolution Elevation Contours

    Data.gov (United States)

    Minnesota Department of Natural Resources — This dataset contains contours generated from high resolution data sources such as LiDAR. Generally speaking this data is 2 foot or less contour interval.

  2. Ultra high resolution tomography

    Energy Technology Data Exchange (ETDEWEB)

    Haddad, W.S.

    1994-11-15

    Recent work and results on ultra high resolution three dimensional imaging with soft x-rays will be presented. This work is aimed at determining microscopic three dimensional structure of biological and material specimens. Three dimensional reconstructed images of a microscopic test object will be presented; the reconstruction has a resolution on the order of 1000 A in all three dimensions. Preliminary work with biological samples will also be shown, and the experimental and numerical methods used will be discussed.

  3. High resolution positron tomography

    International Nuclear Information System (INIS)

    Brownell, G.L.; Burnham, C.A.

    1982-01-01

    The limits of spatial resolution in practical positron tomography are examined. The four factors that limit spatial resolution are: positron range; small angle deviation; detector dimensions and properties; statistics. Of these factors, positron range may be considered the fundamental physical limitation since it is independent of instrument properties. The other factors are to a greater or lesser extent dependent on the design of the tomograph

  4. Scalable Resolution Display Walls

    KAUST Repository

    Leigh, Jason; Johnson, Andrew; Renambot, Luc; Peterka, Tom; Jeong, Byungil; Sandin, Daniel J.; Talandis, Jonas; Jagodic, Ratko; Nam, Sungwon; Hur, Hyejung; Sun, Yiwen

    2013-01-01

    This article will describe the progress since 2000 on research and development in 2-D and 3-D scalable resolution display walls that are built from tiling individual lower resolution flat panel displays. The article will describe approaches and trends in display hardware construction, middleware architecture, and user-interaction design. The article will also highlight examples of use cases and the benefits the technology has brought to their respective disciplines. © 1963-2012 IEEE.

  5. Joint optimization of collimator and reconstruction parameters in SPECT imaging for lesion quantification

    International Nuclear Information System (INIS)

    McQuaid, Sarah J; Southekal, Sudeepti; Kijewski, Marie Foley; Moore, Stephen C

    2011-01-01

    Obtaining the best possible task performance using reconstructed SPECT images requires optimization of both the collimator and reconstruction parameters. The goal of this study is to determine how to perform this optimization, namely whether the collimator parameters can be optimized solely from projection data, or whether reconstruction parameters should also be considered. In order to answer this question, and to determine the optimal collimation, a digital phantom representing a human torso with 16 mm diameter hot lesions (activity ratio 8:1) was generated and used to simulate clinical SPECT studies with parallel-hole collimation. Two approaches to optimizing the SPECT system were then compared in a lesion quantification task: sequential optimization, where collimation was optimized on projection data using the Cramer–Rao bound, and joint optimization, which simultaneously optimized collimator and reconstruction parameters. For every condition, quantification performance in reconstructed images was evaluated using the root-mean-squared-error of 400 estimates of lesion activity. Compared to the joint-optimization approach, the sequential-optimization approach favoured a poorer resolution collimator, which, under some conditions, resulted in sub-optimal estimation performance. This implies that inclusion of the reconstruction parameters in the optimization procedure is important in obtaining the best possible task performance; in this study, this was achieved with a collimator resolution similar to that of a general-purpose (LEGP) collimator. This collimator was found to outperform the more commonly used high-resolution (LEHR) collimator, in agreement with other task-based studies, using both quantification and detection tasks.

  6. Resolution 1540 (2004) overview

    International Nuclear Information System (INIS)

    Kasprzyk, N.

    2013-01-01

    This series of slides presents the Resolution 1540, its features and its status of implementation. Resolution 1540 is a response to the risk that non-State actors may acquire, develop, traffic in weapons of mass destruction and their means of delivery. Resolution 1540 was adopted on 28 April 2004 by the U.N. Security Council at the unanimity of its members. Resolution 1540 deals with the 3 kinds of weapons of mass destruction (nuclear, chemical and biological weapons) as well as 'related materials'. This resolution implies 3 sets of obligations: first no support of non-state actors concerning weapons of mass destruction, secondly to set national laws that prohibit any non-state actors to deal with weapons of mass destruction and thirdly to enforce domestic control to prevent the proliferation of nuclear, chemical or biological weapons and their means of delivery. Four working groups operated by the 1540 Committee have been settled: - Implementation (coordinator: Germany); - Assistance (coordinator: France); - International cooperation (interim coordinator: South Africa); and - Transparency and media outreach (coordinator: USA). The status of implementation of the resolution continues to improve since 2004, much work remains to be done and the gravity of the threat remains considerable. (A.C.)

  7. Model Uncertainty Quantification Methods In Data Assimilation

    Science.gov (United States)

    Pathiraja, S. D.; Marshall, L. A.; Sharma, A.; Moradkhani, H.

    2017-12-01

    Data Assimilation involves utilising observations to improve model predictions in a seamless and statistically optimal fashion. Its applications are wide-ranging; from improving weather forecasts to tracking targets such as in the Apollo 11 mission. The use of Data Assimilation methods in high dimensional complex geophysical systems is an active area of research, where there exists many opportunities to enhance existing methodologies. One of the central challenges is in model uncertainty quantification; the outcome of any Data Assimilation study is strongly dependent on the uncertainties assigned to both observations and models. I focus on developing improved model uncertainty quantification methods that are applicable to challenging real world scenarios. These include developing methods for cases where the system states are only partially observed, where there is little prior knowledge of the model errors, and where the model error statistics are likely to be highly non-Gaussian.

  8. Uncertainty Quantification in Alchemical Free Energy Methods.

    Science.gov (United States)

    Bhati, Agastya P; Wan, Shunzhou; Hu, Yuan; Sherborne, Brad; Coveney, Peter V

    2018-05-02

    Alchemical free energy methods have gained much importance recently from several reports of improved ligand-protein binding affinity predictions based on their implementation using molecular dynamics simulations. A large number of variants of such methods implementing different accelerated sampling techniques and free energy estimators are available, each claimed to be better than the others in its own way. However, the key features of reproducibility and quantification of associated uncertainties in such methods have barely been discussed. Here, we apply a systematic protocol for uncertainty quantification to a number of popular alchemical free energy methods, covering both absolute and relative free energy predictions. We show that a reliable measure of error estimation is provided by ensemble simulation-an ensemble of independent MD simulations-which applies irrespective of the free energy method. The need to use ensemble methods is fundamental and holds regardless of the duration of time of the molecular dynamics simulations performed.

  9. Uncertainty Quantification with Applications to Engineering Problems

    DEFF Research Database (Denmark)

    Bigoni, Daniele

    in measurements, predictions and manufacturing, and we can say that any dynamical system used in engineering is subject to some of these uncertainties. The first part of this work presents an overview of the mathematical framework used in Uncertainty Quantification (UQ) analysis and introduces the spectral tensor...... and thus the UQ analysis of the associated systems will benefit greatly from the application of methods which require few function evaluations. We first consider the propagation of the uncertainty and the sensitivity analysis of the non-linear dynamics of railway vehicles with suspension components whose......-scale problems, where efficient methods are necessary with today’s computational resources. The outcome of this work was also the creation of several freely available Python modules for Uncertainty Quantification, which are listed and described in the appendix....

  10. Level 2 probabilistic event analyses and quantification

    International Nuclear Information System (INIS)

    Boneham, P.

    2003-01-01

    In this paper an example of quantification of a severe accident phenomenological event is given. The performed analysis for assessment of the probability that the debris released from the reactor vessel was in a coolable configuration in the lower drywell is presented. It is also analysed the assessment of the type of core/concrete attack that would occur. The coolability of the debris ex-vessel evaluation by an event in the Simplified Boiling Water Reactor (SBWR) Containment Event Tree (CET) and a detailed Decomposition Event Tree (DET) developed to aid in the quantification of this CET event are considered. The headings in the DET selected to represent plant physical states (e.g., reactor vessel pressure at the time of vessel failure) and the uncertainties associated with the occurrence of critical physical phenomena (e.g., debris configuration in the lower drywell) considered important to assessing whether the debris was coolable or not coolable ex-vessel are also discussed

  11. Uncertainty quantification for hyperbolic and kinetic equations

    CERN Document Server

    Pareschi, Lorenzo

    2017-01-01

    This book explores recent advances in uncertainty quantification for hyperbolic, kinetic, and related problems. The contributions address a range of different aspects, including: polynomial chaos expansions, perturbation methods, multi-level Monte Carlo methods, importance sampling, and moment methods. The interest in these topics is rapidly growing, as their applications have now expanded to many areas in engineering, physics, biology and the social sciences. Accordingly, the book provides the scientific community with a topical overview of the latest research efforts.

  12. Quantification of heterogeneity observed in medical images

    OpenAIRE

    Brooks, Frank J; Grigsby, Perry W

    2013-01-01

    Background There has been much recent interest in the quantification of visually evident heterogeneity within functional grayscale medical images, such as those obtained via magnetic resonance or positron emission tomography. In the case of images of cancerous tumors, variations in grayscale intensity imply variations in crucial tumor biology. Despite these considerable clinical implications, there is as yet no standardized method for measuring the heterogeneity observed via these imaging mod...

  13. Artifacts Quantification of Metal Implants in MRI

    Science.gov (United States)

    Vrachnis, I. N.; Vlachopoulos, G. F.; Maris, T. G.; Costaridou, L. I.

    2017-11-01

    The presence of materials with different magnetic properties, such as metal implants, causes distortion of the magnetic field locally, resulting in signal voids and pile ups, i.e. susceptibility artifacts in MRI. Quantitative and unbiased measurement of the artifact is prerequisite for optimization of acquisition parameters. In this study an image gradient based segmentation method is proposed for susceptibility artifact quantification. The method captures abrupt signal alterations by calculation of the image gradient. Then the artifact is quantified in terms of its extent by an automated cross entropy thresholding method as image area percentage. The proposed method for artifact quantification was tested in phantoms containing two orthopedic implants with significantly different magnetic permeabilities. The method was compared against a method proposed in the literature, considered as a reference, demonstrating moderate to good correlation (Spearman’s rho = 0.62 and 0.802 in case of titanium and stainless steel implants). The automated character of the proposed quantification method seems promising towards MRI acquisition parameter optimization.

  14. Uncertainty Quantification for Ice Sheet Science and Sea Level Projections

    Science.gov (United States)

    Boening, C.; Schlegel, N.; Limonadi, D.; Schodlok, M.; Seroussi, H. L.; Larour, E. Y.; Watkins, M. M.

    2017-12-01

    In order to better quantify uncertainties in global mean sea level rise projections and in particular upper bounds, we aim at systematically evaluating the contributions from ice sheets and potential for extreme sea level rise due to sudden ice mass loss. Here, we take advantage of established uncertainty quantification tools embedded within the Ice Sheet System Model (ISSM) as well as sensitivities to ice/ocean interactions using melt rates and melt potential derived from MITgcm/ECCO2. With the use of these tools, we conduct Monte-Carlo style sampling experiments on forward simulations of the Antarctic ice sheet, by varying internal parameters and boundary conditions of the system over both extreme and credible worst-case ranges. Uncertainty bounds for climate forcing are informed by CMIP5 ensemble precipitation and ice melt estimates for year 2100, and uncertainty bounds for ocean melt rates are derived from a suite of regional sensitivity experiments using MITgcm. Resulting statistics allow us to assess how regional uncertainty in various parameters affect model estimates of century-scale sea level rise projections. The results inform efforts to a) isolate the processes and inputs that are most responsible for determining ice sheet contribution to sea level; b) redefine uncertainty brackets for century-scale projections; and c) provide a prioritized list of measurements, along with quantitative information on spatial and temporal resolution, required for reducing uncertainty in future sea level rise projections. Results indicate that ice sheet mass loss is dependent on the spatial resolution of key boundary conditions - such as bedrock topography and melt rates at the ice-ocean interface. This work is performed at and supported by the California Institute of Technology's Jet Propulsion Laboratory. Supercomputing time is also supported through a contract with the National Aeronautics and Space Administration's Cryosphere program.

  15. Improved radioimmunoassay for urinary Tamm-Horsfall glycoprotein. Investigation and resolution of factors affecting its quantification

    Energy Technology Data Exchange (ETDEWEB)

    Dawney, A B.St.J.; Thornley, C; Cattell, W R [Saint Bartholomew' s Hospital, London (UK)

    1982-09-15

    A rapid specific radioimmunoassay has been used to measure Tamm-Horsfall glycoprotein (TH glycoprotein) in urine, and the method described. The apparent concentration increased with increasing dilution of urine in water, reaching a plateau at 1 in 20. This increase was greater the higher the osmolality and TH glycoprotein concentration and the lower the pH of the original sample. The apparent concentration of TH glycoprotein in neat or diluted urine was not affected by freezing or by storage at 4/sup 0/C or room temperature for at least 2 days. A physiological range for the urinary excretion rate was established as 22-56 mg/24h, (considerably higher than the amount present in serum) based on samples from 29 individuals with normal renal function, as defined by their creatinine clearance. There was no significant correlation between serum concentrations of TH glycoprotein and its urinary excretion rate, nor between urinary excretion rate and creatinine clearance.

  16. Quantification of climate change effects on extreme precipitation used for high resolution hydrologic design

    DEFF Research Database (Denmark)

    Arnbjerg-Nielsen, Karsten

    2012-01-01

    are studied, all based on output from historical rain series of the present climate and output from Regional Climate Models. Two models are applied, one being based on an extreme value model, the Partial Duration Series Approach, and the other based on a stochastic rainfall generator model. Finally...

  17. Method for resolution and quantification of components of the non-photochemical quenching (qN)

    Czech Academy of Sciences Publication Activity Database

    Roháček, Karel

    2010-01-01

    Roč. 105, č. 2 (2010), s. 101-113 ISSN 0166-8595 R&D Projects: GA AV ČR IAA600960716 Institutional research plan: CEZ:AV0Z50510513 Keywords : Dark relaxation * Fluorescence quenching * Multi-exponential regression Subject RIV: BO - Biophysics Impact factor: 2.410, year: 2010

  18. Identification and Quantification of Celery Allergens Using Fiber Optic Surface Plasmon Resonance PCR.

    Science.gov (United States)

    Daems, Devin; Peeters, Bernd; Delport, Filip; Remans, Tony; Lammertyn, Jeroen; Spasic, Dragana

    2017-07-31

    Abstract : Accurate identification and quantification of allergens is key in healthcare, biotechnology and food quality and safety. Celery ( Apium graveolens ) is one of the most important elicitors of food allergic reactions in Europe. Currently, the golden standards to identify, quantify and discriminate celery in a biological sample are immunoassays and two-step molecular detection assays in which quantitative PCR (qPCR) is followed by a high-resolution melting analysis (HRM). In order to provide a DNA-based, rapid and simple detection method suitable for one-step quantification, a fiber optic PCR melting assay (FO-PCR-MA) was developed to determine different concentrations of celery DNA (1 pM-0.1 fM). The presented method is based on the hybridization and melting of DNA-coated gold nanoparticles to the FO sensor surface in the presence of the target gene (mannitol dehydrogenase, Mtd ). The concept was not only able to reveal the presence of celery DNA, but also allowed for the cycle-to-cycle quantification of the target sequence through melting analysis. Furthermore, the developed bioassay was benchmarked against qPCR followed by HRM, showing excellent agreement ( R ² = 0.96). In conclusion, this innovative and sensitive diagnostic test could further improve food quality control and thus have a large impact on allergen induced healthcare problems.

  19. Identification and Quantification of Celery Allergens Using Fiber Optic Surface Plasmon Resonance PCR

    Directory of Open Access Journals (Sweden)

    Devin Daems

    2017-07-01

    Full Text Available Abstract: Accurate identification and quantification of allergens is key in healthcare, biotechnology and food quality and safety. Celery (Apium graveolens is one of the most important elicitors of food allergic reactions in Europe. Currently, the golden standards to identify, quantify and discriminate celery in a biological sample are immunoassays and two-step molecular detection assays in which quantitative PCR (qPCR is followed by a high-resolution melting analysis (HRM. In order to provide a DNA-based, rapid and simple detection method suitable for one-step quantification, a fiber optic PCR melting assay (FO-PCR-MA was developed to determine different concentrations of celery DNA (1 pM–0.1 fM. The presented method is based on the hybridization and melting of DNA-coated gold nanoparticles to the FO sensor surface in the presence of the target gene (mannitol dehydrogenase, Mtd. The concept was not only able to reveal the presence of celery DNA, but also allowed for the cycle-to-cycle quantification of the target sequence through melting analysis. Furthermore, the developed bioassay was benchmarked against qPCR followed by HRM, showing excellent agreement (R2 = 0.96. In conclusion, this innovative and sensitive diagnostic test could further improve food quality control and thus have a large impact on allergen induced healthcare problems.

  20. Histomorphometric quantification of human pathological bones from synchrotron radiation 3D computed microtomography

    International Nuclear Information System (INIS)

    Nogueira, Liebert P.; Braz, Delson

    2011-01-01

    Conventional bone histomorphometry is an important method for quantitative evaluation of bone microstructure. X-ray computed microtomography is a noninvasive technique, which can be used to evaluate histomorphometric indices in trabecular bones (BV/TV, BS/BV, Tb.N, Tb.Th, Tb.Sp). In this technique, the output 3D images are used to quantify the whole sample, differently from the conventional one, in which the quantification is performed in 2D slices and extrapolated for 3D case. In this work, histomorphometric quantification using synchrotron 3D X-ray computed microtomography was performed to quantify pathological samples of human bone. Samples of human bones were cut into small blocks (8 mm x 8 mm x 10 mm) with a precision saw and then imaged. The computed microtomographies were obtained at SYRMEP (Synchrotron Radiation for MEdical Physics) beamline, at ELETTRA synchrotron radiation facility (Italy). The obtained 3D images yielded excellent resolution and details of intra-trabecular bone structures, including marrow present inside trabeculae. Histomorphometric quantification was compared to literature as well. (author)

  1. Subnuclear foci quantification using high-throughput 3D image cytometry

    Science.gov (United States)

    Wadduwage, Dushan N.; Parrish, Marcus; Choi, Heejin; Engelward, Bevin P.; Matsudaira, Paul; So, Peter T. C.

    2015-07-01

    Ionising radiation causes various types of DNA damages including double strand breaks (DSBs). DSBs are often recognized by DNA repair protein ATM which forms gamma-H2AX foci at the site of the DSBs that can be visualized using immunohistochemistry. However most of such experiments are of low throughput in terms of imaging and image analysis techniques. Most of the studies still use manual counting or classification. Hence they are limited to counting a low number of foci per cell (5 foci per nucleus) as the quantification process is extremely labour intensive. Therefore we have developed a high throughput instrumentation and computational pipeline specialized for gamma-H2AX foci quantification. A population of cells with highly clustered foci inside nuclei were imaged, in 3D with submicron resolution, using an in-house developed high throughput image cytometer. Imaging speeds as high as 800 cells/second in 3D were achieved by using HiLo wide-field depth resolved imaging and a remote z-scanning technique. Then the number of foci per cell nucleus were quantified using a 3D extended maxima transform based algorithm. Our results suggests that while most of the other 2D imaging and manual quantification studies can count only up to about 5 foci per nucleus our method is capable of counting more than 100. Moreover we show that 3D analysis is significantly superior compared to the 2D techniques.

  2. Quantification of habitat fragmentation reveals extinction risk in terrestrial mammals

    Science.gov (United States)

    Crooks, Kevin R.; Burdett, Christopher L.; Theobald, David M.; King, Sarah R. B.; Rondinini, Carlo; Boitani, Luigi

    2017-01-01

    Although habitat fragmentation is often assumed to be a primary driver of extinction, global patterns of fragmentation and its relationship to extinction risk have not been consistently quantified for any major animal taxon. We developed high-resolution habitat fragmentation models and used phylogenetic comparative methods to quantify the effects of habitat fragmentation on the world’s terrestrial mammals, including 4,018 species across 26 taxonomic Orders. Results demonstrate that species with more fragmentation are at greater risk of extinction, even after accounting for the effects of key macroecological predictors, such as body size and geographic range size. Species with higher fragmentation had smaller ranges and a lower proportion of high-suitability habitat within their range, and most high-suitability habitat occurred outside of protected areas, further elevating extinction risk. Our models provide a quantitative evaluation of extinction risk assessments for species, allow for identification of emerging threats in species not classified as threatened, and provide maps of global hotspots of fragmentation for the world’s terrestrial mammals. Quantification of habitat fragmentation will help guide threat assessment and strategic priorities for global mammal conservation. PMID:28673992

  3. Uncertainty quantification in nanomechanical measurements using the atomic force microscope

    International Nuclear Information System (INIS)

    Wagner, Ryan; Raman, Arvind; Moon, Robert; Pratt, Jon; Shaw, Gordon

    2011-01-01

    Quantifying uncertainty in measured properties of nanomaterials is a prerequisite for the manufacture of reliable nanoengineered materials and products. Yet, rigorous uncertainty quantification (UQ) is rarely applied for material property measurements with the atomic force microscope (AFM), a widely used instrument that can measure properties at nanometer scale resolution of both inorganic and biological surfaces and nanomaterials. We present a framework to ascribe uncertainty to local nanomechanical properties of any nanoparticle or surface measured with the AFM by taking into account the main uncertainty sources inherent in such measurements. We demonstrate the framework by quantifying uncertainty in AFM-based measurements of the transverse elastic modulus of cellulose nanocrystals (CNCs), an abundant, plant-derived nanomaterial whose mechanical properties are comparable to Kevlar fibers. For a single, isolated CNC the transverse elastic modulus was found to have a mean of 8.1 GPa and a 95% confidence interval of 2.7–20 GPa. A key result is that multiple replicates of force–distance curves do not sample the important sources of uncertainty, which are systematic in nature. The dominant source of uncertainty is the nondimensional photodiode sensitivity calibration rather than the cantilever stiffness or Z-piezo calibrations. The results underscore the great need for, and open a path towards, quantifying and minimizing uncertainty in AFM-based material property measurements of nanoparticles, nanostructured surfaces, thin films, polymers and biomaterials.

  4. High resolution solar observations

    International Nuclear Information System (INIS)

    Title, A.

    1985-01-01

    Currently there is a world-wide effort to develop optical technology required for large diffraction limited telescopes that must operate with high optical fluxes. These developments can be used to significantly improve high resolution solar telescopes both on the ground and in space. When looking at the problem of high resolution observations it is essential to keep in mind that a diffraction limited telescope is an interferometer. Even a 30 cm aperture telescope, which is small for high resolution observations, is a big interferometer. Meter class and above diffraction limited telescopes can be expected to be very unforgiving of inattention to details. Unfortunately, even when an earth based telescope has perfect optics there are still problems with the quality of its optical path. The optical path includes not only the interior of the telescope, but also the immediate interface between the telescope and the atmosphere, and finally the atmosphere itself

  5. Resolution and termination

    Directory of Open Access Journals (Sweden)

    Adina FOLTIŞ

    2012-01-01

    Full Text Available The resolution, the termination and the reduction of labour conscription are regulated by articles 1549-1554 in the new Civil Code, which represents the common law in this matter. We appreciate that the new regulation does not conclusively clarify the issue related to whether the existence of liability in order to call upon the resolution is necessary or not, because the existence of this condition has been inferred under the previous regulation from the fact that the absence of liability shifts the inexecution issue on the domain of fortuitous impossibility of execution, situation in which the resolution of the contract is not in question, but that of the risk it implies.

  6. Quantification of fossil fuel CO2 at the building/street level for large US cities

    Science.gov (United States)

    Gurney, K. R.; Razlivanov, I. N.; Song, Y.

    2012-12-01

    Quantification of fossil fuel CO2 emissions from the bottom-up perspective is a critical element in emerging plans on a global, integrated, carbon monitoring system (CMS). A space/time explicit emissions data product can act as both a verification and planning system. It can verify atmospheric CO2 measurements (in situ and remote) and offer detailed mitigation information to management authorities in order to optimize the mix of mitigation efforts. Here, we present the Hestia Project, an effort aimed at building a high resolution (eg. building and road link-specific, hourly) fossil fuel CO2 emissions data product for the urban domain as a pilot effort to a CMS. A complete data product has been built for the city of Indianapolis and preliminary quantification has been completed for Los Angeles and Phoenix (see figure). The effort in Indianapolis is now part of a larger effort aimed at a convergent top-down/bottom-up assessment of greenhouse gas emissions, called INFLUX. Our urban-level quantification relies on a mixture of data and modeling structures. We start with the sector-specific Vulcan Project estimate at the mix of geocoded and county-wide levels. The Hestia aim is to distribute the Vulcan result in space and time. Two components take the majority of effort: buildings and onroad emissions. In collaboration with our INFLUX colleagues, we are transporting these high resolution emissions through an atmospheric transport model for a forward comparison of the Hestia data product with atmospheric measurements, collected on aircraft and cell towers. In preparation for a formal urban-scale inversion, these forward comparisons offer insights into both improving our emissions data product and measurement strategies. A key benefit of the approach taken in this study is the tracking and archiving of fuel and process-level detail (eg. combustion process, other pollutants), allowing for a more thorough understanding and analysis of energy throughputs in the urban

  7. High resolution drift chambers

    International Nuclear Information System (INIS)

    Va'vra, J.

    1985-07-01

    High precision drift chambers capable of achieving less than or equal to 50 μm resolutions are discussed. In particular, we compare so called cool and hot gases, various charge collection geometries, several timing techniques and we also discuss some systematic problems. We also present what we would consider an ''ultimate'' design of the vertex chamber. 50 refs., 36 figs., 6 tabs

  8. Quantification of silver nanoparticle uptake and distribution within individual human macrophages by FIB/SEM slice and view.

    Science.gov (United States)

    Guehrs, Erik; Schneider, Michael; Günther, Christian M; Hessing, Piet; Heitz, Karen; Wittke, Doreen; López-Serrano Oliver, Ana; Jakubowski, Norbert; Plendl, Johanna; Eisebitt, Stefan; Haase, Andrea

    2017-03-21

    Quantification of nanoparticle (NP) uptake in cells or tissues is very important for safety assessment. Often, electron microscopy based approaches are used for this purpose, which allow imaging at very high resolution. However, precise quantification of NP numbers in cells and tissues remains challenging. The aim of this study was to present a novel approach, that combines precise quantification of NPs in individual cells together with high resolution imaging of their intracellular distribution based on focused ion beam/ scanning electron microscopy (FIB/SEM) slice and view approaches. We quantified cellular uptake of 75 nm diameter citrate stabilized silver NPs (Ag 75 Cit) into an individual human macrophage derived from monocytic THP-1 cells using a FIB/SEM slice and view approach. Cells were treated with 10 μg/ml for 24 h. We investigated a single cell and found in total 3138 ± 722 silver NPs inside this cell. Most of the silver NPs were located in large agglomerates, only a few were found in clusters of fewer than five NPs. Furthermore, we cross-checked our results by using inductively coupled plasma mass spectrometry and could confirm the FIB/SEM results. Our approach based on FIB/SEM slice and view is currently the only one that allows the quantification of the absolute dose of silver NPs in individual cells and at the same time to assess their intracellular distribution at high resolution. We therefore propose to use FIB/SEM slice and view to systematically analyse the cellular uptake of various NPs as a function of size, concentration and incubation time.

  9. Quantification of lung fibrosis and emphysema in mice using automated micro-computed tomography.

    Directory of Open Access Journals (Sweden)

    Ellen De Langhe

    Full Text Available BACKGROUND: In vivo high-resolution micro-computed tomography allows for longitudinal image-based measurements in animal models of lung disease. The combination of repetitive high resolution imaging with fully automated quantitative image analysis in mouse models of lung fibrosis lung benefits preclinical research. This study aimed to develop and validate such an automated micro-computed tomography analysis algorithm for quantification of aerated lung volume in mice; an indicator of pulmonary fibrosis and emphysema severity. METHODOLOGY: Mice received an intratracheal instillation of bleomycin (n = 8, elastase (0.25 U elastase n = 9, 0.5 U elastase n = 8 or saline control (n = 6 for fibrosis, n = 5 for emphysema. A subset of mice was scanned without intervention, to evaluate potential radiation-induced toxicity (n = 4. Some bleomycin-instilled mice were treated with imatinib for proof of concept (n = 8. Mice were scanned weekly, until four weeks after induction, when they underwent pulmonary function testing, lung histology and collagen quantification. Aerated lung volumes were calculated with our automated algorithm. PRINCIPAL FINDINGS: Our automated image-based aerated lung volume quantification method is reproducible with low intra-subject variability. Bleomycin-treated mice had significantly lower scan-derived aerated lung volumes, compared to controls. Aerated lung volume correlated with the histopathological fibrosis score and total lung collagen content. Inversely, a dose-dependent increase in lung volume was observed in elastase-treated mice. Serial scanning of individual mice is feasible and visualized dynamic disease progression. No radiation-induced toxicity was observed. Three-dimensional images provided critical topographical information. CONCLUSIONS: We report on a high resolution in vivo micro-computed tomography image analysis algorithm that runs fully automated and allows quantification of aerated lung volume in mice. This

  10. Digital PCR for direct quantification of viruses without DNA extraction.

    Science.gov (United States)

    Pavšič, Jernej; Žel, Jana; Milavec, Mojca

    2016-01-01

    DNA extraction before amplification is considered an essential step for quantification of viral DNA using real-time PCR (qPCR). However, this can directly affect the final measurements due to variable DNA yields and removal of inhibitors, which leads to increased inter-laboratory variability of qPCR measurements and reduced agreement on viral loads. Digital PCR (dPCR) might be an advantageous methodology for the measurement of virus concentrations, as it does not depend on any calibration material and it has higher tolerance to inhibitors. DNA quantification without an extraction step (i.e. direct quantification) was performed here using dPCR and two different human cytomegalovirus whole-virus materials. Two dPCR platforms were used for this direct quantification of the viral DNA, and these were compared with quantification of the extracted viral DNA in terms of yield and variability. Direct quantification of both whole-virus materials present in simple matrices like cell lysate or Tris-HCl buffer provided repeatable measurements of virus concentrations that were probably in closer agreement with the actual viral load than when estimated through quantification of the extracted DNA. Direct dPCR quantification of other viruses, reference materials and clinically relevant matrices is now needed to show the full versatility of this very promising and cost-efficient development in virus quantification.

  11. Development of a VHH-Based Erythropoietin Quantification Assay

    DEFF Research Database (Denmark)

    Kol, Stefan; Beuchert Kallehauge, Thomas; Adema, Simon

    2015-01-01

    Erythropoietin (EPO) quantification during cell line selection and bioreactor cultivation has traditionally been performed with ELISA or HPLC. As these techniques suffer from several drawbacks, we developed a novel EPO quantification assay. A camelid single-domain antibody fragment directed against...... human EPO was evaluated as a capturing antibody in a label-free biolayer interferometry-based quantification assay. Human recombinant EPO can be specifically detected in Chinese hamster ovary cell supernatants in a sensitive and pH-dependent manner. This method enables rapid and robust quantification...

  12. Quantification procedures in micro X-ray fluorescence analysis

    International Nuclear Information System (INIS)

    Kanngiesser, Birgit

    2003-01-01

    For the quantification in micro X-ray fluorescence analysis standardfree quantification procedures have become especially important. An introduction to the basic concepts of these quantification procedures is given, followed by a short survey of the procedures which are available now and what kind of experimental situations and analytical problems are addressed. The last point is extended by the description of an own development for the fundamental parameter method, which renders the inclusion of nonparallel beam geometries possible. Finally, open problems for the quantification procedures are discussed

  13. Quantification of competitive value of documents

    Directory of Open Access Journals (Sweden)

    Pavel Šimek

    2009-01-01

    Full Text Available The majority of Internet users use the global network to search for different information using fulltext search engines such as Google, Yahoo!, or Seznam. The web presentation operators are trying, with the help of different optimization techniques, to get to the top places in the results of fulltext search engines. Right there is a great importance of Search Engine Optimization and Search Engine Marketing, because normal users usually try links only on the first few pages of the fulltext search engines results on certain keywords and in catalogs they use primarily hierarchically higher placed links in each category. Key to success is the application of optimization methods which deal with the issue of keywords, structure and quality of content, domain names, individual sites and quantity and reliability of backward links. The process is demanding, long-lasting and without a guaranteed outcome. A website operator without advanced analytical tools do not identify the contribution of individual documents from which the entire web site consists. If the web presentation operators want to have an overview of their documents and web site in global, it is appropriate to quantify these positions in a specific way, depending on specific key words. For this purpose serves the quantification of competitive value of documents, which consequently sets global competitive value of a web site. Quantification of competitive values is performed on a specific full-text search engine. For each full-text search engine can be and often are, different results. According to published reports of ClickZ agency or Market Share is according to the number of searches by English-speaking users most widely used Google search engine, which has a market share of more than 80%. The whole procedure of quantification of competitive values is common, however, the initial step which is the analysis of keywords depends on a choice of the fulltext search engine.

  14. Advances in forensic DNA quantification: a review.

    Science.gov (United States)

    Lee, Steven B; McCord, Bruce; Buel, Eric

    2014-11-01

    This review focuses upon a critical step in forensic biology: detection and quantification of human DNA from biological samples. Determination of the quantity and quality of human DNA extracted from biological evidence is important for several reasons. Firstly, depending on the source and extraction method, the quality (purity and length), and quantity of the resultant DNA extract can vary greatly. This affects the downstream method as the quantity of input DNA and its relative length can determine which genotyping procedure to use-standard short-tandem repeat (STR) typing, mini-STR typing or mitochondrial DNA sequencing. Secondly, because it is important in forensic analysis to preserve as much of the evidence as possible for retesting, it is important to determine the total DNA amount available prior to utilizing any destructive analytical method. Lastly, results from initial quantitative and qualitative evaluations permit a more informed interpretation of downstream analytical results. Newer quantitative techniques involving real-time PCR can reveal the presence of degraded DNA and PCR inhibitors, that provide potential reasons for poor genotyping results and may indicate methods to use for downstream typing success. In general, the more information available, the easier it is to interpret and process the sample resulting in a higher likelihood of successful DNA typing. The history of the development of quantitative methods has involved two main goals-improving precision of the analysis and increasing the information content of the result. This review covers advances in forensic DNA quantification methods and recent developments in RNA quantification. © 2014 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  15. Stereo-particle image velocimetry uncertainty quantification

    International Nuclear Information System (INIS)

    Bhattacharya, Sayantan; Vlachos, Pavlos P; Charonko, John J

    2017-01-01

    Particle image velocimetry (PIV) measurements are subject to multiple elemental error sources and thus estimating overall measurement uncertainty is challenging. Recent advances have led to a posteriori uncertainty estimation methods for planar two-component PIV. However, no complete methodology exists for uncertainty quantification in stereo PIV. In the current work, a comprehensive framework is presented to quantify the uncertainty stemming from stereo registration error and combine it with the underlying planar velocity uncertainties. The disparity in particle locations of the dewarped images is used to estimate the positional uncertainty of the world coordinate system, which is then propagated to the uncertainty in the calibration mapping function coefficients. Next, the calibration uncertainty is combined with the planar uncertainty fields of the individual cameras through an uncertainty propagation equation and uncertainty estimates are obtained for all three velocity components. The methodology was tested with synthetic stereo PIV data for different light sheet thicknesses, with and without registration error, and also validated with an experimental vortex ring case from 2014 PIV challenge. Thorough sensitivity analysis was performed to assess the relative impact of the various parameters to the overall uncertainty. The results suggest that in absence of any disparity, the stereo PIV uncertainty prediction method is more sensitive to the planar uncertainty estimates than to the angle uncertainty, although the latter is not negligible for non-zero disparity. Overall the presented uncertainty quantification framework showed excellent agreement between the error and uncertainty RMS values for both the synthetic and the experimental data and demonstrated reliable uncertainty prediction coverage. This stereo PIV uncertainty quantification framework provides the first comprehensive treatment on the subject and potentially lays foundations applicable to volumetric

  16. Collaborative framework for PIV uncertainty quantification: the experimental database

    International Nuclear Information System (INIS)

    Neal, Douglas R; Sciacchitano, Andrea; Scarano, Fulvio; Smith, Barton L

    2015-01-01

    The uncertainty quantification of particle image velocimetry (PIV) measurements has recently become a topic of great interest as shown by the recent appearance of several different methods within the past few years. These approaches have different working principles, merits and limitations, which have been speculated upon in subsequent studies. This paper reports a unique experiment that has been performed specifically to test the efficacy of PIV uncertainty methods. The case of a rectangular jet, as previously studied by Timmins et al (2012) and Wilson and Smith (2013b), is used. The novel aspect of the experiment is simultaneous velocity measurements using two different time-resolved PIV systems and a hot-wire anemometry (HWA) system. The first PIV system, called the PIV measurement system (‘PIV-MS’), is intended for nominal measurements of which the uncertainty is to be evaluated. It is based on a single camera and features a dynamic velocity range (DVR) representative of typical PIV experiments. The second PIV system, called the ‘PIV-HDR’ (high dynamic range) system, features a significantly higher DVR obtained with a higher digital imaging resolution. The hot-wire is placed in close proximity to the PIV measurement domain. The three measurement systems were carefully set to simultaneously measure the flow velocity at the same time and location. The comparison between the PIV-HDR system and the HWA provides an estimate of the measurement precision of the reference velocity for evaluation of the instantaneous error in the measurement system. The discrepancy between the PIV-MS and the reference data provides the measurement error, which is later used to assess the different uncertainty quantification methods proposed in the literature. A detailed comparison of the uncertainty estimation methods based on the present datasets is presented in a second paper from Sciacchitano et al (2015). Furthermore, this database offers the potential to be used for

  17. Quantification of thermal damage in skin tissue

    Institute of Scientific and Technical Information of China (English)

    Xu Feng; Wen Ting; Lu Tianjian; Seffen Keith

    2008-01-01

    Skin thermal damage or skin burns are the most commonly encountered type of trauma in civilian and military communities. Besides, advances in laser, microwave and similar technologies have led to recent developments of thermal treatments for disease and damage involving skin tissue, where the objective is to induce thermal damage precisely within targeted tissue structures but without affecting the surrounding, healthy tissue. Further, extended pain sensation induced by thermal damage has also brought great problem for burn patients. Thus, it is of great importance to quantify the thermal damage in skin tissue. In this paper, the available models and experimental methods for quantification of thermal damage in skin tissue are discussed.

  18. Automated Quantification of Pneumothorax in CT

    Science.gov (United States)

    Do, Synho; Salvaggio, Kristen; Gupta, Supriya; Kalra, Mannudeep; Ali, Nabeel U.; Pien, Homer

    2012-01-01

    An automated, computer-aided diagnosis (CAD) algorithm for the quantification of pneumothoraces from Multidetector Computed Tomography (MDCT) images has been developed. Algorithm performance was evaluated through comparison to manual segmentation by expert radiologists. A combination of two-dimensional and three-dimensional processing techniques was incorporated to reduce required processing time by two-thirds (as compared to similar techniques). Volumetric measurements on relative pneumothorax size were obtained and the overall performance of the automated method shows an average error of just below 1%. PMID:23082091

  19. Uncertainty quantification for PZT bimorph actuators

    Science.gov (United States)

    Bravo, Nikolas; Smith, Ralph C.; Crews, John

    2018-03-01

    In this paper, we discuss the development of a high fidelity model for a PZT bimorph actuator used for micro-air vehicles, which includes the Robobee. We developed a high-fidelity model for the actuator using the homogenized energy model (HEM) framework, which quantifies the nonlinear, hysteretic, and rate-dependent behavior inherent to PZT in dynamic operating regimes. We then discussed an inverse problem on the model. We included local and global sensitivity analysis of the parameters in the high-fidelity model. Finally, we will discuss the results of Bayesian inference and uncertainty quantification on the HEM.

  20. Linking probe thermodynamics to microarray quantification

    International Nuclear Information System (INIS)

    Li, Shuzhao; Pozhitkov, Alexander; Brouwer, Marius

    2010-01-01

    Understanding the difference in probe properties holds the key to absolute quantification of DNA microarrays. So far, Langmuir-like models have failed to link sequence-specific properties to hybridization signals in the presence of a complex hybridization background. Data from washing experiments indicate that the post-hybridization washing has no major effect on the specifically bound targets, which give the final signals. Thus, the amount of specific targets bound to probes is likely determined before washing, by the competition against nonspecific binding. Our competitive hybridization model is a viable alternative to Langmuir-like models. (comment)

  1. Image cytometry: nuclear and chromosomal DNA quantification.

    Science.gov (United States)

    Carvalho, Carlos Roberto; Clarindo, Wellington Ronildo; Abreu, Isabella Santiago

    2011-01-01

    Image cytometry (ICM) associates microscopy, digital image and software technologies, and has been particularly useful in spatial and densitometric cytological analyses, such as DNA ploidy and DNA content measurements. Basically, ICM integrates methodologies of optical microscopy calibration, standard density filters, digital CCD camera, and image analysis softwares for quantitative applications. Apart from all system calibration and setup, cytological protocols must provide good slide preparations for efficient and reliable ICM analysis. In this chapter, procedures for ICM applications employed in our laboratory are described. Protocols shown here for human DNA ploidy determination and quantification of nuclear and chromosomal DNA content in plants could be used as described, or adapted for other studies.

  2. Adjoint-Based Uncertainty Quantification with MCNP

    Energy Technology Data Exchange (ETDEWEB)

    Seifried, Jeffrey E. [Univ. of California, Berkeley, CA (United States)

    2011-09-01

    This work serves to quantify the instantaneous uncertainties in neutron transport simulations born from nuclear data and statistical counting uncertainties. Perturbation and adjoint theories are used to derive implicit sensitivity expressions. These expressions are transformed into forms that are convenient for construction with MCNP6, creating the ability to perform adjoint-based uncertainty quantification with MCNP6. These new tools are exercised on the depleted-uranium hybrid LIFE blanket, quantifying its sensitivities and uncertainties to important figures of merit. Overall, these uncertainty estimates are small (< 2%). Having quantified the sensitivities and uncertainties, physical understanding of the system is gained and some confidence in the simulation is acquired.

  3. Uncertainty quantification and stochastic modeling with Matlab

    CERN Document Server

    Souza de Cursi, Eduardo

    2015-01-01

    Uncertainty Quantification (UQ) is a relatively new research area which describes the methods and approaches used to supply quantitative descriptions of the effects of uncertainty, variability and errors in simulation problems and models. It is rapidly becoming a field of increasing importance, with many real-world applications within statistics, mathematics, probability and engineering, but also within the natural sciences. Literature on the topic has up until now been largely based on polynomial chaos, which raises difficulties when considering different types of approximation and does no

  4. Composition quantification of electron-transparent samples by backscattered electron imaging in scanning electron microscopy

    Energy Technology Data Exchange (ETDEWEB)

    Müller, E., E-mail: erich.mueller@kit.edu; Gerthsen, D.

    2017-02-15

    The contrast of backscattered electron (BSE) images in scanning electron microscopy (SEM) depends on material parameters which can be exploited for composition quantification if some information on the material system is available. As an example, the In-concentration in thin In{sub x}Ga{sub 1−x}As layers embedded in a GaAs matrix is analyzed in this work. The spatial resolution of the technique is improved by using thin electron-transparent specimens instead of bulk samples. Although the BSEs are detected in a comparably small angular range by an annular semiconductor detector, the image intensity can be evaluated to determine the composition and local thickness of the specimen. The measured intensities are calibrated within one single image to eliminate the influence of the detection and amplification system. Quantification is performed by comparison of experimental and calculated data. Instead of using time-consuming Monte-Carlo simulations, an analytical model is applied for BSE-intensity calculations which considers single electron scattering and electron diffusion. - Highlights: • Sample thickness and composition are quantified by backscattered electron imaging. • A thin sample is used to achieve spatial resolution of few nanometers. • Calculations are carried out with a time-saving electron diffusion model. • Small differences in atomic number and density detected at low electron energies.

  5. Quantification of Impact of Orbital Drift on Inter-Annual Trends in AVHRR NDVI Data

    Directory of Open Access Journals (Sweden)

    Jyoteshwar R. Nagol

    2014-07-01

    Full Text Available The Normalized Difference Vegetation Index (NDVI time-series data derived from Advanced Very High Resolution Radiometer (AVHRR have been extensively used for studying inter-annual dynamics of global and regional vegetation. However, there can be significant uncertainties in the data due to incomplete atmospheric correction and orbital drift of the satellites through their active life. Access to location specific quantification of uncertainty is crucial for appropriate evaluation of the trends and anomalies. This paper provides per pixel quantification of orbital drift related spurious trends in Long Term Data Record (LTDR AVHRR NDVI data product. The magnitude and direction of the spurious trends was estimated by direct comparison with data from MODerate resolution Imaging Spectrometer (MODIS Aqua instrument, which has stable inter-annual sun-sensor geometry. The maps show presence of both positive as well as negative spurious trends in the data. After application of the BRDF correction, an overall decrease in positive trends and an increase in number of pixels with negative spurious trends were observed. The mean global spurious inter-annual NDVI trend before and after BRDF correction was 0.0016 and −0.0017 respectively. The research presented in this paper gives valuable insight into the magnitude of orbital drift related trends in the AVHRR NDVI data as well as the degree to which it is being rectified by the MODIS BRDF correction algorithm used by the LTDR processing stream.

  6. Atomic force microscopy applied to the quantification of nano-precipitates in thermo-mechanically treated microalloyed steels

    Energy Technology Data Exchange (ETDEWEB)

    Renteria-Borja, Luciano [Instituto Tecnologico de Morelia, Av. Tecnologico No. 1500, Lomas de Santiaguito, 58120 Morelia (Mexico); Hurtado-Delgado, Eduardo, E-mail: hurtado@itmorelia.edu.mx [Instituto Tecnologico de Morelia, Av. Tecnologico No. 1500, Lomas de Santiaguito, 58120 Morelia (Mexico); Garnica-Gonzalez, Pedro [Instituto Tecnologico de Morelia, Av. Tecnologico No. 1500, Lomas de Santiaguito, 58120 Morelia (Mexico); Dominguez-Lopez, Ivan; Garcia-Garcia, Adrian Luis [Centro de Investigacion en Ciencia Aplicada y Tecnologia Avanzada-IPN Unidad Queretaro, Cerro Blanco No. 141, Colinas del Cimatario, 76090 Queretaro (Mexico)

    2012-07-15

    Quantification of nanometer-size precipitates in microalloyed steels has been traditionally performed using transmission electron microscopy (TEM), in spite of its complicated sample preparation procedures, prone to preparation errors and sample perturbation. In contrast to TEM procedures, atomic force microscopy (AFM) is performed on the as-prepared specimen, with sample preparation requirements similar to those for optical microscopy (OM), rendering three-dimensional representations of the sample surface with vertical resolution of a fraction of a nanometer. In AFM, contrast mechanisms are directly related to surface properties such as topography, adhesion, and stiffness, among others. Chemical etching was performed using 0.5% nital, at time intervals between 4 and 20 s, in 4 s steps, until reaching the desired surface finish. For the present application, an average surface-roughness peak-height below 200 nm was sought. Quantification results of nanometric precipitates were obtained from the statistical analysis of AFM images of the microstructure developed by microalloyed Nb and V-Mo steels. Topography and phase contrast AFM images were used for quantification. The results obtained using AFM are consistent with similar TEM reports. - Highlights: Black-Right-Pointing-Pointer We quantified nanometric precipitates in Nb and V-Mo microalloyed steels using AFM. Black-Right-Pointing-Pointer Microstructures of the thermo-mechanically treated microalloyed steels were used. Black-Right-Pointing-Pointer Topography and phase contrast AFM images were used for quantification. Black-Right-Pointing-Pointer AFM results are comparable with traditionally obtained TEM measurements.

  7. HPCE quantification of 5-methyl-2'-deoxycytidine in genomic DNA: methodological optimization for chestnut and other woody species.

    Science.gov (United States)

    Hasbún, Rodrigo; Valledor, Luís; Rodríguez, José L; Santamaria, Estrella; Ríos, Darcy; Sanchez, Manuel; Cañal, María J; Rodríguez, Roberto

    2008-01-01

    Quantification of deoxynucleosides using micellar high-performance capillary electrophoresis (HPCE) is an efficient, fast and inexpensive evaluation method of genomic DNA methylation. This approach has been demonstrated to be more sensitive and specific than other methods for the quantification of DNA methylation content. However, effective detection and quantification of 5-methyl-2'-deoxycytidine depend of the sample characteristics. Previous works have revealed that in most woody species, the quality and quantity of RNA-free DNA extracted that is suitable for analysis by means of HPCE varies among species of the same gender, among tissues taken from the same tree, and vary in the same tissue depending on the different seasons of the year. The aim of this work is to establish a quantification method of genomic DNA methylation that lends itself to use in different Castanea sativa Mill. materials, and in other angiosperm and gymnosperm woody species. Using a DNA extraction kit based in silica membrane has increased the resolutive capacity of the method. Under these conditions, it can be analyzed different organs or tissues of angiosperms and gymnosperms, regardless of their state of development. We emphasized the importance of samples free of nucleosides, although, in the contrary case, the method ensures the effective separation of deoxynucleosides and identification of 5-methyl-2'-deoxycytidine.

  8. Deep-Dive Targeted Quantification for Ultrasensitive Analysis of Proteins in Nondepleted Human Blood Plasma/Serum and Tissues

    Energy Technology Data Exchange (ETDEWEB)

    Nie, Song [Biological Sciences Division; Shi, Tujin [Biological Sciences Division; Fillmore, Thomas L. [Biological Sciences Division; Schepmoes, Athena A. [Biological Sciences Division; Brewer, Heather [Biological Sciences Division; Gao, Yuqian [Biological Sciences Division; Song, Ehwang [Biological Sciences Division; Wang, Hui [Biological Sciences Division; Rodland, Karin D. [Biological Sciences Division; Qian, Wei-Jun [Biological Sciences Division; Smith, Richard D. [Biological Sciences Division; Liu, Tao [Biological Sciences Division

    2017-08-11

    Mass spectrometry-based targeted proteomics (e.g., selected reaction monitoring, SRM) is emerging as an attractive alternative to immunoassays for protein quantification. Recently we have made significant progress in SRM sensitivity for enabling quantification of low ng/mL to sub-ng/mL level proteins in nondepleted human blood plasma/serum without affinity enrichment. However, precise quantification of extremely low abundant but biologically important proteins (e.g., ≤100 pg/mL in blood plasma/serum) using targeted proteomics approaches still remains challenging. To address this need, we have developed an antibody-independent Deep-Dive SRM (DD-SRM) approach that capitalizes on multidimensional high-resolution reversed-phase liquid chromatography (LC) separation for target peptide enrichment combined with precise selection of target peptide fractions of interest, significantly improving SRM sensitivity by ~5 orders of magnitude when compared to conventional LC-SRM. Application of DD-SRM to human serum and tissue has been demonstrated to enable precise quantification of endogenous proteins at ~10 pg/mL level in nondepleted serum and at <10 copies per cell level in tissue. Thus, DD-SRM holds great promise for precisely measuring extremely low abundance proteins or protein modifications, especially when high-quality antibody is not available.

  9. High resolution data acquisition

    Science.gov (United States)

    Thornton, Glenn W.; Fuller, Kenneth R.

    1993-01-01

    A high resolution event interval timing system measures short time intervals such as occur in high energy physics or laser ranging. Timing is provided from a clock (38) pulse train (37) and analog circuitry (44) for generating a triangular wave (46) synchronously with the pulse train (37). The triangular wave (46) has an amplitude and slope functionally related to the time elapsed during each clock pulse in the train. A converter (18, 32) forms a first digital value of the amplitude and slope of the triangle wave at the start of the event interval and a second digital value of the amplitude and slope of the triangle wave at the end of the event interval. A counter (26) counts the clock pulse train (37) during the interval to form a gross event interval time. A computer (52) then combines the gross event interval time and the first and second digital values to output a high resolution value for the event interval.

  10. High resolution photoelectron spectroscopy

    International Nuclear Information System (INIS)

    Arko, A.J.

    1988-01-01

    Photoelectron Spectroscopy (PES) covers a very broad range of measurements, disciplines, and interests. As the next generation light source, the FEL will result in improvements over the undulator that are larger than the undulater improvements over bending magnets. The combination of high flux and high inherent resolution will result in several orders of magnitude gain in signal to noise over measurements using synchrotron-based undulators. The latter still require monochromators. Their resolution is invariably strongly energy-dependent so that in the regions of interest for many experiments (h upsilon > 100 eV) they will not have a resolving power much over 1000. In order to study some of the interesting phenomena in actinides (heavy fermions e.g.) one would need resolving powers of 10 4 to 10 5 . These values are only reachable with the FEL

  11. Particle detector spatial resolution

    International Nuclear Information System (INIS)

    Perez-Mendez, V.

    1992-01-01

    Method and apparatus for producing separated columns of scintillation layer material, for use in detection of X-rays and high energy charged particles with improved spatial resolution is disclosed. A pattern of ridges or projections is formed on one surface of a substrate layer or in a thin polyimide layer, and the scintillation layer is grown at controlled temperature and growth rate on the ridge-containing material. The scintillation material preferentially forms cylinders or columns, separated by gaps conforming to the pattern of ridges, and these columns direct most of the light produced in the scintillation layer along individual columns for subsequent detection in a photodiode layer. The gaps may be filled with a light-absorbing material to further enhance the spatial resolution of the particle detector. 12 figs

  12. Quantification of complex modular architecture in plants.

    Science.gov (United States)

    Reeb, Catherine; Kaandorp, Jaap; Jansson, Fredrik; Puillandre, Nicolas; Dubuisson, Jean-Yves; Cornette, Raphaël; Jabbour, Florian; Coudert, Yoan; Patiño, Jairo; Flot, Jean-François; Vanderpoorten, Alain

    2018-04-01

    Morphometrics, the assignment of quantities to biological shapes, is a powerful tool to address taxonomic, evolutionary, functional and developmental questions. We propose a novel method for shape quantification of complex modular architecture in thalloid plants, whose extremely reduced morphologies, combined with the lack of a formal framework for thallus description, have long rendered taxonomic and evolutionary studies extremely challenging. Using graph theory, thalli are described as hierarchical series of nodes and edges, allowing for accurate, homologous and repeatable measurements of widths, lengths and angles. The computer program MorphoSnake was developed to extract the skeleton and contours of a thallus and automatically acquire, at each level of organization, width, length, angle and sinuosity measurements. Through the quantification of leaf architecture in Hymenophyllum ferns (Polypodiopsida) and a fully worked example of integrative taxonomy in the taxonomically challenging thalloid liverwort genus Riccardia, we show that MorphoSnake is applicable to all ramified plants. This new possibility of acquiring large numbers of quantitative traits in plants with complex modular architectures opens new perspectives of applications, from the development of rapid species identification tools to evolutionary analyses of adaptive plasticity. © 2018 The Authors. New Phytologist © 2018 New Phytologist Trust.

  13. Quantification of prebiotics in commercial infant formulas.

    Science.gov (United States)

    Sabater, Carlos; Prodanov, Marin; Olano, Agustín; Corzo, Nieves; Montilla, Antonia

    2016-03-01

    Since breastfeeding is not always possible, infant formulas (IFs) are supplemented with prebiotic oligosaccharides, such as galactooligosaccharides (GOS) and/or fructooligosaccharides (FOS) to exert similar effects to those of the breast milk. Nowadays, a great number of infant formulas enriched with prebiotics are disposal in the market, however there are scarce data about their composition. In this study, the combined use of two chromatographic methods (GC-FID and HPLC-RID) for the quantification of carbohydrates present in commercial infant formulas have been used. According to the results obtained by GC-FID for products containing prebiotics, the content of FOS, GOS and GOS/FOS was in the ranges of 1.6-5.0, 1.7-3.2, and 0.08-0.25/2.3-3.8g/100g of product, respectively. HPLC-RID analysis allowed quantification of maltodextrins with degree of polymerization (DP) up to 19. The methodology proposed here may be used for routine quality control of infant formula and other food ingredients containing prebiotics. Copyright © 2015 Elsevier Ltd. All rights reserved.

  14. Seed shape quantification in the order Cucurbitales

    Directory of Open Access Journals (Sweden)

    Emilio Cervantes

    2018-02-01

    Full Text Available Seed shape quantification in diverse species of the families belonging to the order Cucurbitales is done based on the comparison of seed images with geometric figures. Quantification of seed shape is a useful tool in plant description for phenotypic characterization and taxonomic analysis. J index gives the percent of similarity of the image of a seed with a geometric figure and it is useful in taxonomy for the study of relationships between plant groups. Geometric figures used as models in the Cucurbitales are the ovoid, two ellipses with different x/y ratios and the outline of the Fibonacci spiral. The images of seeds have been compared with these figures and values of J index obtained. The results obtained for 29 species in the family Cucurbitaceae support a relationship between seed shape and species ecology. Simple seed shape, with images resembling simple geometric figures like the ovoid, ellipse or the Fibonacci spiral, may be a feature in the basal clades of taxonomic groups.

  15. Quantification of abdominal aortic deformation after EVAR

    Science.gov (United States)

    Demirci, Stefanie; Manstad-Hulaas, Frode; Navab, Nassir

    2009-02-01

    Quantification of abdominal aortic deformation is an important requirement for the evaluation of endovascular stenting procedures and the further refinement of stent graft design. During endovascular aortic repair (EVAR) treatment, the aortic shape is subject to severe deformation that is imposed by medical instruments such as guide wires, catheters, and, the stent graft. This deformation can affect the flow characteristics and morphology of the aorta which have been shown to be elicitors for stent graft failures and be reason for reappearance of aneurysms. We present a method for quantifying the deformation of an aneurysmatic aorta imposed by an inserted stent graft device. The outline of the procedure includes initial rigid alignment of the two abdominal scans, segmentation of abdominal vessel trees, and automatic reduction of their centerline structures to one specified region of interest around the aorta. This is accomplished by preprocessing and remodeling of the pre- and postoperative aortic shapes before performing a non-rigid registration. We further narrow the resulting displacement fields to only include local non-rigid deformation and therefore, eliminate all remaining global rigid transformations. Finally, deformations for specified locations can be calculated from the resulting displacement fields. In order to evaluate our method, experiments for the extraction of aortic deformation fields are conducted on 15 patient datasets from endovascular aortic repair (EVAR) treatment. A visual assessment of the registration results and evaluation of the usage of deformation quantification were performed by two vascular surgeons and one interventional radiologist who are all experts in EVAR procedures.

  16. Virus detection and quantification using electrical parameters

    Science.gov (United States)

    Ahmad, Mahmoud Al; Mustafa, Farah; Ali, Lizna M.; Rizvi, Tahir A.

    2014-10-01

    Here we identify and quantitate two similar viruses, human and feline immunodeficiency viruses (HIV and FIV), suspended in a liquid medium without labeling, using a semiconductor technique. The virus count was estimated by calculating the impurities inside a defined volume by observing the change in electrical parameters. Empirically, the virus count was similar to the absolute value of the ratio of the change of the virus suspension dopant concentration relative to the mock dopant over the change in virus suspension Debye volume relative to mock Debye volume. The virus type was identified by constructing a concentration-mobility relationship which is unique for each kind of virus, allowing for a fast (within minutes) and label-free virus quantification and identification. For validation, the HIV and FIV virus preparations were further quantified by a biochemical technique and the results obtained by both approaches corroborated well. We further demonstrate that the electrical technique could be applied to accurately measure and characterize silica nanoparticles that resemble the virus particles in size. Based on these results, we anticipate our present approach to be a starting point towards establishing the foundation for label-free electrical-based identification and quantification of an unlimited number of viruses and other nano-sized particles.

  17. CT quantification of central airway in tracheobronchomalacia

    Energy Technology Data Exchange (ETDEWEB)

    Im, Won Hyeong; Jin, Gong Yong; Han, Young Min; Kim, Eun Young [Dept. of Radiology, Chonbuk National University Hospital, Jeonju (Korea, Republic of)

    2016-05-15

    To know which factors help to diagnose tracheobronchomalacia (TBM) using CT quantification of central airway. From April 2013 to July 2014, 19 patients (68.0 ± 15.0 years; 6 male, 13 female) were diagnosed as TBM on CT. As case-matching, 38 normal subjects (65.5 ± 21.5 years; 6 male, 13 female) were selected. All 57 subjects underwent CT with end-inspiration and end-expiration. Airway parameters of trachea and both main bronchus were assessed using software (VIDA diagnostic). Airway parameters of TBM patients and normal subjects were compared using the Student t-test. In expiration, both wall perimeter and wall thickness in TBM patients were significantly smaller than normal subjects (wall perimeter: trachea, 43.97 mm vs. 49.04 mm, p = 0.020; right main bronchus, 33.52 mm vs. 42.69 mm, p < 0.001; left main bronchus, 26.76 mm vs. 31.88 mm, p = 0.012; wall thickness: trachea, 1.89 mm vs. 2.22 mm, p = 0.017; right main bronchus, 1.64 mm vs. 1.83 mm, p = 0.021; left main bronchus, 1.61 mm vs. 1.75 mm, p = 0.016). Wall thinning and decreased perimeter of central airway of expiration by CT quantification would be a new diagnostic indicators in TBM.

  18. Total space in resolution

    Czech Academy of Sciences Publication Activity Database

    Bonacina, I.; Galesi, N.; Thapen, Neil

    2016-01-01

    Roč. 45, č. 5 (2016), s. 1894-1909 ISSN 0097-5397 R&D Projects: GA ČR GBP202/12/G061 EU Projects: European Commission(XE) 339691 - FEALORA Institutional support: RVO:67985840 Keywords : total space * resolution random CNFs * proof complexity Subject RIV: BA - General Mathematics Impact factor: 1.433, year: 2016 http://epubs.siam.org/doi/10.1137/15M1023269

  19. High resolution (transformers.

    Science.gov (United States)

    Garcia-Souto, Jose A; Lamela-Rivera, Horacio

    2006-10-16

    A novel fiber-optic interferometric sensor is presented for vibrations measurements and analysis. In this approach, it is shown applied to the vibrations of electrical structures within power transformers. A main feature of the sensor is that an unambiguous optical phase measurement is performed using the direct detection of the interferometer output, without external modulation, for a more compact and stable implementation. High resolution of the interferometric measurement is obtained with this technique (transformers are also highlighted.

  20. ALTERNATIVE DISPUTE RESOLUTION

    Directory of Open Access Journals (Sweden)

    Mihaela Irina IONESCU

    2016-05-01

    Full Text Available Alternative dispute resolution (ADR includes dispute resolution processes and techniques that act as a means for disagreeing parties to come to an agreement short of litigation. It is a collective term for the ways that parties can settle disputes, with (or without the help of a third party. Despite historic resistance to ADR by many popular parties and their advocates, ADR has gained widespread acceptance among both the general public and the legal profession in recent years. In fact, some courts now require some parties to resort to ADR of some type, before permitting the parties' cases to be tried. The rising popularity of ADR can be explained by the increasing caseload of traditional courts, the perception that ADR imposes fewer costs than litigation, a preference for confidentiality, and the desire of some parties to have greater control over the selection of the individual or individuals who will decide their dispute. Directive 2013/11/EU of the European Parliament and of the Council on alternative dispute resolution for consumer disputes and amending Regulation (EC No 2006/2004 and Directive 2009/22/EC (hereinafter „Directive 2013/11/EU” aims to ensure a high level of consumer protection and the proper functioning of the internal market by ensuring that complaints against traders can be submitted by consumers on a voluntary basis, to entities of alternative disputes which are independent, impartial, transparent, effective, simple,quick and fair. Directive 2013/11/EU establishes harmonized quality requirements for entities applying alternative dispute resolution procedure (hereinafter "ADR entity" to provide the same protection and the same rights of consumers in all Member States. Besides this, the present study is trying to present broadly how are all this trasposed in the romanian legislation.

  1. Molecular quantification of genes encoding for green-fluorescent proteins

    DEFF Research Database (Denmark)

    Felske, A; Vandieken, V; Pauling, B V

    2003-01-01

    A quantitative PCR approach is presented to analyze the amount of recombinant green fluorescent protein (gfp) genes in environmental DNA samples. The quantification assay is a combination of specific PCR amplification and temperature gradient gel electrophoresis (TGGE). Gene quantification...... PCR strategy is a highly specific and sensitive way to monitor recombinant DNA in environments like the efflux of a biotechnological plant....

  2. La quantification en Kabiye: une approche linguistique | Pali ...

    African Journals Online (AJOL)

    ... which is denoted by lexical quantifiers. Quantification with specific reference is provided by different types of linguistic units (nouns, numerals, adjectives, adverbs, ideophones and verbs) in arguments/noun phrases and in the predicative phrase in the sense of Chomsky. Keywords: quantification, class, number, reference, ...

  3. Low-resolution simulations of vesicle suspensions in 2D

    Science.gov (United States)

    Kabacaoğlu, Gökberk; Quaife, Bryan; Biros, George

    2018-03-01

    Vesicle suspensions appear in many biological and industrial applications. These suspensions are characterized by rich and complex dynamics of vesicles due to their interaction with the bulk fluid, and their large deformations and nonlinear elastic properties. Many existing state-of-the-art numerical schemes can resolve such complex vesicle flows. However, even when using provably optimal algorithms, these simulations can be computationally expensive, especially for suspensions with a large number of vesicles. These high computational costs can limit the use of simulations for parameter exploration, optimization, or uncertainty quantification. One way to reduce the cost is to use low-resolution discretizations in space and time. However, it is well-known that simply reducing the resolution results in vesicle collisions, numerical instabilities, and often in erroneous results. In this paper, we investigate the effect of a number of algorithmic empirical fixes (which are commonly used by many groups) in an attempt to make low-resolution simulations more stable and more predictive. Based on our empirical studies for a number of flow configurations, we propose a scheme that attempts to integrate these fixes in a systematic way. This low-resolution scheme is an extension of our previous work [51,53]. Our low-resolution correction algorithms (LRCA) include anti-aliasing and membrane reparametrization for avoiding spurious oscillations in vesicles' membranes, adaptive time stepping and a repulsion force for handling vesicle collisions and, correction of vesicles' area and arc-length for maintaining physical vesicle shapes. We perform a systematic error analysis by comparing the low-resolution simulations of dilute and dense suspensions with their high-fidelity, fully resolved, counterparts. We observe that the LRCA enables both efficient and statistically accurate low-resolution simulations of vesicle suspensions, while it can be 10× to 100× faster.

  4. In vivo MRS metabolite quantification using genetic optimization

    Science.gov (United States)

    Papakostas, G. A.; Karras, D. A.; Mertzios, B. G.; van Ormondt, D.; Graveron-Demilly, D.

    2011-11-01

    The in vivo quantification of metabolites' concentrations, revealed in magnetic resonance spectroscopy (MRS) spectra, constitutes the main subject under investigation in this work. Significant contributions based on artificial intelligence tools, such as neural networks (NNs), with good results have been presented lately but have shown several drawbacks, regarding their quantification accuracy under difficult conditions. A general framework that encounters the quantification procedure as an optimization problem, which is solved using a genetic algorithm (GA), is proposed in this paper. Two different lineshape models are examined, while two GA configurations are applied on artificial data. Moreover, the introduced quantification technique deals with metabolite peaks' overlapping, a considerably difficult situation occurring under real conditions. Appropriate experiments have proved the efficiency of the introduced methodology, in artificial MRS data, by establishing it as a generic metabolite quantification procedure.

  5. In vivo MRS metabolite quantification using genetic optimization

    International Nuclear Information System (INIS)

    Papakostas, G A; Mertzios, B G; Karras, D A; Van Ormondt, D; Graveron-Demilly, D

    2011-01-01

    The in vivo quantification of metabolites' concentrations, revealed in magnetic resonance spectroscopy (MRS) spectra, constitutes the main subject under investigation in this work. Significant contributions based on artificial intelligence tools, such as neural networks (NNs), with good results have been presented lately but have shown several drawbacks, regarding their quantification accuracy under difficult conditions. A general framework that encounters the quantification procedure as an optimization problem, which is solved using a genetic algorithm (GA), is proposed in this paper. Two different lineshape models are examined, while two GA configurations are applied on artificial data. Moreover, the introduced quantification technique deals with metabolite peaks' overlapping, a considerably difficult situation occurring under real conditions. Appropriate experiments have proved the efficiency of the introduced methodology, in artificial MRS data, by establishing it as a generic metabolite quantification procedure

  6. Survey and Evaluate Uncertainty Quantification Methodologies

    Energy Technology Data Exchange (ETDEWEB)

    Lin, Guang; Engel, David W.; Eslinger, Paul W.

    2012-02-01

    The Carbon Capture Simulation Initiative (CCSI) is a partnership among national laboratories, industry and academic institutions that will develop and deploy state-of-the-art computational modeling and simulation tools to accelerate the commercialization of carbon capture technologies from discovery to development, demonstration, and ultimately the widespread deployment to hundreds of power plants. The CCSI Toolset will provide end users in industry with a comprehensive, integrated suite of scientifically validated models with uncertainty quantification, optimization, risk analysis and decision making capabilities. The CCSI Toolset will incorporate commercial and open-source software currently in use by industry and will also develop new software tools as necessary to fill technology gaps identified during execution of the project. The CCSI Toolset will (1) enable promising concepts to be more quickly identified through rapid computational screening of devices and processes; (2) reduce the time to design and troubleshoot new devices and processes; (3) quantify the technical risk in taking technology from laboratory-scale to commercial-scale; and (4) stabilize deployment costs more quickly by replacing some of the physical operational tests with virtual power plant simulations. The goal of CCSI is to deliver a toolset that can simulate the scale-up of a broad set of new carbon capture technologies from laboratory scale to full commercial scale. To provide a framework around which the toolset can be developed and demonstrated, we will focus on three Industrial Challenge Problems (ICPs) related to carbon capture technologies relevant to U.S. pulverized coal (PC) power plants. Post combustion capture by solid sorbents is the technology focus of the initial ICP (referred to as ICP A). The goal of the uncertainty quantification (UQ) task (Task 6) is to provide a set of capabilities to the user community for the quantification of uncertainties associated with the carbon

  7. Diagnosis of hearing impairment by high resolution CT scanning of inner ear anomalies

    International Nuclear Information System (INIS)

    Murata, Kiyotaka; Isono, Michio; Ohta, Fumihiko

    1988-01-01

    High resolution CT scanning of the temporal bone in our clinic has provided a more detailed radiological classification of inner ear anomalies than before. The statistical analysis of inner ear malformations based on the theory of quantification II has produced discriminant equations for the measurable diagnosis of hearing impairment and development of the inner ear. This analysis may make it possible to diagnose total and partial deafness on ipsi- and contralateral sides. (author)

  8. Development of computational algorithms for quantification of pulmonary structures; Desenvolvimento de algoritmos computacionais para quantificacao de estruturas pulmonares

    Energy Technology Data Exchange (ETDEWEB)

    Oliveira, Marcela de; Alvarez, Matheus; Alves, Allan F.F.; Miranda, Jose R.A., E-mail: marceladeoliveira@ig.com.br [Universidade Estadual Paulista Julio de Mesquita Filho (UNESP), Botucatu, SP (Brazil). Instituto de Biociencias. Departamento de Fisica e Biofisica; Pina, Diana R. [Universidade Estadual Paulista Julio de Mesquita Filho (UNESP), Botucatu, SP (Brazil). Hospital das Clinicas. Departamento de Doencas Tropicais e Diagnostico por Imagem

    2012-12-15

    The high-resolution computed tomography has become the imaging diagnostic exam most commonly used for the evaluation of the squeals of Paracoccidioidomycosis. The subjective evaluations the radiological abnormalities found on HRCT images do not provide an accurate quantification. The computer-aided diagnosis systems produce a more objective assessment of the abnormal patterns found in HRCT images. Thus, this research proposes the development of algorithms in MATLAB® computing environment can quantify semi-automatically pathologies such as pulmonary fibrosis and emphysema. The algorithm consists in selecting a region of interest (ROI), and by the use of masks, filter densities and morphological operators, to obtain a quantification of the injured area to the area of a healthy lung. The proposed method was tested on ten HRCT scans of patients with confirmed PCM. The results of semi-automatic measurements were compared with subjective evaluations performed by a specialist in radiology, falling to a coincidence of 80% for emphysema and 58% for fibrosis. (author)

  9. Noninvasive Quantification of Retinal Microglia Using Widefield Autofluorescence Imaging.

    Science.gov (United States)

    Kokona, Despina; Schneider, Nadia; Giannakaki-Zimmermann, Helena; Jovanovic, Joel; Ebneter, Andreas; Zinkernagel, Martin

    2017-04-01

    To validate widefield autofluorescence (AF) in vivo imaging of the retina in mice expressing green fluorescent protein (gfp) in microglia, and to monitor retinal microglia reconstitution in vivo after lethal irradiation and bone marrow transplantation. Transgenic Cx3cr1gfp/gfp and wildtype Balb/c mice were used in this study. A confocal scanning laser ophthalmoscope was used for AF imaging with a 55° and a widefield 102° lens. Intrasession reproducibility was assessed for each lens. To investigate reconstitution in vivo, bone marrow from Cx3cr1gfp/gfp mice was used to rescue lethally irradiated wildtype mice. Data were compared to confocal microscopy of retinal flat mounts. Both the 55° and the 102° lens produced high resolution images of retinal microglia with similar microglia density. However, compared to the 55° lens, the widefield 102° lens captured approximately 3.6 times more microglia cells (1515 ± 123 cells versus 445 ± 76 cells [mean ± SD], for 102° and 55°, respectively, P < 0.001). No statistical difference in the number of gfp positive cells within corresponding areas was observed within the same imaging session. Imaging of microglia reconstitution showed a similar time course compared to flat mount preparations with an excellent correlation between microglia cell numbers in AF and gfp-stained flat mounts (R = 0.92, P < 0.0001). Widefield AF imaging of mice with gfp expressing microglia can be used to quantify retinal microglia. In vivo microglia counts corresponded very well with ex vivo counts on retinal flat mounts. As such, AF imaging can largely replace ex vivo quantification.

  10. Bank Resolution in Europe

    DEFF Research Database (Denmark)

    N. Gordon, Jeffery; Ringe, Georg

    2015-01-01

    This chapter argues that the work of the European Banking Union remains incomplete in one important respect, the structural re-organization of large European financial firms that would make “resolution” of a systemically important financial firm a credible alternative to bail-out or some other sort...... of taxpayer assistance. A holding company structure in which the public parent holds unsecured term debt sufficient to cover losses at an operating financial subsidiary would facilitate a “Single Point of Entry” resolution procedure that would minimize knock-on effects from the failure of a systemically...

  11. Bank Resolution in Europe

    DEFF Research Database (Denmark)

    Gordon, Jeffrey N.; Ringe, Georg

    This chapter argues that the work of the European Banking Union remains incomplete in one important respect, the structural re-organization of large European financial firms that would make “resolution” of a systemically important financial firm a credible alternative to bail-out or some other sort...... of taxpayer assistance. A holding company structure in which the public parent holds unsecured term debt sufficient to cover losses at an operating financial subsidiary would facilitate a “Single Point of Entry” resolution procedure that would minimize knock-on effects from the failure of a systemically...

  12. High resolution backscattering instruments

    International Nuclear Information System (INIS)

    Coldea, R.

    2001-01-01

    The principle of operation of indirect-geometry time-of-flight spectrometers are presented, including the IRIS at the ISIS spallation neutron source. The key features that make those types of spectrometers ideally suited for low-energy spectroscopy are: high energy resolution over a wide dynamic range, and simultaneous measurement over a large momentum transfer range provided by the wide angular detector coverage. To exemplify these features are discussed of single-crystal experiments of the spin dynamics in the two-dimensional frustrated quantum magnet Cs 2 CuCl 4 . (R.P.)

  13. Failure Diameter Resolution Study

    Energy Technology Data Exchange (ETDEWEB)

    Menikoff, Ralph [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2017-12-19

    Previously the SURFplus reactive burn model was calibrated for the TATB based explosive PBX 9502. The calibration was based on fitting Pop plot data, the failure diameter and the limiting detonation speed, and curvature effect data for small curvature. The model failure diameter is determined utilizing 2-D simulations of an unconfined rate stick to find the minimum diameter for which a detonation wave propagates. Here we examine the effect of mesh resolution on an unconfined rate stick with a diameter (10mm) slightly greater than the measured failure diameter (8 to 9 mm).

  14. Conflict management and resolution.

    Science.gov (United States)

    Harolds, Jay; Wood, Beverly P

    2006-03-01

    When people work collaboratively, conflict will always arise. Understanding the nature and source of conflict and its progression and stages, resolution, and outcome is a vital aspect of leadership. Causes of conflict include the miscomprehension of communication, emotional issues, personal history, and values. When the difference is understood and the resultant behavior properly addressed, most conflict can be settled in a way that provides needed change in an organization and interrelationships. There are serious consequences of avoiding or mismanaging disagreements. Informed leaders can effectively prevent destructive conflicts.

  15. The super-resolution debate

    Science.gov (United States)

    Won, Rachel

    2018-05-01

    In the quest for nanoscopy with super-resolution, consensus from the imaging community is that super-resolution is not always needed and that scientists should choose an imaging technique based on their specific application.

  16. Resolution Enhancement of Multilook Imagery

    Energy Technology Data Exchange (ETDEWEB)

    Galbraith, Amy E. [Univ. of Arizona, Tucson, AZ (United States)

    2004-07-01

    This dissertation studies the feasibility of enhancing the spatial resolution of multi-look remotely-sensed imagery using an iterative resolution enhancement algorithm known as Projection Onto Convex Sets (POCS). A multi-angle satellite image modeling tool is implemented, and simulated multi-look imagery is formed to test the resolution enhancement algorithm. Experiments are done to determine the optimal con guration and number of multi-angle low-resolution images needed for a quantitative improvement in the spatial resolution of the high-resolution estimate. The important topic of aliasing is examined in the context of the POCS resolution enhancement algorithm performance. In addition, the extension of the method to multispectral sensor images is discussed and an example is shown using multispectral confocal fluorescence imaging microscope data. Finally, the remote sensing issues of atmospheric path radiance and directional reflectance variations are explored to determine their effect on the resolution enhancement performance.

  17. Advanced modeling in positron emission tomography using Monte Carlo simulations for improving reconstruction and quantification

    International Nuclear Information System (INIS)

    Stute, Simon

    2010-01-01

    Positron Emission Tomography (PET) is a medical imaging technique that plays a major role in oncology, especially using "1"8F-Fluoro-Deoxyglucose. However, PET images suffer from a modest spatial resolution and from high noise. As a result, there is still no consensus on how tumor metabolically active volume and tumor uptake should be characterized. In the meantime, research groups keep producing new methods for such characterizations that need to be assessed. A Monte Carlo simulation based method has been developed to produce simulated PET images of patients suffering from cancer, indistinguishable from clinical images, and for which all parameters are known. The method uses high resolution PET images from patient acquisitions, from which the physiological heterogeneous activity distribution can be modeled. It was shown that the performance of quantification methods on such highly realistic simulated images are significantly lower and more variable than using simple phantom studies. Fourteen different quantification methods were also compared in realistic conditions using a group of such simulated patients. In addition, the proposed method was extended to simulate serial PET scans in the context of patient monitoring, including a modeling of the tumor changes, as well as the variability over time of non-tumoral physiological activity distribution. Monte Carlo simulations were also used to study the detection probability inside the crystals of the tomograph. A model of the crystal response was derived and included in the system matrix involved in tomographic reconstruction. The resulting reconstruction method was compared with other sophisticated methods for modeling the detector response in the image space, proposed in the literature. We demonstrated the superiority of the proposed method over equivalent approaches on simulated data, and illustrated its robustness on clinical data. For a same noise level, it is possible to reconstruct PET images offering a

  18. Quantification Methods of Management Skills in Shipping

    Directory of Open Access Journals (Sweden)

    Riana Iren RADU

    2012-04-01

    Full Text Available Romania can not overcome the financial crisis without business growth, without finding opportunities for economic development and without attracting investment into the country. Successful managers find ways to overcome situations of uncertainty. The purpose of this paper is to determine the managerial skills developed by the Romanian fluvial shipping company NAVROM (hereinafter CNFR NAVROM SA, compared with ten other major competitors in the same domain, using financial information of these companies during the years 2005-2010. For carrying out the work it will be used quantification methods of managerial skills to CNFR NAVROM SA Galati, Romania, as example mentioning the analysis of financial performance management based on profitability ratios, net profit margin, suppliers management, turnover.

  19. Recurrence quantification analysis of global stock markets

    Science.gov (United States)

    Bastos, João A.; Caiado, Jorge

    2011-04-01

    This study investigates the presence of deterministic dependencies in international stock markets using recurrence plots and recurrence quantification analysis (RQA). The results are based on a large set of free float-adjusted market capitalization stock indices, covering a period of 15 years. The statistical tests suggest that the dynamics of stock prices in emerging markets is characterized by higher values of RQA measures when compared to their developed counterparts. The behavior of stock markets during critical financial events, such as the burst of the technology bubble, the Asian currency crisis, and the recent subprime mortgage crisis, is analyzed by performing RQA in sliding windows. It is shown that during these events stock markets exhibit a distinctive behavior that is characterized by temporary decreases in the fraction of recurrence points contained in diagonal and vertical structures.

  20. Recurrence quantification analysis theory and best practices

    CERN Document Server

    Jr, Jr; Marwan, Norbert

    2015-01-01

    The analysis of recurrences in dynamical systems by using recurrence plots and their quantification is still an emerging field.  Over the past decades recurrence plots have proven to be valuable data visualization and analysis tools in the theoretical study of complex, time-varying dynamical systems as well as in various applications in biology, neuroscience, kinesiology, psychology, physiology, engineering, physics, geosciences, linguistics, finance, economics, and other disciplines.   This multi-authored book intends to comprehensively introduce and showcase recent advances as well as established best practices concerning both theoretical and practical aspects of recurrence plot based analysis.  Edited and authored by leading researcher in the field, the various chapters address an interdisciplinary readership, ranging from theoretical physicists to application-oriented scientists in all data-providing disciplines.

  1. Quantification practices in the nuclear industry

    International Nuclear Information System (INIS)

    1986-01-01

    In this chapter the quantification of risk practices adopted by the nuclear industries in Germany, Britain and France are examined as representative of the practices adopted throughout Europe. From this examination a number of conclusions are drawn about the common features of the practices adopted. In making this survey, the views expressed in the report of the Task Force on Safety Goals/Objectives appointed by the Commission of the European Communities, are taken into account. For each country considered, the legal requirements for presentation of quantified risk assessment as part of the licensing procedure are examined, and the way in which the requirements have been developed for practical application are then examined. (author)

  2. Convex geometry of quantum resource quantification

    Science.gov (United States)

    Regula, Bartosz

    2018-01-01

    We introduce a framework unifying the mathematical characterisation of different measures of general quantum resources and allowing for a systematic way to define a variety of faithful quantifiers for any given convex quantum resource theory. The approach allows us to describe many commonly used measures such as matrix norm-based quantifiers, robustness measures, convex roof-based measures, and witness-based quantifiers together in a common formalism based on the convex geometry of the underlying sets of resource-free states. We establish easily verifiable criteria for a measure to possess desirable properties such as faithfulness and strong monotonicity under relevant free operations, and show that many quantifiers obtained in this framework indeed satisfy them for any considered quantum resource. We derive various bounds and relations between the measures, generalising and providing significantly simplified proofs of results found in the resource theories of quantum entanglement and coherence. We also prove that the quantification of resources in this framework simplifies for pure states, allowing us to obtain more easily computable forms of the considered measures, and show that many of them are in fact equal on pure states. Further, we investigate the dual formulation of resource quantifiers, which provide a characterisation of the sets of resource witnesses. We present an explicit application of the results to the resource theories of multi-level coherence, entanglement of Schmidt number k, multipartite entanglement, as well as magic states, providing insight into the quantification of the four resources by establishing novel quantitative relations and introducing new quantifiers, such as a measure of entanglement of Schmidt number k which generalises the convex roof-extended negativity, a measure of k-coherence which generalises the \

  3. Kinetic quantification of plyometric exercise intensity.

    Science.gov (United States)

    Ebben, William P; Fauth, McKenzie L; Garceau, Luke R; Petushek, Erich J

    2011-12-01

    Ebben, WP, Fauth, ML, Garceau, LR, and Petushek, EJ. Kinetic quantification of plyometric exercise intensity. J Strength Cond Res 25(12): 3288-3298, 2011-Quantification of plyometric exercise intensity is necessary to understand the characteristics of these exercises and the proper progression of this mode of exercise. The purpose of this study was to assess the kinetic characteristics of a variety of plyometric exercises. This study also sought to assess gender differences in these variables. Twenty-six men and 23 women with previous experience in performing plyometric training served as subjects. The subjects performed a variety of plyometric exercises including line hops, 15.24-cm cone hops, squat jumps, tuck jumps, countermovement jumps (CMJs), loaded CMJs equal to 30% of 1 repetition maximum squat, depth jumps normalized to the subject's jump height (JH), and single leg jumps. All plyometric exercises were assessed with a force platform. Outcome variables associated with the takeoff, airborne, and landing phase of each plyometric exercise were evaluated. These variables included the peak vertical ground reaction force (GRF) during takeoff, the time to takeoff, flight time, JH, peak power, landing rate of force development, and peak vertical GRF during landing. A 2-way mixed analysis of variance with repeated measures for plyometric exercise type demonstrated main effects for exercise type and all outcome variables (p ≤ 0.05) and for the interaction between gender and peak vertical GRF during takeoff (p ≤ 0.05). Bonferroni-adjusted pairwise comparisons identified a number of differences between the plyometric exercises for the outcome variables assessed (p ≤ 0.05). These findings can be used to guide the progression of plyometric training by incorporating exercises of increasing intensity over the course of a program.

  4. Quantification of heterogeneity observed in medical images

    International Nuclear Information System (INIS)

    Brooks, Frank J; Grigsby, Perry W

    2013-01-01

    There has been much recent interest in the quantification of visually evident heterogeneity within functional grayscale medical images, such as those obtained via magnetic resonance or positron emission tomography. In the case of images of cancerous tumors, variations in grayscale intensity imply variations in crucial tumor biology. Despite these considerable clinical implications, there is as yet no standardized method for measuring the heterogeneity observed via these imaging modalities. In this work, we motivate and derive a statistical measure of image heterogeneity. This statistic measures the distance-dependent average deviation from the smoothest intensity gradation feasible. We show how this statistic may be used to automatically rank images of in vivo human tumors in order of increasing heterogeneity. We test this method against the current practice of ranking images via expert visual inspection. We find that this statistic provides a means of heterogeneity quantification beyond that given by other statistics traditionally used for the same purpose. We demonstrate the effect of tumor shape upon our ranking method and find the method applicable to a wide variety of clinically relevant tumor images. We find that the automated heterogeneity rankings agree very closely with those performed visually by experts. These results indicate that our automated method may be used reliably to rank, in order of increasing heterogeneity, tumor images whether or not object shape is considered to contribute to that heterogeneity. Automated heterogeneity ranking yields objective results which are more consistent than visual rankings. Reducing variability in image interpretation will enable more researchers to better study potential clinical implications of observed tumor heterogeneity

  5. Quantification of heterogeneity observed in medical images.

    Science.gov (United States)

    Brooks, Frank J; Grigsby, Perry W

    2013-03-02

    There has been much recent interest in the quantification of visually evident heterogeneity within functional grayscale medical images, such as those obtained via magnetic resonance or positron emission tomography. In the case of images of cancerous tumors, variations in grayscale intensity imply variations in crucial tumor biology. Despite these considerable clinical implications, there is as yet no standardized method for measuring the heterogeneity observed via these imaging modalities. In this work, we motivate and derive a statistical measure of image heterogeneity. This statistic measures the distance-dependent average deviation from the smoothest intensity gradation feasible. We show how this statistic may be used to automatically rank images of in vivo human tumors in order of increasing heterogeneity. We test this method against the current practice of ranking images via expert visual inspection. We find that this statistic provides a means of heterogeneity quantification beyond that given by other statistics traditionally used for the same purpose. We demonstrate the effect of tumor shape upon our ranking method and find the method applicable to a wide variety of clinically relevant tumor images. We find that the automated heterogeneity rankings agree very closely with those performed visually by experts. These results indicate that our automated method may be used reliably to rank, in order of increasing heterogeneity, tumor images whether or not object shape is considered to contribute to that heterogeneity. Automated heterogeneity ranking yields objective results which are more consistent than visual rankings. Reducing variability in image interpretation will enable more researchers to better study potential clinical implications of observed tumor heterogeneity.

  6. Lesion detection and quantification performance of the Tachyon-I time-of-flight PET scanner: phantom and human studies

    Science.gov (United States)

    Zhang, Xuezhu; Peng, Qiyu; Zhou, Jian; Huber, Jennifer S.; Moses, William W.; Qi, Jinyi

    2018-03-01

    The first generation Tachyon PET (Tachyon-I) is a demonstration single-ring PET scanner that reaches a coincidence timing resolution of 314 ps using LSO scintillator crystals coupled to conventional photomultiplier tubes. The objective of this study was to quantify the improvement in both lesion detection and quantification performance resulting from the improved time-of-flight (TOF) capability of the Tachyon-I scanner. We developed a quantitative TOF image reconstruction method for the Tachyon-I and evaluated its TOF gain for lesion detection and quantification. Scans of either a standard NEMA torso phantom or healthy volunteers were used as the normal background data. Separately scanned point source and sphere data were superimposed onto the phantom or human data after accounting for the object attenuation. We used the bootstrap method to generate multiple independent noisy datasets with and without a lesion present. The signal-to-noise ratio (SNR) of a channelized hotelling observer (CHO) was calculated for each lesion size and location combination to evaluate the lesion detection performance. The bias versus standard deviation trade-off of each lesion uptake was also calculated to evaluate the quantification performance. The resulting CHO-SNR measurements showed improved performance in lesion detection with better timing resolution. The detection performance was also dependent on the lesion size and location, in addition to the background object size and shape. The results of bias versus noise trade-off showed that the noise (standard deviation) reduction ratio was about 1.1–1.3 over the TOF 500 ps and 1.5–1.9 over the non-TOF modes, similar to the SNR gains for lesion detection. In conclusion, this Tachyon-I PET study demonstrated the benefit of improved time-of-flight capability on lesion detection and ROI quantification for both phantom and human subjects.

  7. Automating the conflict resolution process

    Science.gov (United States)

    Wike, Jeffrey S.

    1991-01-01

    The purpose is to initiate a discussion of how the conflict resolution process at the Network Control Center can be made more efficient. Described here are how resource conflicts are currently resolved as well as the impacts of automating conflict resolution in the ATDRSS era. A variety of conflict resolution strategies are presented.

  8. Resolution of praziquantel.

    Directory of Open Access Journals (Sweden)

    Michael Woelfle

    2011-09-01

    Full Text Available BACKGROUND: Praziquantel remains the drug of choice for the worldwide treatment and control of schistosomiasis. The drug is synthesized and administered as a racemate. Use of the pure active enantiomer would be desirable since the inactive enantiomer is associated with side effects and is responsible for the extremely bitter taste of the pill. METHODOLOGY/PRINCIPAL FINDINGS: We have identified two resolution approaches toward the production of praziquantel as a single enantiomer. One approach starts with commercially available praziquantel and involves a hydrolysis to an intermediate amine, which is resolved with a derivative of tartaric acid. This method was discovered through an open collaboration on the internet. The second method, identified by a contract research organisation, employs a different intermediate that may be resolved with tartaric acid itself. CONCLUSIONS/SIGNIFICANCE: Both resolution procedures identified show promise for the large-scale, economically viable production of praziquantel as a single enantiomer for a low price. Additionally, they may be employed by laboratories for the production of smaller amounts of enantiopure drug for research purposes that should be useful in, for example, elucidation of the drug's mechanism of action.

  9. High resolution hadron calorimetry

    International Nuclear Information System (INIS)

    Wigmans, R.

    1987-01-01

    The components that contribute to the signal of a hadron calorimeter and the factors that affect its performance are discussed, concentrating on two aspects; energy resolution and signal linearity. Both are decisively dependent on the relative response to the electromagnetic and the non-electromagnetic shower components, the e/h signal ratio, which should be equal to 1.0 for optimal performance. The factors that determine the value of this ratio are examined. The calorimeter performance is crucially determined by its response to the abundantly present soft neutrons in the shower. The presence of a considerable fraction of hydrogen atoms in the active medium is essential for achieving the best possible results. Firstly, this allows one to tune e/h to the desired value by choosing the appropriate sampling fraction. And secondly, the efficient neutron detection via recoil protons in the readout medium itself reduces considerably the effect of fluctuations in binding energy losses at the nuclear level, which dominate the intrinsic energy resolution. Signal equalization, or compensation (e/h = 1.0) does not seem to be a property unique to 238 U, but can also be achieved with lead and probably even iron absorbers. 21 refs.; 19 figs

  10. Objective Tuning of Model Parameters in CAM5 Across Different Spatial Resolutions

    Science.gov (United States)

    Bulaevskaya, V.; Lucas, D. D.

    2014-12-01

    Parameterizations of physical processes in climate models are highly dependent on the spatial and temporal resolution and must be tuned for each resolution under consideration. At high spatial resolutions, objective methods for parameter tuning are computationally prohibitive. Our work has focused on calibrating parameters in the Community Atmosphere Model 5 (CAM5) for three spatial resolutions: 1, 2, and 4 degrees. Using perturbed-parameter ensembles and uncertainty quantification methodology, we have identified input parameters that minimize discrepancies of energy fluxes simulated by CAM5 across the three resolutions and with respect to satellite observations. We are also beginning to exploit the parameter-resolution relationships to objectively tune parameters in a high-resolution version of CAM5 by leveraging cheaper, low-resolution simulations and statistical models. We will present our approach to multi-resolution climate model parameter tuning, as well as the key findings. This work was performed under the auspices of the U.S. Department of Energy by Lawrence Livermore National Laboratory under contract DE-AC52-07NA27344 and was supported from the DOE Office of Science through the Scientific Discovery Through Advanced Computing (SciDAC) project on Multiscale Methods for Accurate, Efficient, and Scale-Aware Models of the Earth System.

  11. Profiling of modified nucleosides from ribonucleic acid digestion by supercritical fluid chromatography coupled to high resolution mass spectrometry.

    Science.gov (United States)

    Laboureur, Laurent; Guérineau, Vincent; Auxilien, Sylvie; Yoshizawa, Satoko; Touboul, David

    2018-02-16

    A method based on supercritical fluid chromatography coupled to high resolution mass spectrometry for the profiling of canonical and modified nucleosides was optimized, and compared to classical reverse-phase liquid chromatography in terms of separation, number of detected modified nucleosides and sensitivity. Limits of detection and quantification were measured using statistical method and quantifications of twelve nucleosides of a tRNA digest from E. coli are in good agreement with previously reported data. Results highlight the complementarity of both separation techniques to cover the largest view of nucleoside modifications for forthcoming epigenetic studies. Copyright © 2017 Elsevier B.V. All rights reserved.

  12. Exploring Heterogeneous Multicore Architectures for Advanced Embedded Uncertainty Quantification.

    Energy Technology Data Exchange (ETDEWEB)

    Phipps, Eric T.; Edwards, Harold C.; Hu, Jonathan J.

    2014-09-01

    We explore rearrangements of classical uncertainty quantification methods with the aim of achieving higher aggregate performance for uncertainty quantification calculations on emerging multicore and manycore architectures. We show a rearrangement of the stochastic Galerkin method leads to improved performance and scalability on several computational architectures whereby un- certainty information is propagated at the lowest levels of the simulation code improving memory access patterns, exposing new dimensions of fine grained parallelism, and reducing communica- tion. We also develop a general framework for implementing such rearrangements for a diverse set of uncertainty quantification algorithms as well as computational simulation codes to which they are applied.

  13. Application of Fuzzy Comprehensive Evaluation Method in Trust Quantification

    Directory of Open Access Journals (Sweden)

    Shunan Ma

    2011-10-01

    Full Text Available Trust can play an important role for the sharing of resources and information in open network environments. Trust quantification is thus an important issue in dynamic trust management. By considering the fuzziness and uncertainty of trust, in this paper, we propose a fuzzy comprehensive evaluation method to quantify trust along with a trust quantification algorithm. Simulation results show that the trust quantification algorithm that we propose can effectively quantify trust and the quantified value of an entity's trust is consistent with the behavior of the entity.

  14. High Time Resolution Astrophysics

    CERN Document Server

    Phelan, Don; Shearer, Andrew

    2008-01-01

    High Time Resolution Astrophysics (HTRA) is an important new window to the universe and a vital tool in understanding a range of phenomena from diverse objects and radiative processes. This importance is demonstrated in this volume with the description of a number of topics in astrophysics, including quantum optics, cataclysmic variables, pulsars, X-ray binaries and stellar pulsations to name a few. Underlining this science foundation, technological developments in both instrumentation and detectors are described. These instruments and detectors combined cover a wide range of timescales and can measure fluxes, spectra and polarisation. These advances make it possible for HTRA to make a big contribution to our understanding of the Universe in the next decade.

  15. Lexical ambiguity resolution

    Energy Technology Data Exchange (ETDEWEB)

    Small, S.; Cottrell, G.; Tanenhaus, M.

    1987-01-01

    This book collects much of the best research currently available on the problem of lexical ambiguity resolution in the processing of human language. When taken out of context, sentences are usually ambiguous. When actually uttered in a dialogue or written in text, these same sentences often have unique interpretations. The inherent ambiguity of isolated sentences, becomes obvious in the attempt to write a computer program to understand them. Different views have emerged on the nature of context and the mechanisms by which it directs unambiguous understanding of words and sentences. These perspectives are represented and discussed. Eighteen original papers from a valuable source book for cognitive scientists in AI, psycholinguistics, neuropsychology, or theoretical linguistics.

  16. High resolution ultrasonic densitometer

    International Nuclear Information System (INIS)

    Dress, W.B.

    1983-01-01

    The velocity of torsional stress pulses in an ultrasonic waveguide of non-circular cross section is affected by the temperature and density of the surrounding medium. Measurement of the transit times of acoustic echoes from the ends of a sensor section are interpreted as level, density, and temperature of the fluid environment surrounding that section. This paper examines methods of making these measurements to obtain high resolution, temperature-corrected absolute and relative density and level determinations of the fluid. Possible applications include on-line process monitoring, a hand-held density probe for battery charge state indication, and precise inventory control for such diverse fluids as uranium salt solutions in accountability storage and gasoline in service station storage tanks

  17. Gamma camera based Positron Emission Tomography: a study of the viability on quantification

    International Nuclear Information System (INIS)

    Pozzo, Lorena

    2005-01-01

    Positron Emission Tomography (PET) is a Nuclear Medicine imaging modality for diagnostic purposes. Pharmaceuticals labeled with positron emitters are used and images which represent the in vivo biochemical process within tissues can be obtained. The positron/electron annihilation photons are detected in coincidence and this information is used for object reconstruction. Presently, there are two types of systems available for this imaging modality: the dedicated systems and those based on gamma camera technology. In this work, we utilized PET/SPECT systems, which also allows for the traditional Nuclear Medicine studies based on single photon emitters. There are inherent difficulties which affect quantification of activity and other indices. They are related to the Poisson nature of radioactivity, to radiation interactions with patient body and detector, noise due to statistical nature of these interactions and to all the detection processes, as well as the patient acquisition protocols. Corrections are described in the literature and not all of them are implemented by the manufacturers: scatter, attenuation, random, decay, dead time, spatial resolution, and others related to the properties of each equipment. The goal of this work was to assess these methods adopted by two manufacturers, as well as the influence of some technical characteristics of PET/SPECT systems on the estimation of SUV. Data from a set of phantoms were collected in 3D mode by one camera and 2D, by the other. We concluded that quantification is viable in PET/SPECT systems, including the estimation of SUVs. This is only possible if, apart from the above mentioned corrections, the camera is well tuned and coefficients for sensitivity normalization and partial volume corrections are applied. We also verified that the shapes of the sources used for obtaining these factors play a role on the final results and should be delt with carefully in clinical quantification. Finally, the choice of the region

  18. Automated Quantification of Hematopoietic Cell – Stromal Cell Interactions in Histological Images of Undecalcified Bone

    Science.gov (United States)

    Zehentmeier, Sandra; Cseresnyes, Zoltan; Escribano Navarro, Juan; Niesner, Raluca A.; Hauser, Anja E.

    2015-01-01

    Confocal microscopy is the method of choice for the analysis of localization of multiple cell types within complex tissues such as the bone marrow. However, the analysis and quantification of cellular localization is difficult, as in many cases it relies on manual counting, thus bearing the risk of introducing a rater-dependent bias and reducing interrater reliability. Moreover, it is often difficult to judge whether the co-localization between two cells results from random positioning, especially when cell types differ strongly in the frequency of their occurrence. Here, a method for unbiased quantification of cellular co-localization in the bone marrow is introduced. The protocol describes the sample preparation used to obtain histological sections of whole murine long bones including the bone marrow, as well as the staining protocol and the acquisition of high-resolution images. An analysis workflow spanning from the recognition of hematopoietic and non-hematopoietic cell types in 2-dimensional (2D) bone marrow images to the quantification of the direct contacts between those cells is presented. This also includes a neighborhood analysis, to obtain information about the cellular microenvironment surrounding a certain cell type. In order to evaluate whether co-localization of two cell types is the mere result of random cell positioning or reflects preferential associations between the cells, a simulation tool which is suitable for testing this hypothesis in the case of hematopoietic as well as stromal cells, is used. This approach is not limited to the bone marrow, and can be extended to other tissues to permit reproducible, quantitative analysis of histological data. PMID:25938636

  19. Contribution to the development of an absolute quantification method in Single Photon Emission Tomography of the brain

    International Nuclear Information System (INIS)

    Dinis-De-Almeida, Pedro-Miguel

    1999-01-01

    Recent technical advances in SPECT have focused on the use of transmission imaging and on the development of new iterative algorithms for attenuation correction. These new tools can be coupled to approaches which compensate for scattering and spatial resolution, in order to quantify the radioactive concentration values in vivo. The main objective of this work was to investigate a quantification method of radioactivity uptake in small cerebral structures using SPECT. This method was based on the correction of attenuation using transmission data. Compton events were estimated and subtracted by positioning a lower energy window. Spatial resolution effects have been corrected using Fourier deconvolution. The radiation dose received by patients during transmission scans was evaluated using anthropomorphic phantoms and suitable dosimeters. A preliminary evaluation of the quantification method was carried out using an anthropomorphic head phantom. In a second phase, in vivo acquisitions were performed in baboon. The values of the percent injected doses per millilitre of tissue in baboon striata were compared under similar experimental conditions using SPECT and PET radiotracers specific for the D2 dopamine receptors. Experiments carried with anthropomorphic phantoms have indicated that the clinical use of transmission scans in SPECT is not limited by radiation doses. Measurements have demonstrated that attenuation dramatically affects quantification in brain SPECT. This effect can be corrected using a map of linear attenuation coefficients obtained through transmission scans and an iterative reconstruction algorithm. After correcting for attenuation, scatter and spatial resolution effects, the accuracy of activity concentration values measurement in the 'striata' of phantom is greatly improved. Results obtained in vivo show that the percent injected doses per millilitre of tissue can be measured with errors similar to those found in PET. This work demonstrates

  20. Quantification of Uncertainties in Integrated Spacecraft System Models, Phase I

    Data.gov (United States)

    National Aeronautics and Space Administration — The proposed effort is to investigate a novel uncertainty quantification (UQ) approach based on non-intrusive polynomial chaos (NIPC) for computationally efficient...

  1. The Method of Manufactured Universes for validating uncertainty quantification methods

    KAUST Repository

    Stripling, H.F.; Adams, M.L.; McClarren, R.G.; Mallick, B.K.

    2011-01-01

    The Method of Manufactured Universes is presented as a validation framework for uncertainty quantification (UQ) methodologies and as a tool for exploring the effects of statistical and modeling assumptions embedded in these methods. The framework

  2. Direct quantification of negatively charged functional groups on membrane surfaces

    KAUST Repository

    Tiraferri, Alberto; Elimelech, Menachem

    2012-01-01

    groups at the surface of dense polymeric membranes. Both techniques consist of associating the membrane surface moieties with chemical probes, followed by quantification of the bound probes. Uranyl acetate and toluidine blue O dye, which interact

  3. Synthesis and Review: Advancing agricultural greenhouse gas quantification

    International Nuclear Information System (INIS)

    Olander, Lydia P; Wollenberg, Eva; Tubiello, Francesco N; Herold, Martin

    2014-01-01

    Reducing emissions of agricultural greenhouse gases (GHGs), such as methane and nitrous oxide, and sequestering carbon in the soil or in living biomass can help reduce the impact of agriculture on climate change while improving productivity and reducing resource use. There is an increasing demand for improved, low cost quantification of GHGs in agriculture, whether for national reporting to the United Nations Framework Convention on Climate Change (UNFCCC), underpinning and stimulating improved practices, establishing crediting mechanisms, or supporting green products. This ERL focus issue highlights GHG quantification to call attention to our existing knowledge and opportunities for further progress. In this article we synthesize the findings of 21 papers on the current state of global capability for agricultural GHG quantification and visions for its improvement. We conclude that strategic investment in quantification can lead to significant global improvement in agricultural GHG estimation in the near term. (paper)

  4. A Micropillar Compression Methodology for Ductile Damage Quantification

    NARCIS (Netherlands)

    Tasan, C.C.; Hoefnagels, J.P.M.; Geers, M.G.D.

    2012-01-01

    Microstructural damage evolution is reported to influence significantly the failures of new high-strength alloys. Its accurate quantification is, therefore, critical for (1) microstructure optimization and (2) continuum damage models to predict failures of these materials. As existing methodologies

  5. Multi data reservior history matching and uncertainty quantification framework

    KAUST Repository

    Katterbauer, Klemens; Hoteit, Ibrahim; Sun, Shuyu

    2015-01-01

    A multi-data reservoir history matching and uncertainty quantification framework is provided. The framework can utilize multiple data sets such as production, seismic, electromagnetic, gravimetric and surface deformation data for improving

  6. A micropillar compression methodology for ductile damage quantification

    NARCIS (Netherlands)

    Tasan, C.C.; Hoefnagels, J.P.M.; Geers, M.G.D.

    2012-01-01

    Microstructural damage evolution is reported to influence significantly the failures of new high-strength alloys. Its accurate quantification is, therefore, critical for (1) microstructure optimization and (2) continuum damage models to predict failures of these materials. As existing methodologies

  7. The value of serum Hepatitis B surface antigen quantification in ...

    African Journals Online (AJOL)

    The value of serum Hepatitis B surface antigen quantification in determining viralactivity in chronic Hepatitis B virus infection. ... ofCHB andalso higher in hepatitis e antigen positive patients compared to hepatitis e antigen negative patients.

  8. an expansion of the aboveground biomass quantification model for ...

    African Journals Online (AJOL)

    Research Note BECVOL 3: an expansion of the aboveground biomass quantification model for ... African Journal of Range and Forage Science ... encroachment and estimation of food to browser herbivore species, was proposed during 1989.

  9. (1) H-MRS processing parameters affect metabolite quantification

    DEFF Research Database (Denmark)

    Bhogal, Alex A; Schür, Remmelt R; Houtepen, Lotte C

    2017-01-01

    investigated the influence of model parameters and spectral quantification software on fitted metabolite concentration values. Sixty spectra in 30 individuals (repeated measures) were acquired using a 7-T MRI scanner. Data were processed by four independent research groups with the freedom to choose their own...... + NAAG/Cr + PCr and Glu/Cr + PCr, respectively. Metabolite quantification using identical (1) H-MRS data was influenced by processing parameters, basis sets and software choice. Locally preferred processing choices affected metabolite quantification, even when using identical software. Our results......Proton magnetic resonance spectroscopy ((1) H-MRS) can be used to quantify in vivo metabolite levels, such as lactate, γ-aminobutyric acid (GABA) and glutamate (Glu). However, there are considerable analysis choices which can alter the accuracy or precision of (1) H-MRS metabolite quantification...

  10. A simple method of digitizing analog scintigrams for quantification and digital archiving

    International Nuclear Information System (INIS)

    Schramm, M.; Kaempfer, B.; Wolf, H.; Clausen, M.; Wendhausen, H.; Henze, E.

    1993-01-01

    This study was undertaken to evaluate a quick, reliable and cheap method of digitizing analog scintigrams. 40 whole-body bone scintigrams were obtained simultaneously in analog and genuine digital format. The analog scans on X-ray film were then digitized seecondarily by three different methods: 300 dpi flatbed scanning, high-resolution camera scanning and camcorder recording. A simple exposure approach using a light box, a cheap camcorder, a PC and image grabber hard- and software proved to be optimal. Visual interpretation showed no differences in clinical findings when comparing the analog images with their secondarily digitized counterparts. To test the possibility of quantification, 126 equivalent ROIs were drawn both in the genuine digital and the secondarily digitized images. Comparing the ROI count to whole-body count percentage of the corresponding ROIs showed the correlation to be linear. The evaluation of phantom studies showed the linear correlation to be true within a wide activity range. Thus, secondary digitalization of analog scintigrams is an easy, cheap and reliable method of archiving images and allows secondary digital quantification. (orig.) [de

  11. Impact of polymeric membrane filtration of oil sands process water on organic compounds quantification.

    Science.gov (United States)

    Moustafa, Ahmed M A; Kim, Eun-Sik; Alpatova, Alla; Sun, Nian; Smith, Scott; Kang, Seoktae; Gamal El-Din, Mohamed

    2014-01-01

    The interaction between organic fractions in oil sands process-affected water (OSPW) and three polymeric membranes with varying hydrophilicity (nylon, polyvinylidene fluoride and polytetrafluoroethylene) at different pHs was studied to evaluate the impact of filtration on the quantification of acid-extractable fraction (AEF) and naphthenic acids (NAs). Four functional groups predominated in OSPW (amine, phosphoryl, carboxyl and hydroxyl) as indicated by the linear programming method. The nylon membranes were the most hydrophilic and exhibited the lowest AEF removal at pH of 8.7. However, the adsorption of AEF on the membranes increased as the pH of OSPW decreased due to hydrophobic interactions between the membrane surfaces and the protonated molecules. The use of ultra pressure liquid chromatography-high resolution mass spectrometry (UPLC/HRMS) showed insignificant adsorption of NAs on the tested membranes at pH 8.7. However, 26±2.4% adsorption of NAs was observed at pH 5.3 following the protonation of NAs species. For the nylon membrane, excessive carboxylic acids in the commercial NAs caused the formation of negatively charged assisted hydrogen bonds, resulting in increased adsorption at pH 8.2 (25%) as compared to OSPW (0%). The use of membranes for filtration of soluble compounds from complex oily wastewaters before quantification analysis of AEF and NAs should be examined prior to application.

  12. [A simple method of digitizing analog scintigrams for quantification and digital archiving].

    Science.gov (United States)

    Schramm, M; Kämpfer, B; Wolf, H; Clausen, M; Wendhausen, H; Henze, E

    1993-02-01

    This study was undertaken to evaluate a quick, reliable and cheap method of digitizing analog scintigrams. 40 whole-body bone scintigrams were obtained simultaneously in analog and genuine digital format. The analog scans on x-ray film were then digitized secondarily by three different methods: 300 dpi flat-bed scanning, high-resolution camera scanning and camcorder recording. A simple exposure approach using a light box, a cheap camcorder, a PC and image grabber hard- and software proved to be optimal. Visual interpretation showed no differences in clinical findings when comparing the analog images with their secondarily digitized counterparts. To test the possibility of quantification, 126 equivalent ROIs were drawn both in the genuine digital and the secondarily digitized images. Comparing the ROI count to whole-body count percentage of the corresponding ROIs showed the correlation to be linear. The evaluation of phantom studies showed the linear correlation to be true within a wide activity range. Thus, secondary digitalization of analog scintigrams is an easy, cheap and reliable method of archiving images and allows secondary digital quantification.

  13. Cardiac chamber quantification using magnetic resonance imaging at 7 Tesla - a pilot study

    International Nuclear Information System (INIS)

    Knobelsdorff-Brenkenhoff, Florian von; Schulz-Menger, Jeanette; Frauenrath, Tobias; Hezel, Fabian; Prothmann, Marcel; Dieringer, Matthias A.; Niendorf, Thoralf; Renz, Wolfgang; Kretschel, Kerstin

    2010-01-01

    Interest in cardiovascular magnetic resonance (CMR) at 7 T is motivated by the expected increase in spatial and temporal resolution, but the method is technically challenging. We examined the feasibility of cardiac chamber quantification at 7 T. A stack of short axes covering the left ventricle was obtained in nine healthy male volunteers. At 1.5 T, steady-state free precession (SSFP) and fast gradient echo (FGRE) cine imaging with 7 mm slice thickness (STH) were used. At 7 T, FGRE with 7 mm and 4 mm STH were applied. End-diastolic volume, end-systolic volume, ejection fraction and mass were calculated. All 7 T examinations provided excellent blood/myocardium contrast for all slice directions. No significant difference was found regarding ejection fraction and cardiac volumes between SSFP at 1.5 T and FGRE at 7 T, while volumes obtained from FGRE at 1.5 T were underestimated. Cardiac mass derived from FGRE at 1.5 and 7 T was larger than obtained from SSFP at 1.5 T. Agreement of volumes and mass between SSFP at 1.5 T and FGRE improved for FGRE at 7 T when combined with an STH reduction to 4 mm. This pilot study demonstrates that cardiac chamber quantification at 7 T using FGRE is feasible and agrees closely with SSFP at 1.5 T. (orig.)

  14. Environmental Systems Conflict Resolution

    Science.gov (United States)

    Hipel, K. W.

    2017-12-01

    The Graph Model for Conflict Resolution (GMCR) is applied to a real-life groundwater contamination dispute to demonstrate how one can realistically model and analyze the controversy in order to obtain an enhanced understanding and strategic insights for permitting one to make informed decisions. This highly divisive conflict is utilized to explain a rich range of inherent capabilities of GMCR, as well as worthwhile avenues for extensions, which make GMCR a truly powerful decision technology for addressing challenging conflict situations. For instance, a flexible preference elicitation method called option prioritization can be employed to obtain the relative preferences of each decision maker (DM) in the dispute over the states or scenarios which can occur, based upon preference statements regarding the options or courses of actions available to the DMs. Solution concepts, reflecting the way a chess player thinks in terms of moves and counter-moves, are defined to mirror the ways humans may behave under conflict, varying from short to long term thinking. After ascertaining the best outcome that a DM can achieve on his or her own in a conflict, coalition analysis algorithms are available to check if a DM can fare even better via cooperating with others. The ability of GMCR to take into account emotions, strength of preference, attitudes, misunderstandings (referred to as hypergames), and uncertain preferences (unknown, fuzzy, grey and probabilistic) greatly broadens its scope of applicability. Techniques for tracing how a conflict can evolve over time from a status quo state to a final specified outcome, as well as how to handle hierarchical structures, such as when a central government interacts with its provinces or states, further enforces the comprehensive nature of GMCR. Within ongoing conflict research mimicking how physical systems are analyzed, methods for inverse engineering of preferences are explained for determining the preferences required by one or

  15. Quantification Model for Estimating Temperature Field Distributions of Apple Fruit

    OpenAIRE

    Zhang , Min; Yang , Le; Zhao , Huizhong; Zhang , Leijie; Zhong , Zhiyou; Liu , Yanling; Chen , Jianhua

    2009-01-01

    International audience; A quantification model of transient heat conduction was provided to simulate apple fruit temperature distribution in the cooling process. The model was based on the energy variation of apple fruit of different points. It took into account, heat exchange of representative elemental volume, metabolism heat and external heat. The following conclusions could be obtained: first, the quantification model can satisfactorily describe the tendency of apple fruit temperature dis...

  16. Quantification of aortic regurgitation by magnetic resonance velocity mapping

    DEFF Research Database (Denmark)

    Søndergaard, Lise; Lindvig, K; Hildebrandt, P

    1993-01-01

    The use of magnetic resonance (MR) velocity mapping in the quantification of aortic valvular blood flow was examined in 10 patients with angiographically verified aortic regurgitation. MR velocity mapping succeeded in identifying and quantifying the regurgitation in all patients, and the regurgit......The use of magnetic resonance (MR) velocity mapping in the quantification of aortic valvular blood flow was examined in 10 patients with angiographically verified aortic regurgitation. MR velocity mapping succeeded in identifying and quantifying the regurgitation in all patients...

  17. FRANX. Application for analysis and quantification of the APS fire

    International Nuclear Information System (INIS)

    Snchez, A.; Osorio, F.; Ontoso, N.

    2014-01-01

    The FRANX application has been developed by EPRI within the Risk and Reliability User Group in order to facilitate the process of quantification and updating APS Fire (also covers floods and earthquakes). By applying fire scenarios are quantified in the central integrating the tasks performed during the APS fire. This paper describes the main features of the program to allow quantification of an APS Fire. (Author)

  18. Tissue quantification for development of pediatric phantom

    International Nuclear Information System (INIS)

    Alves, A.F.F.; Miranda, J.R.A.; Pina, D.R.

    2013-01-01

    The optimization of the risk- benefit ratio is a major concern in the pediatric radiology, due to the greater vulnerability of children to the late somatic effects and genetic effects of exposure to radiation compared to adults. In Brazil, it is estimated that the causes of death from head trauma are 18 % for the age group between 1-5 years and the radiograph is the primary diagnostic test for the detection of skull fracture . Knowing that the image quality is essential to ensure the identification of structures anatomical and minimizing errors diagnostic interpretation, this paper proposed the development and construction of homogeneous phantoms skull, for the age group 1-5 years. The construction of the phantoms homogeneous was performed using the classification and quantification of tissue present in the skull of pediatric patients. In this procedure computational algorithms were used, using Matlab, to quantify distinct biological tissues present in the anatomical regions studied , using pictures retrospective CT scans. Preliminary data obtained from measurements show that between the ages of 1-5 years, assuming an average anteroposterior diameter of the pediatric skull region of the 145.73 ± 2.97 mm, can be represented by 92.34 mm ± 5.22 of lucite and 1.75 ± 0:21 mm of aluminum plates of a provision of PEP (Pacient equivalent phantom). After its construction, the phantoms will be used for image and dose optimization in pediatric protocols process to examinations of computerized radiography

  19. Uncertainty Quantification in High Throughput Screening ...

    Science.gov (United States)

    Using uncertainty quantification, we aim to improve the quality of modeling data from high throughput screening assays for use in risk assessment. ToxCast is a large-scale screening program that analyzes thousands of chemicals using over 800 assays representing hundreds of biochemical and cellular processes, including endocrine disruption, cytotoxicity, and zebrafish development. Over 2.6 million concentration response curves are fit to models to extract parameters related to potency and efficacy. Models built on ToxCast results are being used to rank and prioritize the toxicological risk of tested chemicals and to predict the toxicity of tens of thousands of chemicals not yet tested in vivo. However, the data size also presents challenges. When fitting the data, the choice of models, model selection strategy, and hit call criteria must reflect the need for computational efficiency and robustness, requiring hard and somewhat arbitrary cutoffs. When coupled with unavoidable noise in the experimental concentration response data, these hard cutoffs cause uncertainty in model parameters and the hit call itself. The uncertainty will then propagate through all of the models built on the data. Left unquantified, this uncertainty makes it difficult to fully interpret the data for risk assessment. We used bootstrap resampling methods to quantify the uncertainty in fitting models to the concentration response data. Bootstrap resampling determines confidence intervals for

  20. Uncertainty quantification in flood risk assessment

    Science.gov (United States)

    Blöschl, Günter; Hall, Julia; Kiss, Andrea; Parajka, Juraj; Perdigão, Rui A. P.; Rogger, Magdalena; Salinas, José Luis; Viglione, Alberto

    2017-04-01

    Uncertainty is inherent to flood risk assessments because of the complexity of the human-water system, which is characterised by nonlinearities and interdependencies, because of limited knowledge about system properties and because of cognitive biases in human perception and decision-making. On top of the uncertainty associated with the assessment of the existing risk to extreme events, additional uncertainty arises because of temporal changes in the system due to climate change, modifications of the environment, population growth and the associated increase in assets. Novel risk assessment concepts are needed that take into account all these sources of uncertainty. They should be based on the understanding of how flood extremes are generated and how they change over time. They should also account for the dynamics of risk perception of decision makers and population in the floodplains. In this talk we discuss these novel risk assessment concepts through examples from Flood Frequency Hydrology, Socio-Hydrology and Predictions Under Change. We believe that uncertainty quantification in flood risk assessment should lead to a robust approach of integrated flood risk management aiming at enhancing resilience rather than searching for optimal defense strategies.

  1. Quantification of the vocal folds’ dynamic displacements

    International Nuclear Information System (INIS)

    Hernández-Montes, María del Socorro; Muñoz, Silvino; De La Torre, Manuel; Flores, Mauricio; Pérez, Carlos; Mendoza-Santoyo, Fernando

    2016-01-01

    Fast dynamic data acquisition techniques are required to investigate the motional behavior of the vocal folds (VFs) when they are subjected to a steady air-flow through the trachea. High-speed digital holographic interferometry (DHI) is a non-invasive full-field-of-view technique that has proved its usefulness to study rapid and non-repetitive object movements. Hence it is an ideal technique used here to measure VF displacements and vibration patterns at 2000 fps. Analyses from a set of 200 displacement images showed that VFs’ vibration cycles are established along their width (y) and length (x). Furthermore, the maximum deformation for the right and left VFs’ area may be quantified from these images, which in itself represents an important result in the characterization of this structure. At a controlled air pressure, VF displacements fall within the range ∼100–1740 nm, with a calculated precision and accuracy that yields a variation coefficient of 1.91%. High-speed acquisition of full-field images of VFs and their displacement quantification are on their own significant data in the study of their functional and physiological behavior since voice quality and production depend on how they vibrate, i.e. their displacement amplitude and frequency. Additionally, the use of high speed DHI avoids prolonged examinations and represents a significant scientific and technological alternative contribution in advancing the knowledge and working mechanisms of these tissues. (paper)

  2. Cross recurrence quantification for cover song identification

    Energy Technology Data Exchange (ETDEWEB)

    Serra, Joan; Serra, Xavier; Andrzejak, Ralph G [Department of Information and Communication Technologies, Universitat Pompeu Fabra, Roc Boronat 138, 08018 Barcelona (Spain)], E-mail: joan.serraj@upf.edu

    2009-09-15

    There is growing evidence that nonlinear time series analysis techniques can be used to successfully characterize, classify, or process signals derived from real-world dynamics even though these are not necessarily deterministic and stationary. In the present study, we proceed in this direction by addressing an important problem our modern society is facing, the automatic classification of digital information. In particular, we address the automatic identification of cover songs, i.e. alternative renditions of a previously recorded musical piece. For this purpose, we here propose a recurrence quantification analysis measure that allows the tracking of potentially curved and disrupted traces in cross recurrence plots (CRPs). We apply this measure to CRPs constructed from the state space representation of musical descriptor time series extracted from the raw audio signal. We show that our method identifies cover songs with a higher accuracy as compared to previously published techniques. Beyond the particular application proposed here, we discuss how our approach can be useful for the characterization of a variety of signals from different scientific disciplines. We study coupled Roessler dynamics with stochastically modulated mean frequencies as one concrete example to illustrate this point.

  3. Verification Validation and Uncertainty Quantification for CGS

    Energy Technology Data Exchange (ETDEWEB)

    Rider, William J. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Kamm, James R. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Weirs, V. Gregory [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2015-01-01

    The overall conduct of verification, validation and uncertainty quantification (VVUQ) is discussed through the construction of a workflow relevant to computational modeling including the turbulence problem in the coarse grained simulation (CGS) approach. The workflow contained herein is defined at a high level and constitutes an overview of the activity. Nonetheless, the workflow represents an essential activity in predictive simulation and modeling. VVUQ is complex and necessarily hierarchical in nature. The particular characteristics of VVUQ elements depend upon where the VVUQ activity takes place in the overall hierarchy of physics and models. In this chapter, we focus on the differences between and interplay among validation, calibration and UQ, as well as the difference between UQ and sensitivity analysis. The discussion in this chapter is at a relatively high level and attempts to explain the key issues associated with the overall conduct of VVUQ. The intention is that computational physicists can refer to this chapter for guidance regarding how VVUQ analyses fit into their efforts toward conducting predictive calculations.

  4. Information theoretic quantification of diagnostic uncertainty.

    Science.gov (United States)

    Westover, M Brandon; Eiseman, Nathaniel A; Cash, Sydney S; Bianchi, Matt T

    2012-01-01

    Diagnostic test interpretation remains a challenge in clinical practice. Most physicians receive training in the use of Bayes' rule, which specifies how the sensitivity and specificity of a test for a given disease combine with the pre-test probability to quantify the change in disease probability incurred by a new test result. However, multiple studies demonstrate physicians' deficiencies in probabilistic reasoning, especially with unexpected test results. Information theory, a branch of probability theory dealing explicitly with the quantification of uncertainty, has been proposed as an alternative framework for diagnostic test interpretation, but is even less familiar to physicians. We have previously addressed one key challenge in the practical application of Bayes theorem: the handling of uncertainty in the critical first step of estimating the pre-test probability of disease. This essay aims to present the essential concepts of information theory to physicians in an accessible manner, and to extend previous work regarding uncertainty in pre-test probability estimation by placing this type of uncertainty within a principled information theoretic framework. We address several obstacles hindering physicians' application of information theoretic concepts to diagnostic test interpretation. These include issues of terminology (mathematical meanings of certain information theoretic terms differ from clinical or common parlance) as well as the underlying mathematical assumptions. Finally, we illustrate how, in information theoretic terms, one can understand the effect on diagnostic uncertainty of considering ranges instead of simple point estimates of pre-test probability.

  5. [Quantification of acetabular coverage in normal adult].

    Science.gov (United States)

    Lin, R M; Yang, C Y; Yu, C Y; Yang, C R; Chang, G L; Chou, Y L

    1991-03-01

    Quantification of acetabular coverage is important and can be expressed by superimposition of cartilage tracings on the maximum cross-sectional area of the femoral head. A practical Autolisp program on PC AutoCAD has been developed by us to quantify the acetabular coverage through numerical expression of the images of computed tomography. Thirty adults (60 hips) with normal center-edge angle and acetabular index in plain X ray were randomly selected for serial drops. These slices were prepared with a fixed coordination and in continuous sections of 5 mm in thickness. The contours of the cartilage of each section were digitized into a PC computer and processed by AutoCAD programs to quantify and characterize the acetabular coverage of normal and dysplastic adult hips. We found that a total coverage ratio of greater than 80%, an anterior coverage ratio of greater than 75% and a posterior coverage ratio of greater than 80% can be categorized in a normal group. Polar edge distance is a good indicator for the evaluation of preoperative and postoperative coverage conditions. For standardization and evaluation of acetabular coverage, the most suitable parameters are the total coverage ratio, anterior coverage ratio, posterior coverage ratio and polar edge distance. However, medial coverage and lateral coverage ratios are indispensable in cases of dysplastic hip because variations between them are so great that acetabuloplasty may be impossible. This program can also be used to classify precisely the type of dysplastic hip.

  6. Quantification of the vocal folds’ dynamic displacements

    Science.gov (United States)

    del Socorro Hernández-Montes, María; Muñoz, Silvino; De La Torre, Manuel; Flores, Mauricio; Pérez, Carlos; Mendoza-Santoyo, Fernando

    2016-05-01

    Fast dynamic data acquisition techniques are required to investigate the motional behavior of the vocal folds (VFs) when they are subjected to a steady air-flow through the trachea. High-speed digital holographic interferometry (DHI) is a non-invasive full-field-of-view technique that has proved its usefulness to study rapid and non-repetitive object movements. Hence it is an ideal technique used here to measure VF displacements and vibration patterns at 2000 fps. Analyses from a set of 200 displacement images showed that VFs’ vibration cycles are established along their width (y) and length (x). Furthermore, the maximum deformation for the right and left VFs’ area may be quantified from these images, which in itself represents an important result in the characterization of this structure. At a controlled air pressure, VF displacements fall within the range ~100-1740 nm, with a calculated precision and accuracy that yields a variation coefficient of 1.91%. High-speed acquisition of full-field images of VFs and their displacement quantification are on their own significant data in the study of their functional and physiological behavior since voice quality and production depend on how they vibrate, i.e. their displacement amplitude and frequency. Additionally, the use of high speed DHI avoids prolonged examinations and represents a significant scientific and technological alternative contribution in advancing the knowledge and working mechanisms of these tissues.

  7. Low cost high performance uncertainty quantification

    KAUST Repository

    Bekas, C.

    2009-01-01

    Uncertainty quantification in risk analysis has become a key application. In this context, computing the diagonal of inverse covariance matrices is of paramount importance. Standard techniques, that employ matrix factorizations, incur a cubic cost which quickly becomes intractable with the current explosion of data sizes. In this work we reduce this complexity to quadratic with the synergy of two algorithms that gracefully complement each other and lead to a radically different approach. First, we turned to stochastic estimation of the diagonal. This allowed us to cast the problem as a linear system with a relatively small number of multiple right hand sides. Second, for this linear system we developed a novel, mixed precision, iterative refinement scheme, which uses iterative solvers instead of matrix factorizations. We demonstrate that the new framework not only achieves the much needed quadratic cost but in addition offers excellent opportunities for scaling at massively parallel environments. We based our implementation on BLAS 3 kernels that ensure very high processor performance. We achieved a peak performance of 730 TFlops on 72 BG/P racks, with a sustained performance 73% of theoretical peak. We stress that the techniques presented in this work are quite general and applicable to several other important applications. Copyright © 2009 ACM.

  8. Standardless quantification methods in electron probe microanalysis

    Energy Technology Data Exchange (ETDEWEB)

    Trincavelli, Jorge, E-mail: trincavelli@famaf.unc.edu.ar [Facultad de Matemática, Astronomía y Física, Universidad Nacional de Córdoba, Ciudad Universitaria, 5000 Córdoba (Argentina); Instituto de Física Enrique Gaviola, Consejo Nacional de Investigaciones Científicas y Técnicas de la República Argentina, Medina Allende s/n, Ciudad Universitaria, 5000 Córdoba (Argentina); Limandri, Silvina, E-mail: s.limandri@conicet.gov.ar [Facultad de Matemática, Astronomía y Física, Universidad Nacional de Córdoba, Ciudad Universitaria, 5000 Córdoba (Argentina); Instituto de Física Enrique Gaviola, Consejo Nacional de Investigaciones Científicas y Técnicas de la República Argentina, Medina Allende s/n, Ciudad Universitaria, 5000 Córdoba (Argentina); Bonetto, Rita, E-mail: bonetto@quimica.unlp.edu.ar [Centro de Investigación y Desarrollo en Ciencias Aplicadas Dr. Jorge Ronco, Consejo Nacional de Investigaciones Científicas y Técnicas de la República Argentina, Facultad de Ciencias Exactas, de la Universidad Nacional de La Plata, Calle 47 N° 257, 1900 La Plata (Argentina)

    2014-11-01

    The elemental composition of a solid sample can be determined by electron probe microanalysis with or without the use of standards. The standardless algorithms are quite faster than the methods that require standards; they are useful when a suitable set of standards is not available or for rough samples, and also they help to solve the problem of current variation, for example, in equipments with cold field emission gun. Due to significant advances in the accuracy achieved during the last years, product of the successive efforts made to improve the description of generation, absorption and detection of X-rays, the standardless methods have increasingly become an interesting option for the user. Nevertheless, up to now, algorithms that use standards are still more precise than standardless methods. It is important to remark, that care must be taken with results provided by standardless methods that normalize the calculated concentration values to 100%, unless an estimate of the errors is reported. In this work, a comprehensive discussion of the key features of the main standardless quantification methods, as well as the level of accuracy achieved by them is presented. - Highlights: • Standardless methods are a good alternative when no suitable standards are available. • Their accuracy reaches 10% for 95% of the analyses when traces are excluded. • Some of them are suitable for the analysis of rough samples.

  9. Quantification of variability in trichome patterns

    Directory of Open Access Journals (Sweden)

    Bettina eGreese

    2014-11-01

    Full Text Available While pattern formation is studied in various areas of biology, little is known about the intrinsic noise leading to variations between individual realizations of the pattern. One prominent example for de novo pattern formation in plants is the patterning of trichomes on Arabidopsis leaves, which involves genetic regulation and cell-to-cell communication. These processes are potentially variable due to , e.g., the abundance of cell components or environmental conditions. To elevate the understanding of the regulatory processes underlying the pattern formation it is crucial to quantitatively analyze the variability in naturally occurring patterns. Here, we review recent approaches towards characterization of noise on trichome initiation. We present methods for the quantification of spatial patterns, which are the basis for data-driven mathematical modeling and enable the analysis of noise from different sources. Besides the insight gained on trichome formation, the examination of observed trichome patterns also shows that highly regulated biological processes can be substantially affected by variability.

  10. Cross recurrence quantification for cover song identification

    International Nuclear Information System (INIS)

    Serra, Joan; Serra, Xavier; Andrzejak, Ralph G

    2009-01-01

    There is growing evidence that nonlinear time series analysis techniques can be used to successfully characterize, classify, or process signals derived from real-world dynamics even though these are not necessarily deterministic and stationary. In the present study, we proceed in this direction by addressing an important problem our modern society is facing, the automatic classification of digital information. In particular, we address the automatic identification of cover songs, i.e. alternative renditions of a previously recorded musical piece. For this purpose, we here propose a recurrence quantification analysis measure that allows the tracking of potentially curved and disrupted traces in cross recurrence plots (CRPs). We apply this measure to CRPs constructed from the state space representation of musical descriptor time series extracted from the raw audio signal. We show that our method identifies cover songs with a higher accuracy as compared to previously published techniques. Beyond the particular application proposed here, we discuss how our approach can be useful for the characterization of a variety of signals from different scientific disciplines. We study coupled Roessler dynamics with stochastically modulated mean frequencies as one concrete example to illustrate this point.

  11. Quality Quantification of Evaluated Cross Section Covariances

    International Nuclear Information System (INIS)

    Varet, S.; Dossantos-Uzarralde, P.; Vayatis, N.

    2015-01-01

    Presently, several methods are used to estimate the covariance matrix of evaluated nuclear cross sections. Because the resulting covariance matrices can be different according to the method used and according to the assumptions of the method, we propose a general and objective approach to quantify the quality of the covariance estimation for evaluated cross sections. The first step consists in defining an objective criterion. The second step is computation of the criterion. In this paper the Kullback-Leibler distance is proposed for the quality quantification of a covariance matrix estimation and its inverse. It is based on the distance to the true covariance matrix. A method based on the bootstrap is presented for the estimation of this criterion, which can be applied with most methods for covariance matrix estimation and without the knowledge of the true covariance matrix. The full approach is illustrated on the 85 Rb nucleus evaluations and the results are then used for a discussion on scoring and Monte Carlo approaches for covariance matrix estimation of the cross section evaluations

  12. Quantification of water penetration into concrete through cracks by neutron radiography

    International Nuclear Information System (INIS)

    Kanematsu, M.; Maruyama, I.; Noguchi, T.; Iikura, H.; Tsuchiya, N.

    2009-01-01

    Improving the durability of concrete structures is one of the ways to contribute to the sustainable development of society, and it has also become a crucial issue from an environmental viewpoint. It is well known that moisture behavior in reinforced concrete is linked to phenomena such as cement hydration, volume change and cracking caused by drying shrinkage, rebar corrosion and water leakage that affect the durability of concrete. In this research, neutron radiography was applied for visualization and quantification of water penetration into concrete through cracks. It is clearly confirmed that TNR can make visible the water behavior in/near horizontal/vertical cracks and can quantify the rate of diffusion and concentration distribution of moisture with high spatial and time resolution. On detailed analysis, it is observed that water penetrates through the crack immediately after pouring and its migration speed and distribution depend on the moisture condition in the concrete.

  13. Non-intrusive uncertainty quantification of computational fluid dynamics simulations: notes on the accuracy and efficiency

    Science.gov (United States)

    Zimoń, Małgorzata; Sawko, Robert; Emerson, David; Thompson, Christopher

    2017-11-01

    Uncertainty quantification (UQ) is increasingly becoming an indispensable tool for assessing the reliability of computational modelling. Efficient handling of stochastic inputs, such as boundary conditions, physical properties or geometry, increases the utility of model results significantly. We discuss the application of non-intrusive generalised polynomial chaos techniques in the context of fluid engineering simulations. Deterministic and Monte Carlo integration rules are applied to a set of problems, including ordinary differential equations and the computation of aerodynamic parameters subject to random perturbations. In particular, we analyse acoustic wave propagation in a heterogeneous medium to study the effects of mesh resolution, transients, number and variability of stochastic inputs. We consider variants of multi-level Monte Carlo and perform a novel comparison of the methods with respect to numerical and parametric errors, as well as computational cost. The results provide a comprehensive view of the necessary steps in UQ analysis and demonstrate some key features of stochastic fluid flow systems.

  14. Image-based computational quantification and visualization of genetic alterations and tumour heterogeneity.

    Science.gov (United States)

    Zhong, Qing; Rüschoff, Jan H; Guo, Tiannan; Gabrani, Maria; Schüffler, Peter J; Rechsteiner, Markus; Liu, Yansheng; Fuchs, Thomas J; Rupp, Niels J; Fankhauser, Christian; Buhmann, Joachim M; Perner, Sven; Poyet, Cédric; Blattner, Miriam; Soldini, Davide; Moch, Holger; Rubin, Mark A; Noske, Aurelia; Rüschoff, Josef; Haffner, Michael C; Jochum, Wolfram; Wild, Peter J

    2016-04-07

    Recent large-scale genome analyses of human tissue samples have uncovered a high degree of genetic alterations and tumour heterogeneity in most tumour entities, independent of morphological phenotypes and histopathological characteristics. Assessment of genetic copy-number variation (CNV) and tumour heterogeneity by fluorescence in situ hybridization (ISH) provides additional tissue morphology at single-cell resolution, but it is labour intensive with limited throughput and high inter-observer variability. We present an integrative method combining bright-field dual-colour chromogenic and silver ISH assays with an image-based computational workflow (ISHProfiler), for accurate detection of molecular signals, high-throughput evaluation of CNV, expressive visualization of multi-level heterogeneity (cellular, inter- and intra-tumour heterogeneity), and objective quantification of heterogeneous genetic deletions (PTEN) and amplifications (19q12, HER2) in diverse human tumours (prostate, endometrial, ovarian and gastric), using various tissue sizes and different scanners, with unprecedented throughput and reproducibility.

  15. Characterising non-linear dynamics in nocturnal breathing patterns of healthy infants using recurrence quantification analysis.

    Science.gov (United States)

    Terrill, Philip I; Wilson, Stephen J; Suresh, Sadasivam; Cooper, David M; Dakin, Carolyn

    2013-05-01

    Breathing dynamics vary between infant sleep states, and are likely to exhibit non-linear behaviour. This study applied the non-linear analytical tool recurrence quantification analysis (RQA) to 400 breath interval periods of REM and N-REM sleep, and then using an overlapping moving window. The RQA variables were different between sleep states, with REM radius 150% greater than N-REM radius, and REM laminarity 79% greater than N-REM laminarity. RQA allowed the observation of temporal variations in non-linear breathing dynamics across a night's sleep at 30s resolution, and provides a basis for quantifying changes in complex breathing dynamics with physiology and pathology. Copyright © 2013 Elsevier Ltd. All rights reserved.

  16. Tele-AAC Resolution

    Directory of Open Access Journals (Sweden)

    Kate Anderson

    2012-12-01

    Full Text Available Approximately 1.3% of all people, or about 4 million Americans, cannot rely on their natural speech to meet their daily communication needs. Telepractice offers a potentially cost-effective service delivery mechanism to provide clinical AAC services at a distance to the benefit of underserved populations in the United States and worldwide.  Tele-AAC is a unique cross-disciplinary clinical service delivery model that requires expertise in both telepractice and augmentative and alternative communication (AAC systems.  The Tele-AAC Working Group of the 2012 ISAAC Research Symposium therefore drafted a resolution underscoring the importance of identifying and characterizing the unique opportunities and constraints of Tele-AAC in all aspects of service delivery. These include, but are not limited to: needs assessments; implementation planning; device/system procurement, set-up and training; quality assurance, client progress monitoring, and follow-up service delivery. Tele-AAC, like other telepractice applications, requires adherence to the ASHA Code of Ethics and other policy documents, and state, federal, and international laws, as well as a competent technological infrastructure. The Working Group recommends that institutions of higher education and professional organizations provide training in Tele-AAC service provision. In addition, research and development are needed to create validity measures across Tele-AAC practices (i.e., assessment, implementation, and consultation; determine the communication competence levels achieved  by Tele-AAC users; discern stakeholders’  perceptions of Tele-AAC services (e.g., acceptability and viability; maximize Tele-AAC’s capacity to engage multiple team members in AAC assessment and ongoing service; identify the limitations and barriers of Tele-AAC provision; and develop potential solutions. 

  17. Quantification of arterial plaque and lumen density with MDCT

    International Nuclear Information System (INIS)

    Paul, Narinder S.; Blobel, Joerg; Kashani, Hany; Rice, Murray; Ursani, Ali

    2010-01-01

    Purpose: This study aimed to derive a mathematical correction function in order to normalize the CT number measurements for small volume arterial plaque and small vessel mimicking objects, imaged with multidetector CT (MDCT). Methods: A commercially available calcium plaque phantom (QRM GmbH, Moehrendorf, Germany) and a custom built cardiovascular phantom were scanned with 320 and 64 MDCT scanners. The calcium hydroxyapatite plaque phantom contained objects 0.5-5.0 mm in diameter with known CT attenuation nominal values ranging 50-800 HU. The cardiovascular phantom contained vessel mimicking objects 1.0-5.0 mm in diameter with different contrast media. Both phantoms were scanned using clinical protocols for CT angiography and images were reconstructed with different filter kernels. The measured CT number (HU) and diameter of each object were analyzed on three clinical postprocessing workstations. From the resultant data, a mathematical formula was derived based on absorption function exp(-μ * d) to demonstrate the relation between measured CT numbers and object diameters. Results: The percentage reduction in measured CT number (HU) for the group of selected filter kernels, apparent during CT angiography, is dependent only on the object size (plaque or vessel diameter). The derived formula of the form 1-c * exp(-a * d b ) showed reduction in CT number for objects between 0.5 and 5 mm in diameter, with asymptote reaching background noise for small objects with diameters nearing the CT in-plane resolution (0.35 mm). No reduction was observed for the objects with diameters equal or larger than 5 mm. Conclusions: A clear mathematical relationship exists between object diameter and reduction in measured CT number in HU. This function is independent of exposure parameters and inherent attenuation properties of the objects studied. Future developments include the incorporation of this mathematical model function into quantification software in order to automatically

  18. Quantification of nanowire uptake by live cells

    KAUST Repository

    Margineanu, Michael B.

    2015-05-01

    Nanostructures fabricated by different methods have become increasingly important for various applications at the cellular level. In order to understand how these nanostructures “behave” and for studying their internalization kinetics, several attempts have been made at tagging and investigating their interaction with living cells. In this study, magnetic iron nanowires with an iron oxide layer are coated with (3-Aminopropyl)triethoxysilane (APTES), and subsequently labeled with a fluorogenic pH-dependent dye pHrodo™ Red, covalently bound to the aminosilane surface. Time-lapse live imaging of human colon carcinoma HCT 116 cells interacting with the labeled iron nanowires is performed for 24 hours. As the pHrodo™ Red conjugated nanowires are non-fluorescent outside the cells but fluoresce brightly inside, internalized nanowires are distinguished from non-internalized ones and their behavior inside the cells can be tracked for the respective time length. A machine learning-based computational framework dedicated to automatic analysis of live cell imaging data, Cell Cognition, is adapted and used to classify cells with internalized and non-internalized nanowires and subsequently determine the uptake percentage by cells at different time points. An uptake of 85 % by HCT 116 cells is observed after 24 hours incubation at NW-to-cell ratios of 200. While the approach of using pHrodo™ Red for internalization studies is not novel in the literature, this study reports for the first time the utilization of a machine-learning based time-resolved automatic analysis pipeline for quantification of nanowire uptake by cells. This pipeline has also been used for comparison studies with nickel nanowires coated with APTES and labeled with pHrodo™ Red, and another cell line derived from the cervix carcinoma, HeLa. It has thus the potential to be used for studying the interaction of different types of nanostructures with potentially any live cell types.

  19. Quantification of water in hydrous ringwoodite

    Directory of Open Access Journals (Sweden)

    Sylvia-Monique eThomas

    2015-01-01

    Full Text Available Ringwoodite, γ-(Mg,Fe2SiO4, in the lower 150 km of Earth’s mantle transition zone (410-660 km depth can incorporate up to 1.5-2 wt% H2O as hydroxyl defects. We present a mineral-specific IR calibration for the absolute water content in hydrous ringwoodite by combining results from Raman spectroscopy, secondary ion mass spectrometery (SIMS and proton-proton (pp-scattering on a suite of synthetic Mg- and Fe-bearing hydrous ringwoodites. H2O concentrations in the crystals studied here range from 0.46 to 1.7 wt% H2O (absolute methods, with the maximum H2O in the same sample giving 2.5 wt% by SIMS calibration. Anchoring our spectroscopic results to absolute H-atom concentrations from pp-scattering measurements, we report frequency-dependent integrated IR-absorption coefficients for water in ringwoodite ranging from 78180 to 158880 L mol-1cm-2, depending upon frequency of the OH absorption. We further report a linear wavenumber IR calibration for H2O quantification in hydrous ringwoodite across the Mg2SiO4-Fe2SiO4 solid solution, which will lead to more accurate estimations of the water content in both laboratory-grown and naturally occurring ringwoodites. Re-evaluation of the IR spectrum for a natural hydrous ringwoodite inclusion in diamond from the study of Pearson et al. (2014 indicates the crystal contains 1.43 ± 0.27 wt% H2O, thus confirming near-maximum amounts of H2O for this sample from the transition zone.

  20. Super-resolution Phase Tomography

    KAUST Repository

    Depeursinge, Christian; Cotte, Yann; Toy, Fatih; Jourdain, Pascal; Boss, Daiel; Marquet, Pierre; Magistretti, Pierre J.

    2013-01-01

    Digital Holographic Microscopy (DHM) yields reconstructed complex wavefields. It allows synthesizing the aperture of a virtual microscope up to 2π, offering super-resolution phase images. Live images of micro-organisms and neurons with resolution less than 100 nm are presented.

  1. Resolution analysis by random probing

    NARCIS (Netherlands)

    Fichtner, Andreas; van Leeuwen, T.

    2015-01-01

    We develop and apply methods for resolution analysis in tomography, based on stochastic probing of the Hessian or resolution operators. Key properties of our methods are (i) low algorithmic complexity and easy implementation, (ii) applicability to any tomographic technique, including full‐waveform

  2. Super-resolution Phase Tomography

    KAUST Repository

    Depeursinge, Christian

    2013-04-21

    Digital Holographic Microscopy (DHM) yields reconstructed complex wavefields. It allows synthesizing the aperture of a virtual microscope up to 2π, offering super-resolution phase images. Live images of micro-organisms and neurons with resolution less than 100 nm are presented.

  3. "Planar" Tautologies Hard for Resolution

    DEFF Research Database (Denmark)

    Dantchev, Stefan; Riis, Søren

    2001-01-01

    We prove exponential lower bounds on the resolution proofs of some tautologies, based on rectangular grid graphs. More specifically, we show a 2Ω(n) lower bound for any resolution proof of the mutilated chessboard problem on a 2n×2n chessboard as well as for the Tseitin tautology (G. Tseitin, 196...

  4. Resolution function in neutron diffractometry

    International Nuclear Information System (INIS)

    Popa, N.

    1987-01-01

    The resolution function in the neutron diffractometry is defined, on base of generalizing the resolution formerly formulated for the double axis neutron spectrometer. A polemical discussion is raised concerning an approach to this function existent in literature. The actual approach is concretized for the DN-2 time-of-flight diffractometer installed at the IBR-2 reactor

  5. Validation of tumor protein marker quantification by two independent automated immunofluorescence image analysis platforms

    Science.gov (United States)

    Peck, Amy R; Girondo, Melanie A; Liu, Chengbao; Kovatich, Albert J; Hooke, Jeffrey A; Shriver, Craig D; Hu, Hai; Mitchell, Edith P; Freydin, Boris; Hyslop, Terry; Chervoneva, Inna; Rui, Hallgeir

    2016-01-01

    Protein marker levels in formalin-fixed, paraffin-embedded tissue sections traditionally have been assayed by chromogenic immunohistochemistry and evaluated visually by pathologists. Pathologist scoring of chromogen staining intensity is subjective and generates low-resolution ordinal or nominal data rather than continuous data. Emerging digital pathology platforms now allow quantification of chromogen or fluorescence signals by computer-assisted image analysis, providing continuous immunohistochemistry values. Fluorescence immunohistochemistry offers greater dynamic signal range than chromogen immunohistochemistry, and combined with image analysis holds the promise of enhanced sensitivity and analytic resolution, and consequently more robust quantification. However, commercial fluorescence scanners and image analysis software differ in features and capabilities, and claims of objective quantitative immunohistochemistry are difficult to validate as pathologist scoring is subjective and there is no accepted gold standard. Here we provide the first side-by-side validation of two technologically distinct commercial fluorescence immunohistochemistry analysis platforms. We document highly consistent results by (1) concordance analysis of fluorescence immunohistochemistry values and (2) agreement in outcome predictions both for objective, data-driven cutpoint dichotomization with Kaplan–Meier analyses or employment of continuous marker values to compute receiver-operating curves. The two platforms examined rely on distinct fluorescence immunohistochemistry imaging hardware, microscopy vs line scanning, and functionally distinct image analysis software. Fluorescence immunohistochemistry values for nuclear-localized and tyrosine-phosphorylated Stat5a/b computed by each platform on a cohort of 323 breast cancer cases revealed high concordance after linear calibration, a finding confirmed on an independent 382 case cohort, with concordance correlation coefficients >0

  6. Electron microscopy at atomic resolution

    Energy Technology Data Exchange (ETDEWEB)

    Gronsky, R.

    1983-11-01

    The direct imaging of atomic structure in solids has become increasingly easier to accomplish with modern transmission electron microscopes, many of which have an information retrieval limit near 0.2 nm point resolution. Achieving better resolution, particularly with any useful range of specimen tilting, requires a major design effort. This presentation describes the new Atomic Resolution Microscope (ARM), recently put into operation at the Lawrence Berkeley Laboratory. Capable of 0.18 nm or better interpretable resolution over a voltage range of 400 kV to 1000 kV with +- 40/sup 0/ biaxial specimen tilting, the ARM features a number of new electron-optical and microprocessor-control designs. These are highlighted, and its atomic resolution performance demonstrated for a selection of inorganic crystals.

  7. Electron microscopy at atomic resolution

    International Nuclear Information System (INIS)

    Gronsky, R.

    1983-11-01

    The direct imaging of atomic structure in solids has become increasingly easier to accomplish with modern transmission electron microscopes, many of which have an information retrieval limit near 0.2 nm point resolution. Achieving better resolution, particularly with any useful range of specimen tilting, requires a major design effort. This presentation describes the new Atomic Resolution Microscope (ARM), recently put into operation at the Lawrence Berkeley Laboratory. Capable of 0.18 nm or better interpretable resolution over a voltage range of 400 kV to 1000 kV with +- 40 0 biaxial specimen tilting, the ARM features a number of new electron-optical and microprocessor-control designs. These are highlighted, and its atomic resolution performance demonstrated for a selection of inorganic crystals

  8. Highest Resolution Gaspra Mosaic

    Science.gov (United States)

    1992-01-01

    This picture of asteroid 951 Gaspra is a mosaic of two images taken by the Galileo spacecraft from a range of 5,300 kilometers (3,300 miles), some 10 minutes before closest approach on October 29, 1991. The Sun is shining from the right; phase angle is 50 degrees. The resolution, about 54 meters/pixel, is the highest for the Gaspra encounter and is about three times better than that in the view released in November 1991. Additional images of Gaspra remain stored on Galileo's tape recorder, awaiting playback in November. Gaspra is an irregular body with dimensions about 19 x 12 x 11 kilometers (12 x 7.5 x 7 miles). The portion illuminated in this view is about 18 kilometers (11 miles) from lower left to upper right. The north pole is located at upper left; Gaspra rotates counterclockwise every 7 hours. The large concavity on the lower right limb is about 6 kilometers (3.7 miles) across, the prominent crater on the terminator, center left, about 1.5 kilometers (1 mile). A striking feature of Gaspra's surface is the abundance of small craters. More than 600 craters, 100-500 meters (330-1650 feet) in diameter are visible here. The number of such small craters compared to larger ones is much greater for Gaspra than for previously studied bodies of comparable size such as the satellites of Mars. Gaspra's very irregular shape suggests that the asteroid was derived from a larger body by nearly catastrophic collisions. Consistent with such a history is the prominence of groove-like linear features, believed to be related to fractures. These linear depressions, 100-300 meters wide and tens of meters deep, are in two crossing groups with slightly different morphology, one group wider and more pitted than the other. Grooves had previously been seen only on Mars's moon Phobos, but were predicted for asteroids as well. Gaspra also shows a variety of enigmatic curved depressions and ridges in the terminator region at left. The Galileo project, whose primary mission is the

  9. Gaspra - Highest Resolution Mosaic

    Science.gov (United States)

    1992-01-01

    This picture of asteroid 951 Gaspra is a mosaic of two images taken by the Galileo spacecraft from a range of 5,300 kilometers (3,300 miles), some 10 minutes before closest approach on October 29, 1991. The Sun is shining from the right; phase angle is 50 degrees. The resolution, about 54 meters/pixel, is the highest for the Gaspra encounter and is about three times better than that in the view released in November 1991. Additional images of Gaspra remain stored on Galileo's tape recorder, awaiting playback in November. Gaspra is an irregular body with dimensions about 19 x 12 x 11 kilometers (12 x 7.5 x 7 miles). The portion illuminated in this view is about 18 kilometers (11 miles) from lower left to upper right. The north pole is located at upper left; Gaspra rotates counterclockwise every 7 hours. The large concavity on the lower right limb is about 6 kilometers (3.7 miles) across, the prominent crater on the terminator, center left, about 1.5 kilometers (1 mile). A striking feature of Gaspra's surface is the abundance of small craters. More than 600 craters, 100-500 meters (330-1650 feet) in diameter are visible here. The number of such small craters compared to larger ones is much greater for Gaspra than for previously studied bodies of comparable size such as the satellites of Mars. Gaspra's very irregular shape suggests that the asteroid was derived from a larger body by nearly catastrophic collisions. Consistent with such a history is the prominence of groove-like linear features, believed to be related to fractures. These linear depressions, 100-300 meters wide and tens of meters deep, are in two crossing groups with slightly different morphology, one group wider and more pitted than the other. Grooves had previously been seen only on Mars's moon Phobos, but were predicted for asteroids as well. Gaspra also shows a variety of enigmatic curved depressions and ridges in the terminator region at left. The Galileo project, whose primary mission is the

  10. Resolution enhancement techniques in microscopy

    Science.gov (United States)

    Cremer, Christoph; Masters, Barry R.

    2013-05-01

    We survey the history of resolution enhancement techniques in microscopy and their impact on current research in biomedicine. Often these techniques are labeled superresolution, or enhanced resolution microscopy, or light-optical nanoscopy. First, we introduce the development of diffraction theory in its relation to enhanced resolution; then we explore the foundations of resolution as expounded by the astronomers and the physicists and describe the conditions for which they apply. Then we elucidate Ernst Abbe's theory of optical formation in the microscope, and its experimental verification and dissemination to the world wide microscope communities. Second, we describe and compare the early techniques that can enhance the resolution of the microscope. Third, we present the historical development of various techniques that substantially enhance the optical resolution of the light microscope. These enhanced resolution techniques in their modern form constitute an active area of research with seminal applications in biology and medicine. Our historical survey of the field of resolution enhancement uncovers many examples of reinvention, rediscovery, and independent invention and development of similar proposals, concepts, techniques, and instruments. Attribution of credit is therefore confounded by the fact that for understandable reasons authors stress the achievements from their own research groups and sometimes obfuscate their contributions and the prior art of others. In some cases, attribution of credit is also made more complex by the fact that long term developments are difficult to allocate to a specific individual because of the many mutual connections often existing between sometimes fiercely competing, sometimes strongly collaborating groups. Since applications in biology and medicine have been a major driving force in the development of resolution enhancing approaches, we focus on the contribution of enhanced resolution to these fields.

  11. Rapid quantification and sex determination of forensic evidence materials.

    Science.gov (United States)

    Andréasson, Hanna; Allen, Marie

    2003-11-01

    DNA quantification of forensic evidence is very valuable for an optimal use of the available biological material. Moreover, sex determination is of great importance as additional information in criminal investigations as well as in identification of missing persons, no suspect cases, and ancient DNA studies. While routine forensic DNA analysis based on short tandem repeat markers includes a marker for sex determination, analysis of samples containing scarce amounts of DNA is often based on mitochondrial DNA, and sex determination is not performed. In order to allow quantification and simultaneous sex determination on minute amounts of DNA, an assay based on real-time PCR analysis of a marker within the human amelogenin gene has been developed. The sex determination is based on melting curve analysis, while an externally standardized kinetic analysis allows quantification of the nuclear DNA copy number in the sample. This real-time DNA quantification assay has proven to be highly sensitive, enabling quantification of single DNA copies. Although certain limitations were apparent, the system is a rapid, cost-effective, and flexible assay for analysis of forensic casework samples.

  12. GMO quantification: valuable experience and insights for the future.

    Science.gov (United States)

    Milavec, Mojca; Dobnik, David; Yang, Litao; Zhang, Dabing; Gruden, Kristina; Zel, Jana

    2014-10-01

    Cultivation and marketing of genetically modified organisms (GMOs) have been unevenly adopted worldwide. To facilitate international trade and to provide information to consumers, labelling requirements have been set up in many countries. Quantitative real-time polymerase chain reaction (qPCR) is currently the method of choice for detection, identification and quantification of GMOs. This has been critically assessed and the requirements for the method performance have been set. Nevertheless, there are challenges that should still be highlighted, such as measuring the quantity and quality of DNA, and determining the qPCR efficiency, possible sequence mismatches, characteristics of taxon-specific genes and appropriate units of measurement, as these remain potential sources of measurement uncertainty. To overcome these problems and to cope with the continuous increase in the number and variety of GMOs, new approaches are needed. Statistical strategies of quantification have already been proposed and expanded with the development of digital PCR. The first attempts have been made to use new generation sequencing also for quantitative purposes, although accurate quantification of the contents of GMOs using this technology is still a challenge for the future, and especially for mixed samples. New approaches are needed also for the quantification of stacks, and for potential quantification of organisms produced by new plant breeding techniques.

  13. Advancement in PET quantification using 3D-OP-OSEM point spread function reconstruction with the HRRT

    Energy Technology Data Exchange (ETDEWEB)

    Varrone, Andrea; Sjoeholm, Nils; Gulyas, Balazs; Halldin, Christer; Farde, Lars [Karolinska Hospital, Karolinska Institutet, Department of Clinical Neuroscience, Psychiatry Section and Stockholm Brain Institute, Stockholm (Sweden); Eriksson, Lars [Karolinska Hospital, Karolinska Institutet, Department of Clinical Neuroscience, Psychiatry Section and Stockholm Brain Institute, Stockholm (Sweden); Siemens Molecular Imaging, Knoxville, TN (United States); University of Stockholm, Department of Physics, Stockholm (Sweden)

    2009-10-15

    Image reconstruction including the modelling of the point spread function (PSF) is an approach improving the resolution of the PET images. This study assessed the quantitative improvements provided by the implementation of the PSF modelling in the reconstruction of the PET data using the High Resolution Research Tomograph (HRRT). Measurements were performed on the NEMA-IEC/2001 (Image Quality) phantom for image quality and on an anthropomorphic brain phantom (STEPBRAIN). PSF reconstruction was also applied to PET measurements in two cynomolgus monkeys examined with [{sup 18}F]FE-PE2I (dopamine transporter) and with [{sup 11}C]MNPA (D{sub 2} receptor), and in one human subject examined with [{sup 11}C]raclopride (D{sub 2} receptor). PSF reconstruction increased the recovery coefficient (RC) in the NEMA phantom by 11-40% and the grey to white matter ratio in the STEPBRAIN phantom by 17%. PSF reconstruction increased binding potential (BP{sub ND}) in the striatum and midbrain by 14 and 18% in the [{sup 18}F]FE-PE2I study, and striatal BP{sub ND} by 6 and 10% in the [{sup 11}C]MNPA and [{sup 11}C]raclopride studies. PSF reconstruction improved quantification by increasing the RC and thus reducing the partial volume effect. This method provides improved conditions for PET quantification in clinical studies with the HRRT system, particularly when targeting receptor populations in small brain structures. (orig.)

  14. Super-resolution biomolecular crystallography with low-resolution data.

    Science.gov (United States)

    Schröder, Gunnar F; Levitt, Michael; Brunger, Axel T

    2010-04-22

    X-ray diffraction plays a pivotal role in the understanding of biological systems by revealing atomic structures of proteins, nucleic acids and their complexes, with much recent interest in very large assemblies like the ribosome. As crystals of such large assemblies often diffract weakly (resolution worse than 4 A), we need methods that work at such low resolution. In macromolecular assemblies, some of the components may be known at high resolution, whereas others are unknown: current refinement methods fail as they require a high-resolution starting structure for the entire complex. Determining the structure of such complexes, which are often of key biological importance, should be possible in principle as the number of independent diffraction intensities at a resolution better than 5 A generally exceeds the number of degrees of freedom. Here we introduce a method that adds specific information from known homologous structures but allows global and local deformations of these homology models. Our approach uses the observation that local protein structure tends to be conserved as sequence and function evolve. Cross-validation with R(free) (the free R-factor) determines the optimum deformation and influence of the homology model. For test cases at 3.5-5 A resolution with known structures at high resolution, our method gives significant improvements over conventional refinement in the model as monitored by coordinate accuracy, the definition of secondary structure and the quality of electron density maps. For re-refinements of a representative set of 19 low-resolution crystal structures from the Protein Data Bank, we find similar improvements. Thus, a structure derived from low-resolution diffraction data can have quality similar to a high-resolution structure. Our method is applicable to the study of weakly diffracting crystals using X-ray micro-diffraction as well as data from new X-ray light sources. Use of homology information is not restricted to X

  15. Energy resolution of scintillation detectors

    Energy Technology Data Exchange (ETDEWEB)

    Moszyński, M., E-mail: M.Moszynski@ncbj.gov.pl; Syntfeld-Każuch, A.; Swiderski, L.; Grodzicka, M.; Iwanowska, J.; Sibczyński, P.; Szczęśniak, T.

    2016-01-01

    According to current knowledge, the non-proportionality of the light yield of scintillators appears to be a fundamental limitation of energy resolution. A good energy resolution is of great importance for most applications of scintillation detectors. Thus, its limitations are discussed below; which arise from the non-proportional response of scintillators to gamma rays and electrons, being of crucial importance to the intrinsic energy resolution of crystals. The important influence of Landau fluctuations and the scattering of secondary electrons (δ-rays) on intrinsic resolution is pointed out here. The study on undoped NaI and CsI at liquid nitrogen temperature with a light readout by avalanche photodiodes strongly suggests that the non-proportionality of many crystals is not their intrinsic property and may be improved by selective co-doping. Finally, several observations that have been collected in the last 15 years on the influence of the slow components of light pulses on energy resolution suggest that more complex processes are taking place in the scintillators. This was observed with CsI(Tl), CsI(Na), ZnSe(Te), and undoped NaI at liquid nitrogen temperature and, finally, for NaI(Tl) at temperatures reduced below 0 °C. A common conclusion of these observations is that the highest energy resolution, and particularly intrinsic resolution measured with the scintillators, characterized by two or more components of the light pulse decay, is obtainable when the spectrometry equipment integrates the whole light of the components. In contrast, the slow components observed in many other crystals degrade the intrinsic resolution. In the limiting case, afterglow could also be considered as a very slow component that spoils the energy resolution. The aim of this work is to summarize all of the above observations by looking for their origin.

  16. Quantification model for energy consumption in edification

    Directory of Open Access Journals (Sweden)

    Mercader, Mª P.

    2012-12-01

    Full Text Available The research conducted in this paper focuses on the generation of a model for the quantification of energy consumption in building. This is to be done through one of the most relevant environmental impact indicators associated with weight per m2 of construction, as well as the energy consumption resulting from the manufacturing process of materials used in building construction. The practical application of the proposed model on different buildings typologies in Seville, will provide information regarding the building materials, the subsystems and the most relevant construction elements. Hence, we will be able to observe the impact the built surface has on the environment. The results obtained aim to reference the scientific community, providing quantitative data comparable to other types of buildings and geographical areas. Furthermore, it may also allow the analysis and the characterization of feasible solutions to reduce the environmental impact generated by the different materials, subsystems and construction elements commonly used in the different building types defined in this study.

    La investigación realizada en el presente trabajo plantea la generación de un modelo de cuantificación del consumo energético en edificación, a través de uno de los indicadores de impacto ambiental más relevantes asociados al peso por m2 de construcción, el consumo energético derivado del proceso de fabricación de los materiales de construcción empleados en edificación. La aplicación práctica del modelo propuesto sobre diferentes tipologías edificatorias en Sevilla aportará información respecto a los materiales de construcción, subsistemas y elementos constructivos más impactantes, permitiendo visualizar la influencia que presenta la superficie construida en cuanto al impacto ambiental generado. Los resultados obtenidos pretenden servir de referencia a la comunidad científica, aportando datos num

  17. Quantification of methane emissions from danish landfills

    DEFF Research Database (Denmark)

    Scheutz, Charlotte; Mønster, Jacob; Kjeldsen, Peter

    2013-01-01

    Whole-landfill methane emission was quantified using a tracer technique that combines controlled tracer gas release from the landfill with time-resolved concentration measurements downwind of the landfill using a mobile high-resolution analytical instrument. Methane emissions from 13 Danish...... landfills varied between 2.6 and 60.8 kg CH4 h–1. The highest methane emission was measured at the largest (in terms of disposed waste amounts) of the 13 landfills, whereas the lowest methane emissions (2.6-6.1 kgCH4 h–1) were measured at the older and smaller landfills. At two of the sites, which had gas...... collection, emission measurements showed that the gas collection systems only collected between 30-50% of the methane produced (assuming that the produced methane equalled the sum of the emitted methane and the collected methane). Significant methane emissions were observed from disposed shredder waste...

  18. Strain quantification in epitaxial thin films

    International Nuclear Information System (INIS)

    Cushley, M

    2008-01-01

    Strain arising in epitaxial thin films can be beneficial in some cases but devastating in others. By altering the lattice parameters, strain may give a thin film properties hitherto unseen in the bulk material. On the other hand, heavily strained systems are prone to develop lattice defects in order to relieve the strain, which can cause device failure or, at least, a decrease in functionality. Using convergent beam electron diffraction (CBED) and high-resolution transmission electron microscopy (HRTEM), it is possible to determine local strains within a material. By comparing the results from CBED and HRTEM experiments, it is possible to gain a complete view of a material, including the strain and any lattice defects present. As well as looking at how the two experimental techniques differ from each other, I will also look at how results from different image analysis algorithms compare. Strain in Si/SiGe samples and BST/SRO/MgO capacitor structures will be discussed.

  19. Iron overload in the liver diagnostic and quantification

    International Nuclear Information System (INIS)

    Alustiza, Jose M.; Castiella, Agustin; Juan, Maria D. de; Emparanza, Jose I.; Artetxe, Jose; Uranga, Maite

    2007-01-01

    Hereditary Hemochromatosis is the most frequent modality of iron overload. Since 1996 genetic tests have facilitated significantly the non-invasive diagnosis of the disease. There are however many cases of negative genetic tests that require confirmation by hepatic iron quantification which is traditionally performed by hepatic biopsy. There are many studies that have demonstrated the possibility of performing hepatic iron quantification with Magnetic Resonance. However, a consensus has not been reached yet regarding the technique or the possibility to reproduce the same method of calculus in different machines. This article reviews the state of the art of the question and delineates possible future lines to standardise this non-invasive method of hepatic iron quantification

  20. A highly sensitive method for quantification of iohexol

    DEFF Research Database (Denmark)

    Schulz, A.; Boeringer, F.; Swifka, J.

    2014-01-01

    -chromatography-electrospray-massspectrometry (LC-ESI-MS) approach using the multiple reaction monitoring mode for iohexol quantification. In order to test whether a significantly decreased amount of iohexol is sufficient for reliable quantification, a LC-ESI-MS approach was assessed. We analyzed the kinetics of iohexol in rats after application...... of different amounts of iohexol (15 mg to 150 1.tg per rat). Blood sampling was conducted at four time points, at 15, 30, 60, and 90 min, after iohexol injection. The analyte (iohexol) and the internal standard (iotha(amic acid) were separated from serum proteins using a centrifugal filtration device...... with a cut-off of 3 kDa. The chromatographic separation was achieved on an analytical Zorbax SB C18 column. The detection and quantification were performed on a high capacity trap mass spectrometer using positive ion ESI in the multiple reaction monitoring (MRM) mode. Furthermore, using real-time polymerase...

  1. Iron overload in the liver diagnostic and quantification

    Energy Technology Data Exchange (ETDEWEB)

    Alustiza, Jose M. [Osatek SA, P Dr. Beguiristain 109, 20014, San Sebastian, Guipuzcoa (Spain)]. E-mail: jmalustiza@osatek.es; Castiella, Agustin [Osatek SA, P Dr. Beguiristain 109, 20014, San Sebastian, Guipuzcoa (Spain); Juan, Maria D. de [Osatek SA, P Dr. Beguiristain 109, 20014, San Sebastian, Guipuzcoa (Spain); Emparanza, Jose I. [Osatek SA, P Dr. Beguiristain 109, 20014, San Sebastian, Guipuzcoa (Spain); Artetxe, Jose [Osatek SA, P Dr. Beguiristain 109, 20014, San Sebastian, Guipuzcoa (Spain); Uranga, Maite [Osatek SA, P Dr. Beguiristain 109, 20014, San Sebastian, Guipuzcoa (Spain)

    2007-03-15

    Hereditary Hemochromatosis is the most frequent modality of iron overload. Since 1996 genetic tests have facilitated significantly the non-invasive diagnosis of the disease. There are however many cases of negative genetic tests that require confirmation by hepatic iron quantification which is traditionally performed by hepatic biopsy. There are many studies that have demonstrated the possibility of performing hepatic iron quantification with Magnetic Resonance. However, a consensus has not been reached yet regarding the technique or the possibility to reproduce the same method of calculus in different machines. This article reviews the state of the art of the question and delineates possible future lines to standardise this non-invasive method of hepatic iron quantification.

  2. Stereological quantification of mast cells in human synovium

    DEFF Research Database (Denmark)

    Damsgaard, T E; Sørensen, Flemming Brandt; Herlin, T

    1999-01-01

    Mast cells participate in both the acute allergic reaction as well as in chronic inflammatory diseases. Earlier studies have revealed divergent results regarding the quantification of mast cells in the human synovium. The aim of the present study was therefore to quantify these cells in the human...... synovium, using stereological techniques. Different methods of staining and quantification have previously been used for mast cell quantification in human synovium. Stereological techniques provide precise and unbiased information on the number of cell profiles in two-dimensional tissue sections of......, in this case, human synovium. In 10 patients suffering from osteoarthritis a median of 3.6 mast cells/mm2 synovial membrane was found. The total number of cells (synoviocytes, fibroblasts, lymphocytes, leukocytes) present was 395.9 cells/mm2 (median). The mast cells constituted 0.8% of all the cell profiles...

  3. Superlattice band structure: New and simple energy quantification condition

    Energy Technology Data Exchange (ETDEWEB)

    Maiz, F., E-mail: fethimaiz@gmail.com [University of Cartage, Nabeul Engineering Preparatory Institute, Merazka, 8000 Nabeul (Tunisia); King Khalid University, Faculty of Science, Physics Department, P.O. Box 9004, Abha 61413 (Saudi Arabia)

    2014-10-01

    Assuming an approximated effective mass and using Bastard's boundary conditions, a simple method is used to calculate the subband structure for periodic semiconducting heterostructures. Our method consists to derive and solve the energy quantification condition (EQC), this is a simple real equation, composed of trigonometric and hyperbolic functions, and does not need any programming effort or sophistic machine to solve it. For less than ten wells heterostructures, we have derived and simplified the energy quantification conditions. The subband is build point by point; each point presents an energy level. Our simple energy quantification condition is used to calculate the subband structure of the GaAs/Ga{sub 0.5}Al{sub 0.5}As heterostructures, and build its subband point by point for 4 and 20 wells. Our finding shows a good agreement with previously published results.

  4. The cerebellum mediates conflict resolution.

    Science.gov (United States)

    Schweizer, Tom A; Oriet, Chris; Meiran, Nachshon; Alexander, Michael P; Cusimano, Michael; Stuss, Donald T

    2007-12-01

    Regions within the frontal and parietal cortex have been implicated as important neural correlates for cognitive control during conflict resolution. Despite the extensive reciprocal connectivity between the cerebellum and these putatively critical cortical areas, a role for the cerebellum in conflict resolution has never been identified. We used a task-switching paradigm that separates processes related to task-set switching and the management of response conflict independent of motor processing. Eleven patients with chronic, focal lesions to the cerebellum and 11 healthy controls were compared. Patients were slower and less accurate in conditions involving conflict resolution. In the absence of response conflict, however, tasks-witching abilities were not impaired in our patients. The cerebellum may play an important role in coordinating with other areas of cortex to modulate active response states. These results are the first demonstration of impaired conflict resolution following cerebellar lesions in the presence of an intact prefrontal cortex.

  5. House passes resolution on occupation

    Index Scriptorium Estoniae

    2005-01-01

    Venemaalt Baltimaade okupeerimise tunnistamist nõudva resolutsiooni vastuvõtmisest USA Kongressi Esindajate Kojas Leedu päritolu kongressmani John Shimkuse eestvedamisel. Vt. ka resolutsiooni teksti "House Concurrent Resolution 128" lk. 14

  6. Conflict Resolution for Contrasting Cultures.

    Science.gov (United States)

    Clarke, Clifford C.; Lipp, G. Douglas

    1998-01-01

    A seven-step process can help people from different cultures understand each other's intentions and perceptions so they can work together harmoniously: problem identification, problem clarification, cultural exploration, organizational exploration, conflict resolution, impact assessment, and organizational integration. (JOW)

  7. EPA Alternative Dispute Resolution Contacts

    Science.gov (United States)

    The success of EPA's ADR efforts depends on a network of talented and experienced professionals in Headquarters offices and EPA Regions. For Agency-wide ADR information, please contact the Conflict Prevention and Resolution Center.

  8. Molecular quantification of environmental DNA using microfluidics and digital PCR.

    Science.gov (United States)

    Hoshino, Tatsuhiko; Inagaki, Fumio

    2012-09-01

    Real-time PCR has been widely used to evaluate gene abundance in natural microbial habitats. However, PCR-inhibitory substances often reduce the efficiency of PCR, leading to the underestimation of target gene copy numbers. Digital PCR using microfluidics is a new approach that allows absolute quantification of DNA molecules. In this study, digital PCR was applied to environmental samples, and the effect of PCR inhibitors on DNA quantification was tested. In the control experiment using λ DNA and humic acids, underestimation of λ DNA at 1/4400 of the theoretical value was observed with 6.58 ng μL(-1) humic acids. In contrast, digital PCR provided accurate quantification data with a concentration of humic acids up to 9.34 ng μL(-1). The inhibitory effect of paddy field soil extract on quantification of the archaeal 16S rRNA gene was also tested. By diluting the DNA extract, quantified copy numbers from real-time PCR and digital PCR became similar, indicating that dilution was a useful way to remedy PCR inhibition. The dilution strategy was, however, not applicable to all natural environmental samples. For example, when marine subsurface sediment samples were tested the copy number of archaeal 16S rRNA genes was 1.04×10(3) copies/g-sediment by digital PCR, whereas real-time PCR only resulted in 4.64×10(2) copies/g-sediment, which was most likely due to an inhibitory effect. The data from this study demonstrated that inhibitory substances had little effect on DNA quantification using microfluidics and digital PCR, and showed the great advantages of digital PCR in accurate quantifications of DNA extracted from various microbial habitats. Copyright © 2012 Elsevier GmbH. All rights reserved.

  9. High temperature liquid chromatography hyphenated with ESI-MS and ICP-MS detection for the structural characterization and quantification of halogen containing drug metabolites

    International Nuclear Information System (INIS)

    Vlieger, Jon S.B. de; Giezen, Mark J.N.; Falck, David; Tump, Cornelis; Heuveln, Fred van; Giera, Martin; Kool, Jeroen; Lingeman, Henk; Wieling, Jaap; Honing, Maarten; Irth, Hubertus; Niessen, Wilfried M.A.

    2011-01-01

    Highlights: → Hyphenation of high temperature liquid chromatography to ICP-MS and ESI-MS. → Structural characterization of kinase inhibitor metabolites with high resolution MS n experiments. → Quantification of drug metabolites with ICP-MS based on Iodine detection. → Significant changes in ESI-MS response after small structural changes. - Abstract: In this paper we describe the hyphenation of high temperature liquid chromatography with ICP-MS and ESI-MS for the characterization of halogen containing drug metabolites. The use of temperature gradients up to 200 deg. C enabled the separation of metabolites with low organic modifier content. This specific property allowed the use of detection methods that suffer from (significant) changes in analyte response factors as a function of the organic modifier content such as ICP-MS. Metabolites of two kinase inhibitors (SB-203580-Iodo and MAPK inhibitor VIII) produced by bacterial cytochrome P450 BM3 mutants and human liver microsomes were identified based on high resolution MS n data. Quantification was done using their normalized and elemental specific response in the ICP-MS. The importance of these kinds of quantification strategies is stressed by the observation that the difference of the position of one oxygen atom in a structure can greatly affect its response in ESI-MS and UV detection.

  10. Estimating uncertainty in resolution tests

    CSIR Research Space (South Africa)

    Goncalves, DP

    2006-05-01

    Full Text Available frequencies yields a biased estimate, and we provide an improved estimator. An application illustrates how the results derived can be incorporated into a larger un- certainty analysis. ? 2006 Society of Photo-Optical Instrumentation Engineers. H20851DOI: 10....1117/1.2202914H20852 Subject terms: resolution testing; USAF 1951 test target; resolution uncertainity. Paper 050404R received May 20, 2005; revised manuscript received Sep. 2, 2005; accepted for publication Sep. 9, 2005; published online May 10, 2006. 1...

  11. Uncertainty Quantification of the Reverse Taylor Impact Test and Localized Asynchronous Space-Time Algorithm

    Science.gov (United States)

    Subber, Waad; Salvadori, Alberto; Lee, Sangmin; Matous, Karel

    2017-06-01

    The reverse Taylor impact is a common experiment to investigate the dynamical response of materials at high strain rates. To better understand the physical phenomena and to provide a platform for code validation and Uncertainty Quantification (UQ), a co-designed simulation and experimental paradigm is investigated. For validation under uncertainty, quantities of interest (QOIs) within subregions of the computational domain are introduced. For such simulations where regions of interest can be identified, the computational cost for UQ can be reduced by confining the random variability within these regions of interest. This observation inspired us to develop an asynchronous space and time computational algorithm with localized UQ. In the region of interest, the high resolution space and time discretization schemes are used for a stochastic model. Apart from the region of interest, low spatial and temporal resolutions are allowed for a stochastic model with low dimensional representation of uncertainty. The model is exercised on the linear elastodynamics and shows a potential in reducing the UQ computational cost. Although, we consider wave prorogation in solid, the proposed framework is general and can be used for fluid flow problems as well. Department of Energy, National Nuclear Security Administration (PSAAP-II).

  12. Picowatt Resolution Calorimetry for Micro and Nanoscale Energy Transport Studies

    Science.gov (United States)

    Sadat, Seid H.

    Precise quantification of energy transport is key to obtaining insights into a wide range of phenomena across various disciplines including physics, chemistry, biology and engineering. This thesis describes technical advancements into heat-flow calorimetry which enable measurement of energy transport at micro and nanoscales with picowatt resolution. I have developed two types of microfabricated calorimeter devices and demonstrated single digit picowatt resolution at room temperature. Both devices incorporate two distinct features; an active area isolated by a thermal conductance (GTh) of less than 1 microW/K and a high resolution thermometer with temperature resolution (DeltaTres) in the micro kelvin regime. These features enable measurements of heat currents (q) with picowatt resolution (q= Th xDeltaTres). In the first device the active area is suspended via silicon nitride beams with excellent thermal isolation (~600 nW/K) and a bimaterial cantilever (BMC) thermometer with temperature resolution of ~6 microK. Taken together this design enabled calorimetric measurements with 4 pW resolution. In the second device, the BMC thermometry technique is replaced by a high-resolution resistance thermometry scheme. A detailed noise analysis of resistance thermometers, confirmed by experimental data, enabled me to correctly predict the resolution of different measurement schemes and propose techniques to achieve an order of magnitude improvement in the resolution of resistive thermometers. By incorporating resistance thermometers with temperature resolution of ~30 microK, combined with a thermal isolation of ~150 nW/K, I demonstrated an all-electrical calorimeter device with a resolution of ~ 5 pW. Finally, I used these calorimeters to study Near-Field Radiative Heat Transfer (NF-RHT). Using these devices, we studied--for the first time--the effect of film thickness on the NF-RHT between two dielectric surfaces. We showed that even a very thin film (~50 nm) of silicon

  13. Quantification is Neither Necessary Nor Sufficient for Measurement

    International Nuclear Information System (INIS)

    Mari, Luca; Maul, Andrew; Torres Irribarra, David; Wilson, Mark

    2013-01-01

    Being an infrastructural, widespread activity, measurement is laden with stereotypes. Some of these concern the role of measurement in the relation between quality and quantity. In particular, it is sometimes argued or assumed that quantification is necessary for measurement; it is also sometimes argued or assumed that quantification is sufficient for or synonymous with measurement. To assess the validity of these positions the concepts of measurement and quantitative evaluation should be independently defined and their relationship analyzed. We contend that the defining characteristic of measurement should be the structure of the process, not a feature of its results. Under this perspective, quantitative evaluation is neither sufficient nor necessary for measurement

  14. 2D histomorphometric quantification from 3D computerized tomography

    International Nuclear Information System (INIS)

    Lima, Inaya; Oliveira, Luis Fernando de; Lopes, Ricardo T.; Jesus, Edgar Francisco O. de; Alves, Jose Marcos

    2002-01-01

    In the present article, preliminary results are presented showing the application of the tridimensional computerized microtomographic technique (3D-μCT) to bone tissue characterization, through histomorphometric quantification which are based on stereologic concepts. Two samples of human bone were correctly prepared to be submitted to the tomographic system. The system used to realize that process were a radiographic system with a microfocus X-ray tube. Through these three processes, acquisition, reconstruction and quantification, it was possible to get the good results and coherent to the literature data. From this point, it is intended to compare these results with the information due the conventional method, that is, conventional histomorphometry. (author)

  15. Single Image Super Resolution via Sparse Reconstruction

    NARCIS (Netherlands)

    Kruithof, M.C.; Eekeren, A.W.M. van; Dijk, J.; Schutte, K.

    2012-01-01

    High resolution sensors are required for recognition purposes. Low resolution sensors, however, are still widely used. Software can be used to increase the resolution of such sensors. One way of increasing the resolution of the images produced is using multi-frame super resolution algorithms.

  16. Quantification and Negation in Event Semantics

    Directory of Open Access Journals (Sweden)

    Lucas Champollion

    2010-12-01

    Full Text Available Recently, it has been claimed that event semantics does not go well together with quantification, especially if one rejects syntactic, LF-based approaches to quantifier scope. This paper shows that such fears are unfounded, by presenting a simple, variable-free framework which combines a Neo-Davidsonian event semantics with a type-shifting based account of quantifier scope. The main innovation is that the event variable is bound inside the verbal denotation, rather than at sentence level by existential closure. Quantifiers can then be interpreted in situ. The resulting framework combines the strengths of event semantics and type-shifting accounts of quantifiers and thus does not force the semanticist to posit either a default underlying word order or a syntactic LF-style level. It is therefore well suited for applications to languages where word order is free and quantifier scope is determined by surface order. As an additional benefit, the system leads to a straightforward account of negation, which has also been claimed to be problematic for event-based frameworks.ReferencesBarker, Chris. 2002. ‘Continuations and the nature of quantification’. Natural Language Semantics 10: 211–242.http://dx.doi.org/10.1023/A:1022183511876Barker, Chris & Shan, Chung-chieh. 2008. ‘Donkey anaphora is in-scope binding’. Semantics and Pragmatics 1: 1–46.Beaver, David & Condoravdi, Cleo. 2007. ‘On the logic of verbal modification’. In Maria Aloni, Paul Dekker & Floris Roelofsen (eds. ‘Proceedings of the Sixteenth Amsterdam Colloquium’, 3–9. Amsterdam, Netherlands: University of Amsterdam.Beghelli, Filippo & Stowell, Tim. 1997. ‘Distributivity and negation: The syntax of each and every’. In Anna Szabolcsi (ed. ‘Ways of scope taking’, 71–107. Dordrecht, Netherlands: Kluwer.Brasoveanu, Adrian. 2010. ‘Modified Numerals as Post-Suppositions’. In Maria Aloni, Harald Bastiaanse, Tikitu de Jager & Katrin Schulz (eds. ‘Logic, Language

  17. HPLC for simultaneous quantification of total ceramide, glucosylceramide, and ceramide trihexoside concentrations in plasma

    NARCIS (Netherlands)

    Groener, Johanna E. M.; Poorthuis, Ben J. H. M.; Kuiper, Sijmen; Helmond, Mariette T. J.; Hollak, Carla E. M.; Aerts, Johannes M. F. G.

    2007-01-01

    BACKGROUND: Simple, reproducible assays are needed for the quantification of sphingolipids, ceramide (Cer), and sphingoid bases. We developed an HPLC method for simultaneous quantification of total plasma concentrations of Cer, glucosylceramide (GlcCer), and ceramide trihexoside (CTH). METHODS:

  18. Improved Strategies and Optimization of Calibration Models for Real-time PCR Absolute Quantification

    Science.gov (United States)

    Real-time PCR absolute quantification applications rely on the use of standard curves to make estimates of DNA target concentrations in unknown samples. Traditional absolute quantification approaches dictate that a standard curve must accompany each experimental run. However, t...

  19. Two-Phase Microfluidic Systems for High Throughput Quantification of Agglutination Assays

    KAUST Repository

    Castro, David

    2018-01-01

    assay, with a minimum detection limit of 50 ng/mL using optical image analysis. We compare optical image analysis and light scattering as quantification methods, and demonstrate the first light scattering quantification of agglutination assays in a two

  20. Atomic resolution imaging of YAlO{sub 3}: Ce in the chromatic and spherical aberration corrected PICO electron microscope

    Energy Technology Data Exchange (ETDEWEB)

    Jin, Lei [Ernst Ruska-Centre for Microscopy and Spectroscopy with Electrons, Jülich-Aachen Research Alliance (JARA), Forschungszentrum Jülich GmbH, 52425 Jülich (Germany); Barthel, Juri [Ernst Ruska-Centre for Microscopy and Spectroscopy with Electrons, Jülich-Aachen Research Alliance (JARA), Forschungszentrum Jülich GmbH, 52425 Jülich (Germany); Central Facility for Electron Microscopy, RWTH Aachen University, 52074 Aachen (Germany); Jia, Chun-Lin [Ernst Ruska-Centre for Microscopy and Spectroscopy with Electrons, Jülich-Aachen Research Alliance (JARA), Forschungszentrum Jülich GmbH, 52425 Jülich (Germany); School of Electronic and Information Engineering and State Key Laboratory for Mechanical Behaviour of Materials, Xi' an Jiaotong University, Xi' an 710049 (China); Urban, Knut W., E-mail: k.urban@fz-juelich.de [Ernst Ruska-Centre for Microscopy and Spectroscopy with Electrons, Jülich-Aachen Research Alliance (JARA), Forschungszentrum Jülich GmbH, 52425 Jülich, (Germany); School of Electronic and Information Engineering and State Key Laboratory for Mechanical Behaviour of Materials, Xi' an Jiaotong University, Xi' an 710049 (China)

    2017-05-15

    Highlights: • First time resolution of 57 pm atom separations by HRTEM with 200 keV electrons. • Quantification of the image spread by absolute matching of experiment and simulation. • An information limit of 52 pm is deduced from the determined image spread. • Substantial deviations from the bulk structure are observed for the ultra-thin sample. - Abstract: The application of combined chromatic and spherical aberration correction in high-resolution transmission electron microscopy enables a significant improvement of the spatial resolution down to 50 pm. We demonstrate that such a resolution can be achieved in practice at 200 kV. Diffractograms of images of gold nanoparticles on amorphous carbon demonstrate corresponding information transfer. The Y atom pairs in [010] oriented yttrium orthoaluminate are successfully imaged together with the Al and the O atoms. Although the 57 pm pair separation is well demonstrated separations between 55 pm and 80 pm are measured. This observation is tentatively attributed to structural relaxations and surface reconstruction in the very thin samples used. Quantification of the resolution limiting effective image spread is achieved based on an absolute match between experimental and simulated image intensity distributions.

  1. Conflict Prevention and Resolution Center (CPRC)

    Science.gov (United States)

    The Conflict Prevention and Resolution Center is EPA's primary resource for services and expertise in the areas of consensus-building, collaborative problem solving, alternative dispute resolution, and environmental collaboration and conflict resolution.

  2. Quantification of patterns of regional cardiac metabolism

    International Nuclear Information System (INIS)

    Lear, J.L.; Ackermann, R.F.

    1990-01-01

    To quantitatively map and compare patterns of regional cardiac metabolism with greater spatial resolution than is possible with positron emission tomography (PET), the authors developed autoradiographic techniques for use with combinations of radiolabeled fluorodeoxyglucose (FDG), glucose (GLU), and acetate (ACE) and applied the techniques to normal rats. Kinetic models were developed to compare GLU-based oxidative glucose metabolism with FDG-based total glucose metabolism (oxidative plus anaerobic) and to compare ACE-based overall oxidative metabolism with FDG-based total glucose metabolism. GLU-based metabolism generally paralleled FDG-based metabolism, but divergence occurred in certain structures such as the papillary muscles, where FDG-based metabolism was much greater. ACE-based metabolism also generally paralleled FDG-based metabolism, but again, the papillary muscles had relatively greater FDG-based metabolism. These discrepancies between FDG-based metabolism and GLU- or ACE-based metabolism suggest the presence of high levels of anaerobic glycolysis. Thus, the study indicates that anaerobic glycolysis, in addition to occurring in ischemic or stunned myocardium (as has been shown in recent PET studies), occurs normally in specific cardiac regions, despite the presence of abundant oxygen

  3. Semi-automated quantification of living cells with internalized nanostructures

    KAUST Repository

    Margineanu, Michael B.; Julfakyan, Khachatur; Sommer, Christoph; Perez, Jose E.; Contreras, Maria F.; Khashab, Niveen M.; Kosel, Jü rgen; Ravasi, Timothy

    2016-01-01

    novel method for the quantification of cells that internalize a specific type of nanostructures. This approach is suitable for high-throughput and real-time data analysis and has the potential to be used to study the interaction of different types

  4. Quantification of the sequestration of indium 111 labelled platelets

    International Nuclear Information System (INIS)

    Najean, Y.; Picard, N.; Dufour, V.; Rain, J.D.

    1988-01-01

    A simple method is proposed for an accurate quantification of the splenic and/or hepatic sequestration of the 111 In-labelled platelets. It could be allow a better prediction of the efficiency of splenectomy in idiopathic thrombocytopenic purpura [fr

  5. Data-driven Demand Response Characterization and Quantification

    DEFF Research Database (Denmark)

    Le Ray, Guillaume; Pinson, Pierre; Larsen, Emil Mahler

    2017-01-01

    Analysis of load behavior in demand response (DR) schemes is important to evaluate the performance of participants. Very few real-world experiments have been carried out and quantification and characterization of the response is a difficult task. Nevertheless it will be a necessary tool for portf...

  6. MRI-based quantification of brain damage in cerebrovascular disorders

    NARCIS (Netherlands)

    de Bresser, J.H.J.M.

    2011-01-01

    Brain diseases can lead to diverse structural abnormalities that can be assessed on magnetic resonance imaging (MRI) scans. These abnormalities can be quantified by (semi-)automated techniques. The studies described in this thesis aimed to optimize and apply cerebral quantification techniques in

  7. Automated image analysis for quantification of filamentous bacteria

    DEFF Research Database (Denmark)

    Fredborg, Marlene; Rosenvinge, Flemming Schønning; Spillum, Erik

    2015-01-01

    in systems relying on colorimetry or turbidometry (such as Vitek-2, Phoenix, MicroScan WalkAway). The objective was to examine an automated image analysis algorithm for quantification of filamentous bacteria using the 3D digital microscopy imaging system, oCelloScope. Results Three E. coli strains displaying...

  8. Standardless quantification by parameter optimization in electron probe microanalysis

    International Nuclear Information System (INIS)

    Limandri, Silvina P.; Bonetto, Rita D.; Josa, Víctor Galván; Carreras, Alejo C.; Trincavelli, Jorge C.

    2012-01-01

    A method for standardless quantification by parameter optimization in electron probe microanalysis is presented. The method consists in minimizing the quadratic differences between an experimental spectrum and an analytical function proposed to describe it, by optimizing the parameters involved in the analytical prediction. This algorithm, implemented in the software POEMA (Parameter Optimization in Electron Probe Microanalysis), allows the determination of the elemental concentrations, along with their uncertainties. The method was tested in a set of 159 elemental constituents corresponding to 36 spectra of standards (mostly minerals) that include trace elements. The results were compared with those obtained with the commercial software GENESIS Spectrum® for standardless quantification. The quantifications performed with the method proposed here are better in the 74% of the cases studied. In addition, the performance of the method proposed is compared with the first principles standardless analysis procedure DTSA for a different data set, which excludes trace elements. The relative deviations with respect to the nominal concentrations are lower than 0.04, 0.08 and 0.35 for the 66% of the cases for POEMA, GENESIS and DTSA, respectively. - Highlights: ► A method for standardless quantification in EPMA is presented. ► It gives better results than the commercial software GENESIS Spectrum. ► It gives better results than the software DTSA. ► It allows the determination of the conductive coating thickness. ► It gives an estimation for the concentration uncertainties.

  9. Damage Localization and Quantification of Earthquake Excited RC-Frames

    DEFF Research Database (Denmark)

    Skjærbæk, P.S.; Nielsen, Søren R.K.; Kirkegaard, Poul Henning

    In the paper a recently proposed method for damage localization and quantification of RC-structures from response measurements is tested on experimental data. The method investigated requires at least one response measurement along the structure and the ground surface acceleration. Further, the t...

  10. Automatic quantification of subarachnoid hemorrhage on noncontrast CT

    NARCIS (Netherlands)

    Boers, Anna Maria Merel; Zijlstra, I.A.; Gathier, C.S.; van den Berg, R.; Slump, Cornelis H.; Marquering, H.A.; Majoie, C.B.

    2014-01-01

    Quantification of blood after SAH on initial NCCT is an important radiologic measure to predict patient outcome and guide treatment decisions. In current scales, hemorrhage volume and density are not accounted for. The purpose of this study was to develop and validate a fully automatic method for

  11. Enhancement of Electroluminescence (EL) image measurements for failure quantification methods

    DEFF Research Database (Denmark)

    Parikh, Harsh; Spataru, Sergiu; Sera, Dezso

    2018-01-01

    Enhanced quality images are necessary for EL image analysis and failure quantification. A method is proposed which determines image quality in terms of more accurate failure detection of solar panels through electroluminescence (EL) imaging technique. The goal of the paper is to determine the most...

  12. Investigation on feasibility of recurrence quantification analysis for ...

    African Journals Online (AJOL)

    The RQA parameters such as percent recurrence (REC), trapping time (TT), percent laminarity (LAM) and entropy (ENT), and also the recurrence plots color patterns for different flank wear, can be used in detecting insert wear in face milling. Keywords: milling, flank wear, recurrence plot, recurrence quantification analysis.

  13. Quantification and presence of human ancient DNA in burial place ...

    African Journals Online (AJOL)

    Quantification and presence of human ancient DNA in burial place remains of Turkey using real time polymerase chain reaction. ... A published real-time PCR assay, which allows for the combined analysis of nuclear or ancient DNA and mitochondrial DNA, was modified. This approach can be used for recovering DNA from ...

  14. Rapid Quantification and Validation of Lipid Concentrations within Liposomes

    Directory of Open Access Journals (Sweden)

    Carla B. Roces

    2016-09-01

    Full Text Available Quantification of the lipid content in liposomal adjuvants for subunit vaccine formulation is of extreme importance, since this concentration impacts both efficacy and stability. In this paper, we outline a high performance liquid chromatography-evaporative light scattering detector (HPLC-ELSD method that allows for the rapid and simultaneous quantification of lipid concentrations within liposomal systems prepared by three liposomal manufacturing techniques (lipid film hydration, high shear mixing, and microfluidics. The ELSD system was used to quantify four lipids: 1,2-dimyristoyl-sn-glycero-3-phosphocholine (DMPC, cholesterol, dimethyldioctadecylammonium (DDA bromide, and ᴅ-(+-trehalose 6,6′-dibehenate (TDB. The developed method offers rapidity, high sensitivity, direct linearity, and a good consistency on the responses (R2 > 0.993 for the four lipids tested. The corresponding limit of detection (LOD and limit of quantification (LOQ were 0.11 and 0.36 mg/mL (DMPC, 0.02 and 0.80 mg/mL (cholesterol, 0.06 and 0.20 mg/mL (DDA, and 0.05 and 0.16 mg/mL (TDB, respectively. HPLC-ELSD was shown to be a rapid and effective method for the quantification of lipids within liposome formulations without the need for lipid extraction processes.

  15. Real-Time PCR for Universal Phytoplasma Detection and Quantification

    DEFF Research Database (Denmark)

    Christensen, Nynne Meyn; Nyskjold, Henriette; Nicolaisen, Mogens

    2013-01-01

    Currently, the most efficient detection and precise quantification of phytoplasmas is by real-time PCR. Compared to nested PCR, this method is less sensitive to contamination and is less work intensive. Therefore, a universal real-time PCR method will be valuable in screening programs and in other...

  16. Double-layer Tablets of Lornoxicam: Validation of Quantification ...

    African Journals Online (AJOL)

    Double-layer Tablets of Lornoxicam: Validation of Quantification Method, In vitro Dissolution and Kinetic Modelling. ... Satisfactory results were obtained from all the tablet formulations met compendial requirements. The slowest drug release rate was obtained with tablet cores based on PVP K90 (1.21 mg%.h-1).

  17. Methane emission quantification from landfills using a double tracer approach

    DEFF Research Database (Denmark)

    Scheutz, Charlotte; Samuelsson, J.; Fredenslund, Anders Michael

    2007-01-01

    A tracer method was successfully used for quantification of the whole methane (CH4) emission from Fakse landfill. By using two different tracers the emission from different sections of the landfill could be quantified. Furthermore, is was possible to determine the emissions from local on site...

  18. Protocol for Quantification of Defects in Natural Fibres for Composites

    DEFF Research Database (Denmark)

    Mortensen, Ulrich Andreas; Madsen, Bo

    2014-01-01

    Natural bast-type plant fibres are attracting increasing interest for being used for structural composite applications where high quality fibres with good mechanical properties are required. A protocol for the quantification of defects in natural fibres is presented. The protocol is based...

  19. Uncertainty quantification in nanomechanical measurements using the atomic force microscope

    Science.gov (United States)

    Ryan Wagner; Robert Moon; Jon Pratt; Gordon Shaw; Arvind Raman

    2011-01-01

    Quantifying uncertainty in measured properties of nanomaterials is a prerequisite for the manufacture of reliable nanoengineered materials and products. Yet, rigorous uncertainty quantification (UQ) is rarely applied for material property measurements with the atomic force microscope (AFM), a widely used instrument that can measure properties at nanometer scale...

  20. Temperature dependence of postmortem MR quantification for soft tissue discrimination

    Energy Technology Data Exchange (ETDEWEB)

    Zech, Wolf-Dieter; Schwendener, Nicole; Jackowski, Christian [University of Bern, From the Institute of Forensic Medicine, Bern (Switzerland); Persson, Anders; Warntjes, Marcel J. [University of Linkoeping, The Center for Medical Image Science and Visualization (CMIV), Linkoeping (Sweden)

    2015-08-15

    To investigate and correct the temperature dependence of postmortem MR quantification used for soft tissue characterization and differentiation in thoraco-abdominal organs. Thirty-five postmortem short axis cardiac 3-T MR examinations were quantified using a quantification sequence. Liver, spleen, left ventricular myocardium, pectoralis muscle and subcutaneous fat were analysed in cardiac short axis images to obtain mean T1, T2 and PD tissue values. The core body temperature was measured using a rectally inserted thermometer. The tissue-specific quantitative values were related to the body core temperature. Equations to correct for temperature differences were generated. In a 3D plot comprising the combined data of T1, T2 and PD, different organs/tissues could be well differentiated from each other. The quantitative values were influenced by the temperature. T1 in particular exhibited strong temperature dependence. The correction of quantitative values to a temperature of 37 C resulted in better tissue discrimination. Postmortem MR quantification is feasible for soft tissue discrimination and characterization of thoraco-abdominal organs. This provides a base for computer-aided diagnosis and detection of tissue lesions. The temperature dependence of the T1 values challenges postmortem MR quantification. Equations to correct for the temperature dependence are provided. (orig.)

  1. Quantification of microbial quality and safety in minimally processed foods

    NARCIS (Netherlands)

    Zwietering, M.H.

    2002-01-01

    To find a good equilibrium between quality and margin of safety of minimally processed foods, often various hurdles are used. Quantification of the kinetics should be used to approach an optimum processing and to select the main aspects. Due to many factors of which the exact quantitative effect is

  2. Comparison of DNA Quantification Methods for Next Generation Sequencing.

    Science.gov (United States)

    Robin, Jérôme D; Ludlow, Andrew T; LaRanger, Ryan; Wright, Woodring E; Shay, Jerry W

    2016-04-06

    Next Generation Sequencing (NGS) is a powerful tool that depends on loading a precise amount of DNA onto a flowcell. NGS strategies have expanded our ability to investigate genomic phenomena by referencing mutations in cancer and diseases through large-scale genotyping, developing methods to map rare chromatin interactions (4C; 5C and Hi-C) and identifying chromatin features associated with regulatory elements (ChIP-seq, Bis-Seq, ChiA-PET). While many methods are available for DNA library quantification, there is no unambiguous gold standard. Most techniques use PCR to amplify DNA libraries to obtain sufficient quantities for optical density measurement. However, increased PCR cycles can distort the library's heterogeneity and prevent the detection of rare variants. In this analysis, we compared new digital PCR technologies (droplet digital PCR; ddPCR, ddPCR-Tail) with standard methods for the titration of NGS libraries. DdPCR-Tail is comparable to qPCR and fluorometry (QuBit) and allows sensitive quantification by analysis of barcode repartition after sequencing of multiplexed samples. This study provides a direct comparison between quantification methods throughout a complete sequencing experiment and provides the impetus to use ddPCR-based quantification for improvement of NGS quality.

  3. Machine Learning for Quantification of Small Vessel Disease Imaging Biomarkers

    NARCIS (Netherlands)

    Ghafoorian, M.

    2018-01-01

    This thesis is devoted to developing fully automated methods for quantification of small vessel disease imaging bio-markers, namely WMHs and lacunes, using vari- ous machine learning/deep learning and computer vision techniques. The rest of the thesis is organized as follows: Chapter 2 describes

  4. Direct quantification of nickel in stainless steels by spectrophotometry

    International Nuclear Information System (INIS)

    Singh, Ritu; Raut, Vaibhavi V.; Jeyakumar, S.; Ramakumar, K.L.

    2007-01-01

    A spectrophotometric method based on the Ni-DMG complex for the quantification of nickel in steel samples without employing any prior separation is reported in the present study. The interfering ions are masked by suitable complexing agents and the method was extended to real samples after validating with BCS and Euro steel standards. (author)

  5. Development of a competitive PCR assay for the quantification of ...

    African Journals Online (AJOL)

    ONOS

    2010-01-25

    Jan 25, 2010 ... quantification of total Escherichia coli DNA in water. Omar Kousar Banu, Barnard .... Thereafter the product was ligated into the pGEM®T-easy cloning ... agarose gel using the high pure PCR product purification kit. (Roche® ...

  6. Recognition and quantification of pain in horses: A tutorial review

    DEFF Research Database (Denmark)

    Gleerup, Karina Charlotte Bech; Lindegaard, Casper

    2016-01-01

    Pain management is dependent on the quality of the pain evaluation. Ideally, pain evaluation is objective, pain-specific and easily incorporated into a busy equine clinic. This paper reviews the existing knowledge base regarding the identification and quantification of pain in horses. Behavioural...

  7. 15 CFR 990.52 - Injury assessment-quantification.

    Science.gov (United States)

    2010-01-01

    ... (Continued) NATIONAL OCEANIC AND ATMOSPHERIC ADMINISTRATION, DEPARTMENT OF COMMERCE OIL POLLUTION ACT..., trustees must quantify the degree, and spatial and temporal extent of such injuries relative to baseline. (b) Quantification approaches. Trustees may quantify injuries in terms of: (1) The degree, and...

  8. A posteriori uncertainty quantification of PIV-based pressure data

    NARCIS (Netherlands)

    Azijli, I.; Sciacchitano, A.; Ragni, D.; Palha Da Silva Clérigo, A.; Dwight, R.P.

    2016-01-01

    A methodology for a posteriori uncertainty quantification of pressure data retrieved from particle image velocimetry (PIV) is proposed. It relies upon the Bayesian framework, where the posterior distribution (probability distribution of the true velocity, given the PIV measurements) is obtained from

  9. Temperature dependence of postmortem MR quantification for soft tissue discrimination

    International Nuclear Information System (INIS)

    Zech, Wolf-Dieter; Schwendener, Nicole; Jackowski, Christian; Persson, Anders; Warntjes, Marcel J.

    2015-01-01

    To investigate and correct the temperature dependence of postmortem MR quantification used for soft tissue characterization and differentiation in thoraco-abdominal organs. Thirty-five postmortem short axis cardiac 3-T MR examinations were quantified using a quantification sequence. Liver, spleen, left ventricular myocardium, pectoralis muscle and subcutaneous fat were analysed in cardiac short axis images to obtain mean T1, T2 and PD tissue values. The core body temperature was measured using a rectally inserted thermometer. The tissue-specific quantitative values were related to the body core temperature. Equations to correct for temperature differences were generated. In a 3D plot comprising the combined data of T1, T2 and PD, different organs/tissues could be well differentiated from each other. The quantitative values were influenced by the temperature. T1 in particular exhibited strong temperature dependence. The correction of quantitative values to a temperature of 37 C resulted in better tissue discrimination. Postmortem MR quantification is feasible for soft tissue discrimination and characterization of thoraco-abdominal organs. This provides a base for computer-aided diagnosis and detection of tissue lesions. The temperature dependence of the T1 values challenges postmortem MR quantification. Equations to correct for the temperature dependence are provided. (orig.)

  10. Identification and quantification of phytochemicals in nutraceutical products from green tea by UHPLC-Orbitrap-MS.

    Science.gov (United States)

    López-Gutiérrez, Noelia; Romero-González, Roberto; Plaza-Bolaños, Patricia; Martínez Vidal, José Luis; Garrido Frenich, Antonia

    2015-04-15

    A method has been developed and validated for the simultaneous detection and quantification of phytochemicals in nutraceutical products obtained from green tea. For that purpose, ultra-high performance liquid chromatography coupled to single-stage Orbitrap high resolution mass spectrometry (UHPLC-Orbitrap-MS) has been used. A database containing 37 compounds has been used for the detection and identification of the target compounds. The developed methodology was based on solid-liquid extraction, using a mixture of methanol:H2O (80:20, v/v, pH 4), followed by dilution (10 times) with a mixture of ammonium acetate:methanol (50:50, v/v). Chromatographic conditions were optimised and full scan accurate mass data acquisition using electrospray ionisation in positive and negative ion mode was used. Moreover, all-ion fragmentation mode was used to get information of fragment ions, and they were used for identification purposes. The developed method was validated, obtaining repeatability (intra-day) and inter-day precision values (expressed as relative standard deviation, RSD) lower than 16% and 20%, respectively. Lower limits were also evaluated and limits of detection (LODs) ranged from 1 to 50 μg L(-1), while limits of quantification (LOQs) ranged from 2 to 150 μg L(-1). Recovery was performed at five levels and it ranged from 70% to 109%. Finally, this method was used to evaluate the phytochemical content in 10 samples (tablets or capsules), showing concentrations of (+)-catechin, (-)-epicatechin, gallic acid, (-)-gallocatechin and quercetin-3-O-rutinoside, ranging from 258 (C6) to 10,729 (C6) mg kg(-1). Copyright © 2014 Elsevier Ltd. All rights reserved.

  11. Quantification of dextrose in model solution by 1H MR spectroscopy at 1.5T

    International Nuclear Information System (INIS)

    Lee, Kyung Hee; Cho, Soon Gu; Kim, Hyung Jin; Suh, Chang Hae; Kim, Yong Seong; Lee, Jung Hee

    2002-01-01

    To evaluate the feasibility of proton magnetic resonance spectroscopy ( 1 H-MRS) using a 1.5T magnetic resonance (MR) imager for quantification of the contents of model solutions. We prepared model solutions of dextrose +water and dextrose +water + ethanol at dextrose concentrations of 0.01% to 50% and 0.01% to 20%, respectively. Using these solutions and a 1.5T MR imager together with a high-resolution nuclear magnetic resonance (NMR) spectroscope, we calculated the ratios of dextrose to water peak, (dextrose +ethanol) to water peak, and (dextrose + ethanol) to ethanol peak, as seen on MR and NMR spectra, analysing the relationships between dextrose concentration and the ratios of peaks, and between the ratios of the peaks seen on MR spectra and those seen on NMR spectra. Changes in the ratios between dextrose concentration and dextrose to water peak, (dextrose + ethanol) to water peak and (dextrose + ethanol) to ethanol peak, as seen on MR spectra, were statistically significant, and there was good linear regression. There was also close correlation between the ratios of the observed on MR and NMR spectra. The results depict the quantification of dextrose concentration according to the ratios of spectral peaks obtained by proton MRS at 1.5T. Using proton MRS at 1.5T, and on the basis of the ratios of spectcal peaks, it was possible to quantify the concentration of dextrose in model solutions of dextrose + water and dextrose + water+ ethanol. The results of this study suggest that for quantifying the contents of biofluids, the use of low-tesla 1 H-MRS is feasible

  12. Supercharging by m-NBA Improves ETD-Based Quantification of Hydroxyl Radical Protein Footprinting

    Science.gov (United States)

    Li, Xiaoyan; Li, Zixuan; Xie, Boer; Sharp, Joshua S.

    2015-08-01

    Hydroxyl radical protein footprinting (HRPF) is an MS-based technique for analyzing protein structure based on measuring the oxidation of amino acid side chains by hydroxyl radicals diffusing in solution. Spatial resolution of HRPF is limited by the smallest portion of the protein for which oxidation amounts can be accurately quantitated. Previous work has shown electron transfer dissociation (ETD) to be the most reliable method for quantifying the amount of oxidation of each amino acid side chain in a mixture of peptide oxidation isomers, but efficient ETD requires high peptide charge states, which limits its applicability for HRPF. Supercharging reagents have been used to enhance peptide charge state for ETD analysis, but previous work has shown supercharging reagents to enhance charge state differently for different peptides sequences; it is currently unknown if different oxidation isomers will experience different charge enhancement effects. Here, we report the effect of m-nitrobenzyl alcohol ( m-NBA) on the ETD-based quantification of peptide oxidation. The addition of m-NBA to both a defined mixture of synthetic isomeric oxidized peptides and Robo-1 protein subjected to HRPF increased the abundance of higher charge state ions, improving our ability to perform efficient ETD of the mixture. No differences in the reported quantitation by ETD were noted in the presence or absence of m-NBA, indicating that all oxidation isomers were charge-enhanced to a similar extent. These results indicate the utility of m-NBA for residue-level quantification of peptide oxidation in HRPF and other applications.

  13. Internal event analysis for Laguna Verde Unit 1 Nuclear Power Plant. Accident sequence quantification and results

    International Nuclear Information System (INIS)

    Huerta B, A.; Aguilar T, O.; Nunez C, A.; Lopez M, R.

    1994-01-01

    The Level 1 results of Laguna Verde Nuclear Power Plant PRA are presented in the I nternal Event Analysis for Laguna Verde Unit 1 Nuclear Power Plant, CNSNS-TR 004, in five volumes. The reports are organized as follows: CNSNS-TR 004 Volume 1: Introduction and Methodology. CNSNS-TR4 Volume 2: Initiating Event and Accident Sequences. CNSNS-TR 004 Volume 3: System Analysis. CNSNS-TR 004 Volume 4: Accident Sequence Quantification and Results. CNSNS-TR 005 Volume 5: Appendices A, B and C. This volume presents the development of the dependent failure analysis, the treatment of the support system dependencies, the identification of the shared-components dependencies, and the treatment of the common cause failure. It is also presented the identification of the main human actions considered along with the possible recovery actions included. The development of the data base and the assumptions and limitations in the data base are also described in this volume. The accident sequences quantification process and the resolution of the core vulnerable sequences are presented. In this volume, the source and treatment of uncertainties associated with failure rates, component unavailabilities, initiating event frequencies, and human error probabilities are also presented. Finally, the main results and conclusions for the Internal Event Analysis for Laguna Verde Nuclear Power Plant are presented. The total core damage frequency calculated is 9.03x 10-5 per year for internal events. The most dominant accident sequences found are the transients involving the loss of offsite power, the station blackout accidents, and the anticipated transients without SCRAM (ATWS). (Author)

  14. High angular resolution at LBT

    Science.gov (United States)

    Conrad, A.; Arcidiacono, C.; Bertero, M.; Boccacci, P.; Davies, A. G.; Defrere, D.; de Kleer, K.; De Pater, I.; Hinz, P.; Hofmann, K. H.; La Camera, A.; Leisenring, J.; Kürster, M.; Rathbun, J. A.; Schertl, D.; Skemer, A.; Skrutskie, M.; Spencer, J. R.; Veillet, C.; Weigelt, G.; Woodward, C. E.

    2015-12-01

    High angular resolution from ground-based observatories stands as a key technology for advancing planetary science. In the window between the angular resolution achievable with 8-10 meter class telescopes, and the 23-to-40 meter giants of the future, LBT provides a glimpse of what the next generation of instruments providing higher angular resolution will provide. We present first ever resolved images of an Io eruption site taken from the ground, images of Io's Loki Patera taken with Fizeau imaging at the 22.8 meter LBT [Conrad, et al., AJ, 2015]. We will also present preliminary analysis of two data sets acquired during the 2015 opposition: L-band fringes at Kurdalagon and an occultation of Loki and Pele by Europa (see figure). The light curves from this occultation will yield an order of magnitude improvement in spatial resolution along the path of ingress and egress. We will conclude by providing an overview of the overall benefit of recent and future advances in angular resolution for planetary science.

  15. A statistical kinematic source inversion approach based on the QUESO library for uncertainty quantification and prediction

    Science.gov (United States)

    Zielke, Olaf; McDougall, Damon; Mai, Martin; Babuska, Ivo

    2014-05-01

    Seismic, often augmented with geodetic data, are frequently used to invert for the spatio-temporal evolution of slip along a rupture plane. The resulting images of the slip evolution for a single event, inferred by different research teams, often vary distinctly, depending on the adopted inversion approach and rupture model parameterization. This observation raises the question, which of the provided kinematic source inversion solutions is most reliable and most robust, and — more generally — how accurate are fault parameterization and solution predictions? These issues are not included in "standard" source inversion approaches. Here, we present a statistical inversion approach to constrain kinematic rupture parameters from teleseismic body waves. The approach is based a) on a forward-modeling scheme that computes synthetic (body-)waves for a given kinematic rupture model, and b) on the QUESO (Quantification of Uncertainty for Estimation, Simulation, and Optimization) library that uses MCMC algorithms and Bayes theorem for sample selection. We present Bayesian inversions for rupture parameters in synthetic earthquakes (i.e. for which the exact rupture history is known) in an attempt to identify the cross-over at which further model discretization (spatial and temporal resolution of the parameter space) is no longer attributed to a decreasing misfit. Identification of this cross-over is of importance as it reveals the resolution power of the studied data set (i.e. teleseismic body waves), enabling one to constrain kinematic earthquake rupture histories of real earthquakes at a resolution that is supported by data. In addition, the Bayesian approach allows for mapping complete posterior probability density functions of the desired kinematic source parameters, thus enabling us to rigorously assess the uncertainties in earthquake source inversions.

  16. High-resolution satellite imagery is an important yet underutilized resource in conservation biology.

    Science.gov (United States)

    Boyle, Sarah A; Kennedy, Christina M; Torres, Julio; Colman, Karen; Pérez-Estigarribia, Pastor E; de la Sancha, Noé U

    2014-01-01

    Technological advances and increasing availability of high-resolution satellite imagery offer the potential for more accurate land cover classifications and pattern analyses, which could greatly improve the detection and quantification of land cover change for conservation. Such remotely-sensed products, however, are often expensive and difficult to acquire, which prohibits or reduces their use. We tested whether imagery of high spatial resolution (≤5 m) differs from lower-resolution imagery (≥30 m) in performance and extent of use for conservation applications. To assess performance, we classified land cover in a heterogeneous region of Interior Atlantic Forest in Paraguay, which has undergone recent and dramatic human-induced habitat loss and fragmentation. We used 4 m multispectral IKONOS and 30 m multispectral Landsat imagery and determined the extent to which resolution influenced the delineation of land cover classes and patch-level metrics. Higher-resolution imagery more accurately delineated cover classes, identified smaller patches, retained patch shape, and detected narrower, linear patches. To assess extent of use, we surveyed three conservation journals (Biological Conservation, Biotropica, Conservation Biology) and found limited application of high-resolution imagery in research, with only 26.8% of land cover studies analyzing satellite imagery, and of these studies only 10.4% used imagery ≤5 m resolution. Our results suggest that high-resolution imagery is warranted yet under-utilized in conservation research, but is needed to adequately monitor and evaluate forest loss and conversion, and to delineate potentially important stepping-stone fragments that may serve as corridors in a human-modified landscape. Greater access to low-cost, multiband, high-resolution satellite imagery would therefore greatly facilitate conservation management and decision-making.

  17. MR/PET quantification tools: Registration, segmentation, classification, and MR-based attenuation correction

    Science.gov (United States)

    Fei, Baowei; Yang, Xiaofeng; Nye, Jonathon A.; Aarsvold, John N.; Raghunath, Nivedita; Cervo, Morgan; Stark, Rebecca; Meltzer, Carolyn C.; Votaw, John R.

    2012-01-01

    Purpose: Combined MR/PET is a relatively new, hybrid imaging modality. A human MR/PET prototype system consisting of a Siemens 3T Trio MR and brain PET insert was installed and tested at our institution. Its present design does not offer measured attenuation correction (AC) using traditional transmission imaging. This study is the development of quantification tools including MR-based AC for quantification in combined MR/PET for brain imaging. Methods: The developed quantification tools include image registration, segmentation, classification, and MR-based AC. These components were integrated into a single scheme for processing MR/PET data. The segmentation method is multiscale and based on the Radon transform of brain MR images. It was developed to segment the skull on T1-weighted MR images. A modified fuzzy C-means classification scheme was developed to classify brain tissue into gray matter, white matter, and cerebrospinal fluid. Classified tissue is assigned an attenuation coefficient so that AC factors can be generated. PET emission data are then reconstructed using a three-dimensional ordered sets expectation maximization method with the MR-based AC map. Ten subjects had separate MR and PET scans. The PET with [11C]PIB was acquired using a high-resolution research tomography (HRRT) PET. MR-based AC was compared with transmission (TX)-based AC on the HRRT. Seventeen volumes of interest were drawn manually on each subject image to compare the PET activities between the MR-based and TX-based AC methods. Results: For skull segmentation, the overlap ratio between our segmented results and the ground truth is 85.2 ± 2.6%. Attenuation correction results from the ten subjects show that the difference between the MR and TX-based methods was <6.5%. Conclusions: MR-based AC compared favorably with conventional transmission-based AC. Quantitative tools including registration, segmentation, classification, and MR-based AC have been developed for use in combined MR

  18. MR/PET quantification tools: Registration, segmentation, classification, and MR-based attenuation correction

    Energy Technology Data Exchange (ETDEWEB)

    Fei, Baowei, E-mail: bfei@emory.edu [Department of Radiology and Imaging Sciences, Emory University School of Medicine, 1841 Clifton Road Northeast, Atlanta, Georgia 30329 (United States); Department of Biomedical Engineering, Emory University and Georgia Institute of Technology, Atlanta, Georgia 30322 (United States); Department of Mathematics and Computer Sciences, Emory University, Atlanta, Georgia 30322 (United States); Yang, Xiaofeng; Nye, Jonathon A.; Raghunath, Nivedita; Votaw, John R. [Department of Radiology and Imaging Sciences, Emory University School of Medicine, Atlanta, Georgia 30329 (United States); Aarsvold, John N. [Department of Radiology and Imaging Sciences, Emory University School of Medicine, Atlanta, Georgia 30329 (United States); Nuclear Medicine Service, Atlanta Veterans Affairs Medical Center, Atlanta, Georgia 30033 (United States); Cervo, Morgan; Stark, Rebecca [The Medical Physics Graduate Program in the George W. Woodruff School, Georgia Institute of Technology, Atlanta, Georgia 30332 (United States); Meltzer, Carolyn C. [Department of Radiology and Imaging Sciences, Emory University School of Medicine, Atlanta, Georgia 30329 (United States); Department of Neurology and Department of Psychiatry and Behavior Sciences, Emory University School of Medicine, Atlanta, Georgia 30322 (United States)

    2012-10-15

    Purpose: Combined MR/PET is a relatively new, hybrid imaging modality. A human MR/PET prototype system consisting of a Siemens 3T Trio MR and brain PET insert was installed and tested at our institution. Its present design does not offer measured attenuation correction (AC) using traditional transmission imaging. This study is the development of quantification tools including MR-based AC for quantification in combined MR/PET for brain imaging. Methods: The developed quantification tools include image registration, segmentation, classification, and MR-based AC. These components were integrated into a single scheme for processing MR/PET data. The segmentation method is multiscale and based on the Radon transform of brain MR images. It was developed to segment the skull on T1-weighted MR images. A modified fuzzy C-means classification scheme was developed to classify brain tissue into gray matter, white matter, and cerebrospinal fluid. Classified tissue is assigned an attenuation coefficient so that AC factors can be generated. PET emission data are then reconstructed using a three-dimensional ordered sets expectation maximization method with the MR-based AC map. Ten subjects had separate MR and PET scans. The PET with [{sup 11}C]PIB was acquired using a high-resolution research tomography (HRRT) PET. MR-based AC was compared with transmission (TX)-based AC on the HRRT. Seventeen volumes of interest were drawn manually on each subject image to compare the PET activities between the MR-based and TX-based AC methods. Results: For skull segmentation, the overlap ratio between our segmented results and the ground truth is 85.2 ± 2.6%. Attenuation correction results from the ten subjects show that the difference between the MR and TX-based methods was <6.5%. Conclusions: MR-based AC compared favorably with conventional transmission-based AC. Quantitative tools including registration, segmentation, classification, and MR-based AC have been developed for use in combined MR/PET.

  19. Volumetric quantification of lung nodules in CT with iterative reconstruction (ASiR and MBIR)

    Energy Technology Data Exchange (ETDEWEB)

    Chen, Baiyu [Medical Physics Graduate Program, Duke University, Durham, North Carolina 27705 and Carl E. Ravin Advanced Imaging Laboratories, Duke University, Durham, North Carolina 27705 (United States); Barnhart, Huiman [Department of Biostatistics and Bioinformatics, Duke University, Durham, North Carolina 27705 (United States); Richard, Samuel [Carl E. Ravin Advanced Imaging Laboratories, Duke University, Durham, North Carolina 27705 and Department of Radiology, Duke University, Durham, North Carolina 27705 (United States); Robins, Marthony [Medical Physics Graduate Program, Duke University, Durham, North Carolina 27705 (United States); Colsher, James [Department of Radiology, Duke University, Durham, North Carolina 27705 (United States); Samei, Ehsan [Medical Physics Graduate Program, Duke University, Durham, North Carolina 27705 (United States); Carl E. Ravin Advanced Imaging Laboratories, Duke University, Durham, North Carolina 27705 (United States); Department of Radiology, Duke University, Durham, North Carolina 27705 (United States); Department of Physics, Department of Biomedical Engineering, and Department of Electronic and Computer Engineering, Duke University, Durham, North Carolina 27705 (United States)

    2013-11-15

    Purpose: Volume quantifications of lung nodules with multidetector computed tomography (CT) images provide useful information for monitoring nodule developments. The accuracy and precision of the volume quantification, however, can be impacted by imaging and reconstruction parameters. This study aimed to investigate the impact of iterative reconstruction algorithms on the accuracy and precision of volume quantification with dose and slice thickness as additional variables.Methods: Repeated CT images were acquired from an anthropomorphic chest phantom with synthetic nodules (9.5 and 4.8 mm) at six dose levels, and reconstructed with three reconstruction algorithms [filtered backprojection (FBP), adaptive statistical iterative reconstruction (ASiR), and model based iterative reconstruction (MBIR)] into three slice thicknesses. The nodule volumes were measured with two clinical software (A: Lung VCAR, B: iNtuition), and analyzed for accuracy and precision.Results: Precision was found to be generally comparable between FBP and iterative reconstruction with no statistically significant difference noted for different dose levels, slice thickness, and segmentation software. Accuracy was found to be more variable. For large nodules, the accuracy was significantly different between ASiR and FBP for all slice thicknesses with both software, and significantly different between MBIR and FBP for 0.625 mm slice thickness with Software A and for all slice thicknesses with Software B. For small nodules, the accuracy was more similar between FBP and iterative reconstruction, with the exception of ASIR vs FBP at 1.25 mm with Software A and MBIR vs FBP at 0.625 mm with Software A.Conclusions: The systematic difference between the accuracy of FBP and iterative reconstructions highlights the importance of extending current segmentation software to accommodate the image characteristics of iterative reconstructions. In addition, a calibration process may help reduce the dependency of

  20. MR/PET quantification tools: Registration, segmentation, classification, and MR-based attenuation correction

    International Nuclear Information System (INIS)

    Fei, Baowei; Yang, Xiaofeng; Nye, Jonathon A.; Raghunath, Nivedita; Votaw, John R.; Aarsvold, John N.; Cervo, Morgan; Stark, Rebecca; Meltzer, Carolyn C.

    2012-01-01

    Purpose: Combined MR/PET is a relatively new, hybrid imaging modality. A human MR/PET prototype system consisting of a Siemens 3T Trio MR and brain PET insert was installed and tested at our institution. Its present design does not offer measured attenuation correction (AC) using traditional transmission imaging. This study is the development of quantification tools including MR-based AC for quantification in combined MR/PET for brain imaging. Methods: The developed quantification tools include image registration, segmentation, classification, and MR-based AC. These components were integrated into a single scheme for processing MR/PET data. The segmentation method is multiscale and based on the Radon transform of brain MR images. It was developed to segment the skull on T1-weighted MR images. A modified fuzzy C-means classification scheme was developed to classify brain tissue into gray matter, white matter, and cerebrospinal fluid. Classified tissue is assigned an attenuation coefficient so that AC factors can be generated. PET emission data are then reconstructed using a three-dimensional ordered sets expectation maximization method with the MR-based AC map. Ten subjects had separate MR and PET scans. The PET with ["1"1C]PIB was acquired using a high-resolution research tomography (HRRT) PET. MR-based AC was compared with transmission (TX)-based AC on the HRRT. Seventeen volumes of interest were drawn manually on each subject image to compare the PET activities between the MR-based and TX-based AC methods. Results: For skull segmentation, the overlap ratio between our segmented results and the ground truth is 85.2 ± 2.6%. Attenuation correction results from the ten subjects show that the difference between the MR and TX-based methods was <6.5%. Conclusions: MR-based AC compared favorably with conventional transmission-based AC. Quantitative tools including registration, segmentation, classification, and MR-based AC have been developed for use in combined MR/PET.

  1. Volumetric quantification of lung nodules in CT with iterative reconstruction (ASiR and MBIR).

    Science.gov (United States)

    Chen, Baiyu; Barnhart, Huiman; Richard, Samuel; Robins, Marthony; Colsher, James; Samei, Ehsan

    2013-11-01

    Volume quantifications of lung nodules with multidetector computed tomography (CT) images provide useful information for monitoring nodule developments. The accuracy and precision of the volume quantification, however, can be impacted by imaging and reconstruction parameters. This study aimed to investigate the impact of iterative reconstruction algorithms on the accuracy and precision of volume quantification with dose and slice thickness as additional variables. Repeated CT images were acquired from an anthropomorphic chest phantom with synthetic nodules (9.5 and 4.8 mm) at six dose levels, and reconstructed with three reconstruction algorithms [filtered backprojection (FBP), adaptive statistical iterative reconstruction (ASiR), and model based iterative reconstruction (MBIR)] into three slice thicknesses. The nodule volumes were measured with two clinical software (A: Lung VCAR, B: iNtuition), and analyzed for accuracy and precision. Precision was found to be generally comparable between FBP and iterative reconstruction with no statistically significant difference noted for different dose levels, slice thickness, and segmentation software. Accuracy was found to be more variable. For large nodules, the accuracy was significantly different between ASiR and FBP for all slice thicknesses with both software, and significantly different between MBIR and FBP for 0.625 mm slice thickness with Software A and for all slice thicknesses with Software B. For small nodules, the accuracy was more similar between FBP and iterative reconstruction, with the exception of ASIR vs FBP at 1.25 mm with Software A and MBIR vs FBP at 0.625 mm with Software A. The systematic difference between the accuracy of FBP and iterative reconstructions highlights the importance of extending current segmentation software to accommodate the image characteristics of iterative reconstructions. In addition, a calibration process may help reduce the dependency of accuracy on reconstruction algorithms

  2. Volumetric quantification of lung nodules in CT with iterative reconstruction (ASiR and MBIR)

    International Nuclear Information System (INIS)

    Chen, Baiyu; Barnhart, Huiman; Richard, Samuel; Robins, Marthony; Colsher, James; Samei, Ehsan

    2013-01-01

    Purpose: Volume quantifications of lung nodules with multidetector computed tomography (CT) images provide useful information for monitoring nodule developments. The accuracy and precision of the volume quantification, however, can be impacted by imaging and reconstruction parameters. This study aimed to investigate the impact of iterative reconstruction algorithms on the accuracy and precision of volume quantification with dose and slice thickness as additional variables.Methods: Repeated CT images were acquired from an anthropomorphic chest phantom with synthetic nodules (9.5 and 4.8 mm) at six dose levels, and reconstructed with three reconstruction algorithms [filtered backprojection (FBP), adaptive statistical iterative reconstruction (ASiR), and model based iterative reconstruction (MBIR)] into three slice thicknesses. The nodule volumes were measured with two clinical software (A: Lung VCAR, B: iNtuition), and analyzed for accuracy and precision.Results: Precision was found to be generally comparable between FBP and iterative reconstruction with no statistically significant difference noted for different dose levels, slice thickness, and segmentation software. Accuracy was found to be more variable. For large nodules, the accuracy was significantly different between ASiR and FBP for all slice thicknesses with both software, and significantly different between MBIR and FBP for 0.625 mm slice thickness with Software A and for all slice thicknesses with Software B. For small nodules, the accuracy was more similar between FBP and iterative reconstruction, with the exception of ASIR vs FBP at 1.25 mm with Software A and MBIR vs FBP at 0.625 mm with Software A.Conclusions: The systematic difference between the accuracy of FBP and iterative reconstructions highlights the importance of extending current segmentation software to accommodate the image characteristics of iterative reconstructions. In addition, a calibration process may help reduce the dependency of

  3. Section on High Resolution Optical Imaging (HROI)

    Data.gov (United States)

    Federal Laboratory Consortium — The Section on High Resolution Optical Imaging (HROI) develops novel technologies for studying biological processes at unprecedented speed and resolution. Research...

  4. High-resolution mass spectrometry in toxicology: current status and future perspectives.

    Science.gov (United States)

    Maurer, H H; Meyer, Markus R

    2016-09-01

    This paper reviews high-resolution mass spectrometry (HRMS) approaches using time-of-flight or Orbitrap techniques for research and application in various toxicology fields, particularly in clinical toxicology and forensic toxicology published since 2013 and referenced in PubMed. In the introduction, an overview on applications of HRMS in various toxicology fields is given with reference to current review articles. Papers concerning HRMS in metabolism, screening, and quantification of pharmaceuticals, drugs of abuse, and toxins in human body samples are critically reviewed. Finally, a discussion on advantages as well as limitations and future perspectives of these methods is included.

  5. High-resolution MR imaging of urethra for incontinence by means of intracavitary surface coils

    International Nuclear Information System (INIS)

    Yang, A.; Mostwin, J.L.; Genadry, R.; Yang, S.S.

    1991-01-01

    Urinary incontinence is a major medical problem affecting millions of older women. This paper demonstrates the use of dynamic MR imaging in noninvasive quantification of prolapse in all three pelvic compartments. In this exhibit we use high-resolution MR imaging with intracavity (intravaginal, intrarectal) and surface/intracavitary coils to diagnose intrinsic urethral pathology that prevents opening (dysuria) or coaptation (incontinence). Normal anatomy, congenital anatomy (pelvic floor defects, hypoplasia), acquired anatomy (periurethral cyst/divertivulum, tumor, hypertrophy), and operative failure as causes of incontinence (postoperative scarring, misplacement/dehiscence of sutures and flaps) are shown. We demonstrate a novel method for MR cine voiding cystourethrography. Technical factors and applications are discussed

  6. Determination of the usage factor of components after cyclic loading using high-resolution microstructural investigations

    International Nuclear Information System (INIS)

    Seibold, A.; Scheibe, A.; Assmann, H.D.

    1989-01-01

    The usage factor can be derived from the quantification of the structure changes and the allocation of the microstructural state to the fatigue curves of the component materials. Using the example of the low alloy fine grain structural steel 20 Mn Mo Ni 5 5 (annealed structure), the relationship between micro-structure and the number of load cycles is shown in the form of a calibration curve. By high resolution structural investigation, the usage factor can be determined to n = N/N B ≅ 0.5 under given vibration stress. Only a small volume sample is required for the electron microscope examination. (orig./DG) [de

  7. Resolution improvement of brain PET images using prior information from MRI: clinical application on refractory epilepsy

    International Nuclear Information System (INIS)

    Silva-Rodríguez, Jesus; Tsoumpas, Charalampos; Aguiar, Pablo; Cortes, Julia; Urdaneta, Jesus Lopez

    2015-01-01

    An important counterpart of clinical Positron Emission Tomography (PET) for early diagnosis of neurological diseases is its low resolution. This is particularly important when evaluating diseases related to small hypometabolisms such as epilepsy. The last years, new hybrid systems combining PET with Magnetic Resonance (MR) has been increasingly used for several different clinical applications. One of the advantages of MR is the production of high spatial resolution images and a potential application of PET-MR imaging is the improvement of PET resolution using MR information. A potential advantage of resolution recovery of PET images is the enhancement of contrast delivering at the same time better detectability of small lesions or hypometabolic areas and more accurate quantification over these areas. Recently, Shidahara et al (2009) proposed a new method using wavelet transforms in order to produce PET images with higher resolution. We optimised Shidahara’s method (SFS-RR) to take into account possible shortcomings on the particular clinical datasets, and applied it to a group of patients diagnosed with refractory epilepsy. FDG-PET and MRI images were acquired sequentially and then co-registered using software tools. A complete evaluation of the PET/MR images was performed before and after the correction, including different parameters related with PET quantification, such as atlas-based metabolism asymmetry coefficients and Statistical Parametric Mapping results comparing to a database of 87 healthy subjects. Furthermore, an experienced physician analyzed the results of non-corrected and corrected images in order to evaluate improvements of detectability on a visual inspection. Clinical outcome was used as a gold standard. SFS-RR demonstrated to have a positive impact on clinical diagnosis of small hypometabolisms. New lesions were detected providing additional clinically relevant information on the visual inspection. SPM sensitivity for the detection of small

  8. Resolution improvement of brain PET images using prior information from MRI: clinical application on refractory epilepsy

    Energy Technology Data Exchange (ETDEWEB)

    Silva-Rodríguez, Jesus [Instituto de Investigaciones Sanitarias (IDIS), Santiago de Compostela (Spain); Tsoumpas, Charalampos [University of Leeds, Leeds (United Kingdom); Aguiar, Pablo; Cortes, Julia [Nuclear Medicine Department, University Hospital (CHUS), Santiago de Compostela (Spain); Urdaneta, Jesus Lopez [Instituto de Investigaciones Sanitarias (IDIS), Santiago de Compostela (Spain)

    2015-05-18

    An important counterpart of clinical Positron Emission Tomography (PET) for early diagnosis of neurological diseases is its low resolution. This is particularly important when evaluating diseases related to small hypometabolisms such as epilepsy. The last years, new hybrid systems combining PET with Magnetic Resonance (MR) has been increasingly used for several different clinical applications. One of the advantages of MR is the production of high spatial resolution images and a potential application of PET-MR imaging is the improvement of PET resolution using MR information. A potential advantage of resolution recovery of PET images is the enhancement of contrast delivering at the same time better detectability of small lesions or hypometabolic areas and more accurate quantification over these areas. Recently, Shidahara et al (2009) proposed a new method using wavelet transforms in order to produce PET images with higher resolution. We optimised Shidahara’s method (SFS-RR) to take into account possible shortcomings on the particular clinical datasets, and applied it to a group of patients diagnosed with refractory epilepsy. FDG-PET and MRI images were acquired sequentially and then co-registered using software tools. A complete evaluation of the PET/MR images was performed before and after the correction, including different parameters related with PET quantification, such as atlas-based metabolism asymmetry coefficients and Statistical Parametric Mapping results comparing to a database of 87 healthy subjects. Furthermore, an experienced physician analyzed the results of non-corrected and corrected images in order to evaluate improvements of detectability on a visual inspection. Clinical outcome was used as a gold standard. SFS-RR demonstrated to have a positive impact on clinical diagnosis of small hypometabolisms. New lesions were detected providing additional clinically relevant information on the visual inspection. SPM sensitivity for the detection of small

  9. Berkeley High-Resolution Ball

    International Nuclear Information System (INIS)

    Diamond, R.M.

    1984-10-01

    Criteria for a high-resolution γ-ray system are discussed. Desirable properties are high resolution, good response function, and moderate solid angle so as to achieve not only double- but triple-coincidences with good statistics. The Berkeley High-Resolution Ball involved the first use of bismuth germanate (BGO) for anti-Compton shield for Ge detectors. The resulting compact shield permitted rather close packing of 21 detectors around a target. In addition, a small central BGO ball gives the total γ-ray energy and multiplicity, as well as the angular pattern of the γ rays. The 21-detector array is nearly complete, and the central ball has been designed, but not yet constructed. First results taken with 9 detector modules are shown for the nucleus 156 Er. The complex decay scheme indicates a transition from collective rotation (prolate shape) to single- particle states (possibly oblate) near spin 30 h, and has other interesting features

  10. High resolution optical DNA mapping

    Science.gov (United States)

    Baday, Murat

    Many types of diseases including cancer and autism are associated with copy-number variations in the genome. Most of these variations could not be identified with existing sequencing and optical DNA mapping methods. We have developed Multi-color Super-resolution technique, with potential for high throughput and low cost, which can allow us to recognize more of these variations. Our technique has made 10--fold improvement in the resolution of optical DNA mapping. Using a 180 kb BAC clone as a model system, we resolved dense patterns from 108 fluorescent labels of two different colors representing two different sequence-motifs. Overall, a detailed DNA map with 100 bp resolution was achieved, which has the potential to reveal detailed information about genetic variance and to facilitate medical diagnosis of genetic disease.

  11. KINOFORM LENSES - TOWARD NANOMETER RESOLUTION.

    Energy Technology Data Exchange (ETDEWEB)

    STEIN, A.; EVANS-LUTTERODT, K.; TAYLOR, A.

    2004-10-23

    While hard x-rays have wavelengths in the nanometer and sub-nanometer range, the ability to focus them is limited by the quality of sources and optics, and not by the wavelength. A few options, including reflective (mirrors), diffractive (zone plates) and refractive (CRL's) are available, each with their own limitations. Here we present our work with kinoform lenses which are refractive lenses with all material causing redundant 2{pi} phase shifts removed to reduce the absorption problems inherently limiting the resolution of refractive lenses. By stacking kinoform lenses together, the effective numerical aperture, and thus the focusing resolution, can be increased. The present status of kinoform lens fabrication and testing at Brookhaven is presented as well as future plans toward achieving nanometer resolution.

  12. Microhemodynamic parameters quantification from intravital microscopy videos

    International Nuclear Information System (INIS)

    Ortiz, Daniel; Cabrales, Pedro; Briceño, Juan Carlos

    2014-01-01

    Blood flow and blood–endothelium interactions correspond with the genesis of cardiovascular diseases. Therefore, quantitative analysis of blood flow dynamics at the microcirculation level is of special interest. Regulatory mechanisms mediated by blow flow have been studied in detail using in vitro approaches. However, these mechanisms have not been fully validated in vivo due to technical limitations that arise when quantifying microhemodynamics with the required level of detail. Intravital microscopy combined with high-speed video recordings has been used for the analysis of blood flow in small blood vessels of chronic and acute experimental tissue preparations. This tool can be used to study the interaction between the flowing blood and the vessel walls of arterioles and venules with sufficient temporal and spatial resolution. Our objective was to develop a simple and robust cross-correlation algorithm for the automatic analysis of high-speed video recordings of microcirculatory blood flow. The algorithm was validated using in vitro and in vivo systems. Results indicate that the algorithm's ability to estimate the velocity of local red blood cells as a function of blood vessel radius is highly accurate. They thereby suggest that the algorithm could be used to explore dynamic changes in blood flow under different experimental conditions including a wide range of flow rates and hematocrit levels. The algorithm can also be used to measure volumetric flow rates, radial velocity profiles, wall shear rate, and wall shear stress. Several applications are presently explored, including the analysis of velocity profiles in the branches of arterial bifurcations. This work demonstrates the robustness of the cross-correlation technique in various flow conditions and elucidates its potential application for in vivo determination of blood flow dynamics in the microcirculation. (paper)

  13. Greenhouse Gas Source Attribution: Measurements Modeling and Uncertainty Quantification

    Energy Technology Data Exchange (ETDEWEB)

    Liu, Zhen [Sandia National Lab. (SNL-CA), Livermore, CA (United States); Safta, Cosmin [Sandia National Lab. (SNL-CA), Livermore, CA (United States); Sargsyan, Khachik [Sandia National Lab. (SNL-CA), Livermore, CA (United States); Najm, Habib N. [Sandia National Lab. (SNL-CA), Livermore, CA (United States); van Bloemen Waanders, Bart Gustaaf [Sandia National Lab. (SNL-CA), Livermore, CA (United States); LaFranchi, Brian W. [Sandia National Lab. (SNL-CA), Livermore, CA (United States); Ivey, Mark D. [Sandia National Lab. (SNL-CA), Livermore, CA (United States); Schrader, Paul E. [Sandia National Lab. (SNL-CA), Livermore, CA (United States); Michelsen, Hope A. [Sandia National Lab. (SNL-CA), Livermore, CA (United States); Bambha, Ray P. [Sandia National Lab. (SNL-CA), Livermore, CA (United States)

    2014-09-01

    In this project we have developed atmospheric measurement capabilities and a suite of atmospheric modeling and analysis tools that are well suited for verifying emissions of green- house gases (GHGs) on an urban-through-regional scale. We have for the first time applied the Community Multiscale Air Quality (CMAQ) model to simulate atmospheric CO2 . This will allow for the examination of regional-scale transport and distribution of CO2 along with air pollutants traditionally studied using CMAQ at relatively high spatial and temporal resolution with the goal of leveraging emissions verification efforts for both air quality and climate. We have developed a bias-enhanced Bayesian inference approach that can remedy the well-known problem of transport model errors in atmospheric CO2 inversions. We have tested the approach using data and model outputs from the TransCom3 global CO2 inversion comparison project. We have also performed two prototyping studies on inversion approaches in the generalized convection-diffusion context. One of these studies employed Polynomial Chaos Expansion to accelerate the evaluation of a regional transport model and enable efficient Markov Chain Monte Carlo sampling of the posterior for Bayesian inference. The other approach uses de- terministic inversion of a convection-diffusion-reaction system in the presence of uncertainty. These approaches should, in principle, be applicable to realistic atmospheric problems with moderate adaptation. We outline a regional greenhouse gas source inference system that integrates (1) two ap- proaches of atmospheric dispersion simulation and (2) a class of Bayesian inference and un- certainty quantification algorithms. We use two different and complementary approaches to simulate atmospheric dispersion. Specifically, we use a Eulerian chemical transport model CMAQ and a Lagrangian Particle Dispersion Model - FLEXPART-WRF. These two models share the same WRF

  14. Singularity resolution in quantum gravity

    International Nuclear Information System (INIS)

    Husain, Viqar; Winkler, Oliver

    2004-01-01

    We examine the singularity resolution issue in quantum gravity by studying a new quantization of standard Friedmann-Robertson-Walker geometrodynamics. The quantization procedure is inspired by the loop quantum gravity program, and is based on an alternative to the Schroedinger representation normally used in metric variable quantum cosmology. We show that in this representation for quantum geometrodynamics there exists a densely defined inverse scale factor operator, and that the Hamiltonian constraint acts as a difference operator on the basis states. We find that the cosmological singularity is avoided in the quantum dynamics. We discuss these results with a view to identifying the criteria that constitute 'singularity resolution' in quantum gravity

  15. USI A-43 resolution positions

    International Nuclear Information System (INIS)

    1983-04-01

    NUREG-0869 is comprised of the following documents: Proposed Regulatory Guide 1.82, Revision 1, Sump for Emergency Core Cooling and Containment Spray Systems; The Value-Impact Statement for USI A-43, Containment Emergency Sump Performance; and Background and Summary of Minutes of Meetings of the Committee to Review Generic Requirements Regarding Unresolved Safety Issue A-43 Resolution. The report has been assembled to facilitate obtaining for comment feedback on the position developed for resolution of USI A-43. There are no licensing requirements contained in NUREG-0869, and it should be clearly noted that this for comment report will not be used as interim requirements

  16. Connecticut church passes genetics resolution.

    Science.gov (United States)

    Culliton, B J

    1984-11-09

    The Connecticut Conference of the United Church of Christ, which represents the largest Protestant denomination in the state, has passed a resolution affirming an ethical duty to do research on human gene therapy and is planning to form local church groups to study the scientific and ethical issues involved. The resolution is intended to counter an earlier one proposed by Jeremy Rifkin to ban all efforts at engineering specific traits into the human germline. The Rifkin proposal had been endorsed by a large number of religious leaders, including the head of the U.S. United Church of Christ, but was subsequently characterized by many of the church leaders as overly restrictive.

  17. Requirements on high resolution detectors

    Energy Technology Data Exchange (ETDEWEB)

    Koch, A. [European Synchrotron Radiation Facility, Grenoble (France)

    1997-02-01

    For a number of microtomography applications X-ray detectors with a spatial resolution of 1 {mu}m are required. This high spatial resolution will influence and degrade other parameters of secondary importance like detective quantum efficiency (DQE), dynamic range, linearity and frame rate. This note summarizes the most important arguments, for and against those detector systems which could be considered. This article discusses the mutual dependencies between the various figures which characterize a detector, and tries to give some ideas on how to proceed in order to improve present technology.

  18. Performance of the Real-Q EBV Quantification Kit for Epstein-Barr Virus DNA Quantification in Whole Blood.

    Science.gov (United States)

    Huh, Hee Jae; Park, Jong Eun; Kim, Ji Youn; Yun, Sun Ae; Lee, Myoung Keun; Lee, Nam Yong; Kim, Jong Won; Ki, Chang Seok

    2017-03-01

    There has been increasing interest in standardized and quantitative Epstein-Barr virus (EBV) DNA testing for the management of EBV disease. We evaluated the performance of the Real-Q EBV Quantification Kit (BioSewoom, Korea) in whole blood (WB). Nucleic acid extraction and real-time PCR were performed by using the MagNA Pure 96 (Roche Diagnostics, Germany) and 7500 Fast real-time PCR system (Applied Biosystems, USA), respectively. Assay sensitivity, linearity, and conversion factor were determined by using the World Health Organization international standard diluted in EBV-negative WB. We used 81 WB clinical specimens to compare performance of the Real-Q EBV Quantification Kit and artus EBV RG PCR Kit (Qiagen, Germany). The limit of detection (LOD) and limit of quantification (LOQ) for the Real-Q kit were 453 and 750 IU/mL, respectively. The conversion factor from EBV genomic copies to IU was 0.62. The linear range of the assay was from 750 to 10⁶ IU/mL. Viral load values measured with the Real-Q assay were on average 0.54 log₁₀ copies/mL higher than those measured with the artus assay. The Real-Q assay offered good analytical performance for EBV DNA quantification in WB.

  19. Resolution 1540, ten years on

    International Nuclear Information System (INIS)

    Hautecouverture, Benjamin

    2014-06-01

    Adopted on the 28 April 2004 by the United Nations Security Council under Chapter VII of the UN Charter, Resolution 1540 is a composite tool that was hitherto unprecedented. To recap, States are bound to 'refrain from providing any form of support to non-State actors that attempt to develop, acquire, manufacture, possess, transport, transfer or use nuclear, chemical or biological weapons and their means of delivery' (par. 1), and to prohibit and prevent non-State actors from the aforementioned through 'appropriate and effective' (par. 2,3) legal, judiciary, and administrative means. A Committee was established to which States had to submit a first report outlining the steps 'they have taken or intend to take to implement this resolution' (par. 4). This Committee was initially established for two years and has been regularly renewed since, and its mandate was extended in 2011 for ten years. It is not a surveillance mechanism. Finally, with the aim of remedying difficulties that certain States may experience in implementing the Resolution, 'States in a position to do so' are invited to offer assistance (par. 7). The level of the application of Resolution 1540 was originally based on a delicate three-pronged balance of obligation, good will, and partnership. It is not a matter of designating certain States to the rest of the international community, whilst avoiding that the exercise be limited to the submission of national reports, instead aiming to initiate a dynamic. The wager was a risky one. Ten years on, 90% of UN member States have submitted one or several implementation reports. 170 States and 50 international and regional organisations have taken part in outreach and implementation support events. Whatever quantitative or qualitative conclusions that can be reached, we should continue to promote the Re-solution's universal adoption, and to ensure that the implementation of its provisions is undertaken in a lasting manner, taking account of the national

  20. RESOLUTION

    CERN Multimedia

    STAFF ASSOCIATION

    2010-01-01

    Research without a budget = Europe without a future !   Noting that the CERN Management has submitted to the Member States for the Finance Committee meeting on 25th August 2010 a budget for 2011 and a medium-term plan (MTP) for the period 2012-2015; Deploring the fact that, on the Member States’ request, this plan proposes a reduction of resources of 478 million Swiss francs over the period 2011–2015, compared to the initial proposal by the Management, which corresponded even then to the minimum needed to exploit the machines and experiments; Recalling that, following a decision by Council in 1996, CERN has suffered an annual budget cut of 100 million Swiss francs; Considering that this approach equates to an abandonment by the Member States of the European Union of a policy agreed upon in Barcelona in 2003 to invest 3% of their GDP in R&D by 2010, and today they can barely manage 1.85%; Considering that these budget cuts imposed on CERN compromise not on...

  1. PIV uncertainty quantification by image matching

    International Nuclear Information System (INIS)

    Sciacchitano, Andrea; Scarano, Fulvio; Wieneke, Bernhard

    2013-01-01

    A novel method is presented to quantify the uncertainty of PIV data. The approach is a posteriori, i.e. the unknown actual error of the measured velocity field is estimated using the velocity field itself as input along with the original images. The principle of the method relies on the concept of super-resolution: the image pair is matched according to the cross-correlation analysis and the residual distance between matched particle image pairs (particle disparity vector) due to incomplete match between the two exposures is measured. The ensemble of disparity vectors within the interrogation window is analyzed statistically. The dispersion of the disparity vector returns the estimate of the random error, whereas the mean value of the disparity indicates the occurrence of a systematic error. The validity of the working principle is first demonstrated via Monte Carlo simulations. Two different interrogation algorithms are considered, namely the cross-correlation with discrete window offset and the multi-pass with window deformation. In the simulated recordings, the effects of particle image displacement, its gradient, out-of-plane motion, seeding density and particle image diameter are considered. In all cases good agreement is retrieved, indicating that the error estimator is able to follow the trend of the actual error with satisfactory precision. Experiments where time-resolved PIV data are available are used to prove the concept under realistic measurement conditions. In this case the ‘exact’ velocity field is unknown; however a high accuracy estimate is obtained with an advanced interrogation algorithm that exploits the redundant information of highly temporally oversampled data (pyramid correlation, Sciacchitano et al (2012 Exp. Fluids 53 1087–105)). The image-matching estimator returns the instantaneous distribution of the estimated velocity measurement error. The spatial distribution compares very well with that of the actual error with maxima in the

  2. African Journal on Conflict Resolution

    African Journals Online (AJOL)

    The African Journal on Conflict Resolution (AJCR) publishes the writings of a wide range of African and international authors in the field, but emphasis has deliberately been kept on African writers and the thinking emerging from African universities, colleges and organisations. Other websites assiciated with this Journal: ...

  3. Fractional charge resolution in music

    International Nuclear Information System (INIS)

    Romero, J.L.; Brady, F.P.; Christie, B.

    1984-01-01

    Recent results obtained with MUSIC (MUltiple Sampling Ionization Chamber) for La and Ar beams at the Bevalac show resolutions better than ΔZ(FWHM) = 0.3 e. These results suggest the use of MUSIC in future ultrarelativistic heavy ion collisions

  4. Entity resolution for uncertain data

    NARCIS (Netherlands)

    Ayat, N.; Akbarinia, R.; Afsarmanesh, H.; Valduriez, P.

    2012-01-01

    Entity resolution (ER), also known as duplicate detection or record matching, is the problem of identifying the tuples that represent the same real world entity. In this paper, we address the problem of ER for uncertain data, which we call ERUD. We propose two different approaches for the ERUD

  5. Picosecond resolution programmable delay line

    International Nuclear Information System (INIS)

    Suchenek, Mariusz

    2009-01-01

    The note presents implementation of a programmable delay line for digital signals. The tested circuit has a subnanosecond delay range programmable with a resolution of picoseconds. Implementation of the circuit was based on low-cost components, easily available on the market. (technical design note)

  6. Spatial resolution in visual memory.

    Science.gov (United States)

    Ben-Shalom, Asaf; Ganel, Tzvi

    2015-04-01

    Representations in visual short-term memory are considered to contain relatively elaborated information on object structure. Conversely, representations in earlier stages of the visual hierarchy are thought to be dominated by a sensory-based, feed-forward buildup of information. In four experiments, we compared the spatial resolution of different object properties between two points in time along the processing hierarchy in visual short-term memory. Subjects were asked either to estimate the distance between objects or to estimate the size of one of the objects' features under two experimental conditions, of either a short or a long delay period between the presentation of the target stimulus and the probe. When different objects were referred to, similar spatial resolution was found for the two delay periods, suggesting that initial processing stages are sensitive to object-based properties. Conversely, superior resolution was found for the short, as compared with the long, delay when features were referred to. These findings suggest that initial representations in visual memory are hybrid in that they allow fine-grained resolution for object features alongside normal visual sensitivity to the segregation between objects. The findings are also discussed in reference to the distinction made in earlier studies between visual short-term memory and iconic memory.

  7. High resolution metric imaging payload

    Science.gov (United States)

    Delclaud, Y.

    2017-11-01

    Alcatel Space Industries has become Europe's leader in the field of high and very high resolution optical payloads, in the frame work of earth observation system able to provide military government with metric images from space. This leadership allowed ALCATEL to propose for the export market, within a French collaboration frame, a complete space based system for metric observation.

  8. Improved quantification for local regions of interest in preclinical PET imaging

    Science.gov (United States)

    Cal-González, J.; Moore, S. C.; Park, M.-A.; Herraiz, J. L.; Vaquero, J. J.; Desco, M.; Udias, J. M.

    2015-09-01

    In Positron Emission Tomography, there are several causes of quantitative inaccuracy, such as partial volume or spillover effects. The impact of these effects is greater when using radionuclides that have a large positron range, e.g. 68Ga and 124I, which have been increasingly used in the clinic. We have implemented and evaluated a local projection algorithm (LPA), originally evaluated for SPECT, to compensate for both partial-volume and spillover effects in PET. This method is based on the use of a high-resolution CT or MR image, co-registered with a PET image, which permits a high-resolution segmentation of a few tissues within a volume of interest (VOI) centered on a region within which tissue-activity values need to be estimated. The additional boundary information is used to obtain improved activity estimates for each tissue within the VOI, by solving a simple inversion problem. We implemented this algorithm for the preclinical Argus PET/CT scanner and assessed its performance using the radionuclides 18F, 68Ga and 124I. We also evaluated and compared the results obtained when it was applied during the iterative reconstruction, as well as after the reconstruction as a postprocessing procedure. In addition, we studied how LPA can help to reduce the ‘spillover contamination’, which causes inaccurate quantification of lesions in the immediate neighborhood of large, ‘hot’ sources. Quantification was significantly improved by using LPA, which provided more accurate ratios of lesion-to-background activity concentration for hot and cold regions. For 18F, the contrast was improved from 3.0 to 4.0 in hot lesions (when the true ratio was 4.0) and from 0.16 to 0.06 in cold lesions (true ratio  =  0.0), when using the LPA postprocessing. Furthermore, activity values estimated within the VOI using LPA during reconstruction were slightly more accurate than those obtained by post-processing, while also visually improving the image contrast and uniformity

  9. High resolution tomographic instrument development

    International Nuclear Information System (INIS)

    1992-01-01

    Our recent work has concentrated on the development of high-resolution PET instrumentation reflecting in part the growing importance of PET in nuclear medicine imaging. We have developed a number of positron imaging instruments and have the distinction that every instrument has been placed in operation and has had an extensive history of application for basic research and clinical study. The present program is a logical continuation of these earlier successes. PCR-I, a single ring positron tomograph was the first demonstration of analog coding using BGO. It employed 4 mm detectors and is currently being used for a wide range of biological studies. These are of immense importance in guiding the direction for future instruments. In particular, PCR-II, a volume sensitive positron tomograph with 3 mm spatial resolution has benefited greatly from the studies using PCR-I. PCR-II is currently in the final stages of assembly and testing and will shortly be placed in operation for imaging phantoms, animals and ultimately humans. Perhaps the most important finding resulting from our previous study is that resolution and sensitivity must be carefully balanced to achieve a practical high resolution system. PCR-II has been designed to have the detection characteristics required to achieve 3 mm resolution in human brain under practical imaging situations. The development of algorithms by the group headed by Dr. Chesler is based on a long history of prior study including his joint work with Drs. Pelc and Reiderer and Stearns. This body of expertise will be applied to the processing of data from PCR-II when it becomes operational

  10. High resolution tomographic instrument development

    Energy Technology Data Exchange (ETDEWEB)

    1992-08-01

    Our recent work has concentrated on the development of high-resolution PET instrumentation reflecting in part the growing importance of PET in nuclear medicine imaging. We have developed a number of positron imaging instruments and have the distinction that every instrument has been placed in operation and has had an extensive history of application for basic research and clinical study. The present program is a logical continuation of these earlier successes. PCR-I, a single ring positron tomograph was the first demonstration of analog coding using BGO. It employed 4 mm detectors and is currently being used for a wide range of biological studies. These are of immense importance in guiding the direction for future instruments. In particular, PCR-II, a volume sensitive positron tomograph with 3 mm spatial resolution has benefited greatly from the studies using PCR-I. PCR-II is currently in the final stages of assembly and testing and will shortly be placed in operation for imaging phantoms, animals and ultimately humans. Perhaps the most important finding resulting from our previous study is that resolution and sensitivity must be carefully balanced to achieve a practical high resolution system. PCR-II has been designed to have the detection characteristics required to achieve 3 mm resolution in human brain under practical imaging situations. The development of algorithms by the group headed by Dr. Chesler is based on a long history of prior study including his joint work with Drs. Pelc and Reiderer and Stearns. This body of expertise will be applied to the processing of data from PCR-II when it becomes operational.

  11. High resolution tomographic instrument development

    Energy Technology Data Exchange (ETDEWEB)

    1992-01-01

    Our recent work has concentrated on the development of high-resolution PET instrumentation reflecting in part the growing importance of PET in nuclear medicine imaging. We have developed a number of positron imaging instruments and have the distinction that every instrument has been placed in operation and has had an extensive history of application for basic research and clinical study. The present program is a logical continuation of these earlier successes. PCR-I, a single ring positron tomograph was the first demonstration of analog coding using BGO. It employed 4 mm detectors and is currently being used for a wide range of biological studies. These are of immense importance in guiding the direction for future instruments. In particular, PCR-II, a volume sensitive positron tomograph with 3 mm spatial resolution has benefited greatly from the studies using PCR-I. PCR-II is currently in the final stages of assembly and testing and will shortly be placed in operation for imaging phantoms, animals and ultimately humans. Perhaps the most important finding resulting from our previous study is that resolution and sensitivity must be carefully balanced to achieve a practical high resolution system. PCR-II has been designed to have the detection characteristics required to achieve 3 mm resolution in human brain under practical imaging situations. The development of algorithms by the group headed by Dr. Chesler is based on a long history of prior study including his joint work with Drs. Pelc and Reiderer and Stearns. This body of expertise will be applied to the processing of data from PCR-II when it becomes operational.

  12. Accurate quantification of 5 German cockroach (GCr) allergens in complex extracts using multiple reaction monitoring mass spectrometry (MRM MS).

    Science.gov (United States)

    Mindaye, S T; Spiric, J; David, N A; Rabin, R L; Slater, J E

    2017-12-01

    German cockroach (GCr) allergen extracts are complex and heterogeneous products, and methods to better assess their potency and composition are needed for adequate studies of their safety and efficacy. The objective of this study was to develop an assay based on liquid chromatography and multiple reaction monitoring mass spectrometry (LC-MRM MS) for rapid, accurate, and reproducible quantification of 5 allergens (Bla g 1, Bla g 2, Bla g 3, Bla g 4, and Bla g 5) in crude GCr allergen extracts. We first established a comprehensive peptide library of allergens from various commercial extracts as well as recombinant allergens. Peptide mapping was performed using high-resolution MS, and the peptide library was then used to identify prototypic and quantotypic peptides to proceed with MRM method development. Assay development included a systematic optimization of digestion conditions (buffer, digestion time, and trypsin concentration), chromatographic separation, and MS parameters. Robustness and suitability were assessed following ICH (Q2 [R1]) guidelines. The method is precise (RSD  0.99, 0.01-1384 fmol/μL), and sensitive (LLOD and LLOQ MS, we quantified allergens from various commercial GCr extracts and showed considerable variability that may impact clinical efficacy. Our data demonstrate that the LC-MRM MS method is valuable for absolute quantification of allergens in GCr extracts and likely has broader applicability to other complex allergen extracts. Definitive quantification provides a new standard for labelling of allergen extracts, which will inform patient care, enable personalized therapy, and enhance the efficacy of immunotherapy for environmental and food allergies. © 2017 The Authors. Clinical & Experimental Allergy published by John Wiley & Sons Ltd. This article has been contributed to by US Government employees and their work is in the public domain in the USA.

  13. Very High Spectral Resolution Imaging Spectroscopy: the Fluorescence Explorer (FLEX) Mission

    Science.gov (United States)

    Moreno, Jose F.; Goulas, Yves; Huth, Andreas; Middleton, Elizabeth; Miglietta, Franco; Mohammed, Gina; Nedbal, Ladislav; Rascher, Uwe; Verhoef, Wouter; Drusch, Matthias

    2016-01-01

    The Fluorescence Explorer (FLEX) mission has been recently selected as the 8th Earth Explorer by the European Space Agency (ESA). It will be the first mission specifically designed to measure from space vegetation fluorescence emission, by making use of very high spectral resolution imaging spectroscopy techniques. Vegetation fluorescence is the best proxy to actual vegetation photosynthesis which can be measurable from space, allowing an improved quantification of vegetation carbon assimilation and vegetation stress conditions, thus having key relevance for global mapping of ecosystems dynamics and aspects related with agricultural production and food security. The FLEX mission carries the FLORIS spectrometer, with a spectral resolution in the range of 0.3 nm, and is designed to fly in tandem with Copernicus Sentinel-3, in order to provide all the necessary spectral / angular information to disentangle emitted fluorescence from reflected radiance, and to allow proper interpretation of the observed fluorescence spatial and temporal dynamics.

  14. Compositional Solution Space Quantification for Probabilistic Software Analysis

    Science.gov (United States)

    Borges, Mateus; Pasareanu, Corina S.; Filieri, Antonio; d'Amorim, Marcelo; Visser, Willem

    2014-01-01

    Probabilistic software analysis aims at quantifying how likely a target event is to occur during program execution. Current approaches rely on symbolic execution to identify the conditions to reach the target event and try to quantify the fraction of the input domain satisfying these conditions. Precise quantification is usually limited to linear constraints, while only approximate solutions can be provided in general through statistical approaches. However, statistical approaches may fail to converge to an acceptable accuracy within a reasonable time. We present a compositional statistical approach for the efficient quantification of solution spaces for arbitrarily complex constraints over bounded floating-point domains. The approach leverages interval constraint propagation to improve the accuracy of the estimation by focusing the sampling on the regions of the input domain containing the sought solutions. Preliminary experiments show significant improvement on previous approaches both in results accuracy and analysis time.

  15. Uncertainty Quantification for Large-Scale Ice Sheet Modeling

    Energy Technology Data Exchange (ETDEWEB)

    Ghattas, Omar [Univ. of Texas, Austin, TX (United States)

    2016-02-05

    This report summarizes our work to develop advanced forward and inverse solvers and uncertainty quantification capabilities for a nonlinear 3D full Stokes continental-scale ice sheet flow model. The components include: (1) forward solver: a new state-of-the-art parallel adaptive scalable high-order-accurate mass-conservative Newton-based 3D nonlinear full Stokes ice sheet flow simulator; (2) inverse solver: a new adjoint-based inexact Newton method for solution of deterministic inverse problems governed by the above 3D nonlinear full Stokes ice flow model; and (3) uncertainty quantification: a novel Hessian-based Bayesian method for quantifying uncertainties in the inverse ice sheet flow solution and propagating them forward into predictions of quantities of interest such as ice mass flux to the ocean.

  16. Improved perfusion quantification in FAIR imaging by offset correction

    DEFF Research Database (Denmark)

    Sidaros, Karam; Andersen, Irene Klærke; Gesmar, Henrik

    2001-01-01

    Perfusion quantification using pulsed arterial spin labeling has been shown to be sensitive to the RF pulse slice profiles. Therefore, in Flow-sensitive Alternating-Inversion Recovery (FAIR) imaging the slice selective (ss) inversion slab is usually three to four times thicker than the imaging...... slice. However, this reduces perfusion sensitivity due to the increased transit delay of the incoming blood with unperturbed spins. In the present article, the dependence of the magnetization on the RF pulse slice profiles is inspected both theoretically and experimentally. A perfusion quantification...... model is presented that allows the use of thinner ss inversion slabs by taking into account the offset of RF slice profiles between ss and nonselective inversion slabs. This model was tested in both phantom and human studies. Magn Reson Med 46:193-197, 2001...

  17. Good quantification practices of flavours and fragrances by mass spectrometry.

    Science.gov (United States)

    Begnaud, Frédéric; Chaintreau, Alain

    2016-10-28

    Over the past 15 years, chromatographic techniques with mass spectrometric detection have been increasingly used to monitor the rapidly expanded list of regulated flavour and fragrance ingredients. This trend entails a need for good quantification practices suitable for complex media, especially for multi-analytes. In this article, we present experimental precautions needed to perform the analyses and ways to process the data according to the most recent approaches. This notably includes the identification of analytes during their quantification and method validation, when applied to real matrices, based on accuracy profiles. A brief survey of application studies based on such practices is given.This article is part of the themed issue 'Quantitative mass spectrometry'. © 2016 The Authors.

  18. Techniques of biomolecular quantification through AMS detection of radiocarbon

    International Nuclear Information System (INIS)

    Vogel, S.J.; Turteltaub, K.W.; Frantz, C.; Felton, J.S.; Gledhill, B.L.

    1992-01-01

    Accelerator mass spectrometry offers a large gain over scintillation counting in sensitivity for detecting radiocarbon in biomolecular tracing. Application of this sensitivity requires new considerations of procedures to extract or isolate the carbon fraction to be quantified, to inventory all carbon in the sample, to prepare graphite from the sample for use in the spectrometer, and to derive a meaningful quantification from the measured isotope ratio. These procedures need to be accomplished without contaminating the sample with radiocarbon, which may be ubiquitous in laboratories and on equipment previously used for higher dose, scintillation experiments. Disposable equipment, materials and surfaces are used to control these contaminations. Quantification of attomole amounts of labeled substances are possible through these techniques

  19. Quantification of the effects of dependence on human error probabilities

    International Nuclear Information System (INIS)

    Bell, B.J.; Swain, A.D.

    1980-01-01

    In estimating the probabilities of human error in the performance of a series of tasks in a nuclear power plant, the situation-specific characteristics of the series must be considered. A critical factor not to be overlooked in this estimation is the dependence or independence that pertains to any of the several pairs of task performances. In discussing the quantification of the effects of dependence, the event tree symbology described will be used. In any series of tasks, the only dependence considered for quantification in this document will be that existing between the task of interest and the immediately preceeding task. Tasks performed earlier in the series may have some effect on the end task, but this effect is considered negligible

  20. Metering error quantification under voltage and current waveform distortion

    Science.gov (United States)

    Wang, Tao; Wang, Jia; Xie, Zhi; Zhang, Ran

    2017-09-01

    With integration of more and more renewable energies and distortion loads into power grid, the voltage and current waveform distortion results in metering error in the smart meters. Because of the negative effects on the metering accuracy and fairness, it is an important subject to study energy metering combined error. In this paper, after the comparing between metering theoretical value and real recorded value under different meter modes for linear and nonlinear loads, a quantification method of metering mode error is proposed under waveform distortion. Based on the metering and time-division multiplier principles, a quantification method of metering accuracy error is proposed also. Analyzing the mode error and accuracy error, a comprehensive error analysis method is presented which is suitable for new energy and nonlinear loads. The proposed method has been proved by simulation.

  1. CT quantification of pleuropulmonary lesions in severe thoracic trauma

    International Nuclear Information System (INIS)

    Kunisch-Hoppe, M.; Bachmann, G.; Weimar, B.; Bauer, T.; Rau, W.S.; Hoppe, M.; Zickmann, B.

    1997-01-01

    Purpose: Computed quantification of the extent of pleuropulmonary trauma by CT and comparison with conventional chest X-ray - Impact on therapy and correlation with mechanical ventilation support and clinical outcome. Method: In a prospective trial, 50 patients with clinically suspicious blunt chest trauma were evaluated using CT and conventional chest X-ray. The computed quantification of ventilated lung provided by CT volumetry was correlated with the consecutive artificial respiration parameters and the clinical outcome. Results: We found a high correlation between CT volumetry and artificial ventilation concerning maximal pressures and inspiratory oxygen concentration (FiO 2 , Goris-Score) (r=0.89, Pearson). The graduation of thoracic trauma correlated highly with the duration of mechanical ventilation (r=0.98, Pearson). Especially with regard to atelectases and lung contusions CT is superior compared to conventional chest X-ray; only 32% and 43%, respectively, were identified by conventional chest X-ray. (orig./AJ) [de

  2. Quantification of uranyl in presence of citric acid

    International Nuclear Information System (INIS)

    Garcia G, N.; Barrera D, C.E.; Ordonez R, E.

    2007-01-01

    To determine the influence that has the organic matter of the soil on the uranyl sorption on some solids is necessary to have a detection technique and quantification of uranyl that it is reliable and sufficiently quick in the obtaining of results. For that in this work, it intends to carry out the uranyl quantification in presence of citric acid modifying the Fluorescence induced by UV-Vis radiation technique. Since the uranyl ion is very sensitive to the medium that contains it, (speciation, pH, ionic forces, etc.) it was necessary to develop an analysis technique that stands out the fluorescence of uranyl ion avoiding the out one that produce the organic acids. (Author)

  3. Collaborative framework for PIV uncertainty quantification: comparative assessment of methods

    International Nuclear Information System (INIS)

    Sciacchitano, Andrea; Scarano, Fulvio; Neal, Douglas R; Smith, Barton L; Warner, Scott O; Vlachos, Pavlos P; Wieneke, Bernhard

    2015-01-01

    A posteriori uncertainty quantification of particle image velocimetry (PIV) data is essential to obtain accurate estimates of the uncertainty associated with a given experiment. This is particularly relevant when measurements are used to validate computational models or in design and decision processes. In spite of the importance of the subject, the first PIV uncertainty quantification (PIV-UQ) methods have been developed only in the last three years. The present work is a comparative assessment of four approaches recently proposed in the literature: the uncertainty surface method (Timmins et al 2012), the particle disparity approach (Sciacchitano et al 2013), the peak ratio criterion (Charonko and Vlachos 2013) and the correlation statistics method (Wieneke 2015). The analysis is based upon experiments conducted for this specific purpose, where several measurement techniques are employed simultaneously. The performances of the above approaches are surveyed across different measurement conditions and flow regimes. (paper)

  4. Clinical applications of MS-based protein quantification.

    Science.gov (United States)

    Sabbagh, Bassel; Mindt, Sonani; Neumaier, Michael; Findeisen, Peter

    2016-04-01

    Mass spectrometry-based assays are increasingly important in clinical laboratory medicine and nowadays are already commonly used in several areas of routine diagnostics. These include therapeutic drug monitoring, toxicology, endocrinology, pediatrics, and microbiology. Accordingly, some of the most common analyses are therapeutic drug monitoring of immunosuppressants, vitamin D, steroids, newborn screening, and bacterial identification. However, MS-based quantification of peptides and proteins for routine diagnostic use is rather rare up to now despite excellent analytical specificity and good sensitivity. Here, we want to give an overview over current fit-for-purpose assays for MS-based protein quantification. Advantages as well as challenges of this approach will be discussed with focus on feasibility for routine diagnostic use. © 2016 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  5. Metabolite profiling and quantification of phytochemicals in potato extracts using ultra-high-performance liquid chromatography-mass spectrometry.

    Science.gov (United States)

    Chong, Esther Swee Lan; McGhie, Tony K; Heyes, Julian A; Stowell, Kathryn M

    2013-12-01

    Potatoes contain a diverse range of phytochemicals which have been suggested to have health benefits. Metabolite profiling and quantification were conducted on plant extracts made from a white potato cultivar and 'Urenika', a purple potato cultivar traditionally consumed by New Zealand Maori. There is limited published information regarding the metabolite profile of Solanum tuberosum cultivar 'Urenika'. Using ultra-high- performance liquid chromatography-mass spectrometry (UHPLC-MS), a total of 31 compounds were identified and quantified in the potato extracts. The majority of the compounds were identified for the first time in 'Urenika'. These compounds include several types of anthocyanins, hydroxycinnamic acid (HCA) derivatives, and hydroxycinnamic amides (HCAA). Six classes of compounds, namely organic acids, amino acids, HCA, HCAA, flavonols and glycoalkaloids, were present in both extracts but quantities varied between the two extracts. The unknown plant metabolites in both potato extracts were assigned with molecular formulae and identified with high confidence. Quantification of the metabolites was achieved using a number of appropriate standards. High-resolution mass spectrometry data critical for accurate identification of unknown phytochemicals were achieved and could be added to potato or plant metabolomic database. © 2013 Society of Chemical Industry.

  6. In-line monitoring of cocrystallization process and quantification of carbamazepine-nicotinamide cocrystal using Raman spectroscopy and chemometric tools

    Science.gov (United States)

    Soares, Frederico L. F.; Carneiro, Renato L.

    2017-06-01

    A cocrystallization process may involve several molecular species, which are generally solid under ambient conditions. Thus, accurate monitoring of different components that might appear during the reaction is necessary, as well as quantification of the final product. This work reports for the first time the synthesis of carbamazepine-nicotinamide cocrystal in aqueous media with a full conversion. The reactions were monitored by Raman spectroscopy coupled with Multivariate Curve Resolution - Alternating Least Squares, and the quantification of the final product among its coformers was performed using Raman spectroscopy and Partial Least Squares regression. The slurry reaction was made in four different conditions: room temperature, 40 °C, 60 °C and 80 °C. The slurry reaction at 80 °C enabled a full conversion of initial substrates into the cocrystal form, using water as solvent for a greener method. The employment of MCR-ALS coupled with Raman spectroscopy enabled to observe the main steps of the reactions, such as drug dissolution, nucleation and crystallization of the cocrystal. The PLS models gave mean errors of cross validation around 2.0 (% wt/wt), and errors of validation between 2.5 and 8.2 (% wt/wt) for all components. These were good results since the spectra of cocrystals and the physical mixture of the coformers present some similar peaks.

  7. Simultaneous quantification of carotenoids, retinol, and tocopherols in forage, bovine plasma, and milk: validation of a novel UPLC method

    Energy Technology Data Exchange (ETDEWEB)

    Chauveau-Duriot, B.; Doreau, M.; Noziere, P.; Graulet, B. [UR1213 Research Unit on Herbivores, INRA, Saint Genes Champanelle (France)

    2010-05-15

    Simultaneous quantification of various liposoluble micronutrients is not a new area of interest since these compounds participate in the nutritional quality of feeds that is largely explored in human, and also in animal diet. However, the development of related methods is still under concern, especially when the carotenoid composition is complex such as in forage given to ruminants or in lipid-rich matrices like milk. In this paper, an original method for simultaneous extraction and quantification of all carotenoids, vitamins E, and A in milk was proposed. Moreover, a new UPLC method allowing simultaneous determination of carotenoids and vitamins A and E in forage, plasma and milk, and separation of 23 peaks of carotenoids in forage was described. This UPLC method using a HSS T3 column and a gradient solvent system was compared to a previously published reverse-phase HPLC using two C18 columns in series and an isocratic solvent system. The UPLC method gave similar concentrations of carotenoids and vitamins A and E than the HPLC method. Moreover, UPLC allowed a better resolution for xanthophylls, especially lutein and zeaxanthin, for the three isomers of {beta}-carotene (all-E-, 9Z- and 13Z-) and for vitamins A, an equal or better sensitivity according to gradient, and a better reproducibility of peak areas and retention times, but did not reduce the time required for analysis. (orig.)

  8. Simultaneous quantification of carotenoids, retinol, and tocopherols in forages, bovine plasma, and milk: validation of a novel UPLC method.

    Science.gov (United States)

    Chauveau-Duriot, B; Doreau, M; Nozière, P; Graulet, B

    2010-05-01

    Simultaneous quantification of various liposoluble micronutrients is not a new area of interest since these compounds participate in the nutritional quality of feeds that is largely explored in human, and also in animal diet. However, the development of related methods is still under concern, especially when the carotenoid composition is complex such as in forages given to ruminants or in lipid-rich matrices like milk. In this paper, an original method for simultaneous extraction and quantification of all carotenoids, vitamins E, and A in milk was proposed. Moreover, a new UPLC method allowing simultaneous determination of carotenoids and vitamins A and E in forage, plasma and milk, and separation of 23 peaks of carotenoids in forages was described. This UPLC method using a HSS T3 column and a gradient solvent system was compared to a previously published reverse-phase HPLC using two C18 columns in series and an isocratic solvent system. The UPLC method gave similar concentrations of carotenoids and vitamins A and E than the HPLC method. Moreover, UPLC allowed a better resolution for xanthophylls, especially lutein and zeaxanthin, for the three isomers of beta-carotene (all-E-, 9Z- and 13Z-) and for vitamins A, an equal or better sensitivity according to gradient, and a better reproducibility of peak areas and retention times, but did not reduce the time required for analysis.

  9. Stability Indicating HPLC Method for Simultaneous Quantification of Trihexyphenidyl Hydrochloride, Trifluoperazine Hydrochloride and Chlorpromazine Hydrochloride from Tablet Formulation

    Directory of Open Access Journals (Sweden)

    P. Shetti

    2010-01-01

    Full Text Available A new, simple, precise, rapid, selective and stability indicating reversed-phase high performance liquid chromatographic (HPLC method has been developed and validated for simultaneous quantification of trihexyphenidyl hydrochloride, trifluoperazine hydrochloride and chlorpromazine hydrochloride from combined tablet formulation. The method is based on reverse-phase using C-18 (250×4.6 mm, 5 μm particle size column. The separation is achieved using isocratic elution by methanol and ammonium acetate buffer (1% w/v, pH 6.5 in the ratio of 85:15 v/v, pumped at flow rate 1.0 mL/min and UV detection at 215 nm. The column is maintained at 30 °C through out the analysis. This method gives baseline resolution. The total run time is 15 min. Stability indicating capability is established buy forced degradation experiment. The method is validated for specificity, accuracy, precision and linearity as per International conference of harmonisation (ICH. The method is accurate and linear for quantification of trihexyphenidyl hydrochloride, trifluoperazine hydrochloride and Chlorpromazine hydrochloride between 5 - 15 μg/mL, 12.5- 37.5 μg/mL and 62.5 - 187.5 μg/mL respectively.

  10. Visualization and Quantification of Rotor Tip Vortices in Helicopter Flows

    Science.gov (United States)

    Kao, David L.; Ahmad, Jasim U.; Holst, Terry L.

    2015-01-01

    This paper presents an automated approach for effective extraction, visualization, and quantification of vortex core radii from the Navier-Stokes simulations of a UH-60A rotor in forward flight. We adopt a scaled Q-criterion to determine vortex regions and then perform vortex core profiling in these regions to calculate vortex core radii. This method provides an efficient way of visualizing and quantifying the blade tip vortices. Moreover, the vortices radii are displayed graphically in a plane.

  11. A "Toy" Model for Operational Risk Quantification using Credibility Theory

    OpenAIRE

    Hans B\\"uhlmann; Pavel V. Shevchenko; Mario V. W\\"uthrich

    2009-01-01

    To meet the Basel II regulatory requirements for the Advanced Measurement Approaches in operational risk, the bank's internal model should make use of the internal data, relevant external data, scenario analysis and factors reflecting the business environment and internal control systems. One of the unresolved challenges in operational risk is combining of these data sources appropriately. In this paper we focus on quantification of the low frequency high impact losses exceeding some high thr...

  12. Leishmania parasite detection and quantification using PCR-ELISA

    Czech Academy of Sciences Publication Activity Database

    Kobets, Tetyana; Badalová, Jana; Grekov, Igor; Havelková, Helena; Lipoldová, Marie

    2010-01-01

    Roč. 5, č. 6 (2010), s. 1074-1080 ISSN 1754-2189 R&D Projects: GA ČR GA310/08/1697; GA MŠk(CZ) LC06009 Institutional research plan: CEZ:AV0Z50520514 Keywords : polymerase chain reaction * Leishmania major infection * parasite quantification Subject RIV: EB - Genetics ; Molecular Biology Impact factor: 8.362, year: 2010

  13. Digital PCR for direct quantification of viruses without DNA extraction

    OpenAIRE

    Pav?i?, Jernej; ?el, Jana; Milavec, Mojca

    2015-01-01

    DNA extraction before amplification is considered an essential step for quantification of viral DNA using real-time PCR (qPCR). However, this can directly affect the final measurements due to variable DNA yields and removal of inhibitors, which leads to increased inter-laboratory variability of qPCR measurements and reduced agreement on viral loads. Digital PCR (dPCR) might be an advantageous methodology for the measurement of virus concentrations, as it does not depend on any calibration mat...

  14. Detection and quantification of Leveillula taurica growth in pepper leaves.

    Science.gov (United States)

    Zheng, Zheng; Nonomura, Teruo; Bóka, Károly; Matsuda, Yoshinori; Visser, Richard G F; Toyoda, Hideyoshi; Kiss, Levente; Bai, Yuling

    2013-06-01

    Leveillula taurica is an obligate fungal pathogen that causes powdery mildew disease on a broad range of plants, including important crops such as pepper, tomato, eggplant, onion, cotton, and so on. The early stage of this disease is difficult to diagnose and the disease can easily spread unobserved; for example, in pepper and tomato production fields and greenhouses. The objective of this study was to develop a detection and quantification method of L. taurica biomass in pepper leaves with special regard to the early stages of infection. We monitored the development of the disease to time the infection process on the leaf surface as well as inside the pepper leaves. The initial and final steps of the infection taking place on the leaf surface were consecutively observed using a dissecting microscope and a scanning electron microscope. The development of the intercellular mycelium in the mesophyll was followed by light and transmission electron microscopy. A pair of L. taurica-specific primers was designed based on the internal transcribed spacer sequence of L. taurica and used in real-time polymerase chain reaction (PCR) assay to quantify the fungal DNA during infection. The specificity of this assay was confirmed by testing the primer pair with DNA from host plants and also from another powdery mildew species, Oidium neolycopersici, infecting tomato. A standard curve was obtained for absolute quantification of L. taurica biomass. In addition, we tested a relative quantification method by using a plant gene as reference and the obtained results were compared with the visual disease index scoring. The real-time PCR assay for L. taurica provides a valuable tool for detection and quantification of this pathogen in breeding activities as well in plant-microbe interaction studies.

  15. Standardless quantification by parameter optimization in electron probe microanalysis

    Energy Technology Data Exchange (ETDEWEB)

    Limandri, Silvina P. [Instituto de Fisica Enrique Gaviola (IFEG), CONICET (Argentina); Facultad de Matematica, Astronomia y Fisica, Universidad Nacional de Cordoba, Medina Allende s/n, (5016) Cordoba (Argentina); Bonetto, Rita D. [Centro de Investigacion y Desarrollo en Ciencias Aplicadas Dr. Jorge Ronco (CINDECA), CONICET, 47 Street 257, (1900) La Plata (Argentina); Facultad de Ciencias Exactas, Universidad Nacional de La Plata, 1 and 47 Streets (1900) La Plata (Argentina); Josa, Victor Galvan; Carreras, Alejo C. [Instituto de Fisica Enrique Gaviola (IFEG), CONICET (Argentina); Facultad de Matematica, Astronomia y Fisica, Universidad Nacional de Cordoba, Medina Allende s/n, (5016) Cordoba (Argentina); Trincavelli, Jorge C., E-mail: trincavelli@famaf.unc.edu.ar [Instituto de Fisica Enrique Gaviola (IFEG), CONICET (Argentina); Facultad de Matematica, Astronomia y Fisica, Universidad Nacional de Cordoba, Medina Allende s/n, (5016) Cordoba (Argentina)

    2012-11-15

    A method for standardless quantification by parameter optimization in electron probe microanalysis is presented. The method consists in minimizing the quadratic differences between an experimental spectrum and an analytical function proposed to describe it, by optimizing the parameters involved in the analytical prediction. This algorithm, implemented in the software POEMA (Parameter Optimization in Electron Probe Microanalysis), allows the determination of the elemental concentrations, along with their uncertainties. The method was tested in a set of 159 elemental constituents corresponding to 36 spectra of standards (mostly minerals) that include trace elements. The results were compared with those obtained with the commercial software GENESIS Spectrum Registered-Sign for standardless quantification. The quantifications performed with the method proposed here are better in the 74% of the cases studied. In addition, the performance of the method proposed is compared with the first principles standardless analysis procedure DTSA for a different data set, which excludes trace elements. The relative deviations with respect to the nominal concentrations are lower than 0.04, 0.08 and 0.35 for the 66% of the cases for POEMA, GENESIS and DTSA, respectively. - Highlights: Black-Right-Pointing-Pointer A method for standardless quantification in EPMA is presented. Black-Right-Pointing-Pointer It gives better results than the commercial software GENESIS Spectrum. Black-Right-Pointing-Pointer It gives better results than the software DTSA. Black-Right-Pointing-Pointer It allows the determination of the conductive coating thickness. Black-Right-Pointing-Pointer It gives an estimation for the concentration uncertainties.

  16. Rapid quantification of biomarkers during kerogen microscale pyrolysis

    Energy Technology Data Exchange (ETDEWEB)

    Stott, A.W.; Abbott, G.D. [Fossil Fuels and Environmental Geochemistry NRG, The University, Newcastle-upon-Tyne (United Kingdom)

    1995-02-01

    A rapid, reproducible method incorporating closed system microscale pyrolysis and thermal desorption-gas chromatography/mass spectrometry has been developed and applied to the quantification of sterane biomarkers released during pyrolysis of the Messel oil shale kerogen under confined conditions. This method allows a substantial experimental concentration-time data set to be collected at accurately controlled temperatures, due to the low thermal inertia of the microscale borosilicate glass reaction vessels, which facilitates kinetic studies of biomarker reactions during kerogen microscale pyrolysis

  17. 31 P magnetic resonance fingerprinting for rapid quantification of creatine kinase reaction rate in vivo.

    Science.gov (United States)

    Wang, Charlie Y; Liu, Yuchi; Huang, Shuying; Griswold, Mark A; Seiberlich, Nicole; Yu, Xin

    2017-12-01

    The purpose of this work was to develop a 31 P spectroscopic magnetic resonance fingerprinting (MRF) method for fast quantification of the chemical exchange rate between phosphocreatine (PCr) and adenosine triphosphate (ATP) via creatine kinase (CK). A 31 P MRF sequence (CK-MRF) was developed to quantify the forward rate constant of ATP synthesis via CK ( kfCK), the T 1 relaxation time of PCr ( T1PCr), and the PCr-to-ATP concentration ratio ( MRPCr). The CK-MRF sequence used a balanced steady-state free precession (bSSFP)-type excitation with ramped flip angles and a unique saturation scheme sensitive to the exchange between PCr and γATP. Parameter estimation was accomplished by matching the acquired signals to a dictionary generated using the Bloch-McConnell equation. Simulation studies were performed to examine the susceptibility of the CK-MRF method to several potential error sources. The accuracy of nonlocalized CK-MRF measurements before and after an ischemia-reperfusion (IR) protocol was compared with the magnetization transfer (MT-MRS) method in rat hindlimb at 9.4 T (n = 14). The reproducibility of CK-MRF was also assessed by comparing CK-MRF measurements with both MT-MRS (n = 17) and four angle saturation transfer (FAST) (n = 7). Simulation results showed that CK-MRF quantification of kfCK was robust, with less than 5% error in the presence of model inaccuracies including dictionary resolution, metabolite T 2 values, inorganic phosphate metabolism, and B 1 miscalibration. Estimation of kfCK by CK-MRF (0.38 ± 0.02 s -1 at baseline and 0.42 ± 0.03 s -1 post-IR) showed strong agreement with MT-MRS (0.39 ± 0.03 s -1 at baseline and 0.44 ± 0.04 s -1 post-IR). kfCK estimation was also similar between CK-MRF and FAST (0.38 ± 0.02 s -1 for CK-MRF and 0.38 ± 0.11 s -1 for FAST). The coefficient of variation from 20 s CK-MRF quantification of kfCK was 42% of that by 150 s MT-MRS acquisition and was 12% of that by 20 s FAST

  18. Introducing AAA-MS, a rapid and sensitive method for amino acid analysis using isotope dilution and high-resolution mass spectrometry.

    Science.gov (United States)

    Louwagie, Mathilde; Kieffer-Jaquinod, Sylvie; Dupierris, Véronique; Couté, Yohann; Bruley, Christophe; Garin, Jérôme; Dupuis, Alain; Jaquinod, Michel; Brun, Virginie

    2012-07-06

    Accurate quantification of pure peptides and proteins is essential for biotechnology, clinical chemistry, proteomics, and systems biology. The reference method to quantify peptides and proteins is amino acid analysis (AAA). This consists of an acidic hydrolysis followed by chromatographic separation and spectrophotometric detection of amino acids. Although widely used, this method displays some limitations, in particular the need for large amounts of starting material. Driven by the need to quantify isotope-dilution standards used for absolute quantitative proteomics, particularly stable isotope-labeled (SIL) peptides and PSAQ proteins, we developed a new AAA assay (AAA-MS). This method requires neither derivatization nor chromatographic separation of amino acids. It is based on rapid microwave-assisted acidic hydrolysis followed by high-resolution mass spectrometry analysis of amino acids. Quantification is performed by comparing MS signals from labeled amino acids (SIL peptide- and PSAQ-derived) with those of unlabeled amino acids originating from co-hydrolyzed NIST standard reference materials. For both SIL peptides and PSAQ standards, AAA-MS quantification results were consistent with classical AAA measurements. Compared to AAA assay, AAA-MS was much faster and was 100-fold more sensitive for peptide and protein quantification. Finally, thanks to the development of a labeled protein standard, we also extended AAA-MS analysis to the quantification of unlabeled proteins.

  19. Spatial resolution in Micromegas detectors

    CERN Document Server

    Bayb, A; Giomataris, Ioanis; Zaccone, Henri; Bay, A; Perroud, Jean-Pierre; Ronga, F

    2001-01-01

    The performance of a telescope of Micromegas detectors has been studied in a pion beam at the CERN PS. With a gas filling of CF/sub 4 / and 20% isobutane and with a strip pitch of 100 mu m an accuracy of 14+or-3 mu m on the spatial resolution has been measured at normal incidence. A simulation demonstrates that the resolution is limited by the size of the holes of the mesh of the detector and could be reduced to 11 mu m in the same conditions with smaller holes. Even further improvement down to 8.5 mu m is feasible for the same gas with an optimized 75 mu m strip pitch. (5 refs).

  20. SPECT imaging with resolution recovery

    International Nuclear Information System (INIS)

    Bronnikov, A. V.

    2011-01-01

    Single-photon emission computed tomography (SPECT) is a method of choice for imaging spatial distributions of radioisotopes. Many applications of this method are found in nuclear industry, medicine, and biomedical research. We study mathematical modeling of a micro-SPECT system by using a point-spread function (PSF) and implement an OSEM-based iterative algorithm for image reconstruction with resolution recovery. Unlike other known implementations of the OSEM algorithm, we apply en efficient computation scheme based on a useful approximation of the PSF, which ensures relatively fast computations. The proposed approach can be applied with the data acquired with any type of collimators, including parallel-beam fan-beam, cone-beam and pinhole collimators. Experimental results obtained with a micro SPECT system demonstrate high efficiency of resolution recovery. (authors)

  1. High Resolution Thermometry for EXACT

    Science.gov (United States)

    Panek, J. S.; Nash, A. E.; Larson, M.; Mulders, N.

    2000-01-01

    High Resolution Thermometers (HRTs) based on SQUID detection of the magnetization of a paramagnetic salt or a metal alloy has been commonly used for sub-nano Kelvin temperature resolution in low temperature physics experiments. The main applications to date have been for temperature ranges near the lambda point of He-4 (2.177 K). These thermometers made use of materials such as Cu(NH4)2Br4 *2H2O, GdCl3, or PdFe. None of these materials are suitable for EXACT, which will explore the region of the He-3/He-4 tricritical point at 0.87 K. The experiment requirements and properties of several candidate paramagnetic materials will be presented, as well as preliminary test results.

  2. High-resolution electron microscopy

    CERN Document Server

    Spence, John C H

    2013-01-01

    This new fourth edition of the standard text on atomic-resolution transmission electron microscopy (TEM) retains previous material on the fundamentals of electron optics and aberration correction, linear imaging theory (including wave aberrations to fifth order) with partial coherence, and multiple-scattering theory. Also preserved are updated earlier sections on practical methods, with detailed step-by-step accounts of the procedures needed to obtain the highest quality images of atoms and molecules using a modern TEM or STEM electron microscope. Applications sections have been updated - these include the semiconductor industry, superconductor research, solid state chemistry and nanoscience, and metallurgy, mineralogy, condensed matter physics, materials science and material on cryo-electron microscopy for structural biology. New or expanded sections have been added on electron holography, aberration correction, field-emission guns, imaging filters, super-resolution methods, Ptychography, Ronchigrams, tomogr...

  3. Outcome quantification using SPHARM-PDM toolbox in orthognathic surgery

    Science.gov (United States)

    Cevidanes, Lucia; Zhu, HongTu; Styner, Martin

    2011-01-01

    Purpose Quantification of surgical outcomes in longitudinal studies has led to significant progress in the treatment of dentofacial deformity, both by offering options to patients who might not otherwise have been recommended for treatment and by clarifying the selection of appropriate treatment methods. Most existing surgical treatments have not been assessed in a systematic way. This paper presents the quantification of surgical outcomes in orthognathic surgery via our localized shape analysis framework. Methods In our setting, planning and surgical simulation is performed using the surgery planning software CMFapp. We then employ the SPHARM-PDM to measure the difference between pre-surgery and virtually simulated post-surgery models. This SPHARM-PDM shape framework is validated for use with craniofacial structures via simulating known 3D surgical changes within CMFapp. Results Our results show that SPHARM-PDM analysis accurately measures surgical displacements, compared with known displacement values. Visualization of color maps of virtually simulated surgical displacements describe corresponding surface distances that precisely describe location of changes, and difference vectors indicate directionality and magnitude of changes. Conclusions SPHARM-PDM-based quantification of surgical outcome is feasible. When compared to prior solutions, our method has the potential to make the surgical planning process more flexible, increase the level of detail and accuracy of the plan, yield higher operative precision and control and enhance the follow-up and documentation of clinical cases. PMID:21161693

  4. Initial water quantification results using neutron computed tomography

    Science.gov (United States)

    Heller, A. K.; Shi, L.; Brenizer, J. S.; Mench, M. M.

    2009-06-01

    Neutron computed tomography is an important imaging tool in the field of non-destructive testing and in fundamental research for many engineering applications. Contrary to X-rays, neutrons can be attenuated by some light materials, such as hydrogen, but can penetrate many heavy materials. Thus, neutron computed tomography is useful in obtaining important three-dimensional information about a sample's interior structure and material properties that other traditional methods cannot provide. The neutron computed tomography system at the Pennsylvania State University's Radiation Science and Engineering Center is being utilized to develop a water quantification technique for investigation of water distribution in fuel cells under normal conditions. A hollow aluminum cylinder test sample filled with a known volume of water was constructed for purposes of testing the quantification technique. Transmission images of the test sample at different angles were easily acquired through the synthesis of a dedicated image acquisition computer driving a rotary table controller and an in-house developed synchronization software package. After data acquisition, Octopus (version 8.2) and VGStudio Max (version 1.2) were used to perform cross-sectional and three-dimensional reconstructions of the sample, respectively. The initial reconstructions and water quantification results are presented.

  5. Quantification of rice bran oil in oil blends

    Energy Technology Data Exchange (ETDEWEB)

    Mishra, R.; Sharma, H. K.; Sengar, G.

    2012-11-01

    Blends consisting of physically refined rice bran oil (PRBO): sunflower oil (SnF) and PRBO: safflower oil (SAF) in different proportions were analyzed for various physicochemical parameters. The quantification of pure rice bran oil in the blended oils was carried out using different methods including gas chromatographic, HPLC, ultrasonic velocity and methods based on physico-chemical parameters. The physicochemical parameters such as ultrasonic velocity, relative association and acoustic impedance at 2 MHz, iodine value, palmitic acid content and oryzanol content reflected significant changes with increased proportions of PRBO in the blended oils. These parameters were selected as dependent parameters and % PRBO proportion was selected as independent parameters. The study revealed that regression equations based on the oryzanol content, palmitic acid composition, ultrasonic velocity, relative association, acoustic impedance, and iodine value can be used for the quantification of rice bran oil in blended oils. The rice bran oil can easily be quantified in the blended oils based on the oryzanol content by HPLC even at a 1% level. The palmitic acid content in blended oils can also be used as an indicator to quantify rice bran oil at or above the 20% level in blended oils whereas the method based on ultrasonic velocity, acoustic impedance and relative association showed initial promise in the quantification of rice bran oil. (Author) 23 refs.

  6. Development of Accident Scenarios and Quantification Methodology for RAON Accelerator

    International Nuclear Information System (INIS)

    Lee, Yongjin; Jae, Moosung

    2014-01-01

    The RIsp (Rare Isotope Science Project) plans to provide neutron-rich isotopes (RIs) and stable heavy ion beams. The accelerator is defined as radiation production system according to Nuclear Safety Law. Therefore, it needs strict operate procedures and safety assurance to prevent radiation exposure. In order to satisfy this condition, there is a need for evaluating potential risk of accelerator from the design stage itself. Though some of PSA researches have been conducted for accelerator, most of them focus on not general accident sequence but simple explanation of accident. In this paper, general accident scenarios are developed by Event Tree and deduce new quantification methodology of Event Tree. In this study, some initial events, which may occur in the accelerator, are selected. Using selected initial events, the accident scenarios of accelerator facility are developed with Event Tree. These results can be used as basic data of the accelerator for future risk assessments. After analyzing the probability of each heading, it is possible to conduct quantification and evaluate the significance of the accident result. If there is a development of the accident scenario for external events, risk assessment of entire accelerator facility will be completed. To reduce the uncertainty of the Event Tree, it is possible to produce a reliable data via the presented quantification techniques

  7. A fast and robust hepatocyte quantification algorithm including vein processing

    Directory of Open Access Journals (Sweden)

    Homeyer André

    2010-03-01

    Full Text Available Abstract Background Quantification of different types of cells is often needed for analysis of histological images. In our project, we compute the relative number of proliferating hepatocytes for the evaluation of the regeneration process after partial hepatectomy in normal rat livers. Results Our presented automatic approach for hepatocyte (HC quantification is suitable for the analysis of an entire digitized histological section given in form of a series of images. It is the main part of an automatic hepatocyte quantification tool that allows for the computation of the ratio between the number of proliferating HC-nuclei and the total number of all HC-nuclei for a series of images in one processing run. The processing pipeline allows us to obtain desired and valuable results for a wide range of images with different properties without additional parameter adjustment. Comparing the obtained segmentation results with a manually retrieved segmentation mask which is considered to be the ground truth, we achieve results with sensitivity above 90% and false positive fraction below 15%. Conclusions The proposed automatic procedure gives results with high sensitivity and low false positive fraction and can be applied to process entire stained sections.

  8. Initial water quantification results using neutron computed tomography

    Energy Technology Data Exchange (ETDEWEB)

    Heller, A.K. [Department of Mechanical and Nuclear Engineering, Pennsylvania State University (United States)], E-mail: axh174@psu.edu; Shi, L.; Brenizer, J.S.; Mench, M.M. [Department of Mechanical and Nuclear Engineering, Pennsylvania State University (United States)

    2009-06-21

    Neutron computed tomography is an important imaging tool in the field of non-destructive testing and in fundamental research for many engineering applications. Contrary to X-rays, neutrons can be attenuated by some light materials, such as hydrogen, but can penetrate many heavy materials. Thus, neutron computed tomography is useful in obtaining important three-dimensional information about a sample's interior structure and material properties that other traditional methods cannot provide. The neutron computed tomography system at Pennsylvania State University's Radiation Science and Engineering Center is being utilized to develop a water quantification technique for investigation of water distribution in fuel cells under normal conditions. A hollow aluminum cylinder test sample filled with a known volume of water was constructed for purposes of testing the quantification technique. Transmission images of the test sample at different angles were easily acquired through the synthesis of a dedicated image acquisition computer driving a rotary table controller and an in-house developed synchronization software package. After data acquisition, Octopus (version 8.2) and VGStudio Max (version 1.2) were used to perform cross-sectional and three-dimensional reconstructions of the sample, respectively. The initial reconstructions and water quantification results are presented.

  9. Toponomics method for the automated quantification of membrane protein translocation.

    Science.gov (United States)

    Domanova, Olga; Borbe, Stefan; Mühlfeld, Stefanie; Becker, Martin; Kubitz, Ralf; Häussinger, Dieter; Berlage, Thomas

    2011-09-19

    Intra-cellular and inter-cellular protein translocation can be observed by microscopic imaging of tissue sections prepared immunohistochemically. A manual densitometric analysis is time-consuming, subjective and error-prone. An automated quantification is faster, more reproducible, and should yield results comparable to manual evaluation. The automated method presented here was developed on rat liver tissue sections to study the translocation of bile salt transport proteins in hepatocytes. For validation, the cholestatic liver state was compared to the normal biological state. An automated quantification method was developed to analyze the translocation of membrane proteins and evaluated in comparison to an established manual method. Firstly, regions of interest (membrane fragments) are identified in confocal microscopy images. Further, densitometric intensity profiles are extracted orthogonally to membrane fragments, following the direction from the plasma membrane to cytoplasm. Finally, several different quantitative descriptors were derived from the densitometric profiles and were compared regarding their statistical significance with respect to the transport protein distribution. Stable performance, robustness and reproducibility were tested using several independent experimental datasets. A fully automated workflow for the information extraction and statistical evaluation has been developed and produces robust results. New descriptors for the intensity distribution profiles were found to be more discriminative, i.e. more significant, than those used in previous research publications for the translocation quantification. The slow manual calculation can be substituted by the fast and unbiased automated method.

  10. [DNA quantification of blood samples pre-treated with pyramidon].

    Science.gov (United States)

    Zhu, Chuan-Hong; Zheng, Dao-Li; Ni, Rao-Zhi; Wang, Hai-Sheng; Ning, Ping; Fang, Hui; Liu, Yan

    2014-06-01

    To study DNA quantification and STR typing of samples pre-treated with pyramidon. The blood samples of ten unrelated individuals were anticoagulated in EDTA. The blood stains were made on the filter paper. The experimental groups were divided into six groups in accordance with the storage time, 30 min, 1 h, 3 h, 6 h, 12 h and 24h after pre-treated with pyramidon. DNA was extracted by three methods: magnetic bead-based extraction, QIAcube DNA purification method and Chelex-100 method. The quantification of DNA was made by fluorescent quantitative PCR. STR typing was detected by PCR-STR fluorescent technology. In the same DNA extraction method, the sample DNA decreased gradually with times after pre-treatment with pyramidon. In the same storage time, the DNA quantification in different extraction methods had significant differences. Sixteen loci DNA typing were detected in 90.56% of samples. Pyramidon pre-treatment could cause DNA degradation, but effective STR typing can be achieved within 24 h. The magnetic bead-based extraction is the best method for STR profiling and DNA extraction.

  11. Voltammetric Quantification of Paraquat and Glyphosate in Surface Waters

    Directory of Open Access Journals (Sweden)

    William Roberto Alza-Camacho

    2016-09-01

    Full Text Available The indiscriminate use of pesticides on crops has a negative environmental impact that affects organisms, soil and water resources, essential for life. Therefore, it is necessary to evaluate the residual effect of these substances in water sources. A simple, affordable and accessible electrochemical method for Paraquat and Glyphosate quantification in water was developed. The study was conducted using as supporting electrolyte Britton-Robinson buffer solution, working electrode of glassy carbon, Ag/AgCl as the reference electrode, and platinum as auxiliary electrode. Differential pulse voltammetry (VDP method for both compounds were validated. Linearity of the methods presented a correlation coefficient of 0.9949 and 0.9919 and the limits of detection and quantification were 130 and 190 mg/L for Paraquat and 40 and 50 mg/L for glyphosate. Comparison with the reference method showed that the electrochemical method provides superior results in quantification of analytes. Of the samples tested, a value of Paraquat was between 0,011 to 1,572 mg/L and for glyphosate it was between 0.201 to 2.777 mg/L, indicating that these compounds are present in water sources and that those may be causing serious problems to human health.

  12. HPLC Quantification of astaxanthin and canthaxanthin in Salmonidae eggs.

    Science.gov (United States)

    Tzanova, Milena; Argirova, Mariana; Atanasov, Vasil

    2017-04-01

    Astaxanthin and canthaxanthin are naturally occurring antioxidants referred to as xanthophylls. They are used as food additives in fish farms to improve the organoleptic qualities of salmonid products and to prevent reproductive diseases. This study reports the development and single-laboratory validation of a rapid method for quantification of astaxanthin and canthaxanthin in eggs of rainbow trout (Oncorhynchus mykiss) and brook trout (Salvelinus fontinalis М.). An advantage of the proposed method is the perfect combination of selective extraction of the xanthophylls and analysis of the extract by high-performance liquid chromatography and photodiode array detection. The method validation was carried out in terms of linearity, accuracy, precision, recovery and limits of detection and quantification. The method was applied for simultaneous quantification of the two xanthophylls in eggs of rainbow trout and brook trout after their selective extraction. The results show that astaxanthin accumulations in salmonid fish eggs are larger than those of canthaxanthin. As the levels of these two xanthophylls affect fish fertility, this method can be used to improve the nutritional quality and to minimize the occurrence of the M74 syndrome in fish populations. Copyright © 2016 John Wiley & Sons, Ltd.

  13. Preliminary study on computer automatic quantification of brain atrophy

    International Nuclear Information System (INIS)

    Li Chuanfu; Zhou Kangyuan

    2006-01-01

    Objective: To study the variability of normal brain volume with the sex and age, and put forward an objective standard for computer automatic quantification of brain atrophy. Methods: The cranial volume, brain volume and brain parenchymal fraction (BPF) of 487 cases of brain atrophy (310 males, 177 females) and 1901 cases of normal subjects (993 males, 908 females) were calculated with the newly developed algorithm of automatic quantification for brain atrophy. With the technique of polynomial curve fitting, the mathematical relationship of BPF with age in normal subjects was analyzed. Results: The cranial volume, brain volume and BPF of normal subjects were (1 271 322 ± 128 699) mm 3 , (1 211 725 ± 122 077) mm 3 and (95.3471 ± 2.3453)%, respectively, and those of atrophy subjects were (1 276 900 ± 125 180) mm 3 , (1 203 400 ± 117 760) mm 3 and BPF(91.8115 ± 2.3035)% respectively. The difference of BPF between the two groups was extremely significant (P 0.05). The expression P(x)=-0.0008x 2 + 0.0193x + 96.9999 could accurately describe the mathematical relationship between BPF and age in normal subject (lower limit of 95% CI y=-0.0008x 2 +0.0184x+95.1090). Conclusion: The lower limit of 95% confidence interval mathematical relationship between BPF and age could be used as an objective criteria for automatic quantification of brain atrophy with computer. (authors)

  14. Nuclear and mitochondrial DNA quantification of various forensic materials.

    Science.gov (United States)

    Andréasson, H; Nilsson, M; Budowle, B; Lundberg, H; Allen, M

    2006-12-01

    Due to the different types and quality of forensic evidence materials, their DNA content can vary substantially, and particularly low quantities can impact the results in an identification analysis. In this study, the quantity of mitochondrial and nuclear DNA was determined in a variety of materials using a previously described real-time PCR method. DNA quantification in the roots and distal sections of plucked and shed head hairs revealed large variations in DNA content particularly between the root and the shaft of plucked hairs. Also large intra- and inter-individual variations were found among hairs. In addition, DNA content was estimated in samples collected from fingerprints and accessories. The quantification of DNA on various items also displayed large variations, with some materials containing large amounts of nuclear DNA while no detectable nuclear DNA and only limited amounts of mitochondrial DNA were seen in others. Using this sensitive real-time PCR quantification assay, a better understanding was obtained regarding DNA content and variation in commonly analysed forensic evidence materials and this may guide the forensic scientist as to the best molecular biology approach for analysing various forensic evidence materials.

  15. OTTER, Resolution Style Theorem Prover

    International Nuclear Information System (INIS)

    McCune, W.W.

    2001-01-01

    1 - Description of program or function: OTTER (Other Techniques for Theorem-proving and Effective Research) is a resolution-style theorem-proving program for first-order logic with equality. OTTER includes the inference rules binary resolution, hyper-resolution, UR-resolution, and binary para-modulation. These inference rules take as small set of clauses and infer a clause. If the inferred clause is new and useful, it is stored and may become available for subsequent inferences. Other capabilities are conversion from first-order formulas to clauses, forward and back subsumption, factoring, weighting, answer literals, term ordering, forward and back demodulation, and evaluable functions and predicates. 2 - Method of solution: For its inference process OTTER uses the given-clause algorithm, which can be viewed as a simple implementation of the set of support strategy. OTTER maintains three lists of clauses: axioms, sos (set of support), and demodulators. OTTER is not automatic. Even after the user has encoded a problem into first-order logic or into clauses, the user must choose inference rules, set options to control the processing of inferred clauses, and decide which input formulae or clauses are to be in the initial set of support and which, if any, equalities are to be demodulators. If OTTER fails to find a proof, the user may try again different initial conditions. 3 - Restrictions on the complexity of the problem - Maxima of: 5000 characters in an input string, 64 distinct variables in a clause, 51 characters in any symbol. The maxima can be changed by finding the appropriate definition in the header.h file, increasing the limit, and recompiling OTTER. There are a few constraints on the order of commands

  16. Exploring the chemistry of complex samples by tentative identification and semi-quantification: a food contact material case

    DEFF Research Database (Denmark)

    Pieke, Eelco Nicolaas; Smedsgaard, Jørn; Granby, Kit

    2017-01-01

    to retrieve the most likely chemical match from a structure database. In addition, TOF-only data is used to estimate analyte concentration via semi-quantification. The method is demonstrated in recycled paper food contact material (FCM). Here, 585 chromatographic peaks were discovered, of which 117 were...... data. Overall, the described method is a valuable chemical exploration tool for non-identified substances, but also may be used as a preliminary prioritization tool for substances expected to have the highest health impact, for example in FCMs....... elucidation of a vast number of unknowns, of which only a fraction may be relevant. Here, we present an exploration and prioritization approach based on high resolution mass spectrometry. The method uses algorithm-based precursor/product-ion correlations on Quadrupole-Time of Flight (Q-TOF) MS/MS data...

  17. Box graphs and resolutions I

    Directory of Open Access Journals (Sweden)

    Andreas P. Braun

    2016-04-01

    Full Text Available Box graphs succinctly and comprehensively characterize singular fibers of elliptic fibrations in codimension two and three, as well as flop transitions connecting these, in terms of representation theoretic data. We develop a framework that provides a systematic map between a box graph and a crepant algebraic resolution of the singular elliptic fibration, thus allowing an explicit construction of the fibers from a singular Weierstrass or Tate model. The key tool is what we call a fiber face diagram, which shows the relevant information of a (partial toric triangulation and allows the inclusion of more general algebraic blowups. We shown that each such diagram defines a sequence of weighted algebraic blowups, thus providing a realization of the fiber defined by the box graph in terms of an explicit resolution. We show this correspondence explicitly for the case of SU(5 by providing a map between box graphs and fiber faces, and thereby a sequence of algebraic resolutions of the Tate model, which realizes each of the box graphs.

  18. Patient motion effects on the quantification of regional myocardial blood flow with dynamic PET imaging

    Energy Technology Data Exchange (ETDEWEB)

    Hunter, Chad R. R. N.; Kemp, Robert A. de, E-mail: RAdeKemp@ottawaheart.ca [Physics Department, Room 3302 Herzberg Laboratories, Carleton University, 1125 Colonel By Drive, Ottawa, Ontario K1S 5B6, Canada and Cardiac Imaging, University of Ottawa Heart Institute, 40 Ruskin Street, Ottawa, Ontario K1Y 4W7 (Canada); Klein, Ran [Department of Nuclear Medicine, Ottawa Hospital, Civic Campus, 1053 Carling Avenue, Ottawa, Ontario K1Y 4E9 (Canada); Beanlands, Rob S. [Cardiac Imaging, University of Ottawa Heart Institute, 40 Ruskin Street, Ottawa, Ontario K1Y 4W7 (Canada)

    2016-04-15

    Purpose: Patient motion is a common problem during dynamic positron emission tomography (PET) scans for quantification of myocardial blood flow (MBF). The purpose of this study was to quantify the prevalence of body motion in a clinical setting and evaluate with realistic phantoms the effects of motion on blood flow quantification, including CT attenuation correction (CTAC) artifacts that result from PET–CT misalignment. Methods: A cohort of 236 sequential patients was analyzed for patient motion under resting and peak stress conditions by two independent observers. The presence of motion, affected time-frames, and direction of motion was recorded; discrepancy between observers was resolved by consensus review. Based on these results, patient body motion effects on MBF quantification were characterized using the digital NURBS-based cardiac-torso phantom, with characteristic time activity curves (TACs) assigned to the heart wall (myocardium) and blood regions. Simulated projection data were corrected for attenuation and reconstructed using filtered back-projection. All simulations were performed without noise added, and a single CT image was used for attenuation correction and aligned to the early- or late-frame PET images. Results: In the patient cohort, mild motion of 0.5 ± 0.1 cm occurred in 24% and moderate motion of 1.0 ± 0.3 cm occurred in 38% of patients. Motion in the superior/inferior direction accounted for 45% of all detected motion, with 30% in the superior direction. Anterior/posterior motion was predominant (29%) in the posterior direction. Left/right motion occurred in 24% of cases, with similar proportions in the left and right directions. Computer simulation studies indicated that errors in MBF can approach 500% for scans with severe patient motion (up to 2 cm). The largest errors occurred when the heart wall was shifted left toward the adjacent lung region, resulting in a severe undercorrection for attenuation of the heart wall. Simulations

  19. Patient motion effects on the quantification of regional myocardial blood flow with dynamic PET imaging

    International Nuclear Information System (INIS)

    Hunter, Chad R. R. N.; Kemp, Robert A. de; Klein, Ran; Beanlands, Rob S.

    2016-01-01

    Purpose: Patient motion is a common problem during dynamic positron emission tomography (PET) scans for quantification of myocardial blood flow (MBF). The purpose of this study was to quantify the prevalence of body motion in a clinical setting and evaluate with realistic phantoms the effects of motion on blood flow quantification, including CT attenuation correction (CTAC) artifacts that result from PET–CT misalignment. Methods: A cohort of 236 sequential patients was analyzed for patient motion under resting and peak stress conditions by two independent observers. The presence of motion, affected time-frames, and direction of motion was recorded; discrepancy between observers was resolved by consensus review. Based on these results, patient body motion effects on MBF quantification were characterized using the digital NURBS-based cardiac-torso phantom, with characteristic time activity curves (TACs) assigned to the heart wall (myocardium) and blood regions. Simulated projection data were corrected for attenuation and reconstructed using filtered back-projection. All simulations were performed without noise added, and a single CT image was used for attenuation correction and aligned to the early- or late-frame PET images. Results: In the patient cohort, mild motion of 0.5 ± 0.1 cm occurred in 24% and moderate motion of 1.0 ± 0.3 cm occurred in 38% of patients. Motion in the superior/inferior direction accounted for 45% of all detected motion, with 30% in the superior direction. Anterior/posterior motion was predominant (29%) in the posterior direction. Left/right motion occurred in 24% of cases, with similar proportions in the left and right directions. Computer simulation studies indicated that errors in MBF can approach 500% for scans with severe patient motion (up to 2 cm). The largest errors occurred when the heart wall was shifted left toward the adjacent lung region, resulting in a severe undercorrection for attenuation of the heart wall. Simulations

  20. Patient motion effects on the quantification of regional myocardial blood flow with dynamic PET imaging.

    Science.gov (United States)

    Hunter, Chad R R N; Klein, Ran; Beanlands, Rob S; deKemp, Robert A

    2016-04-01

    Patient motion is a common problem during dynamic positron emission tomography (PET) scans for quantification of myocardial blood flow (MBF). The purpose of this study was to quantify the prevalence of body motion in a clinical setting and evaluate with realistic phantoms the effects of motion on blood flow quantification, including CT attenuation correction (CTAC) artifacts that result from PET-CT misalignment. A cohort of 236 sequential patients was analyzed for patient motion under resting and peak stress conditions by two independent observers. The presence of motion, affected time-frames, and direction of motion was recorded; discrepancy between observers was resolved by consensus review. Based on these results, patient body motion effects on MBF quantification were characterized using the digital NURBS-based cardiac-torso phantom, with characteristic time activity curves (TACs) assigned to the heart wall (myocardium) and blood regions. Simulated projection data were corrected for attenuation and reconstructed using filtered back-projection. All simulations were performed without noise added, and a single CT image was used for attenuation correction and aligned to the early- or late-frame PET images. In the patient cohort, mild motion of 0.5 ± 0.1 cm occurred in 24% and moderate motion of 1.0 ± 0.3 cm occurred in 38% of patients. Motion in the superior/inferior direction accounted for 45% of all detected motion, with 30% in the superior direction. Anterior/posterior motion was predominant (29%) in the posterior direction. Left/right motion occurred in 24% of cases, with similar proportions in the left and right directions. Computer simulation studies indicated that errors in MBF can approach 500% for scans with severe patient motion (up to 2 cm). The largest errors occurred when the heart wall was shifted left toward the adjacent lung region, resulting in a severe undercorrection for attenuation of the heart wall. Simulations also indicated that the

  1. A novel super-resolution camera model

    Science.gov (United States)

    Shao, Xiaopeng; Wang, Yi; Xu, Jie; Wang, Lin; Liu, Fei; Luo, Qiuhua; Chen, Xiaodong; Bi, Xiangli

    2015-05-01

    Aiming to realize super resolution(SR) to single image and video reconstruction, a super resolution camera model is proposed for the problem that the resolution of the images obtained by traditional cameras behave comparatively low. To achieve this function we put a certain driving device such as piezoelectric ceramics in the camera. By controlling the driving device, a set of continuous low resolution(LR) images can be obtained and stored instantaneity, which reflect the randomness of the displacements and the real-time performance of the storage very well. The low resolution image sequences have different redundant information and some particular priori information, thus it is possible to restore super resolution image factually and effectively. The sample method is used to derive the reconstruction principle of super resolution, which analyzes the possible improvement degree of the resolution in theory. The super resolution algorithm based on learning is used to reconstruct single image and the variational Bayesian algorithm is simulated to reconstruct the low resolution images with random displacements, which models the unknown high resolution image, motion parameters and unknown model parameters in one hierarchical Bayesian framework. Utilizing sub-pixel registration method, a super resolution image of the scene can be reconstructed. The results of 16 images reconstruction show that this camera model can increase the image resolution to 2 times, obtaining images with higher resolution in currently available hardware levels.

  2. High-resolution intravital microscopy.

    Directory of Open Access Journals (Sweden)

    Volker Andresen

    Full Text Available Cellular communication constitutes a fundamental mechanism of life, for instance by permitting transfer of information through synapses in the nervous system and by leading to activation of cells during the course of immune responses. Monitoring cell-cell interactions within living adult organisms is crucial in order to draw conclusions on their behavior with respect to the fate of cells, tissues and organs. Until now, there is no technology available that enables dynamic imaging deep within the tissue of living adult organisms at sub-cellular resolution, i.e. detection at the level of few protein molecules. Here we present a novel approach called multi-beam striped-illumination which applies for the first time the principle and advantages of structured-illumination, spatial modulation of the excitation pattern, to laser-scanning-microscopy. We use this approach in two-photon-microscopy--the most adequate optical deep-tissue imaging-technique. As compared to standard two-photon-microscopy, it achieves significant contrast enhancement and up to 3-fold improved axial resolution (optical sectioning while photobleaching, photodamage and acquisition speed are similar. Its imaging depth is comparable to multifocal two-photon-microscopy and only slightly less than in standard single-beam two-photon-microscopy. Precisely, our studies within mouse lymph nodes demonstrated 216% improved axial and 23% improved lateral resolutions at a depth of 80 µm below the surface. Thus, we are for the first time able to visualize the dynamic interactions between B cells and immune complex deposits on follicular dendritic cells within germinal centers (GCs of live mice. These interactions play a decisive role in the process of clonal selection, leading to affinity maturation of the humoral immune response. This novel high-resolution intravital microscopy method has a huge potential for numerous applications in neurosciences, immunology, cancer research and

  3. High-Resolution Intravital Microscopy

    Science.gov (United States)

    Andresen, Volker; Pollok, Karolin; Rinnenthal, Jan-Leo; Oehme, Laura; Günther, Robert; Spiecker, Heinrich; Radbruch, Helena; Gerhard, Jenny; Sporbert, Anje; Cseresnyes, Zoltan; Hauser, Anja E.; Niesner, Raluca

    2012-01-01

    Cellular communication constitutes a fundamental mechanism of life, for instance by permitting transfer of information through synapses in the nervous system and by leading to activation of cells during the course of immune responses. Monitoring cell-cell interactions within living adult organisms is crucial in order to draw conclusions on their behavior with respect to the fate of cells, tissues and organs. Until now, there is no technology available that enables dynamic imaging deep within the tissue of living adult organisms at sub-cellular resolution, i.e. detection at the level of few protein molecules. Here we present a novel approach called multi-beam striped-illumination which applies for the first time the principle and advantages of structured-illumination, spatial modulation of the excitation pattern, to laser-scanning-microscopy. We use this approach in two-photon-microscopy - the most adequate optical deep-tissue imaging-technique. As compared to standard two-photon-microscopy, it achieves significant contrast enhancement and up to 3-fold improved axial resolution (optical sectioning) while photobleaching, photodamage and acquisition speed are similar. Its imaging depth is comparable to multifocal two-photon-microscopy and only slightly less than in standard single-beam two-photon-microscopy. Precisely, our studies within mouse lymph nodes demonstrated 216% improved axial and 23% improved lateral resolutions at a depth of 80 µm below the surface. Thus, we are for the first time able to visualize the dynamic interactions between B cells and immune complex deposits on follicular dendritic cells within germinal centers (GCs) of live mice. These interactions play a decisive role in the process of clonal selection, leading to affinity maturation of the humoral immune response. This novel high-resolution intravital microscopy method has a huge potential for numerous applications in neurosciences, immunology, cancer research and developmental biology

  4. A multifractal approach to space-filling recovery for PET quantification

    Energy Technology Data Exchange (ETDEWEB)

    Willaime, Julien M. Y., E-mail: julien.willaime@siemens.com; Aboagye, Eric O. [Comprehensive Cancer Imaging Centre, Imperial College London, Hammersmith Hospital, London W12 0NN (United Kingdom); Tsoumpas, Charalampos [Division of Medical Physics, University of Leeds, LS2 9JT (United Kingdom); Turkheimer, Federico E. [Department of Neuroimaging, Institute of Psychiatry, King’s College London, London SE5 8AF (United Kingdom)

    2014-11-01

    Purpose: A new image-based methodology is developed for estimating the apparent space-filling properties of an object of interest in PET imaging without need for a robust segmentation step and used to recover accurate estimates of total lesion activity (TLA). Methods: A multifractal approach and the fractal dimension are proposed to recover the apparent space-filling index of a lesion (tumor volume, TV) embedded in nonzero background. A practical implementation is proposed, and the index is subsequently used with mean standardized uptake value (SUV {sub mean}) to correct TLA estimates obtained from approximate lesion contours. The methodology is illustrated on fractal and synthetic objects contaminated by partial volume effects (PVEs), validated on realistic {sup 18}F-fluorodeoxyglucose PET simulations and tested for its robustness using a clinical {sup 18}F-fluorothymidine PET test–retest dataset. Results: TLA estimates were stable for a range of resolutions typical in PET oncology (4–6 mm). By contrast, the space-filling index and intensity estimates were resolution dependent. TLA was generally recovered within 15% of ground truth on postfiltered PET images affected by PVEs. Volumes were recovered within 15% variability in the repeatability study. Results indicated that TLA is a more robust index than other traditional metrics such as SUV {sub mean} or TV measurements across imaging protocols. Conclusions: The fractal procedure reported here is proposed as a simple and effective computational alternative to existing methodologies which require the incorporation of image preprocessing steps (i.e., partial volume correction and automatic segmentation) prior to quantification.

  5. Pore space quantification of carbonate rocks before-after supercritical CO2 interaction by optical image analysis

    Science.gov (United States)

    Berrezueta, Edgar; José Domínguez-Cuesta, María

    2017-04-01

    The aim of this research is to show an experimental application of an automated quantification process of optical porosity in thin sections. Petrographic studies using scanning electronic microscopy, optical microscopy (OpM) and optical image analysis (OIA) could provide a reproducible pore characterization of carbonate rocks in applications related to the geological storage of CO2. This research is focused on i) the quantification of optical pores in a carbonate rock before and after supercritical CO2-rich brine (P ≈ 7.5 MPa and T ≈ 35 °C) and ii) the description of the process followed to guarantee the reproducibility of the OIA method on images acquired with high-resolution scanner. Mineral images were acquired from thin sections using a high-resolution scanner (HRS). Digital images were geo-referenced by using geographic information system to ensure correct spatial correlation and superposition. The optical measures of porosity by image analysis on the carbonates thin sections showed an effective pore segmentation considering different cross-polarized light conditions (90°/0°; 120°/30°) and plane-polarized light conditions (90°/-) of the same petrographic scene. The pore characterization by OpM and OIA-HRS has allowed a preliminary approximation of pore evolution in carbonate rocks under the supercritical CO2-rich brine. This study shows a fast, effective and reproducible methodology that allowed a preliminary characterization (changes in the pore network) of the samples studied. The procedure carried out could be applied to similar experimental injection tests.

  6. Detection and quantification of microparticles from different cellular lineages using flow cytometry. Evaluation of the impact of secreted phospholipase A2 on microparticle assessment.

    Science.gov (United States)

    Rousseau, Matthieu; Belleannee, Clemence; Duchez, Anne-Claire; Cloutier, Nathalie; Levesque, Tania; Jacques, Frederic; Perron, Jean; Nigrovic, Peter A; Dieude, Melanie; Hebert, Marie-Josee; Gelb, Michael H; Boilard, Eric

    2015-01-01

    Microparticles, also called microvesicles, are submicron extracellular vesicles produced by plasma membrane budding and shedding recognized as key actors in numerous physio(patho)logical processes. Since they can be released by virtually any cell lineages and are retrieved in biological fluids, microparticles appear as potent biomarkers. However, the small dimensions of microparticles and soluble factors present in body fluids can considerably impede their quantification. Here, flow cytometry with improved methodology for microparticle resolution was used to detect microparticles of human and mouse species generated from platelets, red blood cells, endothelial cells, apoptotic thymocytes and cells from the male reproductive tract. A family of soluble proteins, the secreted phospholipases A2 (sPLA2), comprises enzymes concomitantly expressed with microparticles in biological fluids and that catalyze the hydrolysis of membrane phospholipids. As sPLA2 can hydrolyze phosphatidylserine, a phospholipid frequently used to assess microparticles, and might even clear microparticles, we further considered the impact of relevant sPLA2 enzymes, sPLA2 group IIA, V and X, on microparticle quantification. We observed that if enriched in fluids, certain sPLA2 enzymes impair the quantification of microparticles depending on the species studied, the source of microparticles and the means of detection employed (surface phosphatidylserine or protein antigen detection). This study provides analytical considerations for appropriate interpretation of microparticle cytofluorometric measurements in biological samples containing sPLA2 enzymes.

  7. Identification and quantification of the hydrological impacts of imperviousness in urban catchments: a review.

    Science.gov (United States)

    Jacobson, Carol R

    2011-06-01

    Urbanisation produces numerous changes in the natural environments it replaces. The impacts include habitat fragmentation and changes to both the quality and quantity of the stormwater runoff, and result in changes to hydrological systems. This review integrates research in relatively diverse areas to examine how the impacts of urban imperviousness on hydrological systems can be quantified and modelled. It examines the nature of reported impacts of urbanisation on hydrological systems over four decades, including the effects of changes in imperviousness within catchments, and some inconsistencies in studies of the impacts of urbanisation. The distribution of imperviousness within urban areas is important in understanding the impacts of urbanisation and quantification requires detailed characterisation of urban areas. As a result most mapping of urban areas uses remote sensing techniques and this review examines a range of techniques using medium and high resolution imagery, including spectral unmixing. The third section examines the ways in which scientists and hydrological and environmental engineers model and quantify water flows in urban areas, the nature of hydrological models and methods for their calibration. The final section examines additional factors which influence the impact of impervious surfaces and some uncertainties that exist in current knowledge. Copyright © 2011 Elsevier Ltd. All rights reserved.

  8. Quantification of the activity of biomolecules in microarrays obtained by direct laser transfer.

    Science.gov (United States)

    Dinca, V; Ranella, A; Farsari, M; Kafetzopoulos, D; Dinescu, M; Popescu, A; Fotakis, C

    2008-10-01

    The direct-writing technique laser-induced forward transfer has been employed for the micro-array printing of liquid solutions of the enzyme horseradish peroxidase and the protein Titin on nitrocellulose solid surfaces. The effect of two UV laser pulse lengths, femtosecond and nanosecond has been studied in relation with maintaining the activity of the transferred biomolecules. The quantification of the active biomolecules after transfer has been carried out using Bradford assay, quantitative colorimetric enzymatic assay and fluorescence techniques. Spectrophotometric measurements of the HRP and the Titin activity as well as chromatogenic and fluorescence assay studies have revealed a connection between the properties of the deposited, biologically active biomolecules, the experimental conditions and the target composition. The bioassays have shown that up to 78% of the biomolecules remained active after femtosecond laser transfer, while this value reduced to 54% after nanosecond laser transfer. The addition of glycerol in a percentage up to 70% in the solution to be transferred has contributed to the stabilization of the micro-array patterns and the increase of their resolution.

  9. Quantification of egg proteome changes during fertilization in sterlet Acipenser ruthenus.

    Science.gov (United States)

    Niksirat, Hamid; Andersson, Liselotte; Golpour, Amin; Chupani, Latifeh; James, Peter

    2017-08-19

    Eggs of sterlet are discharged outside into ambient aquatic environment where egg activation and fertilization occur. Effects of different activation media including freshwater and clay suspension on protein abundances of egg were quantified in sterlet Acipenser ruthenus. In-gel digestion and high resolution mass spectrometry were used for label-free protein quantification in the eggs of five females. No significant (p > 0.05) difference was found between protein abundances in eggs activated with different media. However, results showed significant (p eggs as control. The fact that abundance of proteasome subunit alpha significantly reduced only in eggs which were activated by clay suspension suggests that activation medium can somehow intervene with protein regulation during fertilization. In conclusion, external fertilization in sturgeon egg is accompanied by huge release of proteins into the external environment that may participate in the construction of a transient microenvironment around egg for attraction and protection of spermatozoa to ensure ensuing fertilization. Data are available via ProteomeXchange with identifier PXD006232. Copyright © 2017 Elsevier Inc. All rights reserved.

  10. Kinetics of Reactive Fronts in Porous Media: quantification through a laboratory experiment

    Science.gov (United States)

    De Anna, P.; Jimenez-Martinez, J.; Turuban, R.; Tabuteau, H.; Derrien, M.; Le Borgne, T.; Meheust, Y.

    2013-12-01

    The kinetics of reaction fronts in heterogeneous flows is tightly linked to the mixing dynamics governed by the combined action of stretching, diffusion and dispersion. Focusing on porous media flows, with a new experimental setup we show that the invading solute is organized into stretched lamellae, whose deformation and coalescence control the effective reaction kinetics of the mixing limited bimolecular reaction A + B --> C. While the classic advection-dispersion theory predicts a scaling of the cumulative product mass of C as t^(0.5), we observe two distinct kinetics regimes, one characterized by the stretching and the other by the coalescence of the invading lamellae, in which the mass of C scales faster than t^(0.5). The proposed experimental set up allows for direct quantification of mixing and reactive transport in porous media with a high spatial resolution, at the pore scale. The analogous two dimensional porous medium consists in a Hele-Shaw cell containing a single layer of cylindrical solid grains built by soft lithography. On the one hand, the measurement of the local, intra-pore, conservative concentration field is done using a fluorescent tracer. On the other hand, considering a fast bimolecular advection-dispersion reaction A + B --> C occurring as A displaces B, we quantify the reaction kinetics from the spatially-resolved measurement of the pore scale reaction rate, using a chemiluminescent reaction.

  11. Applying photoacoustics to quantification of melanin concentration in retinal pigment epithelium (Conference Presentation)

    Science.gov (United States)

    Shu, Xiao; Zhang, Hao F.; Liu, Wenzhong

    2016-03-01

    The melanin in the retinal pigment epithelium (RPE) protects retina and other ocular tissues by photo-screening and acting as antioxidant and free radical scavenger. It helps maintain normal visual functions since human eye is subjected to lifelong high oxygen stress and photon exposure. Loss of the RPE melanin weakens the protection mechanism and jeopardizes ocular health. Local decrease in the RPE melanin concentration is believed to be both a cause and a sign of early-stage age-related macular degeneration (AMD), the leading blinding disease in developed world. Current technology cannot quantitatively measure the RPE melanin concentration which might be a promising marker in early AMD screening. Photoacoustic ophthalmoscopy (PAOM), as an emerging optical absorption-based imaging technology, can potentially be applied to measure the RPE melanin concentration if the dependence of the detectable photoacoustic (PA) signal amplitudes on the RPE melanin concentrations is verified. In this study, we tested the feasibility of using PA signal ratio from RPE melanin and the nearby retinal blood vessels as an indicator of the RPE melanin variation. A novel whole eye optical model was designed and Monte Carlo modeling of light (MCML) was employed. We examined the influences on quantification from PAOM axial resolution, the depth and diameter of the retinal blood vessel, and the RPE thickness. The results show that the scheme is robust to individual histological and illumination variations. This study suggests that PAOM is capable of quantitatively measuring the RPE melanin concentration in vivo.

  12. Multi-fidelity uncertainty quantification in large-scale predictive simulations of turbulent flow

    Science.gov (United States)

    Geraci, Gianluca; Jofre-Cruanyes, Lluis; Iaccarino, Gianluca

    2017-11-01

    The performance characterization of complex engineering systems often relies on accurate, but computationally intensive numerical simulations. It is also well recognized that in order to obtain a reliable numerical prediction the propagation of uncertainties needs to be included. Therefore, Uncertainty Quantification (UQ) plays a fundamental role in building confidence in predictive science. Despite the great improvement in recent years, even the more advanced UQ algorithms are still limited to fairly simplified applications and only moderate parameter dimensionality. Moreover, in the case of extremely large dimensionality, sampling methods, i.e. Monte Carlo (MC) based approaches, appear to be the only viable alternative. In this talk we describe and compare a family of approaches which aim to accelerate the convergence of standard MC simulations. These methods are based on hierarchies of generalized numerical resolutions (multi-level) or model fidelities (multi-fidelity), and attempt to leverage the correlation between Low- and High-Fidelity (HF) models to obtain a more accurate statistical estimator without introducing additional HF realizations. The performance of these methods are assessed on an irradiated particle laden turbulent flow (PSAAP II solar energy receiver). This investigation was funded by the United States Department of Energy's (DoE) National Nuclear Security Administration (NNSA) under the Predicitive Science Academic Alliance Program (PSAAP) II at Stanford University.

  13. Capillary gel electrophoresis for the quantification and purity determination of recombinant proteins in inclusion bodies.

    Science.gov (United States)

    Espinosa-de la Garza, Carlos E; Perdomo-Abúndez, Francisco C; Campos-García, Víctor R; Pérez, Néstor O; Flores-Ortiz, Luis F; Medina-Rivero, Emilio

    2013-09-01

    In this work, a high-resolution CGE method for quantification and purity determination of recombinant proteins was developed, involving a single-component inclusion bodies (IBs) solubilization solution. Different recombinant proteins expressed as IBs were used to show method capabilities, using recombinant interferon-β 1b as the model protein for method validation. Method linearity was verified in the range from 0.05 to 0.40 mg/mL and a determination coefficient (r(2) ) of 0.99 was obtained. The LOQs and LODs were 0.018 and 0.006 mg/mL, respectively. RSD for protein content repeatability test was 2.29%. In addition, RSD for protein purity repeatability test was 4.24%. Method accuracy was higher than 90%. Specificity was confirmed, as the method was able to separate recombinant interferon-β 1b monomer from other aggregates and impurities. Sample content and purity was demonstrated to be stable for up to 48 h. Overall, this method is suitable for the analysis of recombinant proteins in IBs according to the attributes established on the International Conference for Harmonization guidelines. © 2013 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  14. Quantification of collagen distributions in rat hyaline and fibro cartilages based on second harmonic generation imaging

    Science.gov (United States)

    Zhu, Xiaoqin; Liao, Chenxi; Wang, Zhenyu; Zhuo, Shuangmu; Liu, Wenge; Chen, Jianxin

    2016-10-01

    Hyaline cartilage is a semitransparent tissue composed of proteoglycan and thicker type II collagen fibers, while fibro cartilage large bundles of type I collagen besides other territorial matrix and chondrocytes. It is reported that the meniscus (fibro cartilage) has a greater capacity to regenerate and close a wound compared to articular cartilage (hyaline cartilage). And fibro cartilage often replaces the type II collagen-rich hyaline following trauma, leading to scar tissue that is composed of rigid type I collagen. The visualization and quantification of the collagen fibrillar meshwork is important for understanding the role of fibril reorganization during the healing process and how different types of cartilage contribute to wound closure. In this study, second harmonic generation (SHG) microscope was applied to image the articular and meniscus cartilage, and textural analysis were developed to quantify the collagen distribution. High-resolution images were achieved based on the SHG signal from collagen within fresh specimens, and detailed observations of tissue morphology and microstructural distribution were obtained without shrinkage or distortion. Textural analysis of SHG images was performed to confirm that collagen in fibrocartilage showed significantly coarser compared to collagen in hyaline cartilage (p < 0.01). Our results show that each type of cartilage has different structural features, which may significantly contribute to pathology when damaged. Our findings demonstrate that SHG microscopy holds potential as a clinically relevant diagnostic tool for imaging degenerative tissues or assessing wound repair following cartilage injury.

  15. Quantification and sensory studies of character impact odorants of different soybean lecithins.

    Science.gov (United States)

    Stephan, A; Steinhart, H

    1999-10-01

    Fifty-four potent odorants in standardized, hydrolyzed, and deoiled and hydrolyzed soybean lecithins were quantified by high-resolution gas chromatography/mass spectrometry (HRGC/MS). The characterization of their aroma impact was performed by calculation of nasal (n) and retronasal (r) odor activity values (OAVs). For this, the nasal and retronasal recognition thresholds of 18 odor-active compounds were determined in vegetable oil. The following compounds showed the highest nOAVs: 2,3-diethyl-5-methylpyrazine, methylpropanal, acetic acid, pentanoic acid, 2-ethyl-3,5-dimethylpyrazine, pentylpyridine, (Z)-1,5-octadien-3-one, 2-methylbutanal, and beta-damascenone. In addition to the compounds above, 1-octen-3-one, 1-nonen-3-one, and 3-methyl-2,4-nonandione showed potent rOAVs. The results of quantification and OAV calculation were confirmed by a model mixture of 25 impact odorants, which yielded a highly similar sensory profile to that of the original soybean lecithin. The sensory importance of pyrazines and free acids increased through enzymatic hydrolysis and decreased by the process of deoiling. The impact of unsaturated ketones on the lecithin aroma was not changed by either process.

  16. Recent advances in hopanoids analysis: Quantification protocols overview, main research targets and selected problems of complex data exploration.

    Science.gov (United States)

    Zarzycki, Paweł K; Portka, Joanna K

    2015-09-01

    Pentacyclic triterpenoids, particularly hopanoids, are organism-specific compounds and are generally considered as useful biomarkers that allow fingerprinting and classification of biological, environmental and geological samples. Simultaneous quantification of various hopanoids together with battery of related non-polar and low-molecular mass compounds may provide principal information for geochemical and environmental research focusing on both modern and ancient investigations. Target compounds can be derived from microbial biomass, water columns, sediments, coals, crude fossils or rocks. This create number of analytical problems due to different composition of the analytical matrix and interfering compounds and therefore, proper optimization of quantification protocols for such biomarkers is still the challenge. In this work we summarizing typical analytical protocols that were recently applied for quantification of hopanoids like compounds from different samples. Main steps including components of interest extraction, pre-purification, fractionation, derivatization and quantification involving gas (1D and 2D) as well as liquid separation techniques (liquid-liquid extraction, solid-phase extraction, planar and low resolution column chromatography, high-performance liquid chromatography) are described and discussed from practical point of view, mainly based on the experimental papers that were published within last two years, where significant increase in hopanoids research was noticed. The second aim of this review is to describe the latest research trends concerning determination of hopanoids and related low-molecular mass lipids analyzed in various samples including sediments, rocks, coals, crude oils and plant fossils as well as stromatolites and microbial biomass cultivated under different conditions. It has been found that majority of the most recent papers are based on uni- or bivariate approach for complex data analysis. Data interpretation involves

  17. Final Report. Evaluating the Climate Sensitivity of Dissipative Subgrid-Scale Mixing Processes and Variable Resolution in NCAR's Community Earth System Model

    Energy Technology Data Exchange (ETDEWEB)

    Jablonowski, Christiane [Univ. of Michigan, Ann Arbor, MI (United States)

    2015-12-14

    The goals of this project were to (1) assess and quantify the sensitivity and scale-dependency of unresolved subgrid-scale mixing processes in NCAR’s Community Earth System Model (CESM), and (2) to improve the accuracy and skill of forthcoming CESM configurations on modern cubed-sphere and variable-resolution computational grids. The research thereby contributed to the description and quantification of uncertainties in CESM’s dynamical cores and their physics-dynamics interactions.

  18. Ultra-high resolution AMOLED

    Science.gov (United States)

    Wacyk, Ihor; Prache, Olivier; Ghosh, Amal

    2011-06-01

    AMOLED microdisplays continue to show improvement in resolution and optical performance, enhancing their appeal for a broad range of near-eye applications such as night vision, simulation and training, situational awareness, augmented reality, medical imaging, and mobile video entertainment and gaming. eMagin's latest development of an HDTV+ resolution technology integrates an OLED pixel of 3.2 × 9.6 microns in size on a 0.18 micron CMOS backplane to deliver significant new functionality as well as the capability to implement a 1920×1200 microdisplay in a 0.86" diagonal area. In addition to the conventional matrix addressing circuitry, the HDTV+ display includes a very lowpower, low-voltage-differential-signaling (LVDS) serialized interface to minimize cable and connector size as well as electromagnetic emissions (EMI), an on-chip set of look-up-tables for digital gamma correction, and a novel pulsewidth- modulation (PWM) scheme that together with the standard analog control provides a total dimming range of 0.05cd/m2 to 2000cd/m2 in the monochrome version. The PWM function also enables an impulse drive mode of operation that significantly reduces motion artifacts in high speed scene changes. An internal 10-bit DAC ensures that a full 256 gamma-corrected gray levels are available across the entire dimming range, resulting in a measured dynamic range exceeding 20-bits. This device has been successfully tested for operation at frame rates ranging from 30Hz up to 85Hz. This paper describes the operational features and detailed optical and electrical test results for the new AMOLED WUXGA resolution microdisplay.

  19. Interstellar scattering and resolution limitations

    International Nuclear Information System (INIS)

    Dennison, B.

    1987-01-01

    Density irregularities in both the interplanetary medium and the ionized component of the interstellar medium scatter radio waves, resulting in limitations on the achievable resolution. Interplanetary scattering (IPS) is weak for most observational situations, and in principle the resulting phase corruption can be corrected for when observing with sufficiently many array elements. Interstellar scattering (ISS), on the other hand, is usually strong at frequencies below about 8 GHz, in which case intrinsic structure information over a range of angular scales is irretrievably lost. With the earth-space baselines now planned, it will be possible to search directly for interstellar refraction, which is suspected of modulating the fluxes of background sources. 14 references

  20. Salivary Cytoprotective Proteins in Inflammation and Resolution during Experimental Gingivitis--A Pilot Study.

    Science.gov (United States)

    Aboodi, Guy M; Sima, Corneliu; Moffa, Eduardo B; Crosara, Karla T B; Xiao, Yizhi; Siqueira, Walter L; Glogauer, Michael

    2015-01-01

    The protective mechanisms that maintain periodontal homeostasis in gingivitis and prevent periodontal tissue destruction are poorly understood. The aim of this study was to identify changes in the salivary proteome during experimental gingivitis. We used oral neutrophil quantification and whole saliva (WS) proteomics to assess changes that occur in the inflammatory and resolution phases of gingivitis in healthy individuals. Oral neutrophils and WS samples were collected and clinical parameters measured on days 0, 7, 14, 21, 28, and 35. Increased oral neutrophil recruitment and salivary cytoprotective proteins increased progressively during inflammation and decreased in resolution. Oral neutrophil numbers in gingival inflammation and resolution correlated moderately with salivary β-globin, thioredoxin, and albumin and strongly with collagen alpha-1 and G-protein coupled receptor 98. Our results indicate that changes in salivary cytoprotective proteins in gingivitis are associated with a similar trend in oral neutrophil recruitment and clinical parameters. We found moderate to strong correlations between oral neutrophil numbers and levels of several salivary cytoprotective proteins both in the development of the inflammation and in the resolution of gingivitis. Our proteomics approach identified and relatively quantified specific cytoprotective proteins in this pilot study of experimental gingivitis; however, future and more comprehensive studies are needed to clearly identify and validate those protein biomarkers when gingivitis is active.

  1. Salivary Cytoprotective Proteins in Inflammation and Resolution during Experimental Gingivitis—A Pilot Study

    Science.gov (United States)

    Aboodi, Guy M.; Sima, Corneliu; Moffa, Eduardo B.; Crosara, Karla T. B.; Xiao, Yizhi; Siqueira, Walter L.; Glogauer, Michael

    2016-01-01

    Objective: The protective mechanisms that maintain periodontal homeostasis in gingivitis and prevent periodontal tissue destruction are poorly understood. The aim of this study was to identify changes in the salivary proteome during experimental gingivitis. Study design: We used oral neutrophil quantification and whole saliva (WS) proteomics to assess changes that occur in the inflammatory and resolution phases of gingivitis in healthy individuals. Oral neutrophils and WS samples were collected and clinical parameters measured on days 0, 7, 14, 21, 28, and 35. Results: Increased oral neutrophil recruitment and salivary cytoprotective proteins increased progressively during inflammation and decreased in resolution. Oral neutrophil numbers in gingival inflammation and resolution correlated moderately with salivary β-globin, thioredoxin, and albumin and strongly with collagen alpha-1 and G-protein coupled receptor 98. Conclusions: Our results indicate that changes in salivary cytoprotective proteins in gingivitis are associated with a similar trend in oral neutrophil recruitment and clinical parameters. Clinical relevance: We found moderate to strong correlations between oral neutrophil numbers and levels of several salivary cytoprotective proteins both in the development of the inflammation and in the resolution of gingivitis. Our proteomics approach identified and relatively quantified specific cytoprotective proteins in this pilot study of experimental gingivitis; however, future and more comprehensive studies are needed to clearly identify and validate those protein biomarkers when gingivitis is active. PMID:26779447

  2. On Radar Resolution in Coherent Change Detection.

    Energy Technology Data Exchange (ETDEWEB)

    Bickel, Douglas L. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2015-11-01

    It is commonly observed that resolution plays a role in coherent change detection. Although this is the case, the relationship of the resolution in coherent change detection is not yet defined . In this document, we present an analytical method of evaluating this relationship using detection theory. Specifically we examine the effect of resolution on receiver operating characteristic curves for coherent change detection.

  3. Likelihood Ratio Based Mixed Resolution Facial Comparison

    NARCIS (Netherlands)

    Peng, Y.; Spreeuwers, Lieuwe Jan; Veldhuis, Raymond N.J.

    2015-01-01

    In this paper, we propose a novel method for low-resolution face recognition. It is especially useful for a common situation in forensic search where faces of low resolution, e.g. on surveillance footage or in a crowd, must be compared to a high-resolution reference. This method is based on the

  4. Development of AMS high resolution injector system

    International Nuclear Information System (INIS)

    Bao Yiwen; Guan Xialing; Hu Yueming

    2008-01-01

    The Beijing HI-13 tandem accelerator AMS high resolution injector system was developed. The high resolution energy achromatic system consists of an electrostatic analyzer and a magnetic analyzer, which mass resolution can reach 600 and transmission is better than 80%. (authors)

  5. Resolution on the program energy-climate

    International Nuclear Information System (INIS)

    2008-01-01

    This document presents the resolutions proposed in the resolution proposition n. 1261 and concerning the european Commission program on the energy policies and the climate change. Twelve resolution are presented on the energy sources development, the energy efficiency, the energy economy and the carbon taxes. (A.L.B.)

  6. Efficient Quantification of Uncertainties in Complex Computer Code Results, Phase I

    Data.gov (United States)

    National Aeronautics and Space Administration — This proposal addresses methods for efficient quantification of margins and uncertainties (QMU) for models that couple multiple, large-scale commercial or...

  7. Cutset Quantification Error Evaluation for Shin-Kori 1 and 2 PSA model

    International Nuclear Information System (INIS)

    Choi, Jong Soo

    2009-01-01

    Probabilistic safety assessments (PSA) for nuclear power plants (NPPs) are based on the minimal cut set (MCS) quantification method. In PSAs, the risk and importance measures are computed from a cutset equation mainly by using approximations. The conservatism of the approximations is also a source of quantification uncertainty. In this paper, exact MCS quantification methods which are based on the 'sum of disjoint products (SDP)' logic and Inclusion-exclusion formula are applied and the conservatism of the MCS quantification results in Shin-Kori 1 and 2 PSA is evaluated

  8. Incremental Query Rewriting with Resolution

    Science.gov (United States)

    Riazanov, Alexandre; Aragão, Marcelo A. T.

    We address the problem of semantic querying of relational databases (RDB) modulo knowledge bases using very expressive knowledge representation formalisms, such as full first-order logic or its various fragments. We propose to use a resolution-based first-order logic (FOL) reasoner for computing schematic answers to deductive queries, with the subsequent translation of these schematic answers to SQL queries which are evaluated using a conventional relational DBMS. We call our method incremental query rewriting, because an original semantic query is rewritten into a (potentially infinite) series of SQL queries. In this chapter, we outline the main idea of our technique - using abstractions of databases and constrained clauses for deriving schematic answers, and provide completeness and soundness proofs to justify the applicability of this technique to the case of resolution for FOL without equality. The proposed method can be directly used with regular RDBs, including legacy databases. Moreover, we propose it as a potential basis for an efficient Web-scale semantic search technology.

  9. Microscopic resolution broadband dielectric spectroscopy

    International Nuclear Information System (INIS)

    Mukherjee, S; Watson, P; Prance, R J

    2011-01-01

    Results are presented for a non-contact measurement system capable of micron level spatial resolution. It utilises the novel electric potential sensor (EPS) technology, invented at Sussex, to image the electric field above a simple composite dielectric material. EP sensors may be regarded as analogous to a magnetometer and require no adjustments or offsets during either setup or use. The sample consists of a standard glass/epoxy FR4 circuit board, with linear defects machined into the surface by a PCB milling machine. The sample is excited with an a.c. signal over a range of frequencies from 10 kHz to 10 MHz, from the reverse side, by placing it on a conducting sheet connected to the source. The single sensor is raster scanned over the surface at a constant working distance, consistent with the spatial resolution, in order to build up an image of the electric field, with respect to the reference potential. The results demonstrate that both the surface defects and the internal dielectric variations within the composite may be imaged in this way, with good contrast being observed between the glass mat and the epoxy resin.

  10. Microscopic resolution broadband dielectric spectroscopy

    Science.gov (United States)

    Mukherjee, S.; Watson, P.; Prance, R. J.

    2011-08-01

    Results are presented for a non-contact measurement system capable of micron level spatial resolution. It utilises the novel electric potential sensor (EPS) technology, invented at Sussex, to image the electric field above a simple composite dielectric material. EP sensors may be regarded as analogous to a magnetometer and require no adjustments or offsets during either setup or use. The sample consists of a standard glass/epoxy FR4 circuit board, with linear defects machined into the surface by a PCB milling machine. The sample is excited with an a.c. signal over a range of frequencies from 10 kHz to 10 MHz, from the reverse side, by placing it on a conducting sheet connected to the source. The single sensor is raster scanned over the surface at a constant working distance, consistent with the spatial resolution, in order to build up an image of the electric field, with respect to the reference potential. The results demonstrate that both the surface defects and the internal dielectric variations within the composite may be imaged in this way, with good contrast being observed between the glass mat and the epoxy resin.

  11. Super Resolution Algorithm for CCTVs

    Science.gov (United States)

    Gohshi, Seiichi

    2015-03-01

    Recently, security cameras and CCTV systems have become an important part of our daily lives. The rising demand for such systems has created business opportunities in this field, especially in big cities. Analogue CCTV systems are being replaced by digital systems, and HDTV CCTV has become quite common. HDTV CCTV can achieve images with high contrast and decent quality if they are clicked in daylight. However, the quality of an image clicked at night does not always have sufficient contrast and resolution because of poor lighting conditions. CCTV systems depend on infrared light at night to compensate for insufficient lighting conditions, thereby producing monochrome images and videos. However, these images and videos do not have high contrast and are blurred. We propose a nonlinear signal processing technique that significantly improves visual and image qualities (contrast and resolution) of low-contrast infrared images. The proposed method enables the use of infrared cameras for various purposes such as night shot and poor lighting environments under poor lighting conditions.

  12. High-Resolution Mass Spectrometers

    Science.gov (United States)

    Marshall, Alan G.; Hendrickson, Christopher L.

    2008-07-01

    Over the past decade, mass spectrometry has been revolutionized by access to instruments of increasingly high mass-resolving power. For small molecules up to ˜400 Da (e.g., drugs, metabolites, and various natural organic mixtures ranging from foods to petroleum), it is possible to determine elemental compositions (CcHhNnOoSsPp…) of thousands of chemical components simultaneously from accurate mass measurements (the same can be done up to 1000 Da if additional information is included). At higher mass, it becomes possible to identify proteins (including posttranslational modifications) from proteolytic peptides, as well as lipids, glycoconjugates, and other biological components. At even higher mass (˜100,000 Da or higher), it is possible to characterize posttranslational modifications of intact proteins and to map the binding surfaces of large biomolecule complexes. Here we review the principles and techniques of the highest-resolution analytical mass spectrometers (time-of-flight and Fourier transform ion cyclotron resonance and orbitrap mass analyzers) and describe some representative high-resolution applications.

  13. Effects of humic acid on DNA quantification with Quantifiler® Human DNA Quantification kit and short tandem repeat amplification efficiency.

    Science.gov (United States)

    Seo, Seung Bum; Lee, Hye Young; Zhang, Ai Hua; Kim, Hye Yeon; Shin, Dong Hoon; Lee, Soong Deok

    2012-11-01

    Correct DNA quantification is an essential part to obtain reliable STR typing results. Forensic DNA analysts often use commercial kits for DNA quantification; among them, real-time-based DNA quantification kits are most frequently used. Incorrect DNA quantification due to the presence of PCR inhibitors may affect experiment results. In this study, we examined the alteration degree of DNA quantification results estimated in DNA samples containing a PCR inhibitor by using a Quantifiler® Human DNA Quantification kit. For experiments, we prepared approximately 0.25 ng/μl DNA samples containing various concentrations of humic acid (HA). The quantification results were 0.194-0.303 ng/μl at 0-1.6 ng/μl HA (final concentration in the Quantifiler reaction) and 0.003-0.168 ng/μl at 2.4-4.0 ng/μl HA. Most DNA quantity was undetermined when HA concentration was higher than 4.8 ng/μl HA. The C (T) values of an internal PCR control (IPC) were 28.0-31.0, 36.5-37.1, and undetermined at 0-1.6, 2.4, and 3.2 ng/μl HA. These results indicate that underestimated DNA quantification results may be obtained in the DNA sample with high C (T) values of IPC. Thus, researchers should carefully interpret the DNA quantification results. We additionally examined the effects of HA on the STR amplification by using an Identifiler® kit and a MiniFiler™ kit. Based on the results of this study, it is thought that a better understanding of various effects of HA would help researchers recognize and manipulate samples containing HA.

  14. Two-stream Convolutional Neural Network for Methane Emissions Quantification

    Science.gov (United States)

    Wang, J.; Ravikumar, A. P.; McGuire, M.; Bell, C.; Tchapmi, L. P.; Brandt, A. R.

    2017-12-01

    Methane, a key component of natural gas, has a 25x higher global warming potential than carbon dioxide on a 100-year basis. Accurately monitoring and mitigating methane emissions require cost-effective detection and quantification technologies. Optical gas imaging, one of the most commonly used leak detection technology, adopted by Environmental Protection Agency, cannot estimate leak-sizes. In this work, we harness advances in computer science to allow for rapid and automatic leak quantification. Particularly, we utilize two-stream deep Convolutional Networks (ConvNets) to estimate leak-size by capturing complementary spatial information from still plume frames, and temporal information from plume motion between frames. We build large leak datasets for training and evaluating purposes by collecting about 20 videos (i.e. 397,400 frames) of leaks. The videos were recorded at six distances from the source, covering 10 -60 ft. Leak sources included natural gas well-heads, separators, and tanks. All frames were labeled with a true leak size, which has eight levels ranging from 0 to 140 MCFH. Preliminary analysis shows that two-stream ConvNets provides significant accuracy advantage over single steam ConvNets. Spatial stream ConvNet can achieve an accuracy of 65.2%, by extracting important features, including texture, plume area, and pattern. Temporal stream, fed by the results of optical flow analysis, results in an accuracy of 58.3%. The integration of the two-stream ConvNets gives a combined accuracy of 77.6%. For future work, we will split the training and testing datasets in distinct ways in order to test the generalization of the algorithm for different leak sources. Several analytic metrics, including confusion matrix and visualization of key features, will be used to understand accuracy rates and occurrences of false positives. The quantification algorithm can help to find and fix super-emitters, and improve the cost-effectiveness of leak detection and repair

  15. Quantification of breast arterial calcification using full field digital mammography

    International Nuclear Information System (INIS)

    Molloi, Sabee; Xu Tong; Ducote, Justin; Iribarren, Carlos

    2008-01-01

    Breast arterial calcification is commonly detected on some mammograms. Previous studies indicate that breast arterial calcification is evidence of general atherosclerotic vascular disease and it may be a useful marker of coronary artery disease. It can potentially be a useful tool for assessment of coronary artery disease in women since mammography is widely used as a screening tool for early detection of breast cancer. However, there are currently no available techniques for quantification of calcium mass using mammography. The purpose of this study was to determine whether it is possible to quantify breast arterial calcium mass using standard digital mammography. An anthropomorphic breast phantom along with a vessel calcification phantom was imaged using a full field digital mammography system. Densitometry was used to quantify calcium mass. A calcium calibration measurement was performed at each phantom thickness and beam energy. The known (K) and measured (M) calcium mass on 5 and 9 cm thickness phantoms were related by M=0.964K-0.288 mg (r=0.997 and SEE=0.878 mg) and M=1.004K+0.324 mg (r=0.994 and SEE=1.32 mg), respectively. The results indicate that accurate calcium mass measurements can be made without correction for scatter glare as long as careful calcium calibration is made for each breast thickness. The results also indicate that composition variations and differences of approximately 1 cm between calibration phantom and breast thickness introduce only minimal error in calcium measurement. The uncertainty in magnification is expected to cause up to 5% and 15% error in calcium mass for 5 and 9 cm breast thicknesses, respectively. In conclusion, a densitometry technique for quantification of breast arterial calcium mass was validated using standard full field digital mammography. The results demonstrated the feasibility and potential utility of the densitometry technique for accurate quantification of breast arterial calcium mass using standard digital

  16. Development of the quantification procedures for in situ XRF analysis

    International Nuclear Information System (INIS)

    Kump, P.; Necemer, M.; Rupnik, P.

    2005-01-01

    For in situ XRF applications, two excitation systems (radioisotope and tube excited) and an X ray spectrometer based on an Si-PIN detector were assembled and used. The radioisotope excitation system with an Am-241 source was assembled into a prototype of a compact XRF analyser PEDUZO-01, which is also applicable in field work. The existing quantification software QAES (quantitative analysis of environmental samples) was assessed to be adequate also in field work. This QAES software was also integrated into a new software attached to the developed XRF analyser PEDUZO-01, which includes spectrum acquisition, spectrum analysis and quantification and runs in the LABVIEW environment. In a process of assessment of the Si-PIN based X ray spectrometers and QAES quantification software in field work, a comparison was made with the results obtained by the standard Si(Li) based spectrometer. The results of this study prove that the use of this spectrometer is adequate for field work. This work was accepted for publication in X ray Spectrometry. Application of a simple sample preparation of solid samples was studied in view of the analytical results obtained. It has been established that under definite conditions the results are not very different from the ones obtained by the homogenized sample pressed into the pellet. The influence of particle size and mineralogical effects on quantitative results was studied. A simple sample preparation kit was proposed. Sample preparation for the analysis of water samples by precipitation with APDC and aerosol analysis using a dichotomous sampler were also adapted and used in the field work. An adequate sample preparation kit was proposed. (author)

  17. QUANTIFICATION OF GENETICALLY MODIFIED MAIZE MON 810 IN PROCESSED FOODS

    Directory of Open Access Journals (Sweden)

    Peter Siekel

    2012-12-01

    Full Text Available 800x600 Normal 0 21 false false false SK X-NONE X-NONE MicrosoftInternetExplorer4 Maize MON 810 (Zea mays L. represents the majority of genetically modified food crops. It is the only transgenic cultivar grown in the EU (European Union countries and food products with its content higher than 0.9 % must be labelled. This study was aimed at impact of food processing (temperature, pH and pressure on DNA degradation and quantification of the genetically modified maize MON 810. The transgenic DNA was quantified by the real-time polymerase chain reaction method. Processing as is high temperature (121 °C, elevated pressure (0.1 MPa and low pH 2.25 fragmented DNA. A consequence of two order difference in the species specific gene content compared to the transgenic DNA content in plant materials used has led to false negative results in the quantification of transgenic DNA. The maize containing 4.2 % of the transgene after processing appeared to be as low as 3.0 % (100 °C and 1.9 % (121 °C, 0.1 MPa. The 2.1 % amount of transgene dropped at 100 °C to 1.0 % and at 121 °C, 0.1 MPa to 0.6 %. Under such make up the DNA degradation of transgenic content showed up 2 or 3 time higher decrease a consequence of unequal gene presence. Such genes disparity is expressed as considerable decrease of transgenic content while the decrease of species specific gene content remains unnoticed. Based on our findings we conclude that high degree of processing might have led to false negative results of the transgenic constituent quantification. Determination of GMO content in processed foods may leads to incorrect statement and labelling in these cases could misleads consumers.doi:10.5219/212

  18. Simultaneous quantification of flavonoids and triterpenoids in licorice using HPLC.

    Science.gov (United States)

    Wang, Yuan-Chuen; Yang, Yi-Shan

    2007-05-01

    Numerous bioactive compounds are present in licorice (Glycyrrhizae Radix), including flavonoids and triterpenoids. In this study, a reversed-phase high-performance liquid chromatography (HPLC) method for simultaneous quantification of three flavonoids (liquiritin, liquiritigenin and isoliquiritigenin) and four triterpenoids (glycyrrhizin, 18alpha-glycyrrhetinic acid, 18beta-glycyrrhetinic acid and 18beta-glycyrrhetinic acid methyl ester) from licorice was developed, and further, to quantify these 7 compounds from 20 different licorice samples. Specifically, the reverse-phase HPLC was performed with a gradient mobile phase composed of 25 mM phosphate buffer (pH 2.5)-acetonitrile featuring gradient elution steps as follows: 0 min, 100:0; 10 min, 80:20; 50 min, 70:30; 73 min, 50:50; 110 min, 50:50; 125 min, 20:80; 140 min, 20:80, and peaks were detected at 254 nm. By using our technique, a rather good specificity was obtained regarding to the separation of these seven compounds. The regression coefficient for the linear equations for the seven compounds lay between 0.9978 and 0.9992. The limits of detection and quantification lay in the range of 0.044-0.084 and 0.13-0.25 microg/ml, respectively. The relative recovery rates for the seven compounds lay between 96.63+/-2.43 and 103.55+/-2.77%. Coefficient variation for intra-day and inter-day precisions lay in the range of 0.20-1.84 and 0.28-1.86%, respectively. Based upon our validation results, this analytical technique is a convenient method to simultaneous quantify numerous bioactive compounds derived from licorice, featuring good quantification parameters, accuracy and precision.

  19. The role of PET quantification in cardiovascular imaging.

    Science.gov (United States)

    Slomka, Piotr; Berman, Daniel S; Alexanderson, Erick; Germano, Guido

    2014-08-01

    Positron Emission Tomography (PET) has several clinical and research applications in cardiovascular imaging. Myocardial perfusion imaging with PET allows accurate global and regional measurements of myocardial perfusion, myocardial blood flow and function at stress and rest in one exam. Simultaneous assessment of function and perfusion by PET with quantitative software is currently the routine practice. Combination of ejection fraction reserve with perfusion information may improve the identification of severe disease. The myocardial viability can be estimated by quantitative comparison of fluorodeoxyglucose ( 18 FDG) and rest perfusion imaging. The myocardial blood flow and coronary flow reserve measurements are becoming routinely included in the clinical assessment due to enhanced dynamic imaging capabilities of the latest PET/CT scanners. Absolute flow measurements allow evaluation of the coronary microvascular dysfunction and provide additional prognostic and diagnostic information for coronary disease. Standard quantitative approaches to compute myocardial blood flow from kinetic PET data in automated and rapid fashion have been developed for 13 N-ammonia, 15 O-water and 82 Rb radiotracers. The agreement between software methods available for such analysis is excellent. Relative quantification of 82 Rb PET myocardial perfusion, based on comparisons to normal databases, demonstrates high performance for the detection of obstructive coronary disease. New tracers, such as 18 F-flurpiridaz may allow further improvements in the disease detection. Computerized analysis of perfusion at stress and rest reduces the variability of the assessment as compared to visual analysis. PET quantification can be enhanced by precise coregistration with CT angiography. In emerging clinical applications, the potential to identify vulnerable plaques by quantification of atherosclerotic plaque uptake of 18 FDG and 18 F-sodium fluoride tracers in carotids, aorta and coronary arteries

  20. Final Report: Quantification of Uncertainty in Extreme Scale Computations (QUEST)

    Energy Technology Data Exchange (ETDEWEB)

    Marzouk, Youssef [Massachusetts Inst. of Technology (MIT), Cambridge, MA (United States); Conrad, Patrick [Massachusetts Inst. of Technology (MIT), Cambridge, MA (United States); Bigoni, Daniele [Massachusetts Inst. of Technology (MIT), Cambridge, MA (United States); Parno, Matthew [Massachusetts Inst. of Technology (MIT), Cambridge, MA (United States)

    2017-06-09

    QUEST (\\url{www.quest-scidac.org}) is a SciDAC Institute that is focused on uncertainty quantification (UQ) in large-scale scientific computations. Our goals are to (1) advance the state of the art in UQ mathematics, algorithms, and software; and (2) provide modeling, algorithmic, and general UQ expertise, together with software tools, to other SciDAC projects, thereby enabling and guiding a broad range of UQ activities in their respective contexts. QUEST is a collaboration among six institutions (Sandia National Laboratories, Los Alamos National Laboratory, the University of Southern California, Massachusetts Institute of Technology, the University of Texas at Austin, and Duke University) with a history of joint UQ research. Our vision encompasses all aspects of UQ in leadership-class computing. This includes the well-founded setup of UQ problems; characterization of the input space given available data/information; local and global sensitivity analysis; adaptive dimensionality and order reduction; forward and inverse propagation of uncertainty; handling of application code failures, missing data, and hardware/software fault tolerance; and model inadequacy, comparison, validation, selection, and averaging. The nature of the UQ problem requires the seamless combination of data, models, and information across this landscape in a manner that provides a self-consistent quantification of requisite uncertainties in predictions from computational models. Accordingly, our UQ methods and tools span an interdisciplinary space across applied math, information theory, and statistics. The MIT QUEST effort centers on statistical inference and methods for surrogate or reduced-order modeling. MIT personnel have been responsible for the development of adaptive sampling methods, methods for approximating computationally intensive models, and software for both forward uncertainty propagation and statistical inverse problems. A key software product of the MIT QUEST effort is the MIT