WorldWideScience

Sample records for field level quantification

  1. Uncertainty Quantification in Geomagnetic Field Modeling

    Science.gov (United States)

    Chulliat, A.; Nair, M. C.; Alken, P.; Meyer, B.; Saltus, R.; Woods, A.

    2017-12-01

    Geomagnetic field models are mathematical descriptions of the various sources of the Earth's magnetic field, and are generally obtained by solving an inverse problem. They are widely used in research to separate and characterize field sources, but also in many practical applications such as aircraft and ship navigation, smartphone orientation, satellite attitude control, and directional drilling. In recent years, more sophisticated models have been developed, thanks to the continuous availability of high quality satellite data and to progress in modeling techniques. Uncertainty quantification has become an integral part of model development, both to assess the progress made and to address specific users' needs. Here we report on recent advances made by our group in quantifying the uncertainty of geomagnetic field models. We first focus on NOAA's World Magnetic Model (WMM) and the International Geomagnetic Reference Field (IGRF), two reference models of the main (core) magnetic field produced every five years. We describe the methods used in quantifying the model commission error as well as the omission error attributed to various un-modeled sources such as magnetized rocks in the crust and electric current systems in the atmosphere and near-Earth environment. A simple error model was derived from this analysis, to facilitate usage in practical applications. We next report on improvements brought by combining a main field model with a high resolution crustal field model and a time-varying, real-time external field model, like in NOAA's High Definition Geomagnetic Model (HDGM). The obtained uncertainties are used by the directional drilling industry to mitigate health, safety and environment risks.

  2. Quantification of trace-level DNA by real-time whole genome amplification.

    Science.gov (United States)

    Kang, Min-Jung; Yu, Hannah; Kim, Sook-Kyung; Park, Sang-Ryoul; Yang, Inchul

    2011-01-01

    Quantification of trace amounts of DNA is a challenge in analytical applications where the concentration of a target DNA is very low or only limited amounts of samples are available for analysis. PCR-based methods including real-time PCR are highly sensitive and widely used for quantification of low-level DNA samples. However, ordinary PCR methods require at least one copy of a specific gene sequence for amplification and may not work for a sub-genomic amount of DNA. We suggest a real-time whole genome amplification method adopting the degenerate oligonucleotide primed PCR (DOP-PCR) for quantification of sub-genomic amounts of DNA. This approach enabled quantification of sub-picogram amounts of DNA independently of their sequences. When the method was applied to the human placental DNA of which amount was accurately determined by inductively coupled plasma-optical emission spectroscopy (ICP-OES), an accurate and stable quantification capability for DNA samples ranging from 80 fg to 8 ng was obtained. In blind tests of laboratory-prepared DNA samples, measurement accuracies of 7.4%, -2.1%, and -13.9% with analytical precisions around 15% were achieved for 400-pg, 4-pg, and 400-fg DNA samples, respectively. A similar quantification capability was also observed for other DNA species from calf, E. coli, and lambda phage. Therefore, when provided with an appropriate standard DNA, the suggested real-time DOP-PCR method can be used as a universal method for quantification of trace amounts of DNA.

  3. Breast density quantification using magnetic resonance imaging (MRI) with bias field correction: a postmortem study.

    Science.gov (United States)

    Ding, Huanjun; Johnson, Travis; Lin, Muqing; Le, Huy Q; Ducote, Justin L; Su, Min-Ying; Molloi, Sabee

    2013-12-01

    Quantification of breast density based on three-dimensional breast MRI may provide useful information for the early detection of breast cancer. However, the field inhomogeneity can severely challenge the computerized image segmentation process. In this work, the effect of the bias field in breast density quantification has been investigated with a postmortem study. T1-weighted images of 20 pairs of postmortem breasts were acquired on a 1.5 T breast MRI scanner. Two computer-assisted algorithms were used to quantify the volumetric breast density. First, standard fuzzy c-means (FCM) clustering was used on raw images with the bias field present. Then, the coherent local intensity clustering (CLIC) method estimated and corrected the bias field during the iterative tissue segmentation process. Finally, FCM clustering was performed on the bias-field-corrected images produced by CLIC method. The left-right correlation for breasts in the same pair was studied for both segmentation algorithms to evaluate the precision of the tissue classification. Finally, the breast densities measured with the three methods were compared to the gold standard tissue compositions obtained from chemical analysis. The linear correlation coefficient, Pearson's r, was used to evaluate the two image segmentation algorithms and the effect of bias field. The CLIC method successfully corrected the intensity inhomogeneity induced by the bias field. In left-right comparisons, the CLIC method significantly improved the slope and the correlation coefficient of the linear fitting for the glandular volume estimation. The left-right breast density correlation was also increased from 0.93 to 0.98. When compared with the percent fibroglandular volume (%FGV) from chemical analysis, results after bias field correction from both the CLIC the FCM algorithms showed improved linear correlation. As a result, the Pearson's r increased from 0.86 to 0.92 with the bias field correction. The investigated CLIC method

  4. Breast density quantification using magnetic resonance imaging (MRI) with bias field correction: A postmortem study

    International Nuclear Information System (INIS)

    Ding, Huanjun; Johnson, Travis; Lin, Muqing; Le, Huy Q.; Ducote, Justin L.; Su, Min-Ying; Molloi, Sabee

    2013-01-01

    Purpose: Quantification of breast density based on three-dimensional breast MRI may provide useful information for the early detection of breast cancer. However, the field inhomogeneity can severely challenge the computerized image segmentation process. In this work, the effect of the bias field in breast density quantification has been investigated with a postmortem study. Methods: T1-weighted images of 20 pairs of postmortem breasts were acquired on a 1.5 T breast MRI scanner. Two computer-assisted algorithms were used to quantify the volumetric breast density. First, standard fuzzy c-means (FCM) clustering was used on raw images with the bias field present. Then, the coherent local intensity clustering (CLIC) method estimated and corrected the bias field during the iterative tissue segmentation process. Finally, FCM clustering was performed on the bias-field-corrected images produced by CLIC method. The left–right correlation for breasts in the same pair was studied for both segmentation algorithms to evaluate the precision of the tissue classification. Finally, the breast densities measured with the three methods were compared to the gold standard tissue compositions obtained from chemical analysis. The linear correlation coefficient, Pearson'sr, was used to evaluate the two image segmentation algorithms and the effect of bias field. Results: The CLIC method successfully corrected the intensity inhomogeneity induced by the bias field. In left–right comparisons, the CLIC method significantly improved the slope and the correlation coefficient of the linear fitting for the glandular volume estimation. The left–right breast density correlation was also increased from 0.93 to 0.98. When compared with the percent fibroglandular volume (%FGV) from chemical analysis, results after bias field correction from both the CLIC the FCM algorithms showed improved linear correlation. As a result, the Pearson'sr increased from 0.86 to 0.92 with the bias field correction

  5. Cyclewise Operation of Printed MoS2 Transistor Biosensors for Rapid Biomolecule Quantification at Femtomolar Levels.

    Science.gov (United States)

    Ryu, Byunghoon; Nam, Hongsuk; Oh, Bo-Ram; Song, Yujing; Chen, Pengyu; Park, Younggeun; Wan, Wenjie; Kurabayashi, Katsuo; Liang, Xiaogan

    2017-02-24

    Field-effect transistors made from MoS 2 and other emerging layered semiconductors have been demonstrated to be able to serve as ultrasensitive biosensors. However, such nanoelectronic sensors still suffer seriously from a series of challenges associated with the poor compatibility between electronic structures and liquid analytes. These challenges hinder the practical biosensing applications that demand rapid, low-noise, highly specific biomolecule quantification at femtomolar levels. To address such challenges, we study a cyclewise process for operating MoS 2 transistor biosensors, in which a series of reagent fluids are delivered to the sensor in a time-sequenced manner and periodically set the sensor into four assay-cycle stages, including incubation, flushing, drying, and electrical measurement. Running multiple cycles of such an assay can acquire a time-dependent sensor response signal quantifying the reaction kinetics of analyte-receptor binding. This cyclewise detection approach can avoid the liquid-solution-induced electrochemical damage, screening, and nonspecific adsorption to the sensor and therefore improves the transistor sensor's durability, sensitivity, specificity, and signal-to-noise ratio. These advantages in combination with the inherent high sensitivity of MoS 2 biosensors allow for rapid biomolecule quantification at femtomolar levels. We have demonstrated the cyclewise quantification of Interleukin-1β in pure and complex solutions (e.g., serum and saliva) with a detection limit of ∼1 fM and a total detection time ∼23 min. This work leverages the superior properties of layered semiconductors for biosensing applications and advances the techniques toward realizing fast real-time immunoassay for low-abundance biomolecule detection.

  6. Quantification of birefringence readily measures the level of muscle damage in zebrafish

    Energy Technology Data Exchange (ETDEWEB)

    Berger, Joachim, E-mail: Joachim.Berger@Monash.edu [Australian Regenerative Medicine Institute, EMBL Australia, Monash University, Clayton (Australia); Sztal, Tamar; Currie, Peter D. [Australian Regenerative Medicine Institute, EMBL Australia, Monash University, Clayton (Australia)

    2012-07-13

    Highlights: Black-Right-Pointing-Pointer Report of an unbiased quantification of the birefringence of muscle of fish larvae. Black-Right-Pointing-Pointer Quantification method readily identifies level of overall muscle damage. Black-Right-Pointing-Pointer Compare zebrafish muscle mutants for level of phenotype severity. Black-Right-Pointing-Pointer Proposed tool to survey treatments that aim to ameliorate muscular dystrophy. -- Abstract: Muscular dystrophies are a group of genetic disorders that progressively weaken and degenerate muscle. Many zebrafish models for human muscular dystrophies have been generated and analysed, including dystrophin-deficient zebrafish mutants dmd that model Duchenne Muscular Dystrophy. Under polarised light the zebrafish muscle can be detected as a bright area in an otherwise dark background. This light effect, called birefringence, results from the diffraction of polarised light through the pseudo-crystalline array of the muscle sarcomeres. Muscle damage, as seen in zebrafish models for muscular dystrophies, can readily be detected by a reduction in the birefringence. Therefore, birefringence is a very sensitive indicator of overall muscle integrity within larval zebrafish. Unbiased documentation of the birefringence followed by densitometric measurement enables the quantification of the birefringence of zebrafish larvae. Thereby, the overall level of muscle integrity can be detected, allowing the identification and categorisation of zebrafish muscle mutants. In addition, we propose that the establish protocol can be used to analyse treatments aimed at ameliorating dystrophic zebrafish models.

  7. Quantification of birefringence readily measures the level of muscle damage in zebrafish

    International Nuclear Information System (INIS)

    Berger, Joachim; Sztal, Tamar; Currie, Peter D.

    2012-01-01

    Highlights: ► Report of an unbiased quantification of the birefringence of muscle of fish larvae. ► Quantification method readily identifies level of overall muscle damage. ► Compare zebrafish muscle mutants for level of phenotype severity. ► Proposed tool to survey treatments that aim to ameliorate muscular dystrophy. -- Abstract: Muscular dystrophies are a group of genetic disorders that progressively weaken and degenerate muscle. Many zebrafish models for human muscular dystrophies have been generated and analysed, including dystrophin-deficient zebrafish mutants dmd that model Duchenne Muscular Dystrophy. Under polarised light the zebrafish muscle can be detected as a bright area in an otherwise dark background. This light effect, called birefringence, results from the diffraction of polarised light through the pseudo-crystalline array of the muscle sarcomeres. Muscle damage, as seen in zebrafish models for muscular dystrophies, can readily be detected by a reduction in the birefringence. Therefore, birefringence is a very sensitive indicator of overall muscle integrity within larval zebrafish. Unbiased documentation of the birefringence followed by densitometric measurement enables the quantification of the birefringence of zebrafish larvae. Thereby, the overall level of muscle integrity can be detected, allowing the identification and categorisation of zebrafish muscle mutants. In addition, we propose that the establish protocol can be used to analyse treatments aimed at ameliorating dystrophic zebrafish models.

  8. Quantification of breast arterial calcification using full field digital mammography

    International Nuclear Information System (INIS)

    Molloi, Sabee; Xu Tong; Ducote, Justin; Iribarren, Carlos

    2008-01-01

    Breast arterial calcification is commonly detected on some mammograms. Previous studies indicate that breast arterial calcification is evidence of general atherosclerotic vascular disease and it may be a useful marker of coronary artery disease. It can potentially be a useful tool for assessment of coronary artery disease in women since mammography is widely used as a screening tool for early detection of breast cancer. However, there are currently no available techniques for quantification of calcium mass using mammography. The purpose of this study was to determine whether it is possible to quantify breast arterial calcium mass using standard digital mammography. An anthropomorphic breast phantom along with a vessel calcification phantom was imaged using a full field digital mammography system. Densitometry was used to quantify calcium mass. A calcium calibration measurement was performed at each phantom thickness and beam energy. The known (K) and measured (M) calcium mass on 5 and 9 cm thickness phantoms were related by M=0.964K-0.288 mg (r=0.997 and SEE=0.878 mg) and M=1.004K+0.324 mg (r=0.994 and SEE=1.32 mg), respectively. The results indicate that accurate calcium mass measurements can be made without correction for scatter glare as long as careful calcium calibration is made for each breast thickness. The results also indicate that composition variations and differences of approximately 1 cm between calibration phantom and breast thickness introduce only minimal error in calcium measurement. The uncertainty in magnification is expected to cause up to 5% and 15% error in calcium mass for 5 and 9 cm breast thicknesses, respectively. In conclusion, a densitometry technique for quantification of breast arterial calcium mass was validated using standard full field digital mammography. The results demonstrated the feasibility and potential utility of the densitometry technique for accurate quantification of breast arterial calcium mass using standard digital

  9. Photoacoustic bio-quantification of graphene based nanomaterials at a single cell level (Conference Presentation)

    Science.gov (United States)

    Nedosekin, Dmitry A.; Nolan, Jacqueline; Biris, Alexandru S.; Zharov, Vladimir P.

    2017-03-01

    Arkansas Nanomedicine Center at the University of Arkansas for Medical Sciences in collaboration with other Arkansas Universities and the FDA-based National Center of Toxicological Research in Jefferson, AR is developing novel techniques for rapid quantification of graphene-based nanomaterials (GBNs) in various biological samples. All-carbon GBNs have wide range of potential applications in industry, agriculture, food processing and medicine; however, quantification of GBNs is difficult in carbon reach biological tissues. The accurate quantification of GBNs is essential for research on material toxicity and the development of GBNs-based drug delivery platforms. We have developed microscopy and cytometry platforms for detection and quantification of GBNs in single cells, tissue and blood samples using photoacoustic contrast of GBNs. We demonstrated PA quantification of individual graphene uptake by single cells. High-resolution PA microscopy provided mapping of GBN distribution within live cells to establish correlation with intracellular toxic phenomena using apoptotic and necrotic assays. This new methodology and corresponding technical platform provide the insight on possible toxicological risks of GBNs at singe cells levels. In addition, in vivo PA image flow cytometry demonstrated the capability to monitor of GBNs pharmacokinetics in mouse model and to map the resulting biodistribution of GBNs in mouse tissues. The integrated PA platform provided an unprecedented sensitivity toward GBNs and allowed to enhance conventional toxicology research by providing a direct correlation between uptake of GBNs at a single cell level and cell viability status.

  10. Direct qPCR quantification using the Quantifiler(®) Trio DNA quantification kit.

    Science.gov (United States)

    Liu, Jason Yingjie

    2014-11-01

    The effectiveness of a direct quantification assay is essential to the adoption of the combined direct quantification/direct STR workflow. In this paper, the feasibility of using the Quantifiler(®) Trio DNA quantification kit for the direct quantification of forensic casework samples was investigated. Both low-level touch DNA samples and blood samples were collected on PE swabs and quantified directly. The increased sensitivity of the Quantifiler(®) Trio kit enabled the detection of less than 10pg of DNA in unprocessed touch samples and also minimizes the stochastic effect experienced by different targets in the same sample. The DNA quantity information obtained from a direct quantification assay using the Quantifiler(®) Trio kit can also be used to accurately estimate the optimal input DNA quantity for a direct STR amplification reaction. The correlation between the direct quantification results (Quantifiler(®) Trio kit) and the direct STR results (GlobalFiler™ PCR amplification kit(*)) for low-level touch DNA samples indicates that direct quantification using the Quantifiler(®) Trio DNA quantification kit is more reliable than the Quantifiler(®) Duo DNA quantification kit for predicting the STR results of unprocessed touch DNA samples containing less than 10pg of DNA. Copyright © 2014 Elsevier Ireland Ltd. All rights reserved.

  11. Quantification of miRNAs by a simple and specific qPCR method

    DEFF Research Database (Denmark)

    Cirera Salicio, Susanna; Busk, Peter K.

    2014-01-01

    MicroRNAs (miRNAs) are powerful regulators of gene expression at posttranscriptional level and play important roles in many biological processes and in disease. The rapid pace of the emerging field of miRNAs has opened new avenues for development of techniques to quantitatively determine mi...... in miRNA quantification. Furthermore, the method is easy to perform with common laboratory reagents, which allows miRNA quantification at low cost....

  12. Multiplex electrochemical DNA platform for femtomolar-level quantification of genetically modified soybean.

    Science.gov (United States)

    Manzanares-Palenzuela, C Lorena; de-los-Santos-Álvarez, Noemí; Lobo-Castañón, María Jesús; López-Ruiz, Beatriz

    2015-06-15

    Current EU regulations on the mandatory labeling of genetically modified organisms (GMOs) with a minimum content of 0.9% would benefit from the availability of reliable and rapid methods to detect and quantify DNA sequences specific for GMOs. Different genosensors have been developed to this aim, mainly intended for GMO screening. A remaining challenge, however, is the development of genosensing platforms for GMO quantification, which should be expressed as the number of event-specific DNA sequences per taxon-specific sequences. Here we report a simple and sensitive multiplexed electrochemical approach for the quantification of Roundup-Ready Soybean (RRS). Two DNA sequences, taxon (lectin) and event-specific (RR), are targeted via hybridization onto magnetic beads. Both sequences are simultaneously detected by performing the immobilization, hybridization and labeling steps in a single tube and parallel electrochemical readout. Hybridization is performed in a sandwich format using signaling probes labeled with fluorescein isothiocyanate (FITC) or digoxigenin (Dig), followed by dual enzymatic labeling using Fab fragments of anti-Dig and anti-FITC conjugated to peroxidase or alkaline phosphatase, respectively. Electrochemical measurement of the enzyme activity is finally performed on screen-printed carbon electrodes. The assay gave a linear range of 2-250 pM for both targets, with LOD values of 650 fM (160 amol) and 190 fM (50 amol) for the event-specific and the taxon-specific targets, respectively. Results indicate that the method could be applied for GMO quantification below the European labeling threshold level (0.9%), offering a general approach for the rapid quantification of specific GMO events in foods. Copyright © 2015 Elsevier B.V. All rights reserved.

  13. Adaptive Quantification and Longitudinal Analysis of Pulmonary Emphysema with a Hidden Markov Measure Field Model

    Science.gov (United States)

    Häme, Yrjö; Angelini, Elsa D.; Hoffman, Eric A.; Barr, R. Graham; Laine, Andrew F.

    2014-01-01

    The extent of pulmonary emphysema is commonly estimated from CT images by computing the proportional area of voxels below a predefined attenuation threshold. However, the reliability of this approach is limited by several factors that affect the CT intensity distributions in the lung. This work presents a novel method for emphysema quantification, based on parametric modeling of intensity distributions in the lung and a hidden Markov measure field model to segment emphysematous regions. The framework adapts to the characteristics of an image to ensure a robust quantification of emphysema under varying CT imaging protocols and differences in parenchymal intensity distributions due to factors such as inspiration level. Compared to standard approaches, the present model involves a larger number of parameters, most of which can be estimated from data, to handle the variability encountered in lung CT scans. The method was used to quantify emphysema on a cohort of 87 subjects, with repeated CT scans acquired over a time period of 8 years using different imaging protocols. The scans were acquired approximately annually, and the data set included a total of 365 scans. The results show that the emphysema estimates produced by the proposed method have very high intra-subject correlation values. By reducing sensitivity to changes in imaging protocol, the method provides a more robust estimate than standard approaches. In addition, the generated emphysema delineations promise great advantages for regional analysis of emphysema extent and progression, possibly advancing disease subtyping. PMID:24759984

  14. Level 2 probabilistic event analyses and quantification

    International Nuclear Information System (INIS)

    Boneham, P.

    2003-01-01

    In this paper an example of quantification of a severe accident phenomenological event is given. The performed analysis for assessment of the probability that the debris released from the reactor vessel was in a coolable configuration in the lower drywell is presented. It is also analysed the assessment of the type of core/concrete attack that would occur. The coolability of the debris ex-vessel evaluation by an event in the Simplified Boiling Water Reactor (SBWR) Containment Event Tree (CET) and a detailed Decomposition Event Tree (DET) developed to aid in the quantification of this CET event are considered. The headings in the DET selected to represent plant physical states (e.g., reactor vessel pressure at the time of vessel failure) and the uncertainties associated with the occurrence of critical physical phenomena (e.g., debris configuration in the lower drywell) considered important to assessing whether the debris was coolable or not coolable ex-vessel are also discussed

  15. Detection and quantification limits: basic concepts, international harmonization, and outstanding ('low-level') issues

    International Nuclear Information System (INIS)

    Currie, L.A.

    2004-01-01

    A brief review is given of concepts, basic definitions, and terminology for metrological detection and quantification capabilities, representing harmonized recommendations and norms of the International Union of Pure and Applied Chemistry (IUPAC) and the International Organization for Standardization (ISO), respectively. Treatment of the (low-level) blank and variance function are discussed in some detail, together with special problems arising with detection decisions and the reporting of low-level data. Key references to the international documents follow, as well as specialized references addressing very low-level counting data, skewed environmental blank distributions, and multiple and multivariate detection decisions

  16. Quantification Model for Estimating Temperature Field Distributions of Apple Fruit

    OpenAIRE

    Zhang , Min; Yang , Le; Zhao , Huizhong; Zhang , Leijie; Zhong , Zhiyou; Liu , Yanling; Chen , Jianhua

    2009-01-01

    International audience; A quantification model of transient heat conduction was provided to simulate apple fruit temperature distribution in the cooling process. The model was based on the energy variation of apple fruit of different points. It took into account, heat exchange of representative elemental volume, metabolism heat and external heat. The following conclusions could be obtained: first, the quantification model can satisfactorily describe the tendency of apple fruit temperature dis...

  17. Quantification of Cannabinoid Content in Cannabis

    Science.gov (United States)

    Tian, Y.; Zhang, F.; Jia, K.; Wen, M.; Yuan, Ch.

    2015-09-01

    Cannabis is an economically important plant that is used in many fields, in addition to being the most commonly consumed illicit drug worldwide. Monitoring the spatial distribution of cannabis cultivation and judging whether it is drug- or fiber-type cannabis is critical for governments and international communities to understand the scale of the illegal drug trade. The aim of this study was to investigate whether the cannabinoids content in cannabis could be spectrally quantified using a spectrometer and to identify the optimal wavebands for quantifying the cannabinoid content. Spectral reflectance data of dried cannabis leaf samples and the cannabis canopy were measured in the laboratory and in the field, respectively. Correlation analysis and the stepwise multivariate regression method were used to select the optimal wavebands for cannabinoid content quantification based on the laboratory-measured spectral data. The results indicated that the delta-9-tetrahydrocannabinol (THC) content in cannabis leaves could be quantified using laboratory-measured spectral reflectance data and that the 695 nm band is the optimal band for THC content quantification. This study provides prerequisite information for designing spectral equipment to enable immediate quantification of THC content in cannabis and to discriminate drug- from fiber-type cannabis based on THC content quantification in the field.

  18. Quantification of fossil fuel CO2 at the building/street level for large US cities

    Science.gov (United States)

    Gurney, K. R.; Razlivanov, I. N.; Song, Y.

    2012-12-01

    Quantification of fossil fuel CO2 emissions from the bottom-up perspective is a critical element in emerging plans on a global, integrated, carbon monitoring system (CMS). A space/time explicit emissions data product can act as both a verification and planning system. It can verify atmospheric CO2 measurements (in situ and remote) and offer detailed mitigation information to management authorities in order to optimize the mix of mitigation efforts. Here, we present the Hestia Project, an effort aimed at building a high resolution (eg. building and road link-specific, hourly) fossil fuel CO2 emissions data product for the urban domain as a pilot effort to a CMS. A complete data product has been built for the city of Indianapolis and preliminary quantification has been completed for Los Angeles and Phoenix (see figure). The effort in Indianapolis is now part of a larger effort aimed at a convergent top-down/bottom-up assessment of greenhouse gas emissions, called INFLUX. Our urban-level quantification relies on a mixture of data and modeling structures. We start with the sector-specific Vulcan Project estimate at the mix of geocoded and county-wide levels. The Hestia aim is to distribute the Vulcan result in space and time. Two components take the majority of effort: buildings and onroad emissions. In collaboration with our INFLUX colleagues, we are transporting these high resolution emissions through an atmospheric transport model for a forward comparison of the Hestia data product with atmospheric measurements, collected on aircraft and cell towers. In preparation for a formal urban-scale inversion, these forward comparisons offer insights into both improving our emissions data product and measurement strategies. A key benefit of the approach taken in this study is the tracking and archiving of fuel and process-level detail (eg. combustion process, other pollutants), allowing for a more thorough understanding and analysis of energy throughputs in the urban

  19. A HPLC method for the quantification of butyramide and acetamide at ppb levels in hydrogeothermal waters

    Energy Technology Data Exchange (ETDEWEB)

    Gracy Elias; Earl D. Mattson; Jessica E. Little

    2012-01-01

    A quantitative analytical method to determine butyramide and acetamide concentrations at the low ppb levels in geothermal waters has been developed. The analytes are concentrated in a preparation step by evaporation and analyzed using HPLC-UV. Chromatographic separation is achieved isocratically with a RP C-18 column using a 30 mM phosphate buffer solution with 5 mM heptane sulfonic acid and methanol (98:2 ratio) as the mobile phase. Absorbance is measured at 200 nm. The limit of detection (LOD) for BA and AA were 2.0 {mu}g L{sup -1} and 2.5 {mu}g L{sup -1}, respectively. The limit of quantification (LOQ) for BA and AA were 5.7 {mu}g L{sup -1} and 7.7 {mu}g L{sup -1}, respectively, at the detection wavelength of 200 nm. Attaining these levels of quantification better allows these amides to be used as thermally reactive tracers in low-temperature hydrogeothermal systems.

  20. Uncertainty quantification theory, implementation, and applications

    CERN Document Server

    Smith, Ralph C

    2014-01-01

    The field of uncertainty quantification is evolving rapidly because of increasing emphasis on models that require quantified uncertainties for large-scale applications, novel algorithm development, and new computational architectures that facilitate implementation of these algorithms. Uncertainty Quantification: Theory, Implementation, and Applications provides readers with the basic concepts, theory, and algorithms necessary to quantify input and response uncertainties for simulation models arising in a broad range of disciplines. The book begins with a detailed discussion of applications where uncertainty quantification is critical for both scientific understanding and policy. It then covers concepts from probability and statistics, parameter selection techniques, frequentist and Bayesian model calibration, propagation of uncertainties, quantification of model discrepancy, surrogate model construction, and local and global sensitivity analysis. The author maintains a complementary web page where readers ca...

  1. Level playing field with political tact

    International Nuclear Information System (INIS)

    Onderstal, S.; Appelman, M.

    2004-01-01

    Businesses, interest groups and policy administrators plead for a level playing field. However, those administrators interpret the level playing field notion in different ways and thus create confusion. In this article the level playing field is explained and a framework discussed by means of which the government can study policy problems in which the level playing field is of importance [nl

  2. Recurrence Quantification Analysis of Sentence-Level Speech Kinematics.

    Science.gov (United States)

    Jackson, Eric S; Tiede, Mark; Riley, Michael A; Whalen, D H

    2016-12-01

    Current approaches to assessing sentence-level speech variability rely on measures that quantify variability across utterances and use normalization procedures that alter raw trajectory data. The current work tests the feasibility of a less restrictive nonlinear approach-recurrence quantification analysis (RQA)-via a procedural example and subsequent analysis of kinematic data. To test the feasibility of RQA, lip aperture (i.e., the Euclidean distance between lip-tracking sensors) was recorded for 21 typically developing adult speakers during production of a simple utterance. The utterance was produced in isolation and in carrier structures differing just in length or in length and complexity. Four RQA indices were calculated: percent recurrence (%REC), percent determinism (%DET), stability (MAXLINE), and stationarity (TREND). Percent determinism (%DET) decreased only for the most linguistically complex sentence; MAXLINE decreased as a function of linguistic complexity but increased for the longer-only sentence; TREND decreased as a function of both length and linguistic complexity. This research note demonstrates the feasibility of using RQA as a tool to compare speech variability across speakers and groups. RQA offers promise as a technique to assess effects of potential stressors (e.g., linguistic or cognitive factors) on the speech production system.

  3. Generalization of the cogeneration concept as field theory effects

    International Nuclear Information System (INIS)

    Forje, A.; Tiberiu, C.; Calugaru, A.; Carstea, O.; Dorobantu, G.; Barota, R.; Balan, N.; Mariam, G.; Udrea, E.

    1990-01-01

    This paper reports on the reformulated notions regarding energy, action geodesic and non-linearity that were defined. Information geodesic is defined as pathway of perceptible and quantifiable signals emitted and received during the evolution of the conversion of a mass field in interaction with the energy field. The objective reality at the level of the distances ranging in between the limits of human ability of perception and quantification can be regarded as an interpenetrative complex of two fields namely: a diffuse, extensive and continuous energy field with multiple manifestation possibilities which is indirectly perceived and quantified through its interaction effects with the field of masses during their conversion; a discrete, intensive and discontinuous field of masses also showing multiple manifestation possibilities which render possible both the perception of this field and quantification of its conversions as an effect of the interactions with the energy field

  4. Uncertainty Quantification for Ice Sheet Science and Sea Level Projections

    Science.gov (United States)

    Boening, C.; Schlegel, N.; Limonadi, D.; Schodlok, M.; Seroussi, H. L.; Larour, E. Y.; Watkins, M. M.

    2017-12-01

    In order to better quantify uncertainties in global mean sea level rise projections and in particular upper bounds, we aim at systematically evaluating the contributions from ice sheets and potential for extreme sea level rise due to sudden ice mass loss. Here, we take advantage of established uncertainty quantification tools embedded within the Ice Sheet System Model (ISSM) as well as sensitivities to ice/ocean interactions using melt rates and melt potential derived from MITgcm/ECCO2. With the use of these tools, we conduct Monte-Carlo style sampling experiments on forward simulations of the Antarctic ice sheet, by varying internal parameters and boundary conditions of the system over both extreme and credible worst-case ranges. Uncertainty bounds for climate forcing are informed by CMIP5 ensemble precipitation and ice melt estimates for year 2100, and uncertainty bounds for ocean melt rates are derived from a suite of regional sensitivity experiments using MITgcm. Resulting statistics allow us to assess how regional uncertainty in various parameters affect model estimates of century-scale sea level rise projections. The results inform efforts to a) isolate the processes and inputs that are most responsible for determining ice sheet contribution to sea level; b) redefine uncertainty brackets for century-scale projections; and c) provide a prioritized list of measurements, along with quantitative information on spatial and temporal resolution, required for reducing uncertainty in future sea level rise projections. Results indicate that ice sheet mass loss is dependent on the spatial resolution of key boundary conditions - such as bedrock topography and melt rates at the ice-ocean interface. This work is performed at and supported by the California Institute of Technology's Jet Propulsion Laboratory. Supercomputing time is also supported through a contract with the National Aeronautics and Space Administration's Cryosphere program.

  5. Uncertainty quantification metrics for whole product life cycle cost estimates in aerospace innovation

    Science.gov (United States)

    Schwabe, O.; Shehab, E.; Erkoyuncu, J.

    2015-08-01

    The lack of defensible methods for quantifying cost estimate uncertainty over the whole product life cycle of aerospace innovations such as propulsion systems or airframes poses a significant challenge to the creation of accurate and defensible cost estimates. Based on the axiomatic definition of uncertainty as the actual prediction error of the cost estimate, this paper provides a comprehensive overview of metrics used for the uncertainty quantification of cost estimates based on a literature review, an evaluation of publicly funded projects such as part of the CORDIS or Horizon 2020 programs, and an analysis of established approaches used by organizations such NASA, the U.S. Department of Defence, the ESA, and various commercial companies. The metrics are categorized based on their foundational character (foundations), their use in practice (state-of-practice), their availability for practice (state-of-art) and those suggested for future exploration (state-of-future). Insights gained were that a variety of uncertainty quantification metrics exist whose suitability depends on the volatility of available relevant information, as defined by technical and cost readiness level, and the number of whole product life cycle phases the estimate is intended to be valid for. Information volatility and number of whole product life cycle phases can hereby be considered as defining multi-dimensional probability fields admitting various uncertainty quantification metric families with identifiable thresholds for transitioning between them. The key research gaps identified were the lacking guidance grounded in theory for the selection of uncertainty quantification metrics and lacking practical alternatives to metrics based on the Central Limit Theorem. An innovative uncertainty quantification framework consisting of; a set-theory based typology, a data library, a classification system, and a corresponding input-output model are put forward to address this research gap as the basis

  6. Quantification of radionuclide uptake levels for primary bone tumors

    Directory of Open Access Journals (Sweden)

    Hasford Francis

    2015-04-01

    Full Text Available The purpose of the study is to quantify the level of uptake of administered radionuclide in primary bone tumors for patients undergoing bone scintigraphy. Retrospective study on 48 patient's scintigrams to quantify the uptake levels of administered radiopharmaceuticals was performed in a nuclear medicine unit in Ghana. Patients were administered with activity ranging between 0.555 and 1.110 MBq (15–30 mCi, and scanned on Siemens e.cam SPECT system. Analyses on scintigrams were performed with Image J software by drawing regions of interest (ROIs over identified hot spots (pathologic sites. Nine skeletal parts namely cranium, neck, shoulder, sacrum, sternum, vertebra, femur, ribcage, and knee were considered in the study, which involved 96 identified primary tumors. Radionuclide uptakes were quantified in terms of the estimated counts of activity per patient for identified tumor sites. Average normalized counts of activity (nGMC per patient ranged from 5.2759 ± 0.6590 cts/mm2/MBq in the case of cranium tumors to 72.7569 ± 17.8786 cts/mm2/MBq in the case of ribcage tumors. The differences in uptake levels could be attributed to different mechanisms of Tc-99m MDP uptake in different types of bones, which is directly related to blood flow and degree of osteoblastic activity. The overall normalized count of activity for the 96 identified tumors was estimated to be 23.0350 ± 19.5424 cts/mm2/MBq. The study revealed highest uptake of activity in ribcage and least uptake in cranium. Quantification of radionuclide uptakes in tumors is important and recommended in assessing patient's response to therapy, doses to critical organs and in diagnosing tumors.

  7. Impact of some field factors on inhalation exposure levels to bitumen emissions during road paving operations.

    Science.gov (United States)

    Deygout, François; Auburtin, Guy

    2015-03-01

    Variability in occupational exposure levels to bitumen emissions has been observed during road paving operations. This is due to recurrent field factors impacting the level of exposure experienced by workers during paving. The present study was undertaken in order to quantify the impact of such factors. Pre-identified variables currently encountered in the field were monitored and recorded during paving surveys, and were conducted randomly covering current applications performed by road crews. Multivariate variance analysis and regressions were then used on computerized field data. The statistical investigations were limited due to the relatively small size of the study (36 data). Nevertheless, the particular use of the step-wise regression tool enabled the quantification of the impact of several predictors despite the existing collinearity between variables. The two bitumen organic fractions (particulates and volatiles) are associated with different field factors. The process conditions (machinery used and delivery temperature) have a significant impact on the production of airborne particulates and explain up to 44% of variability. This confirms the outcomes described by previous studies. The influence of the production factors is limited though, and should be complemented by studying factors involving the worker such as work style and the mix of tasks. The residual volatile compounds, being part of the bituminous binder and released during paving operations, control the volatile emissions; 73% of the encountered field variability is explained by the composition of the bitumen batch. © The Author 2014. Published by Oxford University Press on behalf of the British Occupational Hygiene Society.

  8. Critical points of DNA quantification by real-time PCR--effects of DNA extraction method and sample matrix on quantification of genetically modified organisms.

    Science.gov (United States)

    Cankar, Katarina; Stebih, Dejan; Dreo, Tanja; Zel, Jana; Gruden, Kristina

    2006-08-14

    Real-time PCR is the technique of choice for nucleic acid quantification. In the field of detection of genetically modified organisms (GMOs) quantification of biotech products may be required to fulfil legislative requirements. However, successful quantification depends crucially on the quality of the sample DNA analyzed. Methods for GMO detection are generally validated on certified reference materials that are in the form of powdered grain material, while detection in routine laboratories must be performed on a wide variety of sample matrixes. Due to food processing, the DNA in sample matrixes can be present in low amounts and also degraded. In addition, molecules of plant origin or from other sources that affect PCR amplification of samples will influence the reliability of the quantification. Further, the wide variety of sample matrixes presents a challenge for detection laboratories. The extraction method must ensure high yield and quality of the DNA obtained and must be carefully selected, since even components of DNA extraction solutions can influence PCR reactions. GMO quantification is based on a standard curve, therefore similarity of PCR efficiency for the sample and standard reference material is a prerequisite for exact quantification. Little information on the performance of real-time PCR on samples of different matrixes is available. Five commonly used DNA extraction techniques were compared and their suitability for quantitative analysis was assessed. The effect of sample matrix on nucleic acid quantification was assessed by comparing 4 maize and 4 soybean matrixes. In addition 205 maize and soybean samples from routine analysis were analyzed for PCR efficiency to assess variability of PCR performance within each sample matrix. Together with the amount of DNA needed for reliable quantification, PCR efficiency is the crucial parameter determining the reliability of quantitative results, therefore it was chosen as the primary criterion by which to

  9. Identification of flow paths and quantification of return flow volumes and timing at field scale

    Science.gov (United States)

    Claes, N.; Paige, G. B.; Parsekian, A.

    2017-12-01

    Flood irrigation, which constitutes a large part of agricultural water use, accounts for a significant amount of the water that is diverted from western streams. Return flow, the portion of the water applied to irrigated areas that returns to the stream, is important for maintaining base flows in streams and ecological function of riparian zones and wetlands hydrologically linked with streams. Prediction of timing and volumes of return flow during and after flood irrigation pose a challenge due to the heterogeneity of pedogenic and soil physical factors that influence vadose zone processes. In this study, we quantify volumes of return flow and potential pathways in the subsurface through a vadose zone flow model that is informed by both hydrological and geophysical observations in a Bayesian setting. We couple a two-dimensional vadose zone flow model through a Bayesian Markov Chain Monte Carlo approach with time lapse ERT, borehole NMR datasets that are collected during and after flood irrigation experiments, and soil physical lab analysis. The combination of both synthetic models and field observations leads to flow path identification and allows for quantification of volumes and timing and associated uncertainties of subsurface return that stems from flood irrigation. The quantification of the impact of soil heterogeneity enables us to translate these results to other sites and predict return flow under different soil physical settings. This is key when managing irrigation water resources and predictions of outcomes of different scenarios have to be evaluated.

  10. Collagen Quantification in Tissue Specimens.

    Science.gov (United States)

    Coentro, João Quintas; Capella-Monsonís, Héctor; Graceffa, Valeria; Wu, Zhuning; Mullen, Anne Maria; Raghunath, Michael; Zeugolis, Dimitrios I

    2017-01-01

    Collagen is the major extracellular protein in mammals. Accurate quantification of collagen is essential in the biomaterials (e.g., reproducible collagen scaffold fabrication), drug discovery (e.g., assessment of collagen in pathophysiologies, such as fibrosis), and tissue engineering (e.g., quantification of cell-synthesized collagen) fields. Although measuring hydroxyproline content is the most widely used method to quantify collagen in biological specimens, the process is very laborious. To this end, the Sircol™ Collagen Assay is widely used due to its inherent simplicity and convenience. However, this method leads to overestimation of collagen content due to the interaction of Sirius red with basic amino acids of non-collagenous proteins. Herein, we describe the addition of an ultrafiltration purification step in the process to accurately determine collagen content in tissues.

  11. Circulating levels of 3-hydroxymyristate, a direct quantification of endotoxemia in non-infected cirrhotic patients.

    Science.gov (United States)

    Weil, Delphine; Pais de Barros, Jean-Paul; Mourey, Guillaume; Laheurte, Caroline; Cypriani, Benoit; Badet, Nicolas; Delabrousse, Eric; Grandclément, Emilie; Di Martino, Vincent; Saas, Philippe; Lagrost, Laurent; Thévenot, Thierry

    2018-06-22

    The quantification of lipopolysaccharide (LPS) in biological fluids is challenging. We aimed to measure plasma LPS concentration using a new method of direct quantification of 3-hydroxymyristate (3-HM), a lipid component of LPS, and to evaluate correlations between 3-HM and markers of liver function, endothelial activation, portal hypertension and enterocyte damage. Plasma from 90 non-infected cirrhotic patients (30 Child-Pugh [CP]-A, 30 CP-B, 30 CP-C) was prospectively collected. The concentration of 3-HM was determined by High Performance Liquid Chromatography coupled with Mass Spectrometry. 3-HM levels were higher in CP-C patients (CP-A/CP-B/CP-C: 68/70/103 ng/mL, p=0.005). Patients with severe acute alcoholic hepatitis (n=16; 113 vs 74 ng/mL,p=0.012), diabetic patients (n=22; 99 vs 70 ng/mL, p=0.028) and those not receiving beta-blockers (n=44; 98 vs 72 ng/mL, p=0.034) had higher levels of 3-HM. We observed a trend towards higher baseline levels of 3-HM in patients with hepatic encephalopathy (n=7; 144 vs 76 ng/mL, p=0.45) or SIRS (n=10; 106 vs 75 ng/mL, p=0.114). In multivariate analysis, high levels of 3-HM were associated with CP (OR=4.39; 95%CI=1.79-10.76) or MELD (OR=8.24; 95%CI=3.19-21.32) scores. Patients dying from liver insufficiency (n=6) during a 12-month follow-up had higher baseline levels of 3-HM (106 vs 75 ng/mL, p=0.089). In non-infected cirrhotic patients, 3-HM arises more frequently with impairment of liver function, heavy alcohol consumption, diabetic status, non-use of beta-blockers, and a trend towards poorer outcome is also observed. The direct mass-measurement of LPS using 3-HM appears reliable to detect transient endotoxemia and promising to manage the follow-up of cirrhotic patients. This article is protected by copyright. All rights reserved. This article is protected by copyright. All rights reserved.

  12. Stereotypical Escape Behavior in Caenorhabditis elegans Allows Quantification of Effective Heat Stimulus Level.

    Directory of Open Access Journals (Sweden)

    Kawai Leung

    2016-12-01

    Full Text Available A goal of many sensorimotor studies is to quantify the stimulus-behavioral response relation for specific organisms and specific sensory stimuli. This is especially important to do in the context of painful stimuli since most animals in these studies cannot easily communicate to us their perceived levels of such noxious stimuli. Thus progress on studies of nociception and pain-like responses in animal models depends crucially on our ability to quantitatively and objectively infer the sensed levels of these stimuli from animal behaviors. Here we develop a quantitative model to infer the perceived level of heat stimulus from the stereotyped escape response of individual nematodes Caenorhabditis elegans stimulated by an IR laser. The model provides a method for quantification of analgesic-like effects of chemical stimuli or genetic mutations in C. elegans. We test ibuprofen-treated worms and a TRPV (transient receptor potential mutant, and we show that the perception of heat stimuli for the ibuprofen treated worms is lower than the wild-type. At the same time, our model shows that the mutant changes the worm's behavior beyond affecting the thermal sensory system. Finally, we determine the stimulus level that best distinguishes the analgesic-like effects and the minimum number of worms that allow for a statistically significant identification of these effects.

  13. (1) H-MRS processing parameters affect metabolite quantification

    DEFF Research Database (Denmark)

    Bhogal, Alex A; Schür, Remmelt R; Houtepen, Lotte C

    2017-01-01

    investigated the influence of model parameters and spectral quantification software on fitted metabolite concentration values. Sixty spectra in 30 individuals (repeated measures) were acquired using a 7-T MRI scanner. Data were processed by four independent research groups with the freedom to choose their own...... + NAAG/Cr + PCr and Glu/Cr + PCr, respectively. Metabolite quantification using identical (1) H-MRS data was influenced by processing parameters, basis sets and software choice. Locally preferred processing choices affected metabolite quantification, even when using identical software. Our results......Proton magnetic resonance spectroscopy ((1) H-MRS) can be used to quantify in vivo metabolite levels, such as lactate, γ-aminobutyric acid (GABA) and glutamate (Glu). However, there are considerable analysis choices which can alter the accuracy or precision of (1) H-MRS metabolite quantification...

  14. Artifacts Quantification of Metal Implants in MRI

    Science.gov (United States)

    Vrachnis, I. N.; Vlachopoulos, G. F.; Maris, T. G.; Costaridou, L. I.

    2017-11-01

    The presence of materials with different magnetic properties, such as metal implants, causes distortion of the magnetic field locally, resulting in signal voids and pile ups, i.e. susceptibility artifacts in MRI. Quantitative and unbiased measurement of the artifact is prerequisite for optimization of acquisition parameters. In this study an image gradient based segmentation method is proposed for susceptibility artifact quantification. The method captures abrupt signal alterations by calculation of the image gradient. Then the artifact is quantified in terms of its extent by an automated cross entropy thresholding method as image area percentage. The proposed method for artifact quantification was tested in phantoms containing two orthopedic implants with significantly different magnetic permeabilities. The method was compared against a method proposed in the literature, considered as a reference, demonstrating moderate to good correlation (Spearman’s rho = 0.62 and 0.802 in case of titanium and stainless steel implants). The automated character of the proposed quantification method seems promising towards MRI acquisition parameter optimization.

  15. Critical points of DNA quantification by real-time PCR – effects of DNA extraction method and sample matrix on quantification of genetically modified organisms

    Directory of Open Access Journals (Sweden)

    Žel Jana

    2006-08-01

    Full Text Available Abstract Background Real-time PCR is the technique of choice for nucleic acid quantification. In the field of detection of genetically modified organisms (GMOs quantification of biotech products may be required to fulfil legislative requirements. However, successful quantification depends crucially on the quality of the sample DNA analyzed. Methods for GMO detection are generally validated on certified reference materials that are in the form of powdered grain material, while detection in routine laboratories must be performed on a wide variety of sample matrixes. Due to food processing, the DNA in sample matrixes can be present in low amounts and also degraded. In addition, molecules of plant origin or from other sources that affect PCR amplification of samples will influence the reliability of the quantification. Further, the wide variety of sample matrixes presents a challenge for detection laboratories. The extraction method must ensure high yield and quality of the DNA obtained and must be carefully selected, since even components of DNA extraction solutions can influence PCR reactions. GMO quantification is based on a standard curve, therefore similarity of PCR efficiency for the sample and standard reference material is a prerequisite for exact quantification. Little information on the performance of real-time PCR on samples of different matrixes is available. Results Five commonly used DNA extraction techniques were compared and their suitability for quantitative analysis was assessed. The effect of sample matrix on nucleic acid quantification was assessed by comparing 4 maize and 4 soybean matrixes. In addition 205 maize and soybean samples from routine analysis were analyzed for PCR efficiency to assess variability of PCR performance within each sample matrix. Together with the amount of DNA needed for reliable quantification, PCR efficiency is the crucial parameter determining the reliability of quantitative results, therefore it was

  16. Critical points of DNA quantification by real-time PCR – effects of DNA extraction method and sample matrix on quantification of genetically modified organisms

    Science.gov (United States)

    Cankar, Katarina; Štebih, Dejan; Dreo, Tanja; Žel, Jana; Gruden, Kristina

    2006-01-01

    Background Real-time PCR is the technique of choice for nucleic acid quantification. In the field of detection of genetically modified organisms (GMOs) quantification of biotech products may be required to fulfil legislative requirements. However, successful quantification depends crucially on the quality of the sample DNA analyzed. Methods for GMO detection are generally validated on certified reference materials that are in the form of powdered grain material, while detection in routine laboratories must be performed on a wide variety of sample matrixes. Due to food processing, the DNA in sample matrixes can be present in low amounts and also degraded. In addition, molecules of plant origin or from other sources that affect PCR amplification of samples will influence the reliability of the quantification. Further, the wide variety of sample matrixes presents a challenge for detection laboratories. The extraction method must ensure high yield and quality of the DNA obtained and must be carefully selected, since even components of DNA extraction solutions can influence PCR reactions. GMO quantification is based on a standard curve, therefore similarity of PCR efficiency for the sample and standard reference material is a prerequisite for exact quantification. Little information on the performance of real-time PCR on samples of different matrixes is available. Results Five commonly used DNA extraction techniques were compared and their suitability for quantitative analysis was assessed. The effect of sample matrix on nucleic acid quantification was assessed by comparing 4 maize and 4 soybean matrixes. In addition 205 maize and soybean samples from routine analysis were analyzed for PCR efficiency to assess variability of PCR performance within each sample matrix. Together with the amount of DNA needed for reliable quantification, PCR efficiency is the crucial parameter determining the reliability of quantitative results, therefore it was chosen as the primary

  17. Rapid and Easy Protocol for Quantification of Next-Generation Sequencing Libraries.

    Science.gov (United States)

    Hawkins, Steve F C; Guest, Paul C

    2018-01-01

    The emergence of next-generation sequencing (NGS) over the last 10 years has increased the efficiency of DNA sequencing in terms of speed, ease, and price. However, the exact quantification of a NGS library is crucial in order to obtain good data on sequencing platforms developed by the current market leader Illumina. Different approaches for DNA quantification are available currently and the most commonly used are based on analysis of the physical properties of the DNA through spectrophotometric or fluorometric methods. Although these methods are technically simple, they do not allow exact quantification as can be achieved using a real-time quantitative PCR (qPCR) approach. A qPCR protocol for DNA quantification with applications in NGS library preparation studies is presented here. This can be applied in various fields of study such as medical disorders resulting from nutritional programming disturbances.

  18. Quantification accuracy and partial volume effect in dependence of the attenuation correction of a state-of-the-art small animal PET scanner

    International Nuclear Information System (INIS)

    Mannheim, Julia G; Judenhofer, Martin S; Schmid, Andreas; Pichler, Bernd J; Tillmanns, Julia; Stiller, Detlef; Sossi, Vesna

    2012-01-01

    Quantification accuracy and partial volume effect (PVE) of the Siemens Inveon PET scanner were evaluated. The influence of transmission source activities (40 and 160 MBq) on the quantification accuracy and the PVE were determined. Dynamic range, object size and PVE for different sphere sizes, contrast ratios and positions in the field of view (FOV) were evaluated. The acquired data were reconstructed using different algorithms and correction methods. The activity level of the transmission source and the total emission activity in the FOV strongly influenced the attenuation maps. Reconstruction algorithms, correction methods, object size and location within the FOV had a strong influence on the PVE in all configurations. All evaluated parameters potentially influence the quantification accuracy. Hence, all protocols should be kept constant during a study to allow a comparison between different scans. (paper)

  19. Exploring Heterogeneous Multicore Architectures for Advanced Embedded Uncertainty Quantification.

    Energy Technology Data Exchange (ETDEWEB)

    Phipps, Eric T.; Edwards, Harold C.; Hu, Jonathan J.

    2014-09-01

    We explore rearrangements of classical uncertainty quantification methods with the aim of achieving higher aggregate performance for uncertainty quantification calculations on emerging multicore and manycore architectures. We show a rearrangement of the stochastic Galerkin method leads to improved performance and scalability on several computational architectures whereby un- certainty information is propagated at the lowest levels of the simulation code improving memory access patterns, exposing new dimensions of fine grained parallelism, and reducing communica- tion. We also develop a general framework for implementing such rearrangements for a diverse set of uncertainty quantification algorithms as well as computational simulation codes to which they are applied.

  20. Colour thresholding and objective quantification in bioimaging

    Science.gov (United States)

    Fermin, C. D.; Gerber, M. A.; Torre-Bueno, J. R.

    1992-01-01

    Computer imaging is rapidly becoming an indispensable tool for the quantification of variables in research and medicine. Whilst its use in medicine has largely been limited to qualitative observations, imaging in applied basic sciences, medical research and biotechnology demands objective quantification of the variables in question. In black and white densitometry (0-256 levels of intensity) the separation of subtle differences between closely related hues from stains is sometimes very difficult. True-colour and real-time video microscopy analysis offer choices not previously available with monochrome systems. In this paper we demonstrate the usefulness of colour thresholding, which has so far proven indispensable for proper objective quantification of the products of histochemical reactions and/or subtle differences in tissue and cells. In addition, we provide interested, but untrained readers with basic information that may assist decisions regarding the most suitable set-up for a project under consideration. Data from projects in progress at Tulane are shown to illustrate the advantage of colour thresholding over monochrome densitometry and for objective quantification of subtle colour differences between experimental and control samples.

  1. Development of the quantification procedures for in situ XRF analysis

    International Nuclear Information System (INIS)

    Kump, P.; Necemer, M.; Rupnik, P.

    2005-01-01

    For in situ XRF applications, two excitation systems (radioisotope and tube excited) and an X ray spectrometer based on an Si-PIN detector were assembled and used. The radioisotope excitation system with an Am-241 source was assembled into a prototype of a compact XRF analyser PEDUZO-01, which is also applicable in field work. The existing quantification software QAES (quantitative analysis of environmental samples) was assessed to be adequate also in field work. This QAES software was also integrated into a new software attached to the developed XRF analyser PEDUZO-01, which includes spectrum acquisition, spectrum analysis and quantification and runs in the LABVIEW environment. In a process of assessment of the Si-PIN based X ray spectrometers and QAES quantification software in field work, a comparison was made with the results obtained by the standard Si(Li) based spectrometer. The results of this study prove that the use of this spectrometer is adequate for field work. This work was accepted for publication in X ray Spectrometry. Application of a simple sample preparation of solid samples was studied in view of the analytical results obtained. It has been established that under definite conditions the results are not very different from the ones obtained by the homogenized sample pressed into the pellet. The influence of particle size and mineralogical effects on quantitative results was studied. A simple sample preparation kit was proposed. Sample preparation for the analysis of water samples by precipitation with APDC and aerosol analysis using a dichotomous sampler were also adapted and used in the field work. An adequate sample preparation kit was proposed. (author)

  2. Quantification by qPCR of Pathobionts in Chronic Periodontitis: Development of Predictive Models of Disease Severity at Site-Specific Level

    OpenAIRE

    Tomás, Inmaculada; Regueira-Iglesias, Alba; López, Maria; Arias-Bujanda, Nora; Novoa, Lourdes; Balsa-Castro, Carlos; Tomás, Maria

    2017-01-01

    Currently, there is little evidence available on the development of predictive models for the diagnosis or prognosis of chronic periodontitis based on the qPCR quantification of subgingival pathobionts. Our objectives were to: (1) analyze and internally validate pathobiont-based models that could be used to distinguish different periodontal conditions at site-specific level within the same patient with chronic periodontitis; (2) develop nomograms derived from predictive models. Subgingival pl...

  3. Quantification by aberration corrected (S)TEM of boundaries formed by symmetry breaking phase transformations

    Energy Technology Data Exchange (ETDEWEB)

    Schryvers, D., E-mail: nick.schryvers@uantwerpen.be [EMAT, University of Antwerp, Groenenborgerlaan 171, B-2020 Antwerp (Belgium); Salje, E.K.H. [Department of Earth Sciences, University of Cambridge, Cambridge CB2 3EQ (United Kingdom); Nishida, M. [Department of Engineering Sciences for Electronics and Materials, Faculty of Engineering Sciences, Kyushu University, Kasuga, Fukuoka 816-8580 (Japan); De Backer, A. [EMAT, University of Antwerp, Groenenborgerlaan 171, B-2020 Antwerp (Belgium); Idrissi, H. [EMAT, University of Antwerp, Groenenborgerlaan 171, B-2020 Antwerp (Belgium); Institute of Mechanics, Materials and Civil Engineering, Université Catholique de Louvain, Place Sainte Barbe, 2, B-1348, Louvain-la-Neuve (Belgium); Van Aert, S. [EMAT, University of Antwerp, Groenenborgerlaan 171, B-2020 Antwerp (Belgium)

    2017-05-15

    The present contribution gives a review of recent quantification work of atom displacements, atom site occupations and level of crystallinity in various systems and based on aberration corrected HR(S)TEM images. Depending on the case studied, picometer range precisions for individual distances can be obtained, boundary widths at the unit cell level determined or statistical evolutions of fractions of the ordered areas calculated. In all of these cases, these quantitative measures imply new routes for the applications of the respective materials. - Highlights: • Quantification of picometer displacements at ferroelastic twin boundary in CaTiO{sub 3.} • Quantification of kinks in meandering ferroelectric domain wall in LiNbO{sub 3}. • Quantification of column occupation in anti-phase boundary in Co-Pt. • Quantification of atom displacements at twin boundary in Ni-Ti B19′ martensite.

  4. Method evaluation of Fusarium DNA extraction from mycelia and wheat for down-stream real-time PCR quantification and correlation to mycotoxin levels.

    Science.gov (United States)

    Fredlund, Elisabeth; Gidlund, Ann; Olsen, Monica; Börjesson, Thomas; Spliid, Niels Henrik Hytte; Simonsson, Magnus

    2008-04-01

    Identification of Fusarium species by traditional methods requires specific skill and experience and there is an increased interest for new molecular methods for identification and quantification of Fusarium from food and feed samples. Real-time PCR with probe technology (Taqman) can be used for the identification and quantification of several species of Fusarium from cereal grain samples. There are several critical steps that need to be considered when establishing a real-time PCR-based method for DNA quantification, including extraction of DNA from the samples. In this study, several DNA extraction methods were evaluated, including the DNeasy Plant Mini Spin Columns (Qiagen), the Bio robot EZ1 (Qiagen) with the DNeasy Blood and Tissue Kit (Qiagen), and the Fast-DNA Spin Kit for Soil (Qbiogene). Parameters such as DNA quality and stability, PCR inhibitors, and PCR efficiency were investigated. Our results showed that all methods gave good PCR efficiency (above 90%) and DNA stability whereas the DNeasy Plant Mini Spin Columns in combination with sonication gave the best results with respect to Fusarium DNA yield. The modified DNeasy Plant Mini Spin protocol was used to analyse 31 wheat samples for the presence of F. graminearum and F. culmorum. The DNA level of F. graminearum could be correlated to the level of DON (r(2) = 0.9) and ZEN (r(2) = 0.6) whereas no correlation was found between F. culmorum and DON/ZEA. This shows that F. graminearum and not F. culmorum, was the main producer of DON in Swedish wheat during 2006.

  5. A facile and sensitive method for quantification of cyclic nucleotide monophosphates in mammalian organs: basal levels of eight cNMPs and identification of 2',3'-cIMP.

    Science.gov (United States)

    Jia, Xin; Fontaine, Benjamin M; Strobel, Fred; Weinert, Emily E

    2014-12-12

    A sensitive, versatile and economical method to extract and quantify cyclic nucleotide monophosphates (cNMPs) using LC-MS/MS, including both 3',5'-cNMPs and 2',3'-cNMPs, in mammalian tissues and cellular systems has been developed. Problems, such as matrix effects from complex biological samples, are addressed and have been optimized. This protocol allows for comparison of multiple cNMPs in the same system and was used to examine the relationship between tissue levels of cNMPs in a panel of rat organs. In addition, the study reports the first identification and quantification of 2',3'-cIMP. The developed method will allow for quantification of cNMPs levels in cells and tissues with varying disease states, which will provide insight into the role(s) and interplay of cNMP signalling pathways.

  6. Surface Enhanced Raman Spectroscopy (SERS) methods for endpoint and real-time quantification of miRNA assays

    Science.gov (United States)

    Restaino, Stephen M.; White, Ian M.

    2017-03-01

    Surface Enhanced Raman spectroscopy (SERS) provides significant improvements over conventional methods for single and multianalyte quantification. Specifically, the spectroscopic fingerprint provided by Raman scattering allows for a direct multiplexing potential far beyond that of fluorescence and colorimetry. Additionally, SERS generates a comparatively low financial and spatial footprint compared with common fluorescence based systems. Despite the advantages of SERS, it has remained largely an academic pursuit. In the field of biosensing, techniques to apply SERS to molecular diagnostics are constantly under development but, most often, assay protocols are redesigned around the use of SERS as a quantification method and ultimately complicate existing protocols. Our group has sought to rethink common SERS methodologies in order to produce translational technologies capable of allowing SERS to compete in the evolving, yet often inflexible biosensing field. This work will discuss the development of two techniques for quantification of microRNA, a promising biomarker for homeostatic and disease conditions ranging from cancer to HIV. First, an inkjet-printed paper SERS sensor has been developed to allow on-demand production of a customizable and multiplexable single-step lateral flow assay for miRNA quantification. Second, as miRNA concentrations commonly exist in relatively low concentrations, amplification methods (e.g. PCR) are therefore required to facilitate quantification. This work presents a novel miRNA assay alongside a novel technique for quantification of nuclease driven nucleic acid amplification strategies that will allow SERS to be used directly with common amplification strategies for quantification of miRNA and other nucleic acid biomarkers.

  7. The Quantification Process for the PRiME-U34i

    International Nuclear Information System (INIS)

    Hwang, Mee-Jeong; Han, Sang-Hoon; Yang, Joon-Eon

    2006-01-01

    In this paper, we introduce the quantification process for the PRIME-U34i, which is the merged model of ETs (Event Trees) and FTs (Fault Trees) for the level 1 internal PSA of UCN 3 and 4. PRiME-U34i has one top event. Therefore, the quantification process is changed to a simplified method when compared to the past one. In the past, we used the text file called a user file to control the quantification process. However, this user file is so complicated that it is difficult for a non-expert to understand it. Moreover, in the past PSA, ET and FT were separated but in PRiMEU34i, ET and FT were merged together. Thus, the quantification process is different. This paper is composed of five sections. In section 2, we introduce the construction of the one top model. Section 3 shows the quantification process used in the PRiME-U34i. Section 4 describes the post processing. Last section is the conclusions

  8. Direct quantification of human cytomegalovirus immediate-early and late mRNA levels in blood of lung transplant recipients by competitive nucleic acid sequence-based amplification

    NARCIS (Netherlands)

    Greijer, AE; Verschuuren, EAM; Harmsen, MC; Dekkers, CAJ; Adriaanse, HMA; The, TH; Middeldorp, JM

    The dynamics of active human cytomegalovirus (HCMV) infection was monitored by competitive nucleic acid sequence-based amplification (NASBA) assays for quantification of IE1 (UL123) and pp67 (UL65) mRNA expression levels In the blood of patients after lung transplantation. RNA was isolated from 339

  9. Differentiated-effect shims for medium field levels and saturation

    International Nuclear Information System (INIS)

    Richie, A.

    1976-01-01

    The arrangement of shims on the upstream and downstream ends of magnets may be based on the independent effects of variations in the geometric length and degree of saturation at the edges of the poles. This technique can be used to match the bending strength of an accelerator's magnets at two field levels (medium fields and maximum fields) and thus save special procedures (mixing the laminations, local compensation for errors by arranging the magnets in the appropriate order) and special devices (for instance, correcting dipoles) solely for correcting bending strengths at low field levels. (Auth.)

  10. A Facile and Sensitive Method for Quantification of Cyclic Nucleotide Monophosphates in Mammalian Organs: Basal Levels of Eight cNMPs and Identification of 2',3'-cIMP

    Directory of Open Access Journals (Sweden)

    Xin Jia

    2014-12-01

    Full Text Available A sensitive, versatile and economical method to extract and quantify cyclic nucleotide monophosphates (cNMPs using LC-MS/MS, including both 3',5'-cNMPs and 2',3'-cNMPs, in mammalian tissues and cellular systems has been developed. Problems, such as matrix effects from complex biological samples, are addressed and have been optimized. This protocol allows for comparison of multiple cNMPs in the same system and was used to examine the relationship between tissue levels of cNMPs in a panel of rat organs. In addition, the study reports the first identification and quantification of 2',3'-cIMP. The developed method will allow for quantification of cNMPs levels in cells and tissues with varying disease states, which will provide insight into the role(s and interplay of cNMP signalling pathways.

  11. Digital Quantification of Goldmann Visual Fields (GVF) as a Means for Genotype-Phenotype Comparisons and Detection of Progression in Retinal Degenerations

    Science.gov (United States)

    Zahid, Sarwar; Peeler, Crandall; Khan, Naheed; Davis, Joy; Mahmood, Mahdi; Heckenlively, John; Jayasundera, Thiran

    2015-01-01

    Purpose To develop a reliable and efficient digital method to quantify planimetric Goldmann visual field (GVF) data to monitor disease course and treatment responses in retinal degenerative diseases. Methods A novel method to digitally quantify GVF using Adobe Photoshop CS3 was developed for comparison to traditional digital planimetry (Placom 45C digital planimeter; EngineerSupply, Lynchburg, Virginia, USA). GVFs from 20 eyes from 10 patients with Stargardt disease were quantified to assess the difference between the two methods (a total of 230 measurements per method). This quantification approach was also applied to 13 patients with X-linked retinitis pigmentosa (XLRP) with mutations in RPGR. Results Overall, measurements using Adobe Photoshop were more rapidly performed than those using conventional planimetry. Photoshop measurements also exhibited less inter- and intra-observer variability. GVF areas for the I4e isopter in patients with the same mutation in RPGR who were nearby in age had similar qualitative and quantitative areas. Conclusions Quantification of GVF using Adobe Photoshop is quicker, more reliable, and less-user dependent than conventional digital planimetry. It will be a useful tool for both retrospective and prospective studies of disease course as well as for monitoring treatment response in clinical trials for retinal degenerative diseases. PMID:24664690

  12. Digital quantification of Goldmann visual fields (GVFs) as a means for genotype-phenotype comparisons and detection of progression in retinal degenerations.

    Science.gov (United States)

    Zahid, Sarwar; Peeler, Crandall; Khan, Naheed; Davis, Joy; Mahmood, Mahdi; Heckenlively, John R; Jayasundera, Thiran

    2014-01-01

    To develop a reliable and efficient digital method to quantify planimetric Goldmann visual field (GVF) data to monitor disease course and treatment responses in retinal degenerative diseases. A novel method to digitally quantify GVFs using Adobe Photoshop CS3 was developed for comparison to traditional digital planimetry (Placom 45C digital planimeter; Engineer Supply, Lynchburg, Virginia, USA). GVFs from 20 eyes from 10 patients with Stargardt disease were quantified to assess the difference between the two methods (a total of 230 measurements per method). This quantification approach was also applied to 13 patients with X-linked retinitis pigmentosa (XLRP) with mutations in RPGR. Overall, measurements using Adobe Photoshop were more rapidly performed than those using conventional planimetry. Photoshop measurements also exhibited less inter- and intraobserver variability. GVF areas for the I4e isopter in patients with the same mutation in RPGR who were nearby in age had similar qualitative and quantitative areas. Quantification of GVFs using Adobe Photoshop is quicker, more reliable, and less user dependent than conventional digital planimetry. It will be a useful tool for both retrospective and prospective studies of disease course as well as for monitoring treatment response in clinical trials for retinal degenerative diseases.

  13. Observation and quantification of the quantum dynamics of a strong-field excited multi-level system.

    Science.gov (United States)

    Liu, Zuoye; Wang, Quanjun; Ding, Jingjie; Cavaletto, Stefano M; Pfeifer, Thomas; Hu, Bitao

    2017-01-04

    The quantum dynamics of a V-type three-level system, whose two resonances are first excited by a weak probe pulse and subsequently modified by another strong one, is studied. The quantum dynamics of the multi-level system is closely related to the absorption spectrum of the transmitted probe pulse and its modification manifests itself as a modulation of the absorption line shape. Applying the dipole-control model, the modulation induced by the second strong pulse to the system's dynamics is quantified by eight intensity-dependent parameters, describing the self and inter-state contributions. The present study opens the route to control the quantum dynamics of multi-level systems and to quantify the quantum-control process.

  14. Initial water quantification results using neutron computed tomography

    Science.gov (United States)

    Heller, A. K.; Shi, L.; Brenizer, J. S.; Mench, M. M.

    2009-06-01

    Neutron computed tomography is an important imaging tool in the field of non-destructive testing and in fundamental research for many engineering applications. Contrary to X-rays, neutrons can be attenuated by some light materials, such as hydrogen, but can penetrate many heavy materials. Thus, neutron computed tomography is useful in obtaining important three-dimensional information about a sample's interior structure and material properties that other traditional methods cannot provide. The neutron computed tomography system at the Pennsylvania State University's Radiation Science and Engineering Center is being utilized to develop a water quantification technique for investigation of water distribution in fuel cells under normal conditions. A hollow aluminum cylinder test sample filled with a known volume of water was constructed for purposes of testing the quantification technique. Transmission images of the test sample at different angles were easily acquired through the synthesis of a dedicated image acquisition computer driving a rotary table controller and an in-house developed synchronization software package. After data acquisition, Octopus (version 8.2) and VGStudio Max (version 1.2) were used to perform cross-sectional and three-dimensional reconstructions of the sample, respectively. The initial reconstructions and water quantification results are presented.

  15. Evaluating 239Pu levels using field detectors

    International Nuclear Information System (INIS)

    Wahl, L.E.; Smith, W.J. II; Martin, B.

    1996-01-01

    At Los Alamos National Laboratory, cleanup was planned at three septic tanks where surface soil in the outfall drainage areas was found to be contaminated with 239 Pu. To meet budget and deadline constraints, a technique was developed that used field instruments to verify 239 Pu soil contamination at levels less than 2.8 Bq g -1 , the established cleanup level. The drainage areas were surveyed using a low-energy gamma probe to identify likely areas of 239 Pu contamination. Between 40 and 135 0.1-min gamma radiation measurements were obtained from each drainage area. From these data, locations were identified for subsequent screening for alpha radioactivity. Soil samples from between 11 and 18 locations at each drainage area were placed in petri dishes, dried, and counted for 10 minutes using an alpha probe. Alpha counts were then related to 239 Pu concentrations using a curve developed from local soils containing known concentrations of 239 Pu. Up to six soil samples from each drainage area, representing a range of alpha radioactivity levels, were sent for laboratory analysis of isotopic plutonium to confirm field measurement results. Analytical and field results correlated well at all but one outfall area. At this area, field measurements predicted more 239 Pu than was measured in the laboratory, indicating the presence of another alpha-emitting radionuclide that might have been missed if only laboratory analyses for plutonium had been used. This technique, which combined a large number of gamma radioactivity measurements, a moderate number of alpha radioactivity measurements, and a few isotopic plutonium measurements, allowed quick and inexpensive comparison of 239 Pu with the cleanup level

  16. Advancing agricultural greenhouse gas quantification*

    Science.gov (United States)

    Olander, Lydia; Wollenberg, Eva; Tubiello, Francesco; Herold, Martin

    2013-03-01

    Agricultural Research Service 2011), which aim to improve consistency of field measurement and data collection for soil carbon sequestration and soil nitrous oxide fluxes. Often these national-level activity data and emissions factors are the basis for regional and smaller-scale applications. Such data are used for model-based estimates of changes in GHGs at a project or regional level (Olander et al 2011). To complement national data for regional-, landscape-, or field-level applications, new data are often collected through farmer knowledge or records and field sampling. Ideally such data could be collected in a standardized manner, perhaps through some type of crowd sourcing model to improve regional—and national—level data, as well as to improve consistency of locally collected data. Data can also be collected by companies working with agricultural suppliers and in country networks, within efforts aimed at understanding firm and product (supply-chain) sustainability and risks (FAO 2009). Such data may feed into various certification processes or reporting requirements from buyers. Unfortunately, this data is likely proprietary. A new process is needed to aggregate and share private data in a way that would not be a competitive concern so such data could complement or supplement national data and add value. A number of papers in this focus issue discuss issues surrounding quantification methods and systems at large scales, global and national levels, while others explore landscape- and field-scale approaches. A few explore the intersection of top-down and bottom-up data measurement and modeling approaches. 5. The agricultural greenhouse gas quantification project and ERL focus issue Important land management decisions are often made with poor or few data, especially in developing countries. Current systems for quantifying GHG emissions are inadequate in most low-income countries, due to a lack of funding, human resources, and infrastructure. Most non-Annex 1 countries

  17. Zero-field magnetic response functions in Landau levels

    Science.gov (United States)

    Gao, Yang; Niu, Qian

    2017-07-01

    We present a fresh perspective on the Landau level quantization rule; that is, by successively including zero-field magnetic response functions at zero temperature, such as zero-field magnetization and susceptibility, the Onsager’s rule can be corrected order by order. Such a perspective is further reinterpreted as a quantization of the semiclassical electron density in solids. Our theory not only reproduces Onsager’s rule at zeroth order and the Berry phase and magnetic moment correction at first order but also explains the nature of higher-order corrections in a universal way. In applications, those higher-order corrections are expected to curve the linear relation between the level index and the inverse of the magnetic field, as already observed in experiments. Our theory then provides a way to extract the correct value of Berry phase as well as the magnetic susceptibility at zero temperature from Landau level fan diagrams in experiments. Moreover, it can be used theoretically to calculate Landau levels up to second-order accuracy for realistic models.

  18. New Detection Modality for Label-Free Quantification of DNA in Biological Samples via Superparamagnetic Bead Aggregation

    Science.gov (United States)

    Leslie, Daniel C.; Li, Jingyi; Strachan, Briony C.; Begley, Matthew R.; Finkler, David; Bazydlo, Lindsay L.; Barker, N. Scott; Haverstick, Doris; Utz, Marcel; Landers, James P.

    2012-01-01

    Combining DNA and superparamagnetic beads in a rotating magnetic field produces multiparticle aggregates that are visually striking, and enables label-free optical detection and quantification of DNA at levels in the picogram per microliter range. DNA in biological samples can be quantified directly by simple analysis of optical images of microfluidic wells placed on a magnetic stirrer without DNA purification. Aggregation results from DNA/bead interactions driven either by the presence of a chaotrope (a nonspecific trigger for aggregation) or by hybridization with oligonucleotides on functionalized beads (sequence-specific). This paper demonstrates quantification of DNA with sensitivity comparable to that of the best currently available fluorometric assays. The robustness and sensitivity of the method enable a wide range of applications, illustrated here by counting eukaryotic cells. Using widely available and inexpensive benchtop hardware, the approach provides a highly accessible low-tech microscale alternative to more expensive DNA detection and cell counting techniques. PMID:22423674

  19. Quantification of taurine in energy drinks using ¹H NMR.

    Science.gov (United States)

    Hohmann, Monika; Felbinger, Christine; Christoph, Norbert; Wachter, Helmut; Wiest, Johannes; Holzgrabe, Ulrike

    2014-05-01

    The consumption of so called energy drinks is increasing, especially among adolescents. These beverages commonly contain considerable amounts of the amino sulfonic acid taurine, which is related to a magnitude of various physiological effects. The customary method to control the legal limit of taurine in energy drinks is LC-UV/vis with postcolumn derivatization using ninhydrin. In this paper we describe the quantification of taurine in energy drinks by (1)H NMR as an alternative to existing methods of quantification. Variation of pH values revealed the separation of a distinct taurine signal in (1)H NMR spectra, which was applied for integration and quantification. Quantification was performed using external calibration (R(2)>0.9999; linearity verified by Mandel's fitting test with a 95% confidence level) and PULCON. Taurine concentrations in 20 different energy drinks were analyzed by both using (1)H NMR and LC-UV/vis. The deviation between (1)H NMR and LC-UV/vis results was always below the expanded measurement uncertainty of 12.2% for the LC-UV/vis method (95% confidence level) and at worst 10.4%. Due to the high accordance to LC-UV/vis data and adequate recovery rates (ranging between 97.1% and 108.2%), (1)H NMR measurement presents a suitable method to quantify taurine in energy drinks. Copyright © 2013 Elsevier B.V. All rights reserved.

  20. Superlattice band structure: New and simple energy quantification condition

    Energy Technology Data Exchange (ETDEWEB)

    Maiz, F., E-mail: fethimaiz@gmail.com [University of Cartage, Nabeul Engineering Preparatory Institute, Merazka, 8000 Nabeul (Tunisia); King Khalid University, Faculty of Science, Physics Department, P.O. Box 9004, Abha 61413 (Saudi Arabia)

    2014-10-01

    Assuming an approximated effective mass and using Bastard's boundary conditions, a simple method is used to calculate the subband structure for periodic semiconducting heterostructures. Our method consists to derive and solve the energy quantification condition (EQC), this is a simple real equation, composed of trigonometric and hyperbolic functions, and does not need any programming effort or sophistic machine to solve it. For less than ten wells heterostructures, we have derived and simplified the energy quantification conditions. The subband is build point by point; each point presents an energy level. Our simple energy quantification condition is used to calculate the subband structure of the GaAs/Ga{sub 0.5}Al{sub 0.5}As heterostructures, and build its subband point by point for 4 and 20 wells. Our finding shows a good agreement with previously published results.

  1. Quantification of abdominal aortic deformation after EVAR

    Science.gov (United States)

    Demirci, Stefanie; Manstad-Hulaas, Frode; Navab, Nassir

    2009-02-01

    Quantification of abdominal aortic deformation is an important requirement for the evaluation of endovascular stenting procedures and the further refinement of stent graft design. During endovascular aortic repair (EVAR) treatment, the aortic shape is subject to severe deformation that is imposed by medical instruments such as guide wires, catheters, and, the stent graft. This deformation can affect the flow characteristics and morphology of the aorta which have been shown to be elicitors for stent graft failures and be reason for reappearance of aneurysms. We present a method for quantifying the deformation of an aneurysmatic aorta imposed by an inserted stent graft device. The outline of the procedure includes initial rigid alignment of the two abdominal scans, segmentation of abdominal vessel trees, and automatic reduction of their centerline structures to one specified region of interest around the aorta. This is accomplished by preprocessing and remodeling of the pre- and postoperative aortic shapes before performing a non-rigid registration. We further narrow the resulting displacement fields to only include local non-rigid deformation and therefore, eliminate all remaining global rigid transformations. Finally, deformations for specified locations can be calculated from the resulting displacement fields. In order to evaluate our method, experiments for the extraction of aortic deformation fields are conducted on 15 patient datasets from endovascular aortic repair (EVAR) treatment. A visual assessment of the registration results and evaluation of the usage of deformation quantification were performed by two vascular surgeons and one interventional radiologist who are all experts in EVAR procedures.

  2. Molecular quantification of environmental DNA using microfluidics and digital PCR.

    Science.gov (United States)

    Hoshino, Tatsuhiko; Inagaki, Fumio

    2012-09-01

    Real-time PCR has been widely used to evaluate gene abundance in natural microbial habitats. However, PCR-inhibitory substances often reduce the efficiency of PCR, leading to the underestimation of target gene copy numbers. Digital PCR using microfluidics is a new approach that allows absolute quantification of DNA molecules. In this study, digital PCR was applied to environmental samples, and the effect of PCR inhibitors on DNA quantification was tested. In the control experiment using λ DNA and humic acids, underestimation of λ DNA at 1/4400 of the theoretical value was observed with 6.58 ng μL(-1) humic acids. In contrast, digital PCR provided accurate quantification data with a concentration of humic acids up to 9.34 ng μL(-1). The inhibitory effect of paddy field soil extract on quantification of the archaeal 16S rRNA gene was also tested. By diluting the DNA extract, quantified copy numbers from real-time PCR and digital PCR became similar, indicating that dilution was a useful way to remedy PCR inhibition. The dilution strategy was, however, not applicable to all natural environmental samples. For example, when marine subsurface sediment samples were tested the copy number of archaeal 16S rRNA genes was 1.04×10(3) copies/g-sediment by digital PCR, whereas real-time PCR only resulted in 4.64×10(2) copies/g-sediment, which was most likely due to an inhibitory effect. The data from this study demonstrated that inhibitory substances had little effect on DNA quantification using microfluidics and digital PCR, and showed the great advantages of digital PCR in accurate quantifications of DNA extracted from various microbial habitats. Copyright © 2012 Elsevier GmbH. All rights reserved.

  3. Initial water quantification results using neutron computed tomography

    Energy Technology Data Exchange (ETDEWEB)

    Heller, A.K. [Department of Mechanical and Nuclear Engineering, Pennsylvania State University (United States)], E-mail: axh174@psu.edu; Shi, L.; Brenizer, J.S.; Mench, M.M. [Department of Mechanical and Nuclear Engineering, Pennsylvania State University (United States)

    2009-06-21

    Neutron computed tomography is an important imaging tool in the field of non-destructive testing and in fundamental research for many engineering applications. Contrary to X-rays, neutrons can be attenuated by some light materials, such as hydrogen, but can penetrate many heavy materials. Thus, neutron computed tomography is useful in obtaining important three-dimensional information about a sample's interior structure and material properties that other traditional methods cannot provide. The neutron computed tomography system at Pennsylvania State University's Radiation Science and Engineering Center is being utilized to develop a water quantification technique for investigation of water distribution in fuel cells under normal conditions. A hollow aluminum cylinder test sample filled with a known volume of water was constructed for purposes of testing the quantification technique. Transmission images of the test sample at different angles were easily acquired through the synthesis of a dedicated image acquisition computer driving a rotary table controller and an in-house developed synchronization software package. After data acquisition, Octopus (version 8.2) and VGStudio Max (version 1.2) were used to perform cross-sectional and three-dimensional reconstructions of the sample, respectively. The initial reconstructions and water quantification results are presented.

  4. Level sets and extrema of random processes and fields

    CERN Document Server

    Azais, Jean-Marc

    2009-01-01

    A timely and comprehensive treatment of random field theory with applications across diverse areas of study Level Sets and Extrema of Random Processes and Fields discusses how to understand the properties of the level sets of paths as well as how to compute the probability distribution of its extremal values, which are two general classes of problems that arise in the study of random processes and fields and in related applications. This book provides a unified and accessible approach to these two topics and their relationship to classical theory and Gaussian processes and fields, and the most modern research findings are also discussed. The authors begin with an introduction to the basic concepts of stochastic processes, including a modern review of Gaussian fields and their classical inequalities. Subsequent chapters are devoted to Rice formulas, regularity properties, and recent results on the tails of the distribution of the maximum. Finally, applications of random fields to various areas of mathematics a...

  5. Quantification of rice bran oil in oil blends

    Energy Technology Data Exchange (ETDEWEB)

    Mishra, R.; Sharma, H. K.; Sengar, G.

    2012-11-01

    Blends consisting of physically refined rice bran oil (PRBO): sunflower oil (SnF) and PRBO: safflower oil (SAF) in different proportions were analyzed for various physicochemical parameters. The quantification of pure rice bran oil in the blended oils was carried out using different methods including gas chromatographic, HPLC, ultrasonic velocity and methods based on physico-chemical parameters. The physicochemical parameters such as ultrasonic velocity, relative association and acoustic impedance at 2 MHz, iodine value, palmitic acid content and oryzanol content reflected significant changes with increased proportions of PRBO in the blended oils. These parameters were selected as dependent parameters and % PRBO proportion was selected as independent parameters. The study revealed that regression equations based on the oryzanol content, palmitic acid composition, ultrasonic velocity, relative association, acoustic impedance, and iodine value can be used for the quantification of rice bran oil in blended oils. The rice bran oil can easily be quantified in the blended oils based on the oryzanol content by HPLC even at a 1% level. The palmitic acid content in blended oils can also be used as an indicator to quantify rice bran oil at or above the 20% level in blended oils whereas the method based on ultrasonic velocity, acoustic impedance and relative association showed initial promise in the quantification of rice bran oil. (Author) 23 refs.

  6. System-Level Power Consumption Analysis of the Wearable Asthmatic Wheeze Quantification

    Directory of Open Access Journals (Sweden)

    Dinko Oletic

    2018-01-01

    Full Text Available Long-term quantification of asthmatic wheezing envisions an m-Health sensor system consisting of a smartphone and a body-worn wireless acoustic sensor. As both devices are power constrained, the main criterion guiding the system design comes down to minimization of power consumption, while retaining sufficient respiratory sound classification accuracy (i.e., wheeze detection. Crucial for assessment of the system-level power consumption is the understanding of trade-off between power cost of computationally intensive local processing and communication. Therefore, we analyze power requirements of signal acquisition, processing, and communication in three typical operating scenarios: (1 streaming of uncompressed respiratory signal to a smartphone for classification, (2 signal streaming utilizing compressive sensing (CS for reduction of data rate, and (3 respiratory sound classification onboard the wearable sensor. Study shows that the third scenario featuring the lowest communication cost enables the lowest total sensor system power consumption ranging from 328 to 428 μW. In such scenario, 32-bit ARM Cortex M3/M4 cores typically embedded within Bluetooth 4 SoC modules feature the optimal trade-off between onboard classification performance and consumption. On the other hand, study confirms that CS enables the most power-efficient design of the wearable sensor (216 to 357 μW in the compressed signal streaming, the second scenario. In such case, a single low-power ARM Cortex-A53 core is sufficient for simultaneous real-time CS reconstruction and classification on the smartphone, while keeping the total system power within budget for uncompressed streaming.

  7. Anti-levitation of Landau levels in vanishing magnetic fields

    Science.gov (United States)

    Pan, W.; Baldwin, K. W.; West, K. W.; Pfeiffer, L. N.; Tsui, D. C.

    Soon after the discovery of the quantum Hall effects in two-dimensional electron systems, the question on the fate of the extended states in a Landau level in vanishing magnetic (B) field arose. Many theoretical models have since been proposed, and experimental results remain inconclusive. In this talk, we report experimental observation of anti-levitation behavior of Landau levels in vanishing B fields (down to as low as B 58 mT) in a high quality heterojunction insulated-gated field-effect transistor (HIGFET). We observed that, in the Landau fan diagram of electron density versus magnetic field, the positions of the magneto-resistance minima at Landau level fillings ν = 4, 5, 6 move below the ``traditional'' Landau level line to lower electron densities. This clearly differs from what was observed in the earlier experiments where in the same Landau fan plot the density moved up. Our result strongly supports the anti-levitation behavior predicted recently. Moreover, the even and odd Landau level filling states show quantitatively different behaviors in anti-levitation, suggesting that the exchange interactions, which are important at odd fillings, may play a role. SNL is a multi-program laboratory managed and operated by Sandia Corporation, a wholly owned subsidiary of Lockheed Martin Corporation, for the U.S. Department of Energys National Nuclear Security Administration under contract DE-AC04-94AL85000.

  8. Nickel quantification in serum by a validated sector-field inductively coupled plasma mass spectrometry method: Assessment of tentative reference values for an Italian population.

    Science.gov (United States)

    Bocca, Beatrice; Forte, Giovanni; Ronchi, Anna; Gaggeri, Raffaella; Alimonti, Alessandro; Minoia, Claudio

    2006-01-01

    The daily exposure to Ni from food, industrial processes, jewellery and coins makes the determination of Ni in human serum an important way to monitor the health status in non-occupationally exposed subjects. To this end, a method based on sector-field inductively coupled plasma mass spectrometry was developed and validated. The limits of detection (LoD) and quantification (LoQ), sensitivity, linearity range, trueness, repeatability, within-laboratory reproducibility and robustness were the considered issues of the validation process. The uncertainty associated with the measurements was also calculated, according to the Eurachem/Citac Guide. The method LoD and LoQ were 0.03 and 0.09 ng mL(-1), linearity was over two order of magnitude, trueness was -3.57%, and the repeatability and reproducibility showed relative standard deviations equal to 4.56% and 6.52%, respectively. The relative expanded uncertainty was 21.8% at the Ni levels found in the general population. The tentative reference value for serum Ni was 0.466 +/- 0.160 ng mL(-1) with a related interval between 0.226 and 1.026 ng mL(-1). Copyright 2006 John Wiley & Sons, Ltd.

  9. Quantification in emission tomography

    International Nuclear Information System (INIS)

    Buvat, Irene

    2011-11-01

    The objective of this lecture is to understand the possibilities and limitations of the quantitative analysis of single photon emission computed tomography (SPECT) and positron emission tomography (PET) images. It is also to identify the conditions to be fulfilled to obtain reliable quantitative measurements from images. Content: 1 - Introduction: Quantification in emission tomography - definition and challenges; quantification biasing phenomena 2 - Main problems impacting quantification in PET and SPECT: problems, consequences, correction methods, results (Attenuation, scattering, partial volume effect, movement, un-stationary spatial resolution in SPECT, fortuitous coincidences in PET, standardisation in PET); 3 - Synthesis: accessible efficiency, know-how, Precautions, beyond the activity measurement

  10. Accident sequence quantification with KIRAP

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Tae Un; Han, Sang Hoon; Kim, Kil You; Yang, Jun Eon; Jeong, Won Dae; Chang, Seung Cheol; Sung, Tae Yong; Kang, Dae Il; Park, Jin Hee; Lee, Yoon Hwan; Hwang, Mi Jeong

    1997-01-01

    The tasks of probabilistic safety assessment(PSA) consists of the identification of initiating events, the construction of event tree for each initiating event, construction of fault trees for event tree logics, the analysis of reliability data and finally the accident sequence quantification. In the PSA, the accident sequence quantification is to calculate the core damage frequency, importance analysis and uncertainty analysis. Accident sequence quantification requires to understand the whole model of the PSA because it has to combine all event tree and fault tree models, and requires the excellent computer code because it takes long computation time. Advanced Research Group of Korea Atomic Energy Research Institute(KAERI) has developed PSA workstation KIRAP(Korea Integrated Reliability Analysis Code Package) for the PSA work. This report describes the procedures to perform accident sequence quantification, the method to use KIRAP`s cut set generator, and method to perform the accident sequence quantification with KIRAP. (author). 6 refs.

  11. Accident sequence quantification with KIRAP

    International Nuclear Information System (INIS)

    Kim, Tae Un; Han, Sang Hoon; Kim, Kil You; Yang, Jun Eon; Jeong, Won Dae; Chang, Seung Cheol; Sung, Tae Yong; Kang, Dae Il; Park, Jin Hee; Lee, Yoon Hwan; Hwang, Mi Jeong.

    1997-01-01

    The tasks of probabilistic safety assessment(PSA) consists of the identification of initiating events, the construction of event tree for each initiating event, construction of fault trees for event tree logics, the analysis of reliability data and finally the accident sequence quantification. In the PSA, the accident sequence quantification is to calculate the core damage frequency, importance analysis and uncertainty analysis. Accident sequence quantification requires to understand the whole model of the PSA because it has to combine all event tree and fault tree models, and requires the excellent computer code because it takes long computation time. Advanced Research Group of Korea Atomic Energy Research Institute(KAERI) has developed PSA workstation KIRAP(Korea Integrated Reliability Analysis Code Package) for the PSA work. This report describes the procedures to perform accident sequence quantification, the method to use KIRAP's cut set generator, and method to perform the accident sequence quantification with KIRAP. (author). 6 refs

  12. Crystalline Electric Field Levels in the Neodymium Monopnictides Determined by Neutron Spectroscopy

    DEFF Research Database (Denmark)

    Furrer, A.; Kjems, Jørgen; Vogt, O.

    1972-01-01

    Neutron inelastic scattering experiments have been carried out to determine the energies and widths of the crystalline electric field levels in the neodymium monopnictides NdP, NdAs, and NdSb. The energy level sequence is derived from the observed crystal field transition peak intensities, which...... are in good agreement with calculations based on elementary crystal field theory. The energy level widths are qualitatively discussed. It is found that the point-charge model cannot reproduce the crystal field levels satisfactorily....

  13. Tree-level correlations in the strong field regime

    Science.gov (United States)

    Gelis, François

    2017-09-01

    We consider the correlation function of an arbitrary number of local observables in quantum field theory, in situations where the field amplitude is large. Using a quasi-classical approximation (valid for a highly occupied initial mixed state, or for a coherent initial state if the classical dynamics has instabilities), we show that at tree level these correlations are dominated by fluctuations at the initial time. We obtain a general expression of the correlation functions in terms of the classical solution of the field equation of motion and its derivatives with respect to its initial conditions, that can be arranged graphically as the sum of labeled trees where the nodes are the individual observables, and the links are pairs of derivatives acting on them. For 3-point (and higher) correlation functions, there are additional tree-level terms beyond the quasi-classical approximation, generated by fluctuations in the bulk.

  14. NEW MODEL FOR QUANTIFICATION OF ICT DEPENDABLE ORGANIZATIONS RESILIENCE

    Directory of Open Access Journals (Sweden)

    Zora Arsovski

    2011-03-01

    Full Text Available Business environment today demands high reliable organizations in every segment to be competitive on the global market. Beside that, ICT sector is becoming irreplaceable in many fields of business, from the communication to the complex systems for process control and production. To fulfill those requirements and to develop further, many organizations worldwide are implementing business paradigm called - organizations resilience. Although resilience is well known term in many science fields, it is not well studied due to its complex nature. This paper is dealing with developing the new model for assessment and quantification of ICT dependable organizations resilience.

  15. Uncertainty quantification and stochastic modeling with Matlab

    CERN Document Server

    Souza de Cursi, Eduardo

    2015-01-01

    Uncertainty Quantification (UQ) is a relatively new research area which describes the methods and approaches used to supply quantitative descriptions of the effects of uncertainty, variability and errors in simulation problems and models. It is rapidly becoming a field of increasing importance, with many real-world applications within statistics, mathematics, probability and engineering, but also within the natural sciences. Literature on the topic has up until now been largely based on polynomial chaos, which raises difficulties when considering different types of approximation and does no

  16. Quantitative composition determination at the atomic level using model-based high-angle annular dark field scanning transmission electron microscopy

    International Nuclear Information System (INIS)

    Martinez, G.T.; Rosenauer, A.; De Backer, A.; Verbeeck, J.; Van Aert, S.

    2014-01-01

    High angle annular dark field scanning transmission electron microscopy (HAADF STEM) images provide sample information which is sensitive to the chemical composition. The image intensities indeed scale with the mean atomic number Z. To some extent, chemically different atomic column types can therefore be visually distinguished. However, in order to quantify the atomic column composition with high accuracy and precision, model-based methods are necessary. Therefore, an empirical incoherent parametric imaging model can be used of which the unknown parameters are determined using statistical parameter estimation theory (Van Aert et al., 2009, [1]). In this paper, it will be shown how this method can be combined with frozen lattice multislice simulations in order to evolve from a relative toward an absolute quantification of the composition of single atomic columns with mixed atom types. Furthermore, the validity of the model assumptions are explored and discussed. - Highlights: • A model-based method is extended from a relative toward an absolute quantification of chemical composition of single atomic columns from HAADF HRSTEM images. • The methodology combines statistical parameter estimation theory with frozen lattice multislice simulations to quantify chemical composition atomic column by atomic column. • Validity and limitations of this model-based method are explored and discussed. • Quantification results obtained for a complex structure show agreement with EDX refinement

  17. Determining the optimal number of individual samples to pool for quantification of average herd levels of antimicrobial resistance genes in Danish pig herds using high-throughput qPCR

    DEFF Research Database (Denmark)

    Clasen, Julie; Mellerup, Anders; Olsen, John Elmerdahl

    2016-01-01

    The primary objective of this study was to determine the minimum number of individual fecal samples to pool together in order to obtain a representative sample for herd level quantification of antimicrobial resistance (AMR) genes in a Danish pig herd, using a novel high-throughput qPCR assay...

  18. Tile-Level Annotation of Satellite Images Using Multi-Level Max-Margin Discriminative Random Field

    Directory of Open Access Journals (Sweden)

    Hong Sun

    2013-05-01

    Full Text Available This paper proposes a multi-level max-margin discriminative analysis (M3DA framework, which takes both coarse and fine semantics into consideration, for the annotation of high-resolution satellite images. In order to generate more discriminative topic-level features, the M3DA uses the maximum entropy discrimination latent Dirichlet Allocation (MedLDA model. Moreover, for improving the spatial coherence of visual words neglected by M3DA, conditional random field (CRF is employed to optimize the soft label field composed of multiple label posteriors. The framework of M3DA enables one to combine word-level features (generated by support vector machines and topic-level features (generated by MedLDA via the bag-of-words representation. The experimental results on high-resolution satellite images have demonstrated that, using the proposed method can not only obtain suitable semantic interpretation, but also improve the annotation performance by taking into account the multi-level semantics and the contextual information.

  19. DSM bidding - what field are we leveling, anyway? Or how do you level the field without killing the crop

    International Nuclear Information System (INIS)

    Siebens, C.W.

    1993-01-01

    Since the first regulated monopoly was established, there has been regulatory concern over the open-quotes level playing fieldclose quotes issue. This concern can be valid, but its character is relatively nebulous, and situational. Most recently, regulators have expressed their concern over the level playing field issue relative to demand side bidding, and demand side management (DSM) incentives regulation which provides utility shareholder returns for DSM initiatives. The playing field issues relative to DSM can be extensive. Utility ESCO subsidiaries are being formed, utility service contracts for HVAC equipment exist, and utilities have special access to customers and customer energy information. At the same time, all source bidding processes are required in some states allowing DSM projects to effectively displace supply side projects, customers must choose between utility DSM programs and ESCO offerings, and the list goes on

  20. Stereo-particle image velocimetry uncertainty quantification

    International Nuclear Information System (INIS)

    Bhattacharya, Sayantan; Vlachos, Pavlos P; Charonko, John J

    2017-01-01

    Particle image velocimetry (PIV) measurements are subject to multiple elemental error sources and thus estimating overall measurement uncertainty is challenging. Recent advances have led to a posteriori uncertainty estimation methods for planar two-component PIV. However, no complete methodology exists for uncertainty quantification in stereo PIV. In the current work, a comprehensive framework is presented to quantify the uncertainty stemming from stereo registration error and combine it with the underlying planar velocity uncertainties. The disparity in particle locations of the dewarped images is used to estimate the positional uncertainty of the world coordinate system, which is then propagated to the uncertainty in the calibration mapping function coefficients. Next, the calibration uncertainty is combined with the planar uncertainty fields of the individual cameras through an uncertainty propagation equation and uncertainty estimates are obtained for all three velocity components. The methodology was tested with synthetic stereo PIV data for different light sheet thicknesses, with and without registration error, and also validated with an experimental vortex ring case from 2014 PIV challenge. Thorough sensitivity analysis was performed to assess the relative impact of the various parameters to the overall uncertainty. The results suggest that in absence of any disparity, the stereo PIV uncertainty prediction method is more sensitive to the planar uncertainty estimates than to the angle uncertainty, although the latter is not negligible for non-zero disparity. Overall the presented uncertainty quantification framework showed excellent agreement between the error and uncertainty RMS values for both the synthetic and the experimental data and demonstrated reliable uncertainty prediction coverage. This stereo PIV uncertainty quantification framework provides the first comprehensive treatment on the subject and potentially lays foundations applicable to volumetric

  1. Comparison of five DNA quantification methods

    DEFF Research Database (Denmark)

    Nielsen, Karsten; Mogensen, Helle Smidt; Hedman, Johannes

    2008-01-01

    Six commercial preparations of human genomic DNA were quantified using five quantification methods: UV spectrometry, SYBR-Green dye staining, slot blot hybridization with the probe D17Z1, Quantifiler Human DNA Quantification kit and RB1 rt-PCR. All methods measured higher DNA concentrations than...... Quantification kit in two experiments. The measured DNA concentrations with Quantifiler were 125 and 160% higher than expected based on the manufacturers' information. When the Quantifiler human DNA standard (Raji cell line) was replaced by the commercial human DNA preparation G147A (Promega) to generate the DNA...... standard curve in the Quantifiler Human DNA Quantification kit, the DNA quantification results of the human DNA preparations were 31% higher than expected based on the manufacturers' information. The results indicate a calibration problem with the Quantifiler human DNA standard for its use...

  2. The use of self-quantification systems for personal health information: big data management activities and prospects.

    Science.gov (United States)

    Almalki, Manal; Gray, Kathleen; Sanchez, Fernando Martin

    2015-01-01

    appears promising and exciting. However, more studies are needed to support its use in this field. The proposed model will in the future lead to developing a measure for assessing the effectiveness of interventions to support using SQS for health self-management (e.g., assessing the complexity of self-quantification activities, and activation of the individuals).

  3. Unconventional barometry and rheometry: new quantification approaches for mechanically-controlled microstructures

    Science.gov (United States)

    Tajcmanova, L.; Moulas, E.; Vrijmoed, J.; Podladchikov, Y.

    2016-12-01

    Estimation of pressure-temperature (P-T) from petrographic observations in metamorphic rocks has become a common practice in petrology studies during the last 50 years. This data often serves as a key input in geodynamic reconstructions and thus directly influences our understanding of lithospheric processes. Such an approach might have led the metamorphic geology field to a certain level of quiescence. In the classical view of metamorphic quantification approaches, fast viscous relaxation (and therefore constant pressure across the rock microstructure) is assumed, with chemical diffusion being the limiting factor in equilibration. Recently, we have focused on the other possible scenario - fast chemical diffusion and slow viscous relaxation - and brings an alternative interpretation of chemical zoning found in high-grade rocks. The aim has been to provide insight into the role of mechanically maintained pressure variations on multi-component chemical zoning in minerals. Furthermore, we used the pressure information from the mechanically-controlled microstructure for rheological constrains. We show an unconventional way of relating the direct microstructural observations in rocks to the nonlinearity of rheology at time scales unattainable by laboratory measurements. Our analysis documents that mechanically controlled microstructures that have been preserved over geological times can be used to deduce flow-law parameters and in turn estimate stress levels of minerals in their natural environment. The development of the new quantification approaches has opened new horizons in understanding the phase transformations in the Earth's lithosphere. Furthermore, the new data generated can serve as a food for thought for the next generation of fully coupled numerical codes that involve reacting materials while respecting conservation of mass, momentum and energy.

  4. New approach for the quantification of processed animal proteins in feed using light microscopy.

    Science.gov (United States)

    Veys, P; Baeten, V

    2010-07-01

    A revision of European Union's total feed ban on animal proteins in feed will need robust quantification methods, especially for control analyses, if tolerance levels are to be introduced, as for fishmeal in ruminant feed. In 2006, a study conducted by the Community Reference Laboratory for Animal Proteins in feedstuffs (CRL-AP) demonstrated the deficiency of the official quantification method based on light microscopy. The study concluded that the method had to be revised. This paper puts forward an improved quantification method based on three elements: (1) the preparation of permanent slides with an optical adhesive preserving all morphological markers of bones necessary for accurate identification and precision counting; (2) the use of a counting grid eyepiece reticle; and (3) new definitions for correction factors for the estimated portions of animal particles in the sediment. This revised quantification method was tested on feeds adulterated at different levels with bovine meat and bone meal (MBM) and fishmeal, and it proved to be effortless to apply. The results obtained were very close to the expected values of contamination levels for both types of adulteration (MBM or fishmeal). Calculated values were not only replicable, but also reproducible. The advantages of the new approach, including the benefits of the optical adhesive used for permanent slide mounting and the experimental conditions that need to be met to implement the new method correctly, are discussed.

  5. Optimization of tagged MRI for quantification of liver stiffness using computer simulated data.

    Directory of Open Access Journals (Sweden)

    Serena Monti

    Full Text Available The heartbeat has been proposed as an intrinsic source of motion that can be used in combination with tagged Magnetic Resonance Imaging (MRI to measure displacements induced in the liver as an index of liver stiffness. Optimizing a tagged MRI acquisition protocol in terms of sensitivity to these displacements, which are in the order of pixel size, is necessary to develop the method as a quantification tool for staging fibrosis. We reproduced a study of cardiac-induced strain in the liver at 3T and simulated tagged MR images with different grid tag patterns to evaluate the performance of the Harmonic Phase (HARP image analysis method and its dependence on the parameters of tag spacing and grid angle. The Partial Volume Effect (PVE, T1 relaxation, and different levels of noise were taken into account. Four displacement fields of increasing intensity were created and applied to the tagged MR images of the liver. These fields simulated the deformation at different liver stiffnesses. An Error Index (EI was calculated to evaluate the estimation accuracy for various parameter values. In the absence of noise, the estimation accuracy of the displacement fields increased as tag spacings decreased. EIs for each of the four displacement fields were lower at 0° and the local minima of the EI were found to correspond to multiples of pixel size. The accuracy of the estimation decreased for increasing levels of added noise; as the level increased, the improved estimation caused by decreasing the tag spacing tended to zero. The optimal tag spacing turned out to be a compromise between the smallest tag period that is a multiple of the pixel size and is achievable in a real acquisition and the tag spacing that guarantees an accurate liver displacement measure in the presence of realistic levels of noise.

  6. Lung involvement quantification in chest radiographs

    International Nuclear Information System (INIS)

    Giacomini, Guilherme; Alvarez, Matheus; Oliveira, Marcela de; Miranda, Jose Ricardo A.; Pina, Diana R.; Pereira, Paulo C.M.; Ribeiro, Sergio M.

    2014-01-01

    Tuberculosis (TB) caused by Mycobacterium tuberculosis, is an infectious disease which remains a global health problem. The chest radiography is the commonly method employed to assess the TB's evolution. The methods for quantification of abnormalities of chest are usually performed on CT scans (CT). This quantification is important to assess the TB evolution and treatment and comparing different treatments. However, precise quantification is not feasible for the amount of CT scans required. The purpose of this work is to develop a methodology for quantification of lung damage caused by TB through chest radiographs. It was developed an algorithm for computational processing of exams in Matlab, which creates a lungs' 3D representation, with compromised dilated regions inside. The quantification of lung lesions was also made for the same patients through CT scans. The measurements from the two methods were compared and resulting in strong correlation. Applying statistical Bland and Altman, all samples were within the limits of agreement, with a confidence interval of 95%. The results showed an average variation of around 13% between the two quantification methods. The results suggest the effectiveness and applicability of the method developed, providing better risk-benefit to the patient and cost-benefit ratio for the institution. (author)

  7. Fluorescent quantification of melanin.

    Science.gov (United States)

    Fernandes, Bruno; Matamá, Teresa; Guimarães, Diana; Gomes, Andreia; Cavaco-Paulo, Artur

    2016-11-01

    Melanin quantification is reportedly performed by absorption spectroscopy, commonly at 405 nm. Here, we propose the implementation of fluorescence spectroscopy for melanin assessment. In a typical in vitro assay to assess melanin production in response to an external stimulus, absorption spectroscopy clearly overvalues melanin content. This method is also incapable of distinguishing non-melanotic/amelanotic control cells from those that are actually capable of performing melanogenesis. Therefore, fluorescence spectroscopy is the best method for melanin quantification as it proved to be highly specific and accurate, detecting even small variations in the synthesis of melanin. This method can also be applied to the quantification of melanin in more complex biological matrices like zebrafish embryos and human hair. © 2016 John Wiley & Sons A/S. Published by John Wiley & Sons Ltd.

  8. Verb aspect, alternations and quantification

    Directory of Open Access Journals (Sweden)

    Svetla Koeva

    2015-11-01

    Full Text Available Verb aspect, alternations and quantification In this paper we are briefly discuss the nature of Bulgarian verb aspect and argue that the verb aspect pairs are different lexical units with different (although related meaning, different argument structure (reflecting categories, explicitness and referential status of arguments and different sets of semantic and syntactic alternations. The verb prefixes resulting in perfective verbs derivation in some cases can be interpreted as lexical quantifiers as well. Thus the Bulgarian verb aspect is related (in different way both with the potential for the generation of alternations and with the prefixal lexical quantification. It is shown that the scope of the lexical quantification by means of verbal prefixes is the quantified verb phrase and the scope remains constant in all derived alternations. The paper concerns the basic issues of these complex problems, while the detailed description of the conditions satisfying particular alternation or particular lexical quantification are subject of a more detailed study.

  9. Absolute quantitative proton NMR spectroscopy based on the amplitude of the local water suppression pulse. Quantification of brain water and metabolites

    DEFF Research Database (Denmark)

    Danielsen, E R; Henriksen, O

    1994-01-01

    Quantification in localized proton NMR spectroscopy has been achieved by various methods in recent years. A new method for absolute quantification is described in this paper. The method simultaneously rules out problems with B1 field inhomogeneity and coil loading, utilizing a relation between th......M and [NAA] = 9.15 +/- 0.74 nM. It is concluded that the quantification method is easily applied in vivo, and that the absolute concentrations obtained are similar to results in other studies except those relying on assumptions of the concentration of an internal reference. The advantage...

  10. Tracking electric field exposure levels through radio frequency dosimetry

    International Nuclear Information System (INIS)

    Ewing, P.D.; Moore, M.R.; Rochelle, R.W.; Thomas, R.S.; Hess, R.A.; Hoffheins, B.S.

    1991-01-01

    The radio-frequency (rf) dosimeter developed by the Oak Ridge National Laboratory is a portable, pocket-sized cumulative-dose recording device designed to detect and record the strengths and durations of electric fields present in the work areas of naval vessels. The device measures an integrated dose and records the electric fields that exceed the permissible levels set by the American National Standards Institute. Features of the rf dosimeter include a frequency range of 30 MHz to 10 GHz and a three-dimensional sensor. Data obtained with the rf dosimeter will be used to determine the ambient field-strength profile for shipboard personnel over an extended time. Readings are acquired and averaged over a 6-min period corresponding to the rise time of the core body temperature. These values are stored for up to 6 months, after which the data are transferred to a computer via the dosimeter's serial port. The rf dosimeter should increase knowledge of the levels of electric fields to which individuals are exposed. 5 refs., 4 figs

  11. Field study of sound exposure by personal stereo

    DEFF Research Database (Denmark)

    Ordoñez, Rodrigo Pizarro; Reuter, Karen; Hammershøi, Dorte

    2006-01-01

    A number of large scale studies suggest that the exposure level used with personal stereo systems should raise concern. High levels can be produced by most commercially available mp3 players, and they are generally used in high background noise levels (i.e., while in a bus or rain). A field study...... on young people's habitual sound exposure to personal stereos has been carried out using a measurement method according to principles of ISO 11904-2:2004. Additionally the state of their hearing has also been assessed. This presentation deals with the methodological aspects relating to the quantification...... of habitual use, estimation of listening levels and exposure levels, and assessment of their state of hearing, by either threshold determination or OAE measurement, with a special view to the general validity of the results (uncertainty factors and their magnitude)....

  12. Single sample extraction and HPLC processing for quantification of NAD and NADH levels in Saccharomyces cerevisiae

    Energy Technology Data Exchange (ETDEWEB)

    Sporty, J; Kabir, M M; Turteltaub, K; Ognibene, T; Lin, S; Bench, G

    2008-01-10

    A robust redox extraction protocol for quantitative and reproducible metabolite isolation and recovery has been developed for simultaneous measurement of nicotinamide adenine dinucleotide (NAD) and its reduced form, NADH, from Saccharomyces cerevisiae. Following culture in liquid media, approximately 10{sup 8} yeast cells were harvested by centrifugation and then lysed under non-oxidizing conditions by bead blasting in ice-cold, nitrogen-saturated 50-mM ammonium acetate. To enable protein denaturation, ice cold nitrogen-saturated CH{sub 3}CN + 50-mM ammonium acetate (3:1; v:v) was added to the cell lysates. After sample centrifugation to pellet precipitated proteins, organic solvent removal was performed on supernatants by chloroform extraction. The remaining aqueous phase was dried and resuspended in 50-mM ammonium acetate. NAD and NADH were separated by HPLC and quantified using UV-VIS absorbance detection. Applicability of this procedure for quantifying NAD and NADH levels was evaluated by culturing yeast under normal (2% glucose) and calorie restricted (0.5% glucose) conditions. NAD and NADH contents are similar to previously reported levels in yeast obtained using enzymatic assays performed separately on acid (for NAD) and alkali (for NADH) extracts. Results demonstrate that it is possible to perform a single preparation to reliably and robustly quantitate both NAD and NADH contents in the same sample. Robustness of the protocol suggests it will be (1) applicable to quantification of these metabolites in mammalian and bacterial cell cultures; and (2) amenable to isotope labeling strategies to determine the relative contribution of specific metabolic pathways to total NAD and NADH levels in cell cultures.

  13. Quantification of BCR-ABL transcripts in peripheral blood cells and ...

    African Journals Online (AJOL)

    Purpose: To investigate the feasibility of using peripheral blood plasma samples as surrogates for blood cell sampling for quantification of breakpoint cluster region-Abelson oncogene (BCR-ABL) transcript levels to monitor treatment responses in chronic myeloid leukemia (CML) patients. Methods: Peripheral blood samples ...

  14. Recurrence quantification analysis theory and best practices

    CERN Document Server

    Jr, Jr; Marwan, Norbert

    2015-01-01

    The analysis of recurrences in dynamical systems by using recurrence plots and their quantification is still an emerging field.  Over the past decades recurrence plots have proven to be valuable data visualization and analysis tools in the theoretical study of complex, time-varying dynamical systems as well as in various applications in biology, neuroscience, kinesiology, psychology, physiology, engineering, physics, geosciences, linguistics, finance, economics, and other disciplines.   This multi-authored book intends to comprehensively introduce and showcase recent advances as well as established best practices concerning both theoretical and practical aspects of recurrence plot based analysis.  Edited and authored by leading researcher in the field, the various chapters address an interdisciplinary readership, ranging from theoretical physicists to application-oriented scientists in all data-providing disciplines.

  15. Two-level systems driven by large-amplitude fields

    Science.gov (United States)

    Nori, F.; Ashhab, S.; Johansson, J. R.; Zagoskin, A. M.

    2009-03-01

    We analyze the dynamics of a two-level system subject to driving by large-amplitude external fields, focusing on the resonance properties in the case of driving around the region of avoided level crossing. In particular, we consider three main questions that characterize resonance dynamics: (1) the resonance condition, (2) the frequency of the resulting oscillations on resonance, and (3) the width of the resonance. We identify the regions of validity of different approximations. In a large region of the parameter space, we use a geometric picture in order to obtain both a simple understanding of the dynamics and quantitative results. The geometric approach is obtained by dividing the evolution into discrete time steps, with each time step described by either a phase shift on the basis states or a coherent mixing process corresponding to a Landau-Zener crossing. We compare the results of the geometric picture with those of a rotating wave approximation. We also comment briefly on the prospects of employing strong driving as a useful tool to manipulate two-level systems. S. Ashhab, J.R. Johansson, A.M. Zagoskin, F. Nori, Two-level systems driven by large-amplitude fields, Phys. Rev. A 75, 063414 (2007). S. Ashhab et al, unpublished.

  16. Improved Method for PD-Quantification in Power Cables

    DEFF Research Database (Denmark)

    Holbøll, Joachim T.; Villefrance, Rasmus; Henriksen, Mogens

    1999-01-01

    n this paper, a method is described for improved quantification of partial discharges(PD) in power cables. The method is suitable for PD-detection and location systems in the MHz-range, where pulse attenuation and distortion along the cable cannot be neglected. The system transfer function...... was calculated and measured in order to form basis for magnitude calculation after each measurements. --- Limitations and capabilities of the method will be discussed and related to relevant field applications of high frequent PD-measurements. --- Methods for increased signal/noise ratio are easily implemented...

  17. HPLC Quantification of astaxanthin and canthaxanthin in Salmonidae eggs.

    Science.gov (United States)

    Tzanova, Milena; Argirova, Mariana; Atanasov, Vasil

    2017-04-01

    Astaxanthin and canthaxanthin are naturally occurring antioxidants referred to as xanthophylls. They are used as food additives in fish farms to improve the organoleptic qualities of salmonid products and to prevent reproductive diseases. This study reports the development and single-laboratory validation of a rapid method for quantification of astaxanthin and canthaxanthin in eggs of rainbow trout (Oncorhynchus mykiss) and brook trout (Salvelinus fontinalis М.). An advantage of the proposed method is the perfect combination of selective extraction of the xanthophylls and analysis of the extract by high-performance liquid chromatography and photodiode array detection. The method validation was carried out in terms of linearity, accuracy, precision, recovery and limits of detection and quantification. The method was applied for simultaneous quantification of the two xanthophylls in eggs of rainbow trout and brook trout after their selective extraction. The results show that astaxanthin accumulations in salmonid fish eggs are larger than those of canthaxanthin. As the levels of these two xanthophylls affect fish fertility, this method can be used to improve the nutritional quality and to minimize the occurrence of the M74 syndrome in fish populations. Copyright © 2016 John Wiley & Sons, Ltd.

  18. Thin-Film Magnetic-Field-Response Fluid-Level Sensor for Non-Viscous Fluids

    Science.gov (United States)

    Woodard, Stanley E.; Shams, Qamar A.; Fox, Robert L.; Taylor, Bryant D.

    2008-01-01

    An innovative method has been developed for acquiring fluid-level measurements. This method eliminates the need for the fluid-level sensor to have a physical connection to a power source or to data acquisition equipment. The complete system consists of a lightweight, thin-film magnetic-field-response fluid-level sensor (see Figure 1) and a magnetic field response recorder that was described in Magnetic-Field-Response Measurement-Acquisition System (LAR-16908-1), NASA Tech Briefs, Vol. 30, No. 6 (June 2006), page 28. The sensor circuit is a capacitor connected to an inductor. The response recorder powers the sensor using a series of oscillating magnetic fields. Once electrically active, the sensor responds with its own harmonic magnetic field. The sensor will oscillate at its resonant electrical frequency, which is dependent upon the capacitance and inductance values of the circuit.

  19. Impact of muscular uptake and statistical noise on tumor quantification based on simulated FDG-PET studies

    International Nuclear Information System (INIS)

    Silva-Rodríguez, Jesús; Domínguez-Prado, Inés; Pardo-Montero, Juan; Ruibal, Álvaro

    2017-01-01

    Purpose: The aim of this work is to study the effect of physiological muscular uptake variations and statistical noise on tumor quantification in FDG-PET studies. Methods: We designed a realistic framework based on simulated FDG-PET acquisitions from an anthropomorphic phantom that included different muscular uptake levels and three spherical lung lesions with diameters of 31, 21 and 9 mm. A distribution of muscular uptake levels was obtained from 136 patients remitted to our center for whole-body FDG-PET. Simulated FDG-PET acquisitions were obtained by using the Simulation System for Emission Tomography package (SimSET) Monte Carlo package. Simulated data was reconstructed by using an iterative Ordered Subset Expectation Maximization (OSEM) algorithm implemented in the Software for Tomographic Image Reconstruction (STIR) library. Tumor quantification was carried out by using estimations of SUV max , SUV 50 and SUV mean from different noise realizations, lung lesions and multiple muscular uptakes. Results: Our analysis provided quantification variability values of 17–22% (SUV max ), 11–19% (SUV 50 ) and 8–10% (SUV mean ) when muscular uptake variations and statistical noise were included. Meanwhile, quantification variability due only to statistical noise was 7–8% (SUV max ), 3–7% (SUV 50 ) and 1–2% (SUV mean ) for large tumors (>20 mm) and 13% (SUV max ), 16% (SUV 50 ) and 8% (SUV mean ) for small tumors (<10 mm), thus showing that the variability in tumor quantification is mainly affected by muscular uptake variations when large enough tumors are considered. In addition, our results showed that quantification variability is strongly dominated by statistical noise when the injected dose decreases below 222 MBq. Conclusions: Our study revealed that muscular uptake variations between patients who are totally relaxed should be considered as an uncertainty source of tumor quantification values. - Highlights: • Distribution of muscular uptake from 136 PET

  20. Quantification of Lignin and Its Structural Features in Plant Biomass Using

    NARCIS (Netherlands)

    Erven, Van Gijs; Visser, de Ries; Merkx, Donny W.H.; Strolenberg, Willem; Gijsel, de Peter; Gruppen, Harry; Kabel, Mirjam A.

    2017-01-01

    Understanding the mechanisms underlying plant biomass recalcitrance at the molecular level can only be achieved by accurate analyses of both the content and structural features of the molecules involved. Current quantification of lignin is, however, majorly based on unspecific gravimetric

  1. Effects of humic acid on DNA quantification with Quantifiler® Human DNA Quantification kit and short tandem repeat amplification efficiency.

    Science.gov (United States)

    Seo, Seung Bum; Lee, Hye Young; Zhang, Ai Hua; Kim, Hye Yeon; Shin, Dong Hoon; Lee, Soong Deok

    2012-11-01

    Correct DNA quantification is an essential part to obtain reliable STR typing results. Forensic DNA analysts often use commercial kits for DNA quantification; among them, real-time-based DNA quantification kits are most frequently used. Incorrect DNA quantification due to the presence of PCR inhibitors may affect experiment results. In this study, we examined the alteration degree of DNA quantification results estimated in DNA samples containing a PCR inhibitor by using a Quantifiler® Human DNA Quantification kit. For experiments, we prepared approximately 0.25 ng/μl DNA samples containing various concentrations of humic acid (HA). The quantification results were 0.194-0.303 ng/μl at 0-1.6 ng/μl HA (final concentration in the Quantifiler reaction) and 0.003-0.168 ng/μl at 2.4-4.0 ng/μl HA. Most DNA quantity was undetermined when HA concentration was higher than 4.8 ng/μl HA. The C (T) values of an internal PCR control (IPC) were 28.0-31.0, 36.5-37.1, and undetermined at 0-1.6, 2.4, and 3.2 ng/μl HA. These results indicate that underestimated DNA quantification results may be obtained in the DNA sample with high C (T) values of IPC. Thus, researchers should carefully interpret the DNA quantification results. We additionally examined the effects of HA on the STR amplification by using an Identifiler® kit and a MiniFiler™ kit. Based on the results of this study, it is thought that a better understanding of various effects of HA would help researchers recognize and manipulate samples containing HA.

  2. Uncertainty quantification for mean field games in social interactions

    KAUST Repository

    Dia, Ben Mansour

    2016-01-09

    We present an overview of mean field games formulation. A comparative analysis of the optimality for a stochastic McKean-Vlasov process with time-dependent probability is presented. Then we examine mean-field games for social interactions and we show that optimizing the long-term well-being through effort and social feeling state distribution (mean-field) will help to stabilize couple (marriage). However , if the cost of effort is very high, the couple fluctuates in a bad feeling state or the marriage breaks down. We then examine the influence of society on a couple using mean field sentimental games. We show that, in mean-field equilibrium, the optimal effort is always higher than the one-shot optimal effort. Finally we introduce the Wiener chaos expansion for the construction of solution of stochastic differential equations of Mckean-Vlasov type. The method is based on the Cameron-Martin version of the Wiener Chaos expansion and allow to quantify the uncertainty in the optimality system.

  3. Uncertainty quantification for mean field games in social interactions

    KAUST Repository

    Dia, Ben Mansour

    2016-01-01

    We present an overview of mean field games formulation. A comparative analysis of the optimality for a stochastic McKean-Vlasov process with time-dependent probability is presented. Then we examine mean-field games for social interactions and we show that optimizing the long-term well-being through effort and social feeling state distribution (mean-field) will help to stabilize couple (marriage). However , if the cost of effort is very high, the couple fluctuates in a bad feeling state or the marriage breaks down. We then examine the influence of society on a couple using mean field sentimental games. We show that, in mean-field equilibrium, the optimal effort is always higher than the one-shot optimal effort. Finally we introduce the Wiener chaos expansion for the construction of solution of stochastic differential equations of Mckean-Vlasov type. The method is based on the Cameron-Martin version of the Wiener Chaos expansion and allow to quantify the uncertainty in the optimality system.

  4. Deep-Dive Targeted Quantification for Ultrasensitive Analysis of Proteins in Nondepleted Human Blood Plasma/Serum and Tissues

    Energy Technology Data Exchange (ETDEWEB)

    Nie, Song [Biological Sciences Division; Shi, Tujin [Biological Sciences Division; Fillmore, Thomas L. [Biological Sciences Division; Schepmoes, Athena A. [Biological Sciences Division; Brewer, Heather [Biological Sciences Division; Gao, Yuqian [Biological Sciences Division; Song, Ehwang [Biological Sciences Division; Wang, Hui [Biological Sciences Division; Rodland, Karin D. [Biological Sciences Division; Qian, Wei-Jun [Biological Sciences Division; Smith, Richard D. [Biological Sciences Division; Liu, Tao [Biological Sciences Division

    2017-08-11

    Mass spectrometry-based targeted proteomics (e.g., selected reaction monitoring, SRM) is emerging as an attractive alternative to immunoassays for protein quantification. Recently we have made significant progress in SRM sensitivity for enabling quantification of low ng/mL to sub-ng/mL level proteins in nondepleted human blood plasma/serum without affinity enrichment. However, precise quantification of extremely low abundant but biologically important proteins (e.g., ≤100 pg/mL in blood plasma/serum) using targeted proteomics approaches still remains challenging. To address this need, we have developed an antibody-independent Deep-Dive SRM (DD-SRM) approach that capitalizes on multidimensional high-resolution reversed-phase liquid chromatography (LC) separation for target peptide enrichment combined with precise selection of target peptide fractions of interest, significantly improving SRM sensitivity by ~5 orders of magnitude when compared to conventional LC-SRM. Application of DD-SRM to human serum and tissue has been demonstrated to enable precise quantification of endogenous proteins at ~10 pg/mL level in nondepleted serum and at <10 copies per cell level in tissue. Thus, DD-SRM holds great promise for precisely measuring extremely low abundance proteins or protein modifications, especially when high-quality antibody is not available.

  5. Hepatic Iron Quantification on 3 Tesla (3 T Magnetic Resonance (MR: Technical Challenges and Solutions

    Directory of Open Access Journals (Sweden)

    Muhammad Anwar

    2013-01-01

    Full Text Available MR has become a reliable and noninvasive method of hepatic iron quantification. Currently, most of the hepatic iron quantification is performed on 1.5 T MR, and the biopsy measurements have been paired with R2 and R2* values for 1.5 T MR. As the use of 3 T MR scanners is steadily increasing in clinical practice, it has become important to evaluate the practicality of calculating iron burden at 3 T MR. Hepatic iron quantification on 3 T MR requires a better understanding of the process and more stringent technical considerations. The purpose of this work is to focus on the technical challenges in establishing a relationship between T2* values at 1.5 T MR and 3 T MR for hepatic iron concentration (HIC and to develop an appropriately optimized MR protocol for the evaluation of T2* values in the liver at 3 T magnetic field strength. We studied 22 sickle cell patients using multiecho fast gradient-echo sequence (MFGRE 3 T MR and compared the results with serum ferritin and liver biopsy results. Our study showed that the quantification of hepatic iron on 3 T MRI in sickle cell disease patients correlates well with clinical blood test results and biopsy results. 3 T MR liver iron quantification based on MFGRE can be used for hepatic iron quantification in transfused patients.

  6. Absolute Quantification of the Host-To-Parasite DNA Ratio in Theileria parva-Infected Lymphocyte Cell Lines.

    Science.gov (United States)

    Gotia, Hanzel T; Munro, James B; Knowles, Donald P; Daubenberger, Claudia A; Bishop, Richard P; Silva, Joana C

    2016-01-01

    Theileria parva is a tick-transmitted intracellular apicomplexan pathogen of cattle in sub-Saharan Africa that causes East Coast fever (ECF). ECF is an acute fatal disease that kills over one million cattle annually, imposing a tremendous burden on African small-holder cattle farmers. The pathology and level of T. parva infections in its wildlife host, African buffalo (Syncerus caffer), and in cattle are distinct. We have developed an absolute quantification method based on quantitative PCR (qPCR) in which recombinant plasmids containing single copy genes specific to the parasite (apical membrane antigen 1 gene, ama1) or the host (hypoxanthine phosphoribosyltransferase 1, hprt1) are used as the quantification reference standards. Our study shows that T. parva and bovine cells are present in similar numbers in T. parva-infected lymphocyte cell lines and that consequently, due to its much smaller genome size, T. parva DNA comprises between 0.9% and 3% of the total DNA samples extracted from these lines. This absolute quantification assay of parasite and host genome copy number in a sample provides a simple and reliable method of assessing T. parva load in infected bovine lymphocytes, and is accurate over a wide range of host-to-parasite DNA ratios. Knowledge of the proportion of target DNA in a sample, as enabled by this method, is essential for efficient high-throughput genome sequencing applications for a variety of intracellular pathogens. This assay will also be very useful in future studies of interactions of distinct host-T. parva stocks and to fully characterize the dynamics of ECF infection in the field.

  7. Quantification of local mobilities

    DEFF Research Database (Denmark)

    Zhang, Y. B.

    2018-01-01

    A new method for quantification of mobilities of local recrystallization boundary segments is presented. The quantification is based on microstructures characterized using electron microscopy and on determination of migration velocities and driving forces for local boundary segments. Pure aluminium...... is investigated and the results show that even for a single recrystallization boundary, different boundary segments migrate differently, and the differences can be understood based on variations in mobilities and local deformed microstructures. The present work has important implications for understanding...

  8. Bayesian uncertainty quantification for flows in heterogeneous porous media using reversible jump Markov chain Monte Carlo methods

    KAUST Repository

    Mondal, A.; Efendiev, Y.; Mallick, B.; Datta-Gupta, A.

    2010-01-01

    . Within each channel, the permeability is assumed to have a lognormal distribution. Uncertainty quantification in history matching is carried out hierarchically by constructing geologic facies boundaries as well as permeability fields within each facies

  9. Development of Quantification Method for Bioluminescence Imaging

    International Nuclear Information System (INIS)

    Kim, Hyeon Sik; Min, Jung Joon; Lee, Byeong Il; Choi, Eun Seo; Tak, Yoon O; Choi, Heung Kook; Lee, Ju Young

    2009-01-01

    Optical molecular luminescence imaging is widely used for detection and imaging of bio-photons emitted by luminescent luciferase activation. The measured photons in this method provide the degree of molecular alteration or cell numbers with the advantage of high signal-to-noise ratio. To extract useful information from the measured results, the analysis based on a proper quantification method is necessary. In this research, we propose a quantification method presenting linear response of measured light signal to measurement time. We detected the luminescence signal by using lab-made optical imaging equipment of animal light imaging system (ALIS) and different two kinds of light sources. One is three bacterial light-emitting sources containing different number of bacteria. The other is three different non-bacterial light sources emitting very weak light. By using the concept of the candela and the flux, we could derive simplified linear quantification formula. After experimentally measuring light intensity, the data was processed with the proposed quantification function. We could obtain linear response of photon counts to measurement time by applying the pre-determined quantification function. The ratio of the re-calculated photon counts and measurement time present a constant value although different light source was applied. The quantification function for linear response could be applicable to the standard quantification process. The proposed method could be used for the exact quantitative analysis in various light imaging equipment with presenting linear response behavior of constant light emitting sources to measurement time

  10. Lowering the quantification limit of the QubitTM RNA HS assay using RNA spike-in.

    Science.gov (United States)

    Li, Xin; Ben-Dov, Iddo Z; Mauro, Maurizio; Williams, Zev

    2015-05-06

    RNA quantification is often a prerequisite for most RNA analyses such as RNA sequencing. However, the relatively low sensitivity and large sample consumption of traditional RNA quantification methods such as UV spectrophotometry and even the much more sensitive fluorescence-based RNA quantification assays, such as the Qubit™ RNA HS Assay, are often inadequate for measuring minute levels of RNA isolated from limited cell and tissue samples and biofluids. Thus, there is a pressing need for a more sensitive method to reliably and robustly detect trace levels of RNA without interference from DNA. To improve the quantification limit of the Qubit™ RNA HS Assay, we spiked-in a known quantity of RNA to achieve the minimum reading required by the assay. Samples containing trace amounts of RNA were then added to the spike-in and measured as a reading increase over RNA spike-in baseline. We determined the accuracy and precision of reading increases between 1 and 20 pg/μL as well as RNA-specificity in this range, and compared to those of RiboGreen(®), another sensitive fluorescence-based RNA quantification assay. We then applied Qubit™ Assay with RNA spike-in to quantify plasma RNA samples. RNA spike-in improved the quantification limit of the Qubit™ RNA HS Assay 5-fold, from 25 pg/μL down to 5 pg/μL while maintaining high specificity to RNA. This enabled quantification of RNA with original concentration as low as 55.6 pg/μL compared to 250 pg/μL for the standard assay and decreased sample consumption from 5 to 1 ng. Plasma RNA samples that were not measurable by the Qubit™ RNA HS Assay were measurable by our modified method. The Qubit™ RNA HS Assay with RNA spike-in is able to quantify RNA with high specificity at 5-fold lower concentration and uses 5-fold less sample quantity than the standard Qubit™ Assay.

  11. Effect of Verticillium dahliae soil inoculum levels on spinach seed infection

    DEFF Research Database (Denmark)

    Sapkota, Rumakanta; Olesen, Merete Halkjær; Deleuran, Lise Christina

    2016-01-01

    Verticillium dahliae is a soilborne pathogen and a threat to spinach seed production. The aim of this study was to understand the relation between V. dahliae soil inoculum and infection in harvested seed. Quantitative polymerase chain reaction was used for quantification of the pathogen. Semifield...... experiments in which spinach was grown in soils with different inoculum levels enabled us to determine a threshold level for V. dahliae DNA of 0.003 ng/g of soil for seed infection to occur. Soils from production fields were sampled in 2013 and 2014 during and before planting, as well as the harvested seed....... Seed from plants grown in infested soils were infected with V. dahliae in samples from both the semifield and open-field experiments. Lower levels of pathogen were found in seed from spinach grown in soils with a scattered distribution of V. dahliae (one or two positive of three soil subsamples) than...

  12. Uncertainty quantification for hyperbolic and kinetic equations

    CERN Document Server

    Pareschi, Lorenzo

    2017-01-01

    This book explores recent advances in uncertainty quantification for hyperbolic, kinetic, and related problems. The contributions address a range of different aspects, including: polynomial chaos expansions, perturbation methods, multi-level Monte Carlo methods, importance sampling, and moment methods. The interest in these topics is rapidly growing, as their applications have now expanded to many areas in engineering, physics, biology and the social sciences. Accordingly, the book provides the scientific community with a topical overview of the latest research efforts.

  13. Quantification in single photon emission computed tomography (SPECT)

    International Nuclear Information System (INIS)

    Buvat, Irene

    2005-01-01

    The objective of this lecture is to understand the possibilities and limitations of the quantitative analysis of single photon emission computed tomography (SPECT) images. It is also to identify the conditions to be fulfilled to obtain reliable quantitative measurements from images. Content: 1 - Introduction: Quantification in emission tomography - definition and challenges; quantification biasing phenomena; 2 - quantification in SPECT, problems and correction methods: Attenuation, scattering, un-stationary spatial resolution, partial volume effect, movement, tomographic reconstruction, calibration; 3 - Synthesis: actual quantification accuracy; 4 - Beyond the activity concentration measurement

  14. Development of a VHH-Based Erythropoietin Quantification Assay

    DEFF Research Database (Denmark)

    Kol, Stefan; Beuchert Kallehauge, Thomas; Adema, Simon

    2015-01-01

    Erythropoietin (EPO) quantification during cell line selection and bioreactor cultivation has traditionally been performed with ELISA or HPLC. As these techniques suffer from several drawbacks, we developed a novel EPO quantification assay. A camelid single-domain antibody fragment directed against...... human EPO was evaluated as a capturing antibody in a label-free biolayer interferometry-based quantification assay. Human recombinant EPO can be specifically detected in Chinese hamster ovary cell supernatants in a sensitive and pH-dependent manner. This method enables rapid and robust quantification...

  15. Quantification procedures in micro X-ray fluorescence analysis

    International Nuclear Information System (INIS)

    Kanngiesser, Birgit

    2003-01-01

    For the quantification in micro X-ray fluorescence analysis standardfree quantification procedures have become especially important. An introduction to the basic concepts of these quantification procedures is given, followed by a short survey of the procedures which are available now and what kind of experimental situations and analytical problems are addressed. The last point is extended by the description of an own development for the fundamental parameter method, which renders the inclusion of nonparallel beam geometries possible. Finally, open problems for the quantification procedures are discussed

  16. Applications of acoustic radiation force impulse quantification in chronic kidney disease: A review

    Energy Technology Data Exchange (ETDEWEB)

    Wang, Liang [Dept. of Ultrasound, Chinese Academy of Medical Sciences and Peking Union Medical College Hospital, Beijing (China)

    2016-08-15

    Acoustic radiation force impulse (ARFI) imaging is an emerging technique with great promise in the field of elastography. Previous studies have validated ARFI quantification as a method of estimating fibrosis in chronic liver disease. Similarly, fibrosis is the principal process underlying the progression of chronic kidney disease, which is the major cause of renal failure. However, the quantification of tissue stiffness using ARFI imaging is more complex in the kidney than in the liver. Moreover, not all previous studies are comparable because they employed different procedures. Therefore, subsequent studies are warranted, both in animal models and in clinical patients, in order to better understand the histopathological mechanisms associated with renal elasticity and to further improve this imaging method by developing a standardized guidelines for its implementation.

  17. Applications of acoustic radiation force impulse quantification in chronic kidney disease: A review

    International Nuclear Information System (INIS)

    Wang, Liang

    2016-01-01

    Acoustic radiation force impulse (ARFI) imaging is an emerging technique with great promise in the field of elastography. Previous studies have validated ARFI quantification as a method of estimating fibrosis in chronic liver disease. Similarly, fibrosis is the principal process underlying the progression of chronic kidney disease, which is the major cause of renal failure. However, the quantification of tissue stiffness using ARFI imaging is more complex in the kidney than in the liver. Moreover, not all previous studies are comparable because they employed different procedures. Therefore, subsequent studies are warranted, both in animal models and in clinical patients, in order to better understand the histopathological mechanisms associated with renal elasticity and to further improve this imaging method by developing a standardized guidelines for its implementation

  18. Study of multi-level atomic systems with the application of magnetic field

    Science.gov (United States)

    Hu, Jianping; Roy, Subhankar; Ummal Momeen, M.

    2018-04-01

    The complexity of multiple energy levels associated with each atomic system determines the various processes related to light- matter interactions. It is necessary to understand the influence of different levels in a given atomic system. In this work we focus on multi- level atomic schemes with the application of magnetic field. We analyze the different EIT windows which appears in the presence of moderately high magnetic field (∼ 10 G) strength.

  19. Pesticide residue quantification analysis by hyperspectral imaging sensors

    Science.gov (United States)

    Liao, Yuan-Hsun; Lo, Wei-Sheng; Guo, Horng-Yuh; Kao, Ching-Hua; Chou, Tau-Meu; Chen, Junne-Jih; Wen, Chia-Hsien; Lin, Chinsu; Chen, Hsian-Min; Ouyang, Yen-Chieh; Wu, Chao-Cheng; Chen, Shih-Yu; Chang, Chein-I.

    2015-05-01

    Pesticide residue detection in agriculture crops is a challenging issue and is even more difficult to quantify pesticide residue resident in agriculture produces and fruits. This paper conducts a series of base-line experiments which are particularly designed for three specific pesticides commonly used in Taiwan. The materials used for experiments are single leaves of vegetable produces which are being contaminated by various amount of concentration of pesticides. Two sensors are used to collected data. One is Fourier Transform Infrared (FTIR) spectroscopy. The other is a hyperspectral sensor, called Geophysical and Environmental Research (GER) 2600 spectroradiometer which is a batteryoperated field portable spectroradiometer with full real-time data acquisition from 350 nm to 2500 nm. In order to quantify data with different levels of pesticide residue concentration, several measures for spectral discrimination are developed. Mores specifically, new measures for calculating relative power between two sensors are particularly designed to be able to evaluate effectiveness of each of sensors in quantifying the used pesticide residues. The experimental results show that the GER is a better sensor than FTIR in the sense of pesticide residue quantification.

  20. Image-based computational quantification and visualization of genetic alterations and tumour heterogeneity.

    Science.gov (United States)

    Zhong, Qing; Rüschoff, Jan H; Guo, Tiannan; Gabrani, Maria; Schüffler, Peter J; Rechsteiner, Markus; Liu, Yansheng; Fuchs, Thomas J; Rupp, Niels J; Fankhauser, Christian; Buhmann, Joachim M; Perner, Sven; Poyet, Cédric; Blattner, Miriam; Soldini, Davide; Moch, Holger; Rubin, Mark A; Noske, Aurelia; Rüschoff, Josef; Haffner, Michael C; Jochum, Wolfram; Wild, Peter J

    2016-04-07

    Recent large-scale genome analyses of human tissue samples have uncovered a high degree of genetic alterations and tumour heterogeneity in most tumour entities, independent of morphological phenotypes and histopathological characteristics. Assessment of genetic copy-number variation (CNV) and tumour heterogeneity by fluorescence in situ hybridization (ISH) provides additional tissue morphology at single-cell resolution, but it is labour intensive with limited throughput and high inter-observer variability. We present an integrative method combining bright-field dual-colour chromogenic and silver ISH assays with an image-based computational workflow (ISHProfiler), for accurate detection of molecular signals, high-throughput evaluation of CNV, expressive visualization of multi-level heterogeneity (cellular, inter- and intra-tumour heterogeneity), and objective quantification of heterogeneous genetic deletions (PTEN) and amplifications (19q12, HER2) in diverse human tumours (prostate, endometrial, ovarian and gastric), using various tissue sizes and different scanners, with unprecedented throughput and reproducibility.

  1. Evaluation of nitrate quantification techniques for in-line analysis in drinking water

    International Nuclear Information System (INIS)

    Hernandez Alpizar, Laura; Coy Herrera, Ricardo

    2015-01-01

    The results of a study are presented to determine the potential use of four techniques for the quantification of nitrates in continuous sampling: ion chromatography, ultraviolet absorption spectrophotometry; one table equipment and two mini-spectrophotometers with continuous flow sample injection are used, one for measurements in the field of visible radiation and the other optimized for measurements of ultraviolet radiation absorption. Variables that are considered: reagent and accessory consumption, waste toxicity, analyte response, detection limit (LD), quantification limit (LC), linearity in the field of interest and sensitivity. Ultraviolet absorption detection spectro photometry with continuous flow sample injection is the best of the techniques for line analysis. The response has been between 0-10 mg / L linear, data recommended by WHO for the concentration of nitrates in drinking water. Low consumption of reagents and accessories is shown. This spectrophotometry without hazardous waste generated, has had LD 0.002 mg / L and LC 0.006 mg / L and an adequate sensitivity to respond rapidly to the concentration of the analyte without signal saturation. The desirable characteristics are fulfilled for an on-line analysis system. (author) [es

  2. Discretisation Schemes for Level Sets of Planar Gaussian Fields

    Science.gov (United States)

    Beliaev, D.; Muirhead, S.

    2018-01-01

    Smooth random Gaussian functions play an important role in mathematical physics, a main example being the random plane wave model conjectured by Berry to give a universal description of high-energy eigenfunctions of the Laplacian on generic compact manifolds. Our work is motivated by questions about the geometry of such random functions, in particular relating to the structure of their nodal and level sets. We study four discretisation schemes that extract information about level sets of planar Gaussian fields. Each scheme recovers information up to a different level of precision, and each requires a maximum mesh-size in order to be valid with high probability. The first two schemes are generalisations and enhancements of similar schemes that have appeared in the literature (Beffara and Gayet in Publ Math IHES, 2017. https://doi.org/10.1007/s10240-017-0093-0; Mischaikow and Wanner in Ann Appl Probab 17:980-1018, 2007); these give complete topological information about the level sets on either a local or global scale. As an application, we improve the results in Beffara and Gayet (2017) on Russo-Seymour-Welsh estimates for the nodal set of positively-correlated planar Gaussian fields. The third and fourth schemes are, to the best of our knowledge, completely new. The third scheme is specific to the nodal set of the random plane wave, and provides global topological information about the nodal set up to `visible ambiguities'. The fourth scheme gives a way to approximate the mean number of excursion domains of planar Gaussian fields.

  3. On the Application of Science Systems Engineering and Uncertainty Quantification for Ice Sheet Science and Sea Level Projections

    Science.gov (United States)

    Schlegel, Nicole-Jeanne; Boening, Carmen; Larour, Eric; Limonadi, Daniel; Schodlok, Michael; Seroussi, Helene; Watkins, Michael

    2017-04-01

    Research and development activities at the Jet Propulsion Laboratory (JPL) currently support the creation of a framework to formally evaluate the observational needs within earth system science. One of the pilot projects of this effort aims to quantify uncertainties in global mean sea level rise projections, due to contributions from the continental ice sheets. Here, we take advantage of established uncertainty quantification tools embedded within the JPL-University of California at Irvine Ice Sheet System Model (ISSM). We conduct sensitivity and Monte-Carlo style sampling experiments on forward simulations of the Greenland and Antarctic ice sheets. By varying internal parameters and boundary conditions of the system over both extreme and credible worst-case ranges, we assess the impact of the different parameter ranges on century-scale sea level rise projections. The results inform efforts to a) isolate the processes and inputs that are most responsible for determining ice sheet contribution to sea level; b) redefine uncertainty brackets for century-scale projections; and c) provide a prioritized list of measurements, along with quantitative information on spatial and temporal resolution, required for reducing uncertainty in future sea level rise projections. Results indicate that ice sheet mass loss is dependent on the spatial resolution of key boundary conditions - such as bedrock topography and melt rates at the ice-ocean interface. This work is performed at and supported by the California Institute of Technology's Jet Propulsion Laboratory. Supercomputing time is also supported through a contract with the National Aeronautics and Space Administration's Cryosphere program.

  4. In vivo MRS metabolite quantification using genetic optimization

    Science.gov (United States)

    Papakostas, G. A.; Karras, D. A.; Mertzios, B. G.; van Ormondt, D.; Graveron-Demilly, D.

    2011-11-01

    The in vivo quantification of metabolites' concentrations, revealed in magnetic resonance spectroscopy (MRS) spectra, constitutes the main subject under investigation in this work. Significant contributions based on artificial intelligence tools, such as neural networks (NNs), with good results have been presented lately but have shown several drawbacks, regarding their quantification accuracy under difficult conditions. A general framework that encounters the quantification procedure as an optimization problem, which is solved using a genetic algorithm (GA), is proposed in this paper. Two different lineshape models are examined, while two GA configurations are applied on artificial data. Moreover, the introduced quantification technique deals with metabolite peaks' overlapping, a considerably difficult situation occurring under real conditions. Appropriate experiments have proved the efficiency of the introduced methodology, in artificial MRS data, by establishing it as a generic metabolite quantification procedure.

  5. In vivo MRS metabolite quantification using genetic optimization

    International Nuclear Information System (INIS)

    Papakostas, G A; Mertzios, B G; Karras, D A; Van Ormondt, D; Graveron-Demilly, D

    2011-01-01

    The in vivo quantification of metabolites' concentrations, revealed in magnetic resonance spectroscopy (MRS) spectra, constitutes the main subject under investigation in this work. Significant contributions based on artificial intelligence tools, such as neural networks (NNs), with good results have been presented lately but have shown several drawbacks, regarding their quantification accuracy under difficult conditions. A general framework that encounters the quantification procedure as an optimization problem, which is solved using a genetic algorithm (GA), is proposed in this paper. Two different lineshape models are examined, while two GA configurations are applied on artificial data. Moreover, the introduced quantification technique deals with metabolite peaks' overlapping, a considerably difficult situation occurring under real conditions. Appropriate experiments have proved the efficiency of the introduced methodology, in artificial MRS data, by establishing it as a generic metabolite quantification procedure

  6. On uncertainty quantification in hydrogeology and hydrogeophysics

    Science.gov (United States)

    Linde, Niklas; Ginsbourger, David; Irving, James; Nobile, Fabio; Doucet, Arnaud

    2017-12-01

    Recent advances in sensor technologies, field methodologies, numerical modeling, and inversion approaches have contributed to unprecedented imaging of hydrogeological properties and detailed predictions at multiple temporal and spatial scales. Nevertheless, imaging results and predictions will always remain imprecise, which calls for appropriate uncertainty quantification (UQ). In this paper, we outline selected methodological developments together with pioneering UQ applications in hydrogeology and hydrogeophysics. The applied mathematics and statistics literature is not easy to penetrate and this review aims at helping hydrogeologists and hydrogeophysicists to identify suitable approaches for UQ that can be applied and further developed to their specific needs. To bypass the tremendous computational costs associated with forward UQ based on full-physics simulations, we discuss proxy-modeling strategies and multi-resolution (Multi-level Monte Carlo) methods. We consider Bayesian inversion for non-linear and non-Gaussian state-space problems and discuss how Sequential Monte Carlo may become a practical alternative. We also describe strategies to account for forward modeling errors in Bayesian inversion. Finally, we consider hydrogeophysical inversion, where petrophysical uncertainty is often ignored leading to overconfident parameter estimation. The high parameter and data dimensions encountered in hydrogeological and geophysical problems make UQ a complicated and important challenge that has only been partially addressed to date.

  7. Molecular nonlinear dynamics and protein thermal uncertainty quantification

    Science.gov (United States)

    Xia, Kelin; Wei, Guo-Wei

    2014-01-01

    This work introduces molecular nonlinear dynamics (MND) as a new approach for describing protein folding and aggregation. By using a mode system, we show that the MND of disordered proteins is chaotic while that of folded proteins exhibits intrinsically low dimensional manifolds (ILDMs). The stability of ILDMs is found to strongly correlate with protein energies. We propose a novel method for protein thermal uncertainty quantification based on persistently invariant ILDMs. Extensive comparison with experimental data and the state-of-the-art methods in the field validate the proposed new method for protein B-factor prediction. PMID:24697365

  8. RNAontheBENCH: computational and empirical resources for benchmarking RNAseq quantification and differential expression methods

    KAUST Repository

    Germain, Pierre-Luc

    2016-06-20

    RNA sequencing (RNAseq) has become the method of choice for transcriptome analysis, yet no consensus exists as to the most appropriate pipeline for its analysis, with current benchmarks suffering important limitations. Here, we address these challenges through a rich benchmarking resource harnessing (i) two RNAseq datasets including ERCC ExFold spike-ins; (ii) Nanostring measurements of a panel of 150 genes on the same samples; (iii) a set of internal, genetically-determined controls; (iv) a reanalysis of the SEQC dataset; and (v) a focus on relative quantification (i.e. across-samples). We use this resource to compare different approaches to each step of RNAseq analysis, from alignment to differential expression testing. We show that methods providing the best absolute quantification do not necessarily provide good relative quantification across samples, that count-based methods are superior for gene-level relative quantification, and that the new generation of pseudo-alignment-based software performs as well as established methods, at a fraction of the computing time. We also assess the impact of library type and size on quantification and differential expression analysis. Finally, we have created a R package and a web platform to enable the simple and streamlined application of this resource to the benchmarking of future methods.

  9. RNAontheBENCH: computational and empirical resources for benchmarking RNAseq quantification and differential expression methods

    KAUST Repository

    Germain, Pierre-Luc; Vitriolo, Alessandro; Adamo, Antonio; Laise, Pasquale; Das, Vivek; Testa, Giuseppe

    2016-01-01

    RNA sequencing (RNAseq) has become the method of choice for transcriptome analysis, yet no consensus exists as to the most appropriate pipeline for its analysis, with current benchmarks suffering important limitations. Here, we address these challenges through a rich benchmarking resource harnessing (i) two RNAseq datasets including ERCC ExFold spike-ins; (ii) Nanostring measurements of a panel of 150 genes on the same samples; (iii) a set of internal, genetically-determined controls; (iv) a reanalysis of the SEQC dataset; and (v) a focus on relative quantification (i.e. across-samples). We use this resource to compare different approaches to each step of RNAseq analysis, from alignment to differential expression testing. We show that methods providing the best absolute quantification do not necessarily provide good relative quantification across samples, that count-based methods are superior for gene-level relative quantification, and that the new generation of pseudo-alignment-based software performs as well as established methods, at a fraction of the computing time. We also assess the impact of library type and size on quantification and differential expression analysis. Finally, we have created a R package and a web platform to enable the simple and streamlined application of this resource to the benchmarking of future methods.

  10. Field Dissipation and Storage Stability of Glufosinate Ammonium and Its Metabolites in Soil

    OpenAIRE

    Zhang, Yun; Wang, Kai; Wu, Junxue; Zhang, Hongyan

    2014-01-01

    A simple analytical method was developed to measure concentrations of glufosinate ammonium and its metabolites, 3-methylphosphinico-propionic acid (MPP) and 2-methylphosphinico-acetic acid (MPA), in field soil samples. To determine the minimum quantification limit, samples were spiked at different levels (0.1, 0.5, and 1.0 mg/kg). Soil samples were extracted with ammonium hydroxide solution 5% (v/v), concentrated, and reacted with trimethyl orthoacetate (TMOA) in the presence of acetic acid f...

  11. Detection and quantification of Leveillula taurica growth in pepper leaves.

    Science.gov (United States)

    Zheng, Zheng; Nonomura, Teruo; Bóka, Károly; Matsuda, Yoshinori; Visser, Richard G F; Toyoda, Hideyoshi; Kiss, Levente; Bai, Yuling

    2013-06-01

    Leveillula taurica is an obligate fungal pathogen that causes powdery mildew disease on a broad range of plants, including important crops such as pepper, tomato, eggplant, onion, cotton, and so on. The early stage of this disease is difficult to diagnose and the disease can easily spread unobserved; for example, in pepper and tomato production fields and greenhouses. The objective of this study was to develop a detection and quantification method of L. taurica biomass in pepper leaves with special regard to the early stages of infection. We monitored the development of the disease to time the infection process on the leaf surface as well as inside the pepper leaves. The initial and final steps of the infection taking place on the leaf surface were consecutively observed using a dissecting microscope and a scanning electron microscope. The development of the intercellular mycelium in the mesophyll was followed by light and transmission electron microscopy. A pair of L. taurica-specific primers was designed based on the internal transcribed spacer sequence of L. taurica and used in real-time polymerase chain reaction (PCR) assay to quantify the fungal DNA during infection. The specificity of this assay was confirmed by testing the primer pair with DNA from host plants and also from another powdery mildew species, Oidium neolycopersici, infecting tomato. A standard curve was obtained for absolute quantification of L. taurica biomass. In addition, we tested a relative quantification method by using a plant gene as reference and the obtained results were compared with the visual disease index scoring. The real-time PCR assay for L. taurica provides a valuable tool for detection and quantification of this pathogen in breeding activities as well in plant-microbe interaction studies.

  12. Global Knowledge Futures: Articulating the Emergence of a New Meta-level Field

    Directory of Open Access Journals (Sweden)

    Jennifer M. Gidley

    2013-06-01

    Full Text Available In this paper I articulate a new meta-level field of studies that I call global knowledge futures—a field through which other emerging transdisciplinary fields can be integrated to cohere knowledge at a higher level. I contrast this with the current dominant knowledge paradigm of the global knowledge economy with its fragmentation, commodification and instrumentalism based on neoliberal knowledge capitalism. I take a big-picture, macrohistorical lens to the new thinking and new knowledge patterns that are emerging within the evolution of consciousness discourse. I explore three discourses: postformal studies, integral studies and planetary studies—using a fourth discourse, futures studies, to provide a macro-temporal framing. By extending the meta-fields of postformal, integral and planetary studies into a prospective future dimension, I locate areas of development where these leading-edge discourses can be brought into closer dialogue with each other. In this meeting point of four boundary-spanning discourses I identify the new meta-level field of global knowledge futures, grounded in human thinking capacities, such as creativity, imagination, dialogue and collaboration.

  13. Multivariate Analysis for Quantification of Plutonium(IV) in Nitric Acid Based on Absorption Spectra

    Energy Technology Data Exchange (ETDEWEB)

    Lines, Amanda M. [Energy and Environment Directorate, Pacific Northwest National Laboratory, Richland, Washington 99352, United States; Adami, Susan R. [Energy and Environment Directorate, Pacific Northwest National Laboratory, Richland, Washington 99352, United States; Sinkov, Sergey I. [Energy and Environment Directorate, Pacific Northwest National Laboratory, Richland, Washington 99352, United States; Lumetta, Gregg J. [Energy and Environment Directorate, Pacific Northwest National Laboratory, Richland, Washington 99352, United States; Bryan, Samuel A. [Energy and Environment Directorate, Pacific Northwest National Laboratory, Richland, Washington 99352, United States

    2017-08-09

    Development of more effective, reliable, and fast methods for monitoring process streams is a growing opportunity for analytical applications. Many fields can benefit from on-line monitoring, including the nuclear fuel cycle where improved methods for monitoring radioactive materials will facilitate maintenance of proper safeguards and ensure safe and efficient processing of materials. On-line process monitoring with a focus on optical spectroscopy can provide a fast, non-destructive method for monitoring chemical species. However, identification and quantification of species can be hindered by the complexity of the solutions if bands overlap or show condition-dependent spectral features. Plutonium (IV) is one example of a species which displays significant spectral variation with changing nitric acid concentration. Single variate analysis (i.e. Beer’s Law) is difficult to apply to the quantification of Pu(IV) unless the nitric acid concentration is known and separate calibration curves have been made for all possible acid strengths. Multivariate, or chemometric, analysis is an approach that allows for the accurate quantification of Pu(IV) without a priori knowledge of nitric acid concentration.

  14. Mapping of radio frequency electromagnetic field exposure levels in outdoor environment and comparing with reference levels for general public health.

    Science.gov (United States)

    Cansiz, Mustafa; Abbasov, Teymuraz; Kurt, M Bahattin; Celik, A Recai

    2018-03-01

    In this study, radio frequency electromagnetic field exposure levels were measured on the main streets in the city center of Diyarbakır, Turkey. Measured electric field levels were plotted on satellite imagery of Diyarbakır and were compared with exposure guidelines published by the International Commission on Non-Ionizing Radiation Protection (ICNIRP). Exposure measurements were performed in dense urban, urban and suburban areas each day for 7 consecutive days. The measurement system consisted of high precision and portable spectrum analyzer, three-axis electric field antenna, connection cable and a laptop which was used to record the measurement samples as a data logger. The highest exposure levels were detected for two places, which are called Diclekent and Batıkent. It was observed that the highest instantaneous electric field strength value for Batıkent was 7.18 V/m and for Diclekent was 5.81 V/m. It was statistically determined that the main contributor band to the total exposure levels was Universal Mobile Telecommunications System band. Finally, it was concluded that all measured exposure levels were lower than the reference levels recommended by ICNIRP for general public health.

  15. Sensitive quantification of the HIV-1 reservoir in gut-associated lymphoid tissue.

    Directory of Open Access Journals (Sweden)

    Sara Morón-López

    Full Text Available The implementation of successful strategies to achieve an HIV cure has become a priority in HIV research. However, the current location and size of HIV reservoirs is still unknown since there are limited tools to evaluate HIV latency in viral sanctuaries such as gut-associated lymphoid tissue (GALT. As reported in the so called "Boston Patients", despite undetectable levels of proviral HIV-1 DNA in blood and GALT, viral rebound happens in just few months after ART interruption. This fact might imply that current methods are not sensitive enough to detect residual reservoirs. Showing that, it is imperative to improve the detection and quantification of HIV-1 reservoir in tissue samples. Herein, we propose a novel non-enzymatic protocol for purification of Lamina Propria Leukocytes (LPL from gut biopsies combined to viral HIV DNA (vDNA quantification by droplet digital PCR (ddPCR to improve the sensitivity and accuracy of viral reservoir measurements (LPL-vDNA assay.Endoscopic ileum biopsies were sampled from 12 HIV-1-infected cART-suppressed subjects. We performed a DTT/EDTA-based treatment for epithelial layer removal followed by non-enzymatic disruption of the tissue to obtain lamina propria cell suspension (LP. CD45+ cells were subsequently purified by flow sorting and vDNA was determined by ddPCR.vDNA quantification levels were significantly higher in purified LPLs (CD45+ than in bulk LPs (p<0.01. The levels of vDNA were higher in ileum samples than in concurrent PBMC from the same individuals (p = 0.002. As a result of the increased sensitivity of this purification method, the Poisson 95% confidence intervals of the vDNA quantification data from LPLs were narrower than that from bulk LPs. Of note, vDNA was unambiguously quantified above the detection limit in 100% of LPL samples, while only in 58% of bulk LPs.We propose an innovative combined protocol for a more sensitive detection of the HIV reservoir in gut-associated viral sanctuaries

  16. Markers of anthropogenic contamination: A validated method for quantification of pharmaceuticals, illicit drug metabolites, perfluorinated compounds, and plasticisers in sewage treatment effluent and rain runoff.

    Science.gov (United States)

    Wilkinson, John L; Swinden, Julian; Hooda, Peter S; Barker, James; Barton, Stephen

    2016-09-01

    An effective, specific and accurate method is presented for the quantification of 13 markers of anthropogenic contaminants in water using solid phase extraction (SPE) followed by high performance liquid chromatography (HPLC) tandem mass spectrometry (MS/MS). Validation was conducted according to the International Conference on Harmonisation (ICH) guidelines. Method recoveries ranged from 77 to 114% and limits of quantification between 0.75 and 4.91 ng/L. A study was undertaken to quantify the concentrations and loadings of the selected contaminants in 6 sewage treatment works (STW) effluent discharges as well as concentrations in 5 rain-driven street runoffs and field drainages. Detection frequencies in STW effluent ranged from 25% (ethinylestradiol) to 100% (benzoylecgonine, bisphenol-A (BPA), bisphenol-S (BPS) and diclofenac). Average concentrations of detected compounds in STW effluents ranged from 3.62 ng/L (ethinylestradiol) to 210 ng/L (BPA). Levels of perfluorinated compounds (PFCs) perfluorooctanoic acid (PFOA) and perfluorononanoic acid (PFNA) as well as the plasticiser BPA were found in street runoff at maximum levels of 1160 ng/L, 647 ng/L and 2405 ng/L respectively (8.52, 3.09 and 2.7 times more concentrated than maximum levels in STW effluents respectively). Rain-driven street runoff may have an effect on levels of PFCs and plasticisers in receiving rivers and should be further investigated. Together, this method with the 13 selected contaminants enables the quantification of various markers of anthropogenic pollutants: inter alia pharmaceuticals, illicit drugs and their metabolites from humans and improper disposal of drugs, while the plasticisers and perfluorinated compounds may also indicate contamination from industrial and transport activity (street runoff). Copyright © 2016 Elsevier Ltd. All rights reserved.

  17. Quantification of trace arsenic in soils by field-portable X-ray fluorescence spectrometry: considerations for sample preparation and measurement conditions.

    Science.gov (United States)

    Parsons, Chris; Margui Grabulosa, Eva; Pili, Eric; Floor, Geerke H; Roman-Ross, Gabriela; Charlet, Laurent

    2013-11-15

    Recent technological improvements have led to the widespread adoption of field portable energy dispersive X-ray fluorescence (FP-XRF) by governmental agencies, environmental consultancies and research institutions. FP-XRF units often include analysis modes specifically designed for the quantification of trace elements in soils. Using these modes, X-ray tube based FP-XRF units can offer almost "point and shoot" ease of use and results comparable to those of laboratory based instruments. Nevertheless, FP-XRF analysis is sensitive to spectral interferences as well as physical and chemical matrix effects which can result in decreased precision and accuracy. In this study, an X-ray tube-based FP-XRF analyser was used to determine trace (low ppm) concentrations of As in a floodplain soil. The effect of different sample preparation and analysis conditions on precision and accuracy were systematically evaluated. We propose strategies to minimise sources of error and maximise data precision and accuracy, achieving in situ limits of detection and precision of 6.8 ppm and 14.4%RSD, respectively for arsenic. We demonstrate that soil moisture, even in relatively dry soils, dramatically affects analytical performance with a signal loss of 37% recorded for arsenic at 20 wt% soil moisture relative to dry soil. We also highlight the importance of the use of certified reference materials and independent measurement methods to ensure accurate correction of field values. Copyright © 2012 Elsevier B.V. All rights reserved.

  18. Instructions for 104-SX liquid level measurement field tests

    International Nuclear Information System (INIS)

    Webb, R.H.

    1994-01-01

    This document provides detailed instructions for field testing a suggested solution of inserting a liner inside the 104-SX failed Liquid Observation Well to gain access for making temporary Liquid Level Measurement until a permanent solution has been provided

  19. Subnuclear foci quantification using high-throughput 3D image cytometry

    Science.gov (United States)

    Wadduwage, Dushan N.; Parrish, Marcus; Choi, Heejin; Engelward, Bevin P.; Matsudaira, Paul; So, Peter T. C.

    2015-07-01

    Ionising radiation causes various types of DNA damages including double strand breaks (DSBs). DSBs are often recognized by DNA repair protein ATM which forms gamma-H2AX foci at the site of the DSBs that can be visualized using immunohistochemistry. However most of such experiments are of low throughput in terms of imaging and image analysis techniques. Most of the studies still use manual counting or classification. Hence they are limited to counting a low number of foci per cell (5 foci per nucleus) as the quantification process is extremely labour intensive. Therefore we have developed a high throughput instrumentation and computational pipeline specialized for gamma-H2AX foci quantification. A population of cells with highly clustered foci inside nuclei were imaged, in 3D with submicron resolution, using an in-house developed high throughput image cytometer. Imaging speeds as high as 800 cells/second in 3D were achieved by using HiLo wide-field depth resolved imaging and a remote z-scanning technique. Then the number of foci per cell nucleus were quantified using a 3D extended maxima transform based algorithm. Our results suggests that while most of the other 2D imaging and manual quantification studies can count only up to about 5 foci per nucleus our method is capable of counting more than 100. Moreover we show that 3D analysis is significantly superior compared to the 2D techniques.

  20. A refined methodology for modeling volume quantification performance in CT

    Science.gov (United States)

    Chen, Baiyu; Wilson, Joshua; Samei, Ehsan

    2014-03-01

    The utility of CT lung nodule volume quantification technique depends on the precision of the quantification. To enable the evaluation of quantification precision, we previously developed a mathematical model that related precision to image resolution and noise properties in uniform backgrounds in terms of an estimability index (e'). The e' was shown to predict empirical precision across 54 imaging and reconstruction protocols, but with different correlation qualities for FBP and iterative reconstruction (IR) due to the non-linearity of IR impacted by anatomical structure. To better account for the non-linearity of IR, this study aimed to refine the noise characterization of the model in the presence of textured backgrounds. Repeated scans of an anthropomorphic lung phantom were acquired. Subtracted images were used to measure the image quantum noise, which was then used to adjust the noise component of the e' calculation measured from a uniform region. In addition to the model refinement, the validation of the model was further extended to 2 nodule sizes (5 and 10 mm) and 2 segmentation algorithms. Results showed that the magnitude of IR's quantum noise was significantly higher in structured backgrounds than in uniform backgrounds (ASiR, 30-50%; MBIR, 100-200%). With the refined model, the correlation between e' values and empirical precision no longer depended on reconstruction algorithm. In conclusion, the model with refined noise characterization relfected the nonlinearity of iterative reconstruction in structured background, and further showed successful prediction of quantification precision across a variety of nodule sizes, dose levels, slice thickness, reconstruction algorithms, and segmentation software.

  1. Quantification of chemical elements in blood of patients affected by multiple sclerosis.

    Science.gov (United States)

    Forte, Giovanni; Visconti, Andrea; Santucci, Simone; Ghazaryan, Anna; Figà-Talamanca, Lorenzo; Cannoni, Stefania; Bocca, Beatrice; Pino, Anna; Violante, Nicola; Alimonti, Alessandro; Salvetti, Marco; Ristori, Giovanni

    2005-01-01

    Although some studies suggested a link between exposure to trace elements and development of multiple sclerosis (MS), clear information on their role in the aetiology of MS is still lacking. In this study the concentrations of Al, Ba, Be, Bi, Ca, Cd, Co, Cr, Cu, Fe, Hg, Li, Mg, Mn, Mo, Ni, Pb, Sb, Si, Sn, Sr, Tl, V, W, Zn and Zr were determined in the blood of 60 patients with MS and 60 controls. Quantifications were performed by inductively coupled plasma (ICP) atomic emission spectrometry and sector field ICP mass spectrometry. When the two groups were compared, an increased level of Co, Cu and Ni and a decrement of Be, Fe, Hg, Mg, Mo, Pb and Zn in blood of patients were observed. In addition, the discriminant analysis pointed out that Cu, Be, Hg, Co and Mo were able to discriminate between MS patients and controls (92.5% of cases correctly classified).

  2. Accurate quantification of mouse mitochondrial DNA without co-amplification of nuclear mitochondrial insertion sequences.

    Science.gov (United States)

    Malik, Afshan N; Czajka, Anna; Cunningham, Phil

    2016-07-01

    Mitochondria contain an extra-nuclear genome in the form of mitochondrial DNA (MtDNA), damage to which can lead to inflammation and bioenergetic deficit. Changes in MtDNA levels are increasingly used as a biomarker of mitochondrial dysfunction. We previously reported that in humans, fragments in the nuclear genome known as nuclear mitochondrial insertion sequences (NumtS) affect accurate quantification of MtDNA. In the current paper our aim was to determine whether mouse NumtS affect the quantification of MtDNA and to establish a method designed to avoid this. The existence of NumtS in the mouse genome was confirmed using blast N, unique MtDNA regions were identified using FASTA, and MtDNA primers which do not co-amplify NumtS were designed and tested. MtDNA copy numbers were determined in a range of mouse tissues as the ratio of the mitochondrial and nuclear genome using real time qPCR and absolute quantification. Approximately 95% of mouse MtDNA was duplicated in the nuclear genome as NumtS which were located in 15 out of 21 chromosomes. A unique region was identified and primers flanking this region were used. MtDNA levels differed significantly in mouse tissues being the highest in the heart, with levels in descending order (highest to lowest) in kidney, liver, blood, brain, islets and lung. The presence of NumtS in the nuclear genome of mouse could lead to erroneous data when studying MtDNA content or mutation. The unique primers described here will allow accurate quantification of MtDNA content in mouse models without co-amplification of NumtS. Copyright © 2016 Elsevier B.V. and Mitochondria Research Society. All rights reserved.

  3. Impact of local electrostatic field rearrangement on field ionization

    Science.gov (United States)

    Katnagallu, Shyam; Dagan, Michal; Parviainen, Stefan; Nematollahi, Ali; Grabowski, Blazej; Bagot, Paul A. J.; Rolland, Nicolas; Neugebauer, Jörg; Raabe, Dierk; Vurpillot, François; Moody, Michael P.; Gault, Baptiste

    2018-03-01

    Field ion microscopy allows for direct imaging of surfaces with true atomic resolution. The high charge density distribution on the surface generates an intense electric field that can induce ionization of gas atoms. We investigate the dynamic nature of the charge and the consequent electrostatic field redistribution following the departure of atoms initially constituting the surface in the form of an ion, a process known as field evaporation. We report on a new algorithm for image processing and tracking of individual atoms on the specimen surface enabling quantitative assessment of shifts in the imaged atomic positions. By combining experimental investigations with molecular dynamics simulations, which include the full electric charge, we confirm that change is directly associated with the rearrangement of the electrostatic field that modifies the imaging gas ionization zone. We derive important considerations for future developments of data reconstruction in 3D field ion microscopy, in particular for precise quantification of lattice strains and characterization of crystalline defects at the atomic scale.

  4. Use of capillary Western immunoassay (Wes) for quantification of dystrophin levels in skeletal muscle of healthy controls and individuals with Becker and Duchenne muscular dystrophy.

    Science.gov (United States)

    Beekman, Chantal; Janson, Anneke A; Baghat, Aabed; van Deutekom, Judith C; Datson, Nicole A

    2018-01-01

    Duchenne muscular dystrophy (DMD) is a neuromuscular disease characterized by progressive weakness of the skeletal and cardiac muscles. This X-linked disorder is caused by open reading frame disrupting mutations in the DMD gene, resulting in strong reduction or complete absence of dystrophin protein. In order to use dystrophin as a supportive or even surrogate biomarker in clinical studies on investigational drugs aiming at correcting the primary cause of the disease, the ability to reliably quantify dystrophin expression in muscle biopsies of DMD patients pre- and post-treatment is essential. Here we demonstrate the application of the ProteinSimple capillary immunoassay (Wes) method, a gel- and blot-free method requiring less sample, antibody and time to run than conventional Western blot assay. We optimized dystrophin quantification by Wes using 2 different antibodies and found it to be highly sensitive, reproducible and quantitative over a large dynamic range. Using a healthy control muscle sample as a reference and α-actinin as a protein loading/muscle content control, a panel of skeletal muscle samples consisting of 31 healthy controls, 25 Becker Muscle dystrophy (BMD) and 17 DMD samples was subjected to Wes analysis. In healthy controls dystrophin levels varied 3 to 5-fold between the highest and lowest muscle samples, with the reference sample representing the average of all 31 samples. In BMD muscle samples dystrophin levels ranged from 10% to 90%, with an average of 33% of the healthy muscle average, while for the DMD samples the average dystrophin level was 1.3%, ranging from 0.7% to 7% of the healthy muscle average. In conclusion, Wes is a suitable, efficient and reliable method for quantification of dystrophin expression as a biomarker in DMD clinical drug development.

  5. Accurate quantification of endogenous androgenic steroids in cattle's meat by gas chromatography mass spectrometry using a surrogate analyte approach

    International Nuclear Information System (INIS)

    Ahmadkhaniha, Reza; Shafiee, Abbas; Rastkari, Noushin; Kobarfard, Farzad

    2009-01-01

    Determination of endogenous steroids in complex matrices such as cattle's meat is a challenging task. Since endogenous steroids always exist in animal tissues, no analyte-free matrices for constructing the standard calibration line will be available, which is crucial for accurate quantification specially at trace level. Although some methods have been proposed to solve the problem, none has offered a complete solution. To this aim, a new quantification strategy was developed in this study, which is named 'surrogate analyte approach' and is based on using isotope-labeled standards instead of natural form of endogenous steroids for preparing the calibration line. In comparison with the other methods, which are currently in use for the quantitation of endogenous steroids, this approach provides improved simplicity and speed for analysis on a routine basis. The accuracy of this method is better than other methods at low concentration and comparable to the standard addition at medium and high concentrations. The method was also found to be valid according to the ICH criteria for bioanalytical methods. The developed method could be a promising approach in the field of compounds residue analysis

  6. A field method for monitoring thoron-daughter working level

    International Nuclear Information System (INIS)

    Khan, A.H.; Dhandayatham, R.; Raghavayya, M.; Nambiar, P.P.V.J.

    1975-01-01

    The concept of working level, generally used for radon daughters, has been extended to the daughter products of thoron. Accordingly, thorondaughter working level (TWL) has been defined as the alpha energy released from the ultimate decay of 100 pCi/1 each of the short-lived decay products of thoron. In order to facilitate the evaluation of inhalation hazard in thorium handling areas, a simple field method has been suggested to measure the thoron-daughter working level. A comparison of the potential alpha energies from radon-daughters and that from thoron-daughter is included. (K.B.)

  7. Digital PCR for direct quantification of viruses without DNA extraction.

    Science.gov (United States)

    Pavšič, Jernej; Žel, Jana; Milavec, Mojca

    2016-01-01

    DNA extraction before amplification is considered an essential step for quantification of viral DNA using real-time PCR (qPCR). However, this can directly affect the final measurements due to variable DNA yields and removal of inhibitors, which leads to increased inter-laboratory variability of qPCR measurements and reduced agreement on viral loads. Digital PCR (dPCR) might be an advantageous methodology for the measurement of virus concentrations, as it does not depend on any calibration material and it has higher tolerance to inhibitors. DNA quantification without an extraction step (i.e. direct quantification) was performed here using dPCR and two different human cytomegalovirus whole-virus materials. Two dPCR platforms were used for this direct quantification of the viral DNA, and these were compared with quantification of the extracted viral DNA in terms of yield and variability. Direct quantification of both whole-virus materials present in simple matrices like cell lysate or Tris-HCl buffer provided repeatable measurements of virus concentrations that were probably in closer agreement with the actual viral load than when estimated through quantification of the extracted DNA. Direct dPCR quantification of other viruses, reference materials and clinically relevant matrices is now needed to show the full versatility of this very promising and cost-efficient development in virus quantification.

  8. Outcome quantification using SPHARM-PDM toolbox in orthognathic surgery

    Science.gov (United States)

    Cevidanes, Lucia; Zhu, HongTu; Styner, Martin

    2011-01-01

    Purpose Quantification of surgical outcomes in longitudinal studies has led to significant progress in the treatment of dentofacial deformity, both by offering options to patients who might not otherwise have been recommended for treatment and by clarifying the selection of appropriate treatment methods. Most existing surgical treatments have not been assessed in a systematic way. This paper presents the quantification of surgical outcomes in orthognathic surgery via our localized shape analysis framework. Methods In our setting, planning and surgical simulation is performed using the surgery planning software CMFapp. We then employ the SPHARM-PDM to measure the difference between pre-surgery and virtually simulated post-surgery models. This SPHARM-PDM shape framework is validated for use with craniofacial structures via simulating known 3D surgical changes within CMFapp. Results Our results show that SPHARM-PDM analysis accurately measures surgical displacements, compared with known displacement values. Visualization of color maps of virtually simulated surgical displacements describe corresponding surface distances that precisely describe location of changes, and difference vectors indicate directionality and magnitude of changes. Conclusions SPHARM-PDM-based quantification of surgical outcome is feasible. When compared to prior solutions, our method has the potential to make the surgical planning process more flexible, increase the level of detail and accuracy of the plan, yield higher operative precision and control and enhance the follow-up and documentation of clinical cases. PMID:21161693

  9. QUANTIFICATION AND BIOREMEDIATION OF ENVIRONMENTAL SAMPLES BY DEVELOPING A NOVEL AND EFFICIENT METHOD

    Directory of Open Access Journals (Sweden)

    Mohammad Osama

    2014-06-01

    Full Text Available Pleurotus ostreatus, a white rot fungus, is capable of bioremediating a wide range of organic contaminants including Polycyclic Aromatic Hydrocarbons (PAHs. Ergosterol is produced by living fungal biomass and used as a measure of fungal biomass. The first part of this work deals with the extraction and quantification of PAHs from contaminated sediments by Lipid Extraction Method (LEM. The second part consists of the development of a novel extraction method (Ergosterol Extraction Method (EEM, quantification and bioremediation. The novelty of this method is the simultaneously extraction and quantification of two different types of compounds, sterol (ergosterol and PAHs and is more efficient than LEM. EEM has been successful in extracting ergosterol from the fungus grown on barley in the concentrations of 17.5-39.94 µg g-1 ergosterol and the PAHs are much more quantified in numbers and amounts as compared to LEM. In addition, cholesterol usually found in animals, has also been detected in the fungus, P. ostreatus at easily detectable levels.

  10. Non-Gaussianity at tree and one-loop levels from vector field perturbations

    International Nuclear Information System (INIS)

    Valenzuela-Toledo, Cesar A.; Rodriguez, Yeinzon; Lyth, David H.

    2009-01-01

    We study the spectrum P ζ and bispectrum B ζ of the primordial curvature perturbation ζ when the latter is generated by scalar and vector field perturbations. The tree-level and one-loop contributions from vector field perturbations are worked out considering the possibility that the one-loop contributions may be dominant over the tree-level terms [both (either) in P ζ and (or) in B ζ ] and vice versa. The level of non-Gaussianity in the bispectrum, f NL , is calculated and related to the level of statistical anisotropy in the power spectrum, g ζ . For very small amounts of statistical anisotropy in the power spectrum, the level of non-Gaussianity may be very high, in some cases exceeding the current observational limit.

  11. Quantification of source-term profiles from near-field geochemical models

    International Nuclear Information System (INIS)

    McKinley, I.G.

    1985-01-01

    A geochemical model of the near-field is described which quantitatively treats the processes of engineered barrier degradation, buffering of aqueous chemistry by solid phases, nuclide solubilization and transport through the near-field and release to the far-field. The radionuclide source-terms derived from this model are compared with those from a simpler model used for repository safety analysis. 10 refs., 2 figs., 2 tabs

  12. Computerized systems for high level information processing and decision making in the field of PSA

    International Nuclear Information System (INIS)

    Kafka, P.; Kunitz, H.

    1990-01-01

    A comprehensive review of Probabilistic Safety Assessment (PSA) related program packages is made. Three fields in methodological succession are covered: plant modeling, data quantification procedures and decision-making support. Packages for fault tree construction and minimal cut sets evaluation are referred to and the performances of three of them: RALLY, ORCHARD and SALP-PC, are discussed and compared. Notions on the raw data sources are presented and the Nuclear Computerized Library for Assessing Reactor Reliability (NUCLARR) is given as an example for data base management system generating PSA data. Aggregated risk models for support in safety assessment, plant operation and accident management (SARA, ESSM, PRISIM) are cited. Examples of systems supporting 'living PSA' (SAIS, SUPER-NET, LESSEPS 1300, NUPRA, SPSA) are given. The concluding remarks outline the state-of-the-art developments of computerized systems for reliability analyses. 1 fig., 1 tab., 51 refs. (R.Ts)

  13. Circulating Management Ideas: Towards a Better Understanding of the Reciprocal Relationships between Field-level Dynamics and Micro-level Practices

    DEFF Research Database (Denmark)

    Nielsen, Jeppe Agger; Mathiassen, Lars; Newell, Sue

    Understanding how management ideas spread across countries and fields and how they are adopted and implemented in individual organizations has been of growing interest of scholars in the institutional perspective. Although core institutional arguments depend on analyzing the interaction between......-level dynamics and micro-level practices are understood as mutually constitutive, rather than as distinct levels, as they interact recursively to enable, or hinder, institutionalization. To illustrate and further develop this multi-level model we contribute two longitudinal case studies of how management ideas...... as management ideas emerge, consolidate or wither away. By bringing translation research into the more well-established approach of management fashion and leverage insights from institutional work literature this paper offer a multi-level model for investigating circulating management ideas in which field...

  14. Towards tributyltin quantification in natural water at the Environmental Quality Standard level required by the Water Framework Directive.

    Science.gov (United States)

    Alasonati, Enrica; Fettig, Ina; Richter, Janine; Philipp, Rosemarie; Milačič, Radmila; Sčančar, Janez; Zuliani, Tea; Tunç, Murat; Bilsel, Mine; Gören, Ahmet Ceyhan; Fisicaro, Paola

    2016-11-01

    The European Union (EU) has included tributyltin (TBT) and its compounds in the list of priority water pollutants. Quality standards demanded by the EU Water Framework Directive (WFD) require determination of TBT at so low concentration level that chemical analysis is still difficult and further research is needed to improve the sensitivity, the accuracy and the precision of existing methodologies. Within the frame of a joint research project "Traceable measurements for monitoring critical pollutants under the European Water Framework Directive" in the European Metrology Research Programme (EMRP), four metrological and designated institutes have developed a primary method to quantify TBT in natural water using liquid-liquid extraction (LLE) and species-specific isotope dilution mass spectrometry (SSIDMS). The procedure has been validated at the Environmental Quality Standard (EQS) level (0.2ngL(-1) as cation) and at the WFD-required limit of quantification (LOQ) (0.06ngL(-1) as cation). The LOQ of the methodology was 0.06ngL(-1) and the average measurement uncertainty at the LOQ was 36%, which agreed with WFD requirements. The analytical difficulties of the method, namely the presence of TBT in blanks and the sources of measurement uncertainties, as well as the interlaboratory comparison results are discussed in detail. Copyright © 2016 Elsevier B.V. All rights reserved.

  15. Effects of RF low levels electromagnetic fields on Paramecium primaurelia

    International Nuclear Information System (INIS)

    Tofani, S.; Testa, B.; Agnesod, G.; Tartagbino, L.; Bonazzola, G.C.

    1988-01-01

    In the last years many studies have been performed to examine biological effects of prolonged exposure at electric field low levels. This great interest is linked to a specific interaction possibility, also related to the exposure length, between electromagnetic fields and biological systems without remarkable enhancement of organism's temperature. Hence the need to investigate in vitro the possible cellular regulation mechanisms involved in these interactions, varying physical exposure parameters

  16. Complex Empiricism and the Quantification of Uncertainty in Paleoclimate Reconstructions

    Science.gov (United States)

    Brumble, K. C.

    2014-12-01

    Because the global climate cannot be observed directly, and because of vast and noisy data sets, climate science is a rich field to study how computational statistics informs what it means to do empirical science. Traditionally held virtues of empirical science and empirical methods like reproducibility, independence, and straightforward observation are complicated by representational choices involved in statistical modeling and data handling. Examining how climate reconstructions instantiate complicated empirical relationships between model, data, and predictions reveals that the path from data to prediction does not match traditional conceptions of empirical inference either. Rather, the empirical inferences involved are "complex" in that they require articulation of a good deal of statistical processing wherein assumptions are adopted and representational decisions made, often in the face of substantial uncertainties. Proxy reconstructions are both statistical and paleoclimate science activities aimed at using a variety of proxies to reconstruct past climate behavior. Paleoclimate proxy reconstructions also involve complex data handling and statistical refinement, leading to the current emphasis in the field on the quantification of uncertainty in reconstructions. In this presentation I explore how the processing needed for the correlation of diverse, large, and messy data sets necessitate the explicit quantification of the uncertainties stemming from wrangling proxies into manageable suites. I also address how semi-empirical pseudo-proxy methods allow for the exploration of signal detection in data sets, and as intermediary steps for statistical experimentation.

  17. Quantification of complex modular architecture in plants.

    Science.gov (United States)

    Reeb, Catherine; Kaandorp, Jaap; Jansson, Fredrik; Puillandre, Nicolas; Dubuisson, Jean-Yves; Cornette, Raphaël; Jabbour, Florian; Coudert, Yoan; Patiño, Jairo; Flot, Jean-François; Vanderpoorten, Alain

    2018-04-01

    Morphometrics, the assignment of quantities to biological shapes, is a powerful tool to address taxonomic, evolutionary, functional and developmental questions. We propose a novel method for shape quantification of complex modular architecture in thalloid plants, whose extremely reduced morphologies, combined with the lack of a formal framework for thallus description, have long rendered taxonomic and evolutionary studies extremely challenging. Using graph theory, thalli are described as hierarchical series of nodes and edges, allowing for accurate, homologous and repeatable measurements of widths, lengths and angles. The computer program MorphoSnake was developed to extract the skeleton and contours of a thallus and automatically acquire, at each level of organization, width, length, angle and sinuosity measurements. Through the quantification of leaf architecture in Hymenophyllum ferns (Polypodiopsida) and a fully worked example of integrative taxonomy in the taxonomically challenging thalloid liverwort genus Riccardia, we show that MorphoSnake is applicable to all ramified plants. This new possibility of acquiring large numbers of quantitative traits in plants with complex modular architectures opens new perspectives of applications, from the development of rapid species identification tools to evolutionary analyses of adaptive plasticity. © 2018 The Authors. New Phytologist © 2018 New Phytologist Trust.

  18. Extension of TFTR operations to higher toroidal field levels

    International Nuclear Information System (INIS)

    Woolley, R.D.

    1995-01-01

    For the past year, TFTR has sometimes operated at extended toroidal field (TF) levels. The extension to 5.6 Tesla (79 kA) was crucial for TFTR's November 1994 10.7 MW DT fusion power record. The extension to 6.0 Tesla (85 kA) was commissioned on 9 September 1995. There are several reasons that one could expect the TF coils to survive the higher stresses that develop at higher fields. They were designed to operate at 5.2 Tesla with a vertical field of 0.5 Tesla, whereas the actual vertical field needed for the plasma does not exceed 0.35 Tesla. Their design specification explicitly required they survive some pulses at 6.0 Tesla. TF coil mechanical analysis computer models available during coil design were crude, leading to conservative design. And design analyses also had to consider worst-case misoperations that TFTR's real time Coil Protection Calculators (CPCs) now positively prevent from occurring

  19. Quantification of arbuscular mycorrhizal fungal DNA in roots: how important is material preservation?

    Science.gov (United States)

    Janoušková, Martina; Püschel, David; Hujslová, Martina; Slavíková, Renata; Jansa, Jan

    2015-04-01

    Monitoring populations of arbuscular mycorrhizal fungi (AMF) in roots is a pre-requisite for improving our understanding of AMF ecology and functioning of the symbiosis in natural conditions. Among other approaches, quantification of fungal DNA in plant tissues by quantitative real-time PCR is one of the advanced techniques with a great potential to process large numbers of samples and to deliver truly quantitative information. Its application potential would greatly increase if the samples could be preserved by drying, but little is currently known about the feasibility and reliability of fungal DNA quantification from dry plant material. We addressed this question by comparing quantification results based on dry root material to those obtained from deep-frozen roots of Medicago truncatula colonized with Rhizophagus sp. The fungal DNA was well conserved in the dry root samples with overall fungal DNA levels in the extracts comparable with those determined in extracts of frozen roots. There was, however, no correlation between the quantitative data sets obtained from the two types of material, and data from dry roots were more variable. Based on these results, we recommend dry material for qualitative screenings but advocate using frozen root materials if precise quantification of fungal DNA is required.

  20. Real-time polymerase chain reaction-based approach for quantification of the pat gene in the T25 Zea mays event.

    Science.gov (United States)

    Weighardt, Florian; Barbati, Cristina; Paoletti, Claudia; Querci, Maddalena; Kay, Simon; De Beuckeleer, Marc; Van den Eede, Guy

    2004-01-01

    In Europe, a growing interest for reliable techniques for the quantification of genetically modified component(s) of food matrixes is arising from the need to comply with the European legislative framework on novel food products. Real-time polymerase chain reaction (PCR) is currently the most powerful technique for the quantification of specific nucleic acid sequences. Several real-time PCR methodologies based on different molecular principles have been developed for this purpose. The most frequently used approach in the field of genetically modified organism (GMO) quantification in food or feed samples is based on the 5'-3'-exonuclease activity of Taq DNA polymerase on specific degradation probes (TaqMan principle). A novel approach was developed for the establishment of a TaqMan quantification system assessing GMO contents around the 1% threshold stipulated under European Union (EU) legislation for the labeling of food products. The Zea mays T25 elite event was chosen as a model for the development of the novel GMO quantification approach. The most innovative aspect of the system is represented by the use of sequences cloned in plasmids as reference standards. In the field of GMO quantification, plasmids are an easy to use, cheap, and reliable alternative to Certified Reference Materials (CRMs), which are only available for a few of the GMOs authorized in Europe, have a relatively high production cost, and require further processing to be suitable for analysis. Strengths and weaknesses of the use of novel plasmid-based standards are addressed in detail. In addition, the quantification system was designed to avoid the use of a reference gene (e.g., a single copy, species-specific gene) as normalizer, i.e., to perform a GMO quantification based on an absolute instead of a relative measurement. In fact, experimental evidences show that the use of reference genes adds variability to the measurement system because a second independent real-time PCR-based measurement

  1. Multidimensional Recurrence Quantification Analysis (MdRQA) for the Analysis of Multidimensional Time-Series: A Software Implementation in MATLAB and Its Application to Group-Level Data in Joint Action.

    Science.gov (United States)

    Wallot, Sebastian; Roepstorff, Andreas; Mønster, Dan

    2016-01-01

    We introduce Multidimensional Recurrence Quantification Analysis (MdRQA) as a tool to analyze multidimensional time-series data. We show how MdRQA can be used to capture the dynamics of high-dimensional signals, and how MdRQA can be used to assess coupling between two or more variables. In particular, we describe applications of the method in research on joint and collective action, as it provides a coherent analysis framework to systematically investigate dynamics at different group levels-from individual dynamics, to dyadic dynamics, up to global group-level of arbitrary size. The Appendix in Supplementary Material contains a software implementation in MATLAB to calculate MdRQA measures.

  2. High field MRI in the diagnosis of multiple sclerosis: high field-high yield?

    International Nuclear Information System (INIS)

    Wattjes, Mike P.; Barkhof, Frederik

    2009-01-01

    Following the approval of the U.S. Food and Drug Administration (FDA), high field magnetic resonance imaging (MRI) has been increasingly incorporated into the clinical setting. Especially in the field of neuroimaging, the number of high field MRI applications has been increased dramatically. Taking advantage on increased signal-to-noise ratio (SNR) and chemical shift, higher magnetic field strengths offer new perspectives particularly in brain imaging and also challenges in terms of several technical and physical consequences. Over the past few years, many applications of high field MRI in patients with suspected and definite multiple sclerosis (MS) have been reported including conventional and quantitative MRI methods. Conventional pulse sequences at 3 T offers higher lesion detection rates when compared to 1.5 T, particularly in anatomic regions which are important for the diagnosis of patients with MS. MR spectroscopy at 3 T is characterized by an improved spectral resolution due to increased chemical shift allowing a better quantification of metabolites. It detects significant axonal damage already in patients presenting with clinically isolated syndromes and can quantify metabolites of special interest such as glutamate which is technically difficult to quantify at lower field strengths. Furthermore, the higher susceptibility and SNR offer advantages in the field of functional MRI and diffusion tensor imaging. The recently introduced new generation of ultra-high field systems beyond 3 T allows scanning in submillimeter resolution and gives new insights into in vivo MS pathology on MRI. The objectives of this article are to review the current knowledge and level of evidence concerning the application of high field MRI in MS and to give some ideas of research perspectives in the future. (orig.)

  3. The quantification of risk and tourism

    Directory of Open Access Journals (Sweden)

    Piet Croucamp

    2014-01-01

    Full Text Available Tourism in South Africa comprises 9.5% of Gross Domestic Product (GDP, but remains an underresearched industry, especially regarding the quantification of the risks prevailing in the social, political and economic environment in which the industry operates. Risk prediction, extrapolation forecasting is conducted largely in the context of a qualitative methodology. This article reflects on the quantification of social constructs as variables of risk in the tourism industry with reference to South Africa. The theory and methodology of quantification is briefly reviewed and the indicators of risk are conceptualized and operationalized. The identified indicators are scaled in indices for purposes of quantification. Risk assessments and the quantification of constructs rely heavily on the experience - often personal - of the researcher and this scholarly endeavour is, therefore, not inclusive of all possible identified indicators of risk. It is accepted that tourism in South Africa is an industry comprising of a large diversity of sectors, each with a different set of risk indicators and risk profiles. The emphasis of this article is thus on the methodology to be applied to a risk profile. A secondary endeavour is to provide for clarity about the conceptual and operational confines of risk in general, as well as how quantified risk relates to the tourism industry. The indices provided include both domesticand international risk indicators. The motivation for the article is to encourage a greater emphasis on quantitative research in our efforts to understand and manage a risk profile for the tourist industry.

  4. Standardless quantification methods in electron probe microanalysis

    Energy Technology Data Exchange (ETDEWEB)

    Trincavelli, Jorge, E-mail: trincavelli@famaf.unc.edu.ar [Facultad de Matemática, Astronomía y Física, Universidad Nacional de Córdoba, Ciudad Universitaria, 5000 Córdoba (Argentina); Instituto de Física Enrique Gaviola, Consejo Nacional de Investigaciones Científicas y Técnicas de la República Argentina, Medina Allende s/n, Ciudad Universitaria, 5000 Córdoba (Argentina); Limandri, Silvina, E-mail: s.limandri@conicet.gov.ar [Facultad de Matemática, Astronomía y Física, Universidad Nacional de Córdoba, Ciudad Universitaria, 5000 Córdoba (Argentina); Instituto de Física Enrique Gaviola, Consejo Nacional de Investigaciones Científicas y Técnicas de la República Argentina, Medina Allende s/n, Ciudad Universitaria, 5000 Córdoba (Argentina); Bonetto, Rita, E-mail: bonetto@quimica.unlp.edu.ar [Centro de Investigación y Desarrollo en Ciencias Aplicadas Dr. Jorge Ronco, Consejo Nacional de Investigaciones Científicas y Técnicas de la República Argentina, Facultad de Ciencias Exactas, de la Universidad Nacional de La Plata, Calle 47 N° 257, 1900 La Plata (Argentina)

    2014-11-01

    The elemental composition of a solid sample can be determined by electron probe microanalysis with or without the use of standards. The standardless algorithms are quite faster than the methods that require standards; they are useful when a suitable set of standards is not available or for rough samples, and also they help to solve the problem of current variation, for example, in equipments with cold field emission gun. Due to significant advances in the accuracy achieved during the last years, product of the successive efforts made to improve the description of generation, absorption and detection of X-rays, the standardless methods have increasingly become an interesting option for the user. Nevertheless, up to now, algorithms that use standards are still more precise than standardless methods. It is important to remark, that care must be taken with results provided by standardless methods that normalize the calculated concentration values to 100%, unless an estimate of the errors is reported. In this work, a comprehensive discussion of the key features of the main standardless quantification methods, as well as the level of accuracy achieved by them is presented. - Highlights: • Standardless methods are a good alternative when no suitable standards are available. • Their accuracy reaches 10% for 95% of the analyses when traces are excluded. • Some of them are suitable for the analysis of rough samples.

  5. Bayesian uncertainty quantification for flows in heterogeneous porous media using reversible jump Markov chain Monte Carlo methods

    KAUST Repository

    Mondal, A.

    2010-03-01

    In this paper, we study the uncertainty quantification in inverse problems for flows in heterogeneous porous media. Reversible jump Markov chain Monte Carlo algorithms (MCMC) are used for hierarchical modeling of channelized permeability fields. Within each channel, the permeability is assumed to have a lognormal distribution. Uncertainty quantification in history matching is carried out hierarchically by constructing geologic facies boundaries as well as permeability fields within each facies using dynamic data such as production data. The search with Metropolis-Hastings algorithm results in very low acceptance rate, and consequently, the computations are CPU demanding. To speed-up the computations, we use a two-stage MCMC that utilizes upscaled models to screen the proposals. In our numerical results, we assume that the channels intersect the wells and the intersection locations are known. Our results show that the proposed algorithms are capable of capturing the channel boundaries and describe the permeability variations within the channels using dynamic production history at the wells. © 2009 Elsevier Ltd. All rights reserved.

  6. Exposure levels to electromagnetic fields in usual operative situations

    International Nuclear Information System (INIS)

    Bemardi, C; Bemardi, T.; Testoni, G.; Zannoli, R.; Tubertini, O

    1997-01-01

    In the last few years, the whole population have been repeatedly solicited from media about the possible negative effects of E.M. Fields involved in all the social activities. This determinate the need of evaluations of the risks in different conditions, supported by accurate measurement protocols. This paper describes the procedures and the results of measurements in four different conditions, which involve the whole population and/or workers of a specific field. Results have been used both to increase the knowledge of the E.M. exposure levels and to evaluate the risks, with respect to the National Rules and Guidelines. (authors)

  7. Quantification of viral DNA during HIV-1 infection: A review of relevant clinical uses and laboratory methods.

    Science.gov (United States)

    Alidjinou, E K; Bocket, L; Hober, D

    2015-02-01

    Effective antiretroviral therapy usually leads to undetectable HIV-1 RNA in the plasma. However, the virus persists in some cells of infected patients as various DNA forms, both integrated and unintegrated. This reservoir represents the greatest challenge to the complete cure of HIV-1 infection and its characteristics highly impact the course of the disease. The quantification of HIV-1 DNA in blood samples constitutes currently the most practical approach to measure this residual infection. Real-time quantitative PCR (qPCR) is the most common method used for HIV-DNA quantification and many strategies have been developed to measure the different forms of HIV-1 DNA. In the literature, several "in-house" PCR methods have been used and there is a need for standardization to have comparable results. In addition, qPCR is limited for the precise quantification of low levels by background noise. Among new assays in development, digital PCR was shown to allow an accurate quantification of HIV-1 DNA. Total HIV-1 DNA is most commonly measured in clinical routine. The absolute quantification of proviruses and unintegrated forms is more often used for research purposes. Copyright © 2014 Elsevier Masson SAS. All rights reserved.

  8. Benchmarking common quantification strategies for large-scale phosphoproteomics

    DEFF Research Database (Denmark)

    Hogrebe, Alexander; von Stechow, Louise; Bekker-Jensen, Dorte B

    2018-01-01

    Comprehensive mass spectrometry (MS)-based proteomics is now feasible, but reproducible quantification remains challenging, especially for post-translational modifications such as phosphorylation. Here, we compare the most popular quantification techniques for global phosphoproteomics: label-free...

  9. Sensitive quantification of the HIV-1 reservoir in gut-associated lymphoid tissue.

    Science.gov (United States)

    Morón-López, Sara; Puertas, Maria C; Gálvez, Cristina; Navarro, Jordi; Carrasco, Anna; Esteve, Maria; Manyé, Josep; Crespo, Manel; Salgado, Maria; Martinez-Picado, Javier

    2017-01-01

    The implementation of successful strategies to achieve an HIV cure has become a priority in HIV research. However, the current location and size of HIV reservoirs is still unknown since there are limited tools to evaluate HIV latency in viral sanctuaries such as gut-associated lymphoid tissue (GALT). As reported in the so called "Boston Patients", despite undetectable levels of proviral HIV-1 DNA in blood and GALT, viral rebound happens in just few months after ART interruption. This fact might imply that current methods are not sensitive enough to detect residual reservoirs. Showing that, it is imperative to improve the detection and quantification of HIV-1 reservoir in tissue samples. Herein, we propose a novel non-enzymatic protocol for purification of Lamina Propria Leukocytes (LPL) from gut biopsies combined to viral HIV DNA (vDNA) quantification by droplet digital PCR (ddPCR) to improve the sensitivity and accuracy of viral reservoir measurements (LPL-vDNA assay). Endoscopic ileum biopsies were sampled from 12 HIV-1-infected cART-suppressed subjects. We performed a DTT/EDTA-based treatment for epithelial layer removal followed by non-enzymatic disruption of the tissue to obtain lamina propria cell suspension (LP). CD45+ cells were subsequently purified by flow sorting and vDNA was determined by ddPCR. vDNA quantification levels were significantly higher in purified LPLs (CD45+) than in bulk LPs (pgut-associated viral sanctuaries, which might be used to evaluate any proposed eradication strategy.

  10. Quantification of the level of crowdedness for pedestrian movements

    Science.gov (United States)

    Duives, Dorine C.; Daamen, Winnie; Hoogendoorn, Serge P.

    2015-06-01

    Within the realm of pedestrian research numerous measures have been proposed to estimate the level of crowdedness experienced by pedestrians. However, within the field of pedestrian traffic flow modelling there does not seem to be consensus on the question which of these measures performs best. This paper shows that the shape and scatter within the resulting fundamental diagrams differs a lot depending on the measure of crowdedness used. The main aim of the paper is to establish the advantages and disadvantages of the currently existing measures to quantify crowdedness in order to evaluate which measures provide both accurate and consistent results. The assessment is not only based on the theoretical differences, but also on the qualitative and quantitative differences between the resulting fundamental diagrams computed using the crowdedness measures on one and the same data set. The qualitative and quantitative functioning of the classical Grid-based measure is compared to with the X-T measure, an Exponentially Weighted Distance measure, and a Voronoi-Diagram measure. The consistency of relating these measures for crowdedness to the two macroscopic flow variables velocity and flow, the computational efficiency and the amount of scatter present within the fundamental diagrams produced by the implementation of the different measures are reviewed. It is found that the Voronoi-Diagram and X-T measure are the most efficient and consistent measures for crowdedness.

  11. Quantification analysis of CT for aphasic patients

    International Nuclear Information System (INIS)

    Watanabe, Shunzo; Ooyama, Hiroshi; Hojo, Kei; Tasaki, Hiroichi; Hanazono, Toshihide; Sato, Tokijiro; Metoki, Hirobumi; Totsuka, Motokichi; Oosumi, Noboru.

    1987-01-01

    Using a microcomputer, the locus and extent of the lesions, as demonstrated by computed tomography, for 44 aphasic patients with various types of aphasia were superimposed onto standardized matrices, composed of 10 slices with 3000 points (50 by 60). The relationships between the foci of the lesions and types of aphasia were investigated on the slices numbered 3, 4, 5, and 6 using a quantification theory, Type 3 (pattern analysis). Some types of regularities were observed on Slices 3, 4, 5, and 6. The group of patients with Broca's aphasia and the group with Wernicke's aphasia were generally separated on the 1st component and the 2nd component of the quantification theory, Type 3. On the other hand, the group with global aphasia existed between the group with Broca's aphasia and that with Wernicke's aphasia. The group of patients with amnestic aphasia had no specific findings, and the group with conduction aphasia existed near those with Wernicke's aphasia. The above results serve to establish the quantification theory, Type 2 (discrimination analysis) and the quantification theory, Type 1 (regression analysis). (author)

  12. Cues, quantification, and agreement in language comprehension.

    Science.gov (United States)

    Tanner, Darren; Bulkes, Nyssa Z

    2015-12-01

    We investigated factors that affect the comprehension of subject-verb agreement in English, using quantification as a window into the relationship between morphosyntactic processes in language production and comprehension. Event-related brain potentials (ERPs) were recorded while participants read sentences with grammatical and ungrammatical verbs, in which the plurality of the subject noun phrase was either doubly marked (via overt plural quantification and morphological marking on the noun) or singly marked (via only plural morphology on the noun). Both acceptability judgments and the ERP data showed heightened sensitivity to agreement violations when quantification provided an additional cue to the grammatical number of the subject noun phrase, over and above plural morphology. This is consistent with models of grammatical comprehension that emphasize feature prediction in tandem with cue-based memory retrieval. Our results additionally contrast with those of prior studies that showed no effects of plural quantification on agreement in language production. These findings therefore highlight some nontrivial divergences in the cues and mechanisms supporting morphosyntactic processing in language production and comprehension.

  13. Performance of the Real-Q EBV Quantification Kit for Epstein-Barr Virus DNA Quantification in Whole Blood.

    Science.gov (United States)

    Huh, Hee Jae; Park, Jong Eun; Kim, Ji Youn; Yun, Sun Ae; Lee, Myoung Keun; Lee, Nam Yong; Kim, Jong Won; Ki, Chang Seok

    2017-03-01

    There has been increasing interest in standardized and quantitative Epstein-Barr virus (EBV) DNA testing for the management of EBV disease. We evaluated the performance of the Real-Q EBV Quantification Kit (BioSewoom, Korea) in whole blood (WB). Nucleic acid extraction and real-time PCR were performed by using the MagNA Pure 96 (Roche Diagnostics, Germany) and 7500 Fast real-time PCR system (Applied Biosystems, USA), respectively. Assay sensitivity, linearity, and conversion factor were determined by using the World Health Organization international standard diluted in EBV-negative WB. We used 81 WB clinical specimens to compare performance of the Real-Q EBV Quantification Kit and artus EBV RG PCR Kit (Qiagen, Germany). The limit of detection (LOD) and limit of quantification (LOQ) for the Real-Q kit were 453 and 750 IU/mL, respectively. The conversion factor from EBV genomic copies to IU was 0.62. The linear range of the assay was from 750 to 10⁶ IU/mL. Viral load values measured with the Real-Q assay were on average 0.54 log₁₀ copies/mL higher than those measured with the artus assay. The Real-Q assay offered good analytical performance for EBV DNA quantification in WB.

  14. StakeMeter: value-based stakeholder identification and quantification framework for value-based software systems.

    Science.gov (United States)

    Babar, Muhammad Imran; Ghazali, Masitah; Jawawi, Dayang N A; Bin Zaheer, Kashif

    2015-01-01

    Value-based requirements engineering plays a vital role in the development of value-based software (VBS). Stakeholders are the key players in the requirements engineering process, and the selection of critical stakeholders for the VBS systems is highly desirable. Based on the stakeholder requirements, the innovative or value-based idea is realized. The quality of the VBS system is associated with the concrete set of valuable requirements, and the valuable requirements can only be obtained if all the relevant valuable stakeholders participate in the requirements elicitation phase. The existing value-based approaches focus on the design of the VBS systems. However, the focus on the valuable stakeholders and requirements is inadequate. The current stakeholder identification and quantification (SIQ) approaches are neither state-of-the-art nor systematic for the VBS systems. The existing approaches are time-consuming, complex and inconsistent which makes the initiation process difficult. Moreover, the main motivation of this research is that the existing SIQ approaches do not provide the low level implementation details for SIQ initiation and stakeholder metrics for quantification. Hence, keeping in view the existing SIQ problems, this research contributes in the form of a new SIQ framework called 'StakeMeter'. The StakeMeter framework is verified and validated through case studies. The proposed framework provides low-level implementation guidelines, attributes, metrics, quantification criteria and application procedure as compared to the other methods. The proposed framework solves the issues of stakeholder quantification or prioritization, higher time consumption, complexity, and process initiation. The framework helps in the selection of highly critical stakeholders for the VBS systems with less judgmental error.

  15. StakeMeter: value-based stakeholder identification and quantification framework for value-based software systems.

    Directory of Open Access Journals (Sweden)

    Muhammad Imran Babar

    Full Text Available Value-based requirements engineering plays a vital role in the development of value-based software (VBS. Stakeholders are the key players in the requirements engineering process, and the selection of critical stakeholders for the VBS systems is highly desirable. Based on the stakeholder requirements, the innovative or value-based idea is realized. The quality of the VBS system is associated with the concrete set of valuable requirements, and the valuable requirements can only be obtained if all the relevant valuable stakeholders participate in the requirements elicitation phase. The existing value-based approaches focus on the design of the VBS systems. However, the focus on the valuable stakeholders and requirements is inadequate. The current stakeholder identification and quantification (SIQ approaches are neither state-of-the-art nor systematic for the VBS systems. The existing approaches are time-consuming, complex and inconsistent which makes the initiation process difficult. Moreover, the main motivation of this research is that the existing SIQ approaches do not provide the low level implementation details for SIQ initiation and stakeholder metrics for quantification. Hence, keeping in view the existing SIQ problems, this research contributes in the form of a new SIQ framework called 'StakeMeter'. The StakeMeter framework is verified and validated through case studies. The proposed framework provides low-level implementation guidelines, attributes, metrics, quantification criteria and application procedure as compared to the other methods. The proposed framework solves the issues of stakeholder quantification or prioritization, higher time consumption, complexity, and process initiation. The framework helps in the selection of highly critical stakeholders for the VBS systems with less judgmental error.

  16. StakeMeter: Value-Based Stakeholder Identification and Quantification Framework for Value-Based Software Systems

    Science.gov (United States)

    Babar, Muhammad Imran; Ghazali, Masitah; Jawawi, Dayang N. A.; Zaheer, Kashif Bin

    2015-01-01

    Value-based requirements engineering plays a vital role in the development of value-based software (VBS). Stakeholders are the key players in the requirements engineering process, and the selection of critical stakeholders for the VBS systems is highly desirable. Based on the stakeholder requirements, the innovative or value-based idea is realized. The quality of the VBS system is associated with the concrete set of valuable requirements, and the valuable requirements can only be obtained if all the relevant valuable stakeholders participate in the requirements elicitation phase. The existing value-based approaches focus on the design of the VBS systems. However, the focus on the valuable stakeholders and requirements is inadequate. The current stakeholder identification and quantification (SIQ) approaches are neither state-of-the-art nor systematic for the VBS systems. The existing approaches are time-consuming, complex and inconsistent which makes the initiation process difficult. Moreover, the main motivation of this research is that the existing SIQ approaches do not provide the low level implementation details for SIQ initiation and stakeholder metrics for quantification. Hence, keeping in view the existing SIQ problems, this research contributes in the form of a new SIQ framework called ‘StakeMeter’. The StakeMeter framework is verified and validated through case studies. The proposed framework provides low-level implementation guidelines, attributes, metrics, quantification criteria and application procedure as compared to the other methods. The proposed framework solves the issues of stakeholder quantification or prioritization, higher time consumption, complexity, and process initiation. The framework helps in the selection of highly critical stakeholders for the VBS systems with less judgmental error. PMID:25799490

  17. Methods for the quantification of GHG emissions at the landscape level for developing countries in smallholder contexts

    International Nuclear Information System (INIS)

    Milne, Eleanor; Easter, Mark; Ogle, Stephen; Denef, Karolien; Paustian, Keith; Neufeldt, Henry; Rosenstock, Todd; Smalligan, Mike; Cerri, Carlos Eduardo; Malin, Daniella; Bernoux, Martial; Casarim, Felipe; Pearson, Timothy; Bird, David Neil; Steglich, Evelyn; Ostwald, Madelene

    2013-01-01

    Landscape scale quantification enables farmers to pool resources and expertise. However, the problem remains of how to quantify these gains. This article considers current greenhouse gas (GHG) quantification methods that can be used in a landscape scale analysis in terms of relevance to areas dominated by smallholders in developing countries. In landscape scale carbon accounting frameworks, measurements are an essential element. Sampling strategies need careful design to account for all pools/fluxes and to ensure judicious use of resources. Models can be used to scale-up measurements and fill data gaps. In recent years a number of accessible models and calculators have been developed which can be used at the landscape scale in developing country areas. Some are based on the Intergovernmental Panel on Climate Change (IPCC) method and others on dynamic ecosystem models. They have been developed for a range of different purposes and therefore vary in terms of accuracy and usability. Landscape scale assessments of GHGs require a combination of ground sampling, use of data from census, remote sensing (RS) or other sources and modelling. Fitting of all of these aspects together needs to be performed carefully to minimize uncertainties and maximize the use of scarce resources. This is especially true in heterogeneous landscapes dominated by smallholders in developing countries. (letter)

  18. Quantification of the level of fat-soluble vitamins in feed based on the novel microemulsion electrokinetic chromatography (MEEKC) method.

    Science.gov (United States)

    Olędzka, Ilona; Kowalski, Piotr; Bałuch, Alicja; Bączek, Tomasz; Paradziej-Łukowicz, Jolanta; Taciak, Marcin; Pastuszewska, Barbara

    2014-02-01

    Simultaneous quantification of liposoluble vitamins is not a new area of interest, since these compounds co-determine the nutritional quality of food and feed, a field widely explored in the human and animal diet. However, the development of appropriate methods is still a matter of concern, especially when the vitamin composition is highly complex, as is the case with feed designated for laboratory animals, representing a higher health and microbiological status. A method combining microemulsion electrokinetic chromatography (MEEKC) with liquid-liquid extraction was developed for the determination of four fat-soluble vitamins in animal feed. A separation medium consisting of 25 mmol L⁻¹ phosphate buffer (pH 2.5), 2-propanol, 1-butanol, sodium dodecyl sulfate and octane allowed the simultaneous determination of vitamins A, D, E and K within a reasonable time of 25 min. The polarity of the separation voltage was reversed in view of the strongly suppressed electro-osmotic flow, and the applied voltage was set at 12 kV. The fat-soluble vitamins were separated in the order of decreasing hydrophobicity. It was proved that the proposed MEEKC method was sufficiently specific and sensitive for screening fat-soluble vitamins in animal feed samples after their sterilization. © 2013 Society of Chemical Industry.

  19. Resonance properties of a three-level atom with quantized field modes

    International Nuclear Information System (INIS)

    Yoo, H.I.

    1984-01-01

    A system of one three-level atom and one or two quantized electro-magnetic field modes coupled to each other by the dipole interaction, with the rotating wave approximation is studied. All three atomic configurations, i.e., cascade Lambda- and V-types, are treated simultaneously. The system is treated as closed, i.e., no interaction with the external radiation field modes, to reveal the internal structures and symmetries in the system. The general dynamics of the system are investigated under several distinct initial conditions and their similarities and differences with the dynamics of the Jaynes-Cummings model are revealed. Also investigated is the possibility of so-called coherent trapping of the atom in the quantized field modes in a resonator. An atomic state of coherent trapping exists only for limited cases, and it generally requires the field to be in some special states, depending on the system. The discussion of coherent trapping is extended into a system of M identical three-level atoms. The stability of a coherent-trapping state when fluorescence can take place is discussed. The distinction between a system with resonator field modes and one with ideal laser modes is made clear, and the atomic relaxation to the coherent-trapping atomic state when a Lambda-type atom is irradiated by two ideal laser beams is studied. The experimental prospects to observe the collapse-revival phenomena in the atomic occupation probabilities, which is characteristic of a system with quantized resonator field modes is discussed

  20. Development of a method for detection and quantification of B. brongniartii and B. bassiana in soil

    Science.gov (United States)

    Canfora, L.; Malusà, E.; Tkaczuk, C.; Tartanus, M.; Łabanowska, B. H.; Pinzari, F.

    2016-03-01

    A culture independent method based on qPCR was developed for the detection and quantification of two fungal inoculants in soil. The aim was to adapt a genotyping approach based on SSR (Simple Sequence Repeat) marker to a discriminating tracing of two different species of bioinoculants in soil, after their in-field release. Two entomopathogenic fungi, Beauveria bassiana and B. brongniartii, were traced and quantified in soil samples obtained from field trials. These two fungal species were used as biological agents in Poland to control Melolontha melolontha (European cockchafer), whose larvae live in soil menacing horticultural crops. Specificity of SSR markers was verified using controls consisting of: i) soil samples containing fungal spores of B. bassiana and B. brongniartii in known dilutions; ii) the DNA of the fungal microorganisms; iii) soil samples singly inoculated with each fungus species. An initial evaluation of the protocol was performed with analyses of soil DNA and mycelial DNA. Further, the simultaneous detection and quantification of B. bassiana and B. brongniartii in soil was achieved in field samples after application of the bio-inoculants. The protocol can be considered as a relatively low cost solution for the detection, identification and traceability of fungal bio-inoculants in soil.

  1. Evaluation of 793/B-like and Mass-like vaccine strain kinetics in experimental and field conditions by real-time RT-PCR quantification.

    Science.gov (United States)

    Tucciarone, C M; Franzo, G; Berto, G; Drigo, M; Ramon, G; Koutoulis, K C; Catelli, E; Cecchinato, M

    2018-01-01

    Infectious bronchitis virus (IBV) is a great economic burden both for productive losses and costs of the control strategies. Many different vaccination protocols are applied in the same region and even in consecutive cycles on the same farm in order to find the perfect balance between costs and benefits. In Northern Italy, the usual second vaccination is more and more often moved up to the chick's first d of life. The second strain administration together with the common Mass priming by spray at the hatchery allows saving money and time and reducing animal stress. The present work compared the different vaccine strains (Mass-like or B48, and 1/96) kinetics both in field conditions and in a 21-day-long experimental trial in broilers, monitoring the viral replication by upper respiratory tract swabbing and vaccine specific real time reverse transcription PCR (RT-PCR) quantification. In both field and experimental conditions, titers for all the vaccines showed an increasing trend in the first 2 wk and then a decrease, though still remaining detectable during the whole monitored period. IBV field strain and avian Metapneumovirus (aMPV) presence also was also investigated by RT-PCR and sequencing, and by multiplex real-time RT-PCR, respectively, revealing a consistency in the pathogen introduction timing at around 30 d, in correspondence with the vaccine titer's main decrease. These findings suggest the need for an accurate knowledge of live vaccine kinetics, whose replication can compete with the other pathogen one, providing additional protection to be added to what is conferred by the adaptive immune response. © 2017 Poultry Science Association Inc.

  2. Methodologies for uncertainty analysis in the level 2 PSA and their implementation procedures

    International Nuclear Information System (INIS)

    Ahn, Kwang Il; Yang, Joon Eun; Kim, Dong Ha

    2002-04-01

    Main purpose of this report to present standardized methodologies for uncertainty analysis in the Level 2 Probabilistic Safety Assessment (PSA) and their implementation procedures, based on results obtained through a critical review of the existing methodologies for the analysis of uncertainties employed in the Level 2 PSA, especially Accident Progression Event Tree (APET). Uncertainties employed in the Level 2 PSA, quantitative expressions of overall knowledge of analysts' and experts' participating in the probabilistic quantification process of phenomenological accident progressions ranging from core melt to containment failure, their numerical values are directly related to the degree of confidence that the analyst has that a given phenomenological event or accident process will or will not occur, or analyst's subjective probabilities of occurrence. These results that are obtained from Level 2 PSA uncertainty analysis, become an essential contributor to the plant risk, in addition to the Level 1 PSA and Level 3 PSA uncertainties. Uncertainty analysis methodologies and their implementation procedures presented in this report was prepared based on the following criteria: 'uncertainty quantification process must be logical, scrutable, complete, consistent and in an appropriate level of detail, as mandated by the Level 2 PSA objectives'. For the aforementioned purpose, this report deals mainly with (1) summary of general or Level 2 PSA specific uncertainty analysis methodologies, (2) selection of phenomenological branch events for uncertainty analysis in the APET, methodology for quantification of APET uncertainty inputs and its implementation procedure, (3) statistical propagation of uncertainty inputs through APET and its implementation procedure, and (4) formal procedure for quantification of APET uncertainties and source term categories (STCs) through the Level 2 PSA quantification codes

  3. Accurate quantification of endogenous androgenic steroids in cattle's meat by gas chromatography mass spectrometry using a surrogate analyte approach

    Energy Technology Data Exchange (ETDEWEB)

    Ahmadkhaniha, Reza; Shafiee, Abbas [Department of Medicinal Chemistry, Faculty of Pharmacy and Pharmaceutical Sciences Research Center, Tehran University of Medical Sciences, Tehran 14174 (Iran, Islamic Republic of); Rastkari, Noushin [Center for Environmental Research, Tehran University of Medical Sciences, Tehran (Iran, Islamic Republic of); Kobarfard, Farzad [Department of Medicinal Chemistry, School of Pharmacy, Shaheed Beheshti University of Medical Sciences, Tavaneer Ave., Valieasr St., Tehran (Iran, Islamic Republic of)], E-mail: farzadkf@yahoo.com

    2009-01-05

    Determination of endogenous steroids in complex matrices such as cattle's meat is a challenging task. Since endogenous steroids always exist in animal tissues, no analyte-free matrices for constructing the standard calibration line will be available, which is crucial for accurate quantification specially at trace level. Although some methods have been proposed to solve the problem, none has offered a complete solution. To this aim, a new quantification strategy was developed in this study, which is named 'surrogate analyte approach' and is based on using isotope-labeled standards instead of natural form of endogenous steroids for preparing the calibration line. In comparison with the other methods, which are currently in use for the quantitation of endogenous steroids, this approach provides improved simplicity and speed for analysis on a routine basis. The accuracy of this method is better than other methods at low concentration and comparable to the standard addition at medium and high concentrations. The method was also found to be valid according to the ICH criteria for bioanalytical methods. The developed method could be a promising approach in the field of compounds residue analysis.

  4. Selective Distance-Based K+ Quantification on Paper-Based Microfluidics.

    Science.gov (United States)

    Gerold, Chase T; Bakker, Eric; Henry, Charles S

    2018-04-03

    In this study, paper-based microfluidic devices (μPADs) capable of K + quantification in aqueous samples, as well as in human serum, using both colorimetric and distance-based methods are described. A lipophilic phase containing potassium ionophore I (valinomycin) was utilized to achieve highly selective quantification of K + in the presence of Na + , Li + , and Mg 2+ ions. Successful addition of a suspended lipophilic phase to a wax printed paper-based device is described and offers a solution to current approaches that rely on organic solvents, which damage wax barriers. The approach provides an avenue for future alkali/alkaline quantification utilizing μPADs. Colorimetric spot tests allowed for K + quantification from 0.1-5.0 mM using only 3.00 μL of sample solution. Selective distance-based quantification required small sample volumes (6.00 μL) and gave responses sensitive enough to distinguish between 1.0 and 2.5 mM of sample K + . μPADs using distance-based methods were also capable of differentiating between 4.3 and 6.9 mM K + in human serum samples. Distance-based methods required no digital analysis, electronic hardware, or pumps; any steps required for quantification could be carried out using the naked eye.

  5. Uncertainty quantification an accelerated course with advanced applications in computational engineering

    CERN Document Server

    Soize, Christian

    2017-01-01

    This book presents the fundamental notions and advanced mathematical tools in the stochastic modeling of uncertainties and their quantification for large-scale computational models in sciences and engineering. In particular, it focuses in parametric uncertainties, and non-parametric uncertainties with applications from the structural dynamics and vibroacoustics of complex mechanical systems, from micromechanics and multiscale mechanics of heterogeneous materials. Resulting from a course developed by the author, the book begins with a description of the fundamental mathematical tools of probability and statistics that are directly useful for uncertainty quantification. It proceeds with a well carried out description of some basic and advanced methods for constructing stochastic models of uncertainties, paying particular attention to the problem of calibrating and identifying a stochastic model of uncertainty when experimental data is available. < This book is intended to be a graduate-level textbook for stu...

  6. Three level constraints on conformal field theories and string models

    International Nuclear Information System (INIS)

    Lewellen, D.C.

    1989-05-01

    Simple tree level constraints for conformal field theories which follow from the requirement of crossing symmetry of four-point amplitudes are presented, and their utility for probing general properties of string models is briefly illustrated and discussed. 9 refs

  7. FRET-based modified graphene quantum dots for direct trypsin quantification in urine

    Energy Technology Data Exchange (ETDEWEB)

    Poon, Chung-Yan; Li, Qinghua [Department of Chemistry, Hong Kong Baptist University, Kowloon Tong, Hong Kong Special Administrative Region (Hong Kong); Zhang, Jiali; Li, Zhongping [Department of Chemistry, Hong Kong Baptist University, Kowloon Tong, Hong Kong Special Administrative Region (Hong Kong); Research Center of Environmental Science and Engineering, School of Chemistry and Chemical Engineering, Shanxi University, Taiyuan 030006 (China); Dong, Chuan [Research Center of Environmental Science and Engineering, School of Chemistry and Chemical Engineering, Shanxi University, Taiyuan 030006 (China); Lee, Albert Wai-Ming; Chan, Wing-Hong [Department of Chemistry, Hong Kong Baptist University, Kowloon Tong, Hong Kong Special Administrative Region (Hong Kong); Li, Hung-Wing, E-mail: hwli@hkbu.edu.hk [Department of Chemistry, Hong Kong Baptist University, Kowloon Tong, Hong Kong Special Administrative Region (Hong Kong)

    2016-04-21

    A versatile nanoprobe was developed for trypsin quantification with fluorescence resonance energy transfer (FRET). Here, fluorescence graphene quantum dot is utilized as a donor while a well-designed coumarin derivative, CMR2, as an acceptor. Moreover, bovine serum albumin (BSA), as a protein model, is not only served as a linker for the FRET pair, but also a fluorescence enhancer of the quantum dots and CMR2. In the presence of trypsin, the FRET system would be destroyed when the BSA is digested by trypsin. Thus, the emission peak of the donor is regenerated and the ratio of emission peak of donor/emission peak of acceptor increased. By the ratiometric measurement of these two emission peaks, trypsin content could be determined. The detection limit of trypsin was found to be 0.7 μg/mL, which is 0.008-fold of the average trypsin level in acute pancreatitis patient's urine suggesting a high potential for fast and low cost clinical screening. - Highlights: • A FRET-based biosensor was developed for direct quantification of trypsin. • Fast and sensitive screening of pancreatic disease was facilitated. • The direct quantification of trypsin in urine samples was demonstrated.

  8. Human exposure to Bisphenol A and liver health status: Quantification of urinary and circulating levels by LC-MS/MS.

    Science.gov (United States)

    Nicolucci, Carla; Errico, Sonia; Federico, Alessandro; Dallio, Marcello; Loguercio, Carmelina; Diano, Nadia

    2017-06-05

    A selective and highly sensitive analytical methodology for determination of Bisphenol A in human plasma was developed and validated. The method was based on selective liquid/solid extraction, combined with liquid chromatography-electrospray ionization tandem mass spectrometry in the multiple reaction monitoring mode and negative ionization. The linearity of the detector response was verified in human plasma over the concentration range 0.100-200ngmL -1 . The detection limit was 0.03ngmL -1 and the quantification limit was 0.100ngmL -1 . The analytical features of the proposed in-house validated method were satisfactory: precision was Bisphenol A was detected above the detection limit in all samples. The data show a persistence of unconjugated Bisphenol A levels in plasma and indicate a chronic Bisphenol A exposure of the target organ, suggesting an association between liver health status and Bisphenol A exposure. The results from our study are valuable for further investigation with large sample size and longitudinal study designs, necessary to confirm the observed association. Copyright © 2017 Elsevier B.V. All rights reserved.

  9. Synthesis and Review: Advancing agricultural greenhouse gas quantification

    International Nuclear Information System (INIS)

    Olander, Lydia P; Wollenberg, Eva; Tubiello, Francesco N; Herold, Martin

    2014-01-01

    Reducing emissions of agricultural greenhouse gases (GHGs), such as methane and nitrous oxide, and sequestering carbon in the soil or in living biomass can help reduce the impact of agriculture on climate change while improving productivity and reducing resource use. There is an increasing demand for improved, low cost quantification of GHGs in agriculture, whether for national reporting to the United Nations Framework Convention on Climate Change (UNFCCC), underpinning and stimulating improved practices, establishing crediting mechanisms, or supporting green products. This ERL focus issue highlights GHG quantification to call attention to our existing knowledge and opportunities for further progress. In this article we synthesize the findings of 21 papers on the current state of global capability for agricultural GHG quantification and visions for its improvement. We conclude that strategic investment in quantification can lead to significant global improvement in agricultural GHG estimation in the near term. (paper)

  10. The effect of a coupling field on the entanglement dynamics of a three-level atom

    International Nuclear Information System (INIS)

    Mortezapour, Ali; Mahmoudi, Mohammad; Abedi, Majid; Khajehpour, M R H

    2011-01-01

    The effect of a coupling laser field on the entanglement of a three-level quantum system and its spontaneous emission is investigated via the reduced quantum entropy. We consider two schemes: the upper- and lower-level couplings. By calculating the degree of entanglement (DEM) for both systems, it is shown that the entanglement between the atom and its spontaneous emission can be controlled by the coupling laser field. This field, however, affects the entanglement differently in the two schemes; it is only the lower-level coupling scheme that shows a non-zero steady state DEM which can be controlled by the intensity and detuning of the coupling laser field.

  11. The effect of a coupling field on the entanglement dynamics of a three-level atom

    Energy Technology Data Exchange (ETDEWEB)

    Mortezapour, Ali; Mahmoudi, Mohammad [Physics Department, Zanjan University, PO Box 45195-313, Zanjan (Iran, Islamic Republic of); Abedi, Majid; Khajehpour, M R H, E-mail: mahmoudi@iasbs.ac.ir, E-mail: pour@iasbs.ac.ir [Institute for Advanced Studies in Basic Sciences, PO Box 45195-159, Zanjan (Iran, Islamic Republic of)

    2011-04-28

    The effect of a coupling laser field on the entanglement of a three-level quantum system and its spontaneous emission is investigated via the reduced quantum entropy. We consider two schemes: the upper- and lower-level couplings. By calculating the degree of entanglement (DEM) for both systems, it is shown that the entanglement between the atom and its spontaneous emission can be controlled by the coupling laser field. This field, however, affects the entanglement differently in the two schemes; it is only the lower-level coupling scheme that shows a non-zero steady state DEM which can be controlled by the intensity and detuning of the coupling laser field.

  12. La quantification en Kabiye: une approche linguistique | Pali ...

    African Journals Online (AJOL)

    ... which is denoted by lexical quantifiers. Quantification with specific reference is provided by different types of linguistic units (nouns, numerals, adjectives, adverbs, ideophones and verbs) in arguments/noun phrases and in the predicative phrase in the sense of Chomsky. Keywords: quantification, class, number, reference, ...

  13. Permanent magnetic field, direct electric field, and infrared to reduce blood glucose level and hepatic function in mus musculus with diabetic mellitus

    Science.gov (United States)

    Suhariningsih; Basuki Notobroto, Hari; Winarni, Dwi; Achmad Hussein, Saikhu; Anggono Prijo, Tri

    2017-05-01

    Blood contains several electrolytes with positive (cation) and negative (anion) ion load. Both electrolytes deliver impulse synergistically adjusting body needs. Those electrolytes give specific effect to external disturbance such as electric, magnetic, even infrared field. A study has been conducted to reduce blood glucose level and liver function, in type 2 Diabetes Mellitus patients, using Biophysics concept which uses combination therapy of permanent magnetic field, electric field, and infrared. This study used 48 healthy mice (mus musculus), male, age 3-4 weeks, with approximately 25-30 g in weight. Mice was fed with lard as high fat diet orally, before Streptozotocin (STZ) induction become diabetic mice. Therapy was conducted by putting mice in a chamber that emits the combination of permanent magnetic field, electric field, and infrared, every day for 1 hour for 28 days. There were 4 combinations of therapy/treatment, namely: (1) permanent magnetic field, direct electric field, and infrared; (2) permanent magnetic field, direct electric field, without infrared; (3) permanent magnetic field, alternating electric field, and infrared; and (4) permanent magnetic field, alternating electric field, without infrared. The results of therapy show that every combination is able to reduce blood glucose level, AST, and ALT. However, the best result is by using combination of permanent magnetic field, direct electric field, and infrared.

  14. Impact of sequential proton density fat fraction for quantification of hepatic steatosis in nonalcoholic fatty liver disease.

    Science.gov (United States)

    Idilman, Ilkay S; Keskin, Onur; Elhan, Atilla Halil; Idilman, Ramazan; Karcaaltincaba, Musturay

    2014-05-01

    To determine the utility of sequential MRI-estimated proton density fat fraction (MRI-PDFF) for quantification of the longitudinal changes in liver fat content in individuals with nonalcoholic fatty liver disease (NAFLD). A total of 18 consecutive individuals (M/F: 10/8, mean age: 47.7±9.8 years) diagnosed with NAFLD, who underwent sequential PDFF calculations for the quantification of hepatic steatosis at two different time points, were included in the study. All patients underwent T1-independent volumetric multi-echo gradient-echo imaging with T2* correction and spectral fat modeling. A close correlation for quantification of hepatic steatosis between the initial MRI-PDFF and liver biopsy was observed (rs=0.758, phepatic steatosis. The changes in serum ALT levels significantly reflected changes in MRI-PDFF in patients with NAFLD.

  15. Evaluation of the electromagnetic field level emitted by medium frequency AM broadcast stations

    International Nuclear Information System (INIS)

    Licitra, G.; Bambini, S.; Barellini, A.; Monorchio, A.; Rogovich, A.

    2004-01-01

    In order to estimate the level of the electromagnetic field produced by telecommunication systems, different computational techniques can be employed whose complexity depends on the accuracy of the final results. In this paper, we present the validation of a code based on the method of moments that allows us to analyse the electromagnetic field emitted by radio-communication systems operating at medium frequencies. The method is able to provide an accurate estimate of the levels of electromagnetic field produced by this type of device and, consequently, it can be used as a method for verifying the compliance of the system with the safe exposure level regulations and population protection laws. Some numerical and experimental results are shown relevant to an amplitude modulated (AM) radio transmitter, together with the results of a forthcoming system that will be operative in the near future. (authors)

  16. Quantification analysis of CT for aphasic patients

    Energy Technology Data Exchange (ETDEWEB)

    Watanabe, S.; Ooyama, H.; Hojo, K.; Tasaki, H.; Hanazono, T.; Sato, T.; Metoki, H.; Totsuka, M.; Oosumi, N.

    1987-02-01

    Using a microcomputer, the locus and extent of the lesions, as demonstrated by computed tomography, for 44 aphasic patients with various types of aphasia were superimposed onto standardized matrices, composed of 10 slices with 3000 points (50 by 60). The relationships between the foci of the lesions and types of aphasia were investigated on the slices numbered 3, 4, 5, and 6 using a quantification theory, Type 3 (pattern analysis). Some types of regularities were observed on slices 3, 4, 5, and 6. The group of patients with Broca's aphasia and the group with Wernicke's aphasia were generally separated on the 1st component and the 2nd component of the quantification theory, Type 3. On the other hand, the group with global aphasia existed between the group with Broca's aphasia and that with Wernicke's aphasia. The group of patients with amnestic aphasia had no specific findings, and the group with conduction aphasia existed near those with Wernicke's aphasia. The above results serve to establish the quantification theory, Type 2 (discrimination analysis) and the quantification theory, Type 1 (regression analysis).

  17. Rapid quantification and sex determination of forensic evidence materials.

    Science.gov (United States)

    Andréasson, Hanna; Allen, Marie

    2003-11-01

    DNA quantification of forensic evidence is very valuable for an optimal use of the available biological material. Moreover, sex determination is of great importance as additional information in criminal investigations as well as in identification of missing persons, no suspect cases, and ancient DNA studies. While routine forensic DNA analysis based on short tandem repeat markers includes a marker for sex determination, analysis of samples containing scarce amounts of DNA is often based on mitochondrial DNA, and sex determination is not performed. In order to allow quantification and simultaneous sex determination on minute amounts of DNA, an assay based on real-time PCR analysis of a marker within the human amelogenin gene has been developed. The sex determination is based on melting curve analysis, while an externally standardized kinetic analysis allows quantification of the nuclear DNA copy number in the sample. This real-time DNA quantification assay has proven to be highly sensitive, enabling quantification of single DNA copies. Although certain limitations were apparent, the system is a rapid, cost-effective, and flexible assay for analysis of forensic casework samples.

  18. Field-measured drag area is a key correlate of level cycling time trial performance

    Directory of Open Access Journals (Sweden)

    James E. Peterman

    2015-08-01

    Full Text Available Drag area (Ad is a primary factor determining aerodynamic resistance during level cycling and is therefore a key determinant of level time trial performance. However, Ad has traditionally been difficult to measure. Our purpose was to determine the value of adding field-measured Ad as a correlate of level cycling time trial performance. In the field, 19 male cyclists performed a level (22.1 km time trial. Separately, field-determined Ad and rolling resistance were calculated for subjects along with projected frontal area assessed directly (AP and indirectly (Est AP. Also, a graded exercise test was performed to determine $\\dot {V}{O}_{2}$V̇O2 peak, lactate threshold (LT, and economy. $\\dot {V}{O}_{2}$V̇O2 peak ($\\mathrm{l}~\\min ^{-1}$lmin−1 and power at LT were significantly correlated to power measured during the time trial (r = 0.83 and 0.69, respectively but were not significantly correlated to performance time (r = − 0.42 and −0.45. The correlation with performance time improved significantly (p < 0.05 when these variables were normalized to Ad. Of note, Ad alone was better correlated to performance time (r = 0.85, p < 0.001 than any combination of non-normalized physiological measure. The best correlate with performance time was field-measured power output during the time trial normalized to Ad (r = − 0.92. AP only accounted for 54% of the variability in Ad. Accordingly, the correlation to performance time was significantly lower using power normalized to AP (r = − 0.75 or Est AP (r = − 0.71. In conclusion, unless normalized to Ad, level time trial performance in the field was not highly correlated to common laboratory measures. Furthermore, our field-measured Ad is easy to determine and was the single best predictor of level time trial performance.

  19. NF ISO 14064-2. Greenhouse gases. Part 2: specifications and guidance at the project level for quantification, monitoring and reporting of greenhouse gas emission reductions or removal enhancements

    International Nuclear Information System (INIS)

    2005-01-01

    This document describes methodology for quantification, monitoring and reporting of activities intended to cause greenhouse gas emissions and reductions at projects level (activity modifying the conditions identified in a baseline scenario, intended to reduce emissions or to increase the removal of greenhouse gases). Thus it suggests a method for the declarations of inventory of projects greenhouse gases and provides support for the monitoring and the management of emissions. It provides terms and definitions, principles, the introduction to greenhouse gases projects and the requirements for greenhouse gas projects. (A.L.B.)

  20. Elemental labelling combined with liquid chromatography inductively coupled plasma mass spectrometry for quantification of biomolecules: A review

    Science.gov (United States)

    Kretschy, Daniela; Koellensperger, Gunda; Hann, Stephan

    2012-01-01

    This article reviews novel quantification concepts where elemental labelling is combined with flow injection inductively coupled plasma mass spectrometry (FI-ICP-MS) or liquid chromatography inductively coupled plasma mass spectrometry (LC–ICP-MS), and employed for quantification of biomolecules such as proteins, peptides and related molecules in challenging sample matrices. In the first sections an overview on general aspects of biomolecule quantification, as well as of labelling will be presented emphasizing the potential, which lies in such methodological approaches. In this context, ICP-MS as detector provides high sensitivity, selectivity and robustness in biological samples and offers the capability for multiplexing and isotope dilution mass spectrometry (IDMS). Fundamental methodology of elemental labelling will be highlighted and analytical, as well as biomedical applications will be presented. A special focus will lie on established applications underlining benefits and bottlenecks of such approaches for the implementation in real life analysis. Key research made in this field will be summarized and a perspective for future developments including sophisticated and innovative applications will given. PMID:23062431

  1. Person-generated Data in Self-quantification. A Health Informatics Research Program.

    Science.gov (United States)

    Gray, Kathleen; Martin-Sanchez, Fernando J; Lopez-Campos, Guillermo H; Almalki, Manal; Merolli, Mark

    2017-01-09

    The availability of internet-connected mobile, wearable and ambient consumer technologies, direct-to-consumer e-services and peer-to-peer social media sites far outstrips evidence about the efficiency, effectiveness and efficacy of using them in healthcare applications. The aim of this paper is to describe one approach to build a program of health informatics research, so as to generate rich and robust evidence about health data and information processing in self-quantification and associated healthcare and health outcomes. The paper summarises relevant health informatics research approaches in the literature and presents an example of developing a program of research in the Health and Biomedical Informatics Centre (HaBIC) at the University of Melbourne. The paper describes this program in terms of research infrastructure, conceptual models, research design, research reporting and knowledge sharing. The paper identifies key outcomes from integrative and multiple-angle approaches to investigating the management of information and data generated by use of this Centre's collection of wearable, mobiles and other devices in health self-monitoring experiments. These research results offer lessons for consumers, developers, clinical practitioners and biomedical and health informatics researchers. Health informatics is increasingly called upon to make sense of emerging self-quantification and other digital health phenomena that are well beyond the conventions of healthcare in which the field of informatics originated and consolidated. To make a substantial contribution to optimise the aims, processes and outcomes of health self-quantification needs further work at scale in multi-centre collaborations for this Centre and for health informatics researchers generally.

  2. WE-G-17A-03: MRIgRT: Quantification of Organ Motion

    International Nuclear Information System (INIS)

    Stanescu, T; Tadic, T; Jaffray, D

    2014-01-01

    Purpose: To develop an MRI-based methodology and tools required for the quantification of organ motion on a dedicated MRI-guided radiotherapy system. A three-room facility, consisting of a TrueBeam 6X linac vault, a 1.5T MR suite and a brachytherapy interventional room, is currently under commissioning at our institution. The MR scanner can move and image in either room for diagnostic and treatment guidance purposes. Methods: A multi-imaging modality (MR, kV) phantom, featuring programmable 3D simple and complex motion trajectories, was used for the validation of several image sorting algorithms. The testing was performed on MRI (e.g. TrueFISP, TurboFLASH), 4D CT and 4D CBCT. The image sorting techniques were based on a) direct image pixel manipulation into columns or rows, b) single and aggregated pixel data tracking and c) using computer vision techniques for global pixel analysis. Subsequently, the motion phantom and sorting algorithms were utilized for commissioning of MR fast imaging techniques for 2D-cine and 4D data rendering. MR imaging protocols were optimized (e.g. readout gradient strength vs. SNR) to minimize the presence of susceptibility-induced distortions, which were reported through phantom experiments and numerical simulations. The system-related distortions were also quantified (dedicated field phantom) and treated as systematic shifts where relevant. Results: Image sorting algorithms were validated for specific MR-based applications such as quantification of organ motion, local data sampling, and 4D MRI for pre-RT delivery with accuracy better than the raw image pixel size (e.g. 1 mm). MR fast imaging sequences were commissioning and imaging strategies were developed to mitigate spatial artifacts with minimal penalty on the image spatial and temporal sampling. Workflows (e.g. liver) were optimized to include the new motion quantification tools for RT planning and daily patient setup verification. Conclusion: Comprehensive methods were developed

  3. Multiparameter fluorescence imaging for quantification of TH-1 and TH-2 cytokines at the single-cell level

    Science.gov (United States)

    Fekkar, Hakim; Benbernou, N.; Esnault, S.; Shin, H. C.; Guenounou, Moncef

    1998-04-01

    Immune responses are strongly influenced by the cytokines following antigenic stimulation. Distinct cytokine-producing T cell subsets are well known to play a major role in immune responses and to be differentially regulated during immunological disorders, although the characterization and quantification of the TH-1/TH-2 cytokine pattern in T cells remained not clearly defined. Expression of cytokines by T lymphocytes is a highly balanced process, involving stimulatory and inhibitory intracellular signaling pathways. The aim of this study was (1) to quantify the cytokine expression in T cells at the single cell level using optical imaging, (2) and to analyze the influence of cyclic AMP- dependent signal transduction pathway in the balance between the TH-1 and TH-2 cytokine profile. We attempted to study several cytokines (IL-2, IFN-(gamma) , IL-4, IL-10 and IL-13) in peripheral blood mononuclear cells. Cells were prestimulated in vitro using phytohemagglutinin and phorbol ester for 36h, and then further cultured for 8h in the presence of monensin. Cells were permeabilized and then simple-, double- or triple-labeled with the corresponding specific fluorescent monoclonal antibodies. The cell phenotype was also determined by analyzing the expression of each of CD4, CD8, CD45RO and CD45RA with the cytokine expression. Conventional images of cells were recorded with a Peltier- cooled CCD camera (B/W C5985, Hamamatsu photonics) through an inverted microscope equipped with epi-fluorescence (Diaphot 300, Nikon). Images were digitalized using an acquisition video interface (Oculus TCX Coreco) in 762 by 570 pixels coded in 8 bits (256 gray levels), and analyzed thereafter in an IBM PC computer based on an intel pentium processor with an adequate software (Visilog 4, Noesis). The first image processing step is the extraction of cell areas using an edge detection and a binary thresholding method. In order to reduce the background noise of fluorescence, we performed an opening

  4. Polynomial pseudosupersymmetry underlying a two-level atom in an external electromagnetic field

    International Nuclear Information System (INIS)

    Samsonov, B.F.; Shamshutdinova, V.V.; Gitman, D.M.

    2005-01-01

    Chains of transformations introduced previously were studied in order to obtain electric fields with a time-dependent frequency for which the equation of motion of a two-level atom in the presence of these fields can be solved exactly. It is shown that a polynomial pseudosupersymmetry may be associated to such chains

  5. Permanent magnetic field, direct electric field, and infrared to reduce blood glucose level and hepatic function in mus musculus with diabetic mellitus

    International Nuclear Information System (INIS)

    Suhariningsih; Prijo, Tri Anggono; Notobroto, Hari Basuki; Winarni, Dwi; Hussein, Saikhu Achmad

    2017-01-01

    Blood contains several electrolytes with positive (cation) and negative (anion) ion load. Both electrolytes deliver impulse synergistically adjusting body needs. Those electrolytes give specific effect to external disturbance such as electric, magnetic, even infrared field. A study has been conducted to reduce blood glucose level and liver function, in type 2 Diabetes Mellitus patients, using Biophysics concept which uses combination therapy of permanent magnetic field, electric field, and infrared. This study used 48 healthy mice ( mus musculus ), male, age 3-4 weeks, with approximately 25-30 g in weight. Mice was fed with lard as high fat diet orally, before Streptozotocin (STZ) induction become diabetic mice. Therapy was conducted by putting mice in a chamber that emits the combination of permanent magnetic field, electric field, and infrared, every day for 1 hour for 28 days. There were 4 combinations of therapy/treatment, namely: (1) permanent magnetic field, direct electric field, and infrared; (2) permanent magnetic field, direct electric field, without infrared; (3) permanent magnetic field, alternating electric field, and infrared; and (4) permanent magnetic field, alternating electric field, without infrared. The results of therapy show that every combination is able to reduce blood glucose level, AST, and ALT. However, the best result is by using combination of permanent magnetic field, direct electric field, and infrared. (paper)

  6. Real-time PCR for the quantification of fungi in planta.

    Science.gov (United States)

    Klosterman, Steven J

    2012-01-01

    Methods enabling quantification of fungi in planta can be useful for a variety of applications. In combination with information on plant disease severity, indirect quantification of fungi in planta offers an additional tool in the screening of plants that are resistant to fungal diseases. In this chapter, a method is described for the quantification of DNA from a fungus in plant leaves using real-time PCR (qPCR). Although the method described entails quantification of the fungus Verticillium dahliae in lettuce leaves, the methodology described would be useful for other pathosystems as well. The method utilizes primers that are specific for amplification of a β-tubulin sequence from V. dahliae and a lettuce actin gene sequence as a reference for normalization. This approach enabled quantification of V. dahliae in the amount of 2.5 fg/ng of lettuce leaf DNA at 21 days following plant inoculation.

  7. Elemental labelling combined with liquid chromatography inductively coupled plasma mass spectrometry for quantification of biomolecules: A review

    International Nuclear Information System (INIS)

    Kretschy, Daniela; Koellensperger, Gunda; Hann, Stephan

    2012-01-01

    Highlights: ► Survey of bio-analytical approaches utilizing biomolecule labelling. ► Detailed discussion of methodology and chemistry of elemental labelling. ► Biomedical and bio-analytical applications of elemental labelling. ► FI-ICP-MS and LC–ICP-MS for quantification of elemental labelled biomolecules. ► Review of selected applications. - Abstract: This article reviews novel quantification concepts where elemental labelling is combined with flow injection inductively coupled plasma mass spectrometry (FI-ICP-MS) or liquid chromatography inductively coupled plasma mass spectrometry (LC–ICP-MS), and employed for quantification of biomolecules such as proteins, peptides and related molecules in challenging sample matrices. In the first sections an overview on general aspects of biomolecule quantification, as well as of labelling will be presented emphasizing the potential, which lies in such methodological approaches. In this context, ICP-MS as detector provides high sensitivity, selectivity and robustness in biological samples and offers the capability for multiplexing and isotope dilution mass spectrometry (IDMS). Fundamental methodology of elemental labelling will be highlighted and analytical, as well as biomedical applications will be presented. A special focus will lie on established applications underlining benefits and bottlenecks of such approaches for the implementation in real life analysis. Key research made in this field will be summarized and a perspective for future developments including sophisticated and innovative applications will given.

  8. 3D full-field quantification of cell-induced large deformations in fibrillar biomaterials by combining non-rigid image registration with label-free second harmonic generation.

    Science.gov (United States)

    Jorge-Peñas, Alvaro; Bové, Hannelore; Sanen, Kathleen; Vaeyens, Marie-Mo; Steuwe, Christian; Roeffaers, Maarten; Ameloot, Marcel; Van Oosterwyck, Hans

    2017-08-01

    To advance our current understanding of cell-matrix mechanics and its importance for biomaterials development, advanced three-dimensional (3D) measurement techniques are necessary. Cell-induced deformations of the surrounding matrix are commonly derived from the displacement of embedded fiducial markers, as part of traction force microscopy (TFM) procedures. However, these fluorescent markers may alter the mechanical properties of the matrix or can be taken up by the embedded cells, and therefore influence cellular behavior and fate. In addition, the currently developed methods for calculating cell-induced deformations are generally limited to relatively small deformations, with displacement magnitudes and strains typically of the order of a few microns and less than 10% respectively. Yet, large, complex deformation fields can be expected from cells exerting tractions in fibrillar biomaterials, like collagen. To circumvent these hurdles, we present a technique for the 3D full-field quantification of large cell-generated deformations in collagen, without the need of fiducial markers. We applied non-rigid, Free Form Deformation (FFD)-based image registration to compute full-field displacements induced by MRC-5 human lung fibroblasts in a collagen type I hydrogel by solely relying on second harmonic generation (SHG) from the collagen fibrils. By executing comparative experiments, we show that comparable displacement fields can be derived from both fibrils and fluorescent beads. SHG-based fibril imaging can circumvent all described disadvantages of using fiducial markers. This approach allows measuring 3D full-field deformations under large displacement (of the order of 10 μm) and strain regimes (up to 40%). As such, it holds great promise for the study of large cell-induced deformations as an inherent component of cell-biomaterial interactions and cell-mediated biomaterial remodeling. Copyright © 2017 Elsevier Ltd. All rights reserved.

  9. A statistical approach to quantification of genetically modified organisms (GMO) using frequency distributions.

    Science.gov (United States)

    Gerdes, Lars; Busch, Ulrich; Pecoraro, Sven

    2014-12-14

    According to Regulation (EU) No 619/2011, trace amounts of non-authorised genetically modified organisms (GMO) in feed are tolerated within the EU if certain prerequisites are met. Tolerable traces must not exceed the so-called 'minimum required performance limit' (MRPL), which was defined according to the mentioned regulation to correspond to 0.1% mass fraction per ingredient. Therefore, not yet authorised GMO (and some GMO whose approvals have expired) have to be quantified at very low level following the qualitative detection in genomic DNA extracted from feed samples. As the results of quantitative analysis can imply severe legal and financial consequences for producers or distributors of feed, the quantification results need to be utterly reliable. We developed a statistical approach to investigate the experimental measurement variability within one 96-well PCR plate. This approach visualises the frequency distribution as zygosity-corrected relative content of genetically modified material resulting from different combinations of transgene and reference gene Cq values. One application of it is the simulation of the consequences of varying parameters on measurement results. Parameters could be for example replicate numbers or baseline and threshold settings, measurement results could be for example median (class) and relative standard deviation (RSD). All calculations can be done using the built-in functions of Excel without any need for programming. The developed Excel spreadsheets are available (see section 'Availability of supporting data' for details). In most cases, the combination of four PCR replicates for each of the two DNA isolations already resulted in a relative standard deviation of 15% or less. The aims of the study are scientifically based suggestions for minimisation of uncertainty of measurement especially in -but not limited to- the field of GMO quantification at low concentration levels. Four PCR replicates for each of the two DNA isolations

  10. Quantification of physiological levels of vitamin D3 and 25-hydroxyvitamin D3 in porcine fat and liver in subgram sample sizes

    DEFF Research Database (Denmark)

    Burild, Anders; Frandsen, Henrik Lauritz; Poulsen, Morten

    2014-01-01

    Most methods for the quantification of physiological levels of vitamin D3 and 25‐hydroxyvitamin D3 are developed for food analysis where the sample size is not usually a critical parameter. In contrast, in life science studies sample sizes are often limited. A very sensitive liquid chromatography...... with tandem mass spectrometry method was developed to quantify vitamin D3 and 25‐hydroxyvitamin D3 simultaneously in porcine tissues. A sample of 0.2–1 g was saponified followed by liquid–liquid extraction and normal‐phase solid‐phase extraction. The analytes were derivatized with 4‐phenyl‐1,2,4‐triazoline‐3...

  11. Entropy squeezing of the field interacting with a nearly degenerate V-type three-level atom

    Institute of Scientific and Technical Information of China (English)

    Zhou Qing-Chun; Zhu Shi-Ning

    2005-01-01

    The position- and momentum-entopic squeezing properties of the optical field in the system of a nearly degenerate three-level atom interacting with a single-mode field are investigated. Calculation results indicate that when the field is initially in the vacuum state, it may lead to squeezing of the position entropy or the momentum entropy of the field if the atom is prepared properly. The effects of initial atomic state and the splitting of the excited levels of the atom on field entropies are discussed in this case. When the initial field is in a coherent state, we find that position-entropy squeezing of the field is present even if the atom is prepared in the ground state. By comparing the variance squeezing and entropy squeezing of the field we confirm that entropy is more sensitive than variance in measuring quantum fluctuations.

  12. Relationships between field performance tests in high-level soccer players

    DEFF Research Database (Denmark)

    Ingebrigtsen, Jørgen; Brochmann, Marit; Castagna, Carlo

    2014-01-01

    after two and four minutes of the Yo-Yo IR tests by testing 57 high-level soccer players. All players played regularly in one of the three highest levels of Norwegian soccer and were tested during three sessions on three consecutive days. Large correlations were observed between Yo-Yo IR1 and IR2 test...... using only one of the Yo-Yo tests and a RSA test, in a general soccer-specific field test protocol. The sub-maximal heart rate measures during Yo-Yo tests are reproducible and may be utilized for frequent, time-efficient and non-exhaustive testing of intermittent exercise capacity of high-level soccer...

  13. Leveling the field: The role of training, safety programs, and knowledge management systems in fostering inclusive field settings

    Science.gov (United States)

    Starkweather, S.; Crain, R.; Derry, K. R.

    2017-12-01

    Knowledge is empowering in all settings, but plays an elevated role in empowering under-represented groups in field research. Field research, particularly polar field research, has deep roots in masculinized and colonial traditions, which can lead to high barriers for women and minorities (e.g. Carey et al., 2016). While recruitment of underrepresented groups into polar field research has improved through the efforts of organizations like the Association of Polar Early Career Scientists (APECS), the experiences and successes of these participants is often contingent on the availability of specialized training opportunities or the quality of explicitly documented information about how to survive Arctic conditions or how to establish successful measurement protocols in harsh environments. In Arctic field research, knowledge is often not explicitly documented or conveyed, but learned through "experience" or informally through ad hoc advice. The advancement of field training programs and knowledge management systems suggest two means for unleashing more explicit forms of knowledge about field work. Examples will be presented along with a case for how they level the playing field and improve the experience of field work for all participants.

  14. Cost-Effective, Insitu Field Measurements for Determining the Water Retention Quantification onBehavior of Individual Right-of-Way Bioswales

    Science.gov (United States)

    Wang, S.; McGillis, W. R.; Hu, R.; Culligan, P. J.

    2017-12-01

    Green infrastructure (GI) interventions, such as right-of-way bioswales, are being implemented in many urban areas, including New York City, to help mitigate the negative impacts of stormwater runoff. To understand the storm water retention capacity of bioswales, hydrological models, at scales ranging from the tributary area of a single right-of-way bioswale to an entire watershed, are often invoked. The validation and calibration of these models is, however, currently hampered by lack of extensive field measurements that quantify bioswale stormwater retention behaviors for different storm sizes and bioswale configurations. To overcome this problem, three field methods to quantify the water retention capacity of individual bioswales were developed. The methods are potentially applicable to other applications concerned with quantifying flow regimes in urban area. Precise measurements with high time resolutions and low environmental impacts are desired for gauging the hydraulic performance of bioswales, and similar GI configurations. To satisfy these requirements, an in-field measurement method was developed which involved the deployment of acoustic water-level sensors to measure the upstream and downstream water levels of flow into and out of a bioswale located in the Bronx areas of New York City. The measurements were made during several individual storm events. To provide reference flow rates to enable accurate calibration of the acoustic water level measurements, two other conductometry-based methods, which made use of YSI sensors and injected calcium chloride solutions, were also developed and deployed simultaneously with the water level measurements. The suite of data gathered by these methods enabled the development of a relationship between stage-discharge and rainfall intensity, which was then used to obtain the upstream and downstream hydrographs for the individual bioswale for the different storm events. This presentation will describe in detail the

  15. Use of a medication quantification scale for comparison of pain medication usage in patients with complex regional pain syndrome (CRPS).

    Science.gov (United States)

    Gallizzi, Michael A; Khazai, Ravand S; Gagnon, Christine M; Bruehl, Stephen; Harden, R Norman

    2015-03-01

    To correlate the amount and types of pain medications prescribed to CRPS patients, using the Medication Quantification Scale, and patients' subjective pain levels. An international, multisite, retrospective review. University medical centers in the United States, Israel, Germany, and the Netherlands. A total of 89 subjects were enrolled from four different countries: 27 from the United States, 20 Germany, 18 Netherlands, and 24 Israel. The main outcome measures used were the Medication Quantification Scale III and numerical analog pain scale. There was no statistically significant correlation noted between the medication quantification scale and the visual analog scale for any site except for a moderate positive correlation at German sites. The medication quantification scale mean differences between the United States and Germany, the Netherlands, and Israel were 9.793 (P CRPS patients and would be useful in further prospective studies of pain medication prescription practices in the CRPS population worldwide. Wiley Periodicals, Inc.

  16. Two-level systems driven by large-amplitude fields

    International Nuclear Information System (INIS)

    Ashhab, S.; Johansson, J. R.; Zagoskin, A. M.; Nori, Franco

    2007-01-01

    We analyze the dynamics of a two-level system subject to driving by large-amplitude external fields, focusing on the resonance properties in the case of driving around the region of avoided level crossing. In particular, we consider three main questions that characterize resonance dynamics: (1) the resonance condition (2) the frequency of the resulting oscillations on resonance, and (3) the width of the resonance. We identify the regions of validity of different approximations. In a large region of the parameter space, we use a geometric picture in order to obtain both a simple understanding of the dynamics and quantitative results. The geometric approach is obtained by dividing the evolution into discrete time steps, with each time step described by either a phase shift on the basis states or a coherent mixing process corresponding to a Landau-Zener crossing. We compare the results of the geometric picture with those of a rotating wave approximation. We also comment briefly on the prospects of employing strong driving as a useful tool to manipulate two-level systems

  17. Random model of two-level atoms interacting with electromagnetic field

    International Nuclear Information System (INIS)

    Kireev, A.N.; Meleshko, A.N.

    1983-12-01

    A phase transition has been studied in a random system of two-level atoms interacting with an electromagnetic field. It is shown that superradiation can arise when there is short-range order in a spin-subsystem. The existence of long-range order is irrelevant for this phase transition

  18. Strategy study of quantification harmonization of SUV in PET/CT images

    International Nuclear Information System (INIS)

    Fischer, Andreia Caroline Fischer da Silveira

    2014-01-01

    In clinical practice, PET/CT images are often analyzed qualitatively by visual comparison of tumor lesions and normal tissues uptake; and semi-quantitatively by means of a parameter called SUV (Standardized Uptake Value). To ensure that longitudinal studies acquired on different scanners are interchangeable, and information of quantification is comparable, it is necessary to establish a strategy to harmonize the quantification of SUV. The aim of this study is to evaluate the strategy to harmonize the quantification of PET/CT images, performed with different scanner models and manufacturers. For this purpose, a survey of the technical characteristics of equipment and acquisition protocols of clinical images of different services of PET/CT in the state of Rio Grande do Sul was conducted. For each scanner, the accuracy of SUV quantification, and the Recovery Coefficient (RC) curves were determined, using the reconstruction parameters clinically relevant and available. From these data, harmonized performance specifications among the evaluated scanners were identified, as well as the algorithm that produces, for each one, the most accurate quantification. Finally, the most appropriate reconstruction parameters to harmonize the SUV quantification in each scanner, either regionally or internationally were identified. It was found that the RC values of the analyzed scanners proved to be overestimated by up to 38%, particularly for objects larger than 17mm. These results demonstrate the need for further optimization, through the reconstruction parameters modification, and even the change of the reconstruction algorithm used in each scanner. It was observed that there is a decoupling between the best image for PET/CT qualitative analysis and the best image for quantification studies. Thus, the choice of reconstruction method should be tied to the purpose of the PET/CT study in question, since the same reconstruction algorithm is not adequate, in one scanner, for qualitative

  19. Application of Fuzzy Comprehensive Evaluation Method in Trust Quantification

    Directory of Open Access Journals (Sweden)

    Shunan Ma

    2011-10-01

    Full Text Available Trust can play an important role for the sharing of resources and information in open network environments. Trust quantification is thus an important issue in dynamic trust management. By considering the fuzziness and uncertainty of trust, in this paper, we propose a fuzzy comprehensive evaluation method to quantify trust along with a trust quantification algorithm. Simulation results show that the trust quantification algorithm that we propose can effectively quantify trust and the quantified value of an entity's trust is consistent with the behavior of the entity.

  20. Objective quantification of perturbations produced with a piecewise PV inversion technique

    Directory of Open Access Journals (Sweden)

    L. Fita

    2007-11-01

    Full Text Available PV inversion techniques have been widely used in numerical studies of severe weather cases. These techniques can be applied as a way to study the sensitivity of the responsible meteorological system to changes in the initial conditions of the simulations. Dynamical effects of a collection of atmospheric features involved in the evolution of the system can be isolated. However, aspects, such as the definition of the atmospheric features or the amount of change in the initial conditions, are largely case-dependent and/or subjectively defined. An objective way to calculate the modification of the initial fields is proposed to alleviate this problem. The perturbations are quantified as the mean absolute variations of the total energy between the original and modified fields, and an unique energy variation value is fixed for all the perturbations derived from different PV anomalies. Thus, PV features of different dimensions and characteristics introduce the same net modification of the initial conditions from an energetic point of view. The devised quantification method is applied to study the high impact weather case of 9–11 November 2001 in the Western Mediterranean basin, when a deep and strong cyclone was formed. On the Balearic Islands 4 people died, and sustained winds of 30 ms−1 and precipitation higher than 200 mm/24 h were recorded. Moreover, 700 people died in Algiers during the first phase of the event. The sensitivities to perturbations in the initial conditions of a deep upper level trough, the anticyclonic system related to the North Atlantic high and the surface thermal anomaly related to the baroclinicity of the environment are determined. Results reveal a high influence of the upper level trough and the surface thermal anomaly and a minor role of the North Atlantic high during the genesis of the cyclone.

  1. Direct quantification of cell-free, circulating DNA from unpurified plasma.

    Science.gov (United States)

    Breitbach, Sarah; Tug, Suzan; Helmig, Susanne; Zahn, Daniela; Kubiak, Thomas; Michal, Matthias; Gori, Tommaso; Ehlert, Tobias; Beiter, Thomas; Simon, Perikles

    2014-01-01

    Cell-free DNA (cfDNA) in body tissues or fluids is extensively investigated in clinical medicine and other research fields. In this article we provide a direct quantitative real-time PCR (qPCR) as a sensitive tool for the measurement of cfDNA from plasma without previous DNA extraction, which is known to be accompanied by a reduction of DNA yield. The primer sets were designed to amplify a 90 and 222 bp multi-locus L1PA2 sequence. In the first module, cfDNA concentrations in unpurified plasma were compared to cfDNA concentrations in the eluate and the flow-through of the QIAamp DNA Blood Mini Kit and in the eluate of a phenol-chloroform isoamyl (PCI) based DNA extraction, to elucidate the DNA losses during extraction. The analyses revealed 2.79-fold higher cfDNA concentrations in unpurified plasma compared to the eluate of the QIAamp DNA Blood Mini Kit, while 36.7% of the total cfDNA were found in the flow-through. The PCI procedure only performed well on samples with high cfDNA concentrations, showing 87.4% of the concentrations measured in plasma. The DNA integrity strongly depended on the sample treatment. Further qualitative analyses indicated differing fractions of cfDNA fragment lengths in the eluate of both extraction methods. In the second module, cfDNA concentrations in the plasma of 74 coronary heart disease patients were compared to cfDNA concentrations of 74 healthy controls, using the direct L1PA2 qPCR for cfDNA quantification. The patient collective showed significantly higher cfDNA levels (mean (SD) 20.1 (23.8) ng/ml; range 5.1-183.0 ng/ml) compared to the healthy controls (9.7 (4.2) ng/ml; range 1.6-23.7 ng/ml). With our direct qPCR, we recommend a simple, economic and sensitive procedure for the quantification of cfDNA concentrations from plasma that might find broad applicability, if cfDNA became an established marker in the assessment of pathophysiological conditions.

  2. Modelization of three-dimensional bone micro-architecture using Markov random fields with a multi-level clique system

    International Nuclear Information System (INIS)

    Lamotte, T.; Dinten, J.M.; Peyrin, F.

    2004-01-01

    Imaging trabecular bone micro-architecture in vivo non-invasively is still a challenging issue due to the complexity and small size of the structure. Thus, having a realistic 3D model of bone micro-architecture could be useful in image segmentation or image reconstruction. The goal of this work was to develop a 3D model of trabecular bone micro-architecture which can be seen as a problem of texture synthesis. We investigated a statistical model based on 3D Markov Random Fields (MRF's). Due to the Hammersley-Clifford theorem MRF's may equivalently be defined by an energy function on some set of cliques. In order to model 3D binary bone texture images (bone / background), we first used a particular well-known subclass of MRFs: the Ising model. The local energy function at some voxel depends on the closest neighbors of the voxels and on some parameters which control the shape and the proportion of bone. However, simulations yielded textures organized as connected clusters which even when varying the parameters did not approach the complexity of bone micro-architecture. Then, we introduced a second level of cliques taking into account neighbors located at some distance d from the site s and a new set of cliques allowing to control the plate thickness and spacing. The 3D bone texture images generated using the proposed model were analyzed using the usual bone-architecture quantification tools in order to relate the parameters of the MRF model to the characteristic parameters of bone micro-architecture (trabecular spacing, trabecular thickness, number of trabeculae...). (authors)

  3. Iron overload in the liver diagnostic and quantification

    Energy Technology Data Exchange (ETDEWEB)

    Alustiza, Jose M. [Osatek SA, P Dr. Beguiristain 109, 20014, San Sebastian, Guipuzcoa (Spain)]. E-mail: jmalustiza@osatek.es; Castiella, Agustin [Osatek SA, P Dr. Beguiristain 109, 20014, San Sebastian, Guipuzcoa (Spain); Juan, Maria D. de [Osatek SA, P Dr. Beguiristain 109, 20014, San Sebastian, Guipuzcoa (Spain); Emparanza, Jose I. [Osatek SA, P Dr. Beguiristain 109, 20014, San Sebastian, Guipuzcoa (Spain); Artetxe, Jose [Osatek SA, P Dr. Beguiristain 109, 20014, San Sebastian, Guipuzcoa (Spain); Uranga, Maite [Osatek SA, P Dr. Beguiristain 109, 20014, San Sebastian, Guipuzcoa (Spain)

    2007-03-15

    Hereditary Hemochromatosis is the most frequent modality of iron overload. Since 1996 genetic tests have facilitated significantly the non-invasive diagnosis of the disease. There are however many cases of negative genetic tests that require confirmation by hepatic iron quantification which is traditionally performed by hepatic biopsy. There are many studies that have demonstrated the possibility of performing hepatic iron quantification with Magnetic Resonance. However, a consensus has not been reached yet regarding the technique or the possibility to reproduce the same method of calculus in different machines. This article reviews the state of the art of the question and delineates possible future lines to standardise this non-invasive method of hepatic iron quantification.

  4. Iron overload in the liver diagnostic and quantification

    International Nuclear Information System (INIS)

    Alustiza, Jose M.; Castiella, Agustin; Juan, Maria D. de; Emparanza, Jose I.; Artetxe, Jose; Uranga, Maite

    2007-01-01

    Hereditary Hemochromatosis is the most frequent modality of iron overload. Since 1996 genetic tests have facilitated significantly the non-invasive diagnosis of the disease. There are however many cases of negative genetic tests that require confirmation by hepatic iron quantification which is traditionally performed by hepatic biopsy. There are many studies that have demonstrated the possibility of performing hepatic iron quantification with Magnetic Resonance. However, a consensus has not been reached yet regarding the technique or the possibility to reproduce the same method of calculus in different machines. This article reviews the state of the art of the question and delineates possible future lines to standardise this non-invasive method of hepatic iron quantification

  5. The degree of acceptability of swine blood values at increasing levels of hemolysis evaluated through visual inspection versus automated quantification.

    Science.gov (United States)

    Di Martino, Guido; Stefani, Anna Lisa; Lippi, Giuseppe; Gagliazzo, Laura; McCormick, Wanda; Gabai, Gianfranco; Bonfanti, Lebana

    2015-05-01

    The pronounced fragility that characterizes swine erythrocytes is likely to produce a variable degree of hemolysis during blood sampling, and the free hemoglobin may then unpredictably bias the quantification of several analytes. The aim of this study was to evaluate the degree of acceptability of values obtained for several biochemical parameters at different levels of hemolysis. Progressively increased degrees of physical hemolysis were induced in 3 aliquots of 30 nonhemolytic sera, and the relative effects on the test results were assessed. To define the level of hemolysis, we used both visual estimation (on a scale of 0 to 3+) and analytical assessment (hemolytic index) and identified the best analytical cutoff values for discriminating the visual levels of hemolysis. Hemolysis led to a variable and dose-dependent effect on the test results that was specific for each analyte tested. In mildly hemolyzed specimens, C-reactive protein, haptoglobin, β1-globulin, β2-globulin, α1-globulin, γ-globulin, sodium, calcium, and alkaline phosphatase were not significantly biased, whereas α2-globulin, albumin, urea, creatinine, glucose, total cholesterol, aspartate aminotransferase, alanine aminotransferase, gamma-glutamyl transferase, nonesterified fatty acids, bilirubin, phosphorus, magnesium, iron, zinc, copper, lipase, triglycerides, lactate dehydrogenase, unbound iron-binding capacity, and uric acid were significantly biased. Chloride and total protein were unbiased even in markedly hemolyzed samples. Analytical interference was hypothesized to be the main source of this bias, leading to a nonlinear trend that confirmed the difficulty in establishing reliable coefficients of correction for adjusting the test results. © 2015 The Author(s).

  6. RSEM: accurate transcript quantification from RNA-Seq data with or without a reference genome

    Directory of Open Access Journals (Sweden)

    Dewey Colin N

    2011-08-01

    Full Text Available Abstract Background RNA-Seq is revolutionizing the way transcript abundances are measured. A key challenge in transcript quantification from RNA-Seq data is the handling of reads that map to multiple genes or isoforms. This issue is particularly important for quantification with de novo transcriptome assemblies in the absence of sequenced genomes, as it is difficult to determine which transcripts are isoforms of the same gene. A second significant issue is the design of RNA-Seq experiments, in terms of the number of reads, read length, and whether reads come from one or both ends of cDNA fragments. Results We present RSEM, an user-friendly software package for quantifying gene and isoform abundances from single-end or paired-end RNA-Seq data. RSEM outputs abundance estimates, 95% credibility intervals, and visualization files and can also simulate RNA-Seq data. In contrast to other existing tools, the software does not require a reference genome. Thus, in combination with a de novo transcriptome assembler, RSEM enables accurate transcript quantification for species without sequenced genomes. On simulated and real data sets, RSEM has superior or comparable performance to quantification methods that rely on a reference genome. Taking advantage of RSEM's ability to effectively use ambiguously-mapping reads, we show that accurate gene-level abundance estimates are best obtained with large numbers of short single-end reads. On the other hand, estimates of the relative frequencies of isoforms within single genes may be improved through the use of paired-end reads, depending on the number of possible splice forms for each gene. Conclusions RSEM is an accurate and user-friendly software tool for quantifying transcript abundances from RNA-Seq data. As it does not rely on the existence of a reference genome, it is particularly useful for quantification with de novo transcriptome assemblies. In addition, RSEM has enabled valuable guidance for cost

  7. Outdoor characterization of radio frequency electromagnetic fields in a Spanish birth cohort

    International Nuclear Information System (INIS)

    Calvente, I.; Fernández, M.F.; Pérez-Lobato, R.; Dávila-Arias, C.; Ocón, O.; Ramos, R.; Ríos-Arrabal, S.; Villalba-Moreno, J.

    2015-01-01

    There is considerable public concern in many countries about the possible adverse effects of exposure to non-ionizing radiation electromagnetic fields, especially in vulnerable populations such as children. The aim of this study was to characterize environmental exposure profiles within the frequency range 100 kHz–6 GHz in the immediate surrounds of the dwellings of 123 families from the INMA-Granada birth cohort in Southern Spain, using spot measurements. The arithmetic mean root mean-square electric field (E RMS ) and power density (S RMS ) values were, respectively, 195.79 mV/m (42.3% of data were above this mean) and 799.01 µW/m 2 (30% of values were above this mean); median values were 148.80 mV/m and 285.94 µW/m 2 , respectively. Exposure levels below the quantification limit were assigned a value of 0.01 V/m. Incident field strength levels varied widely among different areas or towns/villages, demonstrating spatial variability in the distribution of exposure values related to the surface area population size and also among seasons. Although recorded values were well below International Commission for Non-Ionizing Radiation Protection reference levels, there is a particular need to characterize incident field strength levels in vulnerable populations (e.g., children) because of their chronic and ever-increasing exposure. The effects of incident field strength have not been fully elucidated; however, it may be appropriate to apply the precautionary principle in order to reduce exposure in susceptible groups. - Highlights: • Spot measurements were performed in the immediate surrounds of children's dwellings. • Mean root mean-square electric field and power density values were calculated. • Most recorded values were far below international standard guideline limits. • Data demonstrate spatial variability in the distribution of exposure levels. • While adverse effects are proven, application of the precautionary principle may be appropriate

  8. Mental skill levels of South African male student field hockey players ...

    African Journals Online (AJOL)

    Mental skill levels of South African male student field hockey players in different playing positions. ... African Journal for Physical Activity and Health Sciences ... The positional results were compared by means of effect sizes (expressed as ...

  9. The levels of the first excited configuration of one-electron ions in intensive alternating field

    International Nuclear Information System (INIS)

    Klimchitskaya, G.L.

    1984-01-01

    The relativistic generalization of the quasi-energy method is applied for the calculation of the influence of spatjally-homogeneous electric field with the periodic time dependence on the energy levels of the first excited configuration of one-electron multiply charged ions. The dependence is found of the corresponding quasi-energy levels on the amplitude and frequency of intensive external field which wholly mixes the levels of fine structure

  10. Disease quantification in dermatology

    DEFF Research Database (Denmark)

    Greve, Tanja Maria; Kamp, Søren; Jemec, Gregor B E

    2013-01-01

    Accurate documentation of disease severity is a prerequisite for clinical research and the practice of evidence-based medicine. The quantification of skin diseases such as psoriasis currently relies heavily on clinical scores. Although these clinical scoring methods are well established and very ...

  11. Carrier relaxation in (In,Ga)As quantum dots with magnetic field-induced anharmonic level structure

    Energy Technology Data Exchange (ETDEWEB)

    Kurtze, H.; Bayer, M. [Experimentelle Physik 2, TU Dortmund, D-44221 Dortmund (Germany)

    2016-07-04

    Sophisticated models have been worked out to explain the fast relaxation of carriers into quantum dot ground states after non-resonant excitation, overcoming the originally proposed phonon bottleneck. We apply a magnetic field along the quantum dot heterostructure growth direction to transform the confined level structure, which can be approximated by a Fock–Darwin spectrum, from a nearly equidistant level spacing at zero field to strong anharmonicity in finite fields. This changeover leaves the ground state carrier population rise time unchanged suggesting that fast relaxation is maintained upon considerable changes of the level spacing. This corroborates recent models explaining the relaxation by polaron formation in combination with quantum kinetic effects.

  12. Adaptation of the Maracas algorithm for carotid artery segmentation and stenosis quantification on CT images

    International Nuclear Information System (INIS)

    Maria A Zuluaga; Maciej Orkisz; Edgar J F Delgado; Vincent Dore; Alfredo Morales Pinzon; Marcela Hernandez Hoyos

    2010-01-01

    This paper describes the adaptations of Maracas algorithm to the segmentation and quantification of vascular structures in CTA images of the carotid artery. The maracas algorithm, which is based on an elastic model and on a multi-scale Eigen-analysis of the inertia matrix, was originally designed to segment a single artery in MRA images. The modifications are primarily aimed at addressing the specificities of CT images and the bifurcations. The algorithms implemented in this new version are classified into two levels. 1. The low-level processing (filtering of noise and directional artifacts, enhancement and pre-segmentation) to improve the quality of the image and to pre-segment it. These techniques are based on a priori information about noise, artifacts and typical gray levels ranges of lumen, background and calcifications. 2. The high-level processing to extract the centerline of the artery, to segment the lumen and to quantify the stenosis. At this level, we apply a priori knowledge of shape and anatomy of vascular structures. The method was evaluated on 31 datasets from the carotid lumen segmentation and stenosis grading grand challenge 2009. The segmentation results obtained an average of 80:4% dice similarity score, compared to reference segmentation, and the mean stenosis quantification error was 14.4%.

  13. Comparison of Suitability of the Most Common Ancient DNA Quantification Methods.

    Science.gov (United States)

    Brzobohatá, Kristýna; Drozdová, Eva; Smutný, Jiří; Zeman, Tomáš; Beňuš, Radoslav

    2017-04-01

    Ancient DNA (aDNA) extracted from historical bones is damaged and fragmented into short segments, present in low quantity, and usually copurified with microbial DNA. A wide range of DNA quantification methods are available. The aim of this study was to compare the five most common DNA quantification methods for aDNA. Quantification methods were tested on DNA extracted from skeletal material originating from an early medieval burial site. The tested methods included ultraviolet (UV) absorbance, real-time quantitative polymerase chain reaction (qPCR) based on SYBR ® green detection, real-time qPCR based on a forensic kit, quantification via fluorescent dyes bonded to DNA, and fragmentary analysis. Differences between groups were tested using a paired t-test. Methods that measure total DNA present in the sample (NanoDrop ™ UV spectrophotometer and Qubit ® fluorometer) showed the highest concentrations. Methods based on real-time qPCR underestimated the quantity of aDNA. The most accurate method of aDNA quantification was fragmentary analysis, which also allows DNA quantification of the desired length and is not affected by PCR inhibitors. Methods based on the quantification of the total amount of DNA in samples are unsuitable for ancient samples as they overestimate the amount of DNA presumably due to the presence of microbial DNA. Real-time qPCR methods give undervalued results due to DNA damage and the presence of PCR inhibitors. DNA quantification methods based on fragment analysis show not only the quantity of DNA but also fragment length.

  14. GMO quantification: valuable experience and insights for the future.

    Science.gov (United States)

    Milavec, Mojca; Dobnik, David; Yang, Litao; Zhang, Dabing; Gruden, Kristina; Zel, Jana

    2014-10-01

    Cultivation and marketing of genetically modified organisms (GMOs) have been unevenly adopted worldwide. To facilitate international trade and to provide information to consumers, labelling requirements have been set up in many countries. Quantitative real-time polymerase chain reaction (qPCR) is currently the method of choice for detection, identification and quantification of GMOs. This has been critically assessed and the requirements for the method performance have been set. Nevertheless, there are challenges that should still be highlighted, such as measuring the quantity and quality of DNA, and determining the qPCR efficiency, possible sequence mismatches, characteristics of taxon-specific genes and appropriate units of measurement, as these remain potential sources of measurement uncertainty. To overcome these problems and to cope with the continuous increase in the number and variety of GMOs, new approaches are needed. Statistical strategies of quantification have already been proposed and expanded with the development of digital PCR. The first attempts have been made to use new generation sequencing also for quantitative purposes, although accurate quantification of the contents of GMOs using this technology is still a challenge for the future, and especially for mixed samples. New approaches are needed also for the quantification of stacks, and for potential quantification of organisms produced by new plant breeding techniques.

  15. Automatic Segmentation and Quantification of Filamentous Structures in Electron Tomography.

    Science.gov (United States)

    Loss, Leandro A; Bebis, George; Chang, Hang; Auer, Manfred; Sarkar, Purbasha; Parvin, Bahram

    2012-10-01

    Electron tomography is a promising technology for imaging ultrastructures at nanoscale resolutions. However, image and quantitative analyses are often hindered by high levels of noise, staining heterogeneity, and material damage either as a result of the electron beam or sample preparation. We have developed and built a framework that allows for automatic segmentation and quantification of filamentous objects in 3D electron tomography. Our approach consists of three steps: (i) local enhancement of filaments by Hessian filtering; (ii) detection and completion (e.g., gap filling) of filamentous structures through tensor voting; and (iii) delineation of the filamentous networks. Our approach allows for quantification of filamentous networks in terms of their compositional and morphological features. We first validate our approach using a set of specifically designed synthetic data. We then apply our segmentation framework to tomograms of plant cell walls that have undergone different chemical treatments for polysaccharide extraction. The subsequent compositional and morphological analyses of the plant cell walls reveal their organizational characteristics and the effects of the different chemical protocols on specific polysaccharides.

  16. Effects of sea-level rise on salt water intrusion near a coastal well field in southeastern Florida

    Science.gov (United States)

    Langevin, Christian D.; Zygnerski, Michael

    2013-01-01

    A variable-density groundwater flow and dispersive solute transport model was developed for the shallow coastal aquifer system near a municipal supply well field in southeastern Florida. The model was calibrated for a 105-year period (1900 to 2005). An analysis with the model suggests that well-field withdrawals were the dominant cause of salt water intrusion near the well field, and that historical sea-level rise, which is similar to lower-bound projections of future sea-level rise, exacerbated the extent of salt water intrusion. Average 2005 hydrologic conditions were used for 100-year sensitivity simulations aimed at quantifying the effect of projected rises in sea level on fresh coastal groundwater resources near the well field. Use of average 2005 hydrologic conditions and a constant sea level result in total dissolved solids (TDS) concentration of the well field exceeding drinking water standards after 70 years. When sea-level rise is included in the simulations, drinking water standards are exceeded 10 to 21 years earlier, depending on the specified rate of sea-level rise.

  17. Quantification of analytes affected by relevant interfering signals under quality controlled conditions

    International Nuclear Information System (INIS)

    Bettencourt da Silva, Ricardo J.N.; Santos, Julia R.; Camoes, M. Filomena G.F.C.

    2006-01-01

    pesticide residues in spiked oranges considering the quantification of the oranges ethyl acetate extract by gas-chromatography with electron capture detector. The application of the proposed methodology to the analysis of this fruit using the studied chromatographic system, allowed the quantification of an increased number of analytes in the samples. The magnitude of the measurement uncertainty estimated by the proposed methodology is fit for the purpose of monitoring pesticide residues in oranges since, frequently, the difference between the maximum residue level and the best estimation of the sample content is larger than the respective uncertainty. The proposed methodology can be useful in other analytical fields and/or instrumental methods of analysis

  18. Quantification of susceptibility change at high-concentrated SPIO-labeled target by characteristic phase gradient recognition.

    Science.gov (United States)

    Zhu, Haitao; Nie, Binbin; Liu, Hua; Guo, Hua; Demachi, Kazuyuki; Sekino, Masaki; Shan, Baoci

    2016-05-01

    Phase map cross-correlation detection and quantification may produce highlighted signal at superparamagnetic iron oxide nanoparticles, and distinguish them from other hypointensities. The method may quantify susceptibility change by performing least squares analysis between a theoretically generated magnetic field template and an experimentally scanned phase image. Because characteristic phase recognition requires the removal of phase wrap and phase background, additional steps of phase unwrapping and filtering may increase the chance of computing error and enlarge the inconsistence among algorithms. To solve problem, phase gradient cross-correlation and quantification method is developed by recognizing characteristic phase gradient pattern instead of phase image because phase gradient operation inherently includes unwrapping and filtering functions. However, few studies have mentioned the detectable limit of currently used phase gradient calculation algorithms. The limit may lead to an underestimation of large magnetic susceptibility change caused by high-concentrated iron accumulation. In this study, mathematical derivation points out the value of maximum detectable phase gradient calculated by differential chain algorithm in both spatial and Fourier domain. To break through the limit, a modified quantification method is proposed by using unwrapped forward differentiation for phase gradient generation. The method enlarges the detectable range of phase gradient measurement and avoids the underestimation of magnetic susceptibility. Simulation and phantom experiments were used to quantitatively compare different methods. In vivo application performs MRI scanning on nude mice implanted by iron-labeled human cancer cells. Results validate the limit of detectable phase gradient and the consequent susceptibility underestimation. Results also demonstrate the advantage of unwrapped forward differentiation compared with differential chain algorithms for susceptibility

  19. Discriminative Random Field Models for Subsurface Contamination Uncertainty Quantification

    Science.gov (United States)

    Arshadi, M.; Abriola, L. M.; Miller, E. L.; De Paolis Kaluza, C.

    2017-12-01

    Application of flow and transport simulators for prediction of the release, entrapment, and persistence of dense non-aqueous phase liquids (DNAPLs) and associated contaminant plumes is a computationally intensive process that requires specification of a large number of material properties and hydrologic/chemical parameters. Given its computational burden, this direct simulation approach is particularly ill-suited for quantifying both the expected performance and uncertainty associated with candidate remediation strategies under real field conditions. Prediction uncertainties primarily arise from limited information about contaminant mass distributions, as well as the spatial distribution of subsurface hydrologic properties. Application of direct simulation to quantify uncertainty would, thus, typically require simulating multiphase flow and transport for a large number of permeability and release scenarios to collect statistics associated with remedial effectiveness, a computationally prohibitive process. The primary objective of this work is to develop and demonstrate a methodology that employs measured field data to produce equi-probable stochastic representations of a subsurface source zone that capture the spatial distribution and uncertainty associated with key features that control remediation performance (i.e., permeability and contamination mass). Here we employ probabilistic models known as discriminative random fields (DRFs) to synthesize stochastic realizations of initial mass distributions consistent with known, and typically limited, site characterization data. Using a limited number of full scale simulations as training data, a statistical model is developed for predicting the distribution of contaminant mass (e.g., DNAPL saturation and aqueous concentration) across a heterogeneous domain. Monte-Carlo sampling methods are then employed, in conjunction with the trained statistical model, to generate realizations conditioned on measured borehole data

  20. Review of the quantification techniques for polycyclic aromatic hydrocarbons (PAHs) in food products.

    Science.gov (United States)

    Bansal, Vasudha; Kumar, Pawan; Kwon, Eilhann E; Kim, Ki-Hyun

    2017-10-13

    There is a growing need for accurate detection of trace-level PAHs in food products due to the numerous detrimental effects caused by their contamination (e.g., toxicity, carcinogenicity, and teratogenicity). This review aims to discuss the up-to-date knowledge on the measurement techniques available for PAHs contained in food or its related products. This article aims to provide a comprehensive outline on the measurement techniques of PAHs in food to help reduce their deleterious impacts on human health based on the accurate quantification. The main part of this review is dedicated to the opportunities and practical options for the treatment of various food samples and for accurate quantification of PAHs contained in those samples. Basic information regarding all available analytical measurement techniques for PAHs in food samples is also evaluated with respect to their performance in terms of quality assurance.

  1. Hepatic fat quantification magnetic resonance for monitoring treatment response in pediatric nonalcoholic steatohepatitis.

    Science.gov (United States)

    Koh, Hong; Kim, Seung; Kim, Myung-Joon; Kim, Hyun Gi; Shin, Hyun Joo; Lee, Mi-Jung

    2015-09-07

    To evaluate the possibility of treatment effect monitoring using hepatic fat quantification magnetic resonance (MR) in pediatric nonalcoholic steatohepatitis (NASH). We retrospectively reviewed the medical records of patients who received educational recommendations and vitamin E for NASH and underwent hepatic fat quantification MR from 2011 to 2013. Hepatic fat fraction (%) was measured using dual- and triple-echo gradient-recalled-echo sequences at 3T. The compliant and non-compliant groups were compared clinically, biochemically, and radiologically. Twenty seven patients (M:F = 24:3; mean age: 12 ± 2.3 years) were included (compliant group = 22, non-compliant = 5). None of the baseline findings differed between the 2 groups, except for triglyceride level (compliant vs non-compliant, 167.7 mg/dL vs 74.2 mg/dL, P = 0.001). In the compliant group, high-density lipoprotein increased and all other parameters decreased after 1-year follow-up. However, there were various changes in the non-compliant group. Dual-echo fat fraction (-19.2% vs 4.6, P fat fraction (-13.4% vs 3.5, P fat fraction showed a positive correlation (ρ = 0.418, P = 0.030). Hepatic fat quantification MR can be a non-invasive, quantitative and useful tool for monitoring treatment effects in pediatric NASH.

  2. Resonance fluorescence spectra of a three-level atom driven by two strong laser fields

    International Nuclear Information System (INIS)

    Peng Jinsheng.

    1986-12-01

    The resonance fluorescence of a three-level atom interacted with two high-power laser fields is investigated in strong field approximation. The fluorescence distribution is obtained by means of the theory of dressing transformation. (author). 15 refs, 2 figs

  3. Role of spontaneous emission through operating transition in probe-field spectroscopy of two-level systems

    Energy Technology Data Exchange (ETDEWEB)

    Saprykin, E. G. [Russian Academy of Sciences, Institute of Automation and Electrometry, Siberian Branch (Russian Federation); Chernenko, A. A., E-mail: chernen@isp.nsc.ru [Russian Academy of Sciences, Rzhanov Institute of Semiconductor Physics, Siberian Branch (Russian Federation); Shalagin, A. M. [Russian Academy of Sciences, Institute of Automation and Electrometry, Siberian Branch (Russian Federation)

    2016-08-15

    Analytical and numerical investigations are carried out of the effect of spontaneous decay through operating transition on the shape of a resonance in the work of a probe field under a strong field applied to the transition. A narrow nonlinear resonance arising on transitions with long-living lower level in the work of a probe field can manifest itself in the form of a traditional minimum and a peak as a function of the first Einstein coefficient for the operating transition. The transformation of the resonance from a minimum to a peak is attributed to the specific character of relaxation of lower-level population beatings on a closed or almost closed transition (the decay of the upper level occurs completely or almost completely through the operating transition).

  4. Effect of sea-level rise on salt water intrusion near a coastal well field in southeastern Florida.

    Science.gov (United States)

    Langevin, Christian D; Zygnerski, Michael

    2013-01-01

    A variable-density groundwater flow and dispersive solute transport model was developed for the shallow coastal aquifer system near a municipal supply well field in southeastern Florida. The model was calibrated for a 105-year period (1900 to 2005). An analysis with the model suggests that well-field withdrawals were the dominant cause of salt water intrusion near the well field, and that historical sea-level rise, which is similar to lower-bound projections of future sea-level rise, exacerbated the extent of salt water intrusion. Average 2005 hydrologic conditions were used for 100-year sensitivity simulations aimed at quantifying the effect of projected rises in sea level on fresh coastal groundwater resources near the well field. Use of average 2005 hydrologic conditions and a constant sea level result in total dissolved solids (TDS) concentration of the well field exceeding drinking water standards after 70 years. When sea-level rise is included in the simulations, drinking water standards are exceeded 10 to 21 years earlier, depending on the specified rate of sea-level rise. Published 2012. This article is a U.S. Government work and is in the public domain in the USA.

  5. Cutset Quantification Error Evaluation for Shin-Kori 1 and 2 PSA model

    International Nuclear Information System (INIS)

    Choi, Jong Soo

    2009-01-01

    Probabilistic safety assessments (PSA) for nuclear power plants (NPPs) are based on the minimal cut set (MCS) quantification method. In PSAs, the risk and importance measures are computed from a cutset equation mainly by using approximations. The conservatism of the approximations is also a source of quantification uncertainty. In this paper, exact MCS quantification methods which are based on the 'sum of disjoint products (SDP)' logic and Inclusion-exclusion formula are applied and the conservatism of the MCS quantification results in Shin-Kori 1 and 2 PSA is evaluated

  6. Imaging and Quantification of Extracellular Vesicles by Transmission Electron Microscopy.

    Science.gov (United States)

    Linares, Romain; Tan, Sisareuth; Gounou, Céline; Brisson, Alain R

    2017-01-01

    Extracellular vesicles (EVs) are cell-derived vesicles that are present in blood and other body fluids. EVs raise major interest for their diverse physiopathological roles and their potential biomedical applications. However, the characterization and quantification of EVs constitute major challenges, mainly due to their small size and the lack of methods adapted for their study. Electron microscopy has made significant contributions to the EV field since their initial discovery. Here, we describe the use of two transmission electron microscopy (TEM) techniques for imaging and quantifying EVs. Cryo-TEM combined with receptor-specific gold labeling is applied to reveal the morphology, size, and phenotype of EVs, while their enumeration is achieved after high-speed sedimentation on EM grids.

  7. Detection and quantification capabilities and the evaluation of low-level data. Some international perspectives and continuing challenges

    International Nuclear Information System (INIS)

    Currie, L.A.

    2000-01-01

    The minimum amounts or concentrations of an analyte that may be detected or quantified by a specific measurement process (MP) represent fundamental performance characteristics that are vital for planning experiments and designing MPs to meet external specifications. Following many years of conceptual and terminological disarray regarding detection and quantification limits, the International Union of Pure and Applied Chemistry (IUPAC, 1995) and the International Organization for Standardization (ISO, 1997) developed a harmonized position and documents that provide a basis for international consensus on this topic. During the past year, the International Atomic Energy Agency (IAEA) has developed a TECDOC on Quantifying Uncertainty in Nuclear Analytical Measurements that treats 'Uncertainty in Measurements Close to Detection Limits' from the perspective of the UIPAC and ISO recommendations. The first part of this article serves as a review of these international developments during the last quinquennium of the twentieth century. Despite the achievement of international consensus on these contentious matters, many challenges remain. One quickly discovers this in the practical world of high stakes, ultra-trace analysis, where complications are introduced by the nature and distribution of the blank, the variance function (σ vs. concentration), non-linear models, and hidden algorithms and data evaluation/reporting schemes. Some of these issues are illustrated through a multidisciplinary case study of fossil and biomass burning aerosol at extremely low levels in the polar atmosphere and cryosphere, and by biased reporting practices for 'non-detects.' (author)

  8. Energy levels and far-infrared optical absorption of impurity doped semiconductor nanorings: Intense laser and electric fields effects

    Energy Technology Data Exchange (ETDEWEB)

    Barseghyan, M.G., E-mail: mbarsegh@ysu.am

    2016-11-10

    Highlights: • The electron-impurity interaction on energy levels in nanoring have been investigated. • The electron-impurity interaction on far-infrared absorption have been investigated. • The energy levels are more stable for higher values of electric field. - Abstract: The effects of electron-impurity interaction on energy levels and far-infrared absorption in semiconductor nanoring under the action of intense laser and lateral electric fields have been investigated. Numerical calculations are performed using exact diagonalization technique. It is found that the electron-impurity interaction and external fields change the energy spectrum dramatically, and also have significant influence on the absorption spectrum. Strong dependence on laser field intensity and electric field of lowest energy levels, also supported by the Coulomb interaction with impurity, is clearly revealed.

  9. Mean-field level analysis of epidemics in directed networks

    Energy Technology Data Exchange (ETDEWEB)

    Wang, Jiazeng [School of Mathematical Sciences, Peking University, Beijing 100871 (China); Liu, Zengrong [Mathematics Department, Shanghai University, Shanghai 200444 (China)], E-mail: wangjiazen@yahoo.com.cn, E-mail: zrongliu@online.sh.cn

    2009-09-04

    The susceptible-infected-removed spreading model in a directed graph is studied. The mean-field level rate equations are built with the degree-degree connectivity correlation element and the (in, out)-degree distribution. And the outbreak threshold is obtained analytically-it is determined by the combination of connectivity probability and the degree distribution. Furthermore, the methods of calculating the degree-degree correlations in directed networks are presented. The numerical results of the discrete epidemic processes in networks verify our analyses.

  10. Mean-field level analysis of epidemics in directed networks

    International Nuclear Information System (INIS)

    Wang, Jiazeng; Liu, Zengrong

    2009-01-01

    The susceptible-infected-removed spreading model in a directed graph is studied. The mean-field level rate equations are built with the degree-degree connectivity correlation element and the (in, out)-degree distribution. And the outbreak threshold is obtained analytically-it is determined by the combination of connectivity probability and the degree distribution. Furthermore, the methods of calculating the degree-degree correlations in directed networks are presented. The numerical results of the discrete epidemic processes in networks verify our analyses.

  11. Level playing field for biomass. Government interventions with regard to fuels and energy carriers for mobility

    International Nuclear Information System (INIS)

    Hoogma, R.; Te Buck, S.

    2010-09-01

    A healthy development of the BioBased Economy (BBE) requires government support to one application to not complicate the development of another application. It is important to have a level playing field. This report takes a closer look at the level playing field for biomass in transport. [nl

  12. Alteration of the ground state by external magnetic fields. [External field, coupling constant ratio, static tree level approximation

    Energy Technology Data Exchange (ETDEWEB)

    Harrington, B J; Shepard, H K [New Hampshire Univ., Durham (USA). Dept. of Physics

    1976-03-22

    By fully exploiting the mathematical and physical analogy to the Ginzburg-Landau theory of superconductivity, a complete discussion of the ground state behavior of the four-dimensional Abelian Higgs model in the static tree level approximation is presented. It is shown that a sufficiently strong external magnetic field can alter the ground state of the theory by restoring a spontaneously broken symmetry, or by creating a qualitatively different 'vortex' state. The energetically favored ground state is explicitly determined as a function of the external field and the ratio between coupling constants of the theory.

  13. A critical view on microplastic quantification in aquatic organisms

    Energy Technology Data Exchange (ETDEWEB)

    Vandermeersch, Griet, E-mail: griet.vandermeersch@ilvo.vlaanderen.be [Institute for Agricultural and Fisheries Research (ILVO), Animal Sciences Unit – Marine Environment and Quality, Ankerstraat 1, 8400 Oostende (Belgium); Van Cauwenberghe, Lisbeth; Janssen, Colin R. [Ghent University, Laboratory of Environmental Toxicology and Aquatic Ecology, Environmental Toxicology Unit (GhEnToxLab), Jozef Plateaustraat 22, 9000 Ghent (Belgium); Marques, Antonio [Division of Aquaculture and Upgrading (DivAV), Portuguese Institute for the Sea and Atmosphere (IPMA), Avenida de Brasília s/n, 1449-006 Lisboa (Portugal); Granby, Kit [Technical University of Denmark, National Food Institute, Mørkhøj Bygade 19, 2860 Søborg (Denmark); Fait, Gabriella [Aeiforia Srl, 29027 Gariga di Podenzano (PC) (Italy); Kotterman, Michiel J.J. [Institute for Marine Resources and Ecosystem Studies (IMARES), Wageningen University and Research Center, Ijmuiden (Netherlands); Diogène, Jorge [Institut de la Recerca i Tecnologia Agroalimentàries (IRTA), Ctra. Poble Nou km 5,5, Sant Carles de la Ràpita E-43540 (Spain); Bekaert, Karen; Robbens, Johan [Institute for Agricultural and Fisheries Research (ILVO), Animal Sciences Unit – Marine Environment and Quality, Ankerstraat 1, 8400 Oostende (Belgium); Devriese, Lisa, E-mail: lisa.devriese@ilvo.vlaanderen.be [Institute for Agricultural and Fisheries Research (ILVO), Animal Sciences Unit – Marine Environment and Quality, Ankerstraat 1, 8400 Oostende (Belgium)

    2015-11-15

    Microplastics, plastic particles and fragments smaller than 5 mm, are ubiquitous in the marine environment. Ingestion and accumulation of microplastics have previously been demonstrated for diverse marine species ranging from zooplankton to bivalves and fish, implying the potential for microplastics to accumulate in the marine food web. In this way, microplastics can potentially impact food safety and human health. Although a few methods to quantify microplastics in biota have been described, no comparison and/or intercalibration of these techniques have been performed. Here we conducted a literature review on all available extraction and quantification methods. Two of these methods, involving wet acid destruction, were used to evaluate the presence of microplastics in field-collected mussels (Mytilus galloprovincialis) from three different “hotspot” locations in Europe (Po estuary, Italy; Tagus estuary, Portugal; Ebro estuary, Spain). An average of 0.18±0.14 total microplastics g{sup −1} w.w. for the Acid mix Method and 0.12±0.04 total microplastics g{sup −1} w.w. for the Nitric acid Method was established. Additionally, in a pilot study an average load of 0.13±0.14 total microplastics g{sup −1} w.w. was recorded in commercial mussels (Mytilus edulis and M. galloprovincialis) from five European countries (France, Italy, Denmark, Spain and The Netherlands). A detailed analysis and comparison of methods indicated the need for further research to develop a standardised operating protocol for microplastic quantification and monitoring.

  14. A critical view on microplastic quantification in aquatic organisms

    International Nuclear Information System (INIS)

    Vandermeersch, Griet; Van Cauwenberghe, Lisbeth; Janssen, Colin R.; Marques, Antonio; Granby, Kit; Fait, Gabriella; Kotterman, Michiel J.J.; Diogène, Jorge; Bekaert, Karen; Robbens, Johan; Devriese, Lisa

    2015-01-01

    Microplastics, plastic particles and fragments smaller than 5 mm, are ubiquitous in the marine environment. Ingestion and accumulation of microplastics have previously been demonstrated for diverse marine species ranging from zooplankton to bivalves and fish, implying the potential for microplastics to accumulate in the marine food web. In this way, microplastics can potentially impact food safety and human health. Although a few methods to quantify microplastics in biota have been described, no comparison and/or intercalibration of these techniques have been performed. Here we conducted a literature review on all available extraction and quantification methods. Two of these methods, involving wet acid destruction, were used to evaluate the presence of microplastics in field-collected mussels (Mytilus galloprovincialis) from three different “hotspot” locations in Europe (Po estuary, Italy; Tagus estuary, Portugal; Ebro estuary, Spain). An average of 0.18±0.14 total microplastics g −1 w.w. for the Acid mix Method and 0.12±0.04 total microplastics g −1 w.w. for the Nitric acid Method was established. Additionally, in a pilot study an average load of 0.13±0.14 total microplastics g −1 w.w. was recorded in commercial mussels (Mytilus edulis and M. galloprovincialis) from five European countries (France, Italy, Denmark, Spain and The Netherlands). A detailed analysis and comparison of methods indicated the need for further research to develop a standardised operating protocol for microplastic quantification and monitoring.

  15. Uncertainty Quantification in Numerical Aerodynamics

    KAUST Repository

    Litvinenko, Alexander

    2017-05-16

    We consider uncertainty quantification problem in aerodynamic simulations. We identify input uncertainties, classify them, suggest an appropriate statistical model and, finally, estimate propagation of these uncertainties into the solution (pressure, velocity and density fields as well as the lift and drag coefficients). The deterministic problem under consideration is a compressible transonic Reynolds-averaged Navier-Strokes flow around an airfoil with random/uncertain data. Input uncertainties include: uncertain angle of attack, the Mach number, random perturbations in the airfoil geometry, mesh, shock location, turbulence model and parameters of this turbulence model. This problem requires efficient numerical/statistical methods since it is computationally expensive, especially for the uncertainties caused by random geometry variations which involve a large number of variables. In numerical section we compares five methods, including quasi-Monte Carlo quadrature, polynomial chaos with coefficients determined by sparse quadrature and gradient-enhanced version of Kriging, radial basis functions and point collocation polynomial chaos, in their efficiency in estimating statistics of aerodynamic performance upon random perturbation to the airfoil geometry [D.Liu et al \\'17]. For modeling we used the TAU code, developed in DLR, Germany.

  16. Chern-Simons field theory of two-dimensional electrons in the lowest Landau level

    International Nuclear Information System (INIS)

    Zhang, L.

    1996-01-01

    We propose a fermion Chern-Simons field theory describing two-dimensional electrons in the lowest Landau level. This theory is constructed with a complete set of states, and the lowest-Landau-level constraint is enforced through a δ functional described by an auxiliary field λ. Unlike the field theory constructed directly with the states in the lowest Landau level, this theory allows one, utilizing the physical picture of open-quote open-quote composite fermion,close-quote close-quote to study the fractional quantum Hall states by mapping them onto certain integer quantum Hall states; but, unlike its application in the unconstrained theory, such a mapping is sensible only when interactions between electrons are present. An open-quote open-quote effective mass,close-quote close-quote which characterizes the scale of low energy excitations in the fractional quantum Hall systems, emerges naturally from our theory. We study a Gaussian effective theory and interpret physically the dressed stationary point equation for λ as an equation for the open-quote open-quote mass renormalization close-quote close-quote of composite fermions. copyright 1996 The American Physical Society

  17. Reference Materials for Calibration of Analytical Biases in Quantification of DNA Methylation.

    Science.gov (United States)

    Yu, Hannah; Hahn, Yoonsoo; Yang, Inchul

    2015-01-01

    Most contemporary methods for the quantification of DNA methylation employ bisulfite conversion and PCR amplification. However, many reports have indicated that bisulfite-mediated PCR methodologies can result in inaccurate measurements of DNA methylation owing to amplification biases. To calibrate analytical biases in quantification of gene methylation, especially those that arise during PCR, we utilized reference materials that represent exact bisulfite-converted sequences with 0% and 100% methylation status of specific genes. After determining relative quantities using qPCR, pairs of plasmids were gravimetrically mixed to generate working standards with predefined DNA methylation levels at 10% intervals in terms of mole fractions. The working standards were used as controls to optimize the experimental conditions and also as calibration standards in melting-based and sequencing-based analyses of DNA methylation. Use of the reference materials enabled precise characterization and proper calibration of various biases during PCR and subsequent methylation measurement processes, resulting in accurate measurements.

  18. Histogram-Based Thresholding for Detection and Quantification of Hemorrhages in Retinal Images

    Directory of Open Access Journals (Sweden)

    Hussain Fadhel Hamdan Jaafar

    2016-12-01

    Full Text Available Retinal image analysis is commonly used for the detection and quantification of retinal diabetic retinopathy. In retinal images, dark lesions including hemorrhages and microaneurysms are the earliest warnings of vision loss. In this paper, new algorithm for extraction and quantification of hemorrhages in fundus images is presented. Hemorrhage candidates are extracted in a preliminary step as a coarse segmentation followed by a fine segmentation step. Local variation processes are applied in the coarse segmentation step to determine boundaries of all candidates with distinct edges. Fine segmentation processes are based on histogram thresholding to extract real hemorrhages from the segmented candidates locally. The proposed method was trained and tested using an image dataset of 153 manually labeled retinal images. At the pixel level, the proposed method could identify abnormal retinal images with 90.7% sensitivity and 85.1% predictive value. Due to its distinctive performance measurements, this technique demonstrates that it could be used for a computer-aided mass screening of retinal diseases.

  19. Stereological quantification of mast cells in human synovium

    DEFF Research Database (Denmark)

    Damsgaard, T E; Sørensen, Flemming Brandt; Herlin, T

    1999-01-01

    Mast cells participate in both the acute allergic reaction as well as in chronic inflammatory diseases. Earlier studies have revealed divergent results regarding the quantification of mast cells in the human synovium. The aim of the present study was therefore to quantify these cells in the human...... synovium, using stereological techniques. Different methods of staining and quantification have previously been used for mast cell quantification in human synovium. Stereological techniques provide precise and unbiased information on the number of cell profiles in two-dimensional tissue sections of......, in this case, human synovium. In 10 patients suffering from osteoarthritis a median of 3.6 mast cells/mm2 synovial membrane was found. The total number of cells (synoviocytes, fibroblasts, lymphocytes, leukocytes) present was 395.9 cells/mm2 (median). The mast cells constituted 0.8% of all the cell profiles...

  20. Comparative quantification of dietary supplemented neural creatine concentrations with (1)H-MRS peak fitting and basis spectrum methods.

    Science.gov (United States)

    Turner, Clare E; Russell, Bruce R; Gant, Nicholas

    2015-11-01

    Magnetic resonance spectroscopy (MRS) is an analytical procedure that can be used to non-invasively measure the concentration of a range of neural metabolites. Creatine is an important neurometabolite with dietary supplementation offering therapeutic potential for neurological disorders with dysfunctional energetic processes. Neural creatine concentrations can be probed using proton MRS and quantified using a range of software packages based on different analytical methods. This experiment examines the differences in quantification performance of two commonly used analysis packages following a creatine supplementation strategy with potential therapeutic application. Human participants followed a seven day dietary supplementation regime in a placebo-controlled, cross-over design interspersed with a five week wash-out period. Spectroscopy data were acquired the day immediately following supplementation and analyzed with two commonly-used software packages which employ vastly different quantification methods. Results demonstrate that neural creatine concentration was augmented following creatine supplementation when analyzed using the peak fitting method of quantification (105.9%±10.1). In contrast, no change in neural creatine levels were detected with supplementation when analysis was conducted using the basis spectrum method of quantification (102.6%±8.6). Results suggest that software packages that employ the peak fitting procedure for spectral quantification are possibly more sensitive to subtle changes in neural creatine concentrations. The relative simplicity of the spectroscopy sequence and the data analysis procedure suggest that peak fitting procedures may be the most effective means of metabolite quantification when detection of subtle alterations in neural metabolites is necessary. The straightforward technique can be used on a clinical magnetic resonance imaging system. Copyright © 2015 Elsevier Inc. All rights reserved.

  1. 1H-MRS processing parameters affect metabolite quantification : The urgent need for uniform and transparent standardization

    NARCIS (Netherlands)

    Bhogal, Alex A.; Schür, Remmelt; Houtepen, Lotte C.; van de Bank, B.L.; Boer, Vincent O.; Marsman, Anouk; Barker, Peter B.; Scheenen, Tom W. J.; Wijnen, Jannie P.; Vinkers, Christiaan H.; Klomp, Dennis W.J.

    2017-01-01

    Proton magnetic resonance spectroscopy (1H-MRS) can be used to quantify in vivo metabolite levels, such as lactate, γ-aminobutyric acid (GABA) and glutamate (Glu). However, there are considerable analysis choices which can alter the accuracy or precision of 1H-MRS metabolite quantification. It is

  2. Ratio methods for cost-effective field sampling of commercial radioactive low-level wastes

    International Nuclear Information System (INIS)

    Eberhardt, L.L.; Simmons, M.A.; Thomas, J.M.

    1985-07-01

    In many field studies to determine the quantities of radioactivity at commercial low-level radioactive waste sites, preliminary appraisals are made with field radiation detectors, or other relatively inaccurate devices. More accurate determinations are subsequently made with procedures requiring chemical separations or other expensive analyses. Costs of these laboratory determinations are often large, so that adequate sampling may not be achieved due to budget limitations. In this report, we propose double sampling as a way to combine the expensive and inexpensive aproaches to substantially reduce overall costs. The underlying theory was developed for human and agricultural surveys, and is partially based on assumptions that are not appropriate for commercial low-level waste sites. Consequently, extensive computer simulations were conducted to determine whether the results can be applied in circumstances of importance to the Nuclear Regulatory Commission. This report gives the simulation details, and concludes that the principal equations are appropriate for most studies at commercial low-level waste sites. A few points require further research, using actual commercial low-level radioactive waste site data. The final section of the report provides some guidance (via an example) for the field use of double sampling. Details of the simulation programs are available from the authors. Major findings are listed in the Executive Summary. 9 refs., 9 figs., 30 tabs

  3. Critical assessment of digital PCR for the detection and quantification of genetically modified organisms.

    Science.gov (United States)

    Demeke, Tigst; Dobnik, David

    2018-07-01

    The number of genetically modified organisms (GMOs) on the market is steadily increasing. Because of regulation of cultivation and trade of GMOs in several countries, there is pressure for their accurate detection and quantification. Today, DNA-based approaches are more popular for this purpose than protein-based methods, and real-time quantitative PCR (qPCR) is still the gold standard in GMO analytics. However, digital PCR (dPCR) offers several advantages over qPCR, making this new technique appealing also for GMO analysis. This critical review focuses on the use of dPCR for the purpose of GMO quantification and addresses parameters which are important for achieving accurate and reliable results, such as the quality and purity of DNA and reaction optimization. Three critical factors are explored and discussed in more depth: correct classification of partitions as positive, correctly determined partition volume, and dilution factor. This review could serve as a guide for all laboratories implementing dPCR. Most of the parameters discussed are applicable to fields other than purely GMO testing. Graphical abstract There are generally three different options for absolute quantification of genetically modified organisms (GMOs) using digital PCR: droplet- or chamber-based and droplets in chambers. All have in common the distribution of reaction mixture into several partitions, which are all subjected to PCR and scored at the end-point as positive or negative. Based on these results GMO content can be calculated.

  4. Aspect-Oriented Programming is Quantification and Obliviousness

    Science.gov (United States)

    Filman, Robert E.; Friedman, Daniel P.; Norvig, Peter (Technical Monitor)

    2000-01-01

    This paper proposes that the distinguishing characteristic of Aspect-Oriented Programming (AOP) systems is that they allow programming by making quantified programmatic assertions over programs written by programmers oblivious to such assertions. Thus, AOP systems can be analyzed with respect to three critical dimensions: the kinds of quantifications allowed, the nature of the actions that can be asserted, and the mechanism for combining base-level actions with asserted actions. Consequences of this perspective are the recognition that certain systems are not AOP and that some mechanisms are expressive enough to allow programming an AOP system within them. A corollary is that while AOP can be applied to Object-Oriented Programming, it is an independent concept applicable to other programming styles.

  5. A universal real-time PCR assay for the quantification of group-M HIV-1 proviral load.

    Science.gov (United States)

    Malnati, Mauro S; Scarlatti, Gabriella; Gatto, Francesca; Salvatori, Francesca; Cassina, Giulia; Rutigliano, Teresa; Volpi, Rosy; Lusso, Paolo

    2008-01-01

    Quantification of human immunodeficiency virus type-1 (HIV-1) proviral DNA is increasingly used to measure the HIV-1 cellular reservoirs, a helpful marker to evaluate the efficacy of antiretroviral therapeutic regimens in HIV-1-infected individuals. Furthermore, the proviral DNA load represents a specific marker for the early diagnosis of perinatal HIV-1 infection and might be predictive of HIV-1 disease progression independently of plasma HIV-1 RNA levels and CD4(+) T-cell counts. The high degree of genetic variability of HIV-1 poses a serious challenge for the design of a universal quantitative assay capable of detecting all the genetic subtypes within the main (M) HIV-1 group with similar efficiency. Here, we describe a highly sensitive real-time PCR protocol that allows for the correct quantification of virtually all group-M HIV-1 strains with a higher degree of accuracy compared with other methods. The protocol involves three stages, namely DNA extraction/lysis, cellular DNA quantification and HIV-1 proviral load assessment. Owing to the robustness of the PCR design, this assay can be performed on crude cellular extracts, and therefore it may be suitable for the routine analysis of clinical samples even in developing countries. An accurate quantification of the HIV-1 proviral load can be achieved within 1 d from blood withdrawal.

  6. 2D atom localization in a four-level tripod system in laser fields

    OpenAIRE

    Ivanov, Vladimir; Rozhdestvensky, Yuri

    2012-01-01

    We propose a scheme for two-dimensional (2D) atom localization in a four-level tripod system under an influence of two orthogonal standing-wave fields. Position information of the atom is retained in the atomic internal states by an additional probe field either of a standing or of a running wave. It is shown that the localization factors depend crucially on the atom-field coupling that results in such spatial structures of populations as spikes, craters and waves. We demonstrate a high-preci...

  7. Application of Enlisted Force Retention Levels and Career Field Stability

    Science.gov (United States)

    2017-03-23

    APPLICATION OF ENLISTED FORCE RETENTION LEVELS AND CAREER FIELD STABILITY THESIS Presented to the Faculty Department of Operational Sciences ...Fulfillment of the Requirements for the Degree of Master of Science in Operations Research Jamie T. Zimmermann, MS, BS Captain, USAF March 2017...Appendix B. The function proc lifetest is a nonparametric estimate of the survivor function using either the Kaplan-Meier method or the actuarial

  8. Cadmium voltametric quantification in table chocolate produced in Chiquinquira-Boyaca, Colombia

    Directory of Open Access Journals (Sweden)

    Paola Andrea Vargas Moreno

    2017-04-01

    Full Text Available Bioaccumulation of heavy metals such as cadmium has been a major concern in scientific communities and international food organizations, given the great toxicological risk to the consumer, and in many places there is no detailed record of its actual content. In this way, the need arises to carry out a study and registration of the concentration of this metal in products such as table chocolate, of great consumption at regional and national level. Likewise, we seek to have effective quantification tools and a reliable and affordable method to achieve the aim of this research. In this research, Cadmium content in powdered and granulated table chocolate was determined, elaborated and commercialized in the municipality of Chiquinquira, Boyacá-Colombia, using the differential pulse voltammetric method of anodic redisolution (DPVMAR. Previously, the parameters of this method were evaluated, selecting selectivity, linearity, sensitivity, precision and accuracy with satisfactory results as follows: selective at a potential range of 0.54 to 0.64 V, sensitivity in ppb, R2> 0.95, % CV 80%. Analysis of variance showed no significant statistical differences (P <0.05 between the results. Cadmium quantification in samples of granulated and powder chocolate showed values of concentration between 214 and 260 ppb, with the highest concentrations of powder chocolate. Cadmium level did not exceed the tolerable weekly intake limit for this type of food.

  9. A real-time PCR assay for detection and quantification of Verticillium dahliae in spinach seed.

    Science.gov (United States)

    Duressa, Dechassa; Rauscher, Gilda; Koike, Steven T; Mou, Beiquan; Hayes, Ryan J; Maruthachalam, Karunakaran; Subbarao, Krishna V; Klosterman, Steven J

    2012-04-01

    Verticillium dahliae is a soilborne fungus that causes Verticillium wilt on multiple crops in central coastal California. Although spinach crops grown in this region for fresh and processing commercial production do not display Verticillium wilt symptoms, spinach seeds produced in the United States or Europe are commonly infected with V. dahliae. Planting of the infected seed increases the soil inoculum density and may introduce exotic strains that contribute to Verticillium wilt epidemics on lettuce and other crops grown in rotation with spinach. A sensitive, rapid, and reliable method for quantification of V. dahliae in spinach seed may help identify highly infected lots, curtail their planting, and minimize the spread of exotic strains via spinach seed. In this study, a quantitative real-time polymerase chain reaction (qPCR) assay was optimized and employed for detection and quantification of V. dahliae in spinach germplasm and 15 commercial spinach seed lots. The assay used a previously reported V. dahliae-specific primer pair (VertBt-F and VertBt-R) and an analytical mill for grinding tough spinach seed for DNA extraction. The assay enabled reliable quantification of V. dahliae in spinach seed, with a sensitivity limit of ≈1 infected seed per 100 (1.3% infection in a seed lot). The quantification was highly reproducible between replicate samples of a seed lot and in different real-time PCR instruments. When tested on commercial seed lots, a pathogen DNA content corresponding to a quantification cycle value of ≥31 corresponded with a percent seed infection of ≤1.3%. The assay is useful in qualitatively assessing seed lots for V. dahliae infection levels, and the results of the assay can be helpful to guide decisions on whether to apply seed treatments.

  10. PCR amplification of repetitive sequences as a possible approach in relative species quantification

    DEFF Research Database (Denmark)

    Ballin, Nicolai Zederkopff; Vogensen, Finn Kvist; Karlsson, Anders H

    2012-01-01

    Abstract Both relative and absolute quantifications are possible in species quantification when single copy genomic DNA is used. However, amplification of single copy genomic DNA does not allow a limit of detection as low as one obtained from amplification of repetitive sequences. Amplification...... of repetitive sequences is therefore frequently used in absolute quantification but problems occur in relative quantification as the number of repetitive sequences is unknown. A promising approach was developed where data from amplification of repetitive sequences were used in relative quantification of species...... to relatively quantify the amount of chicken DNA in a binary mixture of chicken DNA and pig DNA. However, the designed PCR primers lack the specificity required for regulatory species control....

  11. Energy levels of mesic molecules ddμ and dt μ in a homogeneous magnetic field

    International Nuclear Information System (INIS)

    Choi Nam Chol.

    1990-01-01

    The energy levels of mesic molecules ddμ and dtμ in a homogeneous magnetic field 0-10 8 Gs have been calculated. Calculations are carried out in the adiabatic representation of three-body problem. It is shown that in really existing fields ( 5 Gs) the shifts of energy levels produce no considerable effect on the process of resonant production of mesic molecules. 13 refs.; 3 figs.; 2 tabs

  12. Convex geometry of quantum resource quantification

    Science.gov (United States)

    Regula, Bartosz

    2018-01-01

    We introduce a framework unifying the mathematical characterisation of different measures of general quantum resources and allowing for a systematic way to define a variety of faithful quantifiers for any given convex quantum resource theory. The approach allows us to describe many commonly used measures such as matrix norm-based quantifiers, robustness measures, convex roof-based measures, and witness-based quantifiers together in a common formalism based on the convex geometry of the underlying sets of resource-free states. We establish easily verifiable criteria for a measure to possess desirable properties such as faithfulness and strong monotonicity under relevant free operations, and show that many quantifiers obtained in this framework indeed satisfy them for any considered quantum resource. We derive various bounds and relations between the measures, generalising and providing significantly simplified proofs of results found in the resource theories of quantum entanglement and coherence. We also prove that the quantification of resources in this framework simplifies for pure states, allowing us to obtain more easily computable forms of the considered measures, and show that many of them are in fact equal on pure states. Further, we investigate the dual formulation of resource quantifiers, which provide a characterisation of the sets of resource witnesses. We present an explicit application of the results to the resource theories of multi-level coherence, entanglement of Schmidt number k, multipartite entanglement, as well as magic states, providing insight into the quantification of the four resources by establishing novel quantitative relations and introducing new quantifiers, such as a measure of entanglement of Schmidt number k which generalises the convex roof-extended negativity, a measure of k-coherence which generalises the \

  13. Quantification of cellular uptake of DNA nanostructures by qPCR

    DEFF Research Database (Denmark)

    Okholm, Anders Hauge; Nielsen, Jesper Sejrup; Vinther, Mathias

    2014-01-01

    interactions and structural and functional features of the DNA delivery device must be thoroughly investigated. Here, we present a rapid and robust method for the precise quantification of the component materials of DNA origami structures capable of entering cells in vitro. The quantification is performed...

  14. Antibiotic Resistome: Improving Detection and Quantification Accuracy for Comparative Metagenomics.

    Science.gov (United States)

    Elbehery, Ali H A; Aziz, Ramy K; Siam, Rania

    2016-04-01

    The unprecedented rise of life-threatening antibiotic resistance (AR), combined with the unparalleled advances in DNA sequencing of genomes and metagenomes, has pushed the need for in silico detection of the resistance potential of clinical and environmental metagenomic samples through the quantification of AR genes (i.e., genes conferring antibiotic resistance). Therefore, determining an optimal methodology to quantitatively and accurately assess AR genes in a given environment is pivotal. Here, we optimized and improved existing AR detection methodologies from metagenomic datasets to properly consider AR-generating mutations in antibiotic target genes. Through comparative metagenomic analysis of previously published AR gene abundance in three publicly available metagenomes, we illustrate how mutation-generated resistance genes are either falsely assigned or neglected, which alters the detection and quantitation of the antibiotic resistome. In addition, we inspected factors influencing the outcome of AR gene quantification using metagenome simulation experiments, and identified that genome size, AR gene length, total number of metagenomics reads and selected sequencing platforms had pronounced effects on the level of detected AR. In conclusion, our proposed improvements in the current methodologies for accurate AR detection and resistome assessment show reliable results when tested on real and simulated metagenomic datasets.

  15. Strong-field spatiotemporal ultrafast coherent control in three-level atoms

    International Nuclear Information System (INIS)

    Bruner, Barry D.; Suchowski, Haim; Silberberg, Yaron; Vitanov, Nikolay V.

    2010-01-01

    Simple analytical approaches for implementing strong field coherent control schemes are often elusive due to the complexity of the interaction between the intense excitation field and the system of interest. Here, we demonstrate control over multiphoton excitation in a three-level resonant system using simple, analytically derived ultrafast pulse shapes. We utilize a two-dimensional spatiotemporal control technique, in which temporal focusing produces a spatially dependent quadratic spectral phase, while a second, arbitrary phase parameter is scanned using a pulse shaper. In the current work, we demonstrate weak-to-strong field excitation of 85 Rb, with a π phase step and the quadratic phase as the chosen control parameters. The intricate dependence of the multilevel dynamics on these parameters is exhibited by mapping the data onto a two-dimensional control landscape. Further insight is gained by simulating the complete landscape using a dressed-state, time-domain model, in which the influence of individual shaping parameters can be extracted using both exact and asymptotic time-domain representations of the dressed-state energies.

  16. A critical view on microplastic quantification in aquatic organisms

    DEFF Research Database (Denmark)

    Vandermeersch, Griet; Van Cauwenberghe, Lisbeth; Janssen, Colin R.

    2015-01-01

    Microplastics, plastic particles and fragments smaller than 5mm, are ubiquitous in the marine environment. Ingestion and accumulation of microplastics have previously been demonstrated for diverse marine species ranging from zooplankton to bivalves and fish, implying the potential for microplastics...... to accumulate in the marine food web. In this way, microplastics can potentially impact food safety and human health. Although a few methods to quantify microplastics in biota have been described, no comparison and/or intercalibration of these techniques have been performed. Here we conducted a literature...... review on all available extraction and quantification methods. Two of these methods, involving wet acid destruction, were used to evaluate the presence of microplastics in field-collected mussels (Mytilus galloprovincialis) from three different "hotspot" locations in Europe (Po estuary, Italy; Tagus...

  17. Quantification of extracellular levels of corticosterone in the basolateral amygdaloid complex of freely-moving rats: a dialysis study of circadian variation and stress-induced modulation.

    Science.gov (United States)

    Bouchez, Gaëlle; Millan, Mark J; Rivet, Jean-Michel; Billiras, Rodolphe; Boulanger, Raphaël; Gobert, Alain

    2012-05-03

    Corticosterone influences emotion and cognition via actions in a diversity of corticolimbic structures, including the amygdala. Since extracellular levels of corticosterone in brain have rarely been studied, we characterized a specific and sensitive enzymatic immunoassay for microdialysis quantification of corticosterone in the basolateral amygdaloid complex of freely-moving rats. Corticosterone levels showed marked diurnal variation with an evening (dark phase) peak and stable, low levels during the day (light phase). The "anxiogenic agents", FG7142 (20 mg/kg) and yohimbine (10 mg/kg), and an environmental stressor, 15-min forced-swim, induced marked and sustained (1-3 h) increases in dialysis levels of corticosterone in basolateral amygdaloid complex. They likewise increased dialysis levels of dopamine and noradrenaline, but not serotonin and GABA. As compared to basal corticosterone levels of ~200-300 pg/ml, the elevation provoked by forced-swim was ca. 20-fold and this increase was abolished by adrenalectomy. Interestingly, stress-induced rises of corticosterone levels in basolateral amygdaloid complex were abrogated by combined but not separate administration of the corticotrophin releasing factor(1) (CRF(1)) receptor antagonist, CP154,526, and the vasopressin(1b) (V(1b)) receptor antagonist, SSR149,415. Underpinning their specificity, they did not block forced-swim-induced elevations in dopamine and noradrenaline. In conclusion, extracellular levels of corticosterone in the basolateral amygdaloid complex display marked diurnal variation. Further, they are markedly elevated by acute stressors, the effects of which are mediated (in contrast to concomitant elevations in levels of monoamines) by co-joint recruitment of CRF(1) and V(1b) receptors. Copyright © 2012 Elsevier B.V. All rights reserved.

  18. Basic Restriction and Reference Level in Anatomically-based Japanese Models for Low-Frequency Electric and Magnetic Field Exposures

    Science.gov (United States)

    Takano, Yukinori; Hirata, Akimasa; Fujiwara, Osamu

    Human exposed to electric and/or magnetic fields at low frequencies may cause direct effect such as nerve stimulation and excitation. Therefore, basic restriction is regulated in terms of induced current density in the ICNIRP guidelines and in-situ electric field in the IEEE standard. External electric or magnetic field which does not produce induced quantities exceeding the basic restriction is used as a reference level. The relationship between the basic restriction and reference level for low-frequency electric and magnetic fields has been investigated using European anatomic models, while limited for Japanese model, especially for electric field exposures. In addition, that relationship has not well been discussed. In the present study, we calculated the induced quantities in anatomic Japanese male and female models exposed to electric and magnetic fields at reference level. A quasi static finite-difference time-domain (FDTD) method was applied to analyze this problem. As a result, spatially averaged induced current density was found to be more sensitive to averaging algorithms than that of in-situ electric field. For electric and magnetic field exposure at the ICNIRP reference level, the maximum values of the induced current density for different averaging algorithm were smaller than the basic restriction for most cases. For exposures at the reference level in the IEEE standard, the maximum electric fields in the brain were larger than the basic restriction in the brain while smaller for the spinal cord and heart.

  19. Molecular quantification of genes encoding for green-fluorescent proteins

    DEFF Research Database (Denmark)

    Felske, A; Vandieken, V; Pauling, B V

    2003-01-01

    A quantitative PCR approach is presented to analyze the amount of recombinant green fluorescent protein (gfp) genes in environmental DNA samples. The quantification assay is a combination of specific PCR amplification and temperature gradient gel electrophoresis (TGGE). Gene quantification...... PCR strategy is a highly specific and sensitive way to monitor recombinant DNA in environments like the efflux of a biotechnological plant....

  20. A highly sensitive method for quantification of iohexol

    DEFF Research Database (Denmark)

    Schulz, A.; Boeringer, F.; Swifka, J.

    2014-01-01

    -chromatography-electrospray-massspectrometry (LC-ESI-MS) approach using the multiple reaction monitoring mode for iohexol quantification. In order to test whether a significantly decreased amount of iohexol is sufficient for reliable quantification, a LC-ESI-MS approach was assessed. We analyzed the kinetics of iohexol in rats after application...... of different amounts of iohexol (15 mg to 150 1.tg per rat). Blood sampling was conducted at four time points, at 15, 30, 60, and 90 min, after iohexol injection. The analyte (iohexol) and the internal standard (iotha(amic acid) were separated from serum proteins using a centrifugal filtration device...... with a cut-off of 3 kDa. The chromatographic separation was achieved on an analytical Zorbax SB C18 column. The detection and quantification were performed on a high capacity trap mass spectrometer using positive ion ESI in the multiple reaction monitoring (MRM) mode. Furthermore, using real-time polymerase...

  1. Multiscale Characterization and Quantification of Arsenic Mobilization and Attenuation During Injection of Treated Coal Seam Gas Coproduced Water into Deep Aquifers

    Science.gov (United States)

    Rathi, Bhasker; Siade, Adam J.; Donn, Michael J.; Helm, Lauren; Morris, Ryan; Davis, James A.; Berg, Michael; Prommer, Henning

    2017-12-01

    Coal seam gas production involves generation and management of large amounts of co-produced water. One of the most suitable methods of management is injection into deep aquifers. Field injection trials may be used to support the predictions of anticipated hydrological and geochemical impacts of injection. The present work employs reactive transport modeling (RTM) for a comprehensive analysis of data collected from a trial where arsenic mobilization was observed. Arsenic sorption behavior was studied through laboratory experiments, accompanied by the development of a surface complexation model (SCM). A field-scale RTM that incorporated the laboratory-derived SCM was used to simulate the data collected during the field injection trial and then to predict the long-term fate of arsenic. We propose a new practical procedure which integrates laboratory and field-scale models using a Monte Carlo type uncertainty analysis and alleviates a significant proportion of the computational effort required for predictive uncertainty quantification. The results illustrate that both arsenic desorption under alkaline conditions and pyrite oxidation have likely contributed to the arsenic mobilization that was observed during the field trial. The predictive simulations show that arsenic concentrations would likely remain very low if the potential for pyrite oxidation is minimized through complete deoxygenation of the injectant. The proposed modeling and predictive uncertainty quantification method can be implemented for a wide range of groundwater studies that investigate the risks of metal(loid) or radionuclide contamination.

  2. Extensive Evaluation of a Diffusion Denuder Technique for the Quantification of Atmospheric Stable and Radioactive Molecular Iodine

    DEFF Research Database (Denmark)

    Huang, Ru-Jin; Hou, Xiaolin; Hoffmann, Thorsten

    2010-01-01

    In this paper we present the evaluation and optimization of a new approach for the quantification of gaseous molecular iodine (I2) for laboratory- and field-based studies and its novel application for the measurement of radioactive molecular iodine. α-Cyclodextrin (α-CD) in combination with 129I......, and condition of release and derivatization of iodine, is extensively evaluated and optimized. The collection efficiency is larger than 98% and the limit of detection (LOD) obtained is 0.17 parts-per-trillion-by-volume (pptv) for a sampling duration of 30 min at 500 mL min−1. Furthermore, the potential use...... of this protocol for the determination of radioactive I2 at ultra trace level is also demonstrated when 129I− used in the coating is replaced by 127I− and a multiple denuder system is used. Using the present method we observed 25.7−108.6 pptv 127I2 at Mweenish Bay, Ireland and 108 molecule m−3 129I2 at Mainz...

  3. Trace level liquid chromatography tandem mass spectrometry quantification of the mutagenic impurity 2-hydroxypyridine N-oxide as its dansyl derivative.

    Science.gov (United States)

    Ding, Wei; Huang, Yande; Miller, Scott A; Bolgar, Mark S

    2015-03-20

    A derivatization LC-MS/MS method was developed and qualified for the trace level quantification of 2-hydroxypyridine N-oxide (HOPO). HOPO is a coupling reagent used in the syntheses of active pharmaceutical ingredients (APIs) to form amide bonds. HOPO was recently confirmed to generate a positive response in a GLP Ames bacterial-reverse-mutation test, classifying it as a mutagenic impurity and as such requiring its control in APIs to the threshold of toxicological concern (TTC). The derivatization reagent 5-dimethylamino-1-naphthalenesulfonyl chloride (dansyl chloride) was used in a basic solution to convert HOPO into the corresponding dansyl-derivative. The derivative was separated from different APIs and reagents by liquid chromatography. The detection of the HOPO dansyl-derivative was achieved by mass spectrometry in selected reaction monitoring (SRM) mode. The LC-MS/MS method had a reporting limit of 0.1ng/mL HOPO, which corresponds to 0.1ppm HOPO relative to an API at 1mg/mL, and a linearity range of 0.1-25ng/mL HOPO analyte. Recoveries of HOPO standards spiked into three different API matrices at 0.2, 1.2, and 20ppm levels were all within 90-100%. An SRM-based confirmatory methodology using the ratios of two fragment ions at three CID energies was developed to verify the identity of HOPO when present at ≥0.6ppm. This identity confirmation can be employed to prevent potential false positive detection of mutagenic impurities at trace level. It can be broadly applicable for the confirmation of analytes when the analytes generate at least two major fragments in tandem mass spectrometry experiments. Copyright © 2015 Elsevier B.V. All rights reserved.

  4. A novel background field removal method for MRI using projection onto dipole fields (PDF).

    Science.gov (United States)

    Liu, Tian; Khalidov, Ildar; de Rochefort, Ludovic; Spincemaille, Pascal; Liu, Jing; Tsiouris, A John; Wang, Yi

    2011-11-01

    For optimal image quality in susceptibility-weighted imaging and accurate quantification of susceptibility, it is necessary to isolate the local field generated by local magnetic sources (such as iron) from the background field that arises from imperfect shimming and variations in magnetic susceptibility of surrounding tissues (including air). Previous background removal techniques have limited effectiveness depending on the accuracy of model assumptions or information input. In this article, we report an observation that the magnetic field for a dipole outside a given region of interest (ROI) is approximately orthogonal to the magnetic field of a dipole inside the ROI. Accordingly, we propose a nonparametric background field removal technique based on projection onto dipole fields (PDF). In this PDF technique, the background field inside an ROI is decomposed into a field originating from dipoles outside the ROI using the projection theorem in Hilbert space. This novel PDF background removal technique was validated on a numerical simulation and a phantom experiment and was applied in human brain imaging, demonstrating substantial improvement in background field removal compared with the commonly used high-pass filtering method. Copyright © 2011 John Wiley & Sons, Ltd.

  5. Terahertz identification and quantification of penicillamine enantiomers

    International Nuclear Information System (INIS)

    Ji Te; Zhao Hongwei; Chen Min; Xiao Tiqiao; Han Pengyu

    2013-01-01

    Identification and characterization of L-, D- and DL- penicillamine were demonstrated by Terahertz time-domain spectroscopy (THz-TDS). To understand the physical origins of the low frequency resonant modes, the density functional theory (DFT) was adopted for theoretical calculation. It was found that the collective THz frequency motions were decided by the intramolecular and intermolecular hydrogen bond interactions. Moreover, the quantification of penicillamine enantiomers mixture was demonstrated by a THz spectra fitting method with a relative error of less than 3.5%. This technique can be a valuable tool for the discrimination and quantification of chiral drugs in pharmaceutical industry. (authors)

  6. A probabilistic generative model for quantification of DNA modifications enables analysis of demethylation pathways.

    Science.gov (United States)

    Äijö, Tarmo; Huang, Yun; Mannerström, Henrik; Chavez, Lukas; Tsagaratou, Ageliki; Rao, Anjana; Lähdesmäki, Harri

    2016-03-14

    We present a generative model, Lux, to quantify DNA methylation modifications from any combination of bisulfite sequencing approaches, including reduced, oxidative, TET-assisted, chemical-modification assisted, and methylase-assisted bisulfite sequencing data. Lux models all cytosine modifications (C, 5mC, 5hmC, 5fC, and 5caC) simultaneously together with experimental parameters, including bisulfite conversion and oxidation efficiencies, as well as various chemical labeling and protection steps. We show that Lux improves the quantification and comparison of cytosine modification levels and that Lux can process any oxidized methylcytosine sequencing data sets to quantify all cytosine modifications. Analysis of targeted data from Tet2-knockdown embryonic stem cells and T cells during development demonstrates DNA modification quantification at unprecedented detail, quantifies active demethylation pathways and reveals 5hmC localization in putative regulatory regions.

  7. Image-guided regularization level set evolution for MR image segmentation and bias field correction.

    Science.gov (United States)

    Wang, Lingfeng; Pan, Chunhong

    2014-01-01

    Magnetic resonance (MR) image segmentation is a crucial step in surgical and treatment planning. In this paper, we propose a level-set-based segmentation method for MR images with intensity inhomogeneous problem. To tackle the initialization sensitivity problem, we propose a new image-guided regularization to restrict the level set function. The maximum a posteriori inference is adopted to unify segmentation and bias field correction within a single framework. Under this framework, both the contour prior and the bias field prior are fully used. As a result, the image intensity inhomogeneity can be well solved. Extensive experiments are provided to evaluate the proposed method, showing significant improvements in both segmentation and bias field correction accuracies as compared with other state-of-the-art approaches. Copyright © 2014 Elsevier Inc. All rights reserved.

  8. Effects of an applied low frequency field on the dynamics of a two-level atom interacting with a single-mode field

    International Nuclear Information System (INIS)

    Xun-Wei, Xu; Nian-Hua, Liu

    2010-01-01

    The effects of an applied low frequency field on the dynamics of a two-level atom interacting with a single-mode field are investigated. It is shown that the time evolution of the atomic population is mainly controlled by the coupling constants and the frequency of the low frequency field, which leads to a low frequency modulation function for the time evolution of the upper state population. The amplitude of the modulation function becomes larger as the coupling constants increase. The frequency of the modulation function is proportional to the frequency of the low frequency field, and decreases with increasing coupling constant. (classical areas of phenomenology)

  9. Two-dimensional atom localization based on coherent field controlling in a five-level M-type atomic system.

    Science.gov (United States)

    Jiang, Xiangqian; Li, Jinjiang; Sun, Xiudong

    2017-12-11

    We study two-dimensional sub-wavelength atom localization based on the microwave coupling field controlling and spontaneously generated coherence (SGC) effect. For a five-level M-type atom, introducing a microwave coupling field between two upper levels and considering the quantum interference between two transitions from two upper levels to lower levels, the analytical expression of conditional position probability (CPP) distribution is obtained using the iterative method. The influence of the detuning of a spontaneously emitted photon, Rabi frequency of the microwave field, and the SGC effect on the CPP are discussed. The two-dimensional sub-half-wavelength atom localization with high-precision and high spatial resolution is achieved by adjusting the detuning and the Rabi frequency, where the atom can be localized in a region smaller thanλ/10×λ/10. The spatial resolution is improved significantly compared with the case without the microwave field.

  10. Asymmetric flow field-flow fractionation coupled to inductively coupled plasma mass spectrometry for the quantification of quantum dots bioconjugation efficiency.

    Science.gov (United States)

    Menéndez-Miranda, Mario; Encinar, Jorge Ruiz; Costa-Fernández, José M; Sanz-Medel, Alfredo

    2015-11-27

    Hyphenation of asymmetric flow field-flow fractionation (AF4) to an on-line elemental detection (inductively coupled plasma-mass spectrometry, ICP-MS) is proposed as a powerful diagnostic tool for quantum dots bioconjugation studies. In particular, conjugation effectiveness between a "model" monoclonal IgG antibody (Ab) and CdSe/ZnS core-shell Quantum Dots (QDs), surface-coated with an amphiphilic polymer, has been monitored here by such hybrid AF4-ICP-MS technique. Experimental conditions have been optimized searching for a proper separation between the sought bioconjugates from the eventual free reagents excesses employed during the bioconjugation (QDs and antibodies). Composition and pH of the carrier have been found to be critical parameters to ensure an efficient separation while ensuring high species recovery from the AF4 channel. An ICP-MS equipped with a triple quadropole was selected as elemental detector to enable sensitive and reliable simultaneous quantification of the elemental constituents, including sulfur, of the nanoparticulated species and the antibody. The hyphenated technique used provided nanoparticle size-based separation, elemental detection, and composition analysis capabilities that turned out to be instrumental in order to investigate in depth the Ab-QDs bioconjugation process. Moreover, the analytical strategy here proposed allowed us not only to clearly identify the bioconjugation reaction products but also to quantify nanoparticle:antibodies bioconjugation efficiency. This is a key issue in future development of analytical and bioanalytical photoluminescent QDs applications. Copyright © 2015 Elsevier B.V. All rights reserved.

  11. Comparison of DNA Quantification Methods for Next Generation Sequencing.

    Science.gov (United States)

    Robin, Jérôme D; Ludlow, Andrew T; LaRanger, Ryan; Wright, Woodring E; Shay, Jerry W

    2016-04-06

    Next Generation Sequencing (NGS) is a powerful tool that depends on loading a precise amount of DNA onto a flowcell. NGS strategies have expanded our ability to investigate genomic phenomena by referencing mutations in cancer and diseases through large-scale genotyping, developing methods to map rare chromatin interactions (4C; 5C and Hi-C) and identifying chromatin features associated with regulatory elements (ChIP-seq, Bis-Seq, ChiA-PET). While many methods are available for DNA library quantification, there is no unambiguous gold standard. Most techniques use PCR to amplify DNA libraries to obtain sufficient quantities for optical density measurement. However, increased PCR cycles can distort the library's heterogeneity and prevent the detection of rare variants. In this analysis, we compared new digital PCR technologies (droplet digital PCR; ddPCR, ddPCR-Tail) with standard methods for the titration of NGS libraries. DdPCR-Tail is comparable to qPCR and fluorometry (QuBit) and allows sensitive quantification by analysis of barcode repartition after sequencing of multiplexed samples. This study provides a direct comparison between quantification methods throughout a complete sequencing experiment and provides the impetus to use ddPCR-based quantification for improvement of NGS quality.

  12. Reconstructing Northern Hemisphere upper-level fields during World War II

    Energy Technology Data Exchange (ETDEWEB)

    Broennimann, S. [Lunar and Planetary Laboratory, University of Arizona, PO Box 210092, Tucson, AZ 85721-0092 (United States); Luterbacher, J. [Institute of Geography, University of Bern, Bern (Switzerland); NCCR Climate, University of Bern, Bern (Switzerland)

    2004-05-01

    Monthly mean fields of temperature and geopotential height (GPH) from 700 to 100 hPa were statistically reconstructed for the extratropical Northern Hemisphere for the World War II period. The reconstruction was based on several hundred predictor variables, comprising temperature series from meteorological stations and gridded sea level pressure data (1939-1947) as well as a large amount of historical upper-air data (1939-1944). Statistical models were fitted in a calibration period (1948-1994) using the NCEP/NCAR Reanalysis data set as predictand. The procedure consists of a weighting scheme, principal component analyses on both the predictor variables and the predictand fields and multiple regression models relating the two sets of principal component time series to each other. According to validation experiments, the reconstruction skill in the 1939-1944 period is excellent for GPH at all levels and good for temperature up to 500 hPa, but somewhat worse for 300 hPa temperature and clearly worse for 100 hPa temperature. Regionally, high predictive skill is found over the midlatitudes of Europe and North America, but a lower quality over Asia, the subtropics, and the Arctic. Moreover, the quality is considerably better in winter than in summer. In the 1945-1947 period, reconstructions are useful up to 300 hPa for GPH and, in winter, up to 500 hPa for temperature. The reconstructed fields are presented for selected months and analysed from a dynamical perspective. It is demonstrated that the reconstructions provide a useful tool for the analysis of large-scale circulation features as well as stratosphere-troposphere coupling in the late 1930s and early 1940s. (orig.)

  13. SPECT quantification of regional radionuclide distributions

    International Nuclear Information System (INIS)

    Jaszczak, R.J.; Greer, K.L.; Coleman, R.E.

    1986-01-01

    SPECT quantification of regional radionuclide activities within the human body is affected by several physical and instrumental factors including attenuation of photons within the patient, Compton scattered events, the system's finite spatial resolution and object size, finite number of detected events, partial volume effects, the radiopharmaceutical biokinetics, and patient and/or organ motion. Furthermore, other instrumentation factors such as calibration of the center-of-rotation, sampling, and detector nonuniformities will affect the SPECT measurement process. These factors are described, together with examples of compensation methods that are currently available for improving SPECT quantification. SPECT offers the potential to improve in vivo estimates of absorbed dose, provided the acquisition, reconstruction, and compensation procedures are adequately implemented and utilized. 53 references, 2 figures

  14. Multiplex quantification of four DNA targets in one reaction with Bio-Rad droplet digital PCR system for GMO detection

    Science.gov (United States)

    Dobnik, David; Štebih, Dejan; Blejec, Andrej; Morisset, Dany; Žel, Jana

    2016-10-01

    The advantages of the digital PCR technology are already well documented until now. One way to achieve better cost efficiency of the technique is to use it in a multiplexing strategy. Droplet digital PCR platforms, which include two fluorescence filters, support at least duplex reactions and with some developments and optimization higher multiplexing is possible. The present study not only shows a development of multiplex assays in droplet digital PCR, but also presents a first thorough evaluation of several parameters in such multiplex digital PCR. Two 4-plex assays were developed for quantification of 8 different DNA targets (7 genetically modified maize events and maize endogene). Per assay, two of the targets were labelled with one fluorophore and two with another. As current analysis software does not support analysis of more than duplex, a new R- and Shiny-based web application analysis tool (http://bit.ly/ddPCRmulti) was developed that automates the analysis of 4-plex results. In conclusion, the two developed multiplex assays are suitable for quantification of GMO maize events and the same approach can be used in any other field with a need for accurate and reliable quantification of multiple DNA targets.

  15. Quantification of cardiac magnetic field orientation during ventricular de- and repolarization

    International Nuclear Information System (INIS)

    Leeuwen, Peter van; Lange, Silke; Klein, Anita; Geue, Daniel; Groenemeyer, Dietrich; Hailer, Birgit; Seybold, Katrin; Poplutz, Christian

    2008-01-01

    We compared the stability and discriminatory power of different methods of determining cardiac magnetic field map (MFM) orientation within the context of coronary artery disease (CAD). In 27 healthy subjects and 24 CAD patients, multichannel magnetocardiograms were registered at rest. MFM orientation was determined during QT interval using: (a) locations of the positive and negative centres-of-gravity, (b) locations of the field extrema and (c) the direction of the maximum field gradient. Deviation from normal orientation quantified the ability of each approach to discriminate between healthy and CAD subjects. Although the course of orientation was similar for all methods, receiver operating characteristic analysis showed the best discrimination of CAD patients for the centre-of-gravity approach (area-under-the-curve = 86%), followed by the gradient (84%) and extrema (76%) methods. Consideration of methodological and discriminatory advantages with respect to noninvasive diagnosis of CAD suggests that the centres-of-gravity method is the most suited one

  16. Outdoor characterization of radio frequency electromagnetic fields in a Spanish birth cohort

    Energy Technology Data Exchange (ETDEWEB)

    Calvente, I. [Unit Research Support of the San Cecilio University Hospital, Biosanitary Institute of Granada (ibs.GRANADA), University Hospitals of Granada/University of Granada, Granada (Spain); Department of Radiology and Physical Medicine, School of Medicine, University of Granada, Av. Madreid s/n, Granada 18071 (Spain); Fernández, M.F. [Unit Research Support of the San Cecilio University Hospital, Biosanitary Institute of Granada (ibs.GRANADA), University Hospitals of Granada/University of Granada, Granada (Spain); Department of Radiology and Physical Medicine, School of Medicine, University of Granada, Av. Madreid s/n, Granada 18071 (Spain); CIBER en Epidemiología y Salud Pública (CIBERESP) (Spain); Pérez-Lobato, R.; Dávila-Arias, C.; Ocón, O.; Ramos, R. [Unit Research Support of the San Cecilio University Hospital, Biosanitary Institute of Granada (ibs.GRANADA), University Hospitals of Granada/University of Granada, Granada (Spain); Ríos-Arrabal, S. [Unit Research Support of the San Cecilio University Hospital, Biosanitary Institute of Granada (ibs.GRANADA), University Hospitals of Granada/University of Granada, Granada (Spain); Department of Radiology and Physical Medicine, School of Medicine, University of Granada, Av. Madreid s/n, Granada 18071 (Spain); Villalba-Moreno, J. [CIBER en Epidemiología y Salud Pública (CIBERESP) (Spain); and others

    2015-04-15

    There is considerable public concern in many countries about the possible adverse effects of exposure to non-ionizing radiation electromagnetic fields, especially in vulnerable populations such as children. The aim of this study was to characterize environmental exposure profiles within the frequency range 100 kHz–6 GHz in the immediate surrounds of the dwellings of 123 families from the INMA-Granada birth cohort in Southern Spain, using spot measurements. The arithmetic mean root mean-square electric field (E{sub RMS}) and power density (S{sub RMS}) values were, respectively, 195.79 mV/m (42.3% of data were above this mean) and 799.01 µW/m{sup 2} (30% of values were above this mean); median values were 148.80 mV/m and 285.94 µW/m{sup 2}, respectively. Exposure levels below the quantification limit were assigned a value of 0.01 V/m. Incident field strength levels varied widely among different areas or towns/villages, demonstrating spatial variability in the distribution of exposure values related to the surface area population size and also among seasons. Although recorded values were well below International Commission for Non-Ionizing Radiation Protection reference levels, there is a particular need to characterize incident field strength levels in vulnerable populations (e.g., children) because of their chronic and ever-increasing exposure. The effects of incident field strength have not been fully elucidated; however, it may be appropriate to apply the precautionary principle in order to reduce exposure in susceptible groups. - Highlights: • Spot measurements were performed in the immediate surrounds of children's dwellings. • Mean root mean-square electric field and power density values were calculated. • Most recorded values were far below international standard guideline limits. • Data demonstrate spatial variability in the distribution of exposure levels. • While adverse effects are proven, application of the precautionary principle may

  17. Stable isotope dilution quantification of mutagens in cooked foods by combined liquid chromatography-thermospray mass spectrometry

    International Nuclear Information System (INIS)

    Yamaizumi, Ziro; Kasai, Hiroshi; Nishimura, Susumu; Edmonds, C.G.; McCloskey, J.A.

    1986-01-01

    A method of general applicability for the detection and quantification of mutagens in cooked foods at the ppb level is presented. A minimal sample prefractionation is employed and [Me- 2 H 3 ]-labeled analogs of the compounds of interest are added for identification and quantification of mutagens by accurate measurement of chromatographic retention (K') in reverse-phase high-performance liquid chromatography (HPLC), and by measurement of the ratio of response of the protonated molecular ions of analyte and internal standard by directly coupled liquid chromatography-mass spectrometry (LC/MS). Initial application is demonstrated in the analysis of 2-amino-3-methylimidazo[4,5-f]quinoline (IQ) and 2-amino-3,4-dimethylimidazo[4,5-f]quinoline (MelQ) in broiled salmon. (Auth.)

  18. Analyzing Social Interactions: Promises and Challenges of Cross Recurrence Quantification Analysis

    DEFF Research Database (Denmark)

    Fusaroli, Riccardo; Konvalinka, Ivana; Wallot, Sebastian

    2014-01-01

    The scientific investigation of social interactions presents substantial challenges: interacting agents engage each other at many different levels and timescales (motor and physiological coordination, joint attention, linguistic exchanges, etc.), often making their behaviors interdependent in non......-linear ways. In this paper we review the current use of Cross Recurrence Quantification Analysis (CRQA) in the analysis of social interactions, and assess its potential and challenges. We argue that the method can sensitively grasp the dynamics of human interactions, and that it has started producing valuable...

  19. QUANTIFICATION OF TRANSGENIC PLANT MARKER GENE PERSISTENCE IN THE FIELD

    Science.gov (United States)

    Methods were developed to monitor persistence of genomic DNA in decaying plants in the field. As a model, we used recombinant neomycin phosphotransferase II (rNPT-II) marker genes present in genetically engineered plants. Polymerase chain reaction (PCR) primers were designed, com...

  20. Quantification of the vocal folds’ dynamic displacements

    Science.gov (United States)

    del Socorro Hernández-Montes, María; Muñoz, Silvino; De La Torre, Manuel; Flores, Mauricio; Pérez, Carlos; Mendoza-Santoyo, Fernando

    2016-05-01

    Fast dynamic data acquisition techniques are required to investigate the motional behavior of the vocal folds (VFs) when they are subjected to a steady air-flow through the trachea. High-speed digital holographic interferometry (DHI) is a non-invasive full-field-of-view technique that has proved its usefulness to study rapid and non-repetitive object movements. Hence it is an ideal technique used here to measure VF displacements and vibration patterns at 2000 fps. Analyses from a set of 200 displacement images showed that VFs’ vibration cycles are established along their width (y) and length (x). Furthermore, the maximum deformation for the right and left VFs’ area may be quantified from these images, which in itself represents an important result in the characterization of this structure. At a controlled air pressure, VF displacements fall within the range ~100-1740 nm, with a calculated precision and accuracy that yields a variation coefficient of 1.91%. High-speed acquisition of full-field images of VFs and their displacement quantification are on their own significant data in the study of their functional and physiological behavior since voice quality and production depend on how they vibrate, i.e. their displacement amplitude and frequency. Additionally, the use of high speed DHI avoids prolonged examinations and represents a significant scientific and technological alternative contribution in advancing the knowledge and working mechanisms of these tissues.

  1. Quantification of the recovered oil and water fractions during water flooding laboratory experiments

    DEFF Research Database (Denmark)

    Katika, Konstantina; Halim, Amalia Yunita; Shapiro, Alexander

    2015-01-01

    the volume might be less than a few microliters. In this study, we approach the determination of the oil volumes in flooding effluents using predetermined amounts of the North Sea oil with synthetic seawater. The UV/visible spectroscopy method and low-field NMR spectrometry are compared...... for this determination, and an account of advantages and disadvantages of each method is given. Both methods are reproducible with high accuracy. The NMR method was capable of direct quantification of both oil and water fractions, while the UV/visible spectroscopy quantifies only the oil fraction using a standard curve....

  2. Modelling soil water dynamics and crop water uptake at the field level

    NARCIS (Netherlands)

    Kabat, P.; Feddes, R.A.

    1995-01-01

    Parametrization approaches to model soil water dynamics and crop water uptake at field level were analysed. Averaging and numerical difficulties in applying numerical soil water flow models to heterogeneous soils are highlighted. Simplified parametrization approaches to the soil water flow, such as

  3. Two-Phase Microfluidic Systems for High Throughput Quantification of Agglutination Assays

    KAUST Repository

    Castro, David

    2018-01-01

    assay, with a minimum detection limit of 50 ng/mL using optical image analysis. We compare optical image analysis and light scattering as quantification methods, and demonstrate the first light scattering quantification of agglutination assays in a two

  4. Facebook levels the playing field: Dyslexic students learning through digital literacies

    Directory of Open Access Journals (Sweden)

    Owen Barden

    2014-02-01

    Full Text Available Dyslexia has an ambivalent relationship with learning technology. Any potential gains may be nullified if the technology is perceived to exacerbate stigma. This paper examines the use of an ‘everyday’ technology, Facebook, by a small group of sixth form students labelled as dyslexic. ‘Levelling the playing field’ is a phrase the participants used often when discussing what they wanted from learning technology. Because dyslexia usually is defined in terms of significant difficulties with literacy, we might reasonably anticipate that the participants would see Facebook as stigmatising rather than levelling the playing field, because of the very public literacy events that it demands. However, the data indicate that far from shying away from Facebook because of fear of their difficulties with literacy being exposed, the participants enthusiastically embraced it. The students saw Facebook as a desirable presence in their education, one that supported inclusion. For them, levelling the playing field with Facebook had five dimensions: keeping up to date and meeting deadlines; increased control over learning; developing metacognitive awareness; greater control over literacy process and demands; and being experts and helpers. The findings perhaps challenge some assumptions about dyslexia, literacy and learning, and may be of interest to teachers working with dyslexic students, or researchers studying learning in digitally mediated social networks.

  5. Perfusion quantification in contrast-enhanced ultrasound (CEUS)--ready for research projects and routine clinical use.

    Science.gov (United States)

    Tranquart, F; Mercier, L; Frinking, P; Gaud, E; Arditi, M

    2012-07-01

    With contrast-enhanced ultrasound (CEUS) now established as a valuable imaging modality for many applications, a more specific demand has recently emerged for quantifying perfusion and using measured parameters as objective indicators for various disease states. However, CEUS perfusion quantification remains challenging and is not well integrated in daily clinical practice. The development of VueBox™ alleviates existing limitations and enables quantification in a standardized way. VueBox™ operates as an off-line software application, after dynamic contrast-enhanced ultrasound (DCE-US) is performed. It enables linearization of DICOM clips, assessment of perfusion using patented curve-fitting models, and generation of parametric images by synthesizing perfusion information at the pixel level using color coding. VueBox™ is compatible with most of the available ultrasound platforms (nonlinear contrast-enabled), has the ability to process both bolus and disruption-replenishment kinetics loops, allows analysis results and their context to be saved, and generates analysis reports automatically. Specific features have been added to VueBox™, such as fully automatic in-plane motion compensation and an easy-to-use clip editor. Processing time has been reduced as a result of parallel programming optimized for multi-core processors. A long list of perfusion parameters is available for each of the two administration modes to address all possible demands currently reported in the literature for diagnosis or treatment monitoring. In conclusion, VueBox™ is a valid and robust quantification tool to be used for standardizing perfusion quantification and to improve the reproducibility of results across centers. © Georg Thieme Verlag KG Stuttgart · New York.

  6. Verification Validation and Uncertainty Quantification for CGS

    Energy Technology Data Exchange (ETDEWEB)

    Rider, William J. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Kamm, James R. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Weirs, V. Gregory [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2015-01-01

    The overall conduct of verification, validation and uncertainty quantification (VVUQ) is discussed through the construction of a workflow relevant to computational modeling including the turbulence problem in the coarse grained simulation (CGS) approach. The workflow contained herein is defined at a high level and constitutes an overview of the activity. Nonetheless, the workflow represents an essential activity in predictive simulation and modeling. VVUQ is complex and necessarily hierarchical in nature. The particular characteristics of VVUQ elements depend upon where the VVUQ activity takes place in the overall hierarchy of physics and models. In this chapter, we focus on the differences between and interplay among validation, calibration and UQ, as well as the difference between UQ and sensitivity analysis. The discussion in this chapter is at a relatively high level and attempts to explain the key issues associated with the overall conduct of VVUQ. The intention is that computational physicists can refer to this chapter for guidance regarding how VVUQ analyses fit into their efforts toward conducting predictive calculations.

  7. Results after ten years of field testing low-level radioactive waste forms using lysimeters

    International Nuclear Information System (INIS)

    McConnell, J.W. Jr.; Rogers, R.D.; Jastrow, J.D.; Sanford, W.E.; Larsen, I.L.; Sullivan, T.M.

    1995-01-01

    The Field Lysimeter Investigations: Low-Level Waste Data Base Development Program is obtaining information on the performance of radioactive waste forms. Ion-exchange resins from a commercial nuclear power station were solidified into waste forms using portland cement and vinyl esterstyrene. These waste forms are being tested to: (a) obtain information on performance of waste forms in typical disposal environments, (b) compare field results with bench leach studies, (c) develop a low-level waste data base for use in performance assessment source term calculations, and (d) apply the DUST computer code to compare predicted cumulative release to actual field data. The program, funded by the Nuclear Regulatory Commission (NRC), includes observed radionuclide releases from waste forms in field lysimeters. The purpose of this paper is to present the experimental results of two lysimeter arrays over 10 years of operation, and to compare those results to bench test results and to DUST code predicted releases. Further analysis of soil cores taken to define the observed upward migration of radionuclides in one lysimeter is also presented

  8. Uncertainty quantification applied to the radiological characterization of radioactive waste.

    Science.gov (United States)

    Zaffora, B; Magistris, M; Saporta, G; Chevalier, J-P

    2017-09-01

    This paper describes the process adopted at the European Organization for Nuclear Research (CERN) to quantify uncertainties affecting the characterization of very-low-level radioactive waste. Radioactive waste is a by-product of the operation of high-energy particle accelerators. Radioactive waste must be characterized to ensure its safe disposal in final repositories. Characterizing radioactive waste means establishing the list of radionuclides together with their activities. The estimated activity levels are compared to the limits given by the national authority of the waste disposal. The quantification of the uncertainty affecting the concentration of the radionuclides is therefore essential to estimate the acceptability of the waste in the final repository but also to control the sorting, volume reduction and packaging phases of the characterization process. The characterization method consists of estimating the activity of produced radionuclides either by experimental methods or statistical approaches. The uncertainties are estimated using classical statistical methods and uncertainty propagation. A mixed multivariate random vector is built to generate random input parameters for the activity calculations. The random vector is a robust tool to account for the unknown radiological history of legacy waste. This analytical technique is also particularly useful to generate random chemical compositions of materials when the trace element concentrations are not available or cannot be measured. The methodology was validated using a waste population of legacy copper activated at CERN. The methodology introduced here represents a first approach for the uncertainty quantification (UQ) of the characterization process of waste produced at particle accelerators. Copyright © 2017 Elsevier Ltd. All rights reserved.

  9. Simple and accurate quantification of BTEX in ambient air by SPME and GC-MS.

    Science.gov (United States)

    Baimatova, Nassiba; Kenessov, Bulat; Koziel, Jacek A; Carlsen, Lars; Bektassov, Marat; Demyanenko, Olga P

    2016-07-01

    Benzene, toluene, ethylbenzene and xylenes (BTEX) comprise one of the most ubiquitous and hazardous groups of ambient air pollutants of concern. Application of standard analytical methods for quantification of BTEX is limited by the complexity of sampling and sample preparation equipment, and budget requirements. Methods based on SPME represent simpler alternative, but still require complex calibration procedures. The objective of this research was to develop a simpler, low-budget, and accurate method for quantification of BTEX in ambient air based on SPME and GC-MS. Standard 20-mL headspace vials were used for field air sampling and calibration. To avoid challenges with obtaining and working with 'zero' air, slope factors of external standard calibration were determined using standard addition and inherently polluted lab air. For polydimethylsiloxane (PDMS) fiber, differences between the slope factors of calibration plots obtained using lab and outdoor air were below 14%. PDMS fiber provided higher precision during calibration while the use of Carboxen/PDMS fiber resulted in lower detection limits for benzene and toluene. To provide sufficient accuracy, the use of 20mL vials requires triplicate sampling and analysis. The method was successfully applied for analysis of 108 ambient air samples from Almaty, Kazakhstan. Average concentrations of benzene, toluene, ethylbenzene and o-xylene were 53, 57, 11 and 14µgm(-3), respectively. The developed method can be modified for further quantification of a wider range of volatile organic compounds in air. In addition, the new method is amenable to automation. Copyright © 2016 Elsevier B.V. All rights reserved.

  10. Robust high-resolution quantification of time signals encoded by in vivo magnetic resonance spectroscopy

    Science.gov (United States)

    Belkić, Dževad; Belkić, Karen

    2018-01-01

    This paper on molecular imaging emphasizes improving specificity of magnetic resonance spectroscopy (MRS) for early cancer diagnostics by high-resolution data analysis. Sensitivity of magnetic resonance imaging (MRI) is excellent, but specificity is insufficient. Specificity is improved with MRS by going beyond morphology to assess the biochemical content of tissue. This is contingent upon accurate data quantification of diagnostically relevant biomolecules. Quantification is spectral analysis which reconstructs chemical shifts, amplitudes and relaxation times of metabolites. Chemical shifts inform on electronic shielding of resonating nuclei bound to different molecular compounds. Oscillation amplitudes in time signals retrieve the abundance of MR sensitive nuclei whose number is proportional to metabolite concentrations. Transverse relaxation times, the reciprocal of decay probabilities of resonances, arise from spin-spin coupling and reflect local field inhomogeneities. In MRS single voxels are used. For volumetric coverage, multi-voxels are employed within a hybrid of MRS and MRI called magnetic resonance spectroscopic imaging (MRSI). Common to MRS and MRSI is encoding of time signals and subsequent spectral analysis. Encoded data do not provide direct clinical information. Spectral analysis of time signals can yield the quantitative information, of which metabolite concentrations are the most clinically important. This information is equivocal with standard data analysis through the non-parametric, low-resolution fast Fourier transform and post-processing via fitting. By applying the fast Padé transform (FPT) with high-resolution, noise suppression and exact quantification via quantum mechanical signal processing, advances are made, presented herein, focusing on four areas of critical public health importance: brain, prostate, breast and ovarian cancers.

  11. A Variational Level Set Approach Based on Local Entropy for Image Segmentation and Bias Field Correction.

    Science.gov (United States)

    Tang, Jian; Jiang, Xiaoliang

    2017-01-01

    Image segmentation has always been a considerable challenge in image analysis and understanding due to the intensity inhomogeneity, which is also commonly known as bias field. In this paper, we present a novel region-based approach based on local entropy for segmenting images and estimating the bias field simultaneously. Firstly, a local Gaussian distribution fitting (LGDF) energy function is defined as a weighted energy integral, where the weight is local entropy derived from a grey level distribution of local image. The means of this objective function have a multiplicative factor that estimates the bias field in the transformed domain. Then, the bias field prior is fully used. Therefore, our model can estimate the bias field more accurately. Finally, minimization of this energy function with a level set regularization term, image segmentation, and bias field estimation can be achieved. Experiments on images of various modalities demonstrated the superior performance of the proposed method when compared with other state-of-the-art approaches.

  12. Radionuclide release from low-level waste in field lysimeters

    International Nuclear Information System (INIS)

    Oblath, S.B.

    1986-01-01

    A field program has been in operation for 8 years at the Savannah River Plant (SRP) to determine the leaching/migration behavior of low-level radioactive waste using lysimeters. The lysimeters are soil-filled caissons containing well characterized wastes, with each lysimeter serving as a model of a shallow land burial trench. Sampling and analysis of percolate water and vegetation from the lysimeters provide a determination of the release rates of the radionuclides from the waste/soil system. Vegetative uptake appears to be a major pathway for migration. Fractional release rates from the waste/soil system are less than 0.01% per year. Waste-to-soil leach rates up to 10% per year have been determined by coring several of the lysimeters. The leaching of solidified wasteforms under unsaturated field conditions has agreed well with static, immersion leaching of the same type waste in the laboratory. However, releases from the waste/soil system in the lysimeter may be greater than predicted based on leaching alone, due to complexation of the radionuclides by other components leached from the wastes to form mobile, anionic species

  13. Experimental design for TBT quantification by isotope dilution SPE-GC-ICP-MS under the European water framework directive.

    Science.gov (United States)

    Alasonati, Enrica; Fabbri, Barbara; Fettig, Ina; Yardin, Catherine; Del Castillo Busto, Maria Estela; Richter, Janine; Philipp, Rosemarie; Fisicaro, Paola

    2015-03-01

    In Europe the maximum allowable concentration for tributyltin (TBT) compounds in surface water has been regulated by the water framework directive (WFD) and daughter directive that impose a limit of 0.2 ng L(-1) in whole water (as tributyltin cation). Despite the large number of different methodologies for the quantification of organotin species developed in the last two decades, standardised analytical methods at required concentration level do not exist. TBT quantification at picogram level requires efficient and accurate sample preparation and preconcentration, and maximum care to avoid blank contamination. To meet the WFD requirement, a method for the quantification of TBT in mineral water at environmental quality standard (EQS) level, based on solid phase extraction (SPE), was developed and optimised. The quantification was done using species-specific isotope dilution (SSID) followed by gas chromatography (GC) coupled to inductively coupled plasma mass spectrometry (ICP-MS). The analytical process was optimised using a design of experiment (DOE) based on a factorial fractionary plan. The DOE allowed to evaluate 3 qualitative factors (type of stationary phase and eluent, phase mass and eluent volume, pH and analyte ethylation procedure) for a total of 13 levels studied, and a sample volume in the range of 250-1000 mL. Four different models fitting the results were defined and evaluated with statistic tools: one of them was selected and optimised to find the best procedural conditions. C18 phase was found to be the best stationary phase for SPE experiments. The 4 solvents tested with C18, the pH and ethylation conditions, the mass of the phases, the volume of the eluents and the sample volume can all be optimal, but depending on their respective combination. For that reason, the equation of the model conceived in this work is a useful decisional tool for the planning of experiments, because it can be applied to predict the TBT mass fraction recovery when the

  14. Coherent scattering of three-level atoms in the field of a bichromatic standing light wave

    International Nuclear Information System (INIS)

    Pazgalev, A.S.; Rozhdestvenskii, Yu.V.

    1996-01-01

    We discuss the coherent scattering of three-level atoms in the field of two standing light waves for two values of the spatial shift. In the case of a zero spatial shift and equal frequency detunings of the standing waves, the problem of scattering of a three-level atoms is reduced to scattering of an effectively two-level atom. For the case of an exact resonance between the waves and transitions we give expressions for the population probability of the states of the three-level atom obtained in the short-interaction-time approximation. Depending on the initial population distribution over the states, different scattering modes are realized. In particular, we show that there can be initial conditions for which the three-level system does not interact with the field of the standing waves, with the result that there is no coherent scattering of atoms. In the case of standing waves shifted by π/2, there are two types of solution, depending on the values of the frequency detuning. For instance, when the light waves are detuned equally we give the exact solution for arbitrary relationships between the detuning and the standing wave intensities valid for any atom-field interaction times. The case of 'mirror' detunings and shifted standing waves is studied only numerically

  15. Low cost high performance uncertainty quantification

    KAUST Repository

    Bekas, C.; Curioni, A.; Fedulova, I.

    2009-01-01

    Uncertainty quantification in risk analysis has become a key application. In this context, computing the diagonal of inverse covariance matrices is of paramount importance. Standard techniques, that employ matrix factorizations, incur a cubic cost

  16. AVQS: Attack Route-Based Vulnerability Quantification Scheme for Smart Grid

    Directory of Open Access Journals (Sweden)

    Jongbin Ko

    2014-01-01

    Full Text Available A smart grid is a large, consolidated electrical grid system that includes heterogeneous networks and systems. Based on the data, a smart grid system has a potential security threat in its network connectivity. To solve this problem, we develop and apply a novel scheme to measure the vulnerability in a smart grid domain. Vulnerability quantification can be the first step in security analysis because it can help prioritize the security problems. However, existing vulnerability quantification schemes are not suitable for smart grid because they do not consider network vulnerabilities. We propose a novel attack route-based vulnerability quantification scheme using a network vulnerability score and an end-to-end security score, depending on the specific smart grid network environment to calculate the vulnerability score for a particular attack route. To evaluate the proposed approach, we derive several attack scenarios from the advanced metering infrastructure domain. The experimental results of the proposed approach and the existing common vulnerability scoring system clearly show that we need to consider network connectivity for more optimized vulnerability quantification.

  17. AVQS: attack route-based vulnerability quantification scheme for smart grid.

    Science.gov (United States)

    Ko, Jongbin; Lim, Hyunwoo; Lee, Seokjun; Shon, Taeshik

    2014-01-01

    A smart grid is a large, consolidated electrical grid system that includes heterogeneous networks and systems. Based on the data, a smart grid system has a potential security threat in its network connectivity. To solve this problem, we develop and apply a novel scheme to measure the vulnerability in a smart grid domain. Vulnerability quantification can be the first step in security analysis because it can help prioritize the security problems. However, existing vulnerability quantification schemes are not suitable for smart grid because they do not consider network vulnerabilities. We propose a novel attack route-based vulnerability quantification scheme using a network vulnerability score and an end-to-end security score, depending on the specific smart grid network environment to calculate the vulnerability score for a particular attack route. To evaluate the proposed approach, we derive several attack scenarios from the advanced metering infrastructure domain. The experimental results of the proposed approach and the existing common vulnerability scoring system clearly show that we need to consider network connectivity for more optimized vulnerability quantification.

  18. Comparison of machine learning and semi-quantification algorithms for (I123)FP-CIT classification: the beginning of the end for semi-quantification?

    Science.gov (United States)

    Taylor, Jonathan Christopher; Fenner, John Wesley

    2017-11-29

    Semi-quantification methods are well established in the clinic for assisted reporting of (I123) Ioflupane images. Arguably, these are limited diagnostic tools. Recent research has demonstrated the potential for improved classification performance offered by machine learning algorithms. A direct comparison between methods is required to establish whether a move towards widespread clinical adoption of machine learning algorithms is justified. This study compared three machine learning algorithms with that of a range of semi-quantification methods, using the Parkinson's Progression Markers Initiative (PPMI) research database and a locally derived clinical database for validation. Machine learning algorithms were based on support vector machine classifiers with three different sets of features: Voxel intensities Principal components of image voxel intensities Striatal binding radios from the putamen and caudate. Semi-quantification methods were based on striatal binding ratios (SBRs) from both putamina, with and without consideration of the caudates. Normal limits for the SBRs were defined through four different methods: Minimum of age-matched controls Mean minus 1/1.5/2 standard deviations from age-matched controls Linear regression of normal patient data against age (minus 1/1.5/2 standard errors) Selection of the optimum operating point on the receiver operator characteristic curve from normal and abnormal training data Each machine learning and semi-quantification technique was evaluated with stratified, nested 10-fold cross-validation, repeated 10 times. The mean accuracy of the semi-quantitative methods for classification of local data into Parkinsonian and non-Parkinsonian groups varied from 0.78 to 0.87, contrasting with 0.89 to 0.95 for classifying PPMI data into healthy controls and Parkinson's disease groups. The machine learning algorithms gave mean accuracies between 0.88 to 0.92 and 0.95 to 0.97 for local and PPMI data respectively. Classification

  19. Radiation dose determines the method for quantification of DNA double strand breaks

    International Nuclear Information System (INIS)

    Bulat, Tanja; Keta, Olitija; Korićanac, Lela; Žakula, Jelena; Petrović, Ivan; Ristić-Fira, Aleksandra; Todorović, Danijela

    2016-01-01

    Ionizing radiation induces DNA double strand breaks (DSBs) that trigger phosphorylation of the histone protein H2AX (γH2AX). Immunofluorescent staining visualizes formation of γH2AX foci, allowing their quantification. This method, as opposed to Western blot assay and Flow cytometry, provides more accurate analysis, by showing exact position and intensity of fluorescent signal in each single cell. In practice there are problems in quantification of γH2AX. This paper is based on two issues: the determination of which technique should be applied concerning the radiation dose, and how to analyze fluorescent microscopy images obtained by different microscopes. HTB140 melanoma cells were exposed to γ-rays, in the dose range from 1 to 16 Gy. Radiation effects on the DNA level were analyzed at different time intervals after irradiation by Western blot analysis and immunofluorescence microscopy. Immunochemically stained cells were visualized with two types of microscopes: AxioVision (Zeiss, Germany) microscope, comprising an ApoTome software, and AxioImagerA1 microscope (Zeiss, Germany). Obtained results show that the level of γH2AX is time and dose dependent. Immunofluorescence microscopy provided better detection of DSBs for lower irradiation doses, while Western blot analysis was more reliable for higher irradiation doses. AxioVision microscope containing ApoTome software was more suitable for the detection of γH2AX foci. (author)

  20. Radiation dose determines the method for quantification of DNA double strand breaks

    Energy Technology Data Exchange (ETDEWEB)

    Bulat, Tanja; Keta, Olitija; Korićanac, Lela; Žakula, Jelena; Petrović, Ivan; Ristić-Fira, Aleksandra [University of Belgrade, Vinča Institute of Nuclear Sciences, Belgrade (Serbia); Todorović, Danijela, E-mail: dtodorovic@medf.kg.ac.rs [University of Kragujevac, Faculty of Medical Sciences, Kragujevac (Serbia)

    2016-03-15

    Ionizing radiation induces DNA double strand breaks (DSBs) that trigger phosphorylation of the histone protein H2AX (γH2AX). Immunofluorescent staining visualizes formation of γH2AX foci, allowing their quantification. This method, as opposed to Western blot assay and Flow cytometry, provides more accurate analysis, by showing exact position and intensity of fluorescent signal in each single cell. In practice there are problems in quantification of γH2AX. This paper is based on two issues: the determination of which technique should be applied concerning the radiation dose, and how to analyze fluorescent microscopy images obtained by different microscopes. HTB140 melanoma cells were exposed to γ-rays, in the dose range from 1 to 16 Gy. Radiation effects on the DNA level were analyzed at different time intervals after irradiation by Western blot analysis and immunofluorescence microscopy. Immunochemically stained cells were visualized with two types of microscopes: AxioVision (Zeiss, Germany) microscope, comprising an ApoTome software, and AxioImagerA1 microscope (Zeiss, Germany). Obtained results show that the level of γH2AX is time and dose dependent. Immunofluorescence microscopy provided better detection of DSBs for lower irradiation doses, while Western blot analysis was more reliable for higher irradiation doses. AxioVision microscope containing ApoTome software was more suitable for the detection of γH2AX foci. (author)

  1. Quantification of genetically modified soya using strong anion exchange chromatography and time-of-flight mass spectrometry.

    Science.gov (United States)

    Chang, Po-Chih; Reddy, P Muralidhar; Ho, Yen-Peng

    2014-09-01

    Stable-isotope dimethyl labeling was applied to the quantification of genetically modified (GM) soya. The herbicide-resistant gene-related protein 5-enolpyruvylshikimate-3-phosphate synthase (CP4 EPSPS) was labeled using a dimethyl labeling reagent, formaldehyde-H2 or -D2. The identification and quantification of CP4 EPSPS was performed using matrix-assisted laser desorption/ionization mass spectrometry (MALDI-MS). The CP4 EPSPS protein was separated from high abundance proteins using strong anion exchange chromatography and sodium dodecyl sulfate-polyacrylamide gel electrophoresis. Then, the tryptic peptides from the samples and reference were labeled with formaldehyde-H2 and formaldehyde-D2, respectively. The two labeled pools were mixed and analyzed using MALDI-MS. The data showed a good correlation between the peak ratio of the H- and D-labeled peptides and the GM soya percentages at 0.5, 1, 3, and 5 %, with R (2) of 0.99. The labeling reagents are readily available. The labeling experiments and the detection procedures are simple. The approach is useful for the quantification of GM soya at a level as low as 0.5 %.

  2. Rapid Quantification and Validation of Lipid Concentrations within Liposomes

    Directory of Open Access Journals (Sweden)

    Carla B. Roces

    2016-09-01

    Full Text Available Quantification of the lipid content in liposomal adjuvants for subunit vaccine formulation is of extreme importance, since this concentration impacts both efficacy and stability. In this paper, we outline a high performance liquid chromatography-evaporative light scattering detector (HPLC-ELSD method that allows for the rapid and simultaneous quantification of lipid concentrations within liposomal systems prepared by three liposomal manufacturing techniques (lipid film hydration, high shear mixing, and microfluidics. The ELSD system was used to quantify four lipids: 1,2-dimyristoyl-sn-glycero-3-phosphocholine (DMPC, cholesterol, dimethyldioctadecylammonium (DDA bromide, and ᴅ-(+-trehalose 6,6′-dibehenate (TDB. The developed method offers rapidity, high sensitivity, direct linearity, and a good consistency on the responses (R2 > 0.993 for the four lipids tested. The corresponding limit of detection (LOD and limit of quantification (LOQ were 0.11 and 0.36 mg/mL (DMPC, 0.02 and 0.80 mg/mL (cholesterol, 0.06 and 0.20 mg/mL (DDA, and 0.05 and 0.16 mg/mL (TDB, respectively. HPLC-ELSD was shown to be a rapid and effective method for the quantification of lipids within liposome formulations without the need for lipid extraction processes.

  3. Enhanced UV exposure on a ski-field compared with exposures at sea level.

    Science.gov (United States)

    Allen, Martin; McKenzie, Richard

    2005-05-01

    Personal erythemal UV monitoring badges, which were developed to monitor the UV exposure of school children, were used to measure UV exposures received by one of the authors (MA) at the Mt Hutt ski-field, in New Zealand. These were then compared with measurements taken at the same times from a nearby sea level site in Christchurch city. The badges were designed to give instantaneous readings of erythemally-weighted (i.e., "sun burning") UV radiation and were cross-calibrated against meteorological grade UV instruments maintained by the National Institute of Water & Atmospheric Research (NIWA). All skiing and calibration days were clear and almost exclusively cloud free. It was found that the UV maxima for horizontal surfaces at the ski-field (altitude approximately 2 km) were 20-30% greater than at the low altitude site. Larger differences between the sites were observed when the sensor was oriented perpendicular to the sun. The personal doses of UV received by a sensor on the skier's lapel during two days of skiing activity were less than those received by a stationary detector on a horizontal surface near sea level. The exposures depended strongly on the time of year, and in mid-October the maximum UV intensity on the ski-field was 60% greater than in mid-September. The UV exposure levels experienced during skiing were smaller than the summer maxima at low altitudes.

  4. Influence of Co-57 and CT Transmission Measurements on the Quantification Accuracy and Partial Volume Effect of a Small Animal PET Scanner.

    Science.gov (United States)

    Mannheim, Julia G; Schmid, Andreas M; Pichler, Bernd J

    2017-12-01

    Non-invasive in vivo positron emission tomography (PET) provides high detection sensitivity in the nano- to picomolar range and in addition to other advantages, the possibility to absolutely quantify the acquired data. The present study focuses on the comparison of transmission data acquired with an X-ray computed tomography (CT) scanner or a Co-57 source for the Inveon small animal PET scanner (Siemens Healthcare, Knoxville, TN, USA), as well as determines their influences on the quantification accuracy and partial volume effect (PVE). A special focus included the impact of the performed calibration on the quantification accuracy. Phantom measurements were carried out to determine the quantification accuracy, the influence of the object size on the quantification, and the PVE for different sphere sizes, along the field of view and for different contrast ratios. An influence of the emission activity on the Co-57 transmission measurements was discovered (deviations up to 24.06 % measured to true activity), whereas no influence of the emission activity on the CT attenuation correction was identified (deviations influenced by the applied calibration factor and by the object size. The PVE demonstrated a dependency on the sphere size, the position within the field of view, the reconstruction and correction algorithms and the count statistics. Depending on the reconstruction algorithm, only ∼30-40 % of the true activity within a small sphere could be resolved. The iterative 3D reconstruction algorithms uncovered substantially increased recovery values compared to the analytical and 2D iterative reconstruction algorithms (up to 70.46 % and 80.82 % recovery for the smallest and largest sphere using iterative 3D reconstruction algorithms). The transmission measurement (CT or Co-57 source) to correct for attenuation did not severely influence the PVE. The analysis of the quantification accuracy and the PVE revealed an influence of the object size, the reconstruction

  5. Job Motivation Level for Elementary School Teachers Who Made Field Changes

    Science.gov (United States)

    Erdener, Mehmet Akif; Dalkiran, Merve

    2017-01-01

    The aim of this research is to determine the job motivation levels of primary school teachers who have made or have had to make field changes due to the new education system (4+4+4). The sample of the research consists of 512 teachers working in primary and secondary schools in Balikesir province in 2016-2017. The data needed for the research were…

  6. Label-free quantification of Tacrolimus in biological samples by atomic force microscopy

    International Nuclear Information System (INIS)

    Menotta, Michele; Biagiotti, Sara; Streppa, Laura; Rossi, Luigia; Magnani, Mauro

    2015-01-01

    Highlights: • Tacrolimus is a potent immunosuppressant drug that has to be continually monitored. • We present an atomic force microscope approach for quantification of Tacrolimus in blood samples. • Detection and quantification have been successfully achieved. - Abstract: In the present paper we describe an atomic force microscopy (AFM)-based method for the quantitative analysis of FK506 (Tacrolimus) in whole blood (WB) samples. Current reference methods used to quantify this immunosuppressive drug are based on mass spectrometry. In addition, an immunoenzymatic assay (ELISA) has been developed and is widely used in clinic, even though it shows a small but consistent overestimation of the actual drug concentration when compared with the mass spectrometry method. The AFM biosensor presented herein utilises the endogen drug receptor, FKBP12, to quantify Tacrolimus levels. The biosensor was first assayed to detect the free drug in solution, and subsequently used for the detection of Tacrolimus in blood samples. The sensor was suitable to generate a dose–response curve in the full range of clinical drug monitoring. A comparison with the clinically tested ELISA assay is also reported

  7. Quantification of severe accident source terms of a Westinghouse 3-loop plant

    International Nuclear Information System (INIS)

    Lee Min; Ko, Y.-C.

    2008-01-01

    Integrated severe accident analysis codes are used to quantify the source terms of the representative sequences identified in PSA study. The characteristics of these source terms depend on the detail design of the plant and the accident scenario. A historical perspective of radioactive source term is provided. The grouping of radionuclides in different source terms or source term quantification tools based on TID-14844, NUREG-1465, and WASH-1400 is compared. The radionuclides release phenomena and models adopted in the integrated severe accident analysis codes of STCP and MAAP4 are described. In the present study, the severe accident source terms for risk quantification of Maanshan Nuclear Power Plant of Taiwan Power Company are quantified using MAAP 4.0.4 code. A methodology is developed to quantify the source terms of each source term category (STC) identified in the Level II PSA analysis of the plant. The characteristics of source terms obtained are compared with other source terms. The plant analyzed employs a Westinghouse designed 3-loop pressurized water reactor (PWR) with large dry containment

  8. Label-free quantification of Tacrolimus in biological samples by atomic force microscopy

    Energy Technology Data Exchange (ETDEWEB)

    Menotta, Michele, E-mail: michele.menotta@uniurb.it [Department of Biomolecular Sciences, University of Urbino “Carlo Bo” via Saffi 2, Urbino (Italy); Biagiotti, Sara [Department of Biomolecular Sciences, University of Urbino “Carlo Bo” via Saffi 2, Urbino (Italy); Streppa, Laura [Physics Laboratory, CNRS-ENS, UMR 5672, Lyon (France); Cell and Molecular Biology Laboratory, CNRS-ENS Lyon, UMR 5239, IFR128, Lyon (France); Rossi, Luigia; Magnani, Mauro [Department of Biomolecular Sciences, University of Urbino “Carlo Bo” via Saffi 2, Urbino (Italy)

    2015-07-16

    Highlights: • Tacrolimus is a potent immunosuppressant drug that has to be continually monitored. • We present an atomic force microscope approach for quantification of Tacrolimus in blood samples. • Detection and quantification have been successfully achieved. - Abstract: In the present paper we describe an atomic force microscopy (AFM)-based method for the quantitative analysis of FK506 (Tacrolimus) in whole blood (WB) samples. Current reference methods used to quantify this immunosuppressive drug are based on mass spectrometry. In addition, an immunoenzymatic assay (ELISA) has been developed and is widely used in clinic, even though it shows a small but consistent overestimation of the actual drug concentration when compared with the mass spectrometry method. The AFM biosensor presented herein utilises the endogen drug receptor, FKBP12, to quantify Tacrolimus levels. The biosensor was first assayed to detect the free drug in solution, and subsequently used for the detection of Tacrolimus in blood samples. The sensor was suitable to generate a dose–response curve in the full range of clinical drug monitoring. A comparison with the clinically tested ELISA assay is also reported.

  9. Application of radioanalytical methods in the quantification of solute transport in plants

    International Nuclear Information System (INIS)

    Hornik, M.

    2016-01-01

    The present habilitation thesis is elaborated as a compilation of published scientific papers supplemented with a commentary. The primary objective of the work was to bring the results and knowledge applicable to the further development of application possibilities of nuclear analytical chemistry, especially in the field of radioindication methods and application of positron emitters in connection with the positron emission tomography (PET) as well. In the work, these methods and techniques are developed mainly in the context of the solution of environmental issues related to the analysis and remediation of contaminated or degraded environment (water and soil), but also partially in the field of plant production or plant research. In terms of the achieved results and knowledge, the work is divided into three separated sections. The first part is dedicated to the application of radioindication methods, as well as others, non-radioanalytical methods and approaches in the characterization of plant biomass (biomass of terrestrial and aquatic mosses, and waste plant biomass) as alternative sorbents served to the separation and removal of (radio)toxic metals from contaminated or waste waters, as well as in the quantification and description of the sorption processes proceed under conditions of batch or continuous flow systems. The second part describes the results concerning on the quantification and visual description of the processes of (radio)toxic metals and microelements uptake and translocation in plant tissues using radioisotopes (β- and γ-emitters) of these metals and application of the methods of direct gamma spectrometry and autoradiography as well. The main aim of these experiments was to evaluate the possibilities of utilization of selected plant species in phytoremediation of contaminated soils and waters, as well as the possibilities affecting the effectiveness of uptake and translocation of these metals in the plant tissues mainly in dependence on their

  10. Investigation of the radiation level and electromagnetic field strength in sample of Damascus schools

    International Nuclear Information System (INIS)

    Shweikani, R.; Abukassem, I.; Raja, G.; Algamdi, H.

    2009-12-01

    The aim of this work is to determine radon concentration and natural gamma dose rate, and to measure the electromagnetic fields (EMFs) level produced by electric power lines and also mobile phone base station inside some elementary and preparatory schools in old town during two terms (studding terms and summer break). Results showed that most of the obtained values were less than 200 Bq/m 3 the action levels, but there were some classrooms concentrations which are more than 200 Bq/m 3 . These high values may be due to building materials, radon concentration in the soil and poor ventilation. It has been noticed that radon concentrations during the second term (summer) were higher than the first term. This may be due to the poor ventilation as schools are closed during summer break. The results showed also decreasing in radon concentration with increasing height of the floor, and radon concentration in old schools is higher than modern ones. EMFs levels in ground and first floors were higher than second floor; the maximum detected values exceeded 50 V/m and 270 mA/m for electric and magnetic field strength respectively, and 0.5 μT for magnetic flux density. Mobile microwave radiation level was relatively low in all positions, and signal increases with floor height. Finally, no observable correlation between the measured electromagnetic fields and the radon concentration were established.

  11. Concentration of High Level Radioactive Liquid Waste. Basic data acquisition

    Energy Technology Data Exchange (ETDEWEB)

    Juvenelle, A.; Masson, M.; Garrido, M.H. [DEN/VRH/DRCP/SCPS/LPCP, BP 17171 - 30207 Bagnols sur Ceze Cedex (France)

    2008-07-01

    Full text of publication follows: In order to enhance its knowledge about the concentration of high level liquid waste (HLLW) from the nuclear fuel reprocessing process, a program of studies was defined by Cea. In a large field of acidity, it proposes to characterize the concentrated solution and the obtained precipitates versus the concentration factor. Four steps are considered: quantification of the salting-out effect on the concentrate acidity, acquisition of solubility data, precipitates characterisation versus the concentration factor through aging tests and concentration experimentation starting from simulated fission products solutions. The first results, reported here, connect the acidity of the concentrated solution to the concentration factor and allow us to precise the field of acidity (4 to 12 N) for the next experiments. In this field, solubility data of various elements (Ba, Sr, Zr...) are separately measured at room temperature, in nitric acid in a first time, then in the presence of various species present in medium (TBP, PO{sub 4}{sup 3-}). The reactions between these various elements are then investigated (formation of insoluble mixed compounds) by following the concentration cations in solution and characterising the precipitates. (authors)

  12. Pure hydroxyapatite phantoms for the calibration of in vivo X-ray fluorescence systems of bone lead and strontium quantification.

    Science.gov (United States)

    Da Silva, Eric; Kirkham, Brian; Heyd, Darrick V; Pejović-Milić, Ana

    2013-10-01

    Plaster of Paris [poP, CaSO4·(1)/(2) H2O] is the standard phantom material used for the calibration of in vivo X-ray fluorescence (IVXRF)-based systems of bone metal quantification (i.e bone strontium and lead). Calibration of IVXRF systems of bone metal quantification employs the use of a coherent normalization procedure which requires the application of a coherent correction factor (CCF) to the data, calculated as the ratio of the relativistic form factors of the phantom material and bone mineral. Various issues have been raised as to the suitability of poP for the calibration of IVXRF systems of bone metal quantification which include its chemical purity and its chemical difference from bone mineral (a calcium phosphate). This work describes the preparation of a chemically pure hydroxyapatite phantom material, of known composition and stoichiometry, proposed for the purpose of calibrating IVXRF systems of bone strontium and lead quantification as a replacement for poP. The issue with contamination by the analyte was resolved by preparing pure Ca(OH)2 by hydroxide precipitation, which was found to bring strontium and lead levels to bone mineral component of NIST SRM 1486 (bone meal), as determined by powder X-ray diffraction spectrometry.

  13. Ranking Fragment Ions Based on Outlier Detection for Improved Label-Free Quantification in Data-Independent Acquisition LC-MS/MS

    Science.gov (United States)

    Bilbao, Aivett; Zhang, Ying; Varesio, Emmanuel; Luban, Jeremy; Strambio-De-Castillia, Caterina; Lisacek, Frédérique; Hopfgartner, Gérard

    2016-01-01

    Data-independent acquisition LC-MS/MS techniques complement supervised methods for peptide quantification. However, due to the wide precursor isolation windows, these techniques are prone to interference at the fragment ion level, which in turn is detrimental for accurate quantification. The “non-outlier fragment ion” (NOFI) ranking algorithm has been developed to assign low priority to fragment ions affected by interference. By using the optimal subset of high priority fragment ions these interfered fragment ions are effectively excluded from quantification. NOFI represents each fragment ion as a vector of four dimensions related to chromatographic and MS fragmentation attributes and applies multivariate outlier detection techniques. Benchmarking conducted on a well-defined quantitative dataset (i.e. the SWATH Gold Standard), indicates that NOFI on average is able to accurately quantify 11-25% more peptides than the commonly used Top-N library intensity ranking method. The sum of the area of the Top3-5 NOFIs produces similar coefficients of variation as compared to the library intensity method but with more accurate quantification results. On a biologically relevant human dendritic cell digest dataset, NOFI properly assigns low priority ranks to 85% of annotated interferences, resulting in sensitivity values between 0.92 and 0.80 against 0.76 for the Spectronaut interference detection algorithm. PMID:26412574

  14. Quantitative Assessment of Fat Levels in Caenorhabditis elegans Using Dark Field Microscopy

    Directory of Open Access Journals (Sweden)

    Anthony D. Fouad

    2017-06-01

    Full Text Available The roundworm Caenorhabditis elegans is widely used as a model for studying conserved pathways for fat storage, aging, and metabolism. The most broadly used methods for imaging fat in C. elegans require fixing and staining the animal. Here, we show that dark field images acquired through an ordinary light microscope can be used to estimate fat levels in worms. We define a metric based on the amount of light scattered per area, and show that this light scattering metric is strongly correlated with worm fat levels as measured by Oil Red O (ORO staining across a wide variety of genetic backgrounds and feeding conditions. Dark field imaging requires no exogenous agents or chemical fixation, making it compatible with live worm imaging. Using our method, we track fat storage with high temporal resolution in developing larvae, and show that fat storage in the intestine increases in at least one burst during development.

  15. Natural geochemical analogues of the near field of high-level nuclear waste repositories

    International Nuclear Information System (INIS)

    Apps, J.A.

    1995-01-01

    United States practice has been to design high-level nuclear waste (HLW) geological repositories with waste densities sufficiently high that repository temperatures surrounding the waste will exceed 100 degrees C and could reach 250 degrees C. Basalt and devitrified vitroclastic tuff are among the host rocks considered for waste emplacement. Near-field repository thermal behavior and chemical alteration in such rocks is expected to be similar to that observed in many geothermal systems. Therefore, the predictive modeling required for performance assessment studies of the near field could be validated and calibrated using geothermal systems as natural analogues. Examples are given which demonstrate the need for refinement of the thermodynamic databases used in geochemical modeling of near-field natural analogues and the extent to which present models can predict conditions in geothermal fields

  16. Quantification bias caused by plasmid DNA conformation in quantitative real-time PCR assay.

    Science.gov (United States)

    Lin, Chih-Hui; Chen, Yu-Chieh; Pan, Tzu-Ming

    2011-01-01

    Quantitative real-time PCR (qPCR) is the gold standard for the quantification of specific nucleic acid sequences. However, a serious concern has been revealed in a recent report: supercoiled plasmid standards cause significant over-estimation in qPCR quantification. In this study, we investigated the effect of plasmid DNA conformation on the quantification of DNA and the efficiency of qPCR. Our results suggest that plasmid DNA conformation has significant impact on the accuracy of absolute quantification by qPCR. DNA standard curves shifted significantly among plasmid standards with different DNA conformations. Moreover, the choice of DNA measurement method and plasmid DNA conformation may also contribute to the measurement error of DNA standard curves. Due to the multiple effects of plasmid DNA conformation on the accuracy of qPCR, efforts should be made to assure the highest consistency of plasmid standards for qPCR. Thus, we suggest that the conformation, preparation, quantification, purification, handling, and storage of standard plasmid DNA should be described and defined in the Minimum Information for Publication of Quantitative Real-Time PCR Experiments (MIQE) to assure the reproducibility and accuracy of qPCR absolute quantification.

  17. SPE/TLC/Densitometric Quantification of Selected Synthetic Food Dyes in Liquid Foodstuffs and Pharmaceutical Preparations

    Directory of Open Access Journals (Sweden)

    Anna W. Sobańska

    2017-01-01

    Full Text Available Selected synthetic food dyes (tartrazine, Ponceau 4R, Brilliant Blue, orange yellow, and azorubine were isolated from liquid preparations (mouthwashes and beverages by Solid Phase Extraction on aminopropyl-bonded silica with diluted aqueous sodium hydroxide as an eluent. The extraction step was followed by thin layer chromatography on silica gel 60 with chloroform-isopropanol-25% aq. ammonia 1 : 3 : 1 (v/v/v as mobile phase and the densitometric quantification of dyes was achieved using quadratic calibration plots (R2>0.997; LOQ = 0.04–0.09 μgspot−1. The overall recoveries for all studied dyes were at the average level of over 90% and the repeatability of the proposed procedure (CV ≤ 4.1% was sufficient to recommend it for the routine quantification of the aforementioned dyes in liquid matrices.

  18. Liquid chromatography-tandem mass spectrometry quantification of 6-thioguanine in DNA using endogenous guanine as internal standard

    DEFF Research Database (Denmark)

    Jacobsen, Jack Hummeland; Schmiegelow, Kjeld; Nersting, Jakob

    2012-01-01

    was estimated at 63% (RSD 26%), which is corrected for by the internal standard resulting in stable quantification. The TG levels found were above the LOQ in 18 out of 18 childhood leukemia patients on 6-mercaptopurine/methotrexate maintenance therapy (median 377, range 45-1190 fmol/μg DNA) with intra...

  19. Elemental quantification of airborne particulate matter in Bandung and Lembang area

    International Nuclear Information System (INIS)

    Sutisna; Achmad Hidayat; Dadang Supriatna

    2004-01-01

    ELEMENTAL QUANTIFICATION OF AIRBORNE PARTICULATE MATTER IN BANDUNG AND LEMBANG REGION: The contaminated airborne particulates by toxic gases and elements have a potential affect to the human health. Some toxic elements related to air pollution have carcinogenic affect. The quantification of those elements is important to monitor a level of pollutant contained in the airborne particulate. The aim of this work is to analyze the air particulate sample using instrumental neutron activation analysis and other related technique. Two sampling points of Bandung and Lembang that represent and urban and rural area respectively have been chosen to collect the air particulate sample. The samplings were carried out using Gent Stacked Filter Unit Sampler for 24 hours, and two cellulose filters of 8 μm and 0.45 μm pore size were used. Trace elements in the sample collected were determined using NAA based on a comparative method. Elemental distribution on PM 2.5 and PM 10 fraction of airborne particulate was analyzed, the enrichment factor was calculated using Al as reference elements, and the black carbons contents were determined using FEL Smoke Stain Reflectometer analyzed. The results are presented and discussed. (author)

  20. Two-stream Convolutional Neural Network for Methane Emissions Quantification

    Science.gov (United States)

    Wang, J.; Ravikumar, A. P.; McGuire, M.; Bell, C.; Tchapmi, L. P.; Brandt, A. R.

    2017-12-01

    Methane, a key component of natural gas, has a 25x higher global warming potential than carbon dioxide on a 100-year basis. Accurately monitoring and mitigating methane emissions require cost-effective detection and quantification technologies. Optical gas imaging, one of the most commonly used leak detection technology, adopted by Environmental Protection Agency, cannot estimate leak-sizes. In this work, we harness advances in computer science to allow for rapid and automatic leak quantification. Particularly, we utilize two-stream deep Convolutional Networks (ConvNets) to estimate leak-size by capturing complementary spatial information from still plume frames, and temporal information from plume motion between frames. We build large leak datasets for training and evaluating purposes by collecting about 20 videos (i.e. 397,400 frames) of leaks. The videos were recorded at six distances from the source, covering 10 -60 ft. Leak sources included natural gas well-heads, separators, and tanks. All frames were labeled with a true leak size, which has eight levels ranging from 0 to 140 MCFH. Preliminary analysis shows that two-stream ConvNets provides significant accuracy advantage over single steam ConvNets. Spatial stream ConvNet can achieve an accuracy of 65.2%, by extracting important features, including texture, plume area, and pattern. Temporal stream, fed by the results of optical flow analysis, results in an accuracy of 58.3%. The integration of the two-stream ConvNets gives a combined accuracy of 77.6%. For future work, we will split the training and testing datasets in distinct ways in order to test the generalization of the algorithm for different leak sources. Several analytic metrics, including confusion matrix and visualization of key features, will be used to understand accuracy rates and occurrences of false positives. The quantification algorithm can help to find and fix super-emitters, and improve the cost-effectiveness of leak detection and repair

  1. Quantification of anti-Leishmania antibodies in saliva of dogs.

    Science.gov (United States)

    Cantos-Barreda, Ana; Escribano, Damián; Bernal, Luis J; Cerón, José J; Martínez-Subiela, Silvia

    2017-08-15

    Detection of serum anti-Leishmania antibodies by quantitative or qualitative techniques has been the most used method to diagnose Canine Leishmaniosis (CanL). Nevertheless, saliva may represent an alternative to blood because it is easy to collect, painless and non-invasive in comparison with serum. In this study, two time-resolved immunofluorometric assays (TR-IFMAs) for quantification of anti-Leishmania IgG2 and IgA antibodies in saliva were developed and validated and their ability to distinguish Leishmania-seronegative from seropositive dogs was evaluated. The analytical study was performed by evaluation of assay precision, sensitivity and accuracy. In addition, serum from 48 dogs (21 Leishmania-seropositive and 27 Leishmania-seronegative) were analyzed by TR-IFMAs. The assays were precise, with an intra- and inter-assay coefficients of variation lower than 11%, and showed high level of accuracy, as determined by linearity under dilution (R 2 =0.99) and recovery tests (>88.60%). Anti-Leishmania IgG2 antibodies in saliva were significantly higher in the seropositive group compared with the seronegative (pLeishmania IgA antibodies between both groups were observed. Furthermore, TR-IFMA for quantification of anti-Leishmania IgG2 antibodies in saliva showed higher differences between seropositive and seronegative dogs than the commercial assay used in serum. In conclusion, TR-IFMAs developed may be used to quantify anti-Leishmania IgG2 and IgA antibodies in canine saliva with an adequate precision, analytical sensitivity and accuracy. Quantification of anti-Leishmania IgG2 antibodies in saliva could be potentially used to evaluate the humoral response in CanL. However, IgA in saliva seemed not to have diagnostic value for this disease. For future studies, it would be desirable to evaluate the ability of the IgG2 assay to detect dogs with subclinical disease or with low antibody titers in serum and also to study the antibodies behaviour in saliva during the

  2. Absolute Quantification of Toxicological Biomarkers via Mass Spectrometry.

    Science.gov (United States)

    Lau, Thomas Y K; Collins, Ben C; Stone, Peter; Tang, Ning; Gallagher, William M; Pennington, Stephen R

    2017-01-01

    With the advent of "-omics" technologies there has been an explosion of data generation in the field of toxicology, as well as many others. As new candidate biomarkers of toxicity are being regularly discovered, the next challenge is to validate these observations in a targeted manner. Traditionally, these validation experiments have been conducted using antibody-based technologies such as Western blotting, ELISA, and immunohistochemistry. However, this often produces a significant bottleneck as the time, cost, and development of successful antibodies are often far outpaced by the generation of targets of interest. In response to this, there recently have been several developments in the use of triple quadrupole (QQQ) mass spectrometry (MS) as a platform to provide quantification of proteins. This technology does not require antibodies; it is typically less expensive and quicker to develop assays and has the opportunity for more accessible multiplexing. The speed of these experiments combined with their flexibility and ability to multiplex assays makes the technique a valuable strategy to validate biomarker discovery.

  3. Perturbing an electromagnetically induced transparency in a Λ system using a low-frequency driving field. II. Four-level system

    International Nuclear Information System (INIS)

    Wilson, E. A.; Manson, N. B.; Wei, C.

    2005-01-01

    The effect a perturbing field has on an electromagnetically induced transparency within a three-level Λ system is presented. The perturbing field is applied resonant between one of the lower levels of the Λ system and a fourth level. The electromagnetically induced transparency feature is split and this is measured experimentally for both single and bichromatic driving fields. In the single-driving-field case a density matrix treatment is shown to be in reasonable agreement with experiment and in both single and bichromatic cases the structure in the spectrum can be explained using a dressed-state analysis

  4. Ultra-Sensitive NT-proBNP Quantification for Early Detection of Risk Factors Leading to Heart Failure

    Directory of Open Access Journals (Sweden)

    Keum-Soo Song

    2017-09-01

    Full Text Available Cardiovascular diseases such as acute myocardial infarction and heart failure accounted for the death of 17.5 million people (31% of all global deaths in 2015. Monitoring the level of circulating N-terminal proBNP (NT-proBNP is crucial for the detection of people at risk of heart failure. In this article, we describe a novel ultra-sensitive NT-proBNP test (us-NT-proBNP that allows the quantification of circulating NT-proBNP in 30 min at 25 °C in the linear detection range of 7.0–600 pg/mL. It is a first report on the application of a fluorescence bead labeled detection antibody, DNA-guided detection method, and glass fiber membrane platform for the quantification of NT-proBNP in clinical samples. Limit of blank, limit of detection, and limit of quantification were 2.0 pg/mL, 3.7 pg/mL, and 7 pg/mL, respectively. The coefficient of variation was found to be less than 10% in the entire detection range of 7–600 pg/mL. The test demonstrated specificity for NT-proBNP without interferences from bilirubin, intra-lipid, biotin, and hemoglobin. The serial dilution test for plasma samples containing various NT-proBNP levels showed the linear decrement in concentration with the regression coefficient of 0.980–0.998. These results indicate that us-NT-proBNP test does not suffer from the interference of the plasma components for the measurement of NT-proBNP in clinical samples.

  5. Paying Medicare Advantage plans: To level or tilt the playing field.

    Science.gov (United States)

    Glazer, Jacob; McGuire, Thomas G

    2017-12-01

    Medicare beneficiaries are eligible for health insurance through the public option of traditional Medicare (TM) or may join a private Medicare Advantage (MA) plan. Both are highly subsidized but in different ways. Medicare pays for most of costs directly in TM, and subsidizes MA plans based on a "benchmark" for each beneficiary choosing a private plan. The level of this benchmark is arguably the most important policy decision Medicare makes about the MA program. Many analysts recommend equalizing Medicare's subsidy across the options - referred to in policy circles as a "level playing field." This paper studies the normative question of how to set the level of the benchmark, applying the versatile model developed by Einav and Finkelstein (EF) to Medicare. The EF framework implies unequal subsidies to counteract risk selection across plan types. We also study other reasons to tilt the field: the relative efficiency of MA vs. TM, market power of MA plans, and institutional features of the way Medicare determines subsidies and premiums. After review of the empirical and policy literature, we conclude that in areas where the MA market is competitive, the benchmark should be set below average costs in TM, but in areas characterized by imperfect competition in MA, it should be raised in order to offset output (enrollment) restrictions by plans with market power. We also recommend specific modifications of Medicare rules to make demand for MA more price elastic. Copyright © 2016. Published by Elsevier B.V.

  6. Development of Uncertainty Quantification Method for MIR-PIV Measurement using BOS Technique

    International Nuclear Information System (INIS)

    Seong, Jee Hyun; Song, Min Seop; Kim, Eung Soo

    2014-01-01

    Matching Index of Refraction (MIR) is frequently used for obtaining high quality PIV measurement data. ven small distortion by unmatched refraction index of test section can result in uncertainty problems. In this context, it is desirable to construct new concept for checking errors of MIR and following uncertainty of PIV measurement. This paper proposes a couple of experimental concept and relative results. This study developed an MIR uncertainty quantification method for PIV measurement using SBOS technique. From the reference data of the BOS, the reliable SBOS experiment procedure was constructed. Then with the combination of SBOS technique with MIR-PIV technique, velocity vector and refraction displacement vector field was measured simultaneously. MIR errors are calculated through mathematical equation, in which PIV and SBOS data are put. These errors are also verified by another BOS experiment. Finally, with the applying of calculated MIR-PIV uncertainty, correct velocity vector field can be obtained regardless of MIR errors

  7. Quantification of acrylamide in foods selected by using gas chromatography tandem mass spectrometry

    Directory of Open Access Journals (Sweden)

    Delević Veselin M.

    2016-01-01

    Full Text Available Acrylamide is toxic and probably carcinogenic compound, made as a result of high-temperature thermal treatment of carbohydrate-rich foodstuffs. In this article a method is improved for the extraction and quantitation of acrylamide in foods produced based on corn flour that are represented in our traditional diet. Acrylamide extraction was carried out using reduced volume of saturated solution of bromine water and the GC - MS method for the quantification was shown. Quantification of acrylamide was preceded by: sample homogenization, acrylamide extraction using water, extract purification using solid phase extraction, bromination, using a reduced volume of bromine water solution, dehydrobromination with sodium thiosulfate and transformation of dibromopropenamide in 2,3- 2- bromopropenamide using triethylamine. Regression and correlation analysis were applied for the probability level of 0.05. Calibration is performed in the concentration range 5-80 ug/kg with a detection limit 6.86 mg / kg and the limits of quantification 10.78 ug/kg and the coefficient of determination R2 > 0.999. Calibration curve was obtained: y = 0,069x + 0,038. Recovery values were an average from 97 to 110%. Proposed GC-MS method is simple, precise and reliable for the determination of acrylamide in the samples of thermal treated foods. Our results show that the tested foods quantify the presence of acrylamide in concentrations of 18 to 77 mg/kg acrylamide depending on whether the food was prepared by cooking or baking.

  8. Quantification of regional fat volume in rat MRI

    Science.gov (United States)

    Sacha, Jaroslaw P.; Cockman, Michael D.; Dufresne, Thomas E.; Trokhan, Darren

    2003-05-01

    Multiple initiatives in the pharmaceutical and beauty care industries are directed at identifying therapies for weight management. Body composition measurements are critical for such initiatives. Imaging technologies that can be used to measure body composition noninvasively include DXA (dual energy x-ray absorptiometry) and MRI (magnetic resonance imaging). Unlike other approaches, MRI provides the ability to perform localized measurements of fat distribution. Several factors complicate the automatic delineation of fat regions and quantification of fat volumes. These include motion artifacts, field non-uniformity, brightness and contrast variations, chemical shift misregistration, and ambiguity in delineating anatomical structures. We have developed an approach to deal practically with those challenges. The approach is implemented in a package, the Fat Volume Tool, for automatic detection of fat tissue in MR images of the rat abdomen, including automatic discrimination between abdominal and subcutaneous regions. We suppress motion artifacts using masking based on detection of implicit landmarks in the images. Adaptive object extraction is used to compensate for intensity variations. This approach enables us to perform fat tissue detection and quantification in a fully automated manner. The package can also operate in manual mode, which can be used for verification of the automatic analysis or for performing supervised segmentation. In supervised segmentation, the operator has the ability to interact with the automatic segmentation procedures to touch-up or completely overwrite intermediate segmentation steps. The operator's interventions steer the automatic segmentation steps that follow. This improves the efficiency and quality of the final segmentation. Semi-automatic segmentation tools (interactive region growing, live-wire, etc.) improve both the accuracy and throughput of the operator when working in manual mode. The quality of automatic segmentation has been

  9. Facile preparation of poly(methylene blue) modified carbon paste electrode for the detection and quantification of catechin

    Energy Technology Data Exchange (ETDEWEB)

    Manasa, G [Electrochemistry Research Group, Department of Chemistry, St. Joseph' s College, Lalbagh Road, Bangalore, 560027, Karnataka (India); Mascarenhas, Ronald J, E-mail: ronaldmasc2311@yahoo.co.in [Electrochemistry Research Group, Department of Chemistry, St. Joseph' s College, Lalbagh Road, Bangalore, 560027, Karnataka (India); Satpati, Ashis K [Analytical Chemistry Division, Bhabha Atomic Research Centre, Anushakthi Nagar, Trombay, Mumbai 400094, Maharashtra (India); D' Souza, Ozma J [Electrochemistry Research Group, Department of Chemistry, St. Joseph' s College, Lalbagh Road, Bangalore, 560027, Karnataka (India); Dhason, A [Soft Condensed Matter, Raman Research Institute, Sadashivnagar, Bangalore 560080, Karnataka (India)

    2017-04-01

    Free radicals are formed as byproducts of metabolism, and are highly unstable due to the presence of unpaired electrons. They readily react with other important cellular components such as DNA causing them damage. Antioxidants such as (+)-catechin (CAT), neutralize free radicals in the blood stream. Hence there is a need for detection and quantification of catechin concentration in various food sources and beverages. Electro-oxidative properties of catechin were investigated using cyclic voltammetry (CV) and differential pulse voltammetry (DPV). A carbon paste working electrode modified by electropolymerizing methylene blue (MB) was fabricated. Field emission scanning electron microscopy (FESEM) and atomic force microscopy (AFM) techniques were used to study the surface morphology of the electrode. Quasi-reversible electron transfer reaction occurred at + 0.260 V through a diffusion controlled process. In comparison to the bare carbon paste electrode (CPE), there was a significant 5.3 times increment in anodic current sensitivity at the modified electrode at physiological pH. Our findings indicate that for the electro-oxidation of CAT, CPE is a better base material for electropolymerization of MB compared to glassy carbon electrode (GCE). Nyquist plot followed the theoretical shape, indicating low interfacial charge transfer resistance of 0.095 kΩ at the modified electrode. Calibration plots obtained by DPV were linear in two ranges of 1.0 × 10{sup −3} to 1.0 × 10{sup −6} and 1.0 × 10{sup −7} to 0.1 × 10{sup −8} M. The limit of detection (LOD) and limit of quantification (LOQ) was 4.9 nM and 14 nM respectively. Application of the developed electrode was demonstrated by detecting catechin in green tea and spiked fruit juice with satisfactory recoveries. The sensor was stable, sensitive, selective and reproducible. - Highlights: • Remarkable electrocatalytic oxidation of Catechin at poly(methylene blue) modified CPE • Complete elimination of signal

  10. Behavioural changes in mice exposed to low level microwave fields

    International Nuclear Information System (INIS)

    Goiceanu, C.; Gradinaru, F.; Danulescu, R.; Balaceanu, G.; Sandu, D. D.; Avadanei, O. G.

    2001-01-01

    The aim of our study is to point out some changes in mice behaviour due possibly to exposure to low-level microwave fields. Animals spontaneous behaviour were monitored and the exploring behaviour and motor activity were assessed. Ten selected Swiss male mice were exposed to low-level microwave fields of about 1 mW/cm 2 power density for a relatively long period of time (13 weeks), comparing to their lifetime. The exposure system consists in a transverse electromagnetic (TEM) Cell. A control lot of ten Swiss male mice was used. All twenty mice were selected to be of same age and of 202 g initial body weight. Each animal was placed in his own holder. The behaviour of the animals, from both exposed and control lots, was assessed by using a battery of three behavioural tests. The test sessions were performed every two weeks. During exposure period it was recorded a progressive but moderate loss of motor activity for both exposed and controls, probably due to weight gain and aging. Concerning exploratory activity there is a significant difference between control and exposed animals. Control mice had approximately constant performances in time. On the other hand exposed mice showed a progressive decrease in time of their exploratory ability. Motor activity of exposed animals does not seem to be affected by microwave exposure, in spite of moderate loss in time of motor activity in both lots, as long as it was recorded a quite similar evolution. The difference in performances of exposed and controls concerning exploratory activity seem to emphasise an effect of long-term low-level microwave exposure. The progressive loss in time of exploratory activity of exposed mice, in contrast with controls, could be due to the interference of microwaves with central nervous activity. (authors)

  11. Genetic detection and quantification of Nosema apis and N. ceranae in the honey bee.

    Science.gov (United States)

    Bourgeois, A Lelania; Rinderer, Thomas E; Beaman, Lorraine D; Danka, Robert G

    2010-01-01

    The incidence of nosemosis has increased in recent years due to an emerging infestation of Nosema ceranae in managed honey bee populations in much of the world. A real-time PCR assay was developed to facilitate detection and quantification of both Nosema apis and N. ceranae in both single bee and pooled samples. The assay is a multiplexed reaction in which both species are detected and quantified in a single reaction. The assay is highly sensitive and can detect single copies of the target sequence. Real-time PCR results were calibrated to spore counts generated by standard microscopy procedures. The assay was used to assess bees from commercial apiaries sampled in November 2008 and March 2009. Bees from each colony were pooled. A large amount of variation among colonies was evident, signifying the need to examine large numbers of colonies. Due to sampling constraints, a subset of colonies (from five apiaries) was sampled in both seasons. In November, N. apis levels were 1212+/-148 spores/bee and N. ceranae levels were 51,073+/-31,155 spores/bee. In March, no N. apis was detected, N. ceranae levels were 11,824+/-6304 spores/bee. Changes in N. ceranae levels were evident among apiaries, some increasing and other decreasing. This demonstrates the need for thorough sampling of apiaries and the need for a rapid test for both detection and quantification of both Nosema spp. This assay provides the opportunity for detailed study of disease resistance, infection kinetics, and improvement of disease management practices for honey bees.

  12. freeQuant: A Mass Spectrometry Label-Free Quantification Software Tool for Complex Proteome Analysis.

    Science.gov (United States)

    Deng, Ning; Li, Zhenye; Pan, Chao; Duan, Huilong

    2015-01-01

    Study of complex proteome brings forward higher request for the quantification method using mass spectrometry technology. In this paper, we present a mass spectrometry label-free quantification tool for complex proteomes, called freeQuant, which integrated quantification with functional analysis effectively. freeQuant consists of two well-integrated modules: label-free quantification and functional analysis with biomedical knowledge. freeQuant supports label-free quantitative analysis which makes full use of tandem mass spectrometry (MS/MS) spectral count, protein sequence length, shared peptides, and ion intensity. It adopts spectral count for quantitative analysis and builds a new method for shared peptides to accurately evaluate abundance of isoforms. For proteins with low abundance, MS/MS total ion count coupled with spectral count is included to ensure accurate protein quantification. Furthermore, freeQuant supports the large-scale functional annotations for complex proteomes. Mitochondrial proteomes from the mouse heart, the mouse liver, and the human heart were used to evaluate the usability and performance of freeQuant. The evaluation showed that the quantitative algorithms implemented in freeQuant can improve accuracy of quantification with better dynamic range.

  13. Physical Characterisation and Quantification of Total Above Ground Biomass Derived from First Thinnings for Wood Fuel Consumption in Ireland

    OpenAIRE

    Mockler, Nicholas

    2013-01-01

    Comprehensive knowledge of wood fuel properties assists in the optimisation of operations concerned with the harvesting, seasoning, processing and conversion of wood to energy. This study investigated the physical properties of wood fuel. These properties included moisture content and basic density. The field work also allowed for the quantification of above ground biomass partitions. The species investigated were alder (Alnus glutinosa), ash (Fraxinus excelsior L.), birch (Betula spp.), lodg...

  14. Standard Error Computations for Uncertainty Quantification in Inverse Problems: Asymptotic Theory vs. Bootstrapping.

    Science.gov (United States)

    Banks, H T; Holm, Kathleen; Robbins, Danielle

    2010-11-01

    We computationally investigate two approaches for uncertainty quantification in inverse problems for nonlinear parameter dependent dynamical systems. We compare the bootstrapping and asymptotic theory approaches for problems involving data with several noise forms and levels. We consider both constant variance absolute error data and relative error which produces non-constant variance data in our parameter estimation formulations. We compare and contrast parameter estimates, standard errors, confidence intervals, and computational times for both bootstrapping and asymptotic theory methods.

  15. Quantification of thymidine kinase (TK1) mRNA in normal and leukemic cells and investigation of structure-function relatiosnhip of recombinant TK1enzyme

    DEFF Research Database (Denmark)

    Kristensen, Tina

    Thymidine kinase (TK) catalyses the ATP-dependent phosphorylation of thymidine to thymidine monophosphate, which is subsequency phosphorylated to thymidine triphosphate and utilized for DNA synthesis. Human cytosolic TK (TKI) is cell cycle regulated, e.g. the TK1 activity increases sharply at the G...... patients with chronic lymphatic leukemia (CLL). 2: Structure-function relationship of recombinant TKI. In the first part a sensitive method (competitive PCR) for quantification of TKI mRNA was established. The TKI mRNA level was quantified in quiescent lymphocytes from control donors (n = 6...... are characterized as being quiescent, the TK activity was in the same range as in quiescent lymphocytes from control donors. However, quantification of the TKI mRNA level shows that all five CLL patients had a very high level (6 to 22 x IO6 copies mg-’ protein) of TKI mRNA, corresponding to the level in dividing...

  16. Protecting quantum coherence of two-level atoms from vacuum fluctuations of electromagnetic field

    International Nuclear Information System (INIS)

    Liu, Xiaobao; Tian, Zehua; Wang, Jieci; Jing, Jiliang

    2016-01-01

    In the framework of open quantum systems, we study the dynamics of a static polarizable two-level atom interacting with a bath of fluctuating vacuum electromagnetic field and explore under which conditions the coherence of the open quantum system is unaffected by the environment. For both a single-qubit and two-qubit systems, we find that the quantum coherence cannot be protected from noise when the atom interacts with a non-boundary electromagnetic field. However, with the presence of a boundary, the dynamical conditions for the insusceptible of quantum coherence are fulfilled only when the atom is close to the boundary and is transversely polarizable. Otherwise, the quantum coherence can only be protected in some degree in other polarizable direction. -- Highlights: •We study the dynamics of a two-level atom interacting with a bath of fluctuating vacuum electromagnetic field. •For both a single and two-qubit systems, the quantum coherence cannot be protected from noise without a boundary. •The insusceptible of the quantum coherence can be fulfilled only when the atom is close to the boundary and is transversely polarizable. •Otherwise, the quantum coherence can only be protected in some degree in other polarizable direction.

  17. FRANX. Application for analysis and quantification of the APS fire

    International Nuclear Information System (INIS)

    Snchez, A.; Osorio, F.; Ontoso, N.

    2014-01-01

    The FRANX application has been developed by EPRI within the Risk and Reliability User Group in order to facilitate the process of quantification and updating APS Fire (also covers floods and earthquakes). By applying fire scenarios are quantified in the central integrating the tasks performed during the APS fire. This paper describes the main features of the program to allow quantification of an APS Fire. (Author)

  18. Laboratory and field tests of the Sutron RLR-0003-1 water level sensor

    Science.gov (United States)

    Fulford, Janice M.; Bryars, R. Scott

    2015-01-01

    Three Sutron RLR-0003-1 water level sensors were tested in laboratory conditions to evaluate the accuracy of the sensor over the manufacturer’s specified operating temperature and distance-to-water ranges. The sensor was also tested for compliance to SDI-12 communication protocol and in field conditions at a U.S. Geological Survey (USGS) streamgaging site. Laboratory results were compared to the manufacturer’s accuracy specification for water level and to the USGS Office of Surface Water (OSW) policy requirement that water level sensors have a measurement uncertainty of no more than 0.01 foot or 0.20 percent of the indicated reading. Except for one sensor, the differences for the temperature testing were within 0.05 foot and the average measurements for the sensors were within the manufacturer’s accuracy specification. Two of the three sensors were within the manufacturer’s specified accuracy and met the USGS accuracy requirements for the laboratory distance to water testing. Three units passed a basic SDI-12 communication compliance test. Water level measurements made by the Sutron RLR-0003-1 during field testing agreed well with those made by the bubbler system and a Design Analysis Associates (DAA) H3613 radar, and they met the USGS accuracy requirements when compared to the wire-weight gage readings.

  19. A level playing field-obtaining consistent cost estimates for advanced reactor designs

    International Nuclear Information System (INIS)

    Hudson, C.R.; Rohm, H.H.; Humphreys, J.R.

    1987-01-01

    A level playing field in sports is necessary to avoid a situation in which a team has an unfair advantage over its competition. Similarly, rules and guidelines for developing cost estimates can be established which, in effect, provide a level playing field whereby cost estimates for advanced power plant concepts can be presented on a consistent and equitable basis. As an example, consider the capital costs shown in Table 1. Both sets of cost are for the exact same power plant; Estimate 1 is expressed in constant dollars while Estimate 2 is presented in nominal or as-spent dollars. As shown, the costs in Table 1 are not directly comparable. Similar problems can be introduced as a result of differing assumptions in any number of parameters including the scope of the cost estimate, inflation/escalation and interest rates, contingency costs, and site location. Of course, the motivation for having consistent cost estimates is to permit comparison among various concepts. As the U.S. Department of Energy sponsors research and development work on several advanced reactor concepts in which expected cost is a key evaluation parameter, the emphasis in this particular endeavor has been in promoting the comparability of advanced reactor cost estimates among themselves and to existing power plant types. To continue with the analogy, the idea is to lay out the playing field and the rules of the contest such that each team participates in the match on an equal basis with the final score being solely determined by the inherent strengths and abilities of the teams. A description of the playing field and some of the more important rules will now be provided

  20. Quantification of aortic regurgitation by magnetic resonance velocity mapping

    DEFF Research Database (Denmark)

    Søndergaard, Lise; Lindvig, K; Hildebrandt, P

    1993-01-01

    The use of magnetic resonance (MR) velocity mapping in the quantification of aortic valvular blood flow was examined in 10 patients with angiographically verified aortic regurgitation. MR velocity mapping succeeded in identifying and quantifying the regurgitation in all patients, and the regurgit......The use of magnetic resonance (MR) velocity mapping in the quantification of aortic valvular blood flow was examined in 10 patients with angiographically verified aortic regurgitation. MR velocity mapping succeeded in identifying and quantifying the regurgitation in all patients...

  1. Quantification of the vocal folds’ dynamic displacements

    International Nuclear Information System (INIS)

    Hernández-Montes, María del Socorro; Muñoz, Silvino; De La Torre, Manuel; Flores, Mauricio; Pérez, Carlos; Mendoza-Santoyo, Fernando

    2016-01-01

    Fast dynamic data acquisition techniques are required to investigate the motional behavior of the vocal folds (VFs) when they are subjected to a steady air-flow through the trachea. High-speed digital holographic interferometry (DHI) is a non-invasive full-field-of-view technique that has proved its usefulness to study rapid and non-repetitive object movements. Hence it is an ideal technique used here to measure VF displacements and vibration patterns at 2000 fps. Analyses from a set of 200 displacement images showed that VFs’ vibration cycles are established along their width (y) and length (x). Furthermore, the maximum deformation for the right and left VFs’ area may be quantified from these images, which in itself represents an important result in the characterization of this structure. At a controlled air pressure, VF displacements fall within the range ∼100–1740 nm, with a calculated precision and accuracy that yields a variation coefficient of 1.91%. High-speed acquisition of full-field images of VFs and their displacement quantification are on their own significant data in the study of their functional and physiological behavior since voice quality and production depend on how they vibrate, i.e. their displacement amplitude and frequency. Additionally, the use of high speed DHI avoids prolonged examinations and represents a significant scientific and technological alternative contribution in advancing the knowledge and working mechanisms of these tissues. (paper)

  2. Quantification and speciation of technetium-99 in samples at low levels: contributions of capillary electrophoresis / ICP-MS system

    International Nuclear Information System (INIS)

    Kasprzak, L.M.

    2007-01-01

    Given the low levels of 99 Tc (long half-lived artificial radionuclide) in the environment (10 -8 M to 10 -12 M), its determination currently necessitates an enrichment and separation from the sample matrix prior to instrumental measurement. Therefore, nuclear safety monitoring requires the knowledge of the redox and chemical properties of this element in order to predict its behaviour and transfer in the environment. So, a separative and very sensitive measurement technique must thus be employed. We have developed a new environmental measurement method applied to the quantification and speciation of 99 Tc in sample at environmental concentrations. Indeed, we have combined a Capillary Electrophoresis (CE) with an Inductively Coupled Plasma Mass Spectrometer (ICP-MS). The limit of detection of 99 Tc is about 2.10 -8 M by CE/ICP-MS system equipped with a PFA-50 nebuliser. In addition to the detection and measurement of technetium, we can separate online 99 Tc(VII) of its interfering radionuclides like molybdenum and ruthenium by CE/ICP-MS. Indeed, due do the different migration time of each anions, it's possible to determinate a signal at m/z= 99 which is only given to 99 Tc. Results obtained by this method have been compared to an usual radiochemical technique, extraction of Tc(VII) by a TEVA resin followed by ICP-MS measurement. Within the framework of storage of spent fuel, studies on the speciation of Tc(VII) by CE / ICP-MS iron-sulphide soils in anoxic conditions have shown that technetium VII is reduced by sulphured suspensions. (author)

  3. Simultaneous MR quantification of hepatic fat content, fatty acid composition, transverse relaxation time and magnetic susceptibility for the diagnosis of non-alcoholic steatohepatitis.

    Science.gov (United States)

    Leporq, B; Lambert, S A; Ronot, M; Vilgrain, V; Van Beers, B E

    2017-10-01

    Non-alcoholic steatohepatitis (NASH) is characterized at histology by steatosis, hepatocyte ballooning and inflammatory infiltrates, with or without fibrosis. Although diamagnetic material in fibrosis and inflammation can be detected with quantitative susceptibility imaging, fatty acid composition changes in NASH relative to simple steatosis have also been reported. Therefore, our aim was to develop a single magnetic resonance (MR) acquisition and post-processing scheme for the diagnosis of steatohepatitis by the simultaneous quantification of hepatic fat content, fatty acid composition, T 2 * transverse relaxation time and magnetic susceptibility in patients with non-alcoholic fatty liver disease. MR acquisition was performed at 3.0 T using a three-dimensional, multi-echo, spoiled gradient echo sequence. Phase images were unwrapped to compute the B 0 field inhomogeneity (ΔB 0 ) map. The ΔB 0 -demodulated real part images were used for fat-water separation, T 2 * and fatty acid composition quantification. The external and internal fields were separated with the projection onto dipole field method. Susceptibility maps were obtained after dipole inversion from the internal field map with single-orientation Bayesian regularization including spatial priors. Method validation was performed in 32 patients with biopsy-proven, non-alcoholic fatty liver disease from which 12 had simple steatosis and 20 NASH. Liver fat fraction and T 2 * did not change significantly between patients with simple steatosis and NASH. In contrast, the saturated fatty acid fraction increased in patients with NASH relative to patients with simple steatosis (48 ± 2% versus 44 ± 4%; p magnetic susceptibility decreased (-0.30 ± 0.27 ppm versus 0.10 ± 0.14 ppm; p magnetic susceptibility as NASH marker was 0.91 (95% CI: 0.79-1.0). Simultaneous MR quantification of fat content, fatty acid composition, T 2 * and magnetic susceptibility is feasible in the liver. Our preliminary results

  4. Quantification of modulated blood oxygenation levels in single cerebral veins by investigating their MR signal decay

    Energy Technology Data Exchange (ETDEWEB)

    Sedlacik, Jan [St. Jude Children' s Research Hospital, Memphis, TN (United States). Div. of Translational Imaging Research; University Clinics Jena (Germany). Medical Physics Group; Rauscher, Alexander [University Clinics Jena (Germany). Medical Physics Group; British Columbia Univ., Vancouver (Canada). MRI Research Centre; Reichenbach, Juergen R. [University Clinics Jena (Germany). Medical Physics Group

    2009-07-01

    The transverse magnetization of a single vein and its surrounding tissue is subject to spin dephasing caused by the local magnetic field inhomogeneity which is induced by the very same vessel. This phenomenon can be approximated and simulated by applying the model of an infinitely long and homogeneously magnetized cylinder embedded in a homogeneous tissue background. It is then possible to estimate the oxygenation level of the venous blood by fitting the simulated magnetization-time-course to the measured signal decay. In this work we demonstrate the ability of this approach to quantify the blood oxygenation level (Y) of small cerebral veins in vivo, not only under normal physiologic conditions (Y{sub native}=0.5-0.55) but also during induced changes of physiologic conditions which affect the cerebral venous blood oxygenation level. Changes of blood's oxygenation level induced by carbogen (5% CO{sub 2}, 95% O{sub 2}) and caffeine were observed and quantified, resulting in values of Y{sub carbogen}=0.7 and Y{sub caffeine}=0.42, respectively. The proposed technique may ultimately help to better understand local changes in cerebral physiology during neuronal activation by quantifying blood oxygenation in veins draining active brain areas. It may also be beneficial in clinical applications where it may improve diagnosis of cerebral pathologies as well as monitoring of responses to therapy. (orig.)

  5. Experimental investigation and crystal-field modeling of Er{sup 3+} energy levels in GSGG crystal

    Energy Technology Data Exchange (ETDEWEB)

    Gao, J.Y., E-mail: jygao1985@sina.com [Anhui Institute of Optics and Fine Mechanics, Chinese Academy of Sciences, Hefei 230031 (China); Sun, D.L.; Zhang, Q.L. [Anhui Institute of Optics and Fine Mechanics, Chinese Academy of Sciences, Hefei 230031 (China); Wang, X.F. [Anhui Institute of Optics and Fine Mechanics, Chinese Academy of Sciences, Hefei 230031 (China); Graduate School of the Chinese Academy of Sciences, Beijing 100049 (China); Liu, W.P.; Luo, J.Q.; Sun, G.H.; Yin, S.T. [Anhui Institute of Optics and Fine Mechanics, Chinese Academy of Sciences, Hefei 230031 (China)

    2016-06-25

    The Er{sup 3+}-doped Gd{sub 3}Sc{sub 2}Ga{sub 3}O{sub 12} (Er{sup 3+}:GSGG) single crystal, a excellent medium of the mid-infrared and anti-radiation solid state laser pumped by laser diode, was grown by Czochralski method successfully. The absorption spectra were measured and analyzed in a wider spectral wavelength range of 350–1700 nm at different temperatures of 7.6, 77, 200 and 300 K. The free-ions and crystal-field parameters were fitted to the experimental energy levels with the root mean square deviation of 9.86 cm{sup −1}. According to the crystal-field calculations, 124 degenerate energy levels of Er{sup 3+} in GSGG host crystals were assigned. The fitting results of free-ions and crystal-field parameters were compared with those already reported of Er{sup 3+}:YSGG. The results indicated that the free-ions parameters for Er{sup 3+} in GSGG host are similar to those in YSGG host crystals, and the crystal-field interaction of GSGG is weaker than that of YSGG, which may result in the better laser characterization of Er{sup 3+}:GSGG crystal. - Highlights: • The efficient diode-end-pumped laser crystal Er:GSGG has been grown successfully. • The absorption spectra of Er:GSGG have been measured in range of 350–1700 nm. • The fitting result is very well for the root mean square deviation is 9.86 cm{sup −1}. • The 124 levels of Er:GSGG have been assigned from the crystal-field calculations.

  6. Inter- and Intra- Field variations in soil compaction levels and subsequent impacts on hydrological extremes

    Science.gov (United States)

    Pattison, Ian; Coates, Victoria

    2015-04-01

    The rural landscape in the UK is dominated by pastoral agriculture, with about 40% of land cover classified as either improved or semi-natural grassland according to the Land Cover Map 2007. Intensification has resulted in greater levels of compaction associated with higher stocking densities. However, there is likely to be a great amount of variability in compaction levels within and between fields due to multiple controlling factors. This research focusses in on two of these factors; firstly animal species, namely sheep, cattle and horses; and secondly field zonation e.g. feeding areas, field gates, open field. Field experiments have been conducted in multiple fields in the River Skell catchment, in Yorkshire, UK, which has an area of 140km2. The effect on physical and hydrologic soil characteristics such as bulk density and moisture contents have been quantified using a wide range of field and laboratory based experiments. Results have highlighted statistically different properties between heavily compacted areas where animals congregate and less-trampled open areas. Furthermore, soil compaction has been hypothesised to contribute to increased flood risk at larger spatial scales. Previous research (Pattison, 2011) on a ~40km2 catchment (Dacre Beck, Lake District, UK) has shown that when soil characteristics are homogeneously parameterised in a hydrological model, downstream peak discharges can be 65% higher for a heavy compacted soil than for a lightly compacted soil. Here we report results from spatially distributed hydrological modelling using soil parameters gained from the field experimentation. Results highlight the importance of both the percentage of the catchment which is heavily compacted and also the spatial distribution of these fields.

  7. Clinical applications of MS-based protein quantification.

    Science.gov (United States)

    Sabbagh, Bassel; Mindt, Sonani; Neumaier, Michael; Findeisen, Peter

    2016-04-01

    Mass spectrometry-based assays are increasingly important in clinical laboratory medicine and nowadays are already commonly used in several areas of routine diagnostics. These include therapeutic drug monitoring, toxicology, endocrinology, pediatrics, and microbiology. Accordingly, some of the most common analyses are therapeutic drug monitoring of immunosuppressants, vitamin D, steroids, newborn screening, and bacterial identification. However, MS-based quantification of peptides and proteins for routine diagnostic use is rather rare up to now despite excellent analytical specificity and good sensitivity. Here, we want to give an overview over current fit-for-purpose assays for MS-based protein quantification. Advantages as well as challenges of this approach will be discussed with focus on feasibility for routine diagnostic use. © 2016 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  8. Quantification of epithelial cells in coculture with fibroblasts by fluorescence image analysis.

    Science.gov (United States)

    Krtolica, Ana; Ortiz de Solorzano, Carlos; Lockett, Stephen; Campisi, Judith

    2002-10-01

    To demonstrate that senescent fibroblasts stimulate the proliferation and neoplastic transformation of premalignant epithelial cells (Krtolica et al.: Proc Natl Acad Sci USA 98:12072-12077, 2001), we developed methods to quantify the proliferation of epithelial cells cocultured with fibroblasts. We stained epithelial-fibroblast cocultures with the fluorescent DNA-intercalating dye 4,6-diamidino-2-phenylindole (DAPI), or expressed green fluorescent protein (GFP) in the epithelial cells, and then cultured them with fibroblasts. The cocultures were photographed under an inverted microscope with appropriate filters, and the fluorescent images were captured with a digital camera. We modified an image analysis program to selectively recognize the smaller, more intensely fluorescent epithelial cell nuclei in DAPI-stained cultures and used the program to quantify areas with DAPI fluorescence generated by epithelial nuclei or GFP fluorescence generated by epithelial cells in each field. Analysis of the image areas with DAPI and GFP fluorescences produced nearly identical quantification of epithelial cells in coculture with fibroblasts. We confirmed these results by manual counting. In addition, GFP labeling permitted kinetic studies of the same coculture over multiple time points. The image analysis-based quantification method we describe here is an easy and reliable way to monitor cells in coculture and should be useful for a variety of cell biological studies. Copyright 2002 Wiley-Liss, Inc.

  9. Uncertainty Quantification in Aerodynamics Simulations, Phase I

    Data.gov (United States)

    National Aeronautics and Space Administration — The objective of the proposed work (Phases I and II) is to develop uncertainty quantification methodologies and software suitable for use in CFD simulations of...

  10. Energy Levels and Spectral Lines of Li Atoms in White Dwarf Strength Magnetic Fields

    Science.gov (United States)

    Zhao, L. B.

    2018-04-01

    A theoretical approach based on B-splines has been developed to calculate atomic structures and discrete spectra of Li atoms in a strong magnetic field typical of magnetic white dwarf stars. Energy levels are presented for 20 electronic states with the symmetries 20+, 20‑, 2(‑1)+, 2(‑1)‑, and 2(‑2)+. The magnetic field strengths involved range from 0 to 2350 MG. The wavelengths and oscillator strengths for the electric dipole transitions relevant to these magnetized atomic states are reported. The current results are compared to the limited theoretical data in the literature. A good agreement has been found for the lower energy levels, but a significant discrepancy is clearly visible for the higher energy levels. The existing discrepancies of the wavelengths and oscillator strengths are also discussed. Our investigation shows that the spectrum data of magnetized Li atoms previously published are obviously far from meeting requirements of analyzing discrete atomic spectra of magnetic white dwarfs with lithium atmospheres.

  11. Quantification of interfacial segregation by analytical electron microscopy

    CERN Document Server

    Muellejans, H

    2003-01-01

    The quantification of interfacial segregation by spatial difference and one-dimensional profiling is presented in general where special attention is given to the random and systematic uncertainties. The method is demonstrated for an example of Al-Al sub 2 O sub 3 interfaces in a metal-ceramic composite material investigated by energy-dispersive X-ray spectroscopy and electron energy loss spectroscopy in a dedicated scanning transmission electron microscope. The variation of segregation measured at different interfaces by both methods is within the uncertainties, indicating a constant segregation level and interfacial phase formation. The most important random uncertainty is the counting statistics of the impurity signal whereas the specimen thickness introduces systematic uncertainties (via k factor and effective scan width). The latter could be significantly reduced when the specimen thickness is determined explicitly. (orig.)

  12. Validation of a commercially available enzyme-linked immunoabsorbent assay for the quantification of human α-Synuclein in cerebrospinal fluid.

    Science.gov (United States)

    Kruse, Niels; Mollenhauer, Brit

    2015-11-01

    The quantification of α-Synuclein in cerebrospinal fluid (CSF) as a biomarker has gained tremendous interest in the last years. Several commercially available immunoassays are emerging. We here describe the full validation of one commercially available ELISA assay for the quantification of α-Synuclein in human CSF (Covance alpha-Synuclein ELISA kit). The study was conducted within the BIOMARKAPD project in the European initiative Joint Program for Neurodegenerative Diseases (JPND). We investigated the effect of several pre-analytical and analytical confounders: i.e. (1) need for centrifugation of freshly drawn CSF, (2) sample stability, (3) delay of freezing, (4) volume of storage aliquots, (5) freeze/thaw cycles, (6) thawing conditions, (7) dilution linearity, (8) parallelism, (9) spike recovery, and (10) precision. None of these confounders influenced the levels of α-Synuclein in CSF significantly. We found a very high intra-assay precision. The inter-assay precision was lower than expected due to different performances of kit lots used. Overall the validated immunoassay is useful for the quantification of α-Synuclein in human CSF. Copyright © 2015 Elsevier B.V. All rights reserved.

  13. Perfusion Quantification Using Gaussian Process Deconvolution

    DEFF Research Database (Denmark)

    Andersen, Irene Klærke; Have, Anna Szynkowiak; Rasmussen, Carl Edward

    2002-01-01

    The quantification of perfusion using dynamic susceptibility contrast MRI (DSC-MRI) requires deconvolution to obtain the residual impulse response function (IRF). In this work, a method using the Gaussian process for deconvolution (GPD) is proposed. The fact that the IRF is smooth is incorporated...

  14. Direct quantification of lipopeptide biosurfactants in biological samples via HPLC and UPLC-MS requires sample modification with an organic solvent.

    Science.gov (United States)

    Biniarz, Piotr; Łukaszewicz, Marcin

    2017-06-01

    The rapid and accurate quantification of biosurfactants in biological samples is challenging. In contrast to the orcinol method for rhamnolipids, no simple biochemical method is available for the rapid quantification of lipopeptides. Various liquid chromatography (LC) methods are promising tools for relatively fast and exact quantification of lipopeptides. Here, we report strategies for the quantification of the lipopeptides pseudofactin and surfactin in bacterial cultures using different high- (HPLC) and ultra-performance liquid chromatography (UPLC) systems. We tested three strategies for sample pretreatment prior to LC analysis. In direct analysis (DA), bacterial cultures were injected directly and analyzed via LC. As a modification, we diluted the samples with methanol and detected an increase in lipopeptide recovery in the presence of methanol. Therefore, we suggest this simple modification as a tool for increasing the accuracy of LC methods. We also tested freeze-drying followed by solvent extraction (FDSE) as an alternative for the analysis of "heavy" samples. In FDSE, the bacterial cultures were freeze-dried, and the resulting powder was extracted with different solvents. Then, the organic extracts were analyzed via LC. Here, we determined the influence of the extracting solvent on lipopeptide recovery. HPLC methods allowed us to quantify pseudofactin and surfactin with run times of 15 and 20 min per sample, respectively, whereas UPLC quantification was as fast as 4 and 5.5 min per sample, respectively. Our methods provide highly accurate measurements and high recovery levels for lipopeptides. At the same time, UPLC-MS provides the possibility to identify lipopeptides and their structural isoforms.

  15. Directions of improvement for public administration institutional structure in field of ecology at regional level

    Directory of Open Access Journals (Sweden)

    O. I. Matyushenko

    2017-06-01

    Full Text Available Based on the analysis of the organizational structure of public authorities at national and regional level involved in the process of governance in the environmental field as well as their legal security it was found that at the regional level, in regions there are different units (departments, management authorities, divisions, sectors that coordinate the process of public administration in the environmental field. In order to offer its own structure unit of state administration, to deal with environmental issues it is analyzed the organizational structures of central executive authority in the field of ecology - the Ministry of Ecology and Natural Resources of Ukraine at the national level and the organizational structures of departments / offices of Ecology and Natural Resources regional administrations respectively, at the regional level. As it is determined there is no typical structure of the regional state administration unit in Ukraine. Recognized that departments and sectors uniting at the high level is chaotic, unsystematic and apparently dictated by different reasons (financial, personal and psychological, corruption etc., not the content of (the logic of and structural accountability to senior management level. It is offered the author organizational structure for the Ecology and Natural Resources Department of Regional State Administration. It is suggested that this Department consists of three units: Department of Ecology (Department of environmental monitoring and audit department of environmental security department of planning and coordination of international projects in the environmental field; Department of Natural Resources (Department of Conservation of Natural Resources, Department of Protected Areas and Ecological Network Development, Department of Environmental Economics; Management support of the Department (Legal, Financial and Economic Division, Department of Administration Department, a department of scientific and

  16. Temperature dependence of postmortem MR quantification for soft tissue discrimination

    Energy Technology Data Exchange (ETDEWEB)

    Zech, Wolf-Dieter; Schwendener, Nicole; Jackowski, Christian [University of Bern, From the Institute of Forensic Medicine, Bern (Switzerland); Persson, Anders; Warntjes, Marcel J. [University of Linkoeping, The Center for Medical Image Science and Visualization (CMIV), Linkoeping (Sweden)

    2015-08-15

    To investigate and correct the temperature dependence of postmortem MR quantification used for soft tissue characterization and differentiation in thoraco-abdominal organs. Thirty-five postmortem short axis cardiac 3-T MR examinations were quantified using a quantification sequence. Liver, spleen, left ventricular myocardium, pectoralis muscle and subcutaneous fat were analysed in cardiac short axis images to obtain mean T1, T2 and PD tissue values. The core body temperature was measured using a rectally inserted thermometer. The tissue-specific quantitative values were related to the body core temperature. Equations to correct for temperature differences were generated. In a 3D plot comprising the combined data of T1, T2 and PD, different organs/tissues could be well differentiated from each other. The quantitative values were influenced by the temperature. T1 in particular exhibited strong temperature dependence. The correction of quantitative values to a temperature of 37 C resulted in better tissue discrimination. Postmortem MR quantification is feasible for soft tissue discrimination and characterization of thoraco-abdominal organs. This provides a base for computer-aided diagnosis and detection of tissue lesions. The temperature dependence of the T1 values challenges postmortem MR quantification. Equations to correct for the temperature dependence are provided. (orig.)

  17. Temperature dependence of postmortem MR quantification for soft tissue discrimination

    International Nuclear Information System (INIS)

    Zech, Wolf-Dieter; Schwendener, Nicole; Jackowski, Christian; Persson, Anders; Warntjes, Marcel J.

    2015-01-01

    To investigate and correct the temperature dependence of postmortem MR quantification used for soft tissue characterization and differentiation in thoraco-abdominal organs. Thirty-five postmortem short axis cardiac 3-T MR examinations were quantified using a quantification sequence. Liver, spleen, left ventricular myocardium, pectoralis muscle and subcutaneous fat were analysed in cardiac short axis images to obtain mean T1, T2 and PD tissue values. The core body temperature was measured using a rectally inserted thermometer. The tissue-specific quantitative values were related to the body core temperature. Equations to correct for temperature differences were generated. In a 3D plot comprising the combined data of T1, T2 and PD, different organs/tissues could be well differentiated from each other. The quantitative values were influenced by the temperature. T1 in particular exhibited strong temperature dependence. The correction of quantitative values to a temperature of 37 C resulted in better tissue discrimination. Postmortem MR quantification is feasible for soft tissue discrimination and characterization of thoraco-abdominal organs. This provides a base for computer-aided diagnosis and detection of tissue lesions. The temperature dependence of the T1 values challenges postmortem MR quantification. Equations to correct for the temperature dependence are provided. (orig.)

  18. Quantification is Neither Necessary Nor Sufficient for Measurement

    International Nuclear Information System (INIS)

    Mari, Luca; Maul, Andrew; Torres Irribarra, David; Wilson, Mark

    2013-01-01

    Being an infrastructural, widespread activity, measurement is laden with stereotypes. Some of these concern the role of measurement in the relation between quality and quantity. In particular, it is sometimes argued or assumed that quantification is necessary for measurement; it is also sometimes argued or assumed that quantification is sufficient for or synonymous with measurement. To assess the validity of these positions the concepts of measurement and quantitative evaluation should be independently defined and their relationship analyzed. We contend that the defining characteristic of measurement should be the structure of the process, not a feature of its results. Under this perspective, quantitative evaluation is neither sufficient nor necessary for measurement

  19. Detection and quantification of proteins and cells by use of elemental mass spectrometry: progress and challenges.

    Science.gov (United States)

    Yan, Xiaowen; Yang, Limin; Wang, Qiuquan

    2013-07-01

    Much progress has been made in identification of the proteins in proteomes, and quantification of these proteins has attracted much interest. In addition to popular tandem mass spectrometric methods based on soft ionization, inductively coupled plasma mass spectrometry (ICPMS), a typical example of mass spectrometry based on hard ionization, usually used for analysis of elements, has unique advantages in absolute quantification of proteins by determination of an element with a definite stoichiometry in a protein or attached to the protein. In this Trends article, we briefly describe state-of-the-art ICPMS-based methods for quantification of proteins, emphasizing protein-labeling and element-tagging strategies developed on the basis of chemically selective reactions and/or biospecific interactions. Recent progress from protein to cell quantification by use of ICPMS is also discussed, and the possibilities and challenges of ICPMS-based protein quantification for universal, selective, or targeted quantification of proteins and cells in a biological sample are also discussed critically. We believe ICPMS-based protein quantification will become ever more important in targeted quantitative proteomics and bioanalysis in the near future.

  20. Standardless quantification by parameter optimization in electron probe microanalysis

    International Nuclear Information System (INIS)

    Limandri, Silvina P.; Bonetto, Rita D.; Josa, Víctor Galván; Carreras, Alejo C.; Trincavelli, Jorge C.

    2012-01-01

    A method for standardless quantification by parameter optimization in electron probe microanalysis is presented. The method consists in minimizing the quadratic differences between an experimental spectrum and an analytical function proposed to describe it, by optimizing the parameters involved in the analytical prediction. This algorithm, implemented in the software POEMA (Parameter Optimization in Electron Probe Microanalysis), allows the determination of the elemental concentrations, along with their uncertainties. The method was tested in a set of 159 elemental constituents corresponding to 36 spectra of standards (mostly minerals) that include trace elements. The results were compared with those obtained with the commercial software GENESIS Spectrum® for standardless quantification. The quantifications performed with the method proposed here are better in the 74% of the cases studied. In addition, the performance of the method proposed is compared with the first principles standardless analysis procedure DTSA for a different data set, which excludes trace elements. The relative deviations with respect to the nominal concentrations are lower than 0.04, 0.08 and 0.35 for the 66% of the cases for POEMA, GENESIS and DTSA, respectively. - Highlights: ► A method for standardless quantification in EPMA is presented. ► It gives better results than the commercial software GENESIS Spectrum. ► It gives better results than the software DTSA. ► It allows the determination of the conductive coating thickness. ► It gives an estimation for the concentration uncertainties.

  1. Energy levels of a quantum particle on a cylindrical surface with non-circular cross-section in electric and magnetic fields

    International Nuclear Information System (INIS)

    Cruz, Philip Christopher S.; Bernardo, Reginald Christian S.; Esguerra, Jose Perico H.

    2017-01-01

    We calculate the energy levels of a quantum particle on a cylindrical surface with non-circular cross-section in uniform electric and magnetic fields. Using separation of variables method and a change of independent variable, we show that the problem can be reduced to a one-dimensional Schrödinger equation for a periodic potential. The effects of varying the shape of the cross-section while keeping the same perimeter and the strengths of the electric and magnetic fields are investigated for elliptical, corrugated, and nearly-rectangular tubes with radial dimensions of the order of a nanometer. The geometric potential has minima at the angular positions where there is a significant amount of curvature. For the elliptical and corrugated tubes, it is shown that as the tube departs from the circular shape of cross-section the double-degeneracy between the energy levels is lifted. For the nearly-rectangular tube, it is shown that energy level crossings occur as the horizontal dimension of the tube is varied while keeping the same perimeter and radius of circular corners. The interplay between the curvature and the strength of the electric and magnetic fields determines the overall behavior of the energy levels. As the strength of the electric field increases, the overall potential gets skewed creating a potential well on the side corresponding to the more negative electric potential. The energy levels of the first few excited states approach more positive values while the ground state energy level approaches a more negative value. For large electric fields, all bound state energy levels tend to more negative values. The contribution of weak magnetic fields to the overall potential behaves in the same way as the electric field contribution but with its sign depending on the direction of the component of the momentum parallel to the cylindrical axis. Large magnetic fields lead to pairing of energy levels reminiscent of 2D Landau levels for the elliptical and nearly

  2. Energy levels of a quantum particle on a cylindrical surface with non-circular cross-section in electric and magnetic fields

    Energy Technology Data Exchange (ETDEWEB)

    Cruz, Philip Christopher S., E-mail: pscruz1@up.edu.ph; Bernardo, Reginald Christian S., E-mail: rcbernardo@nip.upd.edu.ph; Esguerra, Jose Perico H., E-mail: jesguerra@nip.upd.edu.ph

    2017-04-15

    We calculate the energy levels of a quantum particle on a cylindrical surface with non-circular cross-section in uniform electric and magnetic fields. Using separation of variables method and a change of independent variable, we show that the problem can be reduced to a one-dimensional Schrödinger equation for a periodic potential. The effects of varying the shape of the cross-section while keeping the same perimeter and the strengths of the electric and magnetic fields are investigated for elliptical, corrugated, and nearly-rectangular tubes with radial dimensions of the order of a nanometer. The geometric potential has minima at the angular positions where there is a significant amount of curvature. For the elliptical and corrugated tubes, it is shown that as the tube departs from the circular shape of cross-section the double-degeneracy between the energy levels is lifted. For the nearly-rectangular tube, it is shown that energy level crossings occur as the horizontal dimension of the tube is varied while keeping the same perimeter and radius of circular corners. The interplay between the curvature and the strength of the electric and magnetic fields determines the overall behavior of the energy levels. As the strength of the electric field increases, the overall potential gets skewed creating a potential well on the side corresponding to the more negative electric potential. The energy levels of the first few excited states approach more positive values while the ground state energy level approaches a more negative value. For large electric fields, all bound state energy levels tend to more negative values. The contribution of weak magnetic fields to the overall potential behaves in the same way as the electric field contribution but with its sign depending on the direction of the component of the momentum parallel to the cylindrical axis. Large magnetic fields lead to pairing of energy levels reminiscent of 2D Landau levels for the elliptical and nearly

  3. An approach for quantification of platinum distribution in tissues by LA-ICP-MS imaging using isotope dilution analysis.

    Science.gov (United States)

    Moraleja, I; Mena, M L; Lázaro, A; Neumann, B; Tejedor, A; Jakubowski, N; Gómez-Gómez, M M; Esteban-Fernández, D

    2018-02-01

    Laser ablation inductively coupled plasma mass spectrometry (LA-ICP-MS) has been revealed as a convenient technique for trace elemental imaging in tissue sections, providing elemental 2D distribution at a quantitative level. For quantification purposes, in the last years several approaches have been proposed in the literature such as the use of CRMs or matrix matched standards. The use of Isotope Dilution (ID) for quantification by LA-ICP-MS has been also described, being mainly useful for bulk analysis but not feasible for spatial measurements so far. In this work, a quantification method based on ID analysis was developed by printing isotope-enriched inks onto kidney slices from rats treated with antitumoral Pt-based drugs using a commercial ink-jet device, in order to perform an elemental quantification in different areas from bio-images. For the ID experiments 194 Pt enriched platinum was used. The methodology was validated by deposition of natural Pt standard droplets with a known amount of Pt onto the surface of a control tissue, where could be quantified even 50pg of Pt, with recoveries higher than 90%. The amount of Pt present in the whole kidney slices was quantified for cisplatin, carboplatin and oxaliplatin-treated rats. The results obtained were in accordance with those previously reported. The amount of Pt distributed between the medullar and cortical areas was also quantified, observing different behavior for the three drugs. Copyright © 2017 Elsevier B.V. All rights reserved.

  4. Detection, mapping, and quantification of single walled carbon nanotubes in histological specimens with photoacoustic microscopy.

    Science.gov (United States)

    Avti, Pramod K; Hu, Song; Favazza, Christopher; Mikos, Antonios G; Jansen, John A; Shroyer, Kenneth R; Wang, Lihong V; Sitharaman, Balaji

    2012-01-01

    In the present study, the efficacy of multi-scale photoacoustic microscopy (PAM) was investigated to detect, map, and quantify trace amounts [nanograms (ng) to micrograms (µg)] of SWCNTs in a variety of histological tissue specimens consisting of cancer and benign tissue biopsies (histological specimens from implanted tissue engineering scaffolds). Optical-resolution (OR) and acoustic-resolution (AR)--Photoacoustic microscopy (PAM) was employed to detect, map and quantify the SWCNTs in a variety of tissue histological specimens and compared with other optical techniques (bright-field optical microscopy, Raman microscopy, near infrared (NIR) fluorescence microscopy). Both optical-resolution and acoustic-resolution PAM, allow the detection and quantification of SWCNTs in histological specimens with scalable spatial resolution and depth penetration. The noise-equivalent detection sensitivity to SWCNTs in the specimens was calculated to be as low as ∼7 pg. Image processing analysis further allowed the mapping, distribution, and quantification of the SWCNTs in the histological sections. The results demonstrate the potential of PAM as a promising imaging technique to detect, map, and quantify SWCNTs in histological specimens, and could complement the capabilities of current optical and electron microscopy techniques in the analysis of histological specimens containing SWCNTs.

  5. Detection, mapping, and quantification of single walled carbon nanotubes in histological specimens with photoacoustic microscopy.

    Directory of Open Access Journals (Sweden)

    Pramod K Avti

    Full Text Available In the present study, the efficacy of multi-scale photoacoustic microscopy (PAM was investigated to detect, map, and quantify trace amounts [nanograms (ng to micrograms (µg] of SWCNTs in a variety of histological tissue specimens consisting of cancer and benign tissue biopsies (histological specimens from implanted tissue engineering scaffolds.Optical-resolution (OR and acoustic-resolution (AR--Photoacoustic microscopy (PAM was employed to detect, map and quantify the SWCNTs in a variety of tissue histological specimens and compared with other optical techniques (bright-field optical microscopy, Raman microscopy, near infrared (NIR fluorescence microscopy.Both optical-resolution and acoustic-resolution PAM, allow the detection and quantification of SWCNTs in histological specimens with scalable spatial resolution and depth penetration. The noise-equivalent detection sensitivity to SWCNTs in the specimens was calculated to be as low as ∼7 pg. Image processing analysis further allowed the mapping, distribution, and quantification of the SWCNTs in the histological sections.The results demonstrate the potential of PAM as a promising imaging technique to detect, map, and quantify SWCNTs in histological specimens, and could complement the capabilities of current optical and electron microscopy techniques in the analysis of histological specimens containing SWCNTs.

  6. Methods for the physical characterization and quantification of extracellular vesicles in biological samples.

    Science.gov (United States)

    Rupert, Déborah L M; Claudio, Virginia; Lässer, Cecilia; Bally, Marta

    2017-01-01

    Our body fluids contain a multitude of cell-derived vesicles, secreted by most cell types, commonly referred to as extracellular vesicles. They have attracted considerable attention for their function as intercellular communication vehicles in a broad range of physiological processes and pathological conditions. Extracellular vesicles and especially the smallest type, exosomes, have also generated a lot of excitement in view of their potential as disease biomarkers or as carriers for drug delivery. In this context, state-of-the-art techniques capable of comprehensively characterizing vesicles in biological fluids are urgently needed. This review presents the arsenal of techniques available for quantification and characterization of physical properties of extracellular vesicles, summarizes their working principles, discusses their advantages and limitations and further illustrates their implementation in extracellular vesicle research. The small size and physicochemical heterogeneity of extracellular vesicles make their physical characterization and quantification an extremely challenging task. Currently, structure, size, buoyant density, optical properties and zeta potential have most commonly been studied. The concentration of vesicles in suspension can be expressed in terms of biomolecular or particle content depending on the method at hand. In addition, common quantification methods may either provide a direct quantitative measurement of vesicle concentration or solely allow for relative comparison between samples. The combination of complementary methods capable of detecting, characterizing and quantifying extracellular vesicles at a single particle level promises to provide new exciting insights into their modes of action and to reveal the existence of vesicle subpopulations fulfilling key biological tasks. Copyright © 2016 Elsevier B.V. All rights reserved.

  7. The NASA Langley Multidisciplinary Uncertainty Quantification Challenge

    Science.gov (United States)

    Crespo, Luis G.; Kenny, Sean P.; Giesy, Daniel P.

    2014-01-01

    This paper presents the formulation of an uncertainty quantification challenge problem consisting of five subproblems. These problems focus on key aspects of uncertainty characterization, sensitivity analysis, uncertainty propagation, extreme-case analysis, and robust design.

  8. Information Entropy Squeezing of a Two-Level Atom Interacting with Two-Mode Coherent Fields

    Institute of Scientific and Technical Information of China (English)

    LIU Xiao-Juan; FANG Mao-Fa

    2004-01-01

    From a quantum information point of view we investigate the entropy squeezing properties for a two-level atom interacting with the two-mode coherent fields via the two-photon transition. We discuss the influences of the initial state of the system on the atomic information entropy squeezing. Our results show that the squeezed component number,squeezed direction, and time of the information entropy squeezing can be controlled by choosing atomic distribution angle,the relative phase between the atom and the two-mode field, and the difference of the average photon number of the two field modes, respectively. Quantum information entropy is a remarkable precision measure for the atomic squeezing.

  9. Does zero really mean nothing?-first experiences with the new PowerQuant(TM) system in comparison to established real-time quantification kits.

    Science.gov (United States)

    Poetsch, Micaela; Konrad, Helen; Helmus, Janine; Bajanowski, Thomas; von Wurmb-Schwark, Nicole

    2016-07-01

    DNA quantification is an important step in the molecular genetic analysis of a forensic sample, hopefully providing reliable data on DNA content for a subsequent generation of reproducible STR profiles for identification. For several years, this quantification has usually been done by real-time PCR protocols and meanwhile a variety of assays are commercially available from different companies. The newest one is the PowerQuant(TM) assay by Promega Inc. which is advertised with the promise that a determined DNA concentration of 0 ng/μl in a forensic sample guarantees the impossibility to achieve true STR results, thus allowing to exclude such samples from STR analysis to save time and money. Thus, the goal of this study was to thoroughly verify the quantification step with regard to its suitability as a screening method. We have evaluated the precision and reliability of four different real-time PCR quantification assays by systematically testing DNA dilutions and forensic samples with various DNA contents. Subsequently, each sample was subjected to the Powerplex® ESX 17 fast kit to determine a reliable cutoff level for exclusion of definitely negative samples from STR analysis. An accurate quantification of different cell line DNA dilutions was not possible with any kit. However, at least the PowerQuant(TM) assay provided suitable data analyzing forensic samples, whereas in other systems up to 46 % of negative samples still displayed reliable STR analysis results. All in all, the PowerQuant(TM) assay represents a big step forward, but the evaluation of real-time PCR quantification results has still to be done with great care.

  10. 1H NMR quantification in very dilute toxin solutions: application to anatoxin-a analysis.

    Science.gov (United States)

    Dagnino, Denise; Schripsema, Jan

    2005-08-01

    A complete procedure is described for the extraction, detection and quantification of anatoxin-a in biological samples. Anatoxin-a is extracted from biomass by a routine acid base extraction. The extract is analysed by GC-MS, without the need of derivatization, with a detection limit of 0.5 ng. A method was developed for the accurate quantification of anatoxin-a in the standard solution to be used for the calibration of the GC analysis. 1H NMR allowed the accurate quantification of microgram quantities of anatoxin-a. The accurate quantification of compounds in standard solutions is rarely discussed, but for compounds like anatoxin-a (toxins with prices in the range of a million dollar a gram), of which generally only milligram quantities or less are available, this factor in the quantitative analysis is certainly not trivial. The method that was developed can easily be adapted for the accurate quantification of other toxins in very dilute solutions.

  11. Volumetric quantification of lung nodules in CT with iterative reconstruction (ASiR and MBIR).

    Science.gov (United States)

    Chen, Baiyu; Barnhart, Huiman; Richard, Samuel; Robins, Marthony; Colsher, James; Samei, Ehsan

    2013-11-01

    Volume quantifications of lung nodules with multidetector computed tomography (CT) images provide useful information for monitoring nodule developments. The accuracy and precision of the volume quantification, however, can be impacted by imaging and reconstruction parameters. This study aimed to investigate the impact of iterative reconstruction algorithms on the accuracy and precision of volume quantification with dose and slice thickness as additional variables. Repeated CT images were acquired from an anthropomorphic chest phantom with synthetic nodules (9.5 and 4.8 mm) at six dose levels, and reconstructed with three reconstruction algorithms [filtered backprojection (FBP), adaptive statistical iterative reconstruction (ASiR), and model based iterative reconstruction (MBIR)] into three slice thicknesses. The nodule volumes were measured with two clinical software (A: Lung VCAR, B: iNtuition), and analyzed for accuracy and precision. Precision was found to be generally comparable between FBP and iterative reconstruction with no statistically significant difference noted for different dose levels, slice thickness, and segmentation software. Accuracy was found to be more variable. For large nodules, the accuracy was significantly different between ASiR and FBP for all slice thicknesses with both software, and significantly different between MBIR and FBP for 0.625 mm slice thickness with Software A and for all slice thicknesses with Software B. For small nodules, the accuracy was more similar between FBP and iterative reconstruction, with the exception of ASIR vs FBP at 1.25 mm with Software A and MBIR vs FBP at 0.625 mm with Software A. The systematic difference between the accuracy of FBP and iterative reconstructions highlights the importance of extending current segmentation software to accommodate the image characteristics of iterative reconstructions. In addition, a calibration process may help reduce the dependency of accuracy on reconstruction algorithms

  12. Uncertainty Quantification with Applications to Engineering Problems

    DEFF Research Database (Denmark)

    Bigoni, Daniele

    in measurements, predictions and manufacturing, and we can say that any dynamical system used in engineering is subject to some of these uncertainties. The first part of this work presents an overview of the mathematical framework used in Uncertainty Quantification (UQ) analysis and introduces the spectral tensor...... and thus the UQ analysis of the associated systems will benefit greatly from the application of methods which require few function evaluations. We first consider the propagation of the uncertainty and the sensitivity analysis of the non-linear dynamics of railway vehicles with suspension components whose......-scale problems, where efficient methods are necessary with today’s computational resources. The outcome of this work was also the creation of several freely available Python modules for Uncertainty Quantification, which are listed and described in the appendix....

  13. Proceedings from the technical workshop on near-field performance assessment for high-level waste

    International Nuclear Information System (INIS)

    Sellin, P.; Apted, M.; Gago, J.

    1991-12-01

    This report contains the proceedings of 'Technical workshop of near-filed performance assessment for high-level waste' held in Madrid October 15-17, 1990. It includes the invited presentations and summaries of the scientific discussions. The workshop covered several topics: * post-emplacement environment, * benchmarking of computer codes, * glass release, * spent-fuel release, * radionuclide solubility, * near-field transport processes, * coupled processes in the near-field, * integrated assessments, * sensitivity analyses and validation. There was an invited presentation on each topic followed by an extensive discussion. One of the points highlighted in the closing discussion of the workshop was the need for international cooperation in the field of near-field performance assessment. The general opinion was that this was best achieved in smaller groups discussing specific questions. (au) Separate abstracts were prepared for 9 papers in this volume

  14. Supply chain partnership in construction a field study on project team level factors

    NARCIS (Netherlands)

    Koolwijk, J.S.J.; Van Oel, C.J.; Wamelink, J.W.F.

    2015-01-01

    People and their relationship are at the heart of supply chain partnerships, however there is a lack of qualitative studies focusing on how integrated relationships may be developed. Therefore, the purpose of this study was to conduct field research to deepen our understanding of team level

  15. Condensate-polisher resin-leakage quantification and resin-transport studies

    International Nuclear Information System (INIS)

    Stauffer, C.C.; Doss, P.L.

    1983-04-01

    The objectives of this program were to: (1) determine the extent of resin leakage from current generation condensate polisher systems, both deep bed and powdered resin design, during cut-in, steady-state and flow transient operation, (2) analyze moisture separator drains and other secondary system samples for resin fragments and (3) document the level of organics in the secondary system. Resin leakage samples were obtained from nine-power stations that have either recirculating steam generators or once through steam generators. Secondary system samples were obtained from steam generator feedwater, recirculating steam generator blowdown and moisture separator drains. Analysis included ultraviolet light examination, SEM/EDX, resin quantification and infrared analysis. Data obtained from the various plants were compared and factors affecting resin leakage were summarized

  16. Site characterization field manual for near surface geologic disposal of low-level radioactive waste

    International Nuclear Information System (INIS)

    McCray, J.G.; Nowatzki, E.A.

    1985-01-01

    This field manual has been developed to aid states and regions to do a detailed characterization of a proposed near-surface low-level waste disposal site. The field manual is directed at planners, staff personnel and experts in one discipline to acquaint them with the requirements of other disciplines involved in site characterization. While it can provide a good review, it is not designed to tell experts how to do their job within their own discipline

  17. Control of spontaneous emission from a microwave-field-driven four-level atom in an anisotropic photonic crystal

    Science.gov (United States)

    Zhang, Duo; Li, Jiahua; Ding, Chunling; Yang, Xiaoxue

    2012-05-01

    The spontaneous emission properties of a microwave-field-driven four-level atom embedded in anisotropic double-band photonic crystals (PCs) are investigated. We discuss the influences of the band-edge positions, Rabi frequency and detuning of the microwave field on the emission spectrum. It is found that several interesting features such as spectral-line enhancement, spectral-line suppression, spectral-line overlap, and multi-peak structures can be observed in the spectra. The proposed scheme can be achieved by use of a microwave-coupled field into hyperfine levels in rubidium atom confined in a photonic crystal. These theoretical investigations may provide more degrees of freedom to manipulate the atomic spontaneous emission.

  18. Hyperfine Level Interactions of Diamond Nitrogen Vacancy Ensembles Under Transverse Magnetic Fields

    Science.gov (United States)

    2015-10-06

    eigenvalues 0, ±h̄, corresponding to ms = 0,±1 [18]. Figure 1 shows the calculated energy levels as a function of axial field for a fixed transverse...Progress in 5 Physics 77, 056503 (2014). [9] G. Kucsko, P. C. Maurer, N. Y. Yao, M. Kubo , H. J. Noh, P. K. Lo, H. Park, and M. D. Lukin, Nature 500

  19. Tannins quantification in barks of Mimosa tenuiflora and Acacia mearnsii

    Directory of Open Access Journals (Sweden)

    Leandro Calegari

    2016-03-01

    Full Text Available Due to its chemical complexity, there are several methodologies for vegetable tannins quantification. Thus, this work aims at quantifying both tannin and non-tannin substances present in the barks of Mimosa tenuiflora and Acacia mearnsii by two different methods. From bark particles of both species, analytical solutions were produced by using a steam-jacketed extractor. The solution was analyzed by Stiasny and hide-powder (no chromed methods. For both species, tannin levels were superior when analyzed by hide-powder method, reaching 47.8% and 24.1% for A. mearnsii and M. tenuiflora, respectively. By Stiasny method, the tannins levels considered were 39.0% for A. mearnsii, and 15.5% for M. tenuiflora. Despite the best results presented by A. mearnsii, the bark of M. tenuiflora also showed great potential due to its considerable amount of tannin and the availability of the species at Caatinga biome.

  20. Paying Medicare Advantage Plans: To Level or Tilt the Playing Field

    Science.gov (United States)

    Glazer, Jacob; McGuire, Thomas G.

    2017-01-01

    Medicare beneficiaries are eligible for health insurance through the public option of traditional Medicare (TM) or may join a private Medicare Advantage (MA) plan. Both are highly subsidized but in different ways. Medicare pays for most of costs directly in TM, and makes a subsidy payment to an MA plan based on a “benchmark” for each beneficiary choosing a private plan. The level of this benchmark is arguably the most important policy decision Medicare makes about the MA program. Presently, about 30% of beneficiaries are in MA, and Medicare subsidizes MA plans more on average than TM. Many analysts recommend equalizing Medicare’s subsidy across the options – referred to in policy circles as a “level playing field.” This paper studies the normative question of how to set the level of the benchmark, applying the versatile model of plan choice developed by Einav and Finkelstein (EF) to Medicare. The EF framework implies unequal subsidies to counteract risk selection across plan types. We also study other reasons to tilt the field: the relative efficiency of MA vs. TM, market power of MA plans, and institutional features of the way Medicare determines subsidies and premiums. After review of the empirical and policy literature, we conclude that in areas where the MA market is competitive, the benchmark should be set below average costs in TM, but in areas characterized by imperfect competition in MA, it should be raised in order to offset output (enrollment) restrictions by plans with market power. We also recommend specific modifications of Medicare rules to make demand for MA more price elastic. PMID:28318667

  1. Quantification of baseline pupillary response and task-evoked pupillary response during constant and incremental task load.

    Science.gov (United States)

    Mosaly, Prithima R; Mazur, Lukasz M; Marks, Lawrence B

    2017-10-01

    The methods employed to quantify the baseline pupil size and task-evoked pupillary response (TEPR) may affect the overall study results. To test this hypothesis, the objective of this study was to assess variability in baseline pupil size and TEPR during two basic working memory tasks: constant load of 3-letters memorisation-recall (10 trials), and incremental load memorisation-recall (two trials of each load level), using two commonly used methods (1) change from trail/load specific baseline, (2) change from constant baseline. Results indicated that there was a significant shift in baseline between the trails for constant load, and between the load levels for incremental load. The TEPR was independent of shifts in baseline using method 1 only for constant load, and method 2 only for higher levels of incremental load condition. These important findings suggest that the assessment of both the baseline and methods to quantify TEPR are critical in ergonomics application, especially in studies with small number of trials per subject per condition. Practitioner Summary: Quantification of TEPR can be affected by shifts in baseline pupil size that are most likely affected by non-cognitive factors when other external factors are kept constant. Therefore, quantification methods employed to compute both baseline and TEPR are critical in understanding the information processing of humans in practical ergonomics settings.

  2. Lung involvement quantification in chest radiographs; Quantificacao de comprometimento pulmonar em radiografias de torax

    Energy Technology Data Exchange (ETDEWEB)

    Giacomini, Guilherme; Alvarez, Matheus; Oliveira, Marcela de; Miranda, Jose Ricardo A. [Universidade Estadual Paulista Julio de Mesquita Filho (UNESP), Botucatu, SP (Brazil). Instituto de Biociencias. Departamento de Fisica e Biofisica; Pina, Diana R.; Pereira, Paulo C.M.; Ribeiro, Sergio M., E-mail: giacomini@ibb.unesp.br [Universidade Estadual Paulista Julio de Mesquita Filho (UNESP), Botucatu, SP (Brazil). Faculdade de Medicina. Departamento de Doencas Tropicais e Diagnostico por Imagem

    2014-12-15

    Tuberculosis (TB) caused by Mycobacterium tuberculosis, is an infectious disease which remains a global health problem. The chest radiography is the commonly method employed to assess the TB's evolution. The methods for quantification of abnormalities of chest are usually performed on CT scans (CT). This quantification is important to assess the TB evolution and treatment and comparing different treatments. However, precise quantification is not feasible for the amount of CT scans required. The purpose of this work is to develop a methodology for quantification of lung damage caused by TB through chest radiographs. It was developed an algorithm for computational processing of exams in Matlab, which creates a lungs' 3D representation, with compromised dilated regions inside. The quantification of lung lesions was also made for the same patients through CT scans. The measurements from the two methods were compared and resulting in strong correlation. Applying statistical Bland and Altman, all samples were within the limits of agreement, with a confidence interval of 95%. The results showed an average variation of around 13% between the two quantification methods. The results suggest the effectiveness and applicability of the method developed, providing better risk-benefit to the patient and cost-benefit ratio for the institution. (author)

  3. Quantification of virus syndrome in chili peppers

    African Journals Online (AJOL)

    Jane

    2011-06-15

    Jun 15, 2011 ... alternative for the quantification of the disease' syndromes in regards to this crop. The result of these ..... parison of treatments such as cultivars or control measures and ..... Vascular discoloration and stem necrosis. 2.

  4. A New Sensor for Surface Process Quantification in the Geosciences - Image-Assisted Tacheometers

    Science.gov (United States)

    Vicovac, Tanja; Reiterer, Alexander; Rieke-Zapp, Dirk

    2010-05-01

    The quantification of earth surface processes in the geosciences requires precise measurement tools. Typical applications for precise measurement systems involve deformation monitoring for geo-risk management, detection of erosion rates, etc. Often employed for such applications are laser scanners, photogrammetric sensors and image-assisted tacheometers. Image-assisted tacheometers offer the user (metrology expert) an image capturing system (CCD/CMOS camera) in addition to 3D point measurements. The images of the telescope's visual field are projected onto the camera's chip. The camera is capable of capturing panoramic image mosaics through camera rotation if the axes of the measurement system are driven by computer controlled motors. With appropriate calibration, these images are accurately geo-referenced and oriented since the horizontal and vertical angles of rotation are continuously measured and fed into the computer. The oriented images can then directly be used for direction measurements with no need for control points in object space or further photogrammetric orientation processes. In such a system, viewing angles must be addressed to chip pixels inside the optical field of view. Hence dedicated calibration methods have to be applied, an autofocus unit has to be added to the optical path, and special digital image processing procedures have to be used to detect the points of interest on the objects to be measured. We present such a new optical measurement system for measuring and describing 3D surfaces for geosciences. Besides the technique and methods some practical examples will be shown. The system was developed at the Vienna University of Technology (Institute of Geodesy and Geophysics) - two interdisciplinary research project, i-MeaS and SedyMONT, have been launched with the purpose of measuring and interpreting 3D surfaces and surface processes. For the in situ measurement of bed rock erosion the level of surveying accuracy required for recurring sub

  5. Electric-field induced spin accumulation in the Landau level states of topological insulator thin films

    Science.gov (United States)

    Siu, Zhuo Bin; Chowdhury, Debashree; Basu, Banasri; Jalil, Mansoor B. A.

    2017-08-01

    A topological insulator (TI) thin film differs from the more typically studied thick TI system in that the former has both a top and a bottom surface where the states localized at both surfaces can couple to one other across the finite thickness. An out-of-plane magnetic field leads to the formation of discrete Landau level states in the system, whereas an in-plane magnetization breaks the angular momentum symmetry of the system. In this work, we study the spin accumulation induced by the application of an in-plane electric field to the TI thin film system where the Landau level states and inter-surface coupling are simultaneously present. We show, via Kubo formula calculations, that the in-plane spin accumulation perpendicular to the magnetization due to the electric field vanishes for a TI thin film with symmetric top and bottom surfaces. A finite in-plane spin accumulation perpendicular to both the electric field and magnetization emerges upon applying either a differential magnetization coupling or a potential difference between the two film surfaces. This spin accumulation results from the breaking of the antisymmetry of the spin accumulation around the k-space equal-energy contours.

  6. Lamb Wave Damage Quantification Using GA-Based LS-SVM

    Directory of Open Access Journals (Sweden)

    Fuqiang Sun

    2017-06-01

    Full Text Available Lamb waves have been reported to be an efficient tool for non-destructive evaluations (NDE for various application scenarios. However, accurate and reliable damage quantification using the Lamb wave method is still a practical challenge, due to the complex underlying mechanism of Lamb wave propagation and damage detection. This paper presents a Lamb wave damage quantification method using a least square support vector machine (LS-SVM and a genetic algorithm (GA. Three damage sensitive features, namely, normalized amplitude, phase change, and correlation coefficient, were proposed to describe changes of Lamb wave characteristics caused by damage. In view of commonly used data-driven methods, the GA-based LS-SVM model using the proposed three damage sensitive features was implemented to evaluate the crack size. The GA method was adopted to optimize the model parameters. The results of GA-based LS-SVM were validated using coupon test data and lap joint component test data with naturally developed fatigue cracks. Cases of different loading and manufacturer were also included to further verify the robustness of the proposed method for crack quantification.

  7. Lamb Wave Damage Quantification Using GA-Based LS-SVM.

    Science.gov (United States)

    Sun, Fuqiang; Wang, Ning; He, Jingjing; Guan, Xuefei; Yang, Jinsong

    2017-06-12

    Lamb waves have been reported to be an efficient tool for non-destructive evaluations (NDE) for various application scenarios. However, accurate and reliable damage quantification using the Lamb wave method is still a practical challenge, due to the complex underlying mechanism of Lamb wave propagation and damage detection. This paper presents a Lamb wave damage quantification method using a least square support vector machine (LS-SVM) and a genetic algorithm (GA). Three damage sensitive features, namely, normalized amplitude, phase change, and correlation coefficient, were proposed to describe changes of Lamb wave characteristics caused by damage. In view of commonly used data-driven methods, the GA-based LS-SVM model using the proposed three damage sensitive features was implemented to evaluate the crack size. The GA method was adopted to optimize the model parameters. The results of GA-based LS-SVM were validated using coupon test data and lap joint component test data with naturally developed fatigue cracks. Cases of different loading and manufacturer were also included to further verify the robustness of the proposed method for crack quantification.

  8. [DNA quantification of blood samples pre-treated with pyramidon].

    Science.gov (United States)

    Zhu, Chuan-Hong; Zheng, Dao-Li; Ni, Rao-Zhi; Wang, Hai-Sheng; Ning, Ping; Fang, Hui; Liu, Yan

    2014-06-01

    To study DNA quantification and STR typing of samples pre-treated with pyramidon. The blood samples of ten unrelated individuals were anticoagulated in EDTA. The blood stains were made on the filter paper. The experimental groups were divided into six groups in accordance with the storage time, 30 min, 1 h, 3 h, 6 h, 12 h and 24h after pre-treated with pyramidon. DNA was extracted by three methods: magnetic bead-based extraction, QIAcube DNA purification method and Chelex-100 method. The quantification of DNA was made by fluorescent quantitative PCR. STR typing was detected by PCR-STR fluorescent technology. In the same DNA extraction method, the sample DNA decreased gradually with times after pre-treatment with pyramidon. In the same storage time, the DNA quantification in different extraction methods had significant differences. Sixteen loci DNA typing were detected in 90.56% of samples. Pyramidon pre-treatment could cause DNA degradation, but effective STR typing can be achieved within 24 h. The magnetic bead-based extraction is the best method for STR profiling and DNA extraction.

  9. Real-Time PCR Quantification of Chloroplast DNA Supports DNA Barcoding of Plant Species.

    Science.gov (United States)

    Kikkawa, Hitomi S; Tsuge, Kouichiro; Sugita, Ritsuko

    2016-03-01

    Species identification from extracted DNA is sometimes needed for botanical samples. DNA quantification is required for an accurate and effective examination. If a quantitative assay provides unreliable estimates, a higher quantity of DNA than the estimated amount may be used in additional analyses to avoid failure to analyze samples from which extracting DNA is difficult. Compared with conventional methods, real-time quantitative PCR (qPCR) requires a low amount of DNA and enables quantification of dilute DNA solutions accurately. The aim of this study was to develop a qPCR assay for quantification of chloroplast DNA from taxonomically diverse plant species. An absolute quantification method was developed using primers targeting the ribulose-1,5-bisphosphate carboxylase/oxygenase large subunit (rbcL) gene using SYBR Green I-based qPCR. The calibration curve was generated using the PCR amplicon as the template. DNA extracts from representatives of 13 plant families common in Japan. This demonstrates that qPCR analysis is an effective method for quantification of DNA from plant samples. The results of qPCR assist in the decision-making will determine the success or failure of DNA analysis, indicating the possibility of optimization of the procedure for downstream reactions.

  10. Dynamics of a quantum two-level system under the action of phase-diffusion field

    Energy Technology Data Exchange (ETDEWEB)

    Sobakinskaya, E.A. [Institute for Physics of Microstructures of RAS, Nizhny Novgorod, 603950 (Russian Federation); Pankratov, A.L., E-mail: alp@ipm.sci-nnov.ru [Institute for Physics of Microstructures of RAS, Nizhny Novgorod, 603950 (Russian Federation); Vaks, V.L. [Institute for Physics of Microstructures of RAS, Nizhny Novgorod, 603950 (Russian Federation)

    2012-01-09

    We study a behavior of quantum two-level system, interacting with noisy phase-diffusion field. The dynamics is shown to split into two regimes, determined by the coherence time of the phase-diffusion field. For both regimes we present a model of quantum system behavior and discuss possible applications of the obtained effect for spectroscopy. In particular, the obtained analytical formula for the macroscopic polarization demonstrates that the phase-diffusion field does not affect the absorption line shape, which opens up an intriguing possibility of noisy spectroscopy, based on broadband sources with Lorentzian line shape. -- Highlights: ► We study dynamics of quantum system interacting with noisy phase-diffusion field. ► At short times the phase-diffusion field induces polarization in the quantum system. ► At long times the noise leads to polarization decay and heating of a quantum system. ► Simple model of interaction is derived. ► Application of the described effects for spectroscopy is discussed.

  11. Absolute and direct microRNA quantification using DNA-gold nanoparticle probes.

    Science.gov (United States)

    Degliangeli, Federica; Kshirsagar, Prakash; Brunetti, Virgilio; Pompa, Pier Paolo; Fiammengo, Roberto

    2014-02-12

    DNA-gold nanoparticle probes are implemented in a simple strategy for direct microRNA (miRNA) quantification. Fluorescently labeled DNA-probe strands are immobilized on PEGylated gold nanoparticles (AuNPs). In the presence of target miRNA, DNA-RNA heteroduplexes are formed and become substrate for the endonuclease DSN (duplex-specific nuclease). Enzymatic hydrolysis of the DNA strands yields a fluorescence signal due to diffusion of the fluorophores away from the gold surface. We show that the molecular design of our DNA-AuNP probes, with the DNA strands immobilized on top of the PEG-based passivation layer, results in nearly unaltered enzymatic activity toward immobilized heteroduplexes compared to substrates free in solution. The assay, developed in a real-time format, allows absolute quantification of as little as 0.2 fmol of miR-203. We also show the application of the assay for direct quantification of cancer-related miR-203 and miR-21 in samples of extracted total RNA from cell cultures. The possibility of direct and absolute quantification may significantly advance the use of microRNAs as biomarkers in the clinical praxis.

  12. Activity quantification of phantom using dual-head SPECT with two-view planar image

    International Nuclear Information System (INIS)

    Guo Leiming; Chen Tao; Sun Xiaoguang; Huang Gang

    2005-01-01

    The absorbed radiation dose from internally deposited radionuclide is a major factor in assessing risk and therapeutic utility in nuclear medicine diagnosis or treatment. The quantification of absolute activity in vivo is necessary procedure of estimating the absorbed dose of organ or tissue. To understand accuracy in the determination of organ activity, the experiments on 99 Tc m activity quantification were made for a body phantom using dual-heat SPECT with the two-view counting technique. Accuracy in the activity quantification is credible and is not affected by depth of source organ in vivo. When diameter of the radiation source is ≤2 cm, the most accurate activity quantification result can be obtained on the basis of establishing the system calibration factor and transmission factor. The use of Buijs's method is preferable, especially at very low source-to-background activity concentration rations. (authors)

  13. Investigation of the exposure level of electromagnetic fields produced by mobile telephone base stations

    International Nuclear Information System (INIS)

    Abukassem, I.; Kharita, M. H.

    2010-12-01

    The aim of this work is to investigate the real values of microwave level distribution and propagation in the locality around samples of mobile phone base station, and to compare the results with the exposure restriction limits recommenced by the International Commission on Non Ionizing Radiation Protection (ICNIRP). Measurements were performed using special meters for microwaves; the first (Narda SRM-3000) is used for electromagnetic waves frequency spectrum scanning and the second (NARDA) emr 300) determine the level of electric and magnetic fields and the power density of these waves nearby any sort of transmitters. Samples of different kinds of mobile phone base station were chosen to cover important zones of Damascus, and the region around each base station was also scanned in the emission direction and according to accessibility into the studies positions. Results showed that the signal level in all measured points is lower than the ICNIRP restriction level, but for few points the detected microwave level has relatively important values. The signal level inside building situated partially in the emission direction of the base station transmitters decreases stepwise and walls reduce considerably the signal intensity. To realize these kind of field studies in the best way and obtain the maximum profits for all people, the properties and operating system of transmitters used in mobile phone base station must be known, and therefore, it is very important to achieve a transparent collaboration between research laboratory and mobile phone company. (author)

  14. Species-specific detection and quantification of common barnacle larvae from the Japanese coast using quantitative real-time PCR.

    Science.gov (United States)

    Endo, Noriyuki; Sato, Kana; Matsumura, Kiyotaka; Yoshimura, Erina; Odaka, Yukiko; Nogata, Yasuyuki

    2010-11-01

    Species-specific detection and quantification methods for barnacle larvae using quantitative real-time polymerase chain reaction (qPCR) were developed. Species-specific primers for qPCR were designed for 13 barnacle species in the mitochondrial 12S ribosomal RNA gene region. Primer specificity was examined by PCR using template DNA extracted from each of the 13 barnacle species, other unidentified barnacle species, and field collected zooplankton samples. The resulting PCR products comprised single bands following agarose gel electrophoresis when the templates corresponded to primers. The amplifications were highly species-specific even for the field plankton samples. The field plankton samples were subjected to qPCR assay. The calculated DNA contents for each barnacle species were closely correlated with the number of larvae measured by microscopic examination. The method could be applied to quantify barnacle larvae in natural plankton samples.

  15. Improved Strategies and Optimization of Calibration Models for Real-time PCR Absolute Quantification

    Science.gov (United States)

    Real-time PCR absolute quantification applications rely on the use of standard curves to make estimates of DNA target concentrations in unknown samples. Traditional absolute quantification approaches dictate that a standard curve must accompany each experimental run. However, t...

  16. Quantification of uranyl in presence of citric acid

    International Nuclear Information System (INIS)

    Garcia G, N.; Barrera D, C.E.; Ordonez R, E.

    2007-01-01

    To determine the influence that has the organic matter of the soil on the uranyl sorption on some solids is necessary to have a detection technique and quantification of uranyl that it is reliable and sufficiently quick in the obtaining of results. For that in this work, it intends to carry out the uranyl quantification in presence of citric acid modifying the Fluorescence induced by UV-Vis radiation technique. Since the uranyl ion is very sensitive to the medium that contains it, (speciation, pH, ionic forces, etc.) it was necessary to develop an analysis technique that stands out the fluorescence of uranyl ion avoiding the out one that produce the organic acids. (Author)

  17. quantification of emergency action levels for research reactor

    International Nuclear Information System (INIS)

    Wu Zhongwang; Qu Jingyuan; Liu Yuanzhong; Xi Shuren

    2000-01-01

    Emergency action level (EAL) technical criteria or parameters for emergency conditions classes. Reference methodology for development of EAL in foreign countries, in process of developed and reviewed emergency plan of home several research reactors, the author thought that should be taken initiating conditions which result in emergency conditions quantified some instrumental readings or alarm thresholds, in order to distinguish and confirm emergency conditions and provide technical bases for emergency response actions. Then based on this principle, revised or developed emergency plans of INET Tsinghua University, promote development of work for emergency plan of research reactors

  18. Automated quantification of renal interstitial fibrosis for computer-aided diagnosis: A comprehensive tissue structure segmentation method.

    Science.gov (United States)

    Tey, Wei Keat; Kuang, Ye Chow; Ooi, Melanie Po-Leen; Khoo, Joon Joon

    2018-03-01

    Interstitial fibrosis in renal biopsy samples is a scarring tissue structure that may be visually quantified by pathologists as an indicator to the presence and extent of chronic kidney disease. The standard method of quantification by visual evaluation presents reproducibility issues in the diagnoses. This study proposes an automated quantification system for measuring the amount of interstitial fibrosis in renal biopsy images as a consistent basis of comparison among pathologists. The system extracts and segments the renal tissue structures based on colour information and structural assumptions of the tissue structures. The regions in the biopsy representing the interstitial fibrosis are deduced through the elimination of non-interstitial fibrosis structures from the biopsy area and quantified as a percentage of the total area of the biopsy sample. A ground truth image dataset has been manually prepared by consulting an experienced pathologist for the validation of the segmentation algorithms. The results from experiments involving experienced pathologists have demonstrated a good correlation in quantification result between the automated system and the pathologists' visual evaluation. Experiments investigating the variability in pathologists also proved the automated quantification error rate to be on par with the average intra-observer variability in pathologists' quantification. Interstitial fibrosis in renal biopsy samples is a scarring tissue structure that may be visually quantified by pathologists as an indicator to the presence and extent of chronic kidney disease. The standard method of quantification by visual evaluation presents reproducibility issues in the diagnoses due to the uncertainties in human judgement. An automated quantification system for accurately measuring the amount of interstitial fibrosis in renal biopsy images is presented as a consistent basis of comparison among pathologists. The system identifies the renal tissue structures

  19. On the Confounding Effect of Temperature on Chemical Shift-Encoded Fat Quantification

    Science.gov (United States)

    Hernando, Diego; Sharma, Samir D.; Kramer, Harald; Reeder, Scott B.

    2014-01-01

    Purpose To characterize the confounding effect of temperature on chemical shift-encoded (CSE) fat quantification. Methods The proton resonance frequency of water, unlike triglycerides, depends on temperature. This leads to a temperature dependence of the spectral models of fat (relative to water) that are commonly used by CSE-MRI methods. Simulation analysis was performed for 1.5 Tesla CSE fat–water signals at various temperatures and echo time combinations. Oil–water phantoms were constructed and scanned at temperatures between 0 and 40°C using spectroscopy and CSE imaging at three echo time combinations. An explanted human liver, rejected for transplantation due to steatosis, was scanned using spectroscopy and CSE imaging. Fat–water reconstructions were performed using four different techniques: magnitude and complex fitting, with standard or temperature-corrected signal modeling. Results In all experiments, magnitude fitting with standard signal modeling resulted in large fat quantification errors. Errors were largest for echo time combinations near TEinit ≈ 1.3 ms, ΔTE ≈ 2.2 ms. Errors in fat quantification caused by temperature-related frequency shifts were smaller with complex fitting, and were avoided using a temperature-corrected signal model. Conclusion Temperature is a confounding factor for fat quantification. If not accounted for, it can result in large errors in fat quantifications in phantom and ex vivo acquisitions. PMID:24123362

  20. Standardless quantification by parameter optimization in electron probe microanalysis

    Energy Technology Data Exchange (ETDEWEB)

    Limandri, Silvina P. [Instituto de Fisica Enrique Gaviola (IFEG), CONICET (Argentina); Facultad de Matematica, Astronomia y Fisica, Universidad Nacional de Cordoba, Medina Allende s/n, (5016) Cordoba (Argentina); Bonetto, Rita D. [Centro de Investigacion y Desarrollo en Ciencias Aplicadas Dr. Jorge Ronco (CINDECA), CONICET, 47 Street 257, (1900) La Plata (Argentina); Facultad de Ciencias Exactas, Universidad Nacional de La Plata, 1 and 47 Streets (1900) La Plata (Argentina); Josa, Victor Galvan; Carreras, Alejo C. [Instituto de Fisica Enrique Gaviola (IFEG), CONICET (Argentina); Facultad de Matematica, Astronomia y Fisica, Universidad Nacional de Cordoba, Medina Allende s/n, (5016) Cordoba (Argentina); Trincavelli, Jorge C., E-mail: trincavelli@famaf.unc.edu.ar [Instituto de Fisica Enrique Gaviola (IFEG), CONICET (Argentina); Facultad de Matematica, Astronomia y Fisica, Universidad Nacional de Cordoba, Medina Allende s/n, (5016) Cordoba (Argentina)

    2012-11-15

    A method for standardless quantification by parameter optimization in electron probe microanalysis is presented. The method consists in minimizing the quadratic differences between an experimental spectrum and an analytical function proposed to describe it, by optimizing the parameters involved in the analytical prediction. This algorithm, implemented in the software POEMA (Parameter Optimization in Electron Probe Microanalysis), allows the determination of the elemental concentrations, along with their uncertainties. The method was tested in a set of 159 elemental constituents corresponding to 36 spectra of standards (mostly minerals) that include trace elements. The results were compared with those obtained with the commercial software GENESIS Spectrum Registered-Sign for standardless quantification. The quantifications performed with the method proposed here are better in the 74% of the cases studied. In addition, the performance of the method proposed is compared with the first principles standardless analysis procedure DTSA for a different data set, which excludes trace elements. The relative deviations with respect to the nominal concentrations are lower than 0.04, 0.08 and 0.35 for the 66% of the cases for POEMA, GENESIS and DTSA, respectively. - Highlights: Black-Right-Pointing-Pointer A method for standardless quantification in EPMA is presented. Black-Right-Pointing-Pointer It gives better results than the commercial software GENESIS Spectrum. Black-Right-Pointing-Pointer It gives better results than the software DTSA. Black-Right-Pointing-Pointer It allows the determination of the conductive coating thickness. Black-Right-Pointing-Pointer It gives an estimation for the concentration uncertainties.

  1. Ideas underlying the Quantification of Margins and Uncertainties

    Energy Technology Data Exchange (ETDEWEB)

    Pilch, Martin, E-mail: mpilch@sandia.gov [Department 1514, Sandia National Laboratories, Albuquerque, NM 87185-0828 (United States); Trucano, Timothy G. [Department 1411, Sandia National Laboratories, Albuquerque, NM 87185-0370 (United States); Helton, Jon C. [Department of Mathematics and Statistics, Arizona State University, Tempe, AZ 85287-1804 (United States)

    2011-09-15

    Key ideas underlying the application of Quantification of Margins and Uncertainties (QMU) to nuclear weapons stockpile lifecycle decisions are described. While QMU is a broad process and methodology for generating critical technical information to be used in U.S. nuclear weapon stockpile management, this paper emphasizes one component, which is information produced by computational modeling and simulation. In particular, the following topics are discussed: (i) the key principles of developing QMU information in the form of Best Estimate Plus Uncertainty, (ii) the need to separate aleatory and epistemic uncertainty in QMU, and (iii) the properties of risk-informed decision making (RIDM) that are best suited for effective application of QMU. The paper is written at a high level, but provides an extensive bibliography of useful papers for interested readers to deepen their understanding of the presented ideas.

  2. Secreted cerberus1 as a marker for quantification of definitive endoderm differentiation of the pluripotent stem cells.

    Directory of Open Access Journals (Sweden)

    Hidefumi Iwashita

    Full Text Available To date, CXCR4 and E-cadherin double-positive cells detected by flow cytometry have been used to identify the differentiation of embryonic stem (ES cells or induced pluripotent stem (iPS cells into definitive endoderm (DE lineages. Quantification of DE differentiation from ES/iPS cells by using flow cytometry is a multi-step procedure including dissociation of the cells, antibody reaction, and flow cytometry analysis. To establish a quick assay method for quantification of ES/iPS cell differentiation into the DE without dissociating the cells, we examined whether secreted Cerberus1 (Cer1 protein could be used as a marker. Cer1 is a secreted protein expressed first in the anterior visceral endoderm and then in the DE. The amount of Cer1 secreted correlated with the proportion of CXCR4+/E-Cadherin+ cells that differentiated from mouse ES cells. In addition, we found that human iPS cell-derived DE also expressed the secreted CER1 and that the expression level correlated with the proportion of SOX17+/FOXA2+ cells present. Taken together, these results show that Cer1 (or CER1 serves as a good marker for quantification of DE differentiation of mouse and human ES/iPS cells.

  3. Entanglement for a Bimodal Cavity Field Interacting with a Two-Level Atom

    International Nuclear Information System (INIS)

    Liu Jia; Chen Ziyu; Bu Shenping; Zhang Guofeng

    2009-01-01

    Negativity has been adopted to investigate the entanglement in a system composed of a two-level atom and a two-mode cavity field. Effects of Kerr-like medium and the number of photon inside the cavity on the entanglement are studied. Our results show that atomic initial state must be superposed, so that the two cavity field modes can be entangled. Moreover, we also conclude that the number of photon in the two cavity mode should be equal. The interaction between modes, namely, the Kerr effect, has a significant negative contribution. Note that the atom frequency and the cavity frequency have an indistinguishable effect, so a corresponding approximation has been made in this article. These results may be useful for quantum information in optics systems.

  4. Sensitive quantification of aflatoxin B1 in animal feeds, corn feed grain, and yellow corn meal using immunomagnetic bead-based recovery and real-time immunoquantitative-PCR.

    Science.gov (United States)

    Babu, Dinesh; Muriana, Peter M

    2014-12-02

    Aflatoxins are considered unavoidable natural mycotoxins encountered in foods, animal feeds, and feed grains. In this study, we demonstrate the application of our recently developed real-time immunoquantitative PCR (RT iq-PCR) assay for sensitive detection and quantification of aflatoxins in poultry feed, two types of dairy feed (1 and 2), horse feed, whole kernel corn feed grains, and retail yellow ground corn meal. Upon testing methanol/water (60:40) extractions of the above samples using competitive direct enzyme linked immunosorbent assay, the aflatoxin content was found to be effect in samples containing aflatoxin levels higher than the quantification limits (0.1-10 μg/kg), addressed by comparing the quantification results of undiluted and diluted extracts. In testing the reliability of the immuno-PCR assay, samples were spiked with 200 μg/kg of aflatoxin B1, but the recovery of spiked aflatoxin was found to be poor. Considering the significance of determining trace levels of aflatoxins and their serious implications for animal and human health, the RT iq-PCR method described in this study can be useful for quantifying low natural aflatoxin levels in complex matrices of food or animal feed samples without the requirement of extra sample cleanup.

  5. Spectroscopic analysis of electron trapping levels in pentacene field-effect transistors

    International Nuclear Information System (INIS)

    Bum Park, Chang

    2014-01-01

    Electron trapping phenomena have been investigated with respect to the energy levels of localized trap states and bias-induced device instability effects in pentacene field-effect transistors. The mechanism of the photoinduced threshold voltage shift (ΔV T ) is presented by providing a ΔV T model governed by the electron trapping. The trap-and-release behaviour functionalized by photo-irradiation also shows that the trap state for electrons is associated with the energy levels in different positions in the forbidden gap of pentacene. Spectroscopic analysis identifies two kinds of electron trap states distributed above and below the energy of 2.5 eV in the band gap of the pentacene crystal. The study of photocurrent spectra shows the specific trap levels of electrons in energy space that play a substantial role in causing device instability. The shallow and deep trapping states are distributed at two centroidal energy levels of ∼1.8 and ∼2.67 eV in the pentacene band gap. Moreover, we present a systematic energy profile of electron trap states in the pentacene crystal for the first time. (paper)

  6. Quantification of structural uncertainties in multi-scale models; case study of the Lublin Basin, Poland

    Science.gov (United States)

    Małolepszy, Zbigniew; Szynkaruk, Ewa

    2015-04-01

    The multiscale static modeling of regional structure of the Lublin Basin is carried on in the Polish Geological Institute, in accordance with principles of integrated 3D geological modelling. The model is based on all available geospatial data from Polish digital databases and analogue archives. Mapped regional structure covers the area of 260x80 km located between Warsaw and Polish-Ukrainian border, along NW-SE-trending margin of the East European Craton. Within the basin, the Paleozoic beds with coalbearing Carboniferous and older formations containing hydrocarbons and unconventional prospects are covered unconformably by Permo-Mesozoic and younger rocks. Vertical extent of the regional model is set from topographic surface to 6000 m ssl and at the bottom includes some Proterozoic crystalline formations of the craton. The project focuses on internal consistency of the models built at different scales - from basin (small) scale to field-scale (large-scale). The models, nested in the common structural framework, are being constructed with regional geological knowledge, ensuring smooth transition in the 3D model resolution and amount of geological detail. Major challenge of the multiscale approach to subsurface modelling is the assessment and consistent quantification of various types of geological uncertainties tied to those various scale sub-models. Decreasing amount of information with depth and, particularly, very limited data collected below exploration targets, as well as accuracy and quality of data, all have the most critical impact on the modelled structure. In deeper levels of the Lublin Basin model, seismic interpretation of 2D surveys is sparsely tied to well data. Therefore time-to-depth conversion carries one of the major uncertainties in the modeling of structures, especially below 3000 m ssl. Furthermore, as all models at different scales are based on the same dataset, we must deal with different levels of generalization of geological structures. The

  7. Field-level intelligence simplifies motor protection and control; Mit Intelligenz in der Feldebene Motorschutz und Steuerungen vereinfachen

    Energy Technology Data Exchange (ETDEWEB)

    Westerholt, J. [Siemens AG, Erlangen (Germany)

    1998-10-19

    Field-level intelligence signifies relieving the work load at automation level, reducing the amount of wiring required and thereby reducing a potential source of errors, and enhancing plant availability. These keywords stand for ideas in the form of applications for field-bus systems and intelligent field devices, such as Profibus DP and the Simocode DP motor protection and control with communications capability. (orig.) [Deutsch] Intelligenz wandert in die Feldebene bedeutet: Automatisierungsebene entlasten, Verdrahtungsaufwand und Fehlerquellen verringern, Erhoehung der Verfuegbarkeit der Anlage ereichen, um nur einige Schlagwoerter zu nennen. Hinter den Schlagwoertern stehen Konzepte, wie der Einsatz von Feldbussystemen und intelligenten Feldgeraeten. Der Profibus-DP und das kommunikationsfaehige Motorschutz- und Steuergeraet Simocode-DP ist ein solches Konzept. (orig.)

  8. Techniques for quantification of liver fat in risk stratification of diabetics

    International Nuclear Information System (INIS)

    Kuehn, J.P.; Spoerl, M.C.; Mahlke, C.; Hegenscheid, K.

    2015-01-01

    Fatty liver disease plays an important role in the development of type 2 diabetes. Accurate techniques for detection and quantification of liver fat are essential for clinical diagnostics. Chemical shift-encoded magnetic resonance imaging (MRI) is a simple approach to quantify liver fat content. Liver fat quantification using chemical shift-encoded MRI is influenced by several bias factors, such as T2* decay, T1 recovery and the multispectral complexity of fat. The confounder corrected proton density fat fraction is a simple approach to quantify liver fat with comparable results independent of the software and hardware used. The proton density fat fraction is an accurate biomarker for assessment of liver fat. An accurate and reproducible quantification of liver fat using chemical shift-encoded MRI requires a calculation of the proton density fat fraction. (orig.) [de

  9. An external standard method for quantification of human cytomegalovirus by PCR

    International Nuclear Information System (INIS)

    Rongsen, Shen; Liren, Ma; Fengqi, Zhou; Qingliang, Luo

    1997-01-01

    An external standard method for PCR quantification of HCMV was reported. [α- 32 P]dATP was used as a tracer. 32 P-labelled specific amplification product was separated by agarose gel electrophoresis. A gel piece containing the specific product band was excised and counted in a plastic scintillation counter. Distribution of [α- 32 P]dATP in the electrophoretic gel plate and effect of separation between the 32 P-labelled specific product and free [α- 32 P]dATP were observed. A standard curve for quantification of HCMV by PCR was established and detective results of quality control templets were presented. The external standard method and the electrophoresis separation effect were appraised. The results showed that the method could be used for relative quantification of HCMV. (author)

  10. Generation of structural MR images from amyloid PET: Application to MR-less quantification.

    Science.gov (United States)

    Choi, Hongyoon; Lee, Dong Soo

    2017-12-07

    Structural magnetic resonance (MR) images concomitantly acquired with PET images can provide crucial anatomical information for precise quantitative analysis. However, in the clinical setting, not all the subjects have corresponding MR. Here, we developed a model to generate structural MR images from amyloid PET using deep generative networks. We applied our model to quantification of cortical amyloid load without structural MR. Methods: We used florbetapir PET and structural MR data of Alzheimer's Disease Neuroimaging Initiative database. The generative network was trained to generate realistic structural MR images from florbetapir PET images. After the training, the model was applied to the quantification of cortical amyloid load. PET images were spatially normalized to the template space using the generated MR and then standardized uptake value ratio (SUVR) of the target regions was measured by predefined regions-of-interests. A real MR-based quantification was used as the gold standard to measure the accuracy of our approach. Other MR-less methods, a normal PET template-based, multi-atlas PET template-based and PET segmentation-based normalization/quantification methods, were also tested. We compared performance of quantification methods using generated MR with that of MR-based and MR-less quantification methods. Results: Generated MR images from florbetapir PET showed visually similar signal patterns to the real MR. The structural similarity index between real and generated MR was 0.91 ± 0.04. Mean absolute error of SUVR of cortical composite regions estimated by the generated MR-based method was 0.04±0.03, which was significantly smaller than other MR-less methods (0.29±0.12 for the normal PET-template, 0.12±0.07 for multiatlas PET-template and 0.08±0.06 for PET segmentation-based methods). Bland-Altman plots revealed that the generated MR-based SUVR quantification was the closest to the SUVR values estimated by the real MR-based method. Conclusion

  11. Droplet digital PCR improves absolute quantification of viable lactic acid bacteria in faecal samples.

    Science.gov (United States)

    Gobert, Guillaume; Cotillard, Aurélie; Fourmestraux, Candice; Pruvost, Laurence; Miguet, Jean; Boyer, Mickaël

    2018-03-14

    Analysing correlations between the observed health effects of ingested probiotics and their survival in digestive tract allows adapting their preparations for food. Tracking ingested probiotic in faecal samples requires accurate and specific tools to quantify live vs dead cells at strain level. Traditional culture-based methods are simpler to use but they do not allow quantifying viable but non-cultivable (VBNC) cells and they are poorly discriminant below the species level. We have set up a viable PCR (vPCR) assay combining propidium monoazide (PMA) treatment and either real time quantitative PCR (qPCR) or droplet digital PCR (ddPCR) to quantify a Lactobacillus rhamnosus and two Lactobacillus paracasei subsp. paracasei strains in piglet faeces. Adjustments of the PMA treatment conditions and reduction of the faecal sample size were necessary to obtain accurate discrimination between dead and live cells. The study also revealed differences of PMA efficiency among the two L. paracasei strains. Both PCR methods were able to specifically quantify each strain and provided comparable total bacterial counts. However, quantification of lower numbers of viable cells was best achieved with ddPCR, which was characterized by a reduced lower limit of quantification (improvement of up to 1.76 log 10 compared to qPCR). All three strains were able to survive in the piglets' gut with viability losses between 0.78 and 1.59 log 10 /g faeces. This study shows the applicability of PMA-ddPCR to specific quantification of small numbers of viable bacterial cells in the presence of an important background of unwanted microorganisms, and without the need to set up standard curves. It also illustrates the need to adapt PMA protocols according to the final matrix and target strain, even for closely related strains. The PMA-ddPCR approach provides a new tool to quantify bacterial survival in faecal samples from a preclinical and clinical trial. Copyright © 2018 The Authors. Published by

  12. Evolution of the field quantum entropy and entanglement in a system of multimode light field interacting resonantly with a two-level atom through N_j-degenerate N~Σ-photon process

    Institute of Scientific and Technical Information of China (English)

    2008-01-01

    The time evolution of the field quantum entropy and entanglement in a system of multi-mode coherent light field resonantly interacting with a two-level atom by de-generating the multi-photon process is studied by utilizing the Von Neumann re-duced entropy theory,and the analytical expressions of the quantum entropy of the multimode field and the numerical calculation results for three-mode field inter-acting with the atom are obtained. Our attention focuses on the discussion of the influences of the initial average photon number,the atomic distribution angle and the phase angle of the atom dipole on the evolution of the quantum field entropy and entanglement. The results obtained from the numerical calculation indicate that: the stronger the quantum field is,the weaker the entanglement between the quan-tum field and the atom will be,and when the field is strong enough,the two sub-systems may be in a disentangled state all the time; the quantum field entropy is strongly dependent on the atomic distribution angle,namely,the quantum field and the two-level atom are always in the entangled state,and are nearly stable at maximum entanglement after a short time of vibration; the larger the atomic dis-tribution angle is,the shorter the time for the field quantum entropy to evolve its maximum value is; the phase angles of the atom dipole almost have no influences on the entanglement between the quantum field and the two-level atom. Entangled states or pure states based on these properties of the field quantum entropy can be prepared.

  13. Quantification of heterogeneity as a biomarker in tumor imaging: a systematic review.

    Directory of Open Access Journals (Sweden)

    Lejla Alic

    Full Text Available BACKGROUND: Many techniques are proposed for the quantification of tumor heterogeneity as an imaging biomarker for differentiation between tumor types, tumor grading, response monitoring and outcome prediction. However, in clinical practice these methods are barely used. This study evaluates the reported performance of the described methods and identifies barriers to their implementation in clinical practice. METHODOLOGY: The Ovid, Embase, and Cochrane Central databases were searched up to 20 September 2013. Heterogeneity analysis methods were classified into four categories, i.e., non-spatial methods (NSM, spatial grey level methods (SGLM, fractal analysis (FA methods, and filters and transforms (F&T. The performance of the different methods was compared. PRINCIPAL FINDINGS: Of the 7351 potentially relevant publications, 209 were included. Of these studies, 58% reported the use of NSM, 49% SGLM, 10% FA, and 28% F&T. Differentiation between tumor types, tumor grading and/or outcome prediction was the goal in 87% of the studies. Overall, the reported area under the curve (AUC ranged from 0.5 to 1 (median 0.87. No relation was found between the performance and the quantification methods used, or between the performance and the imaging modality. A negative correlation was found between the tumor-feature ratio and the AUC, which is presumably caused by overfitting in small datasets. Cross-validation was reported in 63% of the classification studies. Retrospective analyses were conducted in 57% of the studies without a clear description. CONCLUSIONS: In a research setting, heterogeneity quantification methods can differentiate between tumor types, grade tumors, and predict outcome and monitor treatment effects. To translate these methods to clinical practice, more prospective studies are required that use external datasets for validation: these datasets should be made available to the community to facilitate the development of new and improved

  14. Quantum averaging and resonances: two-level atom in a one-mode classical laser field

    Directory of Open Access Journals (Sweden)

    M. Amniat-Talab

    2007-06-01

    Full Text Available   We use a nonperturbative method based on quantum averaging and an adapted from of resonant transformations to treat the resonances of the Hamiltonian of a two-level atom interacting with a one-mode classical field in Floquet formalism. We illustrate this method by extraction of effective Hamiltonians of the system in two regimes of weak and strong coupling. The results obtained in the strong-coupling regime, are valid in the whole range of the coupling constant for the one-photon zero-field resonance.

  15. Quantification of isocyanates and amines in polyurethane foams and coated products by liquid chromatography–tandem mass spectrometry

    Science.gov (United States)

    Mutsuga, Motoh; Yamaguchi, Miku; Kawamura, Yoko

    2014-01-01

    An analytical method for the identification and quantification of 10 different isocyanates and 11 different amines in polyurethane (PUR) foam and PUR-coated products was developed and optimized. Isocyanates were extracted and derivatized with di-n-butylamine, while amines were extracted with methanol. Quantification was subsequently performed by liquid chromatography–tandem mass spectrometry. Using this methodology, residual levels of isocyanates and amines in commercial PUR products were quantified. Although the recoveries of certain isocyanates and amines were low, the main compounds used as monomers in the production of PUR products, and their decomposition species, were clearly identified at quantifiable levels. 2,4-and 2,6-toluenediisocyanate were detected in most PUR foam samples and a pastry bag in the range of 0.02–0.92 mg/kg, with their decomposition compounds, 2,4-and 2,6-toluenediamine, detected in all PUR foam samples in the range of 9.5–59 mg/kg. PUR-coated gloves are manufactured using 4,4′-methylenebisphenyl diisocyanate as the main raw material, and a large amount of this compound, in addition to 4,4′-methylenedianiline and dicyclohexylmethane-4,4′-diamine were found in these samples. PMID:24804074

  16. Investigation of the exposure level of electromagnetic fields produced by mobile telephone base stations

    International Nuclear Information System (INIS)

    Abukassem, I.; Kharita, M.H.

    2011-01-01

    The electromagnetic field levels in the surrounding of different samples of mobile phone base station were investigated in order to cover residential zones of Damascus and her environs. Measurements were achieved according to the emission direction and to the studied positions environment. Results showed that the signal level in all measured points is lower than the International Commission on Non Ionizing Radiation Protection (ICNIRP) restriction level, but for few measurement points the detected microwave level has relatively important values. The signal level inside building situated partially in the emission direction of the base station transmitters decreases stepwise and walls reduce considerably the signal intensity. This study showed the importance of achieving a transparent collaboration between research laboratory and mobile phone companies in order to improve the protection level.(author)

  17. MR Spectroscopy: Real-Time Quantification of in-vivo MR Spectroscopic data

    OpenAIRE

    Massé, Kunal

    2009-01-01

    In the last two decades, magnetic resonance spectroscopy (MRS) has had an increasing success in biomedical research. This technique has the faculty of discerning several metabolites in human tissue non-invasively and thus offers a multitude of medical applications. In clinical routine, quantification plays a key role in the evaluation of the different chemical elements. The quantification of metabolites characterizing specific pathologies helps physicians establish the patient's diagnosis. E...

  18. Structures, values, and interaction in field-level partnerships: the case of UNHCR and NGOs

    NARCIS (Netherlands)

    Mommers, C.; Wessel, van M.G.J.

    2009-01-01

    This article discusses the process of transforming partnership from a conceptual framework into a practical, operational framework for field-level interaction among humanitarian organisations. The authors approach this transformation from the perspective of the core values of the partnership concept

  19. Recurrence Quantifcation Analysis of Sentence-Level Speech Kinematics

    Science.gov (United States)

    Jackson, Eric S.; Tiede, Mark; Riley, Michael A.; Whalen, D. H.

    2016-01-01

    Purpose: Current approaches to assessing sentence-level speech variability rely on measures that quantify variability across utterances and use normalization procedures that alter raw trajectory data. The current work tests the feasibility of a less restrictive nonlinear approach--recurrence quantification analysis (RQA)--via a procedural example…

  20. 50 Hz electric field effects on protein carbonyl (PCO), heme oxygenase-1 (HO-1) and hydroxyproline levels

    International Nuclear Information System (INIS)

    Ozgur, Elcin; Goknur, Guler; Seyhan, Nesrin

    2008-01-01

    Full text: Non-ionizing electromagnetic field (EMF) radiation sources, such as power lines and other Extremely Low Frequency (ELF) sources have become one of the most ubiquitous components of the spectrum of the human environment, and the possibility that they may have hazardous effects on human health is a major a public concern. Although it is well documented that EMFs have biological effects, the degree to which these exposures constitute a human health hazard is not clear yet. Today relation between production of oxidative stress resulted by reactive oxygen species and electrical stimulus, also the protective effects of antioxidant treatments are mentioned in many researches. In this study, it was aimed to determine both oxidation of proteins and protein collagen levels under 50 Hz 12 kV/m vertical Electric (E) Field exposure and the N-Acetylcysteine (NAC) administration which is a well-known antioxidant. To this end, protein carbonyl levels (PCO) as bio-markers of oxidative stress and Heme oxygenase-1 (HO-1), an enzyme that catalyzes the degradation of heme analyzed to figure out the protein oxidation. Hydroxyproline level, a major component of the protein collagen was measured in order to express the level of collagen in lung tissue. Guinea pigs, weighted 250-300 g, were used in the study. A total forty male guinea pigs were randomly divided into four groups which are composed of 10 guinea pigs each for groups: 1) Group I (Sham); 2) Group II (NAC-administrated group); 3) Group III (E Field Exposure group); 4) Group IV (NAC administrated + E Field exposed group). One week exposure period for 8 hours per daily was conducted for each exposure groups (Group III, Group IV ). The electric field exposure period was from 9 a.m. to 5 p.m. After the last exposure day, the guinea pigs were anesthetized by the injection of ketamine and xylazine. The guinea pigs were killed by decapitation. Statistical analyses were carried out using SPSS software (SPSS 11.5 for windows

  1. The influence of organic materials on the near field of an intermediate level radioactive waste repository

    International Nuclear Information System (INIS)

    Wilkins, J.D.

    1988-01-01

    The influence of organic materials which are present in some intermediate level wastes on the chemistry of the near field of a radioactive waste repository is discussed. Particular attention is given to the possible formation of water soluble complexing agents as a result of the radiation field and chemical conditions. The present state of the research is reviewed. (author)

  2. Uncertainty Quantification for Large-Scale Ice Sheet Modeling

    Energy Technology Data Exchange (ETDEWEB)

    Ghattas, Omar [Univ. of Texas, Austin, TX (United States)

    2016-02-05

    This report summarizes our work to develop advanced forward and inverse solvers and uncertainty quantification capabilities for a nonlinear 3D full Stokes continental-scale ice sheet flow model. The components include: (1) forward solver: a new state-of-the-art parallel adaptive scalable high-order-accurate mass-conservative Newton-based 3D nonlinear full Stokes ice sheet flow simulator; (2) inverse solver: a new adjoint-based inexact Newton method for solution of deterministic inverse problems governed by the above 3D nonlinear full Stokes ice flow model; and (3) uncertainty quantification: a novel Hessian-based Bayesian method for quantifying uncertainties in the inverse ice sheet flow solution and propagating them forward into predictions of quantities of interest such as ice mass flux to the ocean.

  3. Swift Quantification of Fenofibrate and Tiemonium methylsulfate Active Ingredients in Solid Drugs Using Particle Induced X-Ray Emission

    International Nuclear Information System (INIS)

    Bejjani, A.; Nsouli, B.; Zahraman, K.; Assi, S.; Younes, Gh.; Yazbi, F.

    2011-01-01

    The quantification of active ingredients (AI) in drugs is a crucial and important step in the drug quality control process. This is usually performed by using wet chemical techniques like LC-MS, UV spectrophotometry and other appropriate organic analytical methods. However, if the active ingredient contains specific heteroatoms (F, S, Cl), elemental IBA like PIXE and PIGE techniques, using small tandem accelerator of 1-2 MV, can be explored for molecular quantification. IBA techniques permit the analysis of the sample under solid form, without any laborious sample preparations. In this work, we demonstrate the ability of the Thick Target PIXE technique for rapid and accurate quantification of both low and high concentrations of active ingredients in different commercial drugs. Fenofibrate, a chlorinated active ingredient, is present in high amounts in two different commercial drugs, its quantification was done using the relative approach to an external standard. On the other hand, Tiemonium methylsulfate which exists in relatively low amount in commercial drugs, its quantification was done using GUPIX simulation code (absolute quantification). The experimental aspects related to the quantification validity (use of external standards, absolute quantification, matrix effect,...) are presented and discussed. (author)

  4. Superposition Quantification

    Science.gov (United States)

    Chang, Li-Na; Luo, Shun-Long; Sun, Yuan

    2017-11-01

    The principle of superposition is universal and lies at the heart of quantum theory. Although ever since the inception of quantum mechanics a century ago, superposition has occupied a central and pivotal place, rigorous and systematic studies of the quantification issue have attracted significant interests only in recent years, and many related problems remain to be investigated. In this work we introduce a figure of merit which quantifies superposition from an intuitive and direct perspective, investigate its fundamental properties, connect it to some coherence measures, illustrate it through several examples, and apply it to analyze wave-particle duality. Supported by Science Challenge Project under Grant No. TZ2016002, Laboratory of Computational Physics, Institute of Applied Physics and Computational Mathematics, Beijing, Key Laboratory of Random Complex Structures and Data Science, Chinese Academy of Sciences, Grant under No. 2008DP173182

  5. Noninvasive Quantification of Pancreatic Fat in Humans

    OpenAIRE

    Lingvay, Ildiko; Esser, Victoria; Legendre, Jaime L.; Price, Angela L.; Wertz, Kristen M.; Adams-Huet, Beverley; Zhang, Song; Unger, Roger H.; Szczepaniak, Lidia S.

    2009-01-01

    Objective: To validate magnetic resonance spectroscopy (MRS) as a tool for non-invasive quantification of pancreatic triglyceride (TG) content and to measure the pancreatic TG content in a diverse human population with a wide range of body mass index (BMI) and glucose control.

  6. 15 CFR 990.52 - Injury assessment-quantification.

    Science.gov (United States)

    2010-01-01

    ... (Continued) NATIONAL OCEANIC AND ATMOSPHERIC ADMINISTRATION, DEPARTMENT OF COMMERCE OIL POLLUTION ACT..., trustees must quantify the degree, and spatial and temporal extent of such injuries relative to baseline. (b) Quantification approaches. Trustees may quantify injuries in terms of: (1) The degree, and...

  7. A method to quantify infectious airborne pathogens at concentrations below the threshold of quantification by culture

    Science.gov (United States)

    Cutler, Timothy D.; Wang, Chong; Hoff, Steven J.; Zimmerman, Jeffrey J.

    2013-01-01

    In aerobiology, dose-response studies are used to estimate the risk of infection to a susceptible host presented by exposure to a specific dose of an airborne pathogen. In the research setting, host- and pathogen-specific factors that affect the dose-response continuum can be accounted for by experimental design, but the requirement to precisely determine the dose of infectious pathogen to which the host was exposed is often challenging. By definition, quantification of viable airborne pathogens is based on the culture of micro-organisms, but some airborne pathogens are transmissible at concentrations below the threshold of quantification by culture. In this paper we present an approach to the calculation of exposure dose at microbiologically unquantifiable levels using an application of the “continuous-stirred tank reactor (CSTR) model” and the validation of this approach using rhodamine B dye as a surrogate for aerosolized microbial pathogens in a dynamic aerosol toroid (DAT). PMID:24082399

  8. A novel immunological assay for hepcidin quantification in human serum.

    Directory of Open Access Journals (Sweden)

    Vasiliki Koliaraki

    Full Text Available BACKGROUND: Hepcidin is a 25-aminoacid cysteine-rich iron regulating peptide. Increased hepcidin concentrations lead to iron sequestration in macrophages, contributing to the pathogenesis of anaemia of chronic disease whereas decreased hepcidin is observed in iron deficiency and primary iron overload diseases such as hereditary hemochromatosis. Hepcidin quantification in human blood or urine may provide further insights for the pathogenesis of disorders of iron homeostasis and might prove a valuable tool for clinicians for the differential diagnosis of anaemia. This study describes a specific and non-operator demanding immunoassay for hepcidin quantification in human sera. METHODS AND FINDINGS: An ELISA assay was developed for measuring hepcidin serum concentration using a recombinant hepcidin25-His peptide and a polyclonal antibody against this peptide, which was able to identify native hepcidin. The ELISA assay had a detection range of 10-1500 microg/L and a detection limit of 5.4 microg/L. The intra- and interassay coefficients of variance ranged from 8-15% and 5-16%, respectively. Mean linearity and recovery were 101% and 107%, respectively. Mean hepcidin levels were significantly lower in 7 patients with juvenile hemochromatosis (12.8 microg/L and 10 patients with iron deficiency anemia (15.7 microg/L and higher in 7 patients with Hodgkin lymphoma (116.7 microg/L compared to 32 age-matched healthy controls (42.7 microg/L. CONCLUSIONS: We describe a new simple ELISA assay for measuring hepcidin in human serum with sufficient accuracy and reproducibility.

  9. Quantification of Lignin and Its Structural Features in Plant Biomass Using 13C Lignin as Internal Standard for Pyrolysis-GC-SIM-MS.

    Science.gov (United States)

    van Erven, Gijs; de Visser, Ries; Merkx, Donny W H; Strolenberg, Willem; de Gijsel, Peter; Gruppen, Harry; Kabel, Mirjam A

    2017-10-17

    Understanding the mechanisms underlying plant biomass recalcitrance at the molecular level can only be achieved by accurate analyses of both the content and structural features of the molecules involved. Current quantification of lignin is, however, majorly based on unspecific gravimetric analysis after sulfuric acid hydrolysis. Hence, our research aimed at specific lignin quantification with concurrent characterization of its structural features. Hereto, for the first time, a polymeric 13 C lignin was used as internal standard (IS) for lignin quantification via analytical pyrolysis coupled to gas chromatography with mass-spectrometric detection in selected ion monitoring mode (py-GC-SIM-MS). In addition, relative response factors (RRFs) for the various pyrolysis products obtained were determined and applied. First, 12 C and 13 C lignin were isolated from nonlabeled and uniformly 13 C labeled wheat straw, respectively, and characterized by heteronuclear single quantum coherence (HSQC), nuclear magnetic resonance (NMR), and py-GC/MS. The two lignin isolates were found to have identical structures. Second, 13 C-IS based lignin quantification by py-GC-SIM-MS was validated in reconstituted biomass model systems with known contents of the 12 C lignin analogue and was shown to be extremely accurate (>99.9%, R 2 > 0.999) and precise (RSD corn stover, and sugar cane bagasse), and lignin contents were in good agreement with the total gravimetrically determined lignin contents. Our robust method proves to be a promising alternative for the high-throughput quantification of lignin in milled biomass samples directly and simultaneously provides a direct insight into the structural features of lignin.

  10. Voltammetric Quantification of Paraquat and Glyphosate in Surface Waters

    Directory of Open Access Journals (Sweden)

    William Roberto Alza-Camacho

    2016-09-01

    Full Text Available The indiscriminate use of pesticides on crops has a negative environmental impact that affects organisms, soil and water resources, essential for life. Therefore, it is necessary to evaluate the residual effect of these substances in water sources. A simple, affordable and accessible electrochemical method for Paraquat and Glyphosate quantification in water was developed. The study was conducted using as supporting electrolyte Britton-Robinson buffer solution, working electrode of glassy carbon, Ag/AgCl as the reference electrode, and platinum as auxiliary electrode. Differential pulse voltammetry (VDP method for both compounds were validated. Linearity of the methods presented a correlation coefficient of 0.9949 and 0.9919 and the limits of detection and quantification were 130 and 190 mg/L for Paraquat and 40 and 50 mg/L for glyphosate. Comparison with the reference method showed that the electrochemical method provides superior results in quantification of analytes. Of the samples tested, a value of Paraquat was between 0,011 to 1,572 mg/L and for glyphosate it was between 0.201 to 2.777 mg/L, indicating that these compounds are present in water sources and that those may be causing serious problems to human health.

  11. Nuclear and mitochondrial DNA quantification of various forensic materials.

    Science.gov (United States)

    Andréasson, H; Nilsson, M; Budowle, B; Lundberg, H; Allen, M

    2006-12-01

    Due to the different types and quality of forensic evidence materials, their DNA content can vary substantially, and particularly low quantities can impact the results in an identification analysis. In this study, the quantity of mitochondrial and nuclear DNA was determined in a variety of materials using a previously described real-time PCR method. DNA quantification in the roots and distal sections of plucked and shed head hairs revealed large variations in DNA content particularly between the root and the shaft of plucked hairs. Also large intra- and inter-individual variations were found among hairs. In addition, DNA content was estimated in samples collected from fingerprints and accessories. The quantification of DNA on various items also displayed large variations, with some materials containing large amounts of nuclear DNA while no detectable nuclear DNA and only limited amounts of mitochondrial DNA were seen in others. Using this sensitive real-time PCR quantification assay, a better understanding was obtained regarding DNA content and variation in commonly analysed forensic evidence materials and this may guide the forensic scientist as to the best molecular biology approach for analysing various forensic evidence materials.

  12. Two-dimensional atom localization via two standing-wave fields in a four-level atomic system

    International Nuclear Information System (INIS)

    Zhang Hongtao; Wang Hui; Wang Zhiping

    2011-01-01

    We propose a scheme for the two-dimensional (2D) localization of an atom in a four-level Y-type atomic system. By applying two orthogonal standing-wave fields, the atoms can be localized at some special positions, leading to the formation of sub-wavelength 2D periodic spatial distributions. The localization peak position and number as well as the conditional position probability can be controlled by the intensities and detunings of optical fields.

  13. Model Uncertainty Quantification Methods In Data Assimilation

    Science.gov (United States)

    Pathiraja, S. D.; Marshall, L. A.; Sharma, A.; Moradkhani, H.

    2017-12-01

    Data Assimilation involves utilising observations to improve model predictions in a seamless and statistically optimal fashion. Its applications are wide-ranging; from improving weather forecasts to tracking targets such as in the Apollo 11 mission. The use of Data Assimilation methods in high dimensional complex geophysical systems is an active area of research, where there exists many opportunities to enhance existing methodologies. One of the central challenges is in model uncertainty quantification; the outcome of any Data Assimilation study is strongly dependent on the uncertainties assigned to both observations and models. I focus on developing improved model uncertainty quantification methods that are applicable to challenging real world scenarios. These include developing methods for cases where the system states are only partially observed, where there is little prior knowledge of the model errors, and where the model error statistics are likely to be highly non-Gaussian.

  14. Digital ELISA for the quantification of attomolar concentrations of Alzheimer's disease biomarker protein Tau in biological samples.

    Science.gov (United States)

    Pérez-Ruiz, Elena; Decrop, Deborah; Ven, Karen; Tripodi, Lisa; Leirs, Karen; Rosseels, Joelle; van de Wouwer, Marlies; Geukens, Nick; De Vos, Ann; Vanmechelen, Eugeen; Winderickx, Joris; Lammertyn, Jeroen; Spasic, Dragana

    2018-07-26

    The close correlation between Tau pathology and Alzheimer's disease (AD) progression makes this protein a suitable biomarker for diagnosis and monitoring of the disorder evolution. However, the use of Tau in diagnostics has been hampered, as it currently requires collection of cerebrospinal fluid (CSF), which is an invasive clinical procedure. Although measuring Tau-levels in blood plasma would be favorable, the concentrations are below the detection limit of a conventional ELISA. In this work, we developed a digital ELISA for the quantification of attomolar protein Tau concentrations in both buffer and biological samples. Individual Tau molecules were first captured on the surface of magnetic particles using in-house developed antibodies and subsequently isolated into the femtoliter-sized wells of a 2 × 2 mm 2 microwell array. Combination of high-affinity antibodies, optimal assay conditions and a digital quantification approach resulted in a 24 ± 7 aM limit of detection (LOD) in buffer samples. Additionally, a dynamic range of 6 orders of magnitude was achieved by combining the digital readout with an analogue approach, allowing quantification from attomolar to picomolar levels of Tau using the same platform. This proves the compatibility of the presented assay with the wide range of Tau concentrations encountered in different biological samples. Next, the developed digital assay was applied to detect total Tau levels in spiked blood plasma. A similar LOD (55 ± 29 aM) was obtained compared to the buffer samples, which was 5000-fold more sensitive than commercially available ELISAs and even outperformed previously reported digital assays with 10-fold increase in sensitivity. Finally, the performance of the developed digital ELISA was assessed by quantifying protein Tau in three clinical CSF samples. Here, a high correlation (i.e. Pearson coefficient of 0.99) was found between the measured percentage of active particles and the reference protein Tau

  15. Real-time quantitative PCR for retrovirus-like particle quantification in CHO cell culture.

    Science.gov (United States)

    de Wit, C; Fautz, C; Xu, Y

    2000-09-01

    Chinese hamster ovary (CHO) cells have been widely used to manufacture recombinant proteins intended for human therapeutic uses. Retrovirus-like particles, which are apparently defective and non-infectious, have been detected in all CHO cells by electron microscopy (EM). To assure viral safety of CHO cell-derived biologicals, quantification of retrovirus-like particles in production cell culture and demonstration of sufficient elimination of such retrovirus-like particles by the down-stream purification process are required for product market registration worldwide. EM, with a detection limit of 1x10(6) particles/ml, is the standard retrovirus-like particle quantification method. The whole process, which requires a large amount of sample (3-6 litres), is labour intensive, time consuming, expensive, and subject to significant assay variability. In this paper, a novel real-time quantitative PCR assay (TaqMan assay) has been developed for the quantification of retrovirus-like particles. Each retrovirus particle contains two copies of the viral genomic particle RNA (pRNA) molecule. Therefore, quantification of retrovirus particles can be achieved by quantifying the pRNA copy number, i.e. every two copies of retroviral pRNA is equivalent to one retrovirus-like particle. The TaqMan assay takes advantage of the 5'-->3' exonuclease activity of Taq DNA polymerase and utilizes the PRISM 7700 Sequence Detection System of PE Applied Biosystems (Foster City, CA, U.S.A.) for automated pRNA quantification through a dual-labelled fluorogenic probe. The TaqMan quantification technique is highly comparable to the EM analysis. In addition, it offers significant advantages over the EM analysis, such as a higher sensitivity of less than 600 particles/ml, greater accuracy and reliability, higher sample throughput, more flexibility and lower cost. Therefore, the TaqMan assay should be used as a substitute for EM analysis for retrovirus-like particle quantification in CHO cell

  16. Preferred sound levels of portable music players and listening habits among adults: a field study.

    Science.gov (United States)

    Kähäri, Kim R; Aslund, T; Olsson, J

    2011-01-01

    The main purpose of this descriptive field study was to explore music listening habits and preferred listening levels with portable music players (PMPs). We were also interested in seeing whether any exposure differences could be observed between the sexes. Data were collected during 12 hours at Stockholm Central Station, where people passing by were invited to measure their preferred PMP listening level by using a KEMAR manikin. People were also asked to answer a questionnaire about their listening habits. In all, 60 persons (41 men and 19 women) took part in the questionnaire study and 61 preferred PMP levels to be measured. Forty-one of these sound level measurements were valid to be reported after consideration was taken to acceptable measuring conditions. The women (31 years) and the men (33 years) started to use PMPs on a regular basis in their early 20s. Ear canal headphones/ear buds were the preferred headphone types. Fifty-seven percent of the whole study population used their PMP on a daily basis. The measured LAeq60 sec levels corrected for free field ranged between 73 and 102 dB, with a mean value of 83 dB. Sound levels for different types of headphones are also presented. The results of this study indicate that there are two groups of listeners: people who listen less frequently and at lower, safer sound levels, and people with excessive listening habits that may indeed damage their hearing sensory organ in time.

  17. Preferred sound levels of portable music players and listening habits among adults: A field study

    Directory of Open Access Journals (Sweden)

    Kim R Kahari

    2011-01-01

    Full Text Available The main purpose of this descriptive field study was to explore music listening habits and preferred listening levels with portable music players (PMPs. We were also interested in seeing whether any exposure differences could be observed between the sexes. Data were collected during 12 hours at Stockholm Central Station, where people passing by were invited to measure their preferred PMP listening level by using a KEMAR manikin. People were also asked to answer a questionnaire about their listening habits. In all, 60 persons (41 men and 19 women took part in the questionnaire study and 61 preferred PMP levels to be measured. Forty-one of these sound level measurements were valid to be reported after consideration was taken to acceptable measuring conditions. The women (31 years and the men (33 years started to use PMPs on a regular basis in their early 20s. Ear canal headphones/ear buds were the preferred headphone types. Fifty-seven percent of the whole study population used their PMP on a daily basis. The measured LAeq60 sec levels corrected for free field ranged between 73 and 102 dB, with a mean value of 83 dB. Sound levels for different types of headphones are also presented. The results of this study indicate that there are two groups of listeners: people who listen less frequently and at lower, safer sound levels, and people with excessive listening habits that may indeed damage their hearing sensory organ in time.

  18. 2D histomorphometric quantification from 3D computerized tomography

    International Nuclear Information System (INIS)

    Lima, Inaya; Oliveira, Luis Fernando de; Lopes, Ricardo T.; Jesus, Edgar Francisco O. de; Alves, Jose Marcos

    2002-01-01

    In the present article, preliminary results are presented showing the application of the tridimensional computerized microtomographic technique (3D-μCT) to bone tissue characterization, through histomorphometric quantification which are based on stereologic concepts. Two samples of human bone were correctly prepared to be submitted to the tomographic system. The system used to realize that process were a radiographic system with a microfocus X-ray tube. Through these three processes, acquisition, reconstruction and quantification, it was possible to get the good results and coherent to the literature data. From this point, it is intended to compare these results with the information due the conventional method, that is, conventional histomorphometry. (author)

  19. Education and childlessness: The relationship between educational field, educational level, and childlessness among Swedish women born in 1955-59

    Directory of Open Access Journals (Sweden)

    Gerda Neyer

    2006-05-01

    Full Text Available In this paper we extend the concept of educational attainment to cover the field of education taken in addition to the conventional level of education attained. Our empirical investigation uses register records containing childbearing and educational histories of an entire cohort of women born in Sweden (about a quarter-million individuals. This allows us to operate with a high number of educational field-and-level combinations (some sixty in all. It turns out that the field of education serves as an indicator of a woman's potential reproductive behavior better than the mere level attained. We discover that in each field permanent childlessness increases some with the educational level, but that the field itself is the more important. In general, we find that women educated for jobs in teaching and health care are in a class of their own, with much lower permanent childlessness at each educational level than in any other major grouping. Women educated in arts and humanities or for religious occupations have unusually high fractions permanently childless. Our results cast doubt on the assumption that higher education per se must result in higher childlessness. In our opinion, several factors intrinsic and extrinsic to an educational system (such as its flexibility, its gender structure, and the manner in which education is hooked up to the labor market may influence the relationship between education and childlessness, and we would not expect a simple, unidirectional relationship.

  20. Quantification of residual solvents in antibody drug conjugates using gas chromatography

    Energy Technology Data Exchange (ETDEWEB)

    Medley, Colin D., E-mail: medley.colin@gene.com [Genentech Inc., Small Molecule Pharmaceutical Sciences, 1 DNA Way, South San Francisco, CA 94080 (United States); Kay, Jacob [Research Pharmaceutical Services, 520 Virginia Dr. Fort, Washington, PA (United States); Li, Yi; Gruenhagen, Jason; Yehl, Peter; Chetwyn, Nik P. [Genentech Inc., Small Molecule Pharmaceutical Sciences, 1 DNA Way, South San Francisco, CA 94080 (United States)

    2014-11-19

    Highlights: • Sensitive residual solvents detection in ADCs. • 125 ppm QL for common conjugation solvents. • Generic and validatable method. - Abstract: The detection and quantification of residual solvents present in clinical and commercial pharmaceutical products is necessary from both patient safety and regulatory perspectives. Head-space gas chromatography is routinely used for quantitation of residual solvents for small molecule APIs produced through synthetic processes; however residual solvent analysis is generally not needed for protein based pharmaceuticals produced through cultured cell lines where solvents are not introduced. In contrast, antibody drug conjugates and other protein conjugates where a drug or other molecule is covalently bound to a protein typically use solvents such as N,N-dimethylacetamide (DMA), N,N‑dimethylformamide (DMF), dimethyl sulfoxide (DMSO), or propylene glycol (PG) to dissolve the hydrophobic small molecule drug for conjugation to the protein. The levels of the solvent remaining following the conjugation step are therefore important to patient safety as these parental drug products are introduced directly into the patients bloodstream. We have developed a rapid sample preparation followed by a gas chromatography separation for the detection and quantification of several solvents typically used in these conjugation reactions. This generic method has been validated and can be easily implemented for use in quality control testing for clinical or commercial bioconjugated products.

  1. GMO detection using a bioluminescent real time reporter (BART of loop mediated isothermal amplification (LAMP suitable for field use

    Directory of Open Access Journals (Sweden)

    Kiddle Guy

    2012-04-01

    Full Text Available Abstract Background There is an increasing need for quantitative technologies suitable for molecular detection in a variety of settings for applications including food traceability and monitoring of genetically modified (GM crops and their products through the food processing chain. Conventional molecular diagnostics utilising real-time polymerase chain reaction (RT-PCR and fluorescence-based determination of amplification require temperature cycling and relatively complex optics. In contrast, isothermal amplification coupled to a bioluminescent output produced in real-time (BART occurs at a constant temperature and only requires a simple light detection and integration device. Results Loop mediated isothermal amplification (LAMP shows robustness to sample-derived inhibitors. Here we show the applicability of coupled LAMP and BART reactions (LAMP-BART for determination of genetically modified (GM maize target DNA at low levels of contamination (0.1-5.0% GM using certified reference material, and compare this to RT-PCR. Results show that conventional DNA extraction methods developed for PCR may not be optimal for LAMP-BART quantification. Additionally, we demonstrate that LAMP is more tolerant to plant sample-derived inhibitors, and show this can be exploited to develop rapid extraction techniques suitable for simple field-based qualitative tests for GM status determination. We also assess the effect of total DNA assay load on LAMP-BART quantitation. Conclusions LAMP-BART is an effective and sensitive technique for GM detection with significant potential for quantification even at low levels of contamination and in samples derived from crops such as maize with a large genome size. The resilience of LAMP-BART to acidic polysaccharides makes it well suited to rapid sample preparation techniques and hence to both high throughput laboratory settings and to portable GM detection applications. The impact of the plant sample matrix and genome loading

  2. Advances in forensic DNA quantification: a review.

    Science.gov (United States)

    Lee, Steven B; McCord, Bruce; Buel, Eric

    2014-11-01

    This review focuses upon a critical step in forensic biology: detection and quantification of human DNA from biological samples. Determination of the quantity and quality of human DNA extracted from biological evidence is important for several reasons. Firstly, depending on the source and extraction method, the quality (purity and length), and quantity of the resultant DNA extract can vary greatly. This affects the downstream method as the quantity of input DNA and its relative length can determine which genotyping procedure to use-standard short-tandem repeat (STR) typing, mini-STR typing or mitochondrial DNA sequencing. Secondly, because it is important in forensic analysis to preserve as much of the evidence as possible for retesting, it is important to determine the total DNA amount available prior to utilizing any destructive analytical method. Lastly, results from initial quantitative and qualitative evaluations permit a more informed interpretation of downstream analytical results. Newer quantitative techniques involving real-time PCR can reveal the presence of degraded DNA and PCR inhibitors, that provide potential reasons for poor genotyping results and may indicate methods to use for downstream typing success. In general, the more information available, the easier it is to interpret and process the sample resulting in a higher likelihood of successful DNA typing. The history of the development of quantitative methods has involved two main goals-improving precision of the analysis and increasing the information content of the result. This review covers advances in forensic DNA quantification methods and recent developments in RNA quantification. © 2014 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  3. Intermediate bands versus levels in non-radiative recombination

    International Nuclear Information System (INIS)

    Luque, Antonio; Marti, Antonio; Antolin, Elisa; Tablero, Cesar

    2006-01-01

    There is a practical interest in developing semiconductors with levels situated within their band gap while preventing the non-radiative recombination that these levels promote. In this paper, the physical causes of this non-radiative recombination are analyzed and the increase in the density of the impurities responsible for the mid-gap levels to the point of forming bands is suggested as the means of suppressing the recombination. Simple models supporting this recommendation and helping in its quantification are presented

  4. Metal quantification in water and sediment samples of billings reservoir by SR-TXRF

    International Nuclear Information System (INIS)

    Sampaio, Sergio Arnaud; Moreira, Silvana; Vives, Ana Elisa Sirito de

    2007-01-01

    Billings is the largest reservoir water of the metropolitan Sao Paulo area, with approximately 100km 2 of water. Its basin hydrographic occupies more than 500km 2 in six cities. It concentrates the largest industrial park of South America and only its margins are busy for almost a million inhabitants. The quality of its waters is, therefore, constant of concern of the whole society. In this work the Synchrotron Radiation Total Reflection X Ray Fluorescence (SR-TXRF) is applied for the identification and quantification of metals in waters and sediments of the Billings dam. A comparison of the levels of metals found with the maximum permissive limits established by the Brazilian legislation was made. The purpose of social context is to contribute for the preservation of the local springs and the rational use of its waters. For the field work they were chosen 19 collection points, included the margins and the central portion of the dam, in agreement with similar approaches the those adopted by the Company of Technology of Environmental Sanitation of Sao Paulo State (CETESB).The water and sediment samples, as well as the certified and standard samples, were analyzed at Brazilian Synchrotron Light Laboratory (LNLS), Campinas, SP, Brazil. Results indicate that the water and the sediments of the reservoir have concentrations above the legal limits. (author)

  5. Metal quantification in water and sediment samples of billings reservoir by SR-TXRF

    Energy Technology Data Exchange (ETDEWEB)

    Sampaio, Sergio Arnaud; Moreira, Silvana [Universidade Estadual de Campinas (UNICAMP), Campinas, SP (Brazil). Faculdade de Engenharia Civil, Arquitetura e Urbanismo]. E-mails: silvana@fec.unicamp.br; sergioarnaud@hotmail.com; Vives, Ana Elisa Sirito de [Universidade Metodista de Piracicaba (UNIMEP), Santa Barbara D' Oeste, SP (Brazil). Faculdade de Engenharia Civil, Arquitetura e Urbanismo]. E-mail: aesvives@unimep.br

    2007-07-01

    Billings is the largest reservoir water of the metropolitan Sao Paulo area, with approximately 100km{sup 2} of water. Its basin hydrographic occupies more than 500km{sup 2} in six cities. It concentrates the largest industrial park of South America and only its margins are busy for almost a million inhabitants. The quality of its waters is, therefore, constant of concern of the whole society. In this work the Synchrotron Radiation Total Reflection X Ray Fluorescence (SR-TXRF) is applied for the identification and quantification of metals in waters and sediments of the Billings dam. A comparison of the levels of metals found with the maximum permissive limits established by the Brazilian legislation was made. The purpose of social context is to contribute for the preservation of the local springs and the rational use of its waters. For the field work they were chosen 19 collection points, included the margins and the central portion of the dam, in agreement with similar approaches the those adopted by the Company of Technology of Environmental Sanitation of Sao Paulo State (CETESB).The water and sediment samples, as well as the certified and standard samples, were analyzed at Brazilian Synchrotron Light Laboratory (LNLS), Campinas, SP, Brazil. Results indicate that the water and the sediments of the reservoir have concentrations above the legal limits. (author)

  6. Information entropy of a time-dependent three-level trapped ion interacting with a laser field

    International Nuclear Information System (INIS)

    Abdel-Aty, Mahmoud

    2005-01-01

    Trapped and laser-cooled ions are increasingly used for a variety of modern high-precision experiments, frequency standard applications and quantum information processing. Therefore, in this communication we present a comprehensive analysis of the pattern of information entropy arising in the time evolution of an ion interacting with a laser field. A general analytic approach is proposed for a three-level trapped-ion system in the presence of the time-dependent couplings. By working out an exact analytic solution, we conclusively analyse the general properties of the von Neumann entropy and quantum information entropy. It is shown that the information entropy is affected strongly by the time-dependent coupling and exhibits long time periodic oscillations. This feature attributed to the fact that in the time-dependent region Rabi oscillation is time dependent. Using parameters corresponding to a specific three-level ionic system, a single beryllium ion in a RF-(Paul) trap, we obtain illustrative examples of some novel aspects of this system in the dynamical evolution. Our results establish an explicit relation between the exact information entropy and the entanglement between the multi-level ion and the laser field. We show that different nonclassical effects arise in the dynamics of the ionic population inversion, depending on the initial states of the vibrational motion/field and on the values of Lamb-Dicke parameter η

  7. Review of advances in human reliability analysis of errors of commission-Part 2: EOC quantification

    International Nuclear Information System (INIS)

    Reer, Bernhard

    2008-01-01

    In close connection with examples relevant to contemporary probabilistic safety assessment (PSA), a review of advances in human reliability analysis (HRA) of post-initiator errors of commission (EOCs), i.e. inappropriate actions under abnormal operating conditions, has been carried out. The review comprises both EOC identification (part 1) and quantification (part 2); part 2 is presented in this article. Emerging HRA methods in this field are: ATHEANA, MERMOS, the EOC HRA method developed by Gesellschaft fuer Anlagen- und Reaktorsicherheit (GRS), the MDTA method and CREAM. The essential advanced features are on the conceptual side, especially to envisage the modeling of multiple contexts for an EOC to be quantified (ATHEANA, MERMOS and MDTA), in order to explicitly address adverse conditions. There is promising progress in providing systematic guidance to better account for cognitive demands and tendencies (GRS, CREAM), and EOC recovery (MDTA). Problematic issues are associated with the implementation of multiple context modeling and the assessment of context-specific error probabilities. Approaches for task or error opportunity scaling (CREAM, GRS) and the concept of reference cases (ATHEANA outlook) provide promising orientations for achieving progress towards data-based quantification. Further development work is needed and should be carried out in close connection with large-scale applications of existing approaches

  8. HPLC for simultaneous quantification of total ceramide, glucosylceramide, and ceramide trihexoside concentrations in plasma

    NARCIS (Netherlands)

    Groener, Johanna E. M.; Poorthuis, Ben J. H. M.; Kuiper, Sijmen; Helmond, Mariette T. J.; Hollak, Carla E. M.; Aerts, Johannes M. F. G.

    2007-01-01

    BACKGROUND: Simple, reproducible assays are needed for the quantification of sphingolipids, ceramide (Cer), and sphingoid bases. We developed an HPLC method for simultaneous quantification of total plasma concentrations of Cer, glucosylceramide (GlcCer), and ceramide trihexoside (CTH). METHODS:

  9. Development of salt-tolerance interface for an high performance liquid chromatography/inductively coupled plasma mass spectrometry system and its application to accurate quantification of DNA samples.

    Science.gov (United States)

    Takasaki, Yuka; Sakagawa, Shinnosuke; Inagaki, Kazumi; Fujii, Shin-Ichiro; Sabarudin, Akhmad; Umemura, Tomonari; Haraguchi, Hiroki

    2012-02-03

    Accurate quantification of DNA is highly important in various fields. Determination of phosphorus by ICP-MS is one of the most effective methods for accurate quantification of DNA due to the fixed stoichiometry of phosphate to this molecule. In this paper, a smart and reliable method for accurate quantification of DNA fragments and oligodeoxythymidilic acids by hyphenated HPLC/ICP-MS equipped with a highly efficient interface device is presented. The interface was constructed of a home-made capillary-attached micronebulizer and temperature-controllable cyclonic spray chamber (IsoMist). As a separation column for DNA samples, home-made methacrylate-based weak anion-exchange monolith was employed. Some parameters, which include composition of mobile phase, gradient program, inner and outer diameters of capillary, temperature of spray chamber etc., were optimized to find the best performance for separation and accurate quantification of DNA samples. The proposed system could achieve many advantages, such as total consumption for small amount sample analysis, salt-tolerance for hyphenated analysis, high accuracy and precision for quantitative analysis. Using this proposed system, the samples of 20 bp DNA ladder (20, 40, 60, 80, 100, 120, 140, 160, 180, 200, 300, 400, 500 base pairs) and oligodeoxythymidilic acids (dT(12-18)) were rapidly separated and accurately quantified. Copyright © 2011 Elsevier B.V. All rights reserved.

  10. Quantification of coating aging using impedance measurements

    NARCIS (Netherlands)

    Westing, E.P.M. van; Weijde, D.H. van der; Vreijling, M.P.W.; Ferrari, G.M.; Wit, J.H.W. de

    1998-01-01

    This chapter shows the application results of a novel approach to quantify the ageing of organic coatings using impedance measurements. The ageing quantification is based on the typical impedance behaviour of barrier coatings in immersion. This immersion behaviour is used to determine the limiting

  11. Impacts of rising tropospheric ozone on photosynthesis and metabolite levels on field grown soybean.

    Science.gov (United States)

    Sun, Jindong; Feng, Zhaozhong; Ort, Donald R

    2014-09-01

    The response of leaf photosynthesis and metabolite profiles to ozone (O3) exposure ranging from 37 to 116 ppb was investigated in two soybean cultivars Dwight and IA3010 in the field under fully open-air conditions. Leaf photosynthesis, total non-structural carbohydrates (TNC) and total free amino acids (TAA) decreased linearly with increasing O3 levels in both cultivars with average decrease of 7% for an increase in O3 levels by 10 ppb. Ozone interacted with developmental stages and leaf ages, and caused higher damage at later reproductive stages and in older leaves. Ozone affected yield mainly via reduction of maximum rate of Rubisco carboxylation (Vcmax) and maximum rates of electron transport (Jmax) as well as a shorter growing season due to earlier onset of canopy senescence. For all parameters investigated the critical O3 levels (∼50 ppb) for detectable damage fell within O3 levels that occur routinely in soybean fields across the US and elsewhere in the world. Strong correlations were observed in O3-induced changes among yield, photosynthesis, TNC, TAA and many metabolites. The broad range of metabolites that showed O3 dose dependent effect is consistent with multiple interaction loci and thus multiple targets for improving the tolerance of soybean to O3. Copyright © 2014 Elsevier Ireland Ltd. All rights reserved.

  12. Effect of a uniform magnetic field on dielectric two-phase bubbly flows using the level set method

    International Nuclear Information System (INIS)

    Ansari, M.R.; Hadidi, A.; Nimvari, M.E.

    2012-01-01

    In this study, the behavior of a single bubble in a dielectric viscous fluid under a uniform magnetic field has been simulated numerically using the Level Set method in two-phase bubbly flow. The two-phase bubbly flow was considered to be laminar and homogeneous. Deformation of the bubble was considered to be due to buoyancy and magnetic forces induced from the external applied magnetic field. A computer code was developed to solve the problem using the flow field, the interface of two phases, and the magnetic field. The Finite Volume method was applied using the SIMPLE algorithm to discretize the governing equations. Using this algorithm enables us to calculate the pressure parameter, which has been eliminated by previous researchers because of the complexity of the two-phase flow. The finite difference method was used to solve the magnetic field equation. The results outlined in the present study agree well with the existing experimental data and numerical results. These results show that the magnetic field affects and controls the shape, size, velocity, and location of the bubble. - Highlights: ►A bubble behavior was simulated numerically. ► A single bubble behavior was considered in a dielectric viscous fluid. ► A uniform magnetic field is used to study a bubble behavior. ► Deformation of the bubble was considered using the Level Set method. ► The magnetic field affects the shape, size, velocity, and location of the bubble.

  13. Dynamic evolution of double Λ five-level atom interacting with one-mode electromagnetic cavity field

    Science.gov (United States)

    Abdel-Wahab, N. H.; Salah, Ahmed

    2017-12-01

    In this paper, the model describing a double Λ five-level atom interacting with a single mode electromagnetic cavity field in the (off) non-resonate case is studied. We obtained the constants of motion for the considered model. Also, the state vector of the wave function is given by using the Schrödinger equation when the atom is initially prepared in its excited state. The dynamical evolutions for the collapse revivals, the antibunching of photons and the field squeezing phenomena are investigated when the field is considered in a coherent state. The influence of detuning parameters on these phenomena is investigated. We noticed that the atom-field properties are influenced by changing the detuning parameters. The investigation of these aspects by numerical simulations is carried out using the Quantum Toolbox in Python (QuTip).

  14. Statistical approach for uncertainty quantification of experimental modal model parameters

    DEFF Research Database (Denmark)

    Luczak, M.; Peeters, B.; Kahsin, M.

    2014-01-01

    Composite materials are widely used in manufacture of aerospace and wind energy structural components. These load carrying structures are subjected to dynamic time-varying loading conditions. Robust structural dynamics identification procedure impose tight constraints on the quality of modal models...... represent different complexity levels ranging from coupon, through sub-component up to fully assembled aerospace and wind energy structural components made of composite materials. The proposed method is demonstrated on two application cases of a small and large wind turbine blade........ This paper aims at a systematic approach for uncertainty quantification of the parameters of the modal models estimated from experimentally obtained data. Statistical analysis of modal parameters is implemented to derive an assessment of the entire modal model uncertainty measure. Investigated structures...

  15. LEVELS OF EXTREMELY LOW-FREQUENCY ELECTRIC AND MAGNETIC FIELDS FROM OVERHEAD POWER LINES IN THE OUTDOOR ENVIRONMENT OF RAMALLAH CITY-PALESTINE.

    Science.gov (United States)

    Abuasbi, Falastine; Lahham, Adnan; Abdel-Raziq, Issam Rashid

    2018-05-01

    In this study, levels of extremely low-frequency electric and magnetic fields originated from overhead power lines were investigated in the outdoor environment in Ramallah city, Palestine. Spot measurements were applied to record fields intensities over 6-min period. The Spectrum Analyzer NF-5035 was used to perform measurements at 1 m above ground level and directly underneath 40 randomly selected power lines distributed fairly within the city. Levels of electric fields varied depending on the line's category (power line, transformer or distributor), a minimum mean electric field of 3.9 V/m was found under a distributor line, and a maximum of 769.4 V/m under a high-voltage power line (66 kV). However, results of electric fields showed a log-normal distribution with the geometric mean and the geometric standard deviation of 35.9 and 2.8 V/m, respectively. Magnetic fields measured at power lines, on contrast, were not log-normally distributed; the minimum and maximum mean magnetic fields under power lines were 0.89 and 3.5 μT, respectively. As a result, none of the measured fields exceeded the ICNIRP's guidelines recommended for general public exposures to extremely low-frequency fields.

  16. Parametric resonances in the amplitude-modulated probe-field absorption spectrum of a two-level atom driven by a resonance amplitude- and phase-modulated pumping field

    International Nuclear Information System (INIS)

    Sushilov, N.V.; Kholodkevich, E.D.

    1995-01-01

    An analytical expression is derived for the polarization induced by a weak probe field with periodically modulated amplitude in a two-level medium saturated by a strong amplitude-and phase-modulated resonance field. It is shown that the absorption spectrum of the probe field includes parametric resonances, the maxima corresponding to the condition δ= 2nΓ-Ω w and the minima to that of δ= (2n + 1)Γ- w , where δ is the probe-field detuning front the resonance frequency, Ω w is the modulation frequency of the probe-field amplitude, and Γ is the transition line width, n = 1, 2, 3, hor-ellipsis. At the specific modulation parameters, a substantial region of negative values (i.e., the region of amplification without the population inversion) exists in the absorption spectrum of the probe field

  17. A field studies and modeling approach to develop organochlorine pesticide and PCB total maximum daily load calculations: Case study for Echo Park Lake, Los Angeles, CA

    Energy Technology Data Exchange (ETDEWEB)

    Vasquez, V.R., E-mail: vrvasquez@ucla.edu [Environmental Science and Engineering Program, University of California, Los Angeles, Los Angeles, CA 90095-1496 (United States); Curren, J., E-mail: janecurren@yahoo.com [Environmental Science and Engineering Program, University of California, Los Angeles, Los Angeles, CA 90095-1496 (United States); Lau, S.-L., E-mail: simlin@ucla.edu [Department of Civil and Environmental Engineering, University of California, Los Angeles, Los Angeles, CA 90095-1496 (United States); Stenstrom, M.K., E-mail: stenstro@seas.ucla.edu [Department of Civil and Environmental Engineering, University of California, Los Angeles, Los Angeles, CA 90095-1496 (United States); Suffet, I.H., E-mail: msuffet@ucla.edu [Environmental Science and Engineering Program, University of California, Los Angeles, Los Angeles, CA 90095-1496 (United States)

    2011-09-01

    Echo Park Lake is a small lake in Los Angeles, CA listed on the USA Clean Water Act Section 303(d) list of impaired water bodies for elevated levels of organochlorine pesticides (OCPs) and polychlorinated biphenyls (PCBs) in fish tissue. A lake water and sediment sampling program was completed to support the development of total maximum daily loads (TMDL) to address the lake impairment. The field data indicated quantifiable levels of OCPs and PCBs in the sediments, but lake water data were all below detection levels. The field sediment data obtained may explain the contaminant levels in fish tissue using appropriate sediment-water partitioning coefficients and bioaccumulation factors. A partition-equilibrium fugacity model of the whole lake system was used to interpret the field data and indicated that half of the total mass of the pollutants in the system are in the sediments and the other half is in soil; therefore, soil erosion could be a significant pollutant transport mode into the lake. Modeling also indicated that developing and quantifying the TMDL depends significantly on the analytical detection level for the pollutants in field samples and on the choice of octanol-water partitioning coefficient and bioaccumulation factors for the model. - Research highlights: {yields} Fugacity model using new OCP and PCB field data supports lake TMDL calculations. {yields} OCP and PCB levels in lake sediment were found above levels for impairment. {yields} Relationship between sediment data and available fish tissue data evaluated. {yields} Model provides approximation of contaminant sources and sinks for a lake system. {yields} Model results were sensitive to analytical detection and quantification levels.

  18. Quantification of trace metals in water using complexation and filter concentration.

    Science.gov (United States)

    Dolgin, Bella; Bulatov, Valery; Japarov, Julia; Elish, Eyal; Edri, Elad; Schechter, Israel

    2010-06-15

    Various metals undergo complexation with organic reagents, resulting in colored products. In practice, their molar absorptivities allow for quantification in the ppm range. However, a proper pre-concentration of the colored complex on paper filter lowers the quantification limit to the low ppb range. In this study, several pre-concentration techniques have been examined and compared: filtering the already complexed mixture, complexation on filter, and dipping of dye-covered filter in solution. The best quantification has been based on the ratio of filter reflectance at a certain wavelength to that at zero metal concentration. The studied complex formations (Ni ions with TAN and Cd ions with PAN) involve production of nanoparticle suspensions, which are associated with complicated kinetics. The kinetics of the complexation of Ni ions with TAN has been investigated and optimum timing could be found. Kinetic optimization in regard to some interferences has also been suggested.

  19. Negative dielectrophoresis spectroscopy for rare analyte quantification in biological samples

    Science.gov (United States)

    Kirmani, Syed Abdul Mannan; Gudagunti, Fleming Dackson; Velmanickam, Logeeshan; Nawarathna, Dharmakeerthi; Lima, Ivan T., Jr.

    2017-03-01

    We propose the use of negative dielectrophoresis (DEP) spectroscopy as a technique to improve the detection limit of rare analytes in biological samples. We observe a significant dependence of the negative DEP force on functionalized polystyrene beads at the edges of interdigitated electrodes with respect to the frequency of the electric field. We measured this velocity of repulsion for 0% and 0.8% conjugation of avidin with biotin functionalized polystyrene beads with our automated software through real-time image processing that monitors the Rayleigh scattering from the beads. A significant difference in the velocity of the beads was observed in the presence of as little as 80 molecules of avidin per biotin functionalized bead. This technology can be applied in the detection and quantification of rare analytes that can be useful in the diagnosis and the treatment of diseases, such as cancer and myocardial infarction, with the use of polystyrene beads functionalized with antibodies for the target biomarkers.

  20. Volumetric quantification of lung nodules in CT with iterative reconstruction (ASiR and MBIR)

    Energy Technology Data Exchange (ETDEWEB)

    Chen, Baiyu [Medical Physics Graduate Program, Duke University, Durham, North Carolina 27705 and Carl E. Ravin Advanced Imaging Laboratories, Duke University, Durham, North Carolina 27705 (United States); Barnhart, Huiman [Department of Biostatistics and Bioinformatics, Duke University, Durham, North Carolina 27705 (United States); Richard, Samuel [Carl E. Ravin Advanced Imaging Laboratories, Duke University, Durham, North Carolina 27705 and Department of Radiology, Duke University, Durham, North Carolina 27705 (United States); Robins, Marthony [Medical Physics Graduate Program, Duke University, Durham, North Carolina 27705 (United States); Colsher, James [Department of Radiology, Duke University, Durham, North Carolina 27705 (United States); Samei, Ehsan [Medical Physics Graduate Program, Duke University, Durham, North Carolina 27705 (United States); Carl E. Ravin Advanced Imaging Laboratories, Duke University, Durham, North Carolina 27705 (United States); Department of Radiology, Duke University, Durham, North Carolina 27705 (United States); Department of Physics, Department of Biomedical Engineering, and Department of Electronic and Computer Engineering, Duke University, Durham, North Carolina 27705 (United States)

    2013-11-15

    Purpose: Volume quantifications of lung nodules with multidetector computed tomography (CT) images provide useful information for monitoring nodule developments. The accuracy and precision of the volume quantification, however, can be impacted by imaging and reconstruction parameters. This study aimed to investigate the impact of iterative reconstruction algorithms on the accuracy and precision of volume quantification with dose and slice thickness as additional variables.Methods: Repeated CT images were acquired from an anthropomorphic chest phantom with synthetic nodules (9.5 and 4.8 mm) at six dose levels, and reconstructed with three reconstruction algorithms [filtered backprojection (FBP), adaptive statistical iterative reconstruction (ASiR), and model based iterative reconstruction (MBIR)] into three slice thicknesses. The nodule volumes were measured with two clinical software (A: Lung VCAR, B: iNtuition), and analyzed for accuracy and precision.Results: Precision was found to be generally comparable between FBP and iterative reconstruction with no statistically significant difference noted for different dose levels, slice thickness, and segmentation software. Accuracy was found to be more variable. For large nodules, the accuracy was significantly different between ASiR and FBP for all slice thicknesses with both software, and significantly different between MBIR and FBP for 0.625 mm slice thickness with Software A and for all slice thicknesses with Software B. For small nodules, the accuracy was more similar between FBP and iterative reconstruction, with the exception of ASIR vs FBP at 1.25 mm with Software A and MBIR vs FBP at 0.625 mm with Software A.Conclusions: The systematic difference between the accuracy of FBP and iterative reconstructions highlights the importance of extending current segmentation software to accommodate the image characteristics of iterative reconstructions. In addition, a calibration process may help reduce the dependency of

  1. Volumetric quantification of lung nodules in CT with iterative reconstruction (ASiR and MBIR)

    International Nuclear Information System (INIS)

    Chen, Baiyu; Barnhart, Huiman; Richard, Samuel; Robins, Marthony; Colsher, James; Samei, Ehsan

    2013-01-01

    Purpose: Volume quantifications of lung nodules with multidetector computed tomography (CT) images provide useful information for monitoring nodule developments. The accuracy and precision of the volume quantification, however, can be impacted by imaging and reconstruction parameters. This study aimed to investigate the impact of iterative reconstruction algorithms on the accuracy and precision of volume quantification with dose and slice thickness as additional variables.Methods: Repeated CT images were acquired from an anthropomorphic chest phantom with synthetic nodules (9.5 and 4.8 mm) at six dose levels, and reconstructed with three reconstruction algorithms [filtered backprojection (FBP), adaptive statistical iterative reconstruction (ASiR), and model based iterative reconstruction (MBIR)] into three slice thicknesses. The nodule volumes were measured with two clinical software (A: Lung VCAR, B: iNtuition), and analyzed for accuracy and precision.Results: Precision was found to be generally comparable between FBP and iterative reconstruction with no statistically significant difference noted for different dose levels, slice thickness, and segmentation software. Accuracy was found to be more variable. For large nodules, the accuracy was significantly different between ASiR and FBP for all slice thicknesses with both software, and significantly different between MBIR and FBP for 0.625 mm slice thickness with Software A and for all slice thicknesses with Software B. For small nodules, the accuracy was more similar between FBP and iterative reconstruction, with the exception of ASIR vs FBP at 1.25 mm with Software A and MBIR vs FBP at 0.625 mm with Software A.Conclusions: The systematic difference between the accuracy of FBP and iterative reconstructions highlights the importance of extending current segmentation software to accommodate the image characteristics of iterative reconstructions. In addition, a calibration process may help reduce the dependency of

  2. Anticipating and addressing workplace static magnetic field effects at levels <0.5 mT.

    Science.gov (United States)

    Emery, R J; Hopkins, J R; Charlton, M A

    2000-11-01

    Magnetic resonance, once a research tool limited to the basic sciences, has experienced an increase in popularity due to its unique ability to analyze certain living systems in vivo. Expanding applications in the biomedical sciences have resulted in magnetic sources being located in research institutions nationally. Space and resource limitations sometimes necessitate siting magnetic resonance units in proximity to other institutional operations. For magnetic field shielding and personnel protection considerations, the generally accepted 0.5 mT (milliTesla) limit for implanted cardiac devices is commonly used as the conservative basis for decisions. But the effects of magnetic fields on equipment can be easily observed at levels far below 0.5 mT, often resulting in concern and apprehension on the part of personnel in the surrounding areas. Responding to recurrent worker concerns spawned by noticeable effects on equipment at exposure levels information, practicing radiation safety professionals can better anticipate facility incompatibility issues and improve their responses to worker concerns initiated by observed effects on equipment.

  3. Sensitive Quantification of Aflatoxin B1 in Animal Feeds, Corn Feed Grain, and Yellow Corn Meal Using Immunomagnetic Bead-Based Recovery and Real-Time Immunoquantitative-PCR

    Directory of Open Access Journals (Sweden)

    Dinesh Babu

    2014-12-01

    Full Text Available Aflatoxins are considered unavoidable natural mycotoxins encountered in foods, animal feeds, and feed grains. In this study, we demonstrate the application of our recently developed real-time immunoquantitative PCR (RT iq-PCR assay for sensitive detection and quantification of aflatoxins in poultry feed, two types of dairy feed (1 and 2, horse feed, whole kernel corn feed grains, and retail yellow ground corn meal. Upon testing methanol/water (60:40 extractions of the above samples using competitive direct enzyme linked immunosorbent assay, the aflatoxin content was found to be <20 μg/kg. The RT iq-PCR assay exhibited high antigen hook effect in samples containing aflatoxin levels higher than the quantification limits (0.1–10 μg/kg, addressed by comparing the quantification results of undiluted and diluted extracts. In testing the reliability of the immuno-PCR assay, samples were spiked with 200 μg/kg of aflatoxin B1, but the recovery of spiked aflatoxin was found to be poor. Considering the significance of determining trace levels of aflatoxins and their serious implications for animal and human health, the RT iq-PCR method described in this study can be useful for quantifying low natural aflatoxin levels in complex matrices of food or animal feed samples without the requirement of extra sample cleanup.

  4. Parsing and Quantification of Raw Orbitrap Mass Spectrometer Data Using RawQuant.

    Science.gov (United States)

    Kovalchik, Kevin A; Moggridge, Sophie; Chen, David D Y; Morin, Gregg B; Hughes, Christopher S

    2018-06-01

    Effective analysis of protein samples by mass spectrometry (MS) requires careful selection and optimization of a range of experimental parameters. As the output from the primary detection device, the "raw" MS data file can be used to gauge the success of a given sample analysis. However, the closed-source nature of the standard raw MS file can complicate effective parsing of the data contained within. To ease and increase the range of analyses possible, the RawQuant tool was developed to enable parsing of raw MS files derived from Thermo Orbitrap instruments to yield meta and scan data in an openly readable text format. RawQuant can be commanded to export user-friendly files containing MS 1 , MS 2 , and MS 3 metadata as well as matrices of quantification values based on isobaric tagging approaches. In this study, the utility of RawQuant is demonstrated in several scenarios: (1) reanalysis of shotgun proteomics data for the identification of the human proteome, (2) reanalysis of experiments utilizing isobaric tagging for whole-proteome quantification, and (3) analysis of a novel bacterial proteome and synthetic peptide mixture for assessing quantification accuracy when using isobaric tags. Together, these analyses successfully demonstrate RawQuant for the efficient parsing and quantification of data from raw Thermo Orbitrap MS files acquired in a range of common proteomics experiments. In addition, the individual analyses using RawQuant highlights parametric considerations in the different experimental sets and suggests targetable areas to improve depth of coverage in identification-focused studies and quantification accuracy when using isobaric tags.

  5. Quantification of glycyrrhizin biomarker in Glycyrrhiza glabra ...

    African Journals Online (AJOL)

    Background: A simple and sensitive thin-layer chromatographic method has been established for quantification of glycyrrhizin in Glycyrrhiza glabra rhizome and baby herbal formulations by validated Reverse Phase HPTLC method. Materials and Methods: RP-HPTLC Method was carried out using glass coated with RP-18 ...

  6. Data-driven Demand Response Characterization and Quantification

    DEFF Research Database (Denmark)

    Le Ray, Guillaume; Pinson, Pierre; Larsen, Emil Mahler

    2017-01-01

    Analysis of load behavior in demand response (DR) schemes is important to evaluate the performance of participants. Very few real-world experiments have been carried out and quantification and characterization of the response is a difficult task. Nevertheless it will be a necessary tool for portf...

  7. Field damage of sorghum (Sorghum bicolor) with reduced lignin levels by naturally occurring insect pests and pathogens

    Science.gov (United States)

    Mutant lines of sorghum with low levels of lignin are potentially useful for bioenergy production, but may have problems with insects or disease. Field grown normal and low lignin bmr6 and bmr12 sorghum (Sorghum bicolor) were examined for insect and disease damage in the field, and insect damage in ...

  8. Simultaneous quantification of four native estrogen hormones at trace levels in human cerebrospinal fluid using liquid chromatography-tandem mass spectrometry.

    Science.gov (United States)

    Nguyen, Hien P; Li, Li; Gatson, Joshua W; Maass, David; Wigginton, Jane G; Simpkins, James W; Schug, Kevin A

    2011-03-25

    Estrogens are known to exhibit neuroprotective effects on the brain. Their importance in this regard and in others has been emphasized in many recent studies, which increases the need to develop reliable analytical methods for the measurement of estrogen hormones. A heart-cutting two-dimensional liquid chromatography separation method coupled with electrospray ionization-tandem mass spectrometry (ESI-MS/MS) has been developed for simultaneous measurement of four estrogens, including estriol (E3), estrone (E1), 17β-estradiol (17β-E2), and 17α-estradiol (17α-E2), in human cerebrospinal fluid (CSF). The method was based on liquid-liquid extraction and derivatization of estrogens with dansyl chloride to enhance the sensitivity of ESI-based detection in conjunction with tandem mass spectrometry. Dansylated estriol and estrone were separated in the first dimension by an amide-C18 column, while dansylated 17β- and 17α-estradiol were resolved on the second dimension by two C18 columns (175 mm total length) connected in series. This is the first report of a method for simultaneous quantification of all four endogenous estrogen compounds in their dansylated form. The detection limits for E1, 17α-E2, 17β-E2, and E3 were 19, 35, 26, and 61pg/mL, respectively. Due to matrix effects, validation and calibration was carried out in charcoal-stripped CSF. The precision and accuracy were more than 86% for the two E2 compounds and 79% for E1 and E3 while the extraction recovery ranged from 91% to 104%. The method was applied to measure estrogens obtained in a clinical setting, from the CSF of ischemic trauma patients. While 17β-estradiol was present at a significant level in the CSF of some samples, other estrogens were present at lower levels or were undetectable. Copyright © 2010 Elsevier B.V. All rights reserved.

  9. Uncertainty Quantification in Alchemical Free Energy Methods.

    Science.gov (United States)

    Bhati, Agastya P; Wan, Shunzhou; Hu, Yuan; Sherborne, Brad; Coveney, Peter V

    2018-05-02

    Alchemical free energy methods have gained much importance recently from several reports of improved ligand-protein binding affinity predictions based on their implementation using molecular dynamics simulations. A large number of variants of such methods implementing different accelerated sampling techniques and free energy estimators are available, each claimed to be better than the others in its own way. However, the key features of reproducibility and quantification of associated uncertainties in such methods have barely been discussed. Here, we apply a systematic protocol for uncertainty quantification to a number of popular alchemical free energy methods, covering both absolute and relative free energy predictions. We show that a reliable measure of error estimation is provided by ensemble simulation-an ensemble of independent MD simulations-which applies irrespective of the free energy method. The need to use ensemble methods is fundamental and holds regardless of the duration of time of the molecular dynamics simulations performed.

  10. Seed shape quantification in the order Cucurbitales

    Directory of Open Access Journals (Sweden)

    Emilio Cervantes

    2018-02-01

    Full Text Available Seed shape quantification in diverse species of the families belonging to the order Cucurbitales is done based on the comparison of seed images with geometric figures. Quantification of seed shape is a useful tool in plant description for phenotypic characterization and taxonomic analysis. J index gives the percent of similarity of the image of a seed with a geometric figure and it is useful in taxonomy for the study of relationships between plant groups. Geometric figures used as models in the Cucurbitales are the ovoid, two ellipses with different x/y ratios and the outline of the Fibonacci spiral. The images of seeds have been compared with these figures and values of J index obtained. The results obtained for 29 species in the family Cucurbitaceae support a relationship between seed shape and species ecology. Simple seed shape, with images resembling simple geometric figures like the ovoid, ellipse or the Fibonacci spiral, may be a feature in the basal clades of taxonomic groups.

  11. Advances on generic exemption levels and generic clearance levels in the argentinean regulatory field

    International Nuclear Information System (INIS)

    Muñiz, C.C.; Bossio, M.C.

    2011-01-01

    With the aim of optimizing the regulatory effort in Argentina, the Nuclear Regulatory Authority (ARN) evaluated two worldwide concepts used in the radioactive waste management field: “Generic Exemption Levels” and “Generic Clearance Levels”. The objective of this paper is to present the progress made in the past two years in relation to these topics and to present the results of the specific requests received from users of radioactive material. Since the approval of both Generic Levels, the ARN received two exemption requests. The first one, regarding the practice of dismantling lighting rods with 241 Am. The other case regards the international trade, distribution, usage and final disposal of lighting products with radioactive material ( 85 Kr and 232 Th). Concerning clearance, there has not been any request yet. However, in the future the ARN expects to receive this kind of requests from nuclear power plants and other facilities related to the nuclear fuel cycle. (authors) [es

  12. Modeling qRT-PCR dynamics with application to cancer biomarker quantification.

    Science.gov (United States)

    Chervoneva, Inna; Freydin, Boris; Hyslop, Terry; Waldman, Scott A

    2017-01-01

    Quantitative reverse transcription polymerase chain reaction (qRT-PCR) is widely used for molecular diagnostics and evaluating prognosis in cancer. The utility of mRNA expression biomarkers relies heavily on the accuracy and precision of quantification, which is still challenging for low abundance transcripts. The critical step for quantification is accurate estimation of efficiency needed for computing a relative qRT-PCR expression. We propose a new approach to estimating qRT-PCR efficiency based on modeling dynamics of polymerase chain reaction amplification. In contrast, only models for fluorescence intensity as a function of polymerase chain reaction cycle have been used so far for quantification. The dynamics of qRT-PCR efficiency is modeled using an ordinary differential equation model, and the fitted ordinary differential equation model is used to obtain effective polymerase chain reaction efficiency estimates needed for efficiency-adjusted quantification. The proposed new qRT-PCR efficiency estimates were used to quantify GUCY2C (Guanylate Cyclase 2C) mRNA expression in the blood of colorectal cancer patients. Time to recurrence and GUCY2C expression ratios were analyzed in a joint model for survival and longitudinal outcomes. The joint model with GUCY2C quantified using the proposed polymerase chain reaction efficiency estimates provided clinically meaningful results for association between time to recurrence and longitudinal trends in GUCY2C expression.

  13. Recurrence quantification analysis in Liu's attractor

    International Nuclear Information System (INIS)

    Balibrea, Francisco; Caballero, M. Victoria; Molera, Lourdes

    2008-01-01

    Recurrence Quantification Analysis is used to detect transitions chaos to periodical states or chaos to chaos in a new dynamical system proposed by Liu et al. This system contains a control parameter in the second equation and was originally introduced to investigate the forming mechanism of the compound structure of the chaotic attractor which exists when the control parameter is zero

  14. Two-Phase Microfluidic Systems for High Throughput Quantification of Agglutination Assays

    KAUST Repository

    Castro, David

    2018-04-01

    Lab-on-Chip, the miniaturization of the chemical and analytical lab, is an endeavor that seems to come out of science fiction yet is slowly becoming a reality. It is a multidisciplinary field that combines different areas of science and engineering. Within these areas, microfluidics is a specialized field that deals with the behavior, control and manipulation of small volumes of fluids. Agglutination assays are rapid, single-step, low-cost immunoassays that use microspheres to detect a wide variety molecules and pathogens by using a specific antigen-antibody interaction. Agglutination assays are particularly suitable for the miniaturization and automation that two-phase microfluidics can offer, a combination that can help tackle the ever pressing need of high-throughput screening for blood banks, epidemiology, food banks diagnosis of infectious diseases. In this thesis, we present a two-phase microfluidic system capable of incubating and quantifying agglutination assays. The microfluidic channel is a simple fabrication solution, using laboratory tubing. These assays are incubated by highly efficient passive mixing with a sample-to-answer time of 2.5 min, a 5-10 fold improvement over traditional agglutination assays. It has a user-friendly interface that that does not require droplet generators, in which a pipette is used to continuously insert assays on-demand, with no down-time in between experiments at 360 assays/h. System parameters are explored, using the streptavidin-biotin interaction as a model assay, with a minimum detection limit of 50 ng/mL using optical image analysis. We compare optical image analysis and light scattering as quantification methods, and demonstrate the first light scattering quantification of agglutination assays in a two-phase ow format. The application can be potentially applied to other biomarkers, which we demonstrate using C-reactive protein (CRP) assays. Using our system, we can take a commercially available CRP qualitative slide

  15. Genomic DNA-based absolute quantification of gene expression in Vitis.

    Science.gov (United States)

    Gambetta, Gregory A; McElrone, Andrew J; Matthews, Mark A

    2013-07-01

    Many studies in which gene expression is quantified by polymerase chain reaction represent the expression of a gene of interest (GOI) relative to that of a reference gene (RG). Relative expression is founded on the assumptions that RG expression is stable across samples, treatments, organs, etc., and that reaction efficiencies of the GOI and RG are equal; assumptions which are often faulty. The true variability in RG expression and actual reaction efficiencies are seldom determined experimentally. Here we present a rapid and robust method for absolute quantification of expression in Vitis where varying concentrations of genomic DNA were used to construct GOI standard curves. This methodology was utilized to absolutely quantify and determine the variability of the previously validated RG ubiquitin (VvUbi) across three test studies in three different tissues (roots, leaves and berries). In addition, in each study a GOI was absolutely quantified. Data sets resulting from relative and absolute methods of quantification were compared and the differences were striking. VvUbi expression was significantly different in magnitude between test studies and variable among individual samples. Absolute quantification consistently reduced the coefficients of variation of the GOIs by more than half, often resulting in differences in statistical significance and in some cases even changing the fundamental nature of the result. Utilizing genomic DNA-based absolute quantification is fast and efficient. Through eliminating error introduced by assuming RG stability and equal reaction efficiencies between the RG and GOI this methodology produces less variation, increased accuracy and greater statistical power. © 2012 Scandinavian Plant Physiology Society.

  16. Quantification of rat brain SPECT with 123I-ioflupane: evaluation of different reconstruction methods and image degradation compensations using Monte Carlo simulation

    International Nuclear Information System (INIS)

    Roé-Vellvé, N; Pino, F; Cot, A; Ros, D; Falcon, C; Gispert, J D; Pavía, J; Marin, C

    2014-01-01

    SPECT studies with 123 I-ioflupane facilitate the diagnosis of Parkinson’s disease (PD). The effect on quantification of image degradations has been extensively evaluated in human studies but their impact on studies of experimental PD models is still unclear. The aim of this work was to assess the effect of compensating for the degrading phenomena on the quantification of small animal SPECT studies using 123 I-ioflupane. This assessment enabled us to evaluate the feasibility of quantitatively detecting small pathological changes using different reconstruction methods and levels of compensation for the image degrading phenomena. Monte Carlo simulated studies of a rat phantom were reconstructed and quantified. Compensations for point spread function (PSF), scattering, attenuation and partial volume effect were progressively included in the quantification protocol. A linear relationship was found between calculated and simulated specific uptake ratio (SUR) in all cases. In order to significantly distinguish disease stages, noise-reduction during the reconstruction process was the most relevant factor, followed by PSF compensation. The smallest detectable SUR interval was determined by biological variability rather than by image degradations or coregistration errors. The quantification methods that gave the best results allowed us to distinguish PD stages with SUR values that are as close as 0.5 using groups of six rats to represent each stage. (paper)

  17. Measurement of radiation-induced damage to DNA at the molecular level

    International Nuclear Information System (INIS)

    Dizdaroglu, M.

    1992-01-01

    The present article describes the potential usefulness of the technique of gas chromatography-mass spectrometry (GC/MS) for chemical characterization and quantification of modifications in DNA and in mammalian chromatin. The aim of the article is to give a concise description of the field rather than an exhaustive review, with the emphasis on the practicalities, limitations, applications and comparison with other techniques. (author)

  18. Techniques of biomolecular quantification through AMS detection of radiocarbon

    International Nuclear Information System (INIS)

    Vogel, S.J.; Turteltaub, K.W.; Frantz, C.; Felton, J.S.; Gledhill, B.L.

    1992-01-01

    Accelerator mass spectrometry offers a large gain over scintillation counting in sensitivity for detecting radiocarbon in biomolecular tracing. Application of this sensitivity requires new considerations of procedures to extract or isolate the carbon fraction to be quantified, to inventory all carbon in the sample, to prepare graphite from the sample for use in the spectrometer, and to derive a meaningful quantification from the measured isotope ratio. These procedures need to be accomplished without contaminating the sample with radiocarbon, which may be ubiquitous in laboratories and on equipment previously used for higher dose, scintillation experiments. Disposable equipment, materials and surfaces are used to control these contaminations. Quantification of attomole amounts of labeled substances are possible through these techniques

  19. Effect of strategies regarding concentrate supplementation and day-time grazing on N utilization at both field and dairy cow level

    DEFF Research Database (Denmark)

    Lund, Peter; Søegaard, Karen; Weisbjerg, Martin Riis

    2008-01-01

    N utilization at cow and field level was examined over two grazing periods of 30 days with 64 Holstein dairy cows. At cow and field level the effect of sward type (diploid vs. tetraploid perennial ryegrass, both mixed with white clover) and compressed sward height (6 vs. 10 cm) was examined....

  20. Graphene–platinum nanocomposite as a sensitive and selective voltammetric sensor for trace level arsenic quantification

    Directory of Open Access Journals (Sweden)

    R. Kempegowda

    2014-01-01

    Full Text Available A simple protocol for the chemical modification of graphene with platinum nanoparticles and its subsequent electroanalytical application toward sensitive and selective determination of arsenic has been described. Chemical modification was carried out by the simultaneous and sequential chemical reduction of graphene oxide and hexachloroplatinic acid in the presence of ethylene glycol as a mild reducing agent. The synthesized graphene–platinum nanocomposite (Gr–nPt has been characterized through infrared spectroscopy, x-ray diffraction study, field emission scanning electron microscopy and cyclic voltammetry (CV techniques. CV and square-wave anodic stripping voltammetry have been used to quantify arsenic. The proposed nanostructure showed linearity in the concentration range 10–100 nM with a detection limit of 1.1 nM. The proposed sensor has been successfully applied to measure trace levels of arsenic present in natural sample matrices like borewell water, polluted lake water, agricultural soil, tomato and spinach leaves.