WorldWideScience

Sample records for quantification impacts environnemental

  1. OMEGA Thau : outil de management environnemental et de gestion de l'avertissement des pollutions microbiologiques du bassin de Thau

    OpenAIRE

    Brocard, Gilles; Derolez, Valerie; Serais, Ophelie; Fiandrino, Annie; Lequette, Camille; Lescoulier, Christophe; Benedetti, Murielle; Couton, Prunelle; Marty, Delphine

    2010-01-01

    Le projet OMEGA Thau (Outil de Management Environnemental et de Gestion de l’Avertissement de la lagune de Thau) est un programme de recherche et développement, à maîtrise d'ouvrage du Syndicat Mixte du Bassin de Thau, associant des scientifiques, des autorités et collectivités locales ainsi que des professionnels de la conchyliculture. Il vise à élaborer un outil d'aide à la décision des gestionnaires pour orienter les investissements publics sur le bassin versant, afin d'obtenir une qualité...

  2. Impact environnemental des corps gras et de leurs dérivés formulés ou non : biodégradabilité et écotoxicité

    Directory of Open Access Journals (Sweden)

    Bouillon Vincent

    2003-09-01

    Full Text Available Les risques encourus par l’environnement à cause de la libération d’un produit chimique particulier dépendent essentiellement du potentiel et de la durée d’exposition de ce produit à l’environnement et de sa toxicité. Les corps gras et leurs dérivés sont utilisés dans des domaines très variés tels que les lubrifiants, les solvants, les agents de surface, les détergents… L’impact environnemental de ces composés peut être mis en évidence par des essais normalisés de biodégradabilité et de toxicité. La biodégradabilité est évaluée par diverses mesures : disparition de la substance, évolution de la demande biochimique en oxygène (DBO, consommation d’oxygène, production de dioxyde de carbone et évolution de la composition gazeuse autour du phénomène de biodégradation. La toxicité est évaluée sur divers organismes tels que les bactéries, les algues, les crustacés, les poissons, les mammifères… L’ensemble de ces mesures, expliquées dans cet article, est d’autant plus important qu’il intervient dans les classifications, les réglementations et les écolabels.

  3. Plutonium in the environment - bibliographic study and quantification; Impacts environnemental et sanitaire des isotopes du plutonium, etude bibliographique et quantification

    Energy Technology Data Exchange (ETDEWEB)

    Guetat, Ph; Monfort, M; Ansoborlo, E [CEA Marcoule, Dir. de l' Energie Nucleaire, 30 (France); Bion, L; Moulin, V; Reiller, P; Vercouter, Th [CEA Saclay, Dir. de l' Energie Nucleaire, 91 - Gif sur Yvette (France); Boucher, L; Jourdain, F; Van Dorpe, F [CEA Cadarache, Dir. de l' Energie Nucleaire, 13 - Saint Paul lez Durance (France); Comte, A; Flury Heard, A; Fritsch, P; Menetrier, F [CEA Fontenay-aux-Roses, Dir. des Sciences du Vivant, 92 (France)

    2008-07-01

    This document deals with the different isotopes of plutonium. It intends to summarize the main features of plutonium behaviour from sources inside installation to the environment and man, and is expected to report the current knowledge about the different parameters used in the models for environmental and radiological impact assessment. The objective is to gather scientific information useful for deciders in case of accident or for regulation purposes. It gives main information on radiological and chemical characteristics which are necessary to understand transfers between compartments. Then it reports information on normal and accidental historical sources and present releases. The next part deals with transfer parameters in the installations and in environment. Parameters that influence its behaviour are examined, inside installations (physico-chemical forms and events that lead to releases), and outside in the environment for deposition to soils and transfer to plants, and animal products. A full chapter is dedicated to presentation of typical assessments, for each isotope and for mixture, and correspondence between activity, mass and dose reference levels are presented and discussed. Transfer and behaviour in man and effects on health are finally presented. (author)

  4. EVALUATION DE L’IMPACT ENVIRONNEMENTAL : Comment étudier l’impact de colzas transgéniques sur les abeilles ?

    Directory of Open Access Journals (Sweden)

    Pierre Jacqueline

    2000-07-01

    Full Text Available L’évaluation de l’impact environnemental de plantes génétiquement modifiées implique, lorsqu’il s’agit de plantes mellifères, la prise en compte de leurs interactions avec les abeilles. Les interactions à prendre en compte sont de deux types : - d’une part, il s’agit de vérifier l’innocuité de ces plantes sur l’abeille. En effet, l’abeille joue un rôle essentiel sur les plans économique et écologique, en tant que productrice de miel et pollinisatrice de nombreuses plantes sauvages ou cultivées. Ces différentes activités reposent sur l’aptitude de l’abeille à identifier et à visiter régulièrement des plantes susceptibles de lui procurer de la nourriture sous forme de nectar, qu’elle transforme en miel et qui lui sert d’alimentation glucidique, et de pollen, source de protéines. Toute modification survenant au niveau de ces plantes peut entraîner des perturbations du comportement ou de la physiologie des abeilles et se répercuter sur leur productivité en miel ou leur efficacité pollinisatrice. Les perturbations éventuelles peuvent découler soit d’effets directs liés à la présence du produit de transgène (protéine codée par le gène d’intérêt introduit dans la plante, soit d’effets indirects dus à des modifications secondaires de la physiologie de la plante associées à l’introduction du gène (effets pléiotropiques; - d’autre part, il faut prendre en compte le rôle potentiel de l’abeille comme facteur de dissémination du transgène. Ainsi, l’abeille, en se déplaçant de fleur en fleur, peut contribuer, en parallèle à une vection pollinique par le vent, à transférer le transgène via le pollen et à assurer une fécondation intra, voire interspécifique. Or, notamment dans le cas de gènes de résistance à des herbicides, on souhaite circonscrire strictement aux plantes transgéniques le caractère d’intérêt, en évitant des croisements interspécifiques avec

  5. Radioactive iodine and environmental and sanitary effects - bibliographic study and quantification; Iodes radioactifs et impacts environnemental et sanitaire - etude bibliographique et quantification

    Energy Technology Data Exchange (ETDEWEB)

    Guetat, Ph.; Armand, P.; Monfort, M.; Fritsch, P. [CEA Bruyeres-le-Chatel, 91 (France); Flury Herard, A. [CEA, Dir. des Sciences du Vivant, 75 - Paris (France); Menetrier, F. [CEA Fontenay-aux-Roses, Dir. des Sciences du Vivant, 92 (France); Bion, L. [CEA Saclay, Dir. de l' Energie Nucleaire (DEN), 91 - Gif sur Yvette (France); Schoech, C.; Masset, S. [Societe EX-IN - Expertise et Ingenierie, 92 - Le Plessis-Robinson (France)

    2004-07-01

    This document is intended to a large public. It reviews the different parameters needed to evaluate the potential act o radioactive releases from the emission to public. Its objectives are to evaluate the importance of different exposure pathways and to assess efficiency of the possible interventions for large public. The main conclusions are summarised hereafter: The radioactive decay chains have to be taken into account to evaluate the iodine source term in the nuclear plants in the case of fission accidents. The physico-chemical forms of iodine are important in order to determine the released activity and deposited activity on the soil. The isotopes to be taken into account are mainly iodine 131 for radiological assessments and also iodine 133 for the nuclear reactor accidents, and the chain Tellurium-Iodine 132 when no particulate filtration exists. Iodine 129 in French reprocessing plant cannot lead to significant accidents. The dominant exposure pathways are related to the consumption of contaminated food products (vegetable, milk) for the inorganic iodine. The iodine transfer to goat and sheep milk is greater than the one to cow milk. The meat production of herbivores at field is the most sensitive. The interest to remove rapidly herbivore from pasture appears relatively clearly. The banning of consumption of local contaminated food products (vegetables and meats) may reduce by about a factor of thirteen the impact due to iodine 131. The youngest the population is, the greatest are the thyroid radiosensitivity and variability within the population. Oral administration of stable iodine limits transfers to maternal milk and foetal thyroid. Ingestion of stable iodine is complementary to consumption banning of local contaminated food products. The earliest the ingestion is, the greatest is the efficiency. 0,1 TBq of 131 iodine released at a low height involves only limited and local actions whereas the release of 10 TBq involves direct and immediate protection

  6. Quantification in emission tomography

    International Nuclear Information System (INIS)

    Buvat, Irene

    2011-11-01

    The objective of this lecture is to understand the possibilities and limitations of the quantitative analysis of single photon emission computed tomography (SPECT) and positron emission tomography (PET) images. It is also to identify the conditions to be fulfilled to obtain reliable quantitative measurements from images. Content: 1 - Introduction: Quantification in emission tomography - definition and challenges; quantification biasing phenomena 2 - Main problems impacting quantification in PET and SPECT: problems, consequences, correction methods, results (Attenuation, scattering, partial volume effect, movement, un-stationary spatial resolution in SPECT, fortuitous coincidences in PET, standardisation in PET); 3 - Synthesis: accessible efficiency, know-how, Precautions, beyond the activity measurement

  7. The impact of reconstruction method on the quantification of DaTSCAN images

    Energy Technology Data Exchange (ETDEWEB)

    Dickson, John C.; Erlandsson, Kjell; Hutton, Brian F. [UCLH NHS Foundation Trust and University College London, Institute of Nuclear Medicine, London (United Kingdom); Tossici-Bolt, Livia [Southampton University Hospitals NHS Trust, Department of Medical Physics, Southampton (United Kingdom); Sera, Terez [University of Szeged, Department of Nuclear Medicine and Euromedic Szeged, Szeged (Hungary); Varrone, Andrea [Psychiatry Section and Stockholm Brain Institute, Karolinska Institute, Department of Clinical Neuroscience, Stockholm (Sweden); Tatsch, Klaus [EANM/European Network of Excellence for Brain Imaging, Vienna (Austria)

    2010-01-15

    Reconstruction of DaTSCAN brain studies using OS-EM iterative reconstruction offers better image quality and more accurate quantification than filtered back-projection. However, reconstruction must proceed for a sufficient number of iterations to achieve stable and accurate data. This study assessed the impact of the number of iterations on the image quantification, comparing the results of the iterative reconstruction with filtered back-projection data. A striatal phantom filled with {sup 123}I using striatal to background ratios between 2:1 and 10:1 was imaged on five different gamma camera systems. Data from each system were reconstructed using OS-EM (which included depth-independent resolution recovery) with various combinations of iterations and subsets to achieve up to 200 EM-equivalent iterations and with filtered back-projection. Using volume of interest analysis, the relationships between image reconstruction strategy and quantification of striatal uptake were assessed. For phantom filling ratios of 5:1 or less, significant convergence of measured ratios occurred close to 100 EM-equivalent iterations, whereas for higher filling ratios, measured uptake ratios did not display a convergence pattern. Assessment of the count concentrations used to derive the measured uptake ratio showed that nonconvergence of low background count concentrations caused peaking in higher measured uptake ratios. Compared to filtered back-projection, OS-EM displayed larger uptake ratios because of the resolution recovery applied in the iterative algorithm. The number of EM-equivalent iterations used in OS-EM reconstruction influences the quantification of DaTSCAN studies because of incomplete convergence and possible bias in areas of low activity due to the nonnegativity constraint in OS-EM reconstruction. Nevertheless, OS-EM using 100 EM-equivalent iterations provides the best linear discriminatory measure to quantify the uptake in DaTSCAN studies. (orig.)

  8. Automobile air-conditioning its energy and environmental impact; La climatisation automobile impact energetique et environnemental

    Energy Technology Data Exchange (ETDEWEB)

    Barbusse, St.; Gagnepain, L.

    2003-05-01

    Over the last three decades, automobile manufacturers have made a lot of progress in specific fuel consumption and engine emissions of pollutants. Yet the impact of these improvements on vehicle consumption has been limited by increased dynamic performances (maxi-mum speed, torque), increased safety (power steering and power brakes) and increased comfort (noise and vibration reduction, electric windows and thermal comfort). Because of this, the real CO{sub 2}-emission levels in vehicles is still high in a context where road transport is a major factor in the balance sheet of greenhouse gas emissions, thus in complying with the inter-national climate convention. Although European, Japanese and Korean manufacturers signed an important agreement with the European Commission for voluntarily reducing CO{sub 2} emissions from their vehicles, with a weighted average emission goal by sales of 140 grams per km on the MVEG approval cycle by 2008, it has to be noted that the European procedures for measuring fuel consumption and CO{sub 2} emissions do not take accessories into account, especially air-condition ng (A/C). The big dissemination of this equipment recognized as a big energy consumer and as using a refrigerant with a high global warming potential ed ADEME to implement a set of assessments of A/C's energy and environmental impact. In particular these assessments include studies of vehicle equipment rates, analyses of impact on fuel consumption as well as regulated pollutant emissions in the exhaust, a characterization of the refrigerant leakage levels and an estimate of greenhouse gas emissions for all air-conditioned vehicles. This leaflet summarizes the results of these actions. All of these studies and additional data are presented in greater detail in the document,-'Automobile Air-conditioning' (ADEME reference no. 4985). (author)

  9. Quantification and sensory studies of character impact odorants of different soybean lecithins.

    Science.gov (United States)

    Stephan, A; Steinhart, H

    1999-10-01

    Fifty-four potent odorants in standardized, hydrolyzed, and deoiled and hydrolyzed soybean lecithins were quantified by high-resolution gas chromatography/mass spectrometry (HRGC/MS). The characterization of their aroma impact was performed by calculation of nasal (n) and retronasal (r) odor activity values (OAVs). For this, the nasal and retronasal recognition thresholds of 18 odor-active compounds were determined in vegetable oil. The following compounds showed the highest nOAVs: 2,3-diethyl-5-methylpyrazine, methylpropanal, acetic acid, pentanoic acid, 2-ethyl-3,5-dimethylpyrazine, pentylpyridine, (Z)-1,5-octadien-3-one, 2-methylbutanal, and beta-damascenone. In addition to the compounds above, 1-octen-3-one, 1-nonen-3-one, and 3-methyl-2,4-nonandione showed potent rOAVs. The results of quantification and OAV calculation were confirmed by a model mixture of 25 impact odorants, which yielded a highly similar sensory profile to that of the original soybean lecithin. The sensory importance of pyrazines and free acids increased through enzymatic hydrolysis and decreased by the process of deoiling. The impact of unsaturated ketones on the lecithin aroma was not changed by either process.

  10. The impact of targeting repetitive BamHI-W sequences on the sensitivity and precision of EBV DNA quantification.

    Directory of Open Access Journals (Sweden)

    Armen Sanosyan

    Full Text Available Viral load monitoring and early Epstein-Barr virus (EBV DNA detection are essential in routine laboratory testing, especially in preemptive management of Post-transplant Lymphoproliferative Disorder. Targeting the repetitive BamHI-W sequence was shown to increase the sensitivity of EBV DNA quantification, but the variability of BamHI-W reiterations was suggested to be a source of quantification bias. We aimed to assess the extent of variability associated with BamHI-W PCR and its impact on the sensitivity of EBV DNA quantification using the 1st WHO international standard, EBV strains and clinical samples.Repetitive BamHI-W- and LMP2 single- sequences were amplified by in-house qPCRs and BXLF-1 sequence by a commercial assay (EBV R-gene™, BioMerieux. Linearity and limits of detection of in-house methods were assessed. The impact of repeated versus single target sequences on EBV DNA quantification precision was tested on B95.8 and Raji cell lines, possessing 11 and 7 copies of the BamHI-W sequence, respectively, and on clinical samples.BamHI-W qPCR demonstrated a lower limit of detection compared to LMP2 qPCR (2.33 log10 versus 3.08 log10 IU/mL; P = 0.0002. BamHI-W qPCR underestimated the EBV DNA load on Raji strain which contained fewer BamHI-W copies than the WHO standard derived from the B95.8 EBV strain (mean bias: - 0.21 log10; 95% CI, -0.54 to 0.12. Comparison of BamHI-W qPCR versus LMP2 and BXLF-1 qPCR showed an acceptable variability between EBV DNA levels in clinical samples with the mean bias being within 0.5 log10 IU/mL EBV DNA, whereas a better quantitative concordance was observed between LMP2 and BXLF-1 assays.Targeting BamHI-W resulted to a higher sensitivity compared to LMP2 but the variable reiterations of BamHI-W segment are associated with higher quantification variability. BamHI-W can be considered for clinical and therapeutic monitoring to detect an early EBV DNA and a dynamic change in viral load.

  11. The impact of targeting repetitive BamHI-W sequences on the sensitivity and precision of EBV DNA quantification.

    Science.gov (United States)

    Sanosyan, Armen; Fayd'herbe de Maudave, Alexis; Bollore, Karine; Zimmermann, Valérie; Foulongne, Vincent; Van de Perre, Philippe; Tuaillon, Edouard

    2017-01-01

    Viral load monitoring and early Epstein-Barr virus (EBV) DNA detection are essential in routine laboratory testing, especially in preemptive management of Post-transplant Lymphoproliferative Disorder. Targeting the repetitive BamHI-W sequence was shown to increase the sensitivity of EBV DNA quantification, but the variability of BamHI-W reiterations was suggested to be a source of quantification bias. We aimed to assess the extent of variability associated with BamHI-W PCR and its impact on the sensitivity of EBV DNA quantification using the 1st WHO international standard, EBV strains and clinical samples. Repetitive BamHI-W- and LMP2 single- sequences were amplified by in-house qPCRs and BXLF-1 sequence by a commercial assay (EBV R-gene™, BioMerieux). Linearity and limits of detection of in-house methods were assessed. The impact of repeated versus single target sequences on EBV DNA quantification precision was tested on B95.8 and Raji cell lines, possessing 11 and 7 copies of the BamHI-W sequence, respectively, and on clinical samples. BamHI-W qPCR demonstrated a lower limit of detection compared to LMP2 qPCR (2.33 log10 versus 3.08 log10 IU/mL; P = 0.0002). BamHI-W qPCR underestimated the EBV DNA load on Raji strain which contained fewer BamHI-W copies than the WHO standard derived from the B95.8 EBV strain (mean bias: - 0.21 log10; 95% CI, -0.54 to 0.12). Comparison of BamHI-W qPCR versus LMP2 and BXLF-1 qPCR showed an acceptable variability between EBV DNA levels in clinical samples with the mean bias being within 0.5 log10 IU/mL EBV DNA, whereas a better quantitative concordance was observed between LMP2 and BXLF-1 assays. Targeting BamHI-W resulted to a higher sensitivity compared to LMP2 but the variable reiterations of BamHI-W segment are associated with higher quantification variability. BamHI-W can be considered for clinical and therapeutic monitoring to detect an early EBV DNA and a dynamic change in viral load.

  12. Quantification of Accelerometer Derived Impacts Associated With Competitive Games in National Collegiate Athletic Association Division I College Football Players.

    Science.gov (United States)

    Wellman, Aaron D; Coad, Sam C; Goulet, Grant C; McLellan, Christopher P

    2017-02-01

    Wellman, AD, Coad, SC, Goulet, GC, and McLellan, CP. Quantification of accelerometer derived impacts associated with competitive games in National Collegiate Athletic Association division I college football players. J Strength Cond Res 31(2): 330-338, 2017-The aims of the present study were to (a) examine positional impact profiles of National Collegiate Athletic Association (NCAA) division I college football players using global positioning system (GPS) and integrated accelerometry (IA) technology and (b) determine if positional differences in impact profiles during competition exist within offensive and defensive teams. Thirty-three NCAA division I Football Bowl Subdivision players were monitored using GPS and IA (GPSports) during 12 regular season games throughout the 2014 season. Individual player data sets (n = 294) were divided into offensive and defensive teams, and positional subgroups. The intensity, number, and distribution of impact forces experienced by players during competition were recorded. Positional differences were found for the distribution of impacts within offensive and defensive teams. Wide receivers sustained more very light and light to moderate (5-6.5 G force) impacts than other position groups, whereas the running backs were involved in more severe (>10 G force) impacts than all offensive position groups, with the exception of the quarterbacks (p ≤ 0.05). The defensive back and linebacker groups were subject to more very light (5.0-6.0 G force) impacts, and the defensive tackle group sustained more heavy and very heavy (7.1-10 G force) impacts than other defensive positions (p ≤ 0.05). Data from the present study provide novel quantification of positional impact profiles related to the physical demands of college football games and highlight the need for position-specific monitoring and training in the preparation for the impact loads experienced during NCAA division I football competition.

  13. Identification and quantification of the hydrological impacts of imperviousness in urban catchments: a review.

    Science.gov (United States)

    Jacobson, Carol R

    2011-06-01

    Urbanisation produces numerous changes in the natural environments it replaces. The impacts include habitat fragmentation and changes to both the quality and quantity of the stormwater runoff, and result in changes to hydrological systems. This review integrates research in relatively diverse areas to examine how the impacts of urban imperviousness on hydrological systems can be quantified and modelled. It examines the nature of reported impacts of urbanisation on hydrological systems over four decades, including the effects of changes in imperviousness within catchments, and some inconsistencies in studies of the impacts of urbanisation. The distribution of imperviousness within urban areas is important in understanding the impacts of urbanisation and quantification requires detailed characterisation of urban areas. As a result most mapping of urban areas uses remote sensing techniques and this review examines a range of techniques using medium and high resolution imagery, including spectral unmixing. The third section examines the ways in which scientists and hydrological and environmental engineers model and quantify water flows in urban areas, the nature of hydrological models and methods for their calibration. The final section examines additional factors which influence the impact of impervious surfaces and some uncertainties that exist in current knowledge. Copyright © 2011 Elsevier Ltd. All rights reserved.

  14. Impact of improved attenuation correction featuring a bone atlas and truncation correction on PET quantification in whole-body PET/MR.

    Science.gov (United States)

    Oehmigen, Mark; Lindemann, Maike E; Gratz, Marcel; Kirchner, Julian; Ruhlmann, Verena; Umutlu, Lale; Blumhagen, Jan Ole; Fenchel, Matthias; Quick, Harald H

    2018-04-01

    Recent studies have shown an excellent correlation between PET/MR and PET/CT hybrid imaging in detecting lesions. However, a systematic underestimation of PET quantification in PET/MR has been observed. This is attributable to two methodological challenges of MR-based attenuation correction (AC): (1) lack of bone information, and (2) truncation of the MR-based AC maps (μmaps) along the patient arms. The aim of this study was to evaluate the impact of improved AC featuring a bone atlas and truncation correction on PET quantification in whole-body PET/MR. The MR-based Dixon method provides four-compartment μmaps (background air, lungs, fat, soft tissue) which served as a reference for PET/MR AC in this study. A model-based bone atlas provided bone tissue as a fifth compartment, while the HUGE method provided truncation correction. The study population comprised 51 patients with oncological diseases, all of whom underwent a whole-body PET/MR examination. Each whole-body PET dataset was reconstructed four times using standard four-compartment μmaps, five-compartment μmaps, four-compartment μmaps + HUGE, and five-compartment μmaps + HUGE. The SUV max for each lesion was measured to assess the impact of each μmap on PET quantification. All four μmaps in each patient provided robust results for reconstruction of the AC PET data. Overall, SUV max was quantified in 99 tumours and lesions. Compared to the reference four-compartment μmap, the mean SUV max of all 99 lesions increased by 1.4 ± 2.5% when bone was added, by 2.1 ± 3.5% when HUGE was added, and by 4.4 ± 5.7% when bone + HUGE was added. Larger quantification bias of up to 35% was found for single lesions when bone and truncation correction were added to the μmaps, depending on their individual location in the body. The novel AC method, featuring a bone model and truncation correction, improved PET quantification in whole-body PET/MR imaging. Short reconstruction times, straightforward

  15. Impact of improved attenuation correction featuring a bone atlas and truncation correction on PET quantification in whole-body PET/MR

    Energy Technology Data Exchange (ETDEWEB)

    Oehmigen, Mark; Lindemann, Maike E. [University Hospital Essen, High Field and Hybrid MR Imaging, Essen (Germany); Gratz, Marcel; Quick, Harald H. [University Hospital Essen, High Field and Hybrid MR Imaging, Essen (Germany); University Duisburg-Essen, Erwin L. Hahn Institute for MR Imaging, Essen (Germany); Kirchner, Julian [University Dusseldorf, Department of Diagnostic and Interventional Radiology, Medical Faculty, Dusseldorf (Germany); Ruhlmann, Verena [University Hospital Essen, Department of Nuclear Medicine, Essen (Germany); Umutlu, Lale [University Hospital Essen, Department of Diagnostic and Interventional Radiology and Neuroradiology, Essen (Germany); Blumhagen, Jan Ole; Fenchel, Matthias [Siemens Healthcare GmbH, Erlangen (Germany)

    2018-04-15

    Recent studies have shown an excellent correlation between PET/MR and PET/CT hybrid imaging in detecting lesions. However, a systematic underestimation of PET quantification in PET/MR has been observed. This is attributable to two methodological challenges of MR-based attenuation correction (AC): (1) lack of bone information, and (2) truncation of the MR-based AC maps (μmaps) along the patient arms. The aim of this study was to evaluate the impact of improved AC featuring a bone atlas and truncation correction on PET quantification in whole-body PET/MR. The MR-based Dixon method provides four-compartment μmaps (background air, lungs, fat, soft tissue) which served as a reference for PET/MR AC in this study. A model-based bone atlas provided bone tissue as a fifth compartment, while the HUGE method provided truncation correction. The study population comprised 51 patients with oncological diseases, all of whom underwent a whole-body PET/MR examination. Each whole-body PET dataset was reconstructed four times using standard four-compartment μmaps, five-compartment μmaps, four-compartment μmaps + HUGE, and five-compartment μmaps + HUGE. The SUV{sub max} for each lesion was measured to assess the impact of each μmap on PET quantification. All four μmaps in each patient provided robust results for reconstruction of the AC PET data. Overall, SUV{sub max} was quantified in 99 tumours and lesions. Compared to the reference four-compartment μmap, the mean SUV{sub max} of all 99 lesions increased by 1.4 ± 2.5% when bone was added, by 2.1 ± 3.5% when HUGE was added, and by 4.4 ± 5.7% when bone + HUGE was added. Larger quantification bias of up to 35% was found for single lesions when bone and truncation correction were added to the μmaps, depending on their individual location in the body. The novel AC method, featuring a bone model and truncation correction, improved PET quantification in whole-body PET/MR imaging. Short reconstruction times, straightforward

  16. Synthesis and Review: Advancing agricultural greenhouse gas quantification

    International Nuclear Information System (INIS)

    Olander, Lydia P; Wollenberg, Eva; Tubiello, Francesco N; Herold, Martin

    2014-01-01

    Reducing emissions of agricultural greenhouse gases (GHGs), such as methane and nitrous oxide, and sequestering carbon in the soil or in living biomass can help reduce the impact of agriculture on climate change while improving productivity and reducing resource use. There is an increasing demand for improved, low cost quantification of GHGs in agriculture, whether for national reporting to the United Nations Framework Convention on Climate Change (UNFCCC), underpinning and stimulating improved practices, establishing crediting mechanisms, or supporting green products. This ERL focus issue highlights GHG quantification to call attention to our existing knowledge and opportunities for further progress. In this article we synthesize the findings of 21 papers on the current state of global capability for agricultural GHG quantification and visions for its improvement. We conclude that strategic investment in quantification can lead to significant global improvement in agricultural GHG estimation in the near term. (paper)

  17. Plasticity Detection and Quantification in Monopile Support Structures Due to Axial Impact Loading

    Directory of Open Access Journals (Sweden)

    Meijers P.C.

    2018-01-01

    Full Text Available Recent developments in the construction of offshore wind turbines have created the need for a method to detect whether a monopile foundation is plastically deformed during the installation procedure. Since measurements at the pile head are difficult to perform, a method based on measurements at a certain distance below the pile head is proposed in this work for quantification of the amount of plasticity. By considering a onedimensional rod model with an elastic-perfectly plastic constitutive relation, it is shown that the occurrence of plastic deformation caused by an impact load can be detected from these measurements. Furthermore, this plastic deformation can be quantified by the same measurement with the help of an energy balance. The effectiveness of the proposed method is demonstrated via a numerical example.

  18. Impact of polymeric membrane filtration of oil sands process water on organic compounds quantification.

    Science.gov (United States)

    Moustafa, Ahmed M A; Kim, Eun-Sik; Alpatova, Alla; Sun, Nian; Smith, Scott; Kang, Seoktae; Gamal El-Din, Mohamed

    2014-01-01

    The interaction between organic fractions in oil sands process-affected water (OSPW) and three polymeric membranes with varying hydrophilicity (nylon, polyvinylidene fluoride and polytetrafluoroethylene) at different pHs was studied to evaluate the impact of filtration on the quantification of acid-extractable fraction (AEF) and naphthenic acids (NAs). Four functional groups predominated in OSPW (amine, phosphoryl, carboxyl and hydroxyl) as indicated by the linear programming method. The nylon membranes were the most hydrophilic and exhibited the lowest AEF removal at pH of 8.7. However, the adsorption of AEF on the membranes increased as the pH of OSPW decreased due to hydrophobic interactions between the membrane surfaces and the protonated molecules. The use of ultra pressure liquid chromatography-high resolution mass spectrometry (UPLC/HRMS) showed insignificant adsorption of NAs on the tested membranes at pH 8.7. However, 26±2.4% adsorption of NAs was observed at pH 5.3 following the protonation of NAs species. For the nylon membrane, excessive carboxylic acids in the commercial NAs caused the formation of negatively charged assisted hydrogen bonds, resulting in increased adsorption at pH 8.2 (25%) as compared to OSPW (0%). The use of membranes for filtration of soluble compounds from complex oily wastewaters before quantification analysis of AEF and NAs should be examined prior to application.

  19. The impact of respiratory motion on tumor quantification and delineation in static PET/CT imaging

    International Nuclear Information System (INIS)

    Liu Chi; Pierce II, Larry A; Alessio, Adam M; Kinahan, Paul E

    2009-01-01

    Our aim is to investigate the impact of respiratory motion on tumor quantification and delineation in static PET/CT imaging using a population of patient respiratory traces. A total of 1295 respiratory traces acquired during whole body PET/CT imaging were classified into three types according to the qualitative shape of their signal histograms. Each trace was scaled to three diaphragm motion amplitudes (6 mm, 11 mm and 16 mm) to drive a whole body PET/CT computer simulation that was validated with a physical phantom experiment. Three lung lesions and one liver lesion were simulated with diameters of 1 cm and 2 cm. PET data were reconstructed using the OS-EM algorithm with attenuation correction using CT images at the end-expiration phase and respiratory-averaged CT. The errors of the lesion maximum standardized uptake values (SUV max ) and lesion volumes between motion-free and motion-blurred PET/CT images were measured and analyzed. For respiration with 11 mm diaphragm motion and larger quiescent period fraction, respiratory motion can cause a mean lesion SUV max underestimation of 28% and a mean lesion volume overestimation of 130% in PET/CT images with 1 cm lesions. The errors of lesion SUV max and volume are larger for patient traces with larger motion amplitudes. Smaller lesions are more sensitive to respiratory motion than larger lesions for the same motion amplitude. Patient respiratory traces with relatively larger quiescent period fraction yield results less subject to respiratory motion than traces with long-term amplitude variability. Mismatched attenuation correction due to respiratory motion can cause SUV max overestimation for lesions in the lower lung region close to the liver dome. Using respiratory-averaged CT for attenuation correction yields smaller mismatch errors than those using end-expiration CT. Respiratory motion can have a significant impact on static oncological PET/CT imaging where SUV and/or volume measurements are important. The impact

  20. Consistent quantification of climate impacts due to biogenic carbon storage across a range of bio-product systems

    Energy Technology Data Exchange (ETDEWEB)

    Guest, Geoffrey, E-mail: geoffrey.guest@ntnu.no; Bright, Ryan M., E-mail: ryan.m.bright@ntnu.no; Cherubini, Francesco, E-mail: francesco.cherubini@ntnu.no; Strømman, Anders H., E-mail: anders.hammer.stromman@ntnu.no

    2013-11-15

    Temporary and permanent carbon storage from biogenic sources is seen as a way to mitigate climate change. The aim of this work is to illustrate the need to harmonize the quantification of such mitigation across all possible storage pools in the bio- and anthroposphere. We investigate nine alternative storage cases and a wide array of bio-resource pools: from annual crops, short rotation woody crops, medium rotation temperate forests, and long rotation boreal forests. For each feedstock type and biogenic carbon storage pool, we quantify the carbon cycle climate impact due to the skewed time distribution between emission and sequestration fluxes in the bio- and anthroposphere. Additional consideration of the climate impact from albedo changes in forests is also illustrated for the boreal forest case. When characterizing climate impact with global warming potentials (GWP), we find a large variance in results which is attributed to different combinations of biomass storage and feedstock systems. The storage of biogenic carbon in any storage pool does not always confer climate benefits: even when biogenic carbon is stored long-term in durable product pools, the climate outcome may still be undesirable when the carbon is sourced from slow-growing biomass feedstock. For example, when biogenic carbon from Norway Spruce from Norway is stored in furniture with a mean life time of 43 years, a climate change impact of 0.08 kg CO{sub 2}eq per kg CO{sub 2} stored (100 year time horizon (TH)) would result. It was also found that when biogenic carbon is stored in a pool with negligible leakage to the atmosphere, the resulting GWP factor is not necessarily − 1 CO{sub 2}eq per kg CO{sub 2} stored. As an example, when biogenic CO{sub 2} from Norway Spruce biomass is stored in geological reservoirs with no leakage, we estimate a GWP of − 0.56 kg CO{sub 2}eq per kg CO{sub 2} stored (100 year TH) when albedo effects are also included. The large variance in GWPs across the range of

  1. Consistent quantification of climate impacts due to biogenic carbon storage across a range of bio-product systems

    International Nuclear Information System (INIS)

    Guest, Geoffrey; Bright, Ryan M.; Cherubini, Francesco; Strømman, Anders H.

    2013-01-01

    Temporary and permanent carbon storage from biogenic sources is seen as a way to mitigate climate change. The aim of this work is to illustrate the need to harmonize the quantification of such mitigation across all possible storage pools in the bio- and anthroposphere. We investigate nine alternative storage cases and a wide array of bio-resource pools: from annual crops, short rotation woody crops, medium rotation temperate forests, and long rotation boreal forests. For each feedstock type and biogenic carbon storage pool, we quantify the carbon cycle climate impact due to the skewed time distribution between emission and sequestration fluxes in the bio- and anthroposphere. Additional consideration of the climate impact from albedo changes in forests is also illustrated for the boreal forest case. When characterizing climate impact with global warming potentials (GWP), we find a large variance in results which is attributed to different combinations of biomass storage and feedstock systems. The storage of biogenic carbon in any storage pool does not always confer climate benefits: even when biogenic carbon is stored long-term in durable product pools, the climate outcome may still be undesirable when the carbon is sourced from slow-growing biomass feedstock. For example, when biogenic carbon from Norway Spruce from Norway is stored in furniture with a mean life time of 43 years, a climate change impact of 0.08 kg CO 2 eq per kg CO 2 stored (100 year time horizon (TH)) would result. It was also found that when biogenic carbon is stored in a pool with negligible leakage to the atmosphere, the resulting GWP factor is not necessarily − 1 CO 2 eq per kg CO 2 stored. As an example, when biogenic CO 2 from Norway Spruce biomass is stored in geological reservoirs with no leakage, we estimate a GWP of − 0.56 kg CO 2 eq per kg CO 2 stored (100 year TH) when albedo effects are also included. The large variance in GWPs across the range of resource and carbon storage

  2. Land grabbing: a preliminary quantification of economic impacts on rural livelihoods.

    Science.gov (United States)

    Davis, Kyle F; D'Odorico, Paolo; Rulli, Maria Cristina

    2014-01-01

    Global demands on agricultural land are increasing due to population growth, dietary changes and the use of biofuels. Their effect on food security is to reduce humans' ability to cope with the uncertainties of global climate change. In light of the 2008 food crisis, to secure reliable future access to sufficient agricultural land, many nations and corporations have begun purchasing large tracts of land in the global South, a phenomenon deemed "land grabbing" by popular media. Because land investors frequently export crops without providing adequate employment, this represents an effective income loss for local communities. We study 28 countries targeted by large-scale land acquisitions [comprising 87 % of reported cases and 27 million hectares (ha)] and estimate the effects of such investments on local communities' incomes. We find that this phenomenon can potentially affect the incomes of ~12 million people globally with implications for food security, poverty levels and urbanization. While it is important to note that our study incorporates a number of assumptions and limitations, it provides a much needed initial quantification of the economic impacts of large-scale land acquisitions on rural livelihoods.

  3. Impact of muscular uptake and statistical noise on tumor quantification based on simulated FDG-PET studies

    International Nuclear Information System (INIS)

    Silva-Rodríguez, Jesús; Domínguez-Prado, Inés; Pardo-Montero, Juan; Ruibal, Álvaro

    2017-01-01

    Purpose: The aim of this work is to study the effect of physiological muscular uptake variations and statistical noise on tumor quantification in FDG-PET studies. Methods: We designed a realistic framework based on simulated FDG-PET acquisitions from an anthropomorphic phantom that included different muscular uptake levels and three spherical lung lesions with diameters of 31, 21 and 9 mm. A distribution of muscular uptake levels was obtained from 136 patients remitted to our center for whole-body FDG-PET. Simulated FDG-PET acquisitions were obtained by using the Simulation System for Emission Tomography package (SimSET) Monte Carlo package. Simulated data was reconstructed by using an iterative Ordered Subset Expectation Maximization (OSEM) algorithm implemented in the Software for Tomographic Image Reconstruction (STIR) library. Tumor quantification was carried out by using estimations of SUV max , SUV 50 and SUV mean from different noise realizations, lung lesions and multiple muscular uptakes. Results: Our analysis provided quantification variability values of 17–22% (SUV max ), 11–19% (SUV 50 ) and 8–10% (SUV mean ) when muscular uptake variations and statistical noise were included. Meanwhile, quantification variability due only to statistical noise was 7–8% (SUV max ), 3–7% (SUV 50 ) and 1–2% (SUV mean ) for large tumors (>20 mm) and 13% (SUV max ), 16% (SUV 50 ) and 8% (SUV mean ) for small tumors (<10 mm), thus showing that the variability in tumor quantification is mainly affected by muscular uptake variations when large enough tumors are considered. In addition, our results showed that quantification variability is strongly dominated by statistical noise when the injected dose decreases below 222 MBq. Conclusions: Our study revealed that muscular uptake variations between patients who are totally relaxed should be considered as an uncertainty source of tumor quantification values. - Highlights: • Distribution of muscular uptake from 136 PET

  4. Direct qPCR quantification using the Quantifiler(®) Trio DNA quantification kit.

    Science.gov (United States)

    Liu, Jason Yingjie

    2014-11-01

    The effectiveness of a direct quantification assay is essential to the adoption of the combined direct quantification/direct STR workflow. In this paper, the feasibility of using the Quantifiler(®) Trio DNA quantification kit for the direct quantification of forensic casework samples was investigated. Both low-level touch DNA samples and blood samples were collected on PE swabs and quantified directly. The increased sensitivity of the Quantifiler(®) Trio kit enabled the detection of less than 10pg of DNA in unprocessed touch samples and also minimizes the stochastic effect experienced by different targets in the same sample. The DNA quantity information obtained from a direct quantification assay using the Quantifiler(®) Trio kit can also be used to accurately estimate the optimal input DNA quantity for a direct STR amplification reaction. The correlation between the direct quantification results (Quantifiler(®) Trio kit) and the direct STR results (GlobalFiler™ PCR amplification kit(*)) for low-level touch DNA samples indicates that direct quantification using the Quantifiler(®) Trio DNA quantification kit is more reliable than the Quantifiler(®) Duo DNA quantification kit for predicting the STR results of unprocessed touch DNA samples containing less than 10pg of DNA. Copyright © 2014 Elsevier Ireland Ltd. All rights reserved.

  5. Impact of the Definition of Peak Standardized Uptake Value on Quantification of Treatment Response

    Science.gov (United States)

    Vanderhoek, Matt; Perlman, Scott B.; Jeraj, Robert

    2012-01-01

    PET-based treatment response assessment typically measures the change in maximum standardized uptake value (SUVmax), which is adversely affected by noise. Peak SUV (SUVpeak) has been recommended as a more robust alternative, but its associated region of interest (ROIpeak) is not uniquely defined. We investigated the impact of different ROIpeak definitions on quantification of SUVpeak and tumor response. Methods Seventeen patients with solid malignancies were treated with a multitargeted receptor tyrosine kinase inhibitor resulting in a variety of responses. Using the cellular proliferation marker 3′-deoxy-3′-18F-fluorothymidine (18F-FLT), whole-body PET/CT scans were acquired at baseline and during treatment. 18F-FLT–avid lesions (~2/patient) were segmented on PET images, and tumor response was assessed via the relative change in SUVpeak. For each tumor, 24 different SUVpeaks were determined by changing ROIpeak shape (circles vs. spheres), size (7.5–20 mm), and location (centered on SUVmax vs. placed in highest-uptake region), encompassing different definitions from the literature. Within each tumor, variations in the 24 SUVpeaks and tumor responses were measured using coefficient of variation (CV), standardized deviation (SD), and range. For each ROIpeak definition, a population average SUVpeak and tumor response were determined over all tumors. Results A substantial variation in both SUVpeak and tumor response resulted from changing the ROIpeak definition. The variable ROIpeak definition led to an intratumor SUVpeak variation ranging from 49% above to 46% below the mean (CV, 17%) and an intratumor SUVpeak response variation ranging from 49% above to 35% below the mean (SD, 9%). The variable ROIpeak definition led to a population average SUVpeak variation ranging from 24% above to 28% below the mean (CV, 14%) and a population average SUVpeak response variation ranging from only 3% above to 3% below the mean (SD, 2%). The size of ROIpeak caused more

  6. Impact factors and the optimal parameter of acoustic structure quantification in the assessment of liver fibrosis.

    Science.gov (United States)

    Huang, Yang; Liu, Guang-Jian; Liao, Bing; Huang, Guang-Liang; Liang, Jin-Yu; Zhou, Lu-Yao; Wang, Fen; Li, Wei; Xie, Xiao-Yan; Wang, Wei; Lu, Ming-De

    2015-09-01

    The aims of the present study are to assess the impact factors on acoustic structure quantification (ASQ) ultrasound and find the optimal parameter for the assessment of liver fibrosis. Twenty healthy volunteers underwent ASQ examinations to evaluate impact factors in ASQ image acquisition and analysis. An additional 113 patients with liver diseases underwent standardized ASQ examinations, and the results were compared with histologic staging of liver fibrosis. We found that the right liver displayed lower values of ASQ parameters than the left (p = 0.000-0.021). Receive gain experienced no significant impact except gain 70 (p = 0.193-1.000). With regard to different diameter of involved vessels in regions of interest, the group ≤2.0 mm differed significantly with the group 2.1-5.0 mm (p = 0.000-0.033) and the group >5.0 mm (p = 0.000-0.062). However, the region of interest size (p = 0.438-1.000) and depth (p = 0.072-0.764) had no statistical impact. Good intra- and inter-operator reproducibilities were found in both image acquisitions and offline image analyses. In the liver fibrosis study, the focal disturbance ratio had the highest correlation with histologic fibrosis stage (r = 0.67, p the assessment of liver fibrosis. Copyright © 2015 World Federation for Ultrasound in Medicine & Biology. Published by Elsevier Inc. All rights reserved.

  7. Quantification of the impact of endometriosis symptoms on health-related quality of life and work productivity.

    Science.gov (United States)

    Fourquet, Jessica; Báez, Lorna; Figueroa, Michelle; Iriarte, R Iván; Flores, Idhaliz

    2011-07-01

    To quantify the impact of endometriosis-related symptoms on physical and mental health status, health-related quality of life, and work-related aspects (absenteeism, presenteeism, work productivity, and activity impairment). Cross-sectional quantitative study. Academic and research institution. Women (n = 193) with self-reported surgically diagnosed endometriosis from the Endometriosis Patient Registry at Ponce School of Medicine and Health Sciences (PSMHS). Anonymous questionnaire divided into three sections consisting of questions from the Patient Health Survey (SF-12), the Endometriosis Health Profile (EHP-5), and the Work Productivity and Activity Impairment Survey (WPAI). Quantification of impact of endometriosis symptoms on physical and mental health status, health-related quality of life, absenteeism, presenteeism, work productivity, and activity impairment. Patients had SF-12 scores denoting statistically significant disability in the physical and mental health components. They also reported an average of 7.41 hours (approximately one working day) of work time lost during the week when the symptoms are worse. In addition, the WPAI scores showed a high impact on work-related domains: 13% of average loss in work time (absenteeism), 65% of work impaired (presenteeism), 64% of loss in efficiency levels (work productivity loss), and 60% of daily activities perturbed (activity impairment). Endometriosis symptoms such as chronic, incapacitating pelvic pain and infertility negatively and substantially impact the physical and mental health status, health-related quality of life, and productivity at work of women. Copyright © 2011 American Society for Reproductive Medicine. Published by Elsevier Inc. All rights reserved.

  8. Quantification bias caused by plasmid DNA conformation in quantitative real-time PCR assay.

    Science.gov (United States)

    Lin, Chih-Hui; Chen, Yu-Chieh; Pan, Tzu-Ming

    2011-01-01

    Quantitative real-time PCR (qPCR) is the gold standard for the quantification of specific nucleic acid sequences. However, a serious concern has been revealed in a recent report: supercoiled plasmid standards cause significant over-estimation in qPCR quantification. In this study, we investigated the effect of plasmid DNA conformation on the quantification of DNA and the efficiency of qPCR. Our results suggest that plasmid DNA conformation has significant impact on the accuracy of absolute quantification by qPCR. DNA standard curves shifted significantly among plasmid standards with different DNA conformations. Moreover, the choice of DNA measurement method and plasmid DNA conformation may also contribute to the measurement error of DNA standard curves. Due to the multiple effects of plasmid DNA conformation on the accuracy of qPCR, efforts should be made to assure the highest consistency of plasmid standards for qPCR. Thus, we suggest that the conformation, preparation, quantification, purification, handling, and storage of standard plasmid DNA should be described and defined in the Minimum Information for Publication of Quantitative Real-Time PCR Experiments (MIQE) to assure the reproducibility and accuracy of qPCR absolute quantification.

  9. Leishmania parasite detection and quantification using PCR-ELISA

    Czech Academy of Sciences Publication Activity Database

    Kobets, Tetyana; Badalová, Jana; Grekov, Igor; Havelková, Helena; Lipoldová, Marie

    2010-01-01

    Roč. 5, č. 6 (2010), s. 1074-1080 ISSN 1754-2189 R&D Projects: GA ČR GA310/08/1697; GA MŠk(CZ) LC06009 Institutional research plan: CEZ:AV0Z50520514 Keywords : polymerase chain reaction * Leishmania major infection * parasite quantification Subject RIV: EB - Genetics ; Molecular Biology Impact factor: 8.362, year: 2010

  10. LC-MS3 quantification of O-glycopeptides in human serum

    Czech Academy of Sciences Publication Activity Database

    Sanda, M.; Pompach, Petr; Benicky, J.; Goldman, R.

    2013-01-01

    Roč. 34, č. 16 (2013), s. 2342-2349 ISSN 0173-0835 R&D Projects: GA MŠk LH13051 Institutional support: RVO:61388971 Keywords : LC-MS3 * O-glycosylation * Quantification of glycopeptides Subject RIV: CE - Biochemistry Impact factor: 3.161, year: 2013

  11. Impact of Personal Characteristics and Technical Factors on Quantification of Sodium 18F-Fluoride Uptake in Human Arteries

    DEFF Research Database (Denmark)

    Blomberg, Björn Alexander; Thomassen, Anders; de Jong, Pim A

    2015-01-01

    Sodium (18)F-fluoride ((18)F-NaF) PET/CT imaging is a promising imaging technique for assessment of atherosclerosis, but is hampered by a lack of validated quantification protocols. Both personal characteristics and technical factors can affect quantification of arterial (18)F-NaF uptake....... This study investigated if blood activity, renal function, injected dose, circulating time, and PET/CT system affect quantification of arterial (18)F-NaF uptake. METHODS: Eighty-nine healthy subjects were prospectively examined by (18)F-NaF PET/CT imaging. Arterial (18)F-NaF uptake was quantified...... assessed the effect of personal characteristics and technical factors on quantification of arterial (18)F-NaF uptake. RESULTS: NaFmax and TBRmax/mean were dependent on blood activity (β = .34 to .44, P

  12. Rapid Quantification and Validation of Lipid Concentrations within Liposomes

    Directory of Open Access Journals (Sweden)

    Carla B. Roces

    2016-09-01

    Full Text Available Quantification of the lipid content in liposomal adjuvants for subunit vaccine formulation is of extreme importance, since this concentration impacts both efficacy and stability. In this paper, we outline a high performance liquid chromatography-evaporative light scattering detector (HPLC-ELSD method that allows for the rapid and simultaneous quantification of lipid concentrations within liposomal systems prepared by three liposomal manufacturing techniques (lipid film hydration, high shear mixing, and microfluidics. The ELSD system was used to quantify four lipids: 1,2-dimyristoyl-sn-glycero-3-phosphocholine (DMPC, cholesterol, dimethyldioctadecylammonium (DDA bromide, and ᴅ-(+-trehalose 6,6′-dibehenate (TDB. The developed method offers rapidity, high sensitivity, direct linearity, and a good consistency on the responses (R2 > 0.993 for the four lipids tested. The corresponding limit of detection (LOD and limit of quantification (LOQ were 0.11 and 0.36 mg/mL (DMPC, 0.02 and 0.80 mg/mL (cholesterol, 0.06 and 0.20 mg/mL (DDA, and 0.05 and 0.16 mg/mL (TDB, respectively. HPLC-ELSD was shown to be a rapid and effective method for the quantification of lipids within liposome formulations without the need for lipid extraction processes.

  13. Management Environnemental

    CERN Document Server

    Guthapfel, C

    1999-01-01

    Comme toute industrie, le CERN est responsable des déchets qu'il produit. C'est pourquoi la section TFM/MS de la division ST a développé un système de gestion de ces déchets. Il faut savoir que les activités du Laboratoire le conduisent à produire près de 3000 tonnes de déchets par an qui correspondent, à la fois à ceux d'une collectivité locale et d'une industrie. On distingue deux classes de déchets : les Déchets Industriels Banals et Industriels Spéciaux. Pour chacune de ces classes, chaque type de déchets est caractérisé par son mode de production, sa collecte et ses filières d'élimination. La situation géographique, de part et d'autres de la frontière Franco-Suisse, engendre des procédures particulières et des coûts plus importants. L'environnement faisant partie intégrante des préoccupations de l'Organisation, le service en charge de cette activité s'est donné comme mission, en prenant en compte les desiderata des différents responsables et acteurs, d'optimiser son système...

  14. Impact environnemental d'une désulfuration poussée des gazoles Environmental Impact of Gaz Oil Desulfurization

    Directory of Open Access Journals (Sweden)

    Armengol C.

    2006-11-01

    Full Text Available En une dizaine d'années, le diesel a connu un développement spectaculaire sur les marchés automobile français et européen et pourrait atteindre, en 1995, la moitié des immatriculations de véhicules particuliers en France et le quart en Europe de l'Ouest. Cette situation n'est évidemment pas sans poser de problèmes. Problèmes environnementaux puisque le moteur diesel est une source plus importante d'émissions d'oxydes d'azote et de particules que le convertisseur essence, mais également au niveau de l'industrie du raffinage qui, en France, n'est plus en mesure de satisfaire la demande en gazole. De plus, à compter du 1er octobre 1996, la teneur en soufre du gazole routier ne devra pas excéder 0,05 %, conformément aux nouvelles spécifications européennes. Cette perspective de production de carburants fortement désulfurés va affecter directement l'équilibre en hydrogène de la raffinerie et donc les autoconsommations et les émissions de CO2. L'objectif de cette étude est de mesurer l'impact sur l'environnement d'une réduction de la teneur en soufre des gazoles de 0,3 à 0,05 %. Le bilan est réalisé sur l'ensemble de la filière énergétique, depuis l'extraction du pétrole jusqu'à la combustion du carburant dans le moteur. Les gains et les pertes en termes de pollution locale ou globale sont évalués suivant la nature de l'hydrogène utilisé (oxydation partielle de résidus sous vide ou de charbon, reformage à la vapeur de gaz naturel ou de naphta électrolyse et la nature de la charge à traiter (gazole straight run ou light cycle oil lors de l'hydrodésulfuration. Over the past decade, diesel had made large advances in the French and European automobile markets. In 1995, diesel could account for half of all private vehicle registrations in France, and a quarter in Western Europe. This situation inevitably raises a number of problems : environmental problems, because the diesel engine emits more nitrogen oxides and

  15. Detection and quantification of microparticles from different cellular lineages using flow cytometry. Evaluation of the impact of secreted phospholipase A2 on microparticle assessment.

    Science.gov (United States)

    Rousseau, Matthieu; Belleannee, Clemence; Duchez, Anne-Claire; Cloutier, Nathalie; Levesque, Tania; Jacques, Frederic; Perron, Jean; Nigrovic, Peter A; Dieude, Melanie; Hebert, Marie-Josee; Gelb, Michael H; Boilard, Eric

    2015-01-01

    Microparticles, also called microvesicles, are submicron extracellular vesicles produced by plasma membrane budding and shedding recognized as key actors in numerous physio(patho)logical processes. Since they can be released by virtually any cell lineages and are retrieved in biological fluids, microparticles appear as potent biomarkers. However, the small dimensions of microparticles and soluble factors present in body fluids can considerably impede their quantification. Here, flow cytometry with improved methodology for microparticle resolution was used to detect microparticles of human and mouse species generated from platelets, red blood cells, endothelial cells, apoptotic thymocytes and cells from the male reproductive tract. A family of soluble proteins, the secreted phospholipases A2 (sPLA2), comprises enzymes concomitantly expressed with microparticles in biological fluids and that catalyze the hydrolysis of membrane phospholipids. As sPLA2 can hydrolyze phosphatidylserine, a phospholipid frequently used to assess microparticles, and might even clear microparticles, we further considered the impact of relevant sPLA2 enzymes, sPLA2 group IIA, V and X, on microparticle quantification. We observed that if enriched in fluids, certain sPLA2 enzymes impair the quantification of microparticles depending on the species studied, the source of microparticles and the means of detection employed (surface phosphatidylserine or protein antigen detection). This study provides analytical considerations for appropriate interpretation of microparticle cytofluorometric measurements in biological samples containing sPLA2 enzymes.

  16. Impact of sequential proton density fat fraction for quantification of hepatic steatosis in nonalcoholic fatty liver disease.

    Science.gov (United States)

    Idilman, Ilkay S; Keskin, Onur; Elhan, Atilla Halil; Idilman, Ramazan; Karcaaltincaba, Musturay

    2014-05-01

    To determine the utility of sequential MRI-estimated proton density fat fraction (MRI-PDFF) for quantification of the longitudinal changes in liver fat content in individuals with nonalcoholic fatty liver disease (NAFLD). A total of 18 consecutive individuals (M/F: 10/8, mean age: 47.7±9.8 years) diagnosed with NAFLD, who underwent sequential PDFF calculations for the quantification of hepatic steatosis at two different time points, were included in the study. All patients underwent T1-independent volumetric multi-echo gradient-echo imaging with T2* correction and spectral fat modeling. A close correlation for quantification of hepatic steatosis between the initial MRI-PDFF and liver biopsy was observed (rs=0.758, phepatic steatosis. The changes in serum ALT levels significantly reflected changes in MRI-PDFF in patients with NAFLD.

  17. Impact of acid atmospheric deposition on soils : quantification of chemical and hydrologic processes

    NARCIS (Netherlands)

    Grinsven, van J.J.M.

    1988-01-01

    Atmospheric deposition of SO x , NOx and NHx will cause major changes in the chemical composition of solutions in acid soils, which may affect the biological functions of the soil. This thesis deals with quantification of soil acidification by means of chemical

  18. Uncertainty Quantification of the Reverse Taylor Impact Test and Localized Asynchronous Space-Time Algorithm

    Science.gov (United States)

    Subber, Waad; Salvadori, Alberto; Lee, Sangmin; Matous, Karel

    2017-06-01

    The reverse Taylor impact is a common experiment to investigate the dynamical response of materials at high strain rates. To better understand the physical phenomena and to provide a platform for code validation and Uncertainty Quantification (UQ), a co-designed simulation and experimental paradigm is investigated. For validation under uncertainty, quantities of interest (QOIs) within subregions of the computational domain are introduced. For such simulations where regions of interest can be identified, the computational cost for UQ can be reduced by confining the random variability within these regions of interest. This observation inspired us to develop an asynchronous space and time computational algorithm with localized UQ. In the region of interest, the high resolution space and time discretization schemes are used for a stochastic model. Apart from the region of interest, low spatial and temporal resolutions are allowed for a stochastic model with low dimensional representation of uncertainty. The model is exercised on the linear elastodynamics and shows a potential in reducing the UQ computational cost. Although, we consider wave prorogation in solid, the proposed framework is general and can be used for fluid flow problems as well. Department of Energy, National Nuclear Security Administration (PSAAP-II).

  19. Impact of benzodiazepines on brain FDG-PET quantification after single-dose and chronic administration in rats

    International Nuclear Information System (INIS)

    Silva-Rodríguez, Jesús; García-Varela, Lara; López-Arias, Esteban; Domínguez-Prado, Inés; Cortés, Julia; Pardo-Montero, Juan; Fernández-Ferreiro, Anxo

    2016-01-01

    Introduction: Current guidelines for brain PET imaging advice against the injection of diazepam prior to brain FDG-PET examination in order to avoid possible interactions of benzodiazepines with the radiotracer uptake. Nevertheless, many patients undergoing PET studies are likely to be under chronic treatment with benzodiazepines, for example due to the use of different medications such as sleeping pills. Animal studies may provide an extensive and accurate estimation of the effect of benzodiazepines on brain metabolism in a well-defined and controlled framework. Aim: This study aims at evaluating the impact of benzodiazepines on brain FDG uptake after single-dose administration and chronic treatment in rats. Methods: Twelve Sprague–Dawley healthy rats were randomly divided into two groups, one treated with diazepam and the other used as control group. Both groups underwent PET/CT examinations after single-dose and chronic administration of diazepam (treated) or saline (controls) during twenty-eight days. Different atlas-based quantification methods were used to explore differences on the total uptake and uptake patterns of FDG between both groups. Results: Our analysis revealed a significant reduction of global FDG uptake after acute (−16.2%) and chronic (−23.2%) administration of diazepam. Moreover, a strong trend pointing to differences between acute and chronic administrations (p < 0.08) was also observed. Uptake levels returned to normal after interrupting the administration of diazepam. On the other hand, patterns of FDG uptake were not affected by the administration of diazepam. Conclusions: The administration of diazepam causes a progressive decrease of the FDG global uptake in the rat brain, but it does not change local patterns within the brain. Under these conditions, visual assessment and quantification methods based on regional differences such as asymmetry indexes or SPM statistical analysis would still be valid when administrating this

  20. Voltammetric Quantification of Paraquat and Glyphosate in Surface Waters

    Directory of Open Access Journals (Sweden)

    William Roberto Alza-Camacho

    2016-09-01

    Full Text Available The indiscriminate use of pesticides on crops has a negative environmental impact that affects organisms, soil and water resources, essential for life. Therefore, it is necessary to evaluate the residual effect of these substances in water sources. A simple, affordable and accessible electrochemical method for Paraquat and Glyphosate quantification in water was developed. The study was conducted using as supporting electrolyte Britton-Robinson buffer solution, working electrode of glassy carbon, Ag/AgCl as the reference electrode, and platinum as auxiliary electrode. Differential pulse voltammetry (VDP method for both compounds were validated. Linearity of the methods presented a correlation coefficient of 0.9949 and 0.9919 and the limits of detection and quantification were 130 and 190 mg/L for Paraquat and 40 and 50 mg/L for glyphosate. Comparison with the reference method showed that the electrochemical method provides superior results in quantification of analytes. Of the samples tested, a value of Paraquat was between 0,011 to 1,572 mg/L and for glyphosate it was between 0.201 to 2.777 mg/L, indicating that these compounds are present in water sources and that those may be causing serious problems to human health.

  1. Nuclear and mitochondrial DNA quantification of various forensic materials.

    Science.gov (United States)

    Andréasson, H; Nilsson, M; Budowle, B; Lundberg, H; Allen, M

    2006-12-01

    Due to the different types and quality of forensic evidence materials, their DNA content can vary substantially, and particularly low quantities can impact the results in an identification analysis. In this study, the quantity of mitochondrial and nuclear DNA was determined in a variety of materials using a previously described real-time PCR method. DNA quantification in the roots and distal sections of plucked and shed head hairs revealed large variations in DNA content particularly between the root and the shaft of plucked hairs. Also large intra- and inter-individual variations were found among hairs. In addition, DNA content was estimated in samples collected from fingerprints and accessories. The quantification of DNA on various items also displayed large variations, with some materials containing large amounts of nuclear DNA while no detectable nuclear DNA and only limited amounts of mitochondrial DNA were seen in others. Using this sensitive real-time PCR quantification assay, a better understanding was obtained regarding DNA content and variation in commonly analysed forensic evidence materials and this may guide the forensic scientist as to the best molecular biology approach for analysing various forensic evidence materials.

  2. Localized quantification of anhydrous calcium carbonate polymorphs using micro-Raman spectroscopy

    Czech Academy of Sciences Publication Activity Database

    Ševčík, Radek; Mácová, Petra

    2018-01-01

    Roč. 95, March (2018), s. 1-6 ISSN 0924-2031 R&D Projects: GA MŠk(CZ) LO1219 Keywords : micro-Raman * quantification * calcite * vaterite * aragonite * nanolime Subject RIV: CA - Inorganic Chemistry Impact factor: 1.740, year: 2016 http://www.sciencedirect.com/science/article/pii/S0924203117302886

  3. Accident sequence quantification with KIRAP

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Tae Un; Han, Sang Hoon; Kim, Kil You; Yang, Jun Eon; Jeong, Won Dae; Chang, Seung Cheol; Sung, Tae Yong; Kang, Dae Il; Park, Jin Hee; Lee, Yoon Hwan; Hwang, Mi Jeong

    1997-01-01

    The tasks of probabilistic safety assessment(PSA) consists of the identification of initiating events, the construction of event tree for each initiating event, construction of fault trees for event tree logics, the analysis of reliability data and finally the accident sequence quantification. In the PSA, the accident sequence quantification is to calculate the core damage frequency, importance analysis and uncertainty analysis. Accident sequence quantification requires to understand the whole model of the PSA because it has to combine all event tree and fault tree models, and requires the excellent computer code because it takes long computation time. Advanced Research Group of Korea Atomic Energy Research Institute(KAERI) has developed PSA workstation KIRAP(Korea Integrated Reliability Analysis Code Package) for the PSA work. This report describes the procedures to perform accident sequence quantification, the method to use KIRAP`s cut set generator, and method to perform the accident sequence quantification with KIRAP. (author). 6 refs.

  4. Accident sequence quantification with KIRAP

    International Nuclear Information System (INIS)

    Kim, Tae Un; Han, Sang Hoon; Kim, Kil You; Yang, Jun Eon; Jeong, Won Dae; Chang, Seung Cheol; Sung, Tae Yong; Kang, Dae Il; Park, Jin Hee; Lee, Yoon Hwan; Hwang, Mi Jeong.

    1997-01-01

    The tasks of probabilistic safety assessment(PSA) consists of the identification of initiating events, the construction of event tree for each initiating event, construction of fault trees for event tree logics, the analysis of reliability data and finally the accident sequence quantification. In the PSA, the accident sequence quantification is to calculate the core damage frequency, importance analysis and uncertainty analysis. Accident sequence quantification requires to understand the whole model of the PSA because it has to combine all event tree and fault tree models, and requires the excellent computer code because it takes long computation time. Advanced Research Group of Korea Atomic Energy Research Institute(KAERI) has developed PSA workstation KIRAP(Korea Integrated Reliability Analysis Code Package) for the PSA work. This report describes the procedures to perform accident sequence quantification, the method to use KIRAP's cut set generator, and method to perform the accident sequence quantification with KIRAP. (author). 6 refs

  5. La rentabilité des immeubles "verts"

    OpenAIRE

    Thalmann, Philippe

    2016-01-01

    Les immeubles verts ont un faible impact environnemental tout en offrant aussi de bonnes conditions sociales, voire des gains environnementaux (qualité de l'air, biodiversité). Ils sont donc exemplaires dans divers domaines: énergie, utilisation du sol, biodiversité, matériaux. Est-ce que cela implique forcément un sur-coût? Est-il possible de rentabiliser ce sur-coût ou le développer doit-il se contenter des retombées favorables pour l'environnement et la société?

  6. Lutte contre les cerscosporioses du bananier aux Antilles françaises : Banatrace, un système d'information géographique multi-acteurs pour la gestion et la traçabilité des épandages aériens

    OpenAIRE

    DUMAS, Marine; DUMAS, Marine

    2011-01-01

    Comment un système d’information géographique peut-il aider à améliorer la fiabilité des traitements aériens contre les maladies fongiques du bananier tout en réduisant leur impact environnemental ? Des réponses aux Antilles françaises avec l’outil informatique multi-acteurs Banatrace, conçu pour aider les gestionnaires à répondre aux différents besoins de traçabilité imposée par la réglementation récente sur les épandages aériens.

  7. Evaluation of the impact of matrix effect on quantification of pesticides in foods by gas chromatography-mass spectrometry using isotope-labeled internal standards.

    Science.gov (United States)

    Yarita, Takashi; Aoyagi, Yoshie; Otake, Takamitsu

    2015-05-29

    The impact of the matrix effect in GC-MS quantification of pesticides in food using the corresponding isotope-labeled internal standards was evaluated. A spike-and-recovery study of nine target pesticides was first conducted using paste samples of corn, green soybean, carrot, and pumpkin. The observed analytical values using isotope-labeled internal standards were more accurate for most target pesticides than that obtained using the external calibration method, but were still biased from the spiked concentrations when a matrix-free calibration solution was used for calibration. The respective calibration curves for each target pesticide were also prepared using matrix-free calibration solutions and matrix-matched calibration solutions with blank soybean extract. The intensity ratio of the peaks of most target pesticides to that of the corresponding isotope-labeled internal standards was influenced by the presence of the matrix in the calibration solution; therefore, the observed slope varied. The ratio was also influenced by the type of injection method (splitless or on-column). These results indicated that matrix-matching of the calibration solution is required for very accurate quantification, even if isotope-labeled internal standards were used for calibration. Copyright © 2015 Elsevier B.V. All rights reserved.

  8. RNAontheBENCH: computational and empirical resources for benchmarking RNAseq quantification and differential expression methods

    KAUST Repository

    Germain, Pierre-Luc

    2016-06-20

    RNA sequencing (RNAseq) has become the method of choice for transcriptome analysis, yet no consensus exists as to the most appropriate pipeline for its analysis, with current benchmarks suffering important limitations. Here, we address these challenges through a rich benchmarking resource harnessing (i) two RNAseq datasets including ERCC ExFold spike-ins; (ii) Nanostring measurements of a panel of 150 genes on the same samples; (iii) a set of internal, genetically-determined controls; (iv) a reanalysis of the SEQC dataset; and (v) a focus on relative quantification (i.e. across-samples). We use this resource to compare different approaches to each step of RNAseq analysis, from alignment to differential expression testing. We show that methods providing the best absolute quantification do not necessarily provide good relative quantification across samples, that count-based methods are superior for gene-level relative quantification, and that the new generation of pseudo-alignment-based software performs as well as established methods, at a fraction of the computing time. We also assess the impact of library type and size on quantification and differential expression analysis. Finally, we have created a R package and a web platform to enable the simple and streamlined application of this resource to the benchmarking of future methods.

  9. RNAontheBENCH: computational and empirical resources for benchmarking RNAseq quantification and differential expression methods

    KAUST Repository

    Germain, Pierre-Luc; Vitriolo, Alessandro; Adamo, Antonio; Laise, Pasquale; Das, Vivek; Testa, Giuseppe

    2016-01-01

    RNA sequencing (RNAseq) has become the method of choice for transcriptome analysis, yet no consensus exists as to the most appropriate pipeline for its analysis, with current benchmarks suffering important limitations. Here, we address these challenges through a rich benchmarking resource harnessing (i) two RNAseq datasets including ERCC ExFold spike-ins; (ii) Nanostring measurements of a panel of 150 genes on the same samples; (iii) a set of internal, genetically-determined controls; (iv) a reanalysis of the SEQC dataset; and (v) a focus on relative quantification (i.e. across-samples). We use this resource to compare different approaches to each step of RNAseq analysis, from alignment to differential expression testing. We show that methods providing the best absolute quantification do not necessarily provide good relative quantification across samples, that count-based methods are superior for gene-level relative quantification, and that the new generation of pseudo-alignment-based software performs as well as established methods, at a fraction of the computing time. We also assess the impact of library type and size on quantification and differential expression analysis. Finally, we have created a R package and a web platform to enable the simple and streamlined application of this resource to the benchmarking of future methods.

  10. Stereo-particle image velocimetry uncertainty quantification

    International Nuclear Information System (INIS)

    Bhattacharya, Sayantan; Vlachos, Pavlos P; Charonko, John J

    2017-01-01

    Particle image velocimetry (PIV) measurements are subject to multiple elemental error sources and thus estimating overall measurement uncertainty is challenging. Recent advances have led to a posteriori uncertainty estimation methods for planar two-component PIV. However, no complete methodology exists for uncertainty quantification in stereo PIV. In the current work, a comprehensive framework is presented to quantify the uncertainty stemming from stereo registration error and combine it with the underlying planar velocity uncertainties. The disparity in particle locations of the dewarped images is used to estimate the positional uncertainty of the world coordinate system, which is then propagated to the uncertainty in the calibration mapping function coefficients. Next, the calibration uncertainty is combined with the planar uncertainty fields of the individual cameras through an uncertainty propagation equation and uncertainty estimates are obtained for all three velocity components. The methodology was tested with synthetic stereo PIV data for different light sheet thicknesses, with and without registration error, and also validated with an experimental vortex ring case from 2014 PIV challenge. Thorough sensitivity analysis was performed to assess the relative impact of the various parameters to the overall uncertainty. The results suggest that in absence of any disparity, the stereo PIV uncertainty prediction method is more sensitive to the planar uncertainty estimates than to the angle uncertainty, although the latter is not negligible for non-zero disparity. Overall the presented uncertainty quantification framework showed excellent agreement between the error and uncertainty RMS values for both the synthetic and the experimental data and demonstrated reliable uncertainty prediction coverage. This stereo PIV uncertainty quantification framework provides the first comprehensive treatment on the subject and potentially lays foundations applicable to volumetric

  11. Biomass to energy : GHG reduction quantification protocols and case study

    Energy Technology Data Exchange (ETDEWEB)

    Reusing, G.; Taylor, C. [Conestoga - Rovers and Associates, Waterloo, ON (Canada); Nolan, W. [Liberty Energy, Hamilton, ON (Canada); Kerr, G. [Index Energy, Ajax, ON (Canada)

    2009-07-01

    With the growing concerns over greenhouses gases and their contribution to climate change, it is necessary to find ways of reducing environmental impacts by diversifying energy sources to include non-fossil fuel energy sources. Among the fastest growing green energy sources is energy from waste facilities that use biomass that would otherwise be landfilled or stockpiled. The quantification of greenhouse gas reductions through the use of biomass to energy systems can be calculated using various protocols and methodologies. This paper described each of these methodologies and presented a case study comparing some of these quantification methodologies. A summary and comparison of biomass to energy greenhouse gas reduction protocols in use or under development by the United Nations, the European Union, the Province of Alberta and Environment Canada was presented. It was concluded that regulatory, environmental pressures, and public policy will continue to impact the practices associated with biomass processing or landfill operations, such as composting, or in the case of landfills, gas collection systems, thus reducing the amount of potential credit available for biomass to energy facility offset projects. 10 refs., 2 tabs., 6 figs.

  12. Biomass to energy : GHG reduction quantification protocols and case study

    International Nuclear Information System (INIS)

    Reusing, G.; Taylor, C.; Nolan, W.; Kerr, G.

    2009-01-01

    With the growing concerns over greenhouses gases and their contribution to climate change, it is necessary to find ways of reducing environmental impacts by diversifying energy sources to include non-fossil fuel energy sources. Among the fastest growing green energy sources is energy from waste facilities that use biomass that would otherwise be landfilled or stockpiled. The quantification of greenhouse gas reductions through the use of biomass to energy systems can be calculated using various protocols and methodologies. This paper described each of these methodologies and presented a case study comparing some of these quantification methodologies. A summary and comparison of biomass to energy greenhouse gas reduction protocols in use or under development by the United Nations, the European Union, the Province of Alberta and Environment Canada was presented. It was concluded that regulatory, environmental pressures, and public policy will continue to impact the practices associated with biomass processing or landfill operations, such as composting, or in the case of landfills, gas collection systems, thus reducing the amount of potential credit available for biomass to energy facility offset projects. 10 refs., 2 tabs., 6 figs

  13. Quantification methods of Black Carbon: Comparison of Rock-Eval analysis with traditional methods

    NARCIS (Netherlands)

    Poot, A.; Quik, J.T.K.; Veld, H.; Koelmans, A.A.

    2009-01-01

    Black Carbon (BC) quantification methods are reviewed, including new Rock-Eval 6 data on BC reference materials. BC has been reported to have major impacts on climate, human health and environmental quality. Especially for risk assessment of persistent organic pollutants (POPs) it is important to

  14. CT quantification of pleuropulmonary lesions in severe thoracic trauma

    International Nuclear Information System (INIS)

    Kunisch-Hoppe, M.; Bachmann, G.; Weimar, B.; Bauer, T.; Rau, W.S.; Hoppe, M.; Zickmann, B.

    1997-01-01

    Purpose: Computed quantification of the extent of pleuropulmonary trauma by CT and comparison with conventional chest X-ray - Impact on therapy and correlation with mechanical ventilation support and clinical outcome. Method: In a prospective trial, 50 patients with clinically suspicious blunt chest trauma were evaluated using CT and conventional chest X-ray. The computed quantification of ventilated lung provided by CT volumetry was correlated with the consecutive artificial respiration parameters and the clinical outcome. Results: We found a high correlation between CT volumetry and artificial ventilation concerning maximal pressures and inspiratory oxygen concentration (FiO 2 , Goris-Score) (r=0.89, Pearson). The graduation of thoracic trauma correlated highly with the duration of mechanical ventilation (r=0.98, Pearson). Especially with regard to atelectases and lung contusions CT is superior compared to conventional chest X-ray; only 32% and 43%, respectively, were identified by conventional chest X-ray. (orig./AJ) [de

  15. The Effect of AOP on Software Engineering, with Particular Attention to OIF and Event Quantification

    Science.gov (United States)

    Havelund, Klaus; Filman, Robert; Korsmeyer, David (Technical Monitor)

    2003-01-01

    We consider the impact of Aspect-Oriented Programming on Software Engineering, and, in particular, analyze two AOP systems, one of which does component wrapping and the other, quantification over events, for their software engineering effects.

  16. qPCR-based mitochondrial DNA quantification: Influence of template DNA fragmentation on accuracy

    International Nuclear Information System (INIS)

    Jackson, Christopher B.; Gallati, Sabina; Schaller, André

    2012-01-01

    Highlights: ► Serial qPCR accurately determines fragmentation state of any given DNA sample. ► Serial qPCR demonstrates different preservation of the nuclear and mitochondrial genome. ► Serial qPCR provides a diagnostic tool to validate the integrity of bioptic material. ► Serial qPCR excludes degradation-induced erroneous quantification. -- Abstract: Real-time PCR (qPCR) is the method of choice for quantification of mitochondrial DNA (mtDNA) by relative comparison of a nuclear to a mitochondrial locus. Quantitative abnormal mtDNA content is indicative of mitochondrial disorders and mostly confines in a tissue-specific manner. Thus handling of degradation-prone bioptic material is inevitable. We established a serial qPCR assay based on increasing amplicon size to measure degradation status of any DNA sample. Using this approach we can exclude erroneous mtDNA quantification due to degraded samples (e.g. long post-exicision time, autolytic processus, freeze–thaw cycles) and ensure abnormal DNA content measurements (e.g. depletion) in non-degraded patient material. By preparation of degraded DNA under controlled conditions using sonification and DNaseI digestion we show that erroneous quantification is due to the different preservation qualities of the nuclear and the mitochondrial genome. This disparate degradation of the two genomes results in over- or underestimation of mtDNA copy number in degraded samples. Moreover, as analysis of defined archival tissue would allow to precise the molecular pathomechanism of mitochondrial disorders presenting with abnormal mtDNA content, we compared fresh frozen (FF) with formalin-fixed paraffin-embedded (FFPE) skeletal muscle tissue of the same sample. By extrapolation of measured decay constants for nuclear DNA (λ nDNA ) and mtDNA (λ mtDNA ) we present an approach to possibly correct measurements in degraded samples in the future. To our knowledge this is the first time different degradation impact of the two

  17. qPCR-based mitochondrial DNA quantification: Influence of template DNA fragmentation on accuracy

    Energy Technology Data Exchange (ETDEWEB)

    Jackson, Christopher B., E-mail: Christopher.jackson@insel.ch [Division of Human Genetics, Departements of Pediatrics and Clinical Research, Inselspital, University of Berne, Freiburgstrasse, CH-3010 Berne (Switzerland); Gallati, Sabina, E-mail: sabina.gallati@insel.ch [Division of Human Genetics, Departements of Pediatrics and Clinical Research, Inselspital, University of Berne, Freiburgstrasse, CH-3010 Berne (Switzerland); Schaller, Andre, E-mail: andre.schaller@insel.ch [Division of Human Genetics, Departements of Pediatrics and Clinical Research, Inselspital, University of Berne, Freiburgstrasse, CH-3010 Berne (Switzerland)

    2012-07-06

    degraded samples in the future. To our knowledge this is the first time different degradation impact of the two genomes is demonstrated and which evaluates systematically the impact of DNA degradation on quantification of mtDNA copy number.

  18. Comparison of five DNA quantification methods

    DEFF Research Database (Denmark)

    Nielsen, Karsten; Mogensen, Helle Smidt; Hedman, Johannes

    2008-01-01

    Six commercial preparations of human genomic DNA were quantified using five quantification methods: UV spectrometry, SYBR-Green dye staining, slot blot hybridization with the probe D17Z1, Quantifiler Human DNA Quantification kit and RB1 rt-PCR. All methods measured higher DNA concentrations than...... Quantification kit in two experiments. The measured DNA concentrations with Quantifiler were 125 and 160% higher than expected based on the manufacturers' information. When the Quantifiler human DNA standard (Raji cell line) was replaced by the commercial human DNA preparation G147A (Promega) to generate the DNA...... standard curve in the Quantifiler Human DNA Quantification kit, the DNA quantification results of the human DNA preparations were 31% higher than expected based on the manufacturers' information. The results indicate a calibration problem with the Quantifiler human DNA standard for its use...

  19. Quantification of character-impacting compounds in Ocimum basilicum and 'Pesto alla Genovese' with selected ion flow tube mass spectrometry.

    Science.gov (United States)

    Amadei, Gianluca; Ross, Brian M

    2012-02-15

    Basil (Ocimum basilicum) is an important flavourant plant which constitutes the major ingredient of the pasta sauce 'Pesto alla Genovese'. The characteristic smell of basil stems mainly from a handful of terpenoids (methyl cinnamate, eucalyptol, linalool and estragole), the concentration of which varies according to basil cultivars. The simple and rapid analysis of the terpenoid constituents of basil would be useful as a means to optimise harvesting times and to act as a quality control process for basil-containing foodstuffs. Classical analytical techniques such as gas chromatography/mass spectrometry (GC/MS) are, however, slow, technically demanding and therefore less suitable for routine analysis. A new chemical ionisation technique which allows real-time quantification of traces gases, Selected Ion Flow Tube Mass Spectrometry (SIFT-MS), was therefore utilised to determine its usefulness for the assay of terpenoid concentrations in basil and pesto sauce headspace. Trace gas analysis was performed using the NO(+) precursor ion which minimised interference from other compounds. Character-impacting compound concentration was measured in basil headspace with good reproducibility and statistically significant differences were observed between cultivars. Quantification of linalool in pesto sauce headspace proved more difficult due to the presence of interfering compounds. This was resolved by careful selection of reaction product ions which allowed us to detect differences between various commercial brands of pesto. We conclude that SIFT-MS may be a valid tool for the fast and reproducible analysis of flavourant terpenoids in basil and basil-derived foodstuffs. Copyright © 2011 John Wiley & Sons, Ltd.

  20. Quantifying "apparent" impact and distinguishing impact from invasiveness in multispecies plant invasions

    Science.gov (United States)

    Dean E. Pearson; Yvette K. Ortega; Ozkan Eren; Jose L. Hierro

    2015-01-01

    The quantification of invader impacts remains a major hurdle to understanding and managing invasions. Here, we demonstrate a method for quantifying the community-level impact of multiple plant invaders by applying Parker et al.'s (1999) equation (impact = range x local abundance x per capita effect or per unit effect) using data from 620 survey plots from 31...

  1. A refined methodology for modeling volume quantification performance in CT

    Science.gov (United States)

    Chen, Baiyu; Wilson, Joshua; Samei, Ehsan

    2014-03-01

    The utility of CT lung nodule volume quantification technique depends on the precision of the quantification. To enable the evaluation of quantification precision, we previously developed a mathematical model that related precision to image resolution and noise properties in uniform backgrounds in terms of an estimability index (e'). The e' was shown to predict empirical precision across 54 imaging and reconstruction protocols, but with different correlation qualities for FBP and iterative reconstruction (IR) due to the non-linearity of IR impacted by anatomical structure. To better account for the non-linearity of IR, this study aimed to refine the noise characterization of the model in the presence of textured backgrounds. Repeated scans of an anthropomorphic lung phantom were acquired. Subtracted images were used to measure the image quantum noise, which was then used to adjust the noise component of the e' calculation measured from a uniform region. In addition to the model refinement, the validation of the model was further extended to 2 nodule sizes (5 and 10 mm) and 2 segmentation algorithms. Results showed that the magnitude of IR's quantum noise was significantly higher in structured backgrounds than in uniform backgrounds (ASiR, 30-50%; MBIR, 100-200%). With the refined model, the correlation between e' values and empirical precision no longer depended on reconstruction algorithm. In conclusion, the model with refined noise characterization relfected the nonlinearity of iterative reconstruction in structured background, and further showed successful prediction of quantification precision across a variety of nodule sizes, dose levels, slice thickness, reconstruction algorithms, and segmentation software.

  2. Impact socio-environnemental de l'exploitation du sable marin sur le ...

    African Journals Online (AJOL)

    En effet, l'exploitation du sable marin dans cette contrée s'est ouverte et évolue du fait de l'urbanisation galopante de la Commune liée à l'érection des ... In addition, the uncontrolled exploitation increased coastal erosion and does little to improve the conditions of economic and social life of the people who indulge ...

  3. Lung involvement quantification in chest radiographs

    International Nuclear Information System (INIS)

    Giacomini, Guilherme; Alvarez, Matheus; Oliveira, Marcela de; Miranda, Jose Ricardo A.; Pina, Diana R.; Pereira, Paulo C.M.; Ribeiro, Sergio M.

    2014-01-01

    Tuberculosis (TB) caused by Mycobacterium tuberculosis, is an infectious disease which remains a global health problem. The chest radiography is the commonly method employed to assess the TB's evolution. The methods for quantification of abnormalities of chest are usually performed on CT scans (CT). This quantification is important to assess the TB evolution and treatment and comparing different treatments. However, precise quantification is not feasible for the amount of CT scans required. The purpose of this work is to develop a methodology for quantification of lung damage caused by TB through chest radiographs. It was developed an algorithm for computational processing of exams in Matlab, which creates a lungs' 3D representation, with compromised dilated regions inside. The quantification of lung lesions was also made for the same patients through CT scans. The measurements from the two methods were compared and resulting in strong correlation. Applying statistical Bland and Altman, all samples were within the limits of agreement, with a confidence interval of 95%. The results showed an average variation of around 13% between the two quantification methods. The results suggest the effectiveness and applicability of the method developed, providing better risk-benefit to the patient and cost-benefit ratio for the institution. (author)

  4. Fluorescent quantification of melanin.

    Science.gov (United States)

    Fernandes, Bruno; Matamá, Teresa; Guimarães, Diana; Gomes, Andreia; Cavaco-Paulo, Artur

    2016-11-01

    Melanin quantification is reportedly performed by absorption spectroscopy, commonly at 405 nm. Here, we propose the implementation of fluorescence spectroscopy for melanin assessment. In a typical in vitro assay to assess melanin production in response to an external stimulus, absorption spectroscopy clearly overvalues melanin content. This method is also incapable of distinguishing non-melanotic/amelanotic control cells from those that are actually capable of performing melanogenesis. Therefore, fluorescence spectroscopy is the best method for melanin quantification as it proved to be highly specific and accurate, detecting even small variations in the synthesis of melanin. This method can also be applied to the quantification of melanin in more complex biological matrices like zebrafish embryos and human hair. © 2016 John Wiley & Sons A/S. Published by John Wiley & Sons Ltd.

  5. Verb aspect, alternations and quantification

    Directory of Open Access Journals (Sweden)

    Svetla Koeva

    2015-11-01

    Full Text Available Verb aspect, alternations and quantification In this paper we are briefly discuss the nature of Bulgarian verb aspect and argue that the verb aspect pairs are different lexical units with different (although related meaning, different argument structure (reflecting categories, explicitness and referential status of arguments and different sets of semantic and syntactic alternations. The verb prefixes resulting in perfective verbs derivation in some cases can be interpreted as lexical quantifiers as well. Thus the Bulgarian verb aspect is related (in different way both with the potential for the generation of alternations and with the prefixal lexical quantification. It is shown that the scope of the lexical quantification by means of verbal prefixes is the quantified verb phrase and the scope remains constant in all derived alternations. The paper concerns the basic issues of these complex problems, while the detailed description of the conditions satisfying particular alternation or particular lexical quantification are subject of a more detailed study.

  6. Pitfalls in the analysis of volatile breath biomarkers: suggested solutions and SIFT-MS quantification of single metabolites

    Czech Academy of Sciences Publication Activity Database

    Smith, D.; Španěl, Patrik

    2015-01-01

    Roč. 9, č. 2 (2015), 022001 ISSN 1752-7155 Institutional support: RVO:61388955 Keywords : SIFT-MS * volatile biomarkers * quantifications Subject RIV: CF - Physical ; Theoretical Chemistry Impact factor: 4.177, year: 2015

  7. Nuclear Magnetic Resonance: new applications in the quantification and assessment of polysaccharide-based vaccine intermediates

    International Nuclear Information System (INIS)

    Garrido, Raine; Velez, Herman; Verez, Vicente

    2013-01-01

    Nuclear Magnetic Resonance has become the choice for structural studies, identity assays and simultaneous quantification of active pharmaceutical ingredient of different polysaccharide-based vaccine. In the last two decades, the application of quantitative Nuclear Magnetic Resonance had an increasing impact to support several quantification necessities. The technique involves experiments with several modified parameters in order to obtain spectra with quantifiable signals. The present review is supported by some recent relevant reports and it discusses several applications of NMR in carbohydrate-based vaccines. Moreover, it emphasizes and describes several parameters and applications of quantitative Nuclear Magnetic Resonance

  8. Quantification of viral DNA during HIV-1 infection: A review of relevant clinical uses and laboratory methods.

    Science.gov (United States)

    Alidjinou, E K; Bocket, L; Hober, D

    2015-02-01

    Effective antiretroviral therapy usually leads to undetectable HIV-1 RNA in the plasma. However, the virus persists in some cells of infected patients as various DNA forms, both integrated and unintegrated. This reservoir represents the greatest challenge to the complete cure of HIV-1 infection and its characteristics highly impact the course of the disease. The quantification of HIV-1 DNA in blood samples constitutes currently the most practical approach to measure this residual infection. Real-time quantitative PCR (qPCR) is the most common method used for HIV-DNA quantification and many strategies have been developed to measure the different forms of HIV-1 DNA. In the literature, several "in-house" PCR methods have been used and there is a need for standardization to have comparable results. In addition, qPCR is limited for the precise quantification of low levels by background noise. Among new assays in development, digital PCR was shown to allow an accurate quantification of HIV-1 DNA. Total HIV-1 DNA is most commonly measured in clinical routine. The absolute quantification of proviruses and unintegrated forms is more often used for research purposes. Copyright © 2014 Elsevier Masson SAS. All rights reserved.

  9. A "Toy" Model for Operational Risk Quantification using Credibility Theory

    OpenAIRE

    Hans B\\"uhlmann; Pavel V. Shevchenko; Mario V. W\\"uthrich

    2009-01-01

    To meet the Basel II regulatory requirements for the Advanced Measurement Approaches in operational risk, the bank's internal model should make use of the internal data, relevant external data, scenario analysis and factors reflecting the business environment and internal control systems. One of the unresolved challenges in operational risk is combining of these data sources appropriately. In this paper we focus on quantification of the low frequency high impact losses exceeding some high thr...

  10. Synopsis of research conducted under the 1991/92 northern contaminants program: from a workshop in support of the technical and science managers committees on northern ecosystems and native diets, Ottawa, Ontario, March 24-25, 1992

    National Research Council Canada - National Science Library

    Murray, J. L; Shearer, R. G

    1992-01-01

    .... Ce repport stipule le resume des resultats de recherches et etudes surveilles entreprises sous les auspices de la Strategie pour l'Environnemente Arctique du Plan Vert du Canada en 1991/92 concernant...

  11. Quantification of the impact of a confounding variable on functional connectivity confirms anti-correlated networks in the resting-state.

    Science.gov (United States)

    Carbonell, F; Bellec, P; Shmuel, A

    2014-02-01

    The effect of regressing out the global average signal (GAS) in resting state fMRI data has become a concern for interpreting functional connectivity analyses. It is not clear whether the reported anti-correlations between the Default Mode and the Dorsal Attention Networks are intrinsic to the brain, or are artificially created by regressing out the GAS. Here we introduce a concept, Impact of the Global Average on Functional Connectivity (IGAFC), for quantifying the sensitivity of seed-based correlation analyses to the regression of the GAS. This voxel-wise IGAFC index is defined as the product of two correlation coefficients: the correlation between the GAS and the fMRI time course of a voxel, times the correlation between the GAS and the seed time course. This definition enables the calculation of a threshold at which the impact of regressing-out the GAS would be large enough to introduce spurious negative correlations. It also yields a post-hoc impact correction procedure via thresholding, which eliminates spurious correlations introduced by regressing out the GAS. In addition, we introduce an Artificial Negative Correlation Index (ANCI), defined as the absolute difference between the IGAFC index and the impact threshold. The ANCI allows a graded confidence scale for ranking voxels according to their likelihood of showing artificial correlations. By applying this method, we observed regions in the Default Mode and Dorsal Attention Networks that were anti-correlated. These findings confirm that the previously reported negative correlations between the Dorsal Attention and Default Mode Networks are intrinsic to the brain and not the result of statistical manipulations. Our proposed quantification of the impact that a confound may have on functional connectivity can be generalized to global effect estimators other than the GAS. It can be readily applied to other confounds, such as systemic physiological or head movement interferences, in order to quantify their

  12. Uncertainty quantification theory, implementation, and applications

    CERN Document Server

    Smith, Ralph C

    2014-01-01

    The field of uncertainty quantification is evolving rapidly because of increasing emphasis on models that require quantified uncertainties for large-scale applications, novel algorithm development, and new computational architectures that facilitate implementation of these algorithms. Uncertainty Quantification: Theory, Implementation, and Applications provides readers with the basic concepts, theory, and algorithms necessary to quantify input and response uncertainties for simulation models arising in a broad range of disciplines. The book begins with a detailed discussion of applications where uncertainty quantification is critical for both scientific understanding and policy. It then covers concepts from probability and statistics, parameter selection techniques, frequentist and Bayesian model calibration, propagation of uncertainties, quantification of model discrepancy, surrogate model construction, and local and global sensitivity analysis. The author maintains a complementary web page where readers ca...

  13. Effects of humic acid on DNA quantification with Quantifiler® Human DNA Quantification kit and short tandem repeat amplification efficiency.

    Science.gov (United States)

    Seo, Seung Bum; Lee, Hye Young; Zhang, Ai Hua; Kim, Hye Yeon; Shin, Dong Hoon; Lee, Soong Deok

    2012-11-01

    Correct DNA quantification is an essential part to obtain reliable STR typing results. Forensic DNA analysts often use commercial kits for DNA quantification; among them, real-time-based DNA quantification kits are most frequently used. Incorrect DNA quantification due to the presence of PCR inhibitors may affect experiment results. In this study, we examined the alteration degree of DNA quantification results estimated in DNA samples containing a PCR inhibitor by using a Quantifiler® Human DNA Quantification kit. For experiments, we prepared approximately 0.25 ng/μl DNA samples containing various concentrations of humic acid (HA). The quantification results were 0.194-0.303 ng/μl at 0-1.6 ng/μl HA (final concentration in the Quantifiler reaction) and 0.003-0.168 ng/μl at 2.4-4.0 ng/μl HA. Most DNA quantity was undetermined when HA concentration was higher than 4.8 ng/μl HA. The C (T) values of an internal PCR control (IPC) were 28.0-31.0, 36.5-37.1, and undetermined at 0-1.6, 2.4, and 3.2 ng/μl HA. These results indicate that underestimated DNA quantification results may be obtained in the DNA sample with high C (T) values of IPC. Thus, researchers should carefully interpret the DNA quantification results. We additionally examined the effects of HA on the STR amplification by using an Identifiler® kit and a MiniFiler™ kit. Based on the results of this study, it is thought that a better understanding of various effects of HA would help researchers recognize and manipulate samples containing HA.

  14. Uncertainty quantification in ion–solid interaction simulations

    Energy Technology Data Exchange (ETDEWEB)

    Preuss, R., E-mail: preuss@ipp.mpg.de; Toussaint, U. von

    2017-02-15

    Within the framework of Bayesian uncertainty quantification we propose a non-intrusive reduced-order spectral approach (polynomial chaos expansion) to the simulation of ion–solid interactions. The method not only reduces the number of function evaluations but provides simultaneously a quantitative measure for which combinations of inputs have the most important impact on the result. It is applied to SDTRIM-simulations (Möller et al., 1988) with several uncertain and Gaussian distributed input parameters (i.e. angle, projectile energy, surface binding energy, target composition) and the results are compared to full-grid based approaches and sampling based methods with respect to reliability, efficiency and scalability.

  15. Quantification of local mobilities

    DEFF Research Database (Denmark)

    Zhang, Y. B.

    2018-01-01

    A new method for quantification of mobilities of local recrystallization boundary segments is presented. The quantification is based on microstructures characterized using electron microscopy and on determination of migration velocities and driving forces for local boundary segments. Pure aluminium...... is investigated and the results show that even for a single recrystallization boundary, different boundary segments migrate differently, and the differences can be understood based on variations in mobilities and local deformed microstructures. The present work has important implications for understanding...

  16. Development of Quantification Method for Bioluminescence Imaging

    International Nuclear Information System (INIS)

    Kim, Hyeon Sik; Min, Jung Joon; Lee, Byeong Il; Choi, Eun Seo; Tak, Yoon O; Choi, Heung Kook; Lee, Ju Young

    2009-01-01

    Optical molecular luminescence imaging is widely used for detection and imaging of bio-photons emitted by luminescent luciferase activation. The measured photons in this method provide the degree of molecular alteration or cell numbers with the advantage of high signal-to-noise ratio. To extract useful information from the measured results, the analysis based on a proper quantification method is necessary. In this research, we propose a quantification method presenting linear response of measured light signal to measurement time. We detected the luminescence signal by using lab-made optical imaging equipment of animal light imaging system (ALIS) and different two kinds of light sources. One is three bacterial light-emitting sources containing different number of bacteria. The other is three different non-bacterial light sources emitting very weak light. By using the concept of the candela and the flux, we could derive simplified linear quantification formula. After experimentally measuring light intensity, the data was processed with the proposed quantification function. We could obtain linear response of photon counts to measurement time by applying the pre-determined quantification function. The ratio of the re-calculated photon counts and measurement time present a constant value although different light source was applied. The quantification function for linear response could be applicable to the standard quantification process. The proposed method could be used for the exact quantitative analysis in various light imaging equipment with presenting linear response behavior of constant light emitting sources to measurement time

  17. Quantification in single photon emission computed tomography (SPECT)

    International Nuclear Information System (INIS)

    Buvat, Irene

    2005-01-01

    The objective of this lecture is to understand the possibilities and limitations of the quantitative analysis of single photon emission computed tomography (SPECT) images. It is also to identify the conditions to be fulfilled to obtain reliable quantitative measurements from images. Content: 1 - Introduction: Quantification in emission tomography - definition and challenges; quantification biasing phenomena; 2 - quantification in SPECT, problems and correction methods: Attenuation, scattering, un-stationary spatial resolution, partial volume effect, movement, tomographic reconstruction, calibration; 3 - Synthesis: actual quantification accuracy; 4 - Beyond the activity concentration measurement

  18. Development of a VHH-Based Erythropoietin Quantification Assay

    DEFF Research Database (Denmark)

    Kol, Stefan; Beuchert Kallehauge, Thomas; Adema, Simon

    2015-01-01

    Erythropoietin (EPO) quantification during cell line selection and bioreactor cultivation has traditionally been performed with ELISA or HPLC. As these techniques suffer from several drawbacks, we developed a novel EPO quantification assay. A camelid single-domain antibody fragment directed against...... human EPO was evaluated as a capturing antibody in a label-free biolayer interferometry-based quantification assay. Human recombinant EPO can be specifically detected in Chinese hamster ovary cell supernatants in a sensitive and pH-dependent manner. This method enables rapid and robust quantification...

  19. Quantification procedures in micro X-ray fluorescence analysis

    International Nuclear Information System (INIS)

    Kanngiesser, Birgit

    2003-01-01

    For the quantification in micro X-ray fluorescence analysis standardfree quantification procedures have become especially important. An introduction to the basic concepts of these quantification procedures is given, followed by a short survey of the procedures which are available now and what kind of experimental situations and analytical problems are addressed. The last point is extended by the description of an own development for the fundamental parameter method, which renders the inclusion of nonparallel beam geometries possible. Finally, open problems for the quantification procedures are discussed

  20. In vivo MRS metabolite quantification using genetic optimization

    Science.gov (United States)

    Papakostas, G. A.; Karras, D. A.; Mertzios, B. G.; van Ormondt, D.; Graveron-Demilly, D.

    2011-11-01

    The in vivo quantification of metabolites' concentrations, revealed in magnetic resonance spectroscopy (MRS) spectra, constitutes the main subject under investigation in this work. Significant contributions based on artificial intelligence tools, such as neural networks (NNs), with good results have been presented lately but have shown several drawbacks, regarding their quantification accuracy under difficult conditions. A general framework that encounters the quantification procedure as an optimization problem, which is solved using a genetic algorithm (GA), is proposed in this paper. Two different lineshape models are examined, while two GA configurations are applied on artificial data. Moreover, the introduced quantification technique deals with metabolite peaks' overlapping, a considerably difficult situation occurring under real conditions. Appropriate experiments have proved the efficiency of the introduced methodology, in artificial MRS data, by establishing it as a generic metabolite quantification procedure.

  1. In vivo MRS metabolite quantification using genetic optimization

    International Nuclear Information System (INIS)

    Papakostas, G A; Mertzios, B G; Karras, D A; Van Ormondt, D; Graveron-Demilly, D

    2011-01-01

    The in vivo quantification of metabolites' concentrations, revealed in magnetic resonance spectroscopy (MRS) spectra, constitutes the main subject under investigation in this work. Significant contributions based on artificial intelligence tools, such as neural networks (NNs), with good results have been presented lately but have shown several drawbacks, regarding their quantification accuracy under difficult conditions. A general framework that encounters the quantification procedure as an optimization problem, which is solved using a genetic algorithm (GA), is proposed in this paper. Two different lineshape models are examined, while two GA configurations are applied on artificial data. Moreover, the introduced quantification technique deals with metabolite peaks' overlapping, a considerably difficult situation occurring under real conditions. Appropriate experiments have proved the efficiency of the introduced methodology, in artificial MRS data, by establishing it as a generic metabolite quantification procedure

  2. Monitoring of the radiological environmental impact of the AREVA site of Tricastin; Suivi de l'impact radiologique environnemental des activites du site AREVA du Tricastin

    Energy Technology Data Exchange (ETDEWEB)

    Mercat, C.; Brun, F.; Florens, P.; Petit, J. [AREVA NC Pierrelatte, Direction surete environnement du site du Tricastin, 26 (France); Garnier, F. [EURODIF Production, Direction qualite securite surete environnement, 26 (France); Devin, P. [AREVA NC Pierrelatte, Direction surete, sante, securite, environnement, 26 (France)

    2010-06-15

    Set up at the beginning of the site's operations, in 1962, the monitoring of the radiological environmental impact of the AREVA site of Tricastin has evolved over time to meet more specifically the multiple objectives of environmental monitoring: to prove the respect of the commitments required by the authorities, to be able to detect a dysfunction in the observed levels, to enable the assessment of impacts of industrial activities, to ensure the balance between environmental quality and the use made by the local population and to inform the public of the radiological state of the environment. Thousands of data were acquired on the radioactivity of all environmental compartments as well as on the functioning of local ecosystems. Today, the Network of Environmental Monitoring of AREVA Tricastin goes beyond the requirements of routine monitoring to provide innovative solutions for monitoring the radioactivity (especially for uranium) in the environment. (author)

  3. Quantification of traffic generation and its environmental impacts through decisions, frameworks and measures indirectly influencing transportation; Quantifizierung der Verkehrsentstehung und deren Umweltauswirkungen durch Entscheidungen, Regelwerke und Massnahmen mit indirektem Verkehrsbezug

    Energy Technology Data Exchange (ETDEWEB)

    Holz-Rau, C.; Hesse, M.; Geier, S.; Holzhey, A.; Rau, P.; Schreiner, J.; Schenk, E. [Buero fuer Integrierte Planung, Herdecke (Germany); Arndt, W.H.; Flaemig, H.; Rogge, L.; Steinfeld, M. [Institut fuer Oekologische Wirtschaftsforschung (IOEW) gGmbH, Berlin (Germany)

    2000-09-01

    Legislation (e.g. at the Federal or State level) outside the transportation policy sphere may have indirect impacts on traffic structure and generation. Four case studies have been carried out in order to analyse transport related impact chains, to quantify possible transportation impacts and to identify potential strategies for minimising impacts generated by non-transport-specific legislation. One result is a questionnaire designed as a further element of transportation impact assessment. The goal is to quickly evaluate potential impacts and to modify such proposed legislation in order to minimise negative impacts. Interestingly, the quantification of certain transportion impacts, a difficult task, proved not to be essential. (orig.) [German] Von Vorhaben ausserhalb des Verkehrsbereichs (z.B. Bundesgesetze oder Landesrichtlinien) koennen indirekte Effekte auf die Verkehrsstrukturen ausgehen. Fuer vier Fallbeispiele wurden verkehrsrelevante Wirkungsketten analysiert, die Verkehrseffekte ansatzweise quantifiziert und potentielle Handlungsansaetze fuer eine verkehrssparsamere Modifikation der Vorhaben identifiziert. Es wurde ein Frageschema entwickelt, dessen Anwendung als Fortsetzung einer Verkehrsauswirkungspruefung zur fruehzeitigen Beruecksichtigung von Verkehrseffekten und zur verkehrssparsamen Modifikation von Vorhaben mit indirektem Verkehrsbezug beitragen kann. Eine Quantifizierung der Verkehrseffekte erwies sich dazu in der Regel als entbehrlich. (orig.)

  4. Whole farm quantification of GHG emissions within smallholder farms in developing countries

    International Nuclear Information System (INIS)

    Seebauer, Matthias

    2014-01-01

    The IPCC has compiled the best available scientific methods into published guidelines for estimating greenhouse gas emissions and emission removals from the land-use sector. In order to evaluate existing GHG quantification tools to comprehensively quantify GHG emissions and removals in smallholder conditions, farm scale quantification was tested with farm data from Western Kenya. After conducting a cluster analysis to identify different farm typologies GHG quantification was exercised using the VCS SALM methodology complemented with IPCC livestock emission factors and the cool farm tool. The emission profiles of four farm clusters representing the baseline conditions in the year 2009 are compared with 2011 where farmers adopted sustainable land management practices (SALM). The results demonstrate the variation in both the magnitude of the estimated GHG emissions per ha between different smallholder farm typologies and the emissions estimated by applying two different accounting tools. The farm scale quantification further shows that the adoption of SALM has a significant impact on emission reduction and removals and the mitigation benefits range between 4 and 6.5 tCO 2  ha −1  yr −1 with significantly different mitigation benefits depending on typologies of the crop–livestock systems, their different agricultural practices, as well as adoption rates of improved practices. However, the inherent uncertainty related to the emission factors applied by accounting tools has substantial implications for reported agricultural emissions. With regard to uncertainty related to activity data, the assessment confirms the high variability within different farm types as well as between different parameters surveyed to comprehensively quantify GHG emissions within smallholder farms. (paper)

  5. Aeroelastic Uncertainty Quantification Studies Using the S4T Wind Tunnel Model

    Science.gov (United States)

    Nikbay, Melike; Heeg, Jennifer

    2017-01-01

    This paper originates from the joint efforts of an aeroelastic study team in the Applied Vehicle Technology Panel from NATO Science and Technology Organization, with the Task Group number AVT-191, titled "Application of Sensitivity Analysis and Uncertainty Quantification to Military Vehicle Design." We present aeroelastic uncertainty quantification studies using the SemiSpan Supersonic Transport wind tunnel model at the NASA Langley Research Center. The aeroelastic study team decided treat both structural and aerodynamic input parameters as uncertain and represent them as samples drawn from statistical distributions, propagating them through aeroelastic analysis frameworks. Uncertainty quantification processes require many function evaluations to asses the impact of variations in numerous parameters on the vehicle characteristics, rapidly increasing the computational time requirement relative to that required to assess a system deterministically. The increased computational time is particularly prohibitive if high-fidelity analyses are employed. As a remedy, the Istanbul Technical University team employed an Euler solver in an aeroelastic analysis framework, and implemented reduced order modeling with Polynomial Chaos Expansion and Proper Orthogonal Decomposition to perform the uncertainty propagation. The NASA team chose to reduce the prohibitive computational time by employing linear solution processes. The NASA team also focused on determining input sample distributions.

  6. Le développement durable est-il bienvenu dans les organisations ? Cas de l’implantation d’un Système de Management Environnemental en Tunisie

    Directory of Open Access Journals (Sweden)

    Moez Ben Yedder

    2009-02-01

    Full Text Available Les entreprises sociétalement responsables sont celles qui s’inscrivent dans une logique de développement durable. Leur engagement se matérialise, notamment, par le déploiement de systèmes de gestion respectueux de l’environnement tels que les systèmes de management environnemental (SME. En plus de leur portée écologique, ces derniers ont une visée sociale puisqu’ils permettent d’améliorer les conditions de travail des opérateurs. Toutefois, leur mise en place peut être gênée par ces derniers s’ils ne retrouvent pas leurs intérêts lors de l’instauration d’un tel système. L’objectif de ce texte est d’étayer cette idée à partir de l’étude de cas d’une entreprise tunisienne où l’instauration d’un SME n’a pu aboutir en partie à cause de la résistance du personnel. Dans ce sens, nous appuierons l’idée selon laquelle les pratiques du Développement Durable et de la Responsabilité Sociale de l’Entreprise ne se greffent pas ex abrupto sur une entreprise mais nécessitent certains préalables d’ordre organisationnel pour être mis en place.Socially responsible firms are the ones which act in accordance to sustainable development logic. Their engagement is materialised by deploying environment friendly practices such as environmental management system (EMS. Those systems have also a social purpose in the way that they allow enhancing work conditions of operators. However, their setting could be embarrassed by those ones because of the change of habits that it implies for them. The purpose of this study is to shore up this idea by resorting to a case study of a Tunisian firm in which the setting of the environmental management system turned failed in part because of the resistance of the personnel. We will defend the idea that practices of sustainable development and corporate social responsibilities couldn’t be correctly adopted by firms without taking into consideration to some organizational

  7. An EPGPT-based approach for uncertainty quantification

    International Nuclear Information System (INIS)

    Wang, C.; Abdel-Khalik, H. S.

    2012-01-01

    Generalized Perturbation Theory (GPT) has been widely used by many scientific disciplines to perform sensitivity analysis and uncertainty quantification. This manuscript employs recent developments in GPT theory, collectively referred to as Exact-to-Precision Generalized Perturbation Theory (EPGPT), to enable uncertainty quantification for computationally challenging models, e.g. nonlinear models associated with many input parameters and many output responses and with general non-Gaussian parameters distributions. The core difference between EPGPT and existing GPT is in the way the problem is formulated. GPT formulates an adjoint problem that is dependent on the response of interest. It tries to capture via the adjoint solution the relationship between the response of interest and the constraints on the state variations. EPGPT recasts the problem in terms of a smaller set of what is referred to as the 'active' responses which are solely dependent on the physics model and the boundary and initial conditions rather than on the responses of interest. The objective of this work is to apply an EPGPT methodology to propagate cross-sections variations in typical reactor design calculations. The goal is to illustrate its use and the associated impact for situations where the typical Gaussian assumption for parameters uncertainties is not valid and when nonlinear behavior must be considered. To allow this demonstration, exaggerated variations will be employed to stimulate nonlinear behavior in simple prototypical neutronics models. (authors)

  8. Overview of Environmental Impact Assessment of Oil and Gas ...

    African Journals Online (AJOL)

    The environmental impact assessment (EIA) of oil and gas projects in Nigeria ... natural, social and health components of the environment; Determination of issues ... of impact quantification through which the Environmental Management Plan ...

  9. LC-MS/MS strategies for therapeutic antibodies and investigation into the quantitative impact of antidrug-antibodies.

    Science.gov (United States)

    Ewles, Matthew; Mannu, Ranbir; Fox, Chris; Stanta, Johannes; Evans, Graeme; Goodwin, Lee; Duffy, James; Bell, Len; Estdale, Sian; Firth, David

    2016-12-01

    We aimed to establish novel, high-throughput LC-MS/MS strategies for quantification of monoclonal antibodies in human serum and examine the potential impact of antidrug antibodies. We present two strategies using a thermally stable immobilized trypsin. The first strategy uses whole serum digestion and the second introduces Protein G enrichment to improve the selectivity. The impact of anti-trastuzumab antibodies on the methods was tested. Whole serum digestion has been validated for trastuzumab (LLOQ 0.25 µg/ml). Protein G enrichment has been validated for trastuzumab (LLOQ 0.1 µg/ml), bevacizumab (LLOQ 0.1 µg/ml) and adalimumab (LLOQ 0.25 µg/ml). We have shown the potential for anti-drug antibodies to impact on the quantification and we have subsequently established a strategy to overcome this impact where total quantification is desired.

  10. Review of the quantification techniques for polycyclic aromatic hydrocarbons (PAHs) in food products.

    Science.gov (United States)

    Bansal, Vasudha; Kumar, Pawan; Kwon, Eilhann E; Kim, Ki-Hyun

    2017-10-13

    There is a growing need for accurate detection of trace-level PAHs in food products due to the numerous detrimental effects caused by their contamination (e.g., toxicity, carcinogenicity, and teratogenicity). This review aims to discuss the up-to-date knowledge on the measurement techniques available for PAHs contained in food or its related products. This article aims to provide a comprehensive outline on the measurement techniques of PAHs in food to help reduce their deleterious impacts on human health based on the accurate quantification. The main part of this review is dedicated to the opportunities and practical options for the treatment of various food samples and for accurate quantification of PAHs contained in those samples. Basic information regarding all available analytical measurement techniques for PAHs in food samples is also evaluated with respect to their performance in terms of quality assurance.

  11. Digital PCR for direct quantification of viruses without DNA extraction.

    Science.gov (United States)

    Pavšič, Jernej; Žel, Jana; Milavec, Mojca

    2016-01-01

    DNA extraction before amplification is considered an essential step for quantification of viral DNA using real-time PCR (qPCR). However, this can directly affect the final measurements due to variable DNA yields and removal of inhibitors, which leads to increased inter-laboratory variability of qPCR measurements and reduced agreement on viral loads. Digital PCR (dPCR) might be an advantageous methodology for the measurement of virus concentrations, as it does not depend on any calibration material and it has higher tolerance to inhibitors. DNA quantification without an extraction step (i.e. direct quantification) was performed here using dPCR and two different human cytomegalovirus whole-virus materials. Two dPCR platforms were used for this direct quantification of the viral DNA, and these were compared with quantification of the extracted viral DNA in terms of yield and variability. Direct quantification of both whole-virus materials present in simple matrices like cell lysate or Tris-HCl buffer provided repeatable measurements of virus concentrations that were probably in closer agreement with the actual viral load than when estimated through quantification of the extracted DNA. Direct dPCR quantification of other viruses, reference materials and clinically relevant matrices is now needed to show the full versatility of this very promising and cost-efficient development in virus quantification.

  12. Accurate Digital Polymerase Chain Reaction Quantification of Challenging Samples Applying Inhibitor-Tolerant DNA Polymerases.

    Science.gov (United States)

    Sidstedt, Maja; Romsos, Erica L; Hedell, Ronny; Ansell, Ricky; Steffen, Carolyn R; Vallone, Peter M; Rådström, Peter; Hedman, Johannes

    2017-02-07

    Digital PCR (dPCR) enables absolute quantification of nucleic acids by partitioning of the sample into hundreds or thousands of minute reactions. By assuming a Poisson distribution for the number of DNA fragments present in each chamber, the DNA concentration is determined without the need for a standard curve. However, when analyzing nucleic acids from complex matrixes such as soil and blood, the dPCR quantification can be biased due to the presence of inhibitory compounds. In this study, we evaluated the impact of varying the DNA polymerase in chamber-based dPCR for both pure and impure samples using the common PCR inhibitor humic acid (HA) as a model. We compared the TaqMan Universal PCR Master Mix with two alternative DNA polymerases: ExTaq HS and Immolase. By using Bayesian modeling, we show that there is no difference among the tested DNA polymerases in terms of accuracy of absolute quantification for pure template samples, i.e., without HA present. For samples containing HA, there were great differences in performance: the TaqMan Universal PCR Master Mix failed to correctly quantify DNA with more than 13 pg/nL HA, whereas Immolase (1 U) could handle up to 375 pg/nL HA. Furthermore, we found that BSA had a moderate positive effect for the TaqMan Universal PCR Master Mix, enabling accurate quantification for 25 pg/nL HA. Increasing the amount of DNA polymerase from 1 to 5 U had a strong effect for ExTaq HS, elevating HA-tolerance four times. We also show that the average Cq values of positive reactions may be used as a measure of inhibition effects, e.g., to determine whether or not a dPCR quantification result is reliable. The statistical models developed to objectively analyze the data may also be applied in quality control. We conclude that the choice of DNA polymerase in dPCR is crucial for the accuracy of quantification when analyzing challenging samples.

  13. Quantification of Impact of Orbital Drift on Inter-Annual Trends in AVHRR NDVI Data

    Directory of Open Access Journals (Sweden)

    Jyoteshwar R. Nagol

    2014-07-01

    Full Text Available The Normalized Difference Vegetation Index (NDVI time-series data derived from Advanced Very High Resolution Radiometer (AVHRR have been extensively used for studying inter-annual dynamics of global and regional vegetation. However, there can be significant uncertainties in the data due to incomplete atmospheric correction and orbital drift of the satellites through their active life. Access to location specific quantification of uncertainty is crucial for appropriate evaluation of the trends and anomalies. This paper provides per pixel quantification of orbital drift related spurious trends in Long Term Data Record (LTDR AVHRR NDVI data product. The magnitude and direction of the spurious trends was estimated by direct comparison with data from MODerate resolution Imaging Spectrometer (MODIS Aqua instrument, which has stable inter-annual sun-sensor geometry. The maps show presence of both positive as well as negative spurious trends in the data. After application of the BRDF correction, an overall decrease in positive trends and an increase in number of pixels with negative spurious trends were observed. The mean global spurious inter-annual NDVI trend before and after BRDF correction was 0.0016 and −0.0017 respectively. The research presented in this paper gives valuable insight into the magnitude of orbital drift related trends in the AVHRR NDVI data as well as the degree to which it is being rectified by the MODIS BRDF correction algorithm used by the LTDR processing stream.

  14. The quantification of risk and tourism

    Directory of Open Access Journals (Sweden)

    Piet Croucamp

    2014-01-01

    Full Text Available Tourism in South Africa comprises 9.5% of Gross Domestic Product (GDP, but remains an underresearched industry, especially regarding the quantification of the risks prevailing in the social, political and economic environment in which the industry operates. Risk prediction, extrapolation forecasting is conducted largely in the context of a qualitative methodology. This article reflects on the quantification of social constructs as variables of risk in the tourism industry with reference to South Africa. The theory and methodology of quantification is briefly reviewed and the indicators of risk are conceptualized and operationalized. The identified indicators are scaled in indices for purposes of quantification. Risk assessments and the quantification of constructs rely heavily on the experience - often personal - of the researcher and this scholarly endeavour is, therefore, not inclusive of all possible identified indicators of risk. It is accepted that tourism in South Africa is an industry comprising of a large diversity of sectors, each with a different set of risk indicators and risk profiles. The emphasis of this article is thus on the methodology to be applied to a risk profile. A secondary endeavour is to provide for clarity about the conceptual and operational confines of risk in general, as well as how quantified risk relates to the tourism industry. The indices provided include both domesticand international risk indicators. The motivation for the article is to encourage a greater emphasis on quantitative research in our efforts to understand and manage a risk profile for the tourist industry.

  15. Identification and Quantification of Celery Allergens Using Fiber Optic Surface Plasmon Resonance PCR.

    Science.gov (United States)

    Daems, Devin; Peeters, Bernd; Delport, Filip; Remans, Tony; Lammertyn, Jeroen; Spasic, Dragana

    2017-07-31

    Abstract : Accurate identification and quantification of allergens is key in healthcare, biotechnology and food quality and safety. Celery ( Apium graveolens ) is one of the most important elicitors of food allergic reactions in Europe. Currently, the golden standards to identify, quantify and discriminate celery in a biological sample are immunoassays and two-step molecular detection assays in which quantitative PCR (qPCR) is followed by a high-resolution melting analysis (HRM). In order to provide a DNA-based, rapid and simple detection method suitable for one-step quantification, a fiber optic PCR melting assay (FO-PCR-MA) was developed to determine different concentrations of celery DNA (1 pM-0.1 fM). The presented method is based on the hybridization and melting of DNA-coated gold nanoparticles to the FO sensor surface in the presence of the target gene (mannitol dehydrogenase, Mtd ). The concept was not only able to reveal the presence of celery DNA, but also allowed for the cycle-to-cycle quantification of the target sequence through melting analysis. Furthermore, the developed bioassay was benchmarked against qPCR followed by HRM, showing excellent agreement ( R ² = 0.96). In conclusion, this innovative and sensitive diagnostic test could further improve food quality control and thus have a large impact on allergen induced healthcare problems.

  16. Impact of the recorded variable on recurrence quantification analysis of flows

    International Nuclear Information System (INIS)

    Portes, Leonardo L.; Benda, Rodolfo N.; Ugrinowitsch, Herbert; Aguirre, Luis A.

    2014-01-01

    Recurrence quantification analysis (RQA) is useful in analyzing dynamical systems from a time series s(t). This paper investigates the robustness of RQA in detecting different dynamical regimes with respect to the recorded variable s(t). RQA was applied to time series x(t), y(t) and z(t) of a drifting Rössler system, which are known to have different observability properties. It was found that some characteristics estimated via RQA are heavily influenced by the choice of s(t) in the case of flows but not in the case of maps. - Highlights: • We investigate the influence of the recorded time series on the RQA coefficients. • The time series {x}, {y} and {z} of a drifting Rössler system were recorded. • RQA coefficients were affected in different degrees by the chosen time series. • RQA coefficients were not affected when computed with the Poincaré section. • In real world experiments, observability analysis should be performed prior to RQA

  17. Impact of the recorded variable on recurrence quantification analysis of flows

    Energy Technology Data Exchange (ETDEWEB)

    Portes, Leonardo L., E-mail: ll.portes@gmail.com [Escola de Educação Física, Fisioterapia e Terapia Ocupacional, Universidade Federeal de Minas Gerais, Av. Antônio Carlos 6627, 31270-901 Belo Horizonte MG (Brazil); Benda, Rodolfo N.; Ugrinowitsch, Herbert [Escola de Educação Física, Fisioterapia e Terapia Ocupacional, Universidade Federeal de Minas Gerais, Av. Antônio Carlos 6627, 31270-901 Belo Horizonte MG (Brazil); Aguirre, Luis A. [Departamento de Engenharia Eletrônica, Universidade Federeal de Minas Gerais, Av. Antônio Carlos 6627, 31270-901 Belo Horizonte MG (Brazil)

    2014-06-27

    Recurrence quantification analysis (RQA) is useful in analyzing dynamical systems from a time series s(t). This paper investigates the robustness of RQA in detecting different dynamical regimes with respect to the recorded variable s(t). RQA was applied to time series x(t), y(t) and z(t) of a drifting Rössler system, which are known to have different observability properties. It was found that some characteristics estimated via RQA are heavily influenced by the choice of s(t) in the case of flows but not in the case of maps. - Highlights: • We investigate the influence of the recorded time series on the RQA coefficients. • The time series {x}, {y} and {z} of a drifting Rössler system were recorded. • RQA coefficients were affected in different degrees by the chosen time series. • RQA coefficients were not affected when computed with the Poincaré section. • In real world experiments, observability analysis should be performed prior to RQA.

  18. Risques et impacts environnementaux des retenues d’altitude pour la production de neige de culture dans un contexte de changement climatique

    Directory of Open Access Journals (Sweden)

    Stéphanie Gaucherand

    2011-10-01

    Full Text Available Les retenues d’altitude sont des ouvrages hydrauliques implantés dans les stations de loisirs de montagne et destinés à créer une réserve d’eau, dédiée principalement à la production de neige de culture. Leur implantation en altitude en fait indubitablement des retenues spécifiques, subissant et induisant des risques et des impacts sur leur environnement anthropique et écologique. Le Cemagref a engagé un projet de recherche sur la sûreté des retenues d’altitude. Le présent article est issu de ces travaux et vise à établir un état des lieux des risques liés aux retenues d’altitude et de leurs impacts sur l’environnement. Il replace le développement des retenues d’altitude dans leurs contextes sociétal, social et environnemental. Il développe ensuite les risques et impacts des retenues d’altitude, en focalisant son analyse sur les différents risques et aléas spécifiques auxquels sont exposés les ouvrages, et sur les différents impacts environnementaux liés à la réalisation et la gestion des retenues.Mountain reservoirs are hydraulic structures implanted in recreational mountain resorts designed to provide a water reserve mainly used for the production of artificial snow. Their implantation in high-altitude zones makes them highly specific reservoirs subjected to and inducing risks and impacts on their human and ecological environment. Based on in-depth bibliographic and field research, Cemagref has launched a study on mountain reservoirs. The present article aims to establish the current state of the risks related to mountain reservoirs and their impacts on the environment, placing the development of mountain reservoirs in their societal, social, and environmental contexts. It will then develop mountain reservoir risks and impacts, focusing on the specific risks and uncertainties to which these structures are exposed, and the different environmental impacts related to the construction and management of

  19. Impact assessment revisited

    DEFF Research Database (Denmark)

    Thiele, Jan; Kollmann, Johannes Christian; Markussen, Bo

    2010-01-01

    ; and (4) the total invaded range is an inappropriate measure for quantifying regional impact because the habitat area available for invasion can vary markedly among invasive species. Mathematical models and empirical data using an invasive alien plant species (Heracleum mantegazzianum) indicate......The theoretical underpinnings of the assessment of invasive alien species impacts need to be improved. At present most approaches are unreliable to quantify impact at regional scales and do not allow for comparison of different invasive species. There are four basic problems that need...... and we discuss the quantification of the invaded range. These improvements are crucial for impact assessment with the overall aim of prioritizing management of invasive species....

  20. Environmental impact quantification and correlation between site ...

    African Journals Online (AJOL)

    The aim of this work was to quantify the most significant impact from the polluted environment and to review the correlation between pollution indicators and the content and structures of Tanacetum vulgare L. (Tansy). Heavy metals as mercury, lead, cadmium, chromium and nickel are considered as pollution indicators.

  1. (1) H-MRS processing parameters affect metabolite quantification

    DEFF Research Database (Denmark)

    Bhogal, Alex A; Schür, Remmelt R; Houtepen, Lotte C

    2017-01-01

    investigated the influence of model parameters and spectral quantification software on fitted metabolite concentration values. Sixty spectra in 30 individuals (repeated measures) were acquired using a 7-T MRI scanner. Data were processed by four independent research groups with the freedom to choose their own...... + NAAG/Cr + PCr and Glu/Cr + PCr, respectively. Metabolite quantification using identical (1) H-MRS data was influenced by processing parameters, basis sets and software choice. Locally preferred processing choices affected metabolite quantification, even when using identical software. Our results......Proton magnetic resonance spectroscopy ((1) H-MRS) can be used to quantify in vivo metabolite levels, such as lactate, γ-aminobutyric acid (GABA) and glutamate (Glu). However, there are considerable analysis choices which can alter the accuracy or precision of (1) H-MRS metabolite quantification...

  2. Quantification of environmental impacts of various energy technologies. Final report

    Energy Technology Data Exchange (ETDEWEB)

    Selfors, A [ed.

    1994-10-01

    This report discusses problems related to economic assessment of the environmental impacts and abatement measures in connection with energy projects. Attention is called to the necessity of assessing environmental impacts both in the form of reduced economic welfare and in the form of costs of abatement measures to reduce the impact. In recent years, several methods for valuing environmental impacts have been developed, but the project shows that few empirical studies have been carried out. The final report indicates that some important factors are very difficult to evaluate. In addition environmental impacts of energy development in Norway vary considerably from project to project. This makes it difficult to obtain a good basis for comparing environmental impacts caused by different technologies, for instance hydroelectric power versus gas power or wind versus hydroelectric power. It might be feasible however to carry out more detailed economic assessments of environmental impacts of specific projects. 33 refs., 1 fig., 4 tabs.

  3. Benchmarking common quantification strategies for large-scale phosphoproteomics

    DEFF Research Database (Denmark)

    Hogrebe, Alexander; von Stechow, Louise; Bekker-Jensen, Dorte B

    2018-01-01

    Comprehensive mass spectrometry (MS)-based proteomics is now feasible, but reproducible quantification remains challenging, especially for post-translational modifications such as phosphorylation. Here, we compare the most popular quantification techniques for global phosphoproteomics: label-free...

  4. Quantification of Cannabinoid Content in Cannabis

    Science.gov (United States)

    Tian, Y.; Zhang, F.; Jia, K.; Wen, M.; Yuan, Ch.

    2015-09-01

    Cannabis is an economically important plant that is used in many fields, in addition to being the most commonly consumed illicit drug worldwide. Monitoring the spatial distribution of cannabis cultivation and judging whether it is drug- or fiber-type cannabis is critical for governments and international communities to understand the scale of the illegal drug trade. The aim of this study was to investigate whether the cannabinoids content in cannabis could be spectrally quantified using a spectrometer and to identify the optimal wavebands for quantifying the cannabinoid content. Spectral reflectance data of dried cannabis leaf samples and the cannabis canopy were measured in the laboratory and in the field, respectively. Correlation analysis and the stepwise multivariate regression method were used to select the optimal wavebands for cannabinoid content quantification based on the laboratory-measured spectral data. The results indicated that the delta-9-tetrahydrocannabinol (THC) content in cannabis leaves could be quantified using laboratory-measured spectral reflectance data and that the 695 nm band is the optimal band for THC content quantification. This study provides prerequisite information for designing spectral equipment to enable immediate quantification of THC content in cannabis and to discriminate drug- from fiber-type cannabis based on THC content quantification in the field.

  5. Colour thresholding and objective quantification in bioimaging

    Science.gov (United States)

    Fermin, C. D.; Gerber, M. A.; Torre-Bueno, J. R.

    1992-01-01

    Computer imaging is rapidly becoming an indispensable tool for the quantification of variables in research and medicine. Whilst its use in medicine has largely been limited to qualitative observations, imaging in applied basic sciences, medical research and biotechnology demands objective quantification of the variables in question. In black and white densitometry (0-256 levels of intensity) the separation of subtle differences between closely related hues from stains is sometimes very difficult. True-colour and real-time video microscopy analysis offer choices not previously available with monochrome systems. In this paper we demonstrate the usefulness of colour thresholding, which has so far proven indispensable for proper objective quantification of the products of histochemical reactions and/or subtle differences in tissue and cells. In addition, we provide interested, but untrained readers with basic information that may assist decisions regarding the most suitable set-up for a project under consideration. Data from projects in progress at Tulane are shown to illustrate the advantage of colour thresholding over monochrome densitometry and for objective quantification of subtle colour differences between experimental and control samples.

  6. Quantification analysis of CT for aphasic patients

    International Nuclear Information System (INIS)

    Watanabe, Shunzo; Ooyama, Hiroshi; Hojo, Kei; Tasaki, Hiroichi; Hanazono, Toshihide; Sato, Tokijiro; Metoki, Hirobumi; Totsuka, Motokichi; Oosumi, Noboru.

    1987-01-01

    Using a microcomputer, the locus and extent of the lesions, as demonstrated by computed tomography, for 44 aphasic patients with various types of aphasia were superimposed onto standardized matrices, composed of 10 slices with 3000 points (50 by 60). The relationships between the foci of the lesions and types of aphasia were investigated on the slices numbered 3, 4, 5, and 6 using a quantification theory, Type 3 (pattern analysis). Some types of regularities were observed on Slices 3, 4, 5, and 6. The group of patients with Broca's aphasia and the group with Wernicke's aphasia were generally separated on the 1st component and the 2nd component of the quantification theory, Type 3. On the other hand, the group with global aphasia existed between the group with Broca's aphasia and that with Wernicke's aphasia. The group of patients with amnestic aphasia had no specific findings, and the group with conduction aphasia existed near those with Wernicke's aphasia. The above results serve to establish the quantification theory, Type 2 (discrimination analysis) and the quantification theory, Type 1 (regression analysis). (author)

  7. Cues, quantification, and agreement in language comprehension.

    Science.gov (United States)

    Tanner, Darren; Bulkes, Nyssa Z

    2015-12-01

    We investigated factors that affect the comprehension of subject-verb agreement in English, using quantification as a window into the relationship between morphosyntactic processes in language production and comprehension. Event-related brain potentials (ERPs) were recorded while participants read sentences with grammatical and ungrammatical verbs, in which the plurality of the subject noun phrase was either doubly marked (via overt plural quantification and morphological marking on the noun) or singly marked (via only plural morphology on the noun). Both acceptability judgments and the ERP data showed heightened sensitivity to agreement violations when quantification provided an additional cue to the grammatical number of the subject noun phrase, over and above plural morphology. This is consistent with models of grammatical comprehension that emphasize feature prediction in tandem with cue-based memory retrieval. Our results additionally contrast with those of prior studies that showed no effects of plural quantification on agreement in language production. These findings therefore highlight some nontrivial divergences in the cues and mechanisms supporting morphosyntactic processing in language production and comprehension.

  8. Performance of the Real-Q EBV Quantification Kit for Epstein-Barr Virus DNA Quantification in Whole Blood.

    Science.gov (United States)

    Huh, Hee Jae; Park, Jong Eun; Kim, Ji Youn; Yun, Sun Ae; Lee, Myoung Keun; Lee, Nam Yong; Kim, Jong Won; Ki, Chang Seok

    2017-03-01

    There has been increasing interest in standardized and quantitative Epstein-Barr virus (EBV) DNA testing for the management of EBV disease. We evaluated the performance of the Real-Q EBV Quantification Kit (BioSewoom, Korea) in whole blood (WB). Nucleic acid extraction and real-time PCR were performed by using the MagNA Pure 96 (Roche Diagnostics, Germany) and 7500 Fast real-time PCR system (Applied Biosystems, USA), respectively. Assay sensitivity, linearity, and conversion factor were determined by using the World Health Organization international standard diluted in EBV-negative WB. We used 81 WB clinical specimens to compare performance of the Real-Q EBV Quantification Kit and artus EBV RG PCR Kit (Qiagen, Germany). The limit of detection (LOD) and limit of quantification (LOQ) for the Real-Q kit were 453 and 750 IU/mL, respectively. The conversion factor from EBV genomic copies to IU was 0.62. The linear range of the assay was from 750 to 10⁶ IU/mL. Viral load values measured with the Real-Q assay were on average 0.54 log₁₀ copies/mL higher than those measured with the artus assay. The Real-Q assay offered good analytical performance for EBV DNA quantification in WB.

  9. Étude du vieillissement des batteries lithium-ion dans les applications "véhicule électrique" : Combinaison des effets de vieillissement calendaire et de cyclage.

    OpenAIRE

    REDONDO-IGLESIAS, Eduardo

    2017-01-01

    L'étude du vieillissement des batteries est nécessaire car la dégradation de leurs caractéristiques détermine en grande partie le coût, les performances et l'impact environnemental des véhicules électrifiés, notamment des véhicules 100 % électriques. La méthodologie choisie pour cette thèse consiste en deux étapes bien différenciées, à savoir la caractérisation et la modélisation. Pour la première étape, on s'appuie sur des essais de vieillissement accéléré d'éléments de batterie. Malgré leur...

  10. Selective Distance-Based K+ Quantification on Paper-Based Microfluidics.

    Science.gov (United States)

    Gerold, Chase T; Bakker, Eric; Henry, Charles S

    2018-04-03

    In this study, paper-based microfluidic devices (μPADs) capable of K + quantification in aqueous samples, as well as in human serum, using both colorimetric and distance-based methods are described. A lipophilic phase containing potassium ionophore I (valinomycin) was utilized to achieve highly selective quantification of K + in the presence of Na + , Li + , and Mg 2+ ions. Successful addition of a suspended lipophilic phase to a wax printed paper-based device is described and offers a solution to current approaches that rely on organic solvents, which damage wax barriers. The approach provides an avenue for future alkali/alkaline quantification utilizing μPADs. Colorimetric spot tests allowed for K + quantification from 0.1-5.0 mM using only 3.00 μL of sample solution. Selective distance-based quantification required small sample volumes (6.00 μL) and gave responses sensitive enough to distinguish between 1.0 and 2.5 mM of sample K + . μPADs using distance-based methods were also capable of differentiating between 4.3 and 6.9 mM K + in human serum samples. Distance-based methods required no digital analysis, electronic hardware, or pumps; any steps required for quantification could be carried out using the naked eye.

  11. Analyse technico-économique et évaluation de l’impact environnemental de la cuisson solaire directe au Maroc

    Directory of Open Access Journals (Sweden)

    Ndiaga MBODJI

    2017-09-01

    Full Text Available The objective of this study is to present a design methodology, carry out economic analysis and evaluate the environmental impact of direct solar cooking systems in Morocco. To satisfy the energy needs of a 5 people household, consuming a 3 kg meal at noon for a cooking time of 2.5 hours, a parabolic concentrator having a diameter of 1.4 m (useful area of 1.6 m² is required. At the household level, the economic analysis revealed that the payback period of a direct solar cooker compared to butane varies from 4 to 10 years, depending on the rate of public subsidy. Where firewood is used, the payback period varies from 0.6 to 10 years, depending on the stove performance and the firewood price. At the national level, a 50% subsidy of direct solar cookers with a penetration rate of 50% in rural areas requires a budget of 1.61 billion dirhams (1$US=10 Dirhams. This investment will allow the government to save 185 million dirhams a year in butane subsidies reduction, which corresponds to a payback period of about 8.7 years and a total profit of 1.45 billion dirhams over the cookers 15-year lifetime. On the ecological aspect, the area of forest saved would be about 10 000 ha/year, and the annual amount of CO2 emissions avoided would be 1.08 Mt/year.

  12. Uncertainties and quantification of common cause failure rates and probabilities for system analyses

    International Nuclear Information System (INIS)

    Vaurio, Jussi K.

    2005-01-01

    Simultaneous failures of multiple components due to common causes at random times are modelled by constant multiple-failure rates. A procedure is described for quantification of common cause failure (CCF) basic event probabilities for system models using plant-specific and multiple-plant failure-event data. Methodology is presented for estimating CCF-rates from event data contaminated with assessment uncertainties. Generalised impact vectors determine the moments for the rates of individual systems or plants. These moments determine the effective numbers of events and observation times to be input to a Bayesian formalism to obtain plant-specific posterior CCF-rates. The rates are used to determine plant-specific common cause event probabilities for the basic events of explicit fault tree models depending on test intervals, test schedules and repair policies. Three methods are presented to determine these probabilities such that the correct time-average system unavailability can be obtained with single fault tree quantification. Recommended numerical values are given and examples illustrate different aspects of the methodology

  13. Identification and Quantification of Celery Allergens Using Fiber Optic Surface Plasmon Resonance PCR

    Directory of Open Access Journals (Sweden)

    Devin Daems

    2017-07-01

    Full Text Available Abstract: Accurate identification and quantification of allergens is key in healthcare, biotechnology and food quality and safety. Celery (Apium graveolens is one of the most important elicitors of food allergic reactions in Europe. Currently, the golden standards to identify, quantify and discriminate celery in a biological sample are immunoassays and two-step molecular detection assays in which quantitative PCR (qPCR is followed by a high-resolution melting analysis (HRM. In order to provide a DNA-based, rapid and simple detection method suitable for one-step quantification, a fiber optic PCR melting assay (FO-PCR-MA was developed to determine different concentrations of celery DNA (1 pM–0.1 fM. The presented method is based on the hybridization and melting of DNA-coated gold nanoparticles to the FO sensor surface in the presence of the target gene (mannitol dehydrogenase, Mtd. The concept was not only able to reveal the presence of celery DNA, but also allowed for the cycle-to-cycle quantification of the target sequence through melting analysis. Furthermore, the developed bioassay was benchmarked against qPCR followed by HRM, showing excellent agreement (R2 = 0.96. In conclusion, this innovative and sensitive diagnostic test could further improve food quality control and thus have a large impact on allergen induced healthcare problems.

  14. Large differences in land use emission quantifications implied by definition discrepancies

    Science.gov (United States)

    Stocker, B. D.; Joos, F.

    2015-03-01

    The quantification of CO2 emissions from anthropogenic land use and land use change (eLUC) is essential to understand the drivers of the atmospheric CO2 increase and to inform climate change mitigation policy. Reported values in synthesis reports are commonly derived from different approaches (observation-driven bookkeeping and process-modelling) but recent work has emphasized that inconsistencies between methods may imply substantial differences in eLUC estimates. However, a consistent quantification is lacking and no concise modelling protocol for the separation of primary and secondary components of eLUC has been established. Here, we review the conceptual differences of eLUC quantification methods and apply an Earth System Model to demonstrate that what is claimed to represent total eLUC differs by up to ~20% when quantified from ESM vs. offline vegetation models. Under a future business-as-usual scenario, differences tend to increase further due to slowing land conversion rates and an increasing impact of altered environmental conditions on land-atmosphere fluxes. We establish how coupled Earth System Models may be applied to separate component fluxes of eLUC arising from the replacement of potential C sinks/sources and the land use feedback and show that secondary fluxes derived from offline vegetation models are conceptually and quantitatively not identical to either, nor their sum. Therefore, we argue that synthesis studies and global carbon budget accountings should resort to the "least common denominator" of different methods, following the bookkeeping approach where only primary land use emissions are quantified under the assumption of constant environmental boundary conditions.

  15. Critical points of DNA quantification by real-time PCR--effects of DNA extraction method and sample matrix on quantification of genetically modified organisms.

    Science.gov (United States)

    Cankar, Katarina; Stebih, Dejan; Dreo, Tanja; Zel, Jana; Gruden, Kristina

    2006-08-14

    Real-time PCR is the technique of choice for nucleic acid quantification. In the field of detection of genetically modified organisms (GMOs) quantification of biotech products may be required to fulfil legislative requirements. However, successful quantification depends crucially on the quality of the sample DNA analyzed. Methods for GMO detection are generally validated on certified reference materials that are in the form of powdered grain material, while detection in routine laboratories must be performed on a wide variety of sample matrixes. Due to food processing, the DNA in sample matrixes can be present in low amounts and also degraded. In addition, molecules of plant origin or from other sources that affect PCR amplification of samples will influence the reliability of the quantification. Further, the wide variety of sample matrixes presents a challenge for detection laboratories. The extraction method must ensure high yield and quality of the DNA obtained and must be carefully selected, since even components of DNA extraction solutions can influence PCR reactions. GMO quantification is based on a standard curve, therefore similarity of PCR efficiency for the sample and standard reference material is a prerequisite for exact quantification. Little information on the performance of real-time PCR on samples of different matrixes is available. Five commonly used DNA extraction techniques were compared and their suitability for quantitative analysis was assessed. The effect of sample matrix on nucleic acid quantification was assessed by comparing 4 maize and 4 soybean matrixes. In addition 205 maize and soybean samples from routine analysis were analyzed for PCR efficiency to assess variability of PCR performance within each sample matrix. Together with the amount of DNA needed for reliable quantification, PCR efficiency is the crucial parameter determining the reliability of quantitative results, therefore it was chosen as the primary criterion by which to

  16. La quantification en Kabiye: une approche linguistique | Pali ...

    African Journals Online (AJOL)

    ... which is denoted by lexical quantifiers. Quantification with specific reference is provided by different types of linguistic units (nouns, numerals, adjectives, adverbs, ideophones and verbs) in arguments/noun phrases and in the predicative phrase in the sense of Chomsky. Keywords: quantification, class, number, reference, ...

  17. MRI measurements of water diffusion: impact of region of interest selection on ischemic quantification

    International Nuclear Information System (INIS)

    Ozsunar, Yelda; Koseoglu, Kutsi; Huisman, Thierry A.G.M.; Koroshetz, Walter; Sorensen, A. Gregory

    2004-01-01

    Objective: To investigate the effect of ADC heterogeneity on region of interest (ROI) measurement of isotropic and anisotropic water diffusion in acute (<12 h) cerebral infarctions. Methods and materials: Full diffusion tensor images were retrospectively analyzed in 32 patients with acute cerebral infarction. Fractional anisotropy (FA) and apparent diffusion coefficient (ADC) values were measured in ischemic lesions and in the corresponding contralateral, normal appearing brain by using four ROIs for each patient. The 2x2 pixel square ROIs were placed in the center, the lateral rim and the medial rim of the infarction. In addition, the whole volume of the infarction was measured using a free hand method. Each ROI value obtained from the ischemic lesion was normalized using contralateral normal ROI values. Results: The localization of the ROIs in relation to the ischemic lesion significantly affected ADC measurement (P<0.01, using Friedman test), but not FA measurement (P=0.25). Significant differences were found between ADC values of the center of the infarction versus whole volume (P<0.01), and medial rim versus whole volume of infarction (P<0.001) with variation of relative ADC values up to 11%. The differences of absolute ADC for these groups were 22 and 23%, respectively. The lowest ADC was found in the center, followed by medial rim, lateral rim and whole volume of infarction. Conclusion: ADC quantification may provide variable results depending on ROI method. The ADC and FA values, obtained from the center of infarction tend to be lower compared to the periphery. The researchers who try to compare studies or work on ischemic quantification should be aware of these differences and effects

  18. Energy systems. Tome 3: advanced cycles, low environmental impact innovative systems; Systeme energetiques, TOME 3: cycles avances, systemes innovants a faible impact environnemental

    Energy Technology Data Exchange (ETDEWEB)

    Gicquel, R

    2009-07-01

    This third tome about energy systems completes the two previous ones by showing up advanced thermodynamical cycles, in particular having a low environmental impact, and by dealing with two other questions linked with the study of systems with a changing regime operation: - the time management of energy, with the use of thermal and pneumatic storage systems and time simulation (schedule for instance) of systems (solar energy type in particular); - the technological dimensioning and non-nominal regime operation studies. Because this last topic is particularly complex, new functionalities have been implemented mainly by using the external classes mechanism, which allows the user to freely personalize his models. This tome is illustrated with about 50 examples of cycles modelled with Thermoptim software. Content: foreword; 1 - generic external classes; 2 - advanced gas turbine cycles; 3 - evaporation-concentration, mechanical steam compression, desalination, hot gas drying; 4 - cryogenic cycles; 5 - electrochemical converters; 6 - global warming, CO{sub 2} capture and sequestration; 7 - future nuclear reactors (coupled to Hirn and Brayton cycles); 8 - thermodynamic solar cycles; 10 - pneumatic and thermal storage; 11 - calculation of thermodynamic solar facilities; 12 - problem of technological dimensioning and non-nominal regime; 13 - exchangers modeling and parameterizing for the dimensioning and the non-nominal regime; 14 - modeling and parameterizing of volumetric compressors; 15 - modeling and parameterizing of turbo-compressors and turbines; 16 - identification methodology of component parameters; 17 - case studies. (J.S.)

  19. Quantification analysis of CT for aphasic patients

    Energy Technology Data Exchange (ETDEWEB)

    Watanabe, S.; Ooyama, H.; Hojo, K.; Tasaki, H.; Hanazono, T.; Sato, T.; Metoki, H.; Totsuka, M.; Oosumi, N.

    1987-02-01

    Using a microcomputer, the locus and extent of the lesions, as demonstrated by computed tomography, for 44 aphasic patients with various types of aphasia were superimposed onto standardized matrices, composed of 10 slices with 3000 points (50 by 60). The relationships between the foci of the lesions and types of aphasia were investigated on the slices numbered 3, 4, 5, and 6 using a quantification theory, Type 3 (pattern analysis). Some types of regularities were observed on slices 3, 4, 5, and 6. The group of patients with Broca's aphasia and the group with Wernicke's aphasia were generally separated on the 1st component and the 2nd component of the quantification theory, Type 3. On the other hand, the group with global aphasia existed between the group with Broca's aphasia and that with Wernicke's aphasia. The group of patients with amnestic aphasia had no specific findings, and the group with conduction aphasia existed near those with Wernicke's aphasia. The above results serve to establish the quantification theory, Type 2 (discrimination analysis) and the quantification theory, Type 1 (regression analysis).

  20. Rapid quantification and sex determination of forensic evidence materials.

    Science.gov (United States)

    Andréasson, Hanna; Allen, Marie

    2003-11-01

    DNA quantification of forensic evidence is very valuable for an optimal use of the available biological material. Moreover, sex determination is of great importance as additional information in criminal investigations as well as in identification of missing persons, no suspect cases, and ancient DNA studies. While routine forensic DNA analysis based on short tandem repeat markers includes a marker for sex determination, analysis of samples containing scarce amounts of DNA is often based on mitochondrial DNA, and sex determination is not performed. In order to allow quantification and simultaneous sex determination on minute amounts of DNA, an assay based on real-time PCR analysis of a marker within the human amelogenin gene has been developed. The sex determination is based on melting curve analysis, while an externally standardized kinetic analysis allows quantification of the nuclear DNA copy number in the sample. This real-time DNA quantification assay has proven to be highly sensitive, enabling quantification of single DNA copies. Although certain limitations were apparent, the system is a rapid, cost-effective, and flexible assay for analysis of forensic casework samples.

  1. Real-time PCR quantification of arbuscular mycorrhizal fungi: does the use of nuclear or mitochondrial markers make a difference?

    Czech Academy of Sciences Publication Activity Database

    Voříšková, Alena; Jansa, J.; Püschel, David; Krüger, Manuela; Cajthaml, T.; Vosátka, Miroslav; Janoušková, Martina

    2017-01-01

    Roč. 27, č. 6 (2017), s. 577-585 ISSN 0940-6360 R&D Projects: GA ČR GA15-05466S Institutional support: RVO:67985939 Keywords : real-time PCR * quantification * arbuscular mycorrhizal fungi Subject RIV: EF - Botanics OBOR OECD: Plant sciences, botany Impact factor: 3.047, year: 2016

  2. EVALUATION DE L’IMPACT ENVIRONNEMENTAL : Evaluation des impacts du flux de transgènes de tolérance à différents herbicides à large spectre

    Directory of Open Access Journals (Sweden)

    Astoin Marie-Florence

    2000-07-01

    Full Text Available Ce texte est tiré du rapport « Introduction de variétés génétiquement modifiées de colza tolérantes à différents herbicides : évaluation des impacts agro-environnementaux et propositions de scénarios de gestion » établi par le Cetiom dans le cadre du moratoire sur les variétés génétiquement modifiées de colza. Les auteurs se réservent la possibilité d’ici publication définitive du rapport d’apporter des modifications à ce texte.

  3. Metal Stable Isotope Tagging: Renaissance of Radioimmunoassay for Multiplex and Absolute Quantification of Biomolecules.

    Science.gov (United States)

    Liu, Rui; Zhang, Shixi; Wei, Chao; Xing, Zhi; Zhang, Sichun; Zhang, Xinrong

    2016-05-17

    The unambiguous quantification of biomolecules is of great significance in fundamental biological research as well as practical clinical diagnosis. Due to the lack of a detectable moiety, the direct and highly sensitive quantification of biomolecules is often a "mission impossible". Consequently, tagging strategies to introduce detectable moieties for labeling target biomolecules were invented, which had a long and significant impact on studies of biomolecules in the past decades. For instance, immunoassays have been developed with radioisotope tagging by Yalow and Berson in the late 1950s. The later languishment of this technology can be almost exclusively ascribed to the use of radioactive isotopes, which led to the development of nonradioactive tagging strategy-based assays such as enzyme-linked immunosorbent assay, fluorescent immunoassay, and chemiluminescent and electrochemiluminescent immunoassay. Despite great success, these strategies suffered from drawbacks such as limited spectral window capacity for multiplex detection and inability to provide absolute quantification of biomolecules. After recalling the sequences of tagging strategies, an apparent question is why not use stable isotopes from the start? A reasonable explanation is the lack of reliable means for accurate and precise quantification of stable isotopes at that time. The situation has changed greatly at present, since several atomic mass spectrometric measures for metal stable isotopes have been developed. Among the newly developed techniques, inductively coupled plasma mass spectrometry is an ideal technique to determine metal stable isotope-tagged biomolecules, for its high sensitivity, wide dynamic linear range, and more importantly multiplex and absolute quantification ability. Since the first published report by our group, metal stable isotope tagging has become a revolutionary technique and gained great success in biomolecule quantification. An exciting research highlight in this area

  4. Real-time PCR for the quantification of fungi in planta.

    Science.gov (United States)

    Klosterman, Steven J

    2012-01-01

    Methods enabling quantification of fungi in planta can be useful for a variety of applications. In combination with information on plant disease severity, indirect quantification of fungi in planta offers an additional tool in the screening of plants that are resistant to fungal diseases. In this chapter, a method is described for the quantification of DNA from a fungus in plant leaves using real-time PCR (qPCR). Although the method described entails quantification of the fungus Verticillium dahliae in lettuce leaves, the methodology described would be useful for other pathosystems as well. The method utilizes primers that are specific for amplification of a β-tubulin sequence from V. dahliae and a lettuce actin gene sequence as a reference for normalization. This approach enabled quantification of V. dahliae in the amount of 2.5 fg/ng of lettuce leaf DNA at 21 days following plant inoculation.

  5. Real-time PCR quantification of arbuscular mycorrhizal fungi: does the use of nuclear or mitochondrial markers make a difference?

    Czech Academy of Sciences Publication Activity Database

    Voříšková, A.; Jansa, J.; Püschel, D.; Krüger, Manuela; Cajthaml, T.; Vosátka, M.; Janoušková, M.

    2017-01-01

    Roč. 27, č. 6 (2017), s. 577-585 ISSN 0940-6360 Institutional support: RVO:61389030 Keywords : Arbuscular mycorrhizal fungi * Isolate discrimination * Microsymbiont screening * Mitochondrial DNA * Molecular genetic quantification * Nuclear ribosomal DNA * plfa * Real-time PCR Subject RIV: EA - Cell Biology OBOR OECD: Cell biology Impact factor: 3.047, year: 2016

  6. QUANTIFICATION OF GENETICALLY MODIFIED MAIZE MON 810 IN PROCESSED FOODS

    Directory of Open Access Journals (Sweden)

    Peter Siekel

    2012-12-01

    Full Text Available 800x600 Normal 0 21 false false false SK X-NONE X-NONE MicrosoftInternetExplorer4 Maize MON 810 (Zea mays L. represents the majority of genetically modified food crops. It is the only transgenic cultivar grown in the EU (European Union countries and food products with its content higher than 0.9 % must be labelled. This study was aimed at impact of food processing (temperature, pH and pressure on DNA degradation and quantification of the genetically modified maize MON 810. The transgenic DNA was quantified by the real-time polymerase chain reaction method. Processing as is high temperature (121 °C, elevated pressure (0.1 MPa and low pH 2.25 fragmented DNA. A consequence of two order difference in the species specific gene content compared to the transgenic DNA content in plant materials used has led to false negative results in the quantification of transgenic DNA. The maize containing 4.2 % of the transgene after processing appeared to be as low as 3.0 % (100 °C and 1.9 % (121 °C, 0.1 MPa. The 2.1 % amount of transgene dropped at 100 °C to 1.0 % and at 121 °C, 0.1 MPa to 0.6 %. Under such make up the DNA degradation of transgenic content showed up 2 or 3 time higher decrease a consequence of unequal gene presence. Such genes disparity is expressed as considerable decrease of transgenic content while the decrease of species specific gene content remains unnoticed. Based on our findings we conclude that high degree of processing might have led to false negative results of the transgenic constituent quantification. Determination of GMO content in processed foods may leads to incorrect statement and labelling in these cases could misleads consumers.doi:10.5219/212

  7. Strategy study of quantification harmonization of SUV in PET/CT images

    International Nuclear Information System (INIS)

    Fischer, Andreia Caroline Fischer da Silveira

    2014-01-01

    In clinical practice, PET/CT images are often analyzed qualitatively by visual comparison of tumor lesions and normal tissues uptake; and semi-quantitatively by means of a parameter called SUV (Standardized Uptake Value). To ensure that longitudinal studies acquired on different scanners are interchangeable, and information of quantification is comparable, it is necessary to establish a strategy to harmonize the quantification of SUV. The aim of this study is to evaluate the strategy to harmonize the quantification of PET/CT images, performed with different scanner models and manufacturers. For this purpose, a survey of the technical characteristics of equipment and acquisition protocols of clinical images of different services of PET/CT in the state of Rio Grande do Sul was conducted. For each scanner, the accuracy of SUV quantification, and the Recovery Coefficient (RC) curves were determined, using the reconstruction parameters clinically relevant and available. From these data, harmonized performance specifications among the evaluated scanners were identified, as well as the algorithm that produces, for each one, the most accurate quantification. Finally, the most appropriate reconstruction parameters to harmonize the SUV quantification in each scanner, either regionally or internationally were identified. It was found that the RC values of the analyzed scanners proved to be overestimated by up to 38%, particularly for objects larger than 17mm. These results demonstrate the need for further optimization, through the reconstruction parameters modification, and even the change of the reconstruction algorithm used in each scanner. It was observed that there is a decoupling between the best image for PET/CT qualitative analysis and the best image for quantification studies. Thus, the choice of reconstruction method should be tied to the purpose of the PET/CT study in question, since the same reconstruction algorithm is not adequate, in one scanner, for qualitative

  8. Application of Fuzzy Comprehensive Evaluation Method in Trust Quantification

    Directory of Open Access Journals (Sweden)

    Shunan Ma

    2011-10-01

    Full Text Available Trust can play an important role for the sharing of resources and information in open network environments. Trust quantification is thus an important issue in dynamic trust management. By considering the fuzziness and uncertainty of trust, in this paper, we propose a fuzzy comprehensive evaluation method to quantify trust along with a trust quantification algorithm. Simulation results show that the trust quantification algorithm that we propose can effectively quantify trust and the quantified value of an entity's trust is consistent with the behavior of the entity.

  9. GHG emission quantification for pavement construction projects using a process-based approach

    Directory of Open Access Journals (Sweden)

    Charinee Limsawasd

    2017-03-01

    Full Text Available Climate change and greenhouse gas (GHG emissions have attracted much attention for their impacts upon the global environment. Initiating of new legislation and regulations for control of GHG emissions from the industrial sectors has been applied to address this problem. The transportation industries, which include operation of road pavement and pavement construction equipment, are the highest GHG-emitting sectors. This study presents a novel quantification model of GHG emissions of pavement construction using process-based analysis. The model is composed of five modules that evaluate GHG emissions. These are: material production and acquisition, (2 material transport to a project site, (3 heavy equipment use, (4 on-site machinery use, and, (5 on-site electricity use. The model was applied to a hypothetical pavement project to compare the environmental impacts of flexible and rigid pavement types during construction. The resulting model can be used for evaluation of environmental impacts, as well as for designing and planning highway pavement construction.

  10. Iron overload in the liver diagnostic and quantification

    Energy Technology Data Exchange (ETDEWEB)

    Alustiza, Jose M. [Osatek SA, P Dr. Beguiristain 109, 20014, San Sebastian, Guipuzcoa (Spain)]. E-mail: jmalustiza@osatek.es; Castiella, Agustin [Osatek SA, P Dr. Beguiristain 109, 20014, San Sebastian, Guipuzcoa (Spain); Juan, Maria D. de [Osatek SA, P Dr. Beguiristain 109, 20014, San Sebastian, Guipuzcoa (Spain); Emparanza, Jose I. [Osatek SA, P Dr. Beguiristain 109, 20014, San Sebastian, Guipuzcoa (Spain); Artetxe, Jose [Osatek SA, P Dr. Beguiristain 109, 20014, San Sebastian, Guipuzcoa (Spain); Uranga, Maite [Osatek SA, P Dr. Beguiristain 109, 20014, San Sebastian, Guipuzcoa (Spain)

    2007-03-15

    Hereditary Hemochromatosis is the most frequent modality of iron overload. Since 1996 genetic tests have facilitated significantly the non-invasive diagnosis of the disease. There are however many cases of negative genetic tests that require confirmation by hepatic iron quantification which is traditionally performed by hepatic biopsy. There are many studies that have demonstrated the possibility of performing hepatic iron quantification with Magnetic Resonance. However, a consensus has not been reached yet regarding the technique or the possibility to reproduce the same method of calculus in different machines. This article reviews the state of the art of the question and delineates possible future lines to standardise this non-invasive method of hepatic iron quantification.

  11. Iron overload in the liver diagnostic and quantification

    International Nuclear Information System (INIS)

    Alustiza, Jose M.; Castiella, Agustin; Juan, Maria D. de; Emparanza, Jose I.; Artetxe, Jose; Uranga, Maite

    2007-01-01

    Hereditary Hemochromatosis is the most frequent modality of iron overload. Since 1996 genetic tests have facilitated significantly the non-invasive diagnosis of the disease. There are however many cases of negative genetic tests that require confirmation by hepatic iron quantification which is traditionally performed by hepatic biopsy. There are many studies that have demonstrated the possibility of performing hepatic iron quantification with Magnetic Resonance. However, a consensus has not been reached yet regarding the technique or the possibility to reproduce the same method of calculus in different machines. This article reviews the state of the art of the question and delineates possible future lines to standardise this non-invasive method of hepatic iron quantification

  12. Selective classification and quantification model of C&D waste from material resources consumed in residential building construction.

    Science.gov (United States)

    Mercader-Moyano, Pilar; Ramírez-de-Arellano-Agudo, Antonio

    2013-05-01

    The unfortunate economic situation involving Spain and the European Union is, among other factors, the result of intensive construction activity over recent years. The excessive consumption of natural resources, together with the impact caused by the uncontrolled dumping of untreated C&D waste in illegal landfills have caused environmental pollution and a deterioration of the landscape. The objective of this research was to generate a selective classification and quantification model of C&D waste based on the material resources consumed in the construction of residential buildings, either new or renovated, namely the Conventional Constructive Model (CCM). A practical example carried out on ten residential buildings in Seville, Spain, enabled the identification and quantification of the C&D waste generated in their construction and the origin of the waste, in terms of the building material from which it originated and its impact for every m(2) constructed. This model enables other researchers to establish comparisons between the various improvements proposed for the minimization of the environmental impact produced by building a CCM, new corrective measures to be proposed in future policies that regulate the production and management of C&D waste generated in construction from the design stage to the completion of the construction process, and the establishment of sustainable management for C&D waste and for the selection of materials for the construction on projected or renovated buildings.

  13. Disease quantification in dermatology

    DEFF Research Database (Denmark)

    Greve, Tanja Maria; Kamp, Søren; Jemec, Gregor B E

    2013-01-01

    Accurate documentation of disease severity is a prerequisite for clinical research and the practice of evidence-based medicine. The quantification of skin diseases such as psoriasis currently relies heavily on clinical scores. Although these clinical scoring methods are well established and very ...

  14. Comparison of Suitability of the Most Common Ancient DNA Quantification Methods.

    Science.gov (United States)

    Brzobohatá, Kristýna; Drozdová, Eva; Smutný, Jiří; Zeman, Tomáš; Beňuš, Radoslav

    2017-04-01

    Ancient DNA (aDNA) extracted from historical bones is damaged and fragmented into short segments, present in low quantity, and usually copurified with microbial DNA. A wide range of DNA quantification methods are available. The aim of this study was to compare the five most common DNA quantification methods for aDNA. Quantification methods were tested on DNA extracted from skeletal material originating from an early medieval burial site. The tested methods included ultraviolet (UV) absorbance, real-time quantitative polymerase chain reaction (qPCR) based on SYBR ® green detection, real-time qPCR based on a forensic kit, quantification via fluorescent dyes bonded to DNA, and fragmentary analysis. Differences between groups were tested using a paired t-test. Methods that measure total DNA present in the sample (NanoDrop ™ UV spectrophotometer and Qubit ® fluorometer) showed the highest concentrations. Methods based on real-time qPCR underestimated the quantity of aDNA. The most accurate method of aDNA quantification was fragmentary analysis, which also allows DNA quantification of the desired length and is not affected by PCR inhibitors. Methods based on the quantification of the total amount of DNA in samples are unsuitable for ancient samples as they overestimate the amount of DNA presumably due to the presence of microbial DNA. Real-time qPCR methods give undervalued results due to DNA damage and the presence of PCR inhibitors. DNA quantification methods based on fragment analysis show not only the quantity of DNA but also fragment length.

  15. Critical points of DNA quantification by real-time PCR – effects of DNA extraction method and sample matrix on quantification of genetically modified organisms

    Directory of Open Access Journals (Sweden)

    Žel Jana

    2006-08-01

    Full Text Available Abstract Background Real-time PCR is the technique of choice for nucleic acid quantification. In the field of detection of genetically modified organisms (GMOs quantification of biotech products may be required to fulfil legislative requirements. However, successful quantification depends crucially on the quality of the sample DNA analyzed. Methods for GMO detection are generally validated on certified reference materials that are in the form of powdered grain material, while detection in routine laboratories must be performed on a wide variety of sample matrixes. Due to food processing, the DNA in sample matrixes can be present in low amounts and also degraded. In addition, molecules of plant origin or from other sources that affect PCR amplification of samples will influence the reliability of the quantification. Further, the wide variety of sample matrixes presents a challenge for detection laboratories. The extraction method must ensure high yield and quality of the DNA obtained and must be carefully selected, since even components of DNA extraction solutions can influence PCR reactions. GMO quantification is based on a standard curve, therefore similarity of PCR efficiency for the sample and standard reference material is a prerequisite for exact quantification. Little information on the performance of real-time PCR on samples of different matrixes is available. Results Five commonly used DNA extraction techniques were compared and their suitability for quantitative analysis was assessed. The effect of sample matrix on nucleic acid quantification was assessed by comparing 4 maize and 4 soybean matrixes. In addition 205 maize and soybean samples from routine analysis were analyzed for PCR efficiency to assess variability of PCR performance within each sample matrix. Together with the amount of DNA needed for reliable quantification, PCR efficiency is the crucial parameter determining the reliability of quantitative results, therefore it was

  16. Critical points of DNA quantification by real-time PCR – effects of DNA extraction method and sample matrix on quantification of genetically modified organisms

    Science.gov (United States)

    Cankar, Katarina; Štebih, Dejan; Dreo, Tanja; Žel, Jana; Gruden, Kristina

    2006-01-01

    Background Real-time PCR is the technique of choice for nucleic acid quantification. In the field of detection of genetically modified organisms (GMOs) quantification of biotech products may be required to fulfil legislative requirements. However, successful quantification depends crucially on the quality of the sample DNA analyzed. Methods for GMO detection are generally validated on certified reference materials that are in the form of powdered grain material, while detection in routine laboratories must be performed on a wide variety of sample matrixes. Due to food processing, the DNA in sample matrixes can be present in low amounts and also degraded. In addition, molecules of plant origin or from other sources that affect PCR amplification of samples will influence the reliability of the quantification. Further, the wide variety of sample matrixes presents a challenge for detection laboratories. The extraction method must ensure high yield and quality of the DNA obtained and must be carefully selected, since even components of DNA extraction solutions can influence PCR reactions. GMO quantification is based on a standard curve, therefore similarity of PCR efficiency for the sample and standard reference material is a prerequisite for exact quantification. Little information on the performance of real-time PCR on samples of different matrixes is available. Results Five commonly used DNA extraction techniques were compared and their suitability for quantitative analysis was assessed. The effect of sample matrix on nucleic acid quantification was assessed by comparing 4 maize and 4 soybean matrixes. In addition 205 maize and soybean samples from routine analysis were analyzed for PCR efficiency to assess variability of PCR performance within each sample matrix. Together with the amount of DNA needed for reliable quantification, PCR efficiency is the crucial parameter determining the reliability of quantitative results, therefore it was chosen as the primary

  17. Exploring Heterogeneous Multicore Architectures for Advanced Embedded Uncertainty Quantification.

    Energy Technology Data Exchange (ETDEWEB)

    Phipps, Eric T.; Edwards, Harold C.; Hu, Jonathan J.

    2014-09-01

    We explore rearrangements of classical uncertainty quantification methods with the aim of achieving higher aggregate performance for uncertainty quantification calculations on emerging multicore and manycore architectures. We show a rearrangement of the stochastic Galerkin method leads to improved performance and scalability on several computational architectures whereby un- certainty information is propagated at the lowest levels of the simulation code improving memory access patterns, exposing new dimensions of fine grained parallelism, and reducing communica- tion. We also develop a general framework for implementing such rearrangements for a diverse set of uncertainty quantification algorithms as well as computational simulation codes to which they are applied.

  18. GMO quantification: valuable experience and insights for the future.

    Science.gov (United States)

    Milavec, Mojca; Dobnik, David; Yang, Litao; Zhang, Dabing; Gruden, Kristina; Zel, Jana

    2014-10-01

    Cultivation and marketing of genetically modified organisms (GMOs) have been unevenly adopted worldwide. To facilitate international trade and to provide information to consumers, labelling requirements have been set up in many countries. Quantitative real-time polymerase chain reaction (qPCR) is currently the method of choice for detection, identification and quantification of GMOs. This has been critically assessed and the requirements for the method performance have been set. Nevertheless, there are challenges that should still be highlighted, such as measuring the quantity and quality of DNA, and determining the qPCR efficiency, possible sequence mismatches, characteristics of taxon-specific genes and appropriate units of measurement, as these remain potential sources of measurement uncertainty. To overcome these problems and to cope with the continuous increase in the number and variety of GMOs, new approaches are needed. Statistical strategies of quantification have already been proposed and expanded with the development of digital PCR. The first attempts have been made to use new generation sequencing also for quantitative purposes, although accurate quantification of the contents of GMOs using this technology is still a challenge for the future, and especially for mixed samples. New approaches are needed also for the quantification of stacks, and for potential quantification of organisms produced by new plant breeding techniques.

  19. The Quantification Process for the PRiME-U34i

    International Nuclear Information System (INIS)

    Hwang, Mee-Jeong; Han, Sang-Hoon; Yang, Joon-Eon

    2006-01-01

    In this paper, we introduce the quantification process for the PRIME-U34i, which is the merged model of ETs (Event Trees) and FTs (Fault Trees) for the level 1 internal PSA of UCN 3 and 4. PRiME-U34i has one top event. Therefore, the quantification process is changed to a simplified method when compared to the past one. In the past, we used the text file called a user file to control the quantification process. However, this user file is so complicated that it is difficult for a non-expert to understand it. Moreover, in the past PSA, ET and FT were separated but in PRiMEU34i, ET and FT were merged together. Thus, the quantification process is different. This paper is composed of five sections. In section 2, we introduce the construction of the one top model. Section 3 shows the quantification process used in the PRiME-U34i. Section 4 describes the post processing. Last section is the conclusions

  20. A qualitative method proposal to improve environmental impact assessment

    International Nuclear Information System (INIS)

    Toro, Javier; Requena, Ignacio; Duarte, Oscar; Zamorano, Montserrat

    2013-01-01

    In environmental impact assessment, qualitative methods are used because they are versatile and easy to apply. This methodology is based on the evaluation of the strength of the impact by grading a series of qualitative attributes that can be manipulated by the evaluator. The results thus obtained are not objective, and all too often impacts are eliminated that should be mitigated with corrective measures. However, qualitative methodology can be improved if the calculation of Impact Importance is based on the characteristics of environmental factors and project activities instead on indicators assessed by evaluators. In this sense, this paper proposes the inclusion of the vulnerability of environmental factors and the potential environmental impact of project activities. For this purpose, the study described in this paper defined Total Impact Importance and specified a quantification procedure. The results obtained in the case study of oil drilling in Colombia reflect greater objectivity in the evaluation of impacts as well as a positive correlation between impact values, the environmental characteristics at and near the project location, and the technical characteristics of project activities. -- Highlights: • Concept of vulnerability has been used to calculate the importance impact assessment. • This paper defined Total Impact Importance and specified a quantification procedure. • The method includes the characteristics of environmental and project activities. • The application has shown greater objectivity in the evaluation of impacts. • Better correlation between impact values, environment and the project has been shown

  1. A qualitative method proposal to improve environmental impact assessment

    Energy Technology Data Exchange (ETDEWEB)

    Toro, Javier, E-mail: jjtoroca@unal.edu.co [Institute of Environmental Studies, National University of Colombia at Bogotá (Colombia); Requena, Ignacio, E-mail: requena@decsai.ugr.es [Department of Computer Science and Artificial Intelligence, University of Granada (Spain); Duarte, Oscar, E-mail: ogduartev@unal.edu.co [National University of Colombia at Bogotá, Department of Electrical Engineering and Electronics (Colombia); Zamorano, Montserrat, E-mail: zamorano@ugr.es [Department of Civil Engineering, University of Granada (Spain)

    2013-11-15

    In environmental impact assessment, qualitative methods are used because they are versatile and easy to apply. This methodology is based on the evaluation of the strength of the impact by grading a series of qualitative attributes that can be manipulated by the evaluator. The results thus obtained are not objective, and all too often impacts are eliminated that should be mitigated with corrective measures. However, qualitative methodology can be improved if the calculation of Impact Importance is based on the characteristics of environmental factors and project activities instead on indicators assessed by evaluators. In this sense, this paper proposes the inclusion of the vulnerability of environmental factors and the potential environmental impact of project activities. For this purpose, the study described in this paper defined Total Impact Importance and specified a quantification procedure. The results obtained in the case study of oil drilling in Colombia reflect greater objectivity in the evaluation of impacts as well as a positive correlation between impact values, the environmental characteristics at and near the project location, and the technical characteristics of project activities. -- Highlights: • Concept of vulnerability has been used to calculate the importance impact assessment. • This paper defined Total Impact Importance and specified a quantification procedure. • The method includes the characteristics of environmental and project activities. • The application has shown greater objectivity in the evaluation of impacts. • Better correlation between impact values, environment and the project has been shown.

  2. Artifacts Quantification of Metal Implants in MRI

    Science.gov (United States)

    Vrachnis, I. N.; Vlachopoulos, G. F.; Maris, T. G.; Costaridou, L. I.

    2017-11-01

    The presence of materials with different magnetic properties, such as metal implants, causes distortion of the magnetic field locally, resulting in signal voids and pile ups, i.e. susceptibility artifacts in MRI. Quantitative and unbiased measurement of the artifact is prerequisite for optimization of acquisition parameters. In this study an image gradient based segmentation method is proposed for susceptibility artifact quantification. The method captures abrupt signal alterations by calculation of the image gradient. Then the artifact is quantified in terms of its extent by an automated cross entropy thresholding method as image area percentage. The proposed method for artifact quantification was tested in phantoms containing two orthopedic implants with significantly different magnetic permeabilities. The method was compared against a method proposed in the literature, considered as a reference, demonstrating moderate to good correlation (Spearman’s rho = 0.62 and 0.802 in case of titanium and stainless steel implants). The automated character of the proposed quantification method seems promising towards MRI acquisition parameter optimization.

  3. Quantification of Sediment Transport During Glacier Surges and its Impact on Landform Architecture

    DEFF Research Database (Denmark)

    Kjær, Kurt H.; Schomacker, Anders; Korsgaard, Niels Jákup

    ) for 1945, prior to the last surge in 1964, and for 2003 in order to assess the effect of the surge on the sediment architecture in the forefield. The pre- and post-surge DEMs allow direct quantification of the sediment volumes that were re-distributed in the forefield by the surging ice mass in 1964...... or glaciofluvial outwash fans. Mapping of the sediment thickness in the glacier forefield shows higher accumulation along ice marginal positions related to wedge formation during extremely rapid ice flow. Fast flow was sustained by overpressurized water causing sediment-bedrock decoupling beneath a thick sediment...... architecture occurs distal to the 1810 ice margin, where the 1890 surge advanced over hitherto undeformed sediments. Proximal to the 1810 ice margin, the landscape have been transgressed by either one or two glaciers (in 1890 and 1964). The most complex landscape architecture is found proximal to the 1964 ice...

  4. Le début d'une nouvelle vie pour de vieux ordinateurs | IDRC ...

    International Development Research Centre (IDRC) Digital Library (Canada)

    25 janv. 2011 ... Un projet financé par le CRDI qui fournit des ordinateurs aux écoles tout en créant des emplois permet de s'attaquer au problème environnemental causé par les déchets électroniques en Amérique latine et dans les Caraïbes.

  5. Quantification of trace-level DNA by real-time whole genome amplification.

    Science.gov (United States)

    Kang, Min-Jung; Yu, Hannah; Kim, Sook-Kyung; Park, Sang-Ryoul; Yang, Inchul

    2011-01-01

    Quantification of trace amounts of DNA is a challenge in analytical applications where the concentration of a target DNA is very low or only limited amounts of samples are available for analysis. PCR-based methods including real-time PCR are highly sensitive and widely used for quantification of low-level DNA samples. However, ordinary PCR methods require at least one copy of a specific gene sequence for amplification and may not work for a sub-genomic amount of DNA. We suggest a real-time whole genome amplification method adopting the degenerate oligonucleotide primed PCR (DOP-PCR) for quantification of sub-genomic amounts of DNA. This approach enabled quantification of sub-picogram amounts of DNA independently of their sequences. When the method was applied to the human placental DNA of which amount was accurately determined by inductively coupled plasma-optical emission spectroscopy (ICP-OES), an accurate and stable quantification capability for DNA samples ranging from 80 fg to 8 ng was obtained. In blind tests of laboratory-prepared DNA samples, measurement accuracies of 7.4%, -2.1%, and -13.9% with analytical precisions around 15% were achieved for 400-pg, 4-pg, and 400-fg DNA samples, respectively. A similar quantification capability was also observed for other DNA species from calf, E. coli, and lambda phage. Therefore, when provided with an appropriate standard DNA, the suggested real-time DOP-PCR method can be used as a universal method for quantification of trace amounts of DNA.

  6. Collagen Quantification in Tissue Specimens.

    Science.gov (United States)

    Coentro, João Quintas; Capella-Monsonís, Héctor; Graceffa, Valeria; Wu, Zhuning; Mullen, Anne Maria; Raghunath, Michael; Zeugolis, Dimitrios I

    2017-01-01

    Collagen is the major extracellular protein in mammals. Accurate quantification of collagen is essential in the biomaterials (e.g., reproducible collagen scaffold fabrication), drug discovery (e.g., assessment of collagen in pathophysiologies, such as fibrosis), and tissue engineering (e.g., quantification of cell-synthesized collagen) fields. Although measuring hydroxyproline content is the most widely used method to quantify collagen in biological specimens, the process is very laborious. To this end, the Sircol™ Collagen Assay is widely used due to its inherent simplicity and convenience. However, this method leads to overestimation of collagen content due to the interaction of Sirius red with basic amino acids of non-collagenous proteins. Herein, we describe the addition of an ultrafiltration purification step in the process to accurately determine collagen content in tissues.

  7. A critical view on microplastic quantification in aquatic organisms

    Energy Technology Data Exchange (ETDEWEB)

    Vandermeersch, Griet, E-mail: griet.vandermeersch@ilvo.vlaanderen.be [Institute for Agricultural and Fisheries Research (ILVO), Animal Sciences Unit – Marine Environment and Quality, Ankerstraat 1, 8400 Oostende (Belgium); Van Cauwenberghe, Lisbeth; Janssen, Colin R. [Ghent University, Laboratory of Environmental Toxicology and Aquatic Ecology, Environmental Toxicology Unit (GhEnToxLab), Jozef Plateaustraat 22, 9000 Ghent (Belgium); Marques, Antonio [Division of Aquaculture and Upgrading (DivAV), Portuguese Institute for the Sea and Atmosphere (IPMA), Avenida de Brasília s/n, 1449-006 Lisboa (Portugal); Granby, Kit [Technical University of Denmark, National Food Institute, Mørkhøj Bygade 19, 2860 Søborg (Denmark); Fait, Gabriella [Aeiforia Srl, 29027 Gariga di Podenzano (PC) (Italy); Kotterman, Michiel J.J. [Institute for Marine Resources and Ecosystem Studies (IMARES), Wageningen University and Research Center, Ijmuiden (Netherlands); Diogène, Jorge [Institut de la Recerca i Tecnologia Agroalimentàries (IRTA), Ctra. Poble Nou km 5,5, Sant Carles de la Ràpita E-43540 (Spain); Bekaert, Karen; Robbens, Johan [Institute for Agricultural and Fisheries Research (ILVO), Animal Sciences Unit – Marine Environment and Quality, Ankerstraat 1, 8400 Oostende (Belgium); Devriese, Lisa, E-mail: lisa.devriese@ilvo.vlaanderen.be [Institute for Agricultural and Fisheries Research (ILVO), Animal Sciences Unit – Marine Environment and Quality, Ankerstraat 1, 8400 Oostende (Belgium)

    2015-11-15

    Microplastics, plastic particles and fragments smaller than 5 mm, are ubiquitous in the marine environment. Ingestion and accumulation of microplastics have previously been demonstrated for diverse marine species ranging from zooplankton to bivalves and fish, implying the potential for microplastics to accumulate in the marine food web. In this way, microplastics can potentially impact food safety and human health. Although a few methods to quantify microplastics in biota have been described, no comparison and/or intercalibration of these techniques have been performed. Here we conducted a literature review on all available extraction and quantification methods. Two of these methods, involving wet acid destruction, were used to evaluate the presence of microplastics in field-collected mussels (Mytilus galloprovincialis) from three different “hotspot” locations in Europe (Po estuary, Italy; Tagus estuary, Portugal; Ebro estuary, Spain). An average of 0.18±0.14 total microplastics g{sup −1} w.w. for the Acid mix Method and 0.12±0.04 total microplastics g{sup −1} w.w. for the Nitric acid Method was established. Additionally, in a pilot study an average load of 0.13±0.14 total microplastics g{sup −1} w.w. was recorded in commercial mussels (Mytilus edulis and M. galloprovincialis) from five European countries (France, Italy, Denmark, Spain and The Netherlands). A detailed analysis and comparison of methods indicated the need for further research to develop a standardised operating protocol for microplastic quantification and monitoring.

  8. A critical view on microplastic quantification in aquatic organisms

    International Nuclear Information System (INIS)

    Vandermeersch, Griet; Van Cauwenberghe, Lisbeth; Janssen, Colin R.; Marques, Antonio; Granby, Kit; Fait, Gabriella; Kotterman, Michiel J.J.; Diogène, Jorge; Bekaert, Karen; Robbens, Johan; Devriese, Lisa

    2015-01-01

    Microplastics, plastic particles and fragments smaller than 5 mm, are ubiquitous in the marine environment. Ingestion and accumulation of microplastics have previously been demonstrated for diverse marine species ranging from zooplankton to bivalves and fish, implying the potential for microplastics to accumulate in the marine food web. In this way, microplastics can potentially impact food safety and human health. Although a few methods to quantify microplastics in biota have been described, no comparison and/or intercalibration of these techniques have been performed. Here we conducted a literature review on all available extraction and quantification methods. Two of these methods, involving wet acid destruction, were used to evaluate the presence of microplastics in field-collected mussels (Mytilus galloprovincialis) from three different “hotspot” locations in Europe (Po estuary, Italy; Tagus estuary, Portugal; Ebro estuary, Spain). An average of 0.18±0.14 total microplastics g −1 w.w. for the Acid mix Method and 0.12±0.04 total microplastics g −1 w.w. for the Nitric acid Method was established. Additionally, in a pilot study an average load of 0.13±0.14 total microplastics g −1 w.w. was recorded in commercial mussels (Mytilus edulis and M. galloprovincialis) from five European countries (France, Italy, Denmark, Spain and The Netherlands). A detailed analysis and comparison of methods indicated the need for further research to develop a standardised operating protocol for microplastic quantification and monitoring.

  9. On the complex quantification of risk: systems-based perspective on terrorism.

    Science.gov (United States)

    Haimes, Yacov Y

    2011-08-01

    This article highlights the complexity of the quantification of the multidimensional risk function, develops five systems-based premises on quantifying the risk of terrorism to a threatened system, and advocates the quantification of vulnerability and resilience through the states of the system. The five premises are: (i) There exists interdependence between a specific threat to a system by terrorist networks and the states of the targeted system, as represented through the system's vulnerability, resilience, and criticality-impact. (ii) A specific threat, its probability, its timing, the states of the targeted system, and the probability of consequences can be interdependent. (iii) The two questions in the risk assessment process: "What is the likelihood?" and "What are the consequences?" can be interdependent. (iv) Risk management policy options can reduce both the likelihood of a threat to a targeted system and the associated likelihood of consequences by changing the states (including both vulnerability and resilience) of the system. (v) The quantification of risk to a vulnerable system from a specific threat must be built on a systemic and repeatable modeling process, by recognizing that the states of the system constitute an essential step to construct quantitative metrics of the consequences based on intelligence gathering, expert evidence, and other qualitative information. The fact that the states of all systems are functions of time (among other variables) makes the time frame pivotal in each component of the process of risk assessment, management, and communication. Thus, risk to a system, caused by an initiating event (e.g., a threat) is a multidimensional function of the specific threat, its probability and time frame, the states of the system (representing vulnerability and resilience), and the probabilistic multidimensional consequences. © 2011 Society for Risk Analysis.

  10. Guided Wave Delamination Detection and Quantification With Wavefield Data Analysis

    Science.gov (United States)

    Tian, Zhenhua; Campbell Leckey, Cara A.; Seebo, Jeffrey P.; Yu, Lingyu

    2014-01-01

    Unexpected damage can occur in aerospace composites due to impact events or material stress during off-nominal loading events. In particular, laminated composites are susceptible to delamination damage due to weak transverse tensile and inter-laminar shear strengths. Developments of reliable and quantitative techniques to detect delamination damage in laminated composites are imperative for safe and functional optimally-designed next-generation composite structures. In this paper, we investigate guided wave interactions with delamination damage and develop quantification algorithms by using wavefield data analysis. The trapped guided waves in the delamination region are observed from the wavefield data and further quantitatively interpreted by using different wavenumber analysis methods. The frequency-wavenumber representation of the wavefield shows that new wavenumbers are present and correlate to trapped waves in the damage region. These new wavenumbers are used to detect and quantify the delamination damage through the wavenumber analysis, which can show how the wavenumber changes as a function of wave propagation distance. The location and spatial duration of the new wavenumbers can be identified, providing a useful means not only for detecting the presence of delamination damage but also allowing for estimation of the delamination size. Our method has been applied to detect and quantify real delamination damage with complex geometry (grown using a quasi-static indentation technique). The detection and quantification results show the location, size, and shape of the delamination damage.

  11. Cutset Quantification Error Evaluation for Shin-Kori 1 and 2 PSA model

    International Nuclear Information System (INIS)

    Choi, Jong Soo

    2009-01-01

    Probabilistic safety assessments (PSA) for nuclear power plants (NPPs) are based on the minimal cut set (MCS) quantification method. In PSAs, the risk and importance measures are computed from a cutset equation mainly by using approximations. The conservatism of the approximations is also a source of quantification uncertainty. In this paper, exact MCS quantification methods which are based on the 'sum of disjoint products (SDP)' logic and Inclusion-exclusion formula are applied and the conservatism of the MCS quantification results in Shin-Kori 1 and 2 PSA is evaluated

  12. Accurate episomal HIV 2-LTR circles quantification using optimized DNA isolation and droplet digital PCR.

    Science.gov (United States)

    Malatinkova, Eva; Kiselinova, Maja; Bonczkowski, Pawel; Trypsteen, Wim; Messiaen, Peter; Vermeire, Jolien; Verhasselt, Bruno; Vervisch, Karen; Vandekerckhove, Linos; De Spiegelaere, Ward

    2014-01-01

    In HIV-infected patients on combination antiretroviral therapy (cART), the detection of episomal HIV 2-LTR circles is a potential marker for ongoing viral replication. Quantification of 2-LTR circles is based on quantitative PCR or more recently on digital PCR assessment, but is hampered due to its low abundance. Sample pre-PCR processing is a critical step for 2-LTR circles quantification, which has not yet been sufficiently evaluated in patient derived samples. We compared two sample processing procedures to more accurately quantify 2-LTR circles using droplet digital PCR (ddPCR). Episomal HIV 2-LTR circles were either isolated by genomic DNA isolation or by a modified plasmid DNA isolation, to separate the small episomal circular DNA from chromosomal DNA. This was performed in a dilution series of HIV-infected cells and HIV-1 infected patient derived samples (n=59). Samples for the plasmid DNA isolation method were spiked with an internal control plasmid. Genomic DNA isolation enables robust 2-LTR circles quantification. However, in the lower ranges of detection, PCR inhibition caused by high genomic DNA load substantially limits the amount of sample input and this impacts sensitivity and accuracy. Moreover, total genomic DNA isolation resulted in a lower recovery of 2-LTR templates per isolate, further reducing its sensitivity. The modified plasmid DNA isolation with a spiked reference for normalization was more accurate in these low ranges compared to genomic DNA isolation. A linear correlation of both methods was observed in the dilution series (R2=0.974) and in the patient derived samples with 2-LTR numbers above 10 copies per million peripheral blood mononuclear cells (PBMCs), (R2=0.671). Furthermore, Bland-Altman analysis revealed an average agreement between the methods within the 27 samples in which 2-LTR circles were detectable with both methods (bias: 0.3875±1.2657 log10). 2-LTR circles quantification in HIV-infected patients proved to be more

  13. Ecological impacts of alien species: quantification, scope, caveats and recommendations

    Czech Academy of Sciences Publication Activity Database

    Kumschick, S.; Gaertner, M.; Vila, M.; Essl, F.; Jeschke, J.M.; Pyšek, Petr; Ricciardi, A.; Bacher, S.; Blackburn, T. M.; Dick, J. T. A.; Evans, T.; Hulme, P. E.; Kühn, I.; Mrugala, A.; Pergl, Jan; Rabitsch, W.; Richardson, D. M.; Sendek, A.; Winter, M.

    2015-01-01

    Roč. 65, č. 1 (2015), s. 55-63 ISSN 0006-3568 R&D Projects: GA ČR(CZ) GAP504/11/1028; GA ČR GB14-36079G Grant - others:AV ČR(CZ) AP1002 Program:Akademická prémie - Praemium Academiae Institutional support: RVO:67985939 Keywords : biological invasions * impact * prediction Subject RIV: EH - Ecology, Behaviour Impact factor: 4.294, year: 2015

  14. A critical view on microplastic quantification in aquatic organisms

    DEFF Research Database (Denmark)

    Vandermeersch, Griet; Van Cauwenberghe, Lisbeth; Janssen, Colin R.

    2015-01-01

    Microplastics, plastic particles and fragments smaller than 5mm, are ubiquitous in the marine environment. Ingestion and accumulation of microplastics have previously been demonstrated for diverse marine species ranging from zooplankton to bivalves and fish, implying the potential for microplastics...... to accumulate in the marine food web. In this way, microplastics can potentially impact food safety and human health. Although a few methods to quantify microplastics in biota have been described, no comparison and/or intercalibration of these techniques have been performed. Here we conducted a literature...... review on all available extraction and quantification methods. Two of these methods, involving wet acid destruction, were used to evaluate the presence of microplastics in field-collected mussels (Mytilus galloprovincialis) from three different "hotspot" locations in Europe (Po estuary, Italy; Tagus...

  15. Stereological quantification of mast cells in human synovium

    DEFF Research Database (Denmark)

    Damsgaard, T E; Sørensen, Flemming Brandt; Herlin, T

    1999-01-01

    Mast cells participate in both the acute allergic reaction as well as in chronic inflammatory diseases. Earlier studies have revealed divergent results regarding the quantification of mast cells in the human synovium. The aim of the present study was therefore to quantify these cells in the human...... synovium, using stereological techniques. Different methods of staining and quantification have previously been used for mast cell quantification in human synovium. Stereological techniques provide precise and unbiased information on the number of cell profiles in two-dimensional tissue sections of......, in this case, human synovium. In 10 patients suffering from osteoarthritis a median of 3.6 mast cells/mm2 synovial membrane was found. The total number of cells (synoviocytes, fibroblasts, lymphocytes, leukocytes) present was 395.9 cells/mm2 (median). The mast cells constituted 0.8% of all the cell profiles...

  16. The ratio of right ventricular volume to left ventricular volume reflects the impact of pulmonary regurgitation independently of the method of pulmonary regurgitation quantification

    International Nuclear Information System (INIS)

    Śpiewak, Mateusz; Małek, Łukasz A.; Petryka, Joanna; Mazurkiewicz, Łukasz; Miłosz, Barbara; Biernacka, Elżbieta K.; Kowalski, Mirosław; Hoffman, Piotr; Demkow, Marcin; Miśko, Jolanta; Rużyłło, Witold

    2012-01-01

    Background: Previous studies have advocated quantifying pulmonary regurgitation (PR) by using PR volume (PRV) instead of commonly used PR fraction (PRF). However, physicians are not familiar with the use of PRV in clinical practice. The ratio of right ventricle (RV) volume to left ventricle volume (RV/LV) may better reflect the impact of PR on the heart than RV end-diastolic volume (RVEDV) alone. We aimed to compare the impact of PRV and PRF on RV size expressed as either the RV/LV ratio or RVEDV (mL/m 2 ). Methods: Consecutive patients with repaired tetralogy of Fallot were included (n = 53). PRV, PRF and ventricular volumes were measured with the use of cardiac magnetic resonance. Results: RVEDV was more closely correlated with PRV when compared with PRF (r = 0.686, p 2.0 [area under the curve (AUC) PRV = 0.770 vs AUC PRF = 0.777, p = 0.86]. Conversely, with the use of the RVEDV-based criterion (>170 mL/m 2 ), PRV proved to be superior over PRF (AUC PRV = 0.770 vs AUC PRF = 0.656, p = 0.0028]. Conclusions: PRV and PRF have similar significance as measures of PR when the RV/LV ratio is used instead of RVEDV. The RV/LV ratio is a universal marker of RV dilatation independent of the method of PR quantification applied (PRF vs PRV)

  17. PCR amplification of repetitive sequences as a possible approach in relative species quantification

    DEFF Research Database (Denmark)

    Ballin, Nicolai Zederkopff; Vogensen, Finn Kvist; Karlsson, Anders H

    2012-01-01

    Abstract Both relative and absolute quantifications are possible in species quantification when single copy genomic DNA is used. However, amplification of single copy genomic DNA does not allow a limit of detection as low as one obtained from amplification of repetitive sequences. Amplification...... of repetitive sequences is therefore frequently used in absolute quantification but problems occur in relative quantification as the number of repetitive sequences is unknown. A promising approach was developed where data from amplification of repetitive sequences were used in relative quantification of species...... to relatively quantify the amount of chicken DNA in a binary mixture of chicken DNA and pig DNA. However, the designed PCR primers lack the specificity required for regulatory species control....

  18. Quantification of the impact of water as an impurity on standard physico-chemical properties of ionic liquids

    International Nuclear Information System (INIS)

    Andanson, J.-M.; Meng, X.; Traïkia, M.; Husson, P.

    2016-01-01

    Highlights: • Residual water has a negligible impact on density of hydrophobic ionic liquids. • The density of a dry sample can be calculated from the density of a wet ionic liquid. • The viscosity of a dry sample can be calculated from the one of a wet ionic liquid. • Water can be quantified by NMR spectroscopy even in dried hydrophobic ionic liquids. - Abstract: The objective of this work was to quantify the effect of the presence of water as impurity in ionic liquids. First, density and viscosity of five ionic liquids as well as their aqueous solutions were measured. For hydrophobic dried ionic liquids, traces of water (50 ppm) have measurable impact neither on the density nor on the viscosity values. In the concentration range studied (up to 5000 ppm), a linear evolution of the molar volume of the mixture with the mole fraction composition is observed. Practically, this allows to estimate the density of an neat ionic liquid provided (i) the water quantity and (ii) the density of the undried sample are known. This is particularly useful for hydrophilic ionic liquids that are difficult to dry. In the studied concentration range, a linear evolution of the relative viscosity was also depicted as a function of the mass fraction composition. It is thus possible to evaluate the viscosity of the pure ionic liquid knowing the water quantity and the viscosity of the undried sample. The comparison of the results obtained using two viscosimeters confirms that a Stabinger viscosimeter is appropriate to precisely measure ionic liquids viscosities. Second, NMR and IR spectroscopies were used to characterize the pure ionic liquids and their solutions with water. The sensitivity of IR spectroscopy does allow neither the quantification nor the detection of water below 1 mol%. With NMR spectroscopy, water can be quantified using either the intensity or the chemical shift of the water proton peak for mole fractions as low as 200 ppm. It is even possible to detect water in

  19. Advancing agricultural greenhouse gas quantification*

    Science.gov (United States)

    Olander, Lydia; Wollenberg, Eva; Tubiello, Francesco; Herold, Martin

    2013-03-01

    1. Introduction Better information on greenhouse gas (GHG) emissions and mitigation potential in the agricultural sector is necessary to manage these emissions and identify responses that are consistent with the food security and economic development priorities of countries. Critical activity data (what crops or livestock are managed in what way) are poor or lacking for many agricultural systems, especially in developing countries. In addition, the currently available methods for quantifying emissions and mitigation are often too expensive or complex or not sufficiently user friendly for widespread use. The purpose of this focus issue is to capture the state of the art in quantifying greenhouse gases from agricultural systems, with the goal of better understanding our current capabilities and near-term potential for improvement, with particular attention to quantification issues relevant to smallholders in developing countries. This work is timely in light of international discussions and negotiations around how agriculture should be included in efforts to reduce and adapt to climate change impacts, and considering that significant climate financing to developing countries in post-2012 agreements may be linked to their increased ability to identify and report GHG emissions (Murphy et al 2010, CCAFS 2011, FAO 2011). 2. Agriculture and climate change mitigation The main agricultural GHGs—methane and nitrous oxide—account for 10%-12% of anthropogenic emissions globally (Smith et al 2008), or around 50% and 60% of total anthropogenic methane and nitrous oxide emissions, respectively, in 2005. Net carbon dioxide fluxes between agricultural land and the atmosphere linked to food production are relatively small, although significant carbon emissions are associated with degradation of organic soils for plantations in tropical regions (Smith et al 2007, FAO 2012). Population growth and shifts in dietary patterns toward more meat and dairy consumption will lead to

  20. Quantification of cellular uptake of DNA nanostructures by qPCR

    DEFF Research Database (Denmark)

    Okholm, Anders Hauge; Nielsen, Jesper Sejrup; Vinther, Mathias

    2014-01-01

    interactions and structural and functional features of the DNA delivery device must be thoroughly investigated. Here, we present a rapid and robust method for the precise quantification of the component materials of DNA origami structures capable of entering cells in vitro. The quantification is performed...

  1. Optimized methods for total nucleic acid extraction and quantification of the bat white-nose syndrome fungus, Pseudogymnoascus destructans, from swab and environmental samples.

    Science.gov (United States)

    Verant, Michelle L; Bohuski, Elizabeth A; Lorch, Jeffery M; Blehert, David S

    2016-03-01

    The continued spread of white-nose syndrome and its impacts on hibernating bat populations across North America has prompted nationwide surveillance efforts and the need for high-throughput, noninvasive diagnostic tools. Quantitative real-time polymerase chain reaction (qPCR) analysis has been increasingly used for detection of the causative fungus, Pseudogymnoascus destructans, in both bat- and environment-associated samples and provides a tool for quantification of fungal DNA useful for research and monitoring purposes. However, precise quantification of nucleic acid from P. destructans is dependent on effective and standardized methods for extracting nucleic acid from various relevant sample types. We describe optimized methodologies for extracting fungal nucleic acids from sediment, guano, and swab-based samples using commercial kits together with a combination of chemical, enzymatic, and mechanical modifications. Additionally, we define modifications to a previously published intergenic spacer-based qPCR test for P. destructans to refine quantification capabilities of this assay. © 2016 The Author(s).

  2. Optimized methods for total nucleic acid extraction and quantification of the bat white-nose syndrome fungus, Pseudogymnoascus destructans, from swab and environmental samples

    Science.gov (United States)

    Verant, Michelle; Bohuski, Elizabeth A.; Lorch, Jeffrey M.; Blehert, David

    2016-01-01

    The continued spread of white-nose syndrome and its impacts on hibernating bat populations across North America has prompted nationwide surveillance efforts and the need for high-throughput, noninvasive diagnostic tools. Quantitative real-time polymerase chain reaction (qPCR) analysis has been increasingly used for detection of the causative fungus, Pseudogymnoascus destructans, in both bat- and environment-associated samples and provides a tool for quantification of fungal DNA useful for research and monitoring purposes. However, precise quantification of nucleic acid fromP. destructans is dependent on effective and standardized methods for extracting nucleic acid from various relevant sample types. We describe optimized methodologies for extracting fungal nucleic acids from sediment, guano, and swab-based samples using commercial kits together with a combination of chemical, enzymatic, and mechanical modifications. Additionally, we define modifications to a previously published intergenic spacer–based qPCR test for P. destructans to refine quantification capabilities of this assay.

  3. Resting myocardial blood flow quantification using contrast-enhanced magnetic resonance imaging in the presence of stenosis: A computational fluid dynamics study

    Energy Technology Data Exchange (ETDEWEB)

    Sommer, Karsten, E-mail: sommerk@uni-mainz.de, E-mail: Schreiber-L@ukw.de [Section of Medical Physics, Department of Radiology, Johannes Gutenberg University Medical Center, Mainz 55131, Germany and Max Planck Graduate Center with the Johannes Gutenberg University Mainz, Mainz 55128 (Germany); Bernat, Dominik; Schmidt, Regine; Breit, Hanns-Christian [Section of Medical Physics, Department of Radiology, Johannes Gutenberg University Medical Center, Mainz 55131 (Germany); Schreiber, Laura M., E-mail: sommerk@uni-mainz.de, E-mail: Schreiber-L@ukw.de [Comprehensive Heart Failure Center, Department of Cardiovascular Imaging, Würzburg University Hospital, Würzburg 97078 (Germany)

    2015-07-15

    Purpose: The extent to which atherosclerotic plaques affect contrast agent (CA) transport in the coronary arteries and, hence, quantification of myocardial blood flow (MBF) using magnetic resonance imaging (MRI) is unclear. The purpose of this work was to evaluate the influence of plaque induced stenosis both on CA transport and on the accuracy of MBF quantification. Methods: Computational fluid dynamics simulations in a high-detailed realistic vascular model were employed to investigate CA bolus transport in the coronary arteries. The impact of atherosclerosis was analyzed by inserting various medium- to high-grade stenoses in the vascular model. The influence of stenosis morphology was examined by varying the stenosis shapes but keeping the area reduction constant. Errors due to CA bolus transport were analyzed using the tracer-kinetic model MMID4. Results: Dispersion of the CA bolus was found in all models and for all outlets, but with a varying magnitude. The impact of stenosis was complex: while high-grade stenoses amplified dispersion, mild stenoses reduced the effect. Morphology was found to have a marked influence on dispersion for a small number of outlets in the post-stenotic region. Despite this marked influence on the concentration–time curves, MBF errors were less affected by stenosis. In total, MBF was underestimated by −7.9% to −44.9%. Conclusions: The presented results reveal that local hemodynamics in the coronary vasculature appears to have a direct impact on CA bolus dispersion. Inclusion of atherosclerotic plaques resulted in a complex alteration of this effect, with both degree of area reduction and stenosis morphology affecting the amount of dispersion. This strong influence of vascular transport effects impairs the accuracy of MRI-based MBF quantification techniques and, potentially, other bolus-based perfusion measurement techniques like computed tomography perfusion imaging.

  4. Molecular quantification of genes encoding for green-fluorescent proteins

    DEFF Research Database (Denmark)

    Felske, A; Vandieken, V; Pauling, B V

    2003-01-01

    A quantitative PCR approach is presented to analyze the amount of recombinant green fluorescent protein (gfp) genes in environmental DNA samples. The quantification assay is a combination of specific PCR amplification and temperature gradient gel electrophoresis (TGGE). Gene quantification...... PCR strategy is a highly specific and sensitive way to monitor recombinant DNA in environments like the efflux of a biotechnological plant....

  5. Quantification of the genetic risk of environmental mutagens

    International Nuclear Information System (INIS)

    Ehling, U.H.

    1988-01-01

    Screening methods are used for hazard identification. Assays for heritable mutations in mammals are used for the confirmation of short-term test results and for the quantification of the genetic risk. There are two main approaches in making genetic risk estimates. One of these, termed the direct method, expresses risk in terms of the expected frequency of genetic changes induced per unit. The other, referred to as the doubling dose method or the indirect method, expresses risk in relation to the observed incidence of genetic disorders now present in man. The indirect method uses experimental data only for the calculation of the doubling dose. The quality of the risk estimation depends on the assumption of persistence of the induced mutations and the ability to determine the current incidence of genetic diseases. The difficulties of improving the estimates of current incidences of genetic diseases or the persistence of the genes in the population led them to the development of an alternative method, the direct estimation of the genetic risk. The direct estimation uses experimental data for the induced frequency for dominant mutations in mice. For the verification of these quantifications one can use the data of Hiroshima and Nagasaki. According to the estimation with the direct method, one would expect less than 1 radiation-induced dominant cataract in 19,000 children with one or both parents exposed. The expected overall frequency of dominant mutations in the first generation would be 20-25, based on radiation-induced dominant cataract mutations. It is estimated that 10 times more recessive than dominant mutations are induced. The same approaches can be used to determine the impact of chemical mutagens

  6. A highly sensitive method for quantification of iohexol

    DEFF Research Database (Denmark)

    Schulz, A.; Boeringer, F.; Swifka, J.

    2014-01-01

    -chromatography-electrospray-massspectrometry (LC-ESI-MS) approach using the multiple reaction monitoring mode for iohexol quantification. In order to test whether a significantly decreased amount of iohexol is sufficient for reliable quantification, a LC-ESI-MS approach was assessed. We analyzed the kinetics of iohexol in rats after application...... of different amounts of iohexol (15 mg to 150 1.tg per rat). Blood sampling was conducted at four time points, at 15, 30, 60, and 90 min, after iohexol injection. The analyte (iohexol) and the internal standard (iotha(amic acid) were separated from serum proteins using a centrifugal filtration device...... with a cut-off of 3 kDa. The chromatographic separation was achieved on an analytical Zorbax SB C18 column. The detection and quantification were performed on a high capacity trap mass spectrometer using positive ion ESI in the multiple reaction monitoring (MRM) mode. Furthermore, using real-time polymerase...

  7. Quantification of the Impact of Roadway Conditions on Emissions

    Science.gov (United States)

    2017-11-01

    The scope of this project involved developing a methodology to quantify the impact of roads condition on emissions and providing guidance to assist TxDOT in improving maintenance strategies to reduce gas emissions. The research quantified vehicle ...

  8. Terahertz identification and quantification of penicillamine enantiomers

    International Nuclear Information System (INIS)

    Ji Te; Zhao Hongwei; Chen Min; Xiao Tiqiao; Han Pengyu

    2013-01-01

    Identification and characterization of L-, D- and DL- penicillamine were demonstrated by Terahertz time-domain spectroscopy (THz-TDS). To understand the physical origins of the low frequency resonant modes, the density functional theory (DFT) was adopted for theoretical calculation. It was found that the collective THz frequency motions were decided by the intramolecular and intermolecular hydrogen bond interactions. Moreover, the quantification of penicillamine enantiomers mixture was demonstrated by a THz spectra fitting method with a relative error of less than 3.5%. This technique can be a valuable tool for the discrimination and quantification of chiral drugs in pharmaceutical industry. (authors)

  9. 74 Résolution du problème d'engagement d'unités de production d ...

    African Journals Online (AJOL)

    Mr PERABI

    énergies fossiles dans les différentes centrales. De ce fait, le DE doit aussi respecter les contraintes environnementales. Le dispatching économique et environnemental (DEE) a pour objectif, non seulement de produire à un coût de combustible réduit mais aussi de réduire les émissions de gaz de ces combustibles.

  10. Résultats de recherche | Page 11 | CRDI - Centre de recherches ...

    International Development Research Centre (IDRC) Digital Library (Canada)

    Pauvreté et vulnérabilité environnementale dans les bidonvilles en pleine croissance en Angola. Cette subvention permettra à l'organisme Development Workshop (DW) de déterminer le fardeau environnemental dans certaines zones urbaines de l'Angola et d'analyser les différences en matière de pauvreté et de ...

  11. Comparison of DNA Quantification Methods for Next Generation Sequencing.

    Science.gov (United States)

    Robin, Jérôme D; Ludlow, Andrew T; LaRanger, Ryan; Wright, Woodring E; Shay, Jerry W

    2016-04-06

    Next Generation Sequencing (NGS) is a powerful tool that depends on loading a precise amount of DNA onto a flowcell. NGS strategies have expanded our ability to investigate genomic phenomena by referencing mutations in cancer and diseases through large-scale genotyping, developing methods to map rare chromatin interactions (4C; 5C and Hi-C) and identifying chromatin features associated with regulatory elements (ChIP-seq, Bis-Seq, ChiA-PET). While many methods are available for DNA library quantification, there is no unambiguous gold standard. Most techniques use PCR to amplify DNA libraries to obtain sufficient quantities for optical density measurement. However, increased PCR cycles can distort the library's heterogeneity and prevent the detection of rare variants. In this analysis, we compared new digital PCR technologies (droplet digital PCR; ddPCR, ddPCR-Tail) with standard methods for the titration of NGS libraries. DdPCR-Tail is comparable to qPCR and fluorometry (QuBit) and allows sensitive quantification by analysis of barcode repartition after sequencing of multiplexed samples. This study provides a direct comparison between quantification methods throughout a complete sequencing experiment and provides the impetus to use ddPCR-based quantification for improvement of NGS quality.

  12. SPECT quantification of regional radionuclide distributions

    International Nuclear Information System (INIS)

    Jaszczak, R.J.; Greer, K.L.; Coleman, R.E.

    1986-01-01

    SPECT quantification of regional radionuclide activities within the human body is affected by several physical and instrumental factors including attenuation of photons within the patient, Compton scattered events, the system's finite spatial resolution and object size, finite number of detected events, partial volume effects, the radiopharmaceutical biokinetics, and patient and/or organ motion. Furthermore, other instrumentation factors such as calibration of the center-of-rotation, sampling, and detector nonuniformities will affect the SPECT measurement process. These factors are described, together with examples of compensation methods that are currently available for improving SPECT quantification. SPECT offers the potential to improve in vivo estimates of absorbed dose, provided the acquisition, reconstruction, and compensation procedures are adequately implemented and utilized. 53 references, 2 figures

  13. Optimization of SPECT calibration for quantification of images applied to dosimetry with iodine-131

    International Nuclear Information System (INIS)

    Carvalho, Samira Marques de

    2018-01-01

    SPECT systems calibration plays an essential role in the accuracy of the quantification of images. In this work, in its first stage, an optimized SPECT calibration method was proposed for 131 I studies, considering the partial volume effect (PVE) and the position of the calibration source. In the second stage, the study aimed to investigate the impact of count density and reconstruction parameters on the determination of the calibration factor and the quantification of the image in dosimetry studies, considering the reality of clinical practice in Brazil. In the final step, the study aimed evaluating the influence of several factors in the calibration for absorbed dose calculation using Monte Carlo simulations (MC) GATE code. Calibration was performed by determining a calibration curve (sensitivity versus volume) obtained by applying different thresholds. Then, the calibration factors were determined with an exponential function adjustment. Images were performed with high and low counts densities for several source positions within the simulator. To validate the calibration method, the calibration factors were used for absolute quantification of the total reference activities. The images were reconstructed adopting two approaches of different parameters, usually used in patient images. The methodology developed for the calibration of the tomographic system was easier and faster to implement than other procedures suggested to improve the accuracy of the results. The study also revealed the influence of the location of the calibration source, demonstrating better precision in the absolute quantification considering the location of the target region during the calibration of the system. The study applied in the Brazilian thyroid protocol suggests the revision of the calibration of the SPECT system, including different positions for the reference source, besides acquisitions considering the Signal to Noise Ratio (SNR) of the images. Finally, the doses obtained with the

  14. Rapid and Easy Protocol for Quantification of Next-Generation Sequencing Libraries.

    Science.gov (United States)

    Hawkins, Steve F C; Guest, Paul C

    2018-01-01

    The emergence of next-generation sequencing (NGS) over the last 10 years has increased the efficiency of DNA sequencing in terms of speed, ease, and price. However, the exact quantification of a NGS library is crucial in order to obtain good data on sequencing platforms developed by the current market leader Illumina. Different approaches for DNA quantification are available currently and the most commonly used are based on analysis of the physical properties of the DNA through spectrophotometric or fluorometric methods. Although these methods are technically simple, they do not allow exact quantification as can be achieved using a real-time quantitative PCR (qPCR) approach. A qPCR protocol for DNA quantification with applications in NGS library preparation studies is presented here. This can be applied in various fields of study such as medical disorders resulting from nutritional programming disturbances.

  15. Quantification of analytes affected by relevant interfering signals under quality controlled conditions

    International Nuclear Information System (INIS)

    Bettencourt da Silva, Ricardo J.N.; Santos, Julia R.; Camoes, M. Filomena G.F.C.

    2006-01-01

    The analysis of organic contaminants or residues in biological samples is frequently affected by the presence of compounds producing interfering instrumental signals. This feature is responsible for the higher complexity and cost of these analyses and/or by a significant reduction of the number of studied analytes in a multi-analyte method. This work presents a methodology to estimate the impact of the interfering compounds on the quality of the analysis of complex samples, based on separative instrumental methods of analysis, aiming at supporting the inclusion of analytes affected by interfering compounds in the list of compounds analysed in the studied samples. The proposed methodology involves the study of the magnitude of the signal produced by the interfering compounds in the analysed matrix, and is applicable to analytical systems affected by interfering compounds with varying concentration in the studied matrix. The proposed methodology is based on the comparison of the signals from a representative number of examples of the studied matrix, in order to estimate the impact of the presence of such compounds on the measurement quality. The treatment of the chromatographic signals necessary to collect these data can be easily performed considering algorithms of subtraction of chromatographic signals available in most of the analytical instrumentation software. The subtraction of the interfering compounds signal from the sample signal allows the compensation of the interfering effect irrespective of the relative magnitude of the interfering and analyte signals, supporting the applicability of the same model of the method performance for a broader concentration range. The quantification of the measurement uncertainty was performed using the differential approach, which allows the estimation of the contribution of the presence of the interfering compounds to the quality of the measurement. The proposed methodology was successfully applied to the analysis of

  16. Two-Phase Microfluidic Systems for High Throughput Quantification of Agglutination Assays

    KAUST Repository

    Castro, David

    2018-01-01

    assay, with a minimum detection limit of 50 ng/mL using optical image analysis. We compare optical image analysis and light scattering as quantification methods, and demonstrate the first light scattering quantification of agglutination assays in a two

  17. Molecular quantification of environmental DNA using microfluidics and digital PCR.

    Science.gov (United States)

    Hoshino, Tatsuhiko; Inagaki, Fumio

    2012-09-01

    Real-time PCR has been widely used to evaluate gene abundance in natural microbial habitats. However, PCR-inhibitory substances often reduce the efficiency of PCR, leading to the underestimation of target gene copy numbers. Digital PCR using microfluidics is a new approach that allows absolute quantification of DNA molecules. In this study, digital PCR was applied to environmental samples, and the effect of PCR inhibitors on DNA quantification was tested. In the control experiment using λ DNA and humic acids, underestimation of λ DNA at 1/4400 of the theoretical value was observed with 6.58 ng μL(-1) humic acids. In contrast, digital PCR provided accurate quantification data with a concentration of humic acids up to 9.34 ng μL(-1). The inhibitory effect of paddy field soil extract on quantification of the archaeal 16S rRNA gene was also tested. By diluting the DNA extract, quantified copy numbers from real-time PCR and digital PCR became similar, indicating that dilution was a useful way to remedy PCR inhibition. The dilution strategy was, however, not applicable to all natural environmental samples. For example, when marine subsurface sediment samples were tested the copy number of archaeal 16S rRNA genes was 1.04×10(3) copies/g-sediment by digital PCR, whereas real-time PCR only resulted in 4.64×10(2) copies/g-sediment, which was most likely due to an inhibitory effect. The data from this study demonstrated that inhibitory substances had little effect on DNA quantification using microfluidics and digital PCR, and showed the great advantages of digital PCR in accurate quantifications of DNA extracted from various microbial habitats. Copyright © 2012 Elsevier GmbH. All rights reserved.

  18. Low cost high performance uncertainty quantification

    KAUST Repository

    Bekas, C.; Curioni, A.; Fedulova, I.

    2009-01-01

    Uncertainty quantification in risk analysis has become a key application. In this context, computing the diagonal of inverse covariance matrices is of paramount importance. Standard techniques, that employ matrix factorizations, incur a cubic cost

  19. AVQS: Attack Route-Based Vulnerability Quantification Scheme for Smart Grid

    Directory of Open Access Journals (Sweden)

    Jongbin Ko

    2014-01-01

    Full Text Available A smart grid is a large, consolidated electrical grid system that includes heterogeneous networks and systems. Based on the data, a smart grid system has a potential security threat in its network connectivity. To solve this problem, we develop and apply a novel scheme to measure the vulnerability in a smart grid domain. Vulnerability quantification can be the first step in security analysis because it can help prioritize the security problems. However, existing vulnerability quantification schemes are not suitable for smart grid because they do not consider network vulnerabilities. We propose a novel attack route-based vulnerability quantification scheme using a network vulnerability score and an end-to-end security score, depending on the specific smart grid network environment to calculate the vulnerability score for a particular attack route. To evaluate the proposed approach, we derive several attack scenarios from the advanced metering infrastructure domain. The experimental results of the proposed approach and the existing common vulnerability scoring system clearly show that we need to consider network connectivity for more optimized vulnerability quantification.

  20. AVQS: attack route-based vulnerability quantification scheme for smart grid.

    Science.gov (United States)

    Ko, Jongbin; Lim, Hyunwoo; Lee, Seokjun; Shon, Taeshik

    2014-01-01

    A smart grid is a large, consolidated electrical grid system that includes heterogeneous networks and systems. Based on the data, a smart grid system has a potential security threat in its network connectivity. To solve this problem, we develop and apply a novel scheme to measure the vulnerability in a smart grid domain. Vulnerability quantification can be the first step in security analysis because it can help prioritize the security problems. However, existing vulnerability quantification schemes are not suitable for smart grid because they do not consider network vulnerabilities. We propose a novel attack route-based vulnerability quantification scheme using a network vulnerability score and an end-to-end security score, depending on the specific smart grid network environment to calculate the vulnerability score for a particular attack route. To evaluate the proposed approach, we derive several attack scenarios from the advanced metering infrastructure domain. The experimental results of the proposed approach and the existing common vulnerability scoring system clearly show that we need to consider network connectivity for more optimized vulnerability quantification.

  1. Comparison of machine learning and semi-quantification algorithms for (I123)FP-CIT classification: the beginning of the end for semi-quantification?

    Science.gov (United States)

    Taylor, Jonathan Christopher; Fenner, John Wesley

    2017-11-29

    Semi-quantification methods are well established in the clinic for assisted reporting of (I123) Ioflupane images. Arguably, these are limited diagnostic tools. Recent research has demonstrated the potential for improved classification performance offered by machine learning algorithms. A direct comparison between methods is required to establish whether a move towards widespread clinical adoption of machine learning algorithms is justified. This study compared three machine learning algorithms with that of a range of semi-quantification methods, using the Parkinson's Progression Markers Initiative (PPMI) research database and a locally derived clinical database for validation. Machine learning algorithms were based on support vector machine classifiers with three different sets of features: Voxel intensities Principal components of image voxel intensities Striatal binding radios from the putamen and caudate. Semi-quantification methods were based on striatal binding ratios (SBRs) from both putamina, with and without consideration of the caudates. Normal limits for the SBRs were defined through four different methods: Minimum of age-matched controls Mean minus 1/1.5/2 standard deviations from age-matched controls Linear regression of normal patient data against age (minus 1/1.5/2 standard errors) Selection of the optimum operating point on the receiver operator characteristic curve from normal and abnormal training data Each machine learning and semi-quantification technique was evaluated with stratified, nested 10-fold cross-validation, repeated 10 times. The mean accuracy of the semi-quantitative methods for classification of local data into Parkinsonian and non-Parkinsonian groups varied from 0.78 to 0.87, contrasting with 0.89 to 0.95 for classifying PPMI data into healthy controls and Parkinson's disease groups. The machine learning algorithms gave mean accuracies between 0.88 to 0.92 and 0.95 to 0.97 for local and PPMI data respectively. Classification

  2. Volumetric quantification of lung nodules in CT with iterative reconstruction (ASiR and MBIR).

    Science.gov (United States)

    Chen, Baiyu; Barnhart, Huiman; Richard, Samuel; Robins, Marthony; Colsher, James; Samei, Ehsan

    2013-11-01

    Volume quantifications of lung nodules with multidetector computed tomography (CT) images provide useful information for monitoring nodule developments. The accuracy and precision of the volume quantification, however, can be impacted by imaging and reconstruction parameters. This study aimed to investigate the impact of iterative reconstruction algorithms on the accuracy and precision of volume quantification with dose and slice thickness as additional variables. Repeated CT images were acquired from an anthropomorphic chest phantom with synthetic nodules (9.5 and 4.8 mm) at six dose levels, and reconstructed with three reconstruction algorithms [filtered backprojection (FBP), adaptive statistical iterative reconstruction (ASiR), and model based iterative reconstruction (MBIR)] into three slice thicknesses. The nodule volumes were measured with two clinical software (A: Lung VCAR, B: iNtuition), and analyzed for accuracy and precision. Precision was found to be generally comparable between FBP and iterative reconstruction with no statistically significant difference noted for different dose levels, slice thickness, and segmentation software. Accuracy was found to be more variable. For large nodules, the accuracy was significantly different between ASiR and FBP for all slice thicknesses with both software, and significantly different between MBIR and FBP for 0.625 mm slice thickness with Software A and for all slice thicknesses with Software B. For small nodules, the accuracy was more similar between FBP and iterative reconstruction, with the exception of ASIR vs FBP at 1.25 mm with Software A and MBIR vs FBP at 0.625 mm with Software A. The systematic difference between the accuracy of FBP and iterative reconstructions highlights the importance of extending current segmentation software to accommodate the image characteristics of iterative reconstructions. In addition, a calibration process may help reduce the dependency of accuracy on reconstruction algorithms

  3. Superlattice band structure: New and simple energy quantification condition

    Energy Technology Data Exchange (ETDEWEB)

    Maiz, F., E-mail: fethimaiz@gmail.com [University of Cartage, Nabeul Engineering Preparatory Institute, Merazka, 8000 Nabeul (Tunisia); King Khalid University, Faculty of Science, Physics Department, P.O. Box 9004, Abha 61413 (Saudi Arabia)

    2014-10-01

    Assuming an approximated effective mass and using Bastard's boundary conditions, a simple method is used to calculate the subband structure for periodic semiconducting heterostructures. Our method consists to derive and solve the energy quantification condition (EQC), this is a simple real equation, composed of trigonometric and hyperbolic functions, and does not need any programming effort or sophistic machine to solve it. For less than ten wells heterostructures, we have derived and simplified the energy quantification conditions. The subband is build point by point; each point presents an energy level. Our simple energy quantification condition is used to calculate the subband structure of the GaAs/Ga{sub 0.5}Al{sub 0.5}As heterostructures, and build its subband point by point for 4 and 20 wells. Our finding shows a good agreement with previously published results.

  4. Quantification of rat brain SPECT with 123I-ioflupane: evaluation of different reconstruction methods and image degradation compensations using Monte Carlo simulation

    International Nuclear Information System (INIS)

    Roé-Vellvé, N; Pino, F; Cot, A; Ros, D; Falcon, C; Gispert, J D; Pavía, J; Marin, C

    2014-01-01

    SPECT studies with 123 I-ioflupane facilitate the diagnosis of Parkinson’s disease (PD). The effect on quantification of image degradations has been extensively evaluated in human studies but their impact on studies of experimental PD models is still unclear. The aim of this work was to assess the effect of compensating for the degrading phenomena on the quantification of small animal SPECT studies using 123 I-ioflupane. This assessment enabled us to evaluate the feasibility of quantitatively detecting small pathological changes using different reconstruction methods and levels of compensation for the image degrading phenomena. Monte Carlo simulated studies of a rat phantom were reconstructed and quantified. Compensations for point spread function (PSF), scattering, attenuation and partial volume effect were progressively included in the quantification protocol. A linear relationship was found between calculated and simulated specific uptake ratio (SUR) in all cases. In order to significantly distinguish disease stages, noise-reduction during the reconstruction process was the most relevant factor, followed by PSF compensation. The smallest detectable SUR interval was determined by biological variability rather than by image degradations or coregistration errors. The quantification methods that gave the best results allowed us to distinguish PD stages with SUR values that are as close as 0.5 using groups of six rats to represent each stage. (paper)

  5. Environmental Impact of Munition and Propellant Disposal (Impact Environnemental de l’Elimination des Munitions et des Combustibles)

    Science.gov (United States)

    2010-02-01

    the Future of Demil 3-29 3.4.2 Poster Session 3-32 3.4.2.1 The Situation in Lithuania: The Studies on the Explosive Contamination, 3-32 Their Toxic...Presentations and Documents Supporting Capability Assessments A-1 Annex B – Presentations, Paper/ Posters and Videos from the Sofia Meeting B-1 Annex C...sur l’environnement. La réunion a inclus des participants provenant de l’OTAN et des partenaires dont la Russie et la Géorgie. Les sessions étaient

  6. Qualite bio-ecologique d'un milieu lacustre hyper-eutrophisee en ...

    African Journals Online (AJOL)

    Dr AJEAGAH

    L'observation des cellules phyto-planctoniques est faite au microscope Motic de type B1 Séries ..... du Lac de Bè. Rapport préliminaire suivi de la qualité des Eaux. ... [30] - A. COUTE, & A. ILTIS, Etude au microscope électronique à balayage de ... [40] - S. SANE, Contrôle environnemental de la production primaire du lac de ...

  7. Les premiers succès obtenus en matière de lutte contre la faim en ...

    International Development Research Centre (IDRC) Digital Library (Canada)

    8 nov. 2013 ... Les nutritionnistes peuvent habituellement traiter les carences en zinc et en fer au moyen de compléments alimentaires, mais il est apparu clairement que la cause profonde du problème était d'ordre environnemental. On a donc appris que la santé du sol et celle des plantes allaient de pair avec la santé ...

  8. Impact of deep learning on the normalization of reconstruction kernel effects in imaging biomarker quantification: a pilot study in CT emphysema

    Science.gov (United States)

    Jin, Hyeongmin; Heo, Changyong; Kim, Jong Hyo

    2018-02-01

    Differing reconstruction kernels are known to strongly affect the variability of imaging biomarkers and thus remain as a barrier in translating the computer aided quantification techniques into clinical practice. This study presents a deep learning application to CT kernel conversion which converts a CT image of sharp kernel to that of standard kernel and evaluates its impact on variability reduction of a pulmonary imaging biomarker, the emphysema index (EI). Forty cases of low-dose chest CT exams obtained with 120kVp, 40mAs, 1mm thickness, of 2 reconstruction kernels (B30f, B50f) were selected from the low dose lung cancer screening database of our institution. A Fully convolutional network was implemented with Keras deep learning library. The model consisted of symmetric layers to capture the context and fine structure characteristics of CT images from the standard and sharp reconstruction kernels. Pairs of the full-resolution CT data set were fed to input and output nodes to train the convolutional network to learn the appropriate filter kernels for converting the CT images of sharp kernel to standard kernel with a criterion of measuring the mean squared error between the input and target images. EIs (RA950 and Perc15) were measured with a software package (ImagePrism Pulmo, Seoul, South Korea) and compared for the data sets of B50f, B30f, and the converted B50f. The effect of kernel conversion was evaluated with the mean and standard deviation of pair-wise differences in EI. The population mean of RA950 was 27.65 +/- 7.28% for B50f data set, 10.82 +/- 6.71% for the B30f data set, and 8.87 +/- 6.20% for the converted B50f data set. The mean of pair-wise absolute differences in RA950 between B30f and B50f is reduced from 16.83% to 1.95% using kernel conversion. Our study demonstrates the feasibility of applying the deep learning technique for CT kernel conversion and reducing the kernel-induced variability of EI quantification. The deep learning model has a

  9. freeQuant: A Mass Spectrometry Label-Free Quantification Software Tool for Complex Proteome Analysis.

    Science.gov (United States)

    Deng, Ning; Li, Zhenye; Pan, Chao; Duan, Huilong

    2015-01-01

    Study of complex proteome brings forward higher request for the quantification method using mass spectrometry technology. In this paper, we present a mass spectrometry label-free quantification tool for complex proteomes, called freeQuant, which integrated quantification with functional analysis effectively. freeQuant consists of two well-integrated modules: label-free quantification and functional analysis with biomedical knowledge. freeQuant supports label-free quantitative analysis which makes full use of tandem mass spectrometry (MS/MS) spectral count, protein sequence length, shared peptides, and ion intensity. It adopts spectral count for quantitative analysis and builds a new method for shared peptides to accurately evaluate abundance of isoforms. For proteins with low abundance, MS/MS total ion count coupled with spectral count is included to ensure accurate protein quantification. Furthermore, freeQuant supports the large-scale functional annotations for complex proteomes. Mitochondrial proteomes from the mouse heart, the mouse liver, and the human heart were used to evaluate the usability and performance of freeQuant. The evaluation showed that the quantitative algorithms implemented in freeQuant can improve accuracy of quantification with better dynamic range.

  10. Quantification of viable spray-dried potential probiotic lactobacilli using real-time PCR

    Directory of Open Access Journals (Sweden)

    Radulović Zorica

    2012-01-01

    Full Text Available The basic requirement for probiotic bacteria to be able to perform expected positive effects is to be alive. Therefore, appropriate quantification methods are crucial. Bacterial quantification based on nucleic acid detection is increasingly used. Spray-drying (SD is one of the possibilities to improve the survival of probiotic bacteria against negative environmental effects. The aim of this study was to investigate the survival of spray-dried Lactobacillus plantarum 564 and Lactobacillus paracasei Z-8, and to investigate the impact on some probiotic properties caused by SD of both tested strains. Besides the plate count technique, the aim was to examine the possibility of using propidium monoazide (PMA in combination with real-time polymerase chain reaction (PCR for determining spray-dried tested strains. The number of intact cells, Lb. plantarum 564 and Lb. paracasei Z-8, was determined by real-time PCR with PMA, and it was similar to the number of investigated strains obtained by the plate count method. Spray-dried Lb. plantarum 564 and Lb. paracasei Z-8 demonstrated very good probiotic ability. It may be concluded that the PMA real-time PCR determination of the viability of probiotic bacteria could complement the plate count method and SD may be a cost-effective way to produce large quantities of some probiotic cultures. [Projekat Ministarstva nauke Republike Srbije, br. 046010

  11. A methodology for uncertainty quantification in quantitative technology valuation based on expert elicitation

    Science.gov (United States)

    Akram, Muhammad Farooq Bin

    The management of technology portfolios is an important element of aerospace system design. New technologies are often applied to new product designs to ensure their competitiveness at the time they are introduced to market. The future performance of yet-to- be designed components is inherently uncertain, necessitating subject matter expert knowledge, statistical methods and financial forecasting. Estimates of the appropriate parameter settings often come from disciplinary experts, who may disagree with each other because of varying experience and background. Due to inherent uncertain nature of expert elicitation in technology valuation process, appropriate uncertainty quantification and propagation is very critical. The uncertainty in defining the impact of an input on performance parameters of a system makes it difficult to use traditional probability theory. Often the available information is not enough to assign the appropriate probability distributions to uncertain inputs. Another problem faced during technology elicitation pertains to technology interactions in a portfolio. When multiple technologies are applied simultaneously on a system, often their cumulative impact is non-linear. Current methods assume that technologies are either incompatible or linearly independent. It is observed that in case of lack of knowledge about the problem, epistemic uncertainty is the most suitable representation of the process. It reduces the number of assumptions during the elicitation process, when experts are forced to assign probability distributions to their opinions without sufficient knowledge. Epistemic uncertainty can be quantified by many techniques. In present research it is proposed that interval analysis and Dempster-Shafer theory of evidence are better suited for quantification of epistemic uncertainty in technology valuation process. Proposed technique seeks to offset some of the problems faced by using deterministic or traditional probabilistic approaches for

  12. Effect of gadolinium on hepatic fat quantification using multi-echo reconstruction technique with T2* correction and estimation

    Energy Technology Data Exchange (ETDEWEB)

    Ge, Mingmei; Wu, Bing; Liu, Zhiqin; Song, Hai; Meng, Xiangfeng; Wu, Xinhuai [The Military General Hospital of Beijing PLA, Department of Radiology, Beijing (China); Zhang, Jing [The 309th Hospital of Chinese People' s Liberation Army, Department of Radiology, Beijing (China)

    2016-06-15

    To determine whether hepatic fat quantification is affected by administration of gadolinium using a multiecho reconstruction technique with T2* correction and estimation. Forty-eight patients underwent the investigational sequence for hepatic fat quantification at 3.0T MRI once before and twice after administration of gadopentetate dimeglumine (0.1 mmol/kg). A one-way repeated-measures analysis of variance with pairwise comparisons was conducted to evaluate the systematic bias of fat fraction (FF) and R2* measurements between three acquisitions. Bland-Altman plots were used to assess the agreements between pre- and post-contrast FF measurements in the liver. A P value <0.05 indicated statistically significant difference. FF measurements of liver, spleen and spine revealed no significant systematic bias between the three measurements (P > 0.05 for all). Good agreements (95 % confidence interval) of FF measurements were demonstrated between pre-contrast and post-contrast1 (-0.49 %, 0.52 %) and post-contrast2 (-0.83 %, 0.77 %). R2* increased in liver and spleen (P = 0.039, P = 0.01) after administration of gadolinium. Although under the impact of an increased R2* in liver and spleen post-contrast, the investigational sequence can still obtain stable fat quantification. Therefore, it could be applied post-contrast to substantially increase the efficiency of MR examination and also provide a backup for the occasional failure of FF measurements pre-contrast. (orig.)

  13. Effect of gadolinium on hepatic fat quantification using multi-echo reconstruction technique with T2* correction and estimation

    International Nuclear Information System (INIS)

    Ge, Mingmei; Wu, Bing; Liu, Zhiqin; Song, Hai; Meng, Xiangfeng; Wu, Xinhuai; Zhang, Jing

    2016-01-01

    To determine whether hepatic fat quantification is affected by administration of gadolinium using a multiecho reconstruction technique with T2* correction and estimation. Forty-eight patients underwent the investigational sequence for hepatic fat quantification at 3.0T MRI once before and twice after administration of gadopentetate dimeglumine (0.1 mmol/kg). A one-way repeated-measures analysis of variance with pairwise comparisons was conducted to evaluate the systematic bias of fat fraction (FF) and R2* measurements between three acquisitions. Bland-Altman plots were used to assess the agreements between pre- and post-contrast FF measurements in the liver. A P value <0.05 indicated statistically significant difference. FF measurements of liver, spleen and spine revealed no significant systematic bias between the three measurements (P > 0.05 for all). Good agreements (95 % confidence interval) of FF measurements were demonstrated between pre-contrast and post-contrast1 (-0.49 %, 0.52 %) and post-contrast2 (-0.83 %, 0.77 %). R2* increased in liver and spleen (P = 0.039, P = 0.01) after administration of gadolinium. Although under the impact of an increased R2* in liver and spleen post-contrast, the investigational sequence can still obtain stable fat quantification. Therefore, it could be applied post-contrast to substantially increase the efficiency of MR examination and also provide a backup for the occasional failure of FF measurements pre-contrast. (orig.)

  14. Quantification of UHMWPE wear in periprosthetic tissues of hip arthroplasty: Description of a new method based on IR and comparison with radiographic appearance

    Czech Academy of Sciences Publication Activity Database

    Šlouf, Miroslav; Pokorný, D.; Entlicher, G.; Dybal, Jiří; Synková, Hana; Lapčíková, Monika; Fejfarková, Z.; Špundová, M.; Veselý, F.; Sosna, A.

    2008-01-01

    Roč. 265, 5-6 (2008), s. 674-684 ISSN 0043-1648 R&D Projects: GA ČR GA106/04/1118; GA MŠk 2B06096 Institutional research plan: CEZ:AV0Z40500505 Keywords : UHMWPE * isolation of wear debris * quantification of wear particles Subject RIV: CD - Macromolecular Chemistry Impact factor: 1.509, year: 2008

  15. FRANX. Application for analysis and quantification of the APS fire

    International Nuclear Information System (INIS)

    Snchez, A.; Osorio, F.; Ontoso, N.

    2014-01-01

    The FRANX application has been developed by EPRI within the Risk and Reliability User Group in order to facilitate the process of quantification and updating APS Fire (also covers floods and earthquakes). By applying fire scenarios are quantified in the central integrating the tasks performed during the APS fire. This paper describes the main features of the program to allow quantification of an APS Fire. (Author)

  16. Quantification of aortic regurgitation by magnetic resonance velocity mapping

    DEFF Research Database (Denmark)

    Søndergaard, Lise; Lindvig, K; Hildebrandt, P

    1993-01-01

    The use of magnetic resonance (MR) velocity mapping in the quantification of aortic valvular blood flow was examined in 10 patients with angiographically verified aortic regurgitation. MR velocity mapping succeeded in identifying and quantifying the regurgitation in all patients, and the regurgit......The use of magnetic resonance (MR) velocity mapping in the quantification of aortic valvular blood flow was examined in 10 patients with angiographically verified aortic regurgitation. MR velocity mapping succeeded in identifying and quantifying the regurgitation in all patients...

  17. Impact of follicular G-CSF quantification on subsequent embryo transfer decisions: a proof of concept study.

    Science.gov (United States)

    Lédée, N; Gridelet, V; Ravet, S; Jouan, C; Gaspard, O; Wenders, F; Thonon, F; Hincourt, N; Dubois, M; Foidart, J M; Munaut, C; Perrier d'Hauterive, S

    2013-02-01

    Previous experiments have shown that granulocyte colony-stimulating factor (G-CSF), quantified in the follicular fluid (FF) of individual oocytes, correlates with the potential for an ongoing pregnancy of the corresponding fertilized oocytes among selected transferred embryos. Here we present a proof of concept study aimed at evaluating the impact of including FF G-CSF quantification in the embryo transfer decisions. FF G-CSF was quantified with the Luminex XMap technology in 523 individual FF samples corresponding to 116 fresh transferred embryos, 275 frozen embryos and 131 destroyed embryos from 78 patients undergoing ICSI. Follicular G-CSF was highly predictive of subsequent implantation. The receiving operator characteristics curve methodology showed its higher discriminatory power to predict ongoing pregnancy in multivariate logistic regression analysis for FF G-CSF compared with embryo morphology [0.77 (0.69-0.83), P Embryos were classified by their FF G-CSF concentration: Class I over 30 pg/ml (a highest positive predictive value for implantation), Class II from 30 to 18.4 pg/ml and Class III Embryos derived from Class I follicles had a significantly higher implantation rate (IR) than those from Class II and III follicles (36 versus 16.6 and 6%, P Embryos derived from Class I follicles with an optimal morphology reached an IR of 54%. Frozen-thawed embryos transfer derived from Class I follicles had an IR of 37% significantly higher than those from Class II and III follicles, respectively, of 8 and 5% (P embryos but also 10% of the destroyed embryos were derived from G-CSF Class I follicles. Non-optimal embryos appear to have been transferred in 28% (22/78) of the women, and their pregnancy rate was significantly lower than that of women who received at least one optimal embryo (18 versus 36%, P = 0.04). Monitoring FF G-CSF for the selection of embryos with a better potential for pregnancy might improve the effectiveness of IVF by reducing the time and cost

  18. Quantification by aberration corrected (S)TEM of boundaries formed by symmetry breaking phase transformations

    Energy Technology Data Exchange (ETDEWEB)

    Schryvers, D., E-mail: nick.schryvers@uantwerpen.be [EMAT, University of Antwerp, Groenenborgerlaan 171, B-2020 Antwerp (Belgium); Salje, E.K.H. [Department of Earth Sciences, University of Cambridge, Cambridge CB2 3EQ (United Kingdom); Nishida, M. [Department of Engineering Sciences for Electronics and Materials, Faculty of Engineering Sciences, Kyushu University, Kasuga, Fukuoka 816-8580 (Japan); De Backer, A. [EMAT, University of Antwerp, Groenenborgerlaan 171, B-2020 Antwerp (Belgium); Idrissi, H. [EMAT, University of Antwerp, Groenenborgerlaan 171, B-2020 Antwerp (Belgium); Institute of Mechanics, Materials and Civil Engineering, Université Catholique de Louvain, Place Sainte Barbe, 2, B-1348, Louvain-la-Neuve (Belgium); Van Aert, S. [EMAT, University of Antwerp, Groenenborgerlaan 171, B-2020 Antwerp (Belgium)

    2017-05-15

    The present contribution gives a review of recent quantification work of atom displacements, atom site occupations and level of crystallinity in various systems and based on aberration corrected HR(S)TEM images. Depending on the case studied, picometer range precisions for individual distances can be obtained, boundary widths at the unit cell level determined or statistical evolutions of fractions of the ordered areas calculated. In all of these cases, these quantitative measures imply new routes for the applications of the respective materials. - Highlights: • Quantification of picometer displacements at ferroelastic twin boundary in CaTiO{sub 3.} • Quantification of kinks in meandering ferroelectric domain wall in LiNbO{sub 3}. • Quantification of column occupation in anti-phase boundary in Co-Pt. • Quantification of atom displacements at twin boundary in Ni-Ti B19′ martensite.

  19. Automated quantification of proliferation with automated hot-spot selection in phosphohistone H3/MART1 dual-stained stage I/II melanoma.

    Science.gov (United States)

    Nielsen, Patricia Switten; Riber-Hansen, Rikke; Schmidt, Henrik; Steiniche, Torben

    2016-04-09

    Staging of melanoma includes quantification of a proliferation index, i.e., presumed melanocytic mitoses of H&E stains are counted manually in hot spots. Yet, its reproducibility and prognostic impact increases by immunohistochemical dual staining for phosphohistone H3 (PHH3) and MART1, which also may enable fully automated quantification by image analysis. To ensure manageable workloads and repeatable measurements in modern pathology, the study aimed to present an automated quantification of proliferation with automated hot-spot selection in PHH3/MART1-stained melanomas. Formalin-fixed, paraffin-embedded tissue from 153 consecutive stage I/II melanoma patients was immunohistochemically dual-stained for PHH3 and MART1. Whole slide images were captured, and the number of PHH3/MART1-positive cells was manually and automatically counted in the global tumor area and in a manually and automatically selected hot spot, i.e., a fixed 1-mm(2) square. Bland-Altman plots and hypothesis tests compared manual and automated procedures, and the Cox proportional hazards model established their prognostic impact. The mean difference between manual and automated global counts was 2.9 cells/mm(2) (P = 0.0071) and 0.23 cells per hot spot (P = 0.96) for automated counts in manually and automatically selected hot spots. In 77 % of cases, manual and automated hot spots overlapped. Fully manual hot-spot counts yielded the highest prognostic performance with an adjusted hazard ratio of 5.5 (95 % CI, 1.3-24, P = 0.024) as opposed to 1.3 (95 % CI, 0.61-2.9, P = 0.47) for automated counts with automated hot spots. The automated index and automated hot-spot selection were highly correlated to their manual counterpart, but altogether their prognostic impact was noticeably reduced. Because correct recognition of only one PHH3/MART1-positive cell seems important, extremely high sensitivity and specificity of the algorithm is required for prognostic purposes. Thus, automated

  20. The use of self-quantification systems for personal health information: big data management activities and prospects.

    Science.gov (United States)

    Almalki, Manal; Gray, Kathleen; Sanchez, Fernando Martin

    2015-01-01

    Self-quantification is seen as an emerging paradigm for health care self-management. Self-quantification systems (SQS) can be used for tracking, monitoring, and quantifying health aspects including mental, emotional, physical, and social aspects in order to gain self-knowledge. However, there has been a lack of a systematic approach for conceptualising and mapping the essential activities that are undertaken by individuals who are using SQS in order to improve health outcomes. In this paper, we propose a new model of personal health information self-quantification systems (PHI-SQS). PHI-SQS model describes two types of activities that individuals go through during their journey of health self-managed practice, which are 'self-quantification' and 'self-activation'. In this paper, we aimed to examine thoroughly the first type of activity in PHI-SQS which is 'self-quantification'. Our objectives were to review the data management processes currently supported in a representative set of self-quantification tools and ancillary applications, and provide a systematic approach for conceptualising and mapping these processes with the individuals' activities. We reviewed and compared eleven self-quantification tools and applications (Zeo Sleep Manager, Fitbit, Actipressure, MoodPanda, iBGStar, Sensaris Senspod, 23andMe, uBiome, Digifit, BodyTrack, and Wikilife), that collect three key health data types (Environmental exposure, Physiological patterns, Genetic traits). We investigated the interaction taking place at different data flow stages between the individual user and the self-quantification technology used. We found that these eleven self-quantification tools and applications represent two major tool types (primary and secondary self-quantification systems). In each type, the individuals experience different processes and activities which are substantially influenced by the technologies' data management capabilities. Self-quantification in personal health maintenance

  1. Clinical applications of MS-based protein quantification.

    Science.gov (United States)

    Sabbagh, Bassel; Mindt, Sonani; Neumaier, Michael; Findeisen, Peter

    2016-04-01

    Mass spectrometry-based assays are increasingly important in clinical laboratory medicine and nowadays are already commonly used in several areas of routine diagnostics. These include therapeutic drug monitoring, toxicology, endocrinology, pediatrics, and microbiology. Accordingly, some of the most common analyses are therapeutic drug monitoring of immunosuppressants, vitamin D, steroids, newborn screening, and bacterial identification. However, MS-based quantification of peptides and proteins for routine diagnostic use is rather rare up to now despite excellent analytical specificity and good sensitivity. Here, we want to give an overview over current fit-for-purpose assays for MS-based protein quantification. Advantages as well as challenges of this approach will be discussed with focus on feasibility for routine diagnostic use. © 2016 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  2. A Study of Tongue and Pulse Diagnosis in Traditional Korean Medicine for Stroke Patients Based on Quantification Theory Type II

    Directory of Open Access Journals (Sweden)

    Mi Mi Ko

    2013-01-01

    Full Text Available In traditional Korean medicine (TKM, pattern identification (PI diagnosis is important for treating diseases. The aim of this study was to comprehensively investigate the relationship between the PI type and tongue diagnosis or pulse diagnosis variables. The study included 1,879 stroke patients who were admitted to 12 oriental medical university hospitals from June 2006 through March 2009. The status of the pulse and tongue was examined in each patient. Additionally, to investigate relatively important indicators related to specialist PI, the quantification theory type II analysis was performed regarding the PI type. In the first axis quantification of the external criteria, the Qi-deficiency and the Yin-deficiency patterns were located in the negative direction, while the dampness-phlegm (DP and fire-heat patterns were located in the positive direction. The explanatory variable with the greatest impact on the assessment was a fine pulse. In the second axis quantification, the external criteria were divided into either the DP or non-DP patterns. The slippery pulse exhibited the greatest effect on the division. This study attempted to build a model using a statistical method to objectively quantify PI and various indicators that constitute the unique diagnosis system of TKM. These results should assist the development of future diagnostic standards in stroke PI.

  3. Uncertainty Quantification in Aerodynamics Simulations, Phase I

    Data.gov (United States)

    National Aeronautics and Space Administration — The objective of the proposed work (Phases I and II) is to develop uncertainty quantification methodologies and software suitable for use in CFD simulations of...

  4. Quantification Model for Estimating Temperature Field Distributions of Apple Fruit

    OpenAIRE

    Zhang , Min; Yang , Le; Zhao , Huizhong; Zhang , Leijie; Zhong , Zhiyou; Liu , Yanling; Chen , Jianhua

    2009-01-01

    International audience; A quantification model of transient heat conduction was provided to simulate apple fruit temperature distribution in the cooling process. The model was based on the energy variation of apple fruit of different points. It took into account, heat exchange of representative elemental volume, metabolism heat and external heat. The following conclusions could be obtained: first, the quantification model can satisfactorily describe the tendency of apple fruit temperature dis...

  5. Initial water quantification results using neutron computed tomography

    Science.gov (United States)

    Heller, A. K.; Shi, L.; Brenizer, J. S.; Mench, M. M.

    2009-06-01

    Neutron computed tomography is an important imaging tool in the field of non-destructive testing and in fundamental research for many engineering applications. Contrary to X-rays, neutrons can be attenuated by some light materials, such as hydrogen, but can penetrate many heavy materials. Thus, neutron computed tomography is useful in obtaining important three-dimensional information about a sample's interior structure and material properties that other traditional methods cannot provide. The neutron computed tomography system at the Pennsylvania State University's Radiation Science and Engineering Center is being utilized to develop a water quantification technique for investigation of water distribution in fuel cells under normal conditions. A hollow aluminum cylinder test sample filled with a known volume of water was constructed for purposes of testing the quantification technique. Transmission images of the test sample at different angles were easily acquired through the synthesis of a dedicated image acquisition computer driving a rotary table controller and an in-house developed synchronization software package. After data acquisition, Octopus (version 8.2) and VGStudio Max (version 1.2) were used to perform cross-sectional and three-dimensional reconstructions of the sample, respectively. The initial reconstructions and water quantification results are presented.

  6. Perfusion Quantification Using Gaussian Process Deconvolution

    DEFF Research Database (Denmark)

    Andersen, Irene Klærke; Have, Anna Szynkowiak; Rasmussen, Carl Edward

    2002-01-01

    The quantification of perfusion using dynamic susceptibility contrast MRI (DSC-MRI) requires deconvolution to obtain the residual impulse response function (IRF). In this work, a method using the Gaussian process for deconvolution (GPD) is proposed. The fact that the IRF is smooth is incorporated...

  7. SU-F-R-28: Correction of FCh-PET Bladder Uptake Using Virtual Sinograms and Investigation of Its Impact On the Quantification of Prostate Textural Characteristics

    Energy Technology Data Exchange (ETDEWEB)

    Laberge, S; Beauregard, J; Archambault, L [CHUQ Pavillon Hotel-Dieu de Quebec, Quebec, QC (Canada)

    2016-06-15

    Purpose: Textural biomarkers as a tool for quantifying intratumoral heterogeneity hold great promise for diagnosis and early assessment of treatment response in prostate cancer. However, spill-in counts from the bladder uptake are suspected to have an impact on the textural measurements of the prostate volume. This work proposes a correction method for the FCh-PET bladder uptake and investigates its impact on intraprostatic textural properties. Methods: Two patients with PC received pre-treatment dynamic FCh-PET scans reconstructed at four time points (interval: 2 min), for which prostate and bladder contours were obtained. Projection bins affected by bladder uptake were determined by forward-projection. For each time point and axial position, virtual sinograms were obtained and affected bins replaced by a weighted combination of original values and values interpolated using cubic spline from non-affected bins of the current and adjacent projection angles. The process was optimized using a genetic algorithm in terms of minimization of the root-mean-square error (RMSE) within the bladder between the corrected dynamic time point volume and a reference initial uptake volume. Finally, the impact of the bladder uptake correction on the prostate region was investigated using two standard SUV metrics (1) and three texture metrics (2): 1) SUVmax, SUVmean; 2) Contrast, Homogeneity, Coarseness. Results: Without bladder uptake correction, SUVmax and SUVmean were on average overestimated in the prostate by 0%, 0%, 33.2%, 51.2%, and 3.6%, 6.0%, 2.9%, 3.2%, for each time point respectively. Contrast varied by −9.1%, −6.7%, +40.4%, +107.7%, and Homogeneity and Coarseness by +4.5%, +1.8%, −8.8%, −14.8% and +1.0%, +0.5%, −9.5%, +0.9%. Conclusion: We proposed a method for FCh-PET bladder uptake correction and showed an impact on the quantification of the prostate signal. This method achieved a large reduction of intra-prostatic SUVmax while minimizing the impact on SUVmean

  8. Système de traiement d'eaux de regémération de résines pour l'élimination du cuivre

    CERN Document Server

    Karademir, Aynur

    2005-01-01

    Des cartouches de résines échangeuses d’ions sont utilisées pour le maintien de la qualité de l’eau déminéralisée nécessaire pour le refroidissement des accélérateurs du CERN. Les eaux issues de la régénération de ces cartouches contiennent des teneurs en cuivre trop élevées pour permettre leur rejet dans les réseaux d’eaux claires ou même usées. Le projet d’un système de traitement pour abattre la concentration de cuivre, basé sur une méthode simple, avec un rendement élevé, satisfaisant du point de vue de la sécurité, de l’impact environnemental et aussi économique, fait l’objet de ce travail.

  9. Temperature dependence of postmortem MR quantification for soft tissue discrimination

    Energy Technology Data Exchange (ETDEWEB)

    Zech, Wolf-Dieter; Schwendener, Nicole; Jackowski, Christian [University of Bern, From the Institute of Forensic Medicine, Bern (Switzerland); Persson, Anders; Warntjes, Marcel J. [University of Linkoeping, The Center for Medical Image Science and Visualization (CMIV), Linkoeping (Sweden)

    2015-08-15

    To investigate and correct the temperature dependence of postmortem MR quantification used for soft tissue characterization and differentiation in thoraco-abdominal organs. Thirty-five postmortem short axis cardiac 3-T MR examinations were quantified using a quantification sequence. Liver, spleen, left ventricular myocardium, pectoralis muscle and subcutaneous fat were analysed in cardiac short axis images to obtain mean T1, T2 and PD tissue values. The core body temperature was measured using a rectally inserted thermometer. The tissue-specific quantitative values were related to the body core temperature. Equations to correct for temperature differences were generated. In a 3D plot comprising the combined data of T1, T2 and PD, different organs/tissues could be well differentiated from each other. The quantitative values were influenced by the temperature. T1 in particular exhibited strong temperature dependence. The correction of quantitative values to a temperature of 37 C resulted in better tissue discrimination. Postmortem MR quantification is feasible for soft tissue discrimination and characterization of thoraco-abdominal organs. This provides a base for computer-aided diagnosis and detection of tissue lesions. The temperature dependence of the T1 values challenges postmortem MR quantification. Equations to correct for the temperature dependence are provided. (orig.)

  10. Temperature dependence of postmortem MR quantification for soft tissue discrimination

    International Nuclear Information System (INIS)

    Zech, Wolf-Dieter; Schwendener, Nicole; Jackowski, Christian; Persson, Anders; Warntjes, Marcel J.

    2015-01-01

    To investigate and correct the temperature dependence of postmortem MR quantification used for soft tissue characterization and differentiation in thoraco-abdominal organs. Thirty-five postmortem short axis cardiac 3-T MR examinations were quantified using a quantification sequence. Liver, spleen, left ventricular myocardium, pectoralis muscle and subcutaneous fat were analysed in cardiac short axis images to obtain mean T1, T2 and PD tissue values. The core body temperature was measured using a rectally inserted thermometer. The tissue-specific quantitative values were related to the body core temperature. Equations to correct for temperature differences were generated. In a 3D plot comprising the combined data of T1, T2 and PD, different organs/tissues could be well differentiated from each other. The quantitative values were influenced by the temperature. T1 in particular exhibited strong temperature dependence. The correction of quantitative values to a temperature of 37 C resulted in better tissue discrimination. Postmortem MR quantification is feasible for soft tissue discrimination and characterization of thoraco-abdominal organs. This provides a base for computer-aided diagnosis and detection of tissue lesions. The temperature dependence of the T1 values challenges postmortem MR quantification. Equations to correct for the temperature dependence are provided. (orig.)

  11. Quantification is Neither Necessary Nor Sufficient for Measurement

    International Nuclear Information System (INIS)

    Mari, Luca; Maul, Andrew; Torres Irribarra, David; Wilson, Mark

    2013-01-01

    Being an infrastructural, widespread activity, measurement is laden with stereotypes. Some of these concern the role of measurement in the relation between quality and quantity. In particular, it is sometimes argued or assumed that quantification is necessary for measurement; it is also sometimes argued or assumed that quantification is sufficient for or synonymous with measurement. To assess the validity of these positions the concepts of measurement and quantitative evaluation should be independently defined and their relationship analyzed. We contend that the defining characteristic of measurement should be the structure of the process, not a feature of its results. Under this perspective, quantitative evaluation is neither sufficient nor necessary for measurement

  12. Detection and quantification of proteins and cells by use of elemental mass spectrometry: progress and challenges.

    Science.gov (United States)

    Yan, Xiaowen; Yang, Limin; Wang, Qiuquan

    2013-07-01

    Much progress has been made in identification of the proteins in proteomes, and quantification of these proteins has attracted much interest. In addition to popular tandem mass spectrometric methods based on soft ionization, inductively coupled plasma mass spectrometry (ICPMS), a typical example of mass spectrometry based on hard ionization, usually used for analysis of elements, has unique advantages in absolute quantification of proteins by determination of an element with a definite stoichiometry in a protein or attached to the protein. In this Trends article, we briefly describe state-of-the-art ICPMS-based methods for quantification of proteins, emphasizing protein-labeling and element-tagging strategies developed on the basis of chemically selective reactions and/or biospecific interactions. Recent progress from protein to cell quantification by use of ICPMS is also discussed, and the possibilities and challenges of ICPMS-based protein quantification for universal, selective, or targeted quantification of proteins and cells in a biological sample are also discussed critically. We believe ICPMS-based protein quantification will become ever more important in targeted quantitative proteomics and bioanalysis in the near future.

  13. Standardless quantification by parameter optimization in electron probe microanalysis

    International Nuclear Information System (INIS)

    Limandri, Silvina P.; Bonetto, Rita D.; Josa, Víctor Galván; Carreras, Alejo C.; Trincavelli, Jorge C.

    2012-01-01

    A method for standardless quantification by parameter optimization in electron probe microanalysis is presented. The method consists in minimizing the quadratic differences between an experimental spectrum and an analytical function proposed to describe it, by optimizing the parameters involved in the analytical prediction. This algorithm, implemented in the software POEMA (Parameter Optimization in Electron Probe Microanalysis), allows the determination of the elemental concentrations, along with their uncertainties. The method was tested in a set of 159 elemental constituents corresponding to 36 spectra of standards (mostly minerals) that include trace elements. The results were compared with those obtained with the commercial software GENESIS Spectrum® for standardless quantification. The quantifications performed with the method proposed here are better in the 74% of the cases studied. In addition, the performance of the method proposed is compared with the first principles standardless analysis procedure DTSA for a different data set, which excludes trace elements. The relative deviations with respect to the nominal concentrations are lower than 0.04, 0.08 and 0.35 for the 66% of the cases for POEMA, GENESIS and DTSA, respectively. - Highlights: ► A method for standardless quantification in EPMA is presented. ► It gives better results than the commercial software GENESIS Spectrum. ► It gives better results than the software DTSA. ► It allows the determination of the conductive coating thickness. ► It gives an estimation for the concentration uncertainties.

  14. Publications | Page 122 | CRDI - Centre de recherches pour le ...

    International Development Research Centre (IDRC) Digital Library (Canada)

    Le CRDI collabore avec les chercheurs et les établissements des pays en développement au renforcement des capacités locales par le truchement du ... par le CRDI qui fournit des ordinateurs aux écoles tout en créant des emplois permet de s'attaquer au problème environnemental causé par les déchets électroniques en ...

  15. The NASA Langley Multidisciplinary Uncertainty Quantification Challenge

    Science.gov (United States)

    Crespo, Luis G.; Kenny, Sean P.; Giesy, Daniel P.

    2014-01-01

    This paper presents the formulation of an uncertainty quantification challenge problem consisting of five subproblems. These problems focus on key aspects of uncertainty characterization, sensitivity analysis, uncertainty propagation, extreme-case analysis, and robust design.

  16. 1H NMR quantification in very dilute toxin solutions: application to anatoxin-a analysis.

    Science.gov (United States)

    Dagnino, Denise; Schripsema, Jan

    2005-08-01

    A complete procedure is described for the extraction, detection and quantification of anatoxin-a in biological samples. Anatoxin-a is extracted from biomass by a routine acid base extraction. The extract is analysed by GC-MS, without the need of derivatization, with a detection limit of 0.5 ng. A method was developed for the accurate quantification of anatoxin-a in the standard solution to be used for the calibration of the GC analysis. 1H NMR allowed the accurate quantification of microgram quantities of anatoxin-a. The accurate quantification of compounds in standard solutions is rarely discussed, but for compounds like anatoxin-a (toxins with prices in the range of a million dollar a gram), of which generally only milligram quantities or less are available, this factor in the quantitative analysis is certainly not trivial. The method that was developed can easily be adapted for the accurate quantification of other toxins in very dilute solutions.

  17. Uncertainty Quantification with Applications to Engineering Problems

    DEFF Research Database (Denmark)

    Bigoni, Daniele

    in measurements, predictions and manufacturing, and we can say that any dynamical system used in engineering is subject to some of these uncertainties. The first part of this work presents an overview of the mathematical framework used in Uncertainty Quantification (UQ) analysis and introduces the spectral tensor...... and thus the UQ analysis of the associated systems will benefit greatly from the application of methods which require few function evaluations. We first consider the propagation of the uncertainty and the sensitivity analysis of the non-linear dynamics of railway vehicles with suspension components whose......-scale problems, where efficient methods are necessary with today’s computational resources. The outcome of this work was also the creation of several freely available Python modules for Uncertainty Quantification, which are listed and described in the appendix....

  18. Volumetric quantification of lung nodules in CT with iterative reconstruction (ASiR and MBIR)

    Energy Technology Data Exchange (ETDEWEB)

    Chen, Baiyu [Medical Physics Graduate Program, Duke University, Durham, North Carolina 27705 and Carl E. Ravin Advanced Imaging Laboratories, Duke University, Durham, North Carolina 27705 (United States); Barnhart, Huiman [Department of Biostatistics and Bioinformatics, Duke University, Durham, North Carolina 27705 (United States); Richard, Samuel [Carl E. Ravin Advanced Imaging Laboratories, Duke University, Durham, North Carolina 27705 and Department of Radiology, Duke University, Durham, North Carolina 27705 (United States); Robins, Marthony [Medical Physics Graduate Program, Duke University, Durham, North Carolina 27705 (United States); Colsher, James [Department of Radiology, Duke University, Durham, North Carolina 27705 (United States); Samei, Ehsan [Medical Physics Graduate Program, Duke University, Durham, North Carolina 27705 (United States); Carl E. Ravin Advanced Imaging Laboratories, Duke University, Durham, North Carolina 27705 (United States); Department of Radiology, Duke University, Durham, North Carolina 27705 (United States); Department of Physics, Department of Biomedical Engineering, and Department of Electronic and Computer Engineering, Duke University, Durham, North Carolina 27705 (United States)

    2013-11-15

    Purpose: Volume quantifications of lung nodules with multidetector computed tomography (CT) images provide useful information for monitoring nodule developments. The accuracy and precision of the volume quantification, however, can be impacted by imaging and reconstruction parameters. This study aimed to investigate the impact of iterative reconstruction algorithms on the accuracy and precision of volume quantification with dose and slice thickness as additional variables.Methods: Repeated CT images were acquired from an anthropomorphic chest phantom with synthetic nodules (9.5 and 4.8 mm) at six dose levels, and reconstructed with three reconstruction algorithms [filtered backprojection (FBP), adaptive statistical iterative reconstruction (ASiR), and model based iterative reconstruction (MBIR)] into three slice thicknesses. The nodule volumes were measured with two clinical software (A: Lung VCAR, B: iNtuition), and analyzed for accuracy and precision.Results: Precision was found to be generally comparable between FBP and iterative reconstruction with no statistically significant difference noted for different dose levels, slice thickness, and segmentation software. Accuracy was found to be more variable. For large nodules, the accuracy was significantly different between ASiR and FBP for all slice thicknesses with both software, and significantly different between MBIR and FBP for 0.625 mm slice thickness with Software A and for all slice thicknesses with Software B. For small nodules, the accuracy was more similar between FBP and iterative reconstruction, with the exception of ASIR vs FBP at 1.25 mm with Software A and MBIR vs FBP at 0.625 mm with Software A.Conclusions: The systematic difference between the accuracy of FBP and iterative reconstructions highlights the importance of extending current segmentation software to accommodate the image characteristics of iterative reconstructions. In addition, a calibration process may help reduce the dependency of

  19. Volumetric quantification of lung nodules in CT with iterative reconstruction (ASiR and MBIR)

    International Nuclear Information System (INIS)

    Chen, Baiyu; Barnhart, Huiman; Richard, Samuel; Robins, Marthony; Colsher, James; Samei, Ehsan

    2013-01-01

    Purpose: Volume quantifications of lung nodules with multidetector computed tomography (CT) images provide useful information for monitoring nodule developments. The accuracy and precision of the volume quantification, however, can be impacted by imaging and reconstruction parameters. This study aimed to investigate the impact of iterative reconstruction algorithms on the accuracy and precision of volume quantification with dose and slice thickness as additional variables.Methods: Repeated CT images were acquired from an anthropomorphic chest phantom with synthetic nodules (9.5 and 4.8 mm) at six dose levels, and reconstructed with three reconstruction algorithms [filtered backprojection (FBP), adaptive statistical iterative reconstruction (ASiR), and model based iterative reconstruction (MBIR)] into three slice thicknesses. The nodule volumes were measured with two clinical software (A: Lung VCAR, B: iNtuition), and analyzed for accuracy and precision.Results: Precision was found to be generally comparable between FBP and iterative reconstruction with no statistically significant difference noted for different dose levels, slice thickness, and segmentation software. Accuracy was found to be more variable. For large nodules, the accuracy was significantly different between ASiR and FBP for all slice thicknesses with both software, and significantly different between MBIR and FBP for 0.625 mm slice thickness with Software A and for all slice thicknesses with Software B. For small nodules, the accuracy was more similar between FBP and iterative reconstruction, with the exception of ASIR vs FBP at 1.25 mm with Software A and MBIR vs FBP at 0.625 mm with Software A.Conclusions: The systematic difference between the accuracy of FBP and iterative reconstructions highlights the importance of extending current segmentation software to accommodate the image characteristics of iterative reconstructions. In addition, a calibration process may help reduce the dependency of

  20. Lung involvement quantification in chest radiographs; Quantificacao de comprometimento pulmonar em radiografias de torax

    Energy Technology Data Exchange (ETDEWEB)

    Giacomini, Guilherme; Alvarez, Matheus; Oliveira, Marcela de; Miranda, Jose Ricardo A. [Universidade Estadual Paulista Julio de Mesquita Filho (UNESP), Botucatu, SP (Brazil). Instituto de Biociencias. Departamento de Fisica e Biofisica; Pina, Diana R.; Pereira, Paulo C.M.; Ribeiro, Sergio M., E-mail: giacomini@ibb.unesp.br [Universidade Estadual Paulista Julio de Mesquita Filho (UNESP), Botucatu, SP (Brazil). Faculdade de Medicina. Departamento de Doencas Tropicais e Diagnostico por Imagem

    2014-12-15

    Tuberculosis (TB) caused by Mycobacterium tuberculosis, is an infectious disease which remains a global health problem. The chest radiography is the commonly method employed to assess the TB's evolution. The methods for quantification of abnormalities of chest are usually performed on CT scans (CT). This quantification is important to assess the TB evolution and treatment and comparing different treatments. However, precise quantification is not feasible for the amount of CT scans required. The purpose of this work is to develop a methodology for quantification of lung damage caused by TB through chest radiographs. It was developed an algorithm for computational processing of exams in Matlab, which creates a lungs' 3D representation, with compromised dilated regions inside. The quantification of lung lesions was also made for the same patients through CT scans. The measurements from the two methods were compared and resulting in strong correlation. Applying statistical Bland and Altman, all samples were within the limits of agreement, with a confidence interval of 95%. The results showed an average variation of around 13% between the two quantification methods. The results suggest the effectiveness and applicability of the method developed, providing better risk-benefit to the patient and cost-benefit ratio for the institution. (author)

  1. Quantification of virus syndrome in chili peppers

    African Journals Online (AJOL)

    Jane

    2011-06-15

    Jun 15, 2011 ... alternative for the quantification of the disease' syndromes in regards to this crop. The result of these ..... parison of treatments such as cultivars or control measures and ..... Vascular discoloration and stem necrosis. 2.

  2. Lamb Wave Damage Quantification Using GA-Based LS-SVM

    Directory of Open Access Journals (Sweden)

    Fuqiang Sun

    2017-06-01

    Full Text Available Lamb waves have been reported to be an efficient tool for non-destructive evaluations (NDE for various application scenarios. However, accurate and reliable damage quantification using the Lamb wave method is still a practical challenge, due to the complex underlying mechanism of Lamb wave propagation and damage detection. This paper presents a Lamb wave damage quantification method using a least square support vector machine (LS-SVM and a genetic algorithm (GA. Three damage sensitive features, namely, normalized amplitude, phase change, and correlation coefficient, were proposed to describe changes of Lamb wave characteristics caused by damage. In view of commonly used data-driven methods, the GA-based LS-SVM model using the proposed three damage sensitive features was implemented to evaluate the crack size. The GA method was adopted to optimize the model parameters. The results of GA-based LS-SVM were validated using coupon test data and lap joint component test data with naturally developed fatigue cracks. Cases of different loading and manufacturer were also included to further verify the robustness of the proposed method for crack quantification.

  3. Lamb Wave Damage Quantification Using GA-Based LS-SVM.

    Science.gov (United States)

    Sun, Fuqiang; Wang, Ning; He, Jingjing; Guan, Xuefei; Yang, Jinsong

    2017-06-12

    Lamb waves have been reported to be an efficient tool for non-destructive evaluations (NDE) for various application scenarios. However, accurate and reliable damage quantification using the Lamb wave method is still a practical challenge, due to the complex underlying mechanism of Lamb wave propagation and damage detection. This paper presents a Lamb wave damage quantification method using a least square support vector machine (LS-SVM) and a genetic algorithm (GA). Three damage sensitive features, namely, normalized amplitude, phase change, and correlation coefficient, were proposed to describe changes of Lamb wave characteristics caused by damage. In view of commonly used data-driven methods, the GA-based LS-SVM model using the proposed three damage sensitive features was implemented to evaluate the crack size. The GA method was adopted to optimize the model parameters. The results of GA-based LS-SVM were validated using coupon test data and lap joint component test data with naturally developed fatigue cracks. Cases of different loading and manufacturer were also included to further verify the robustness of the proposed method for crack quantification.

  4. [DNA quantification of blood samples pre-treated with pyramidon].

    Science.gov (United States)

    Zhu, Chuan-Hong; Zheng, Dao-Li; Ni, Rao-Zhi; Wang, Hai-Sheng; Ning, Ping; Fang, Hui; Liu, Yan

    2014-06-01

    To study DNA quantification and STR typing of samples pre-treated with pyramidon. The blood samples of ten unrelated individuals were anticoagulated in EDTA. The blood stains were made on the filter paper. The experimental groups were divided into six groups in accordance with the storage time, 30 min, 1 h, 3 h, 6 h, 12 h and 24h after pre-treated with pyramidon. DNA was extracted by three methods: magnetic bead-based extraction, QIAcube DNA purification method and Chelex-100 method. The quantification of DNA was made by fluorescent quantitative PCR. STR typing was detected by PCR-STR fluorescent technology. In the same DNA extraction method, the sample DNA decreased gradually with times after pre-treatment with pyramidon. In the same storage time, the DNA quantification in different extraction methods had significant differences. Sixteen loci DNA typing were detected in 90.56% of samples. Pyramidon pre-treatment could cause DNA degradation, but effective STR typing can be achieved within 24 h. The magnetic bead-based extraction is the best method for STR profiling and DNA extraction.

  5. Real-Time PCR Quantification of Chloroplast DNA Supports DNA Barcoding of Plant Species.

    Science.gov (United States)

    Kikkawa, Hitomi S; Tsuge, Kouichiro; Sugita, Ritsuko

    2016-03-01

    Species identification from extracted DNA is sometimes needed for botanical samples. DNA quantification is required for an accurate and effective examination. If a quantitative assay provides unreliable estimates, a higher quantity of DNA than the estimated amount may be used in additional analyses to avoid failure to analyze samples from which extracting DNA is difficult. Compared with conventional methods, real-time quantitative PCR (qPCR) requires a low amount of DNA and enables quantification of dilute DNA solutions accurately. The aim of this study was to develop a qPCR assay for quantification of chloroplast DNA from taxonomically diverse plant species. An absolute quantification method was developed using primers targeting the ribulose-1,5-bisphosphate carboxylase/oxygenase large subunit (rbcL) gene using SYBR Green I-based qPCR. The calibration curve was generated using the PCR amplicon as the template. DNA extracts from representatives of 13 plant families common in Japan. This demonstrates that qPCR analysis is an effective method for quantification of DNA from plant samples. The results of qPCR assist in the decision-making will determine the success or failure of DNA analysis, indicating the possibility of optimization of the procedure for downstream reactions.

  6. Absolute and direct microRNA quantification using DNA-gold nanoparticle probes.

    Science.gov (United States)

    Degliangeli, Federica; Kshirsagar, Prakash; Brunetti, Virgilio; Pompa, Pier Paolo; Fiammengo, Roberto

    2014-02-12

    DNA-gold nanoparticle probes are implemented in a simple strategy for direct microRNA (miRNA) quantification. Fluorescently labeled DNA-probe strands are immobilized on PEGylated gold nanoparticles (AuNPs). In the presence of target miRNA, DNA-RNA heteroduplexes are formed and become substrate for the endonuclease DSN (duplex-specific nuclease). Enzymatic hydrolysis of the DNA strands yields a fluorescence signal due to diffusion of the fluorophores away from the gold surface. We show that the molecular design of our DNA-AuNP probes, with the DNA strands immobilized on top of the PEG-based passivation layer, results in nearly unaltered enzymatic activity toward immobilized heteroduplexes compared to substrates free in solution. The assay, developed in a real-time format, allows absolute quantification of as little as 0.2 fmol of miR-203. We also show the application of the assay for direct quantification of cancer-related miR-203 and miR-21 in samples of extracted total RNA from cell cultures. The possibility of direct and absolute quantification may significantly advance the use of microRNAs as biomarkers in the clinical praxis.

  7. Activity quantification of phantom using dual-head SPECT with two-view planar image

    International Nuclear Information System (INIS)

    Guo Leiming; Chen Tao; Sun Xiaoguang; Huang Gang

    2005-01-01

    The absorbed radiation dose from internally deposited radionuclide is a major factor in assessing risk and therapeutic utility in nuclear medicine diagnosis or treatment. The quantification of absolute activity in vivo is necessary procedure of estimating the absorbed dose of organ or tissue. To understand accuracy in the determination of organ activity, the experiments on 99 Tc m activity quantification were made for a body phantom using dual-heat SPECT with the two-view counting technique. Accuracy in the activity quantification is credible and is not affected by depth of source organ in vivo. When diameter of the radiation source is ≤2 cm, the most accurate activity quantification result can be obtained on the basis of establishing the system calibration factor and transmission factor. The use of Buijs's method is preferable, especially at very low source-to-background activity concentration rations. (authors)

  8. Assessment of DNA degradation induced by thermal and UV radiation processing: implications for quantification of genetically modified organisms.

    Science.gov (United States)

    Ballari, Rajashekhar V; Martin, Asha

    2013-12-01

    DNA quality is an important parameter for the detection and quantification of genetically modified organisms (GMO's) using the polymerase chain reaction (PCR). Food processing leads to degradation of DNA, which may impair GMO detection and quantification. This study evaluated the effect of various processing treatments such as heating, baking, microwaving, autoclaving and ultraviolet (UV) irradiation on the relative transgenic content of MON 810 maize using pRSETMON-02, a dual target plasmid as a model system. Amongst all the processing treatments examined, autoclaving and UV irradiation resulted in the least recovery of the transgenic (CaMV 35S promoter) and taxon-specific (zein) target DNA sequences. Although a profound impact on DNA degradation was seen during the processing, DNA could still be reliably quantified by Real-time PCR. The measured mean DNA copy number ratios of the processed samples were in agreement with the expected values. Our study confirms the premise that the final analytical value assigned to a particular sample is independent of the degree of DNA degradation since the transgenic and the taxon-specific target sequences possessing approximately similar lengths degrade in parallel. The results of our study demonstrate that food processing does not alter the relative quantification of the transgenic content provided the quantitative assays target shorter amplicons and the difference in the amplicon size between the transgenic and taxon-specific genes is minimal. Copyright © 2013 Elsevier Ltd. All rights reserved.

  9. Improved Strategies and Optimization of Calibration Models for Real-time PCR Absolute Quantification

    Science.gov (United States)

    Real-time PCR absolute quantification applications rely on the use of standard curves to make estimates of DNA target concentrations in unknown samples. Traditional absolute quantification approaches dictate that a standard curve must accompany each experimental run. However, t...

  10. Quantification of uranyl in presence of citric acid

    International Nuclear Information System (INIS)

    Garcia G, N.; Barrera D, C.E.; Ordonez R, E.

    2007-01-01

    To determine the influence that has the organic matter of the soil on the uranyl sorption on some solids is necessary to have a detection technique and quantification of uranyl that it is reliable and sufficiently quick in the obtaining of results. For that in this work, it intends to carry out the uranyl quantification in presence of citric acid modifying the Fluorescence induced by UV-Vis radiation technique. Since the uranyl ion is very sensitive to the medium that contains it, (speciation, pH, ionic forces, etc.) it was necessary to develop an analysis technique that stands out the fluorescence of uranyl ion avoiding the out one that produce the organic acids. (Author)

  11. Automated quantification of renal interstitial fibrosis for computer-aided diagnosis: A comprehensive tissue structure segmentation method.

    Science.gov (United States)

    Tey, Wei Keat; Kuang, Ye Chow; Ooi, Melanie Po-Leen; Khoo, Joon Joon

    2018-03-01

    Interstitial fibrosis in renal biopsy samples is a scarring tissue structure that may be visually quantified by pathologists as an indicator to the presence and extent of chronic kidney disease. The standard method of quantification by visual evaluation presents reproducibility issues in the diagnoses. This study proposes an automated quantification system for measuring the amount of interstitial fibrosis in renal biopsy images as a consistent basis of comparison among pathologists. The system extracts and segments the renal tissue structures based on colour information and structural assumptions of the tissue structures. The regions in the biopsy representing the interstitial fibrosis are deduced through the elimination of non-interstitial fibrosis structures from the biopsy area and quantified as a percentage of the total area of the biopsy sample. A ground truth image dataset has been manually prepared by consulting an experienced pathologist for the validation of the segmentation algorithms. The results from experiments involving experienced pathologists have demonstrated a good correlation in quantification result between the automated system and the pathologists' visual evaluation. Experiments investigating the variability in pathologists also proved the automated quantification error rate to be on par with the average intra-observer variability in pathologists' quantification. Interstitial fibrosis in renal biopsy samples is a scarring tissue structure that may be visually quantified by pathologists as an indicator to the presence and extent of chronic kidney disease. The standard method of quantification by visual evaluation presents reproducibility issues in the diagnoses due to the uncertainties in human judgement. An automated quantification system for accurately measuring the amount of interstitial fibrosis in renal biopsy images is presented as a consistent basis of comparison among pathologists. The system identifies the renal tissue structures

  12. L’impact environnemental de l’usine hydroélectrique de Porto Primavera (Brésil

    Directory of Open Access Journals (Sweden)

    Jailton Dias

    2002-12-01

    Full Text Available L’implantation de l’usine hydroélectrique de Porto Primavera sur le cours du haut Paraná, au Centre-Sud du Brésil, a entraîné de grandes transformations de l’environnement et de l’organisation de l’espace. L’ampleur et la rapidité des modifications se prêtent à un suivi par télédétection. Les images Landsat™ démontrent que la construction du barrage a donné une nouvelle impulsion au développement économique régional.

  13. Economic evaluation of environmental impacts of open cast mining project - an approach

    International Nuclear Information System (INIS)

    Maiti, S.K.; Pathak, K.

    1998-01-01

    Economic valuation of environmental attributes are pragmatic approach to evaluating the impacts and it helps decision makers to arrive at objective decisions on the basis of cost benefit ratio. For determining the physical impact and its quantification, four evaluation methods, namely-market price method, surrogate market price, survey based and cost based approaches are generally used. The present paper reviews the importance of environmental evaluation of impacts of mining and also reviews a few suitable methodologies that could be effectively used for economic evaluation of environmental impacts in open cast mining projects. (author)

  14. On the Confounding Effect of Temperature on Chemical Shift-Encoded Fat Quantification

    Science.gov (United States)

    Hernando, Diego; Sharma, Samir D.; Kramer, Harald; Reeder, Scott B.

    2014-01-01

    Purpose To characterize the confounding effect of temperature on chemical shift-encoded (CSE) fat quantification. Methods The proton resonance frequency of water, unlike triglycerides, depends on temperature. This leads to a temperature dependence of the spectral models of fat (relative to water) that are commonly used by CSE-MRI methods. Simulation analysis was performed for 1.5 Tesla CSE fat–water signals at various temperatures and echo time combinations. Oil–water phantoms were constructed and scanned at temperatures between 0 and 40°C using spectroscopy and CSE imaging at three echo time combinations. An explanted human liver, rejected for transplantation due to steatosis, was scanned using spectroscopy and CSE imaging. Fat–water reconstructions were performed using four different techniques: magnitude and complex fitting, with standard or temperature-corrected signal modeling. Results In all experiments, magnitude fitting with standard signal modeling resulted in large fat quantification errors. Errors were largest for echo time combinations near TEinit ≈ 1.3 ms, ΔTE ≈ 2.2 ms. Errors in fat quantification caused by temperature-related frequency shifts were smaller with complex fitting, and were avoided using a temperature-corrected signal model. Conclusion Temperature is a confounding factor for fat quantification. If not accounted for, it can result in large errors in fat quantifications in phantom and ex vivo acquisitions. PMID:24123362

  15. Standardless quantification by parameter optimization in electron probe microanalysis

    Energy Technology Data Exchange (ETDEWEB)

    Limandri, Silvina P. [Instituto de Fisica Enrique Gaviola (IFEG), CONICET (Argentina); Facultad de Matematica, Astronomia y Fisica, Universidad Nacional de Cordoba, Medina Allende s/n, (5016) Cordoba (Argentina); Bonetto, Rita D. [Centro de Investigacion y Desarrollo en Ciencias Aplicadas Dr. Jorge Ronco (CINDECA), CONICET, 47 Street 257, (1900) La Plata (Argentina); Facultad de Ciencias Exactas, Universidad Nacional de La Plata, 1 and 47 Streets (1900) La Plata (Argentina); Josa, Victor Galvan; Carreras, Alejo C. [Instituto de Fisica Enrique Gaviola (IFEG), CONICET (Argentina); Facultad de Matematica, Astronomia y Fisica, Universidad Nacional de Cordoba, Medina Allende s/n, (5016) Cordoba (Argentina); Trincavelli, Jorge C., E-mail: trincavelli@famaf.unc.edu.ar [Instituto de Fisica Enrique Gaviola (IFEG), CONICET (Argentina); Facultad de Matematica, Astronomia y Fisica, Universidad Nacional de Cordoba, Medina Allende s/n, (5016) Cordoba (Argentina)

    2012-11-15

    A method for standardless quantification by parameter optimization in electron probe microanalysis is presented. The method consists in minimizing the quadratic differences between an experimental spectrum and an analytical function proposed to describe it, by optimizing the parameters involved in the analytical prediction. This algorithm, implemented in the software POEMA (Parameter Optimization in Electron Probe Microanalysis), allows the determination of the elemental concentrations, along with their uncertainties. The method was tested in a set of 159 elemental constituents corresponding to 36 spectra of standards (mostly minerals) that include trace elements. The results were compared with those obtained with the commercial software GENESIS Spectrum Registered-Sign for standardless quantification. The quantifications performed with the method proposed here are better in the 74% of the cases studied. In addition, the performance of the method proposed is compared with the first principles standardless analysis procedure DTSA for a different data set, which excludes trace elements. The relative deviations with respect to the nominal concentrations are lower than 0.04, 0.08 and 0.35 for the 66% of the cases for POEMA, GENESIS and DTSA, respectively. - Highlights: Black-Right-Pointing-Pointer A method for standardless quantification in EPMA is presented. Black-Right-Pointing-Pointer It gives better results than the commercial software GENESIS Spectrum. Black-Right-Pointing-Pointer It gives better results than the software DTSA. Black-Right-Pointing-Pointer It allows the determination of the conductive coating thickness. Black-Right-Pointing-Pointer It gives an estimation for the concentration uncertainties.

  16. Quantification of taurine in energy drinks using ¹H NMR.

    Science.gov (United States)

    Hohmann, Monika; Felbinger, Christine; Christoph, Norbert; Wachter, Helmut; Wiest, Johannes; Holzgrabe, Ulrike

    2014-05-01

    The consumption of so called energy drinks is increasing, especially among adolescents. These beverages commonly contain considerable amounts of the amino sulfonic acid taurine, which is related to a magnitude of various physiological effects. The customary method to control the legal limit of taurine in energy drinks is LC-UV/vis with postcolumn derivatization using ninhydrin. In this paper we describe the quantification of taurine in energy drinks by (1)H NMR as an alternative to existing methods of quantification. Variation of pH values revealed the separation of a distinct taurine signal in (1)H NMR spectra, which was applied for integration and quantification. Quantification was performed using external calibration (R(2)>0.9999; linearity verified by Mandel's fitting test with a 95% confidence level) and PULCON. Taurine concentrations in 20 different energy drinks were analyzed by both using (1)H NMR and LC-UV/vis. The deviation between (1)H NMR and LC-UV/vis results was always below the expanded measurement uncertainty of 12.2% for the LC-UV/vis method (95% confidence level) and at worst 10.4%. Due to the high accordance to LC-UV/vis data and adequate recovery rates (ranging between 97.1% and 108.2%), (1)H NMR measurement presents a suitable method to quantify taurine in energy drinks. Copyright © 2013 Elsevier B.V. All rights reserved.

  17. Initial water quantification results using neutron computed tomography

    Energy Technology Data Exchange (ETDEWEB)

    Heller, A.K. [Department of Mechanical and Nuclear Engineering, Pennsylvania State University (United States)], E-mail: axh174@psu.edu; Shi, L.; Brenizer, J.S.; Mench, M.M. [Department of Mechanical and Nuclear Engineering, Pennsylvania State University (United States)

    2009-06-21

    Neutron computed tomography is an important imaging tool in the field of non-destructive testing and in fundamental research for many engineering applications. Contrary to X-rays, neutrons can be attenuated by some light materials, such as hydrogen, but can penetrate many heavy materials. Thus, neutron computed tomography is useful in obtaining important three-dimensional information about a sample's interior structure and material properties that other traditional methods cannot provide. The neutron computed tomography system at Pennsylvania State University's Radiation Science and Engineering Center is being utilized to develop a water quantification technique for investigation of water distribution in fuel cells under normal conditions. A hollow aluminum cylinder test sample filled with a known volume of water was constructed for purposes of testing the quantification technique. Transmission images of the test sample at different angles were easily acquired through the synthesis of a dedicated image acquisition computer driving a rotary table controller and an in-house developed synchronization software package. After data acquisition, Octopus (version 8.2) and VGStudio Max (version 1.2) were used to perform cross-sectional and three-dimensional reconstructions of the sample, respectively. The initial reconstructions and water quantification results are presented.

  18. Techniques for quantification of liver fat in risk stratification of diabetics

    International Nuclear Information System (INIS)

    Kuehn, J.P.; Spoerl, M.C.; Mahlke, C.; Hegenscheid, K.

    2015-01-01

    Fatty liver disease plays an important role in the development of type 2 diabetes. Accurate techniques for detection and quantification of liver fat are essential for clinical diagnostics. Chemical shift-encoded magnetic resonance imaging (MRI) is a simple approach to quantify liver fat content. Liver fat quantification using chemical shift-encoded MRI is influenced by several bias factors, such as T2* decay, T1 recovery and the multispectral complexity of fat. The confounder corrected proton density fat fraction is a simple approach to quantify liver fat with comparable results independent of the software and hardware used. The proton density fat fraction is an accurate biomarker for assessment of liver fat. An accurate and reproducible quantification of liver fat using chemical shift-encoded MRI requires a calculation of the proton density fat fraction. (orig.) [de

  19. An external standard method for quantification of human cytomegalovirus by PCR

    International Nuclear Information System (INIS)

    Rongsen, Shen; Liren, Ma; Fengqi, Zhou; Qingliang, Luo

    1997-01-01

    An external standard method for PCR quantification of HCMV was reported. [α- 32 P]dATP was used as a tracer. 32 P-labelled specific amplification product was separated by agarose gel electrophoresis. A gel piece containing the specific product band was excised and counted in a plastic scintillation counter. Distribution of [α- 32 P]dATP in the electrophoretic gel plate and effect of separation between the 32 P-labelled specific product and free [α- 32 P]dATP were observed. A standard curve for quantification of HCMV by PCR was established and detective results of quality control templets were presented. The external standard method and the electrophoresis separation effect were appraised. The results showed that the method could be used for relative quantification of HCMV. (author)

  20. Rapid and sensitive quantification of C3- and C6-phosphoesters in starch by fluorescence-assisted capillary electrophoresis.

    Science.gov (United States)

    Verbeke, Jeremy; Penverne, Christophe; D'Hulst, Christophe; Rolando, Christian; Szydlowski, Nicolas

    2016-11-05

    Phosphate groups are naturally present in starch at C3- or C6-position of the glucose residues and impact the structure of starch granules. Their precise quantification is necessary for understanding starch physicochemical properties and metabolism. Nevertheless, reliable quantification of Glc-3-P remains laborious and time consuming. Here we describe a capillary electrophoresis method for simultaneous measurement of both Glc-6-P and Glc-3-P after acid hydrolysis of starch. The sensitivity threshold was estimated at the fg scale, which is compatible with the analysis of less than a μg of sample. The method was validated by analyzing antisense potato lines deficient in SBEs, GWD or GBSS. We show that Glc-3-P content is altered in the latter and that these variations do not correlate with modifications in Glc-6-P content. We anticipate the method reported here to be an efficient tool for high throughput study of starch phosphorylation at both C3- and C6-position. Copyright © 2016 Elsevier Ltd. All rights reserved.

  1. Generation of structural MR images from amyloid PET: Application to MR-less quantification.

    Science.gov (United States)

    Choi, Hongyoon; Lee, Dong Soo

    2017-12-07

    Structural magnetic resonance (MR) images concomitantly acquired with PET images can provide crucial anatomical information for precise quantitative analysis. However, in the clinical setting, not all the subjects have corresponding MR. Here, we developed a model to generate structural MR images from amyloid PET using deep generative networks. We applied our model to quantification of cortical amyloid load without structural MR. Methods: We used florbetapir PET and structural MR data of Alzheimer's Disease Neuroimaging Initiative database. The generative network was trained to generate realistic structural MR images from florbetapir PET images. After the training, the model was applied to the quantification of cortical amyloid load. PET images were spatially normalized to the template space using the generated MR and then standardized uptake value ratio (SUVR) of the target regions was measured by predefined regions-of-interests. A real MR-based quantification was used as the gold standard to measure the accuracy of our approach. Other MR-less methods, a normal PET template-based, multi-atlas PET template-based and PET segmentation-based normalization/quantification methods, were also tested. We compared performance of quantification methods using generated MR with that of MR-based and MR-less quantification methods. Results: Generated MR images from florbetapir PET showed visually similar signal patterns to the real MR. The structural similarity index between real and generated MR was 0.91 ± 0.04. Mean absolute error of SUVR of cortical composite regions estimated by the generated MR-based method was 0.04±0.03, which was significantly smaller than other MR-less methods (0.29±0.12 for the normal PET-template, 0.12±0.07 for multiatlas PET-template and 0.08±0.06 for PET segmentation-based methods). Bland-Altman plots revealed that the generated MR-based SUVR quantification was the closest to the SUVR values estimated by the real MR-based method. Conclusion

  2. Climate change and waterborne diarrhoea in Northern India: Impact and adaptation strategies

    NARCIS (Netherlands)

    Moors, E.J.; Singh, T.; Siderius, C.; Balakrishnan, S.; Mishra, A.

    2013-01-01

    Although several studies show the vulnerability of human health to climate change, a clear comprehensive quantification of the increased health risks attributable to climate change is lacking. Even more complicated are assessments of adaptation measures for this sector. We discuss the impact of

  3. Quantification of Liver Proton-Density Fat Fraction in an 7.1 Tesla preclinical MR Systems: Impact of the Fitting Technique

    Science.gov (United States)

    Mahlke, C; Hernando, D; Jahn, C; Cigliano, A; Ittermann, T; Mössler, A; Kromrey, ML; Domaska, G; Reeder, SB; Kühn, JP

    2016-01-01

    Purpose To investigate the feasibility of estimating the proton-density fat fraction (PDFF) using a 7.1 Tesla magnetic resonance imaging (MRI) system and to compare the accuracy of liver fat quantification using different fitting approaches. Materials and Methods Fourteen leptin-deficient ob/ob mice and eight intact controls were examined in a 7.1 Tesla animal scanner using a 3-dimensional six-echo chemical shift-encoded pulse sequence. Confounder-corrected PDFF was calculated using magnitude (magnitude data alone) and combined fitting (complex and magnitude data). Differences between fitting techniques were compared using Bland-Altman analysis. In addition, PDFFs derived with both reconstructions were correlated with histopathological fat content and triglyceride mass fraction using linear regression analysis. Results The PDFFs determined with use of both reconstructions correlated very strongly (r=0.91). However, small mean bias between reconstructions demonstrated divergent results (3.9%; CI 2.7%-5.1%). For both reconstructions, there was linear correlation with histopathology (combined fitting: r=0.61; magnitude fitting: r=0.64) and triglyceride content (combined fitting: r=0.79; magnitude fitting: r=0.70). Conclusion Liver fat quantification using the PDFF derived from MRI performed at 7.1 Tesla is feasible. PDFF has strong correlations with histopathologically determined fat and with triglyceride content. However, small differences between PDFF reconstruction techniques may impair the robustness and reliability of the biomarker at 7.1 Tesla. PMID:27197806

  4. MR Spectroscopy: Real-Time Quantification of in-vivo MR Spectroscopic data

    OpenAIRE

    Massé, Kunal

    2009-01-01

    In the last two decades, magnetic resonance spectroscopy (MRS) has had an increasing success in biomedical research. This technique has the faculty of discerning several metabolites in human tissue non-invasively and thus offers a multitude of medical applications. In clinical routine, quantification plays a key role in the evaluation of the different chemical elements. The quantification of metabolites characterizing specific pathologies helps physicians establish the patient's diagnosis. E...

  5. Recommendations for adaptation and validation of commercial kits for biomarker quantification in drug development.

    Science.gov (United States)

    Khan, Masood U; Bowsher, Ronald R; Cameron, Mark; Devanarayan, Viswanath; Keller, Steve; King, Lindsay; Lee, Jean; Morimoto, Alyssa; Rhyne, Paul; Stephen, Laurie; Wu, Yuling; Wyant, Timothy; Lachno, D Richard

    2015-01-01

    Increasingly, commercial immunoassay kits are used to support drug discovery and development. Longitudinally consistent kit performance is crucial, but the degree to which kits and reagents are characterized by manufacturers is not standardized, nor are the approaches by users to adapt them and evaluate their performance through validation prior to use. These factors can negatively impact data quality. This paper offers a systematic approach to assessment, method adaptation and validation of commercial immunoassay kits for quantification of biomarkers in drug development, expanding upon previous publications and guidance. These recommendations aim to standardize and harmonize user practices, contributing to reliable biomarker data from commercial immunoassays, thus, enabling properly informed decisions during drug development.

  6. Reliable quantification of phthalates in environmental matrices (air, water, sludge, sediment and soil): a review.

    Science.gov (United States)

    Net, Sopheak; Delmont, Anne; Sempéré, Richard; Paluselli, Andrea; Ouddane, Baghdad

    2015-05-15

    Because of their widespread application, phthalates or phthalic acid esters (PAEs) are ubiquitous in the environment. Their presence has attracted considerable attention due to their potential impacts on ecosystem functioning and on public health, so their quantification has become a necessity. Various extraction procedures as well as gas/liquid chromatography and mass spectrometry detection techniques are found as suitable for reliable detection of such compounds. However, PAEs are ubiquitous in the laboratory environment including ambient air, reagents, sampling equipment, and various analytical devices, that induces difficult analysis of real samples with a low PAE background. Therefore, accurate PAE analysis in environmental matrices is a challenging task. This paper reviews the extensive literature data on the techniques for PAE quantification in natural media. Sampling, sample extraction/pretreatment and detection for quantifying PAEs in different environmental matrices (air, water, sludge, sediment and soil) have been reviewed and compared. The concept of "green analytical chemistry" for PAE determination is also discussed. Moreover useful information about the material preparation and the procedures of quality control and quality assurance are presented to overcome the problem of sample contamination and these encountered due to matrix effects in order to avoid overestimating PAE concentrations in the environment. Copyright © 2015 Elsevier B.V. All rights reserved.

  7. 1648-IJBCS-Article-Dedjiho Comlam Achille

    African Journals Online (AJOL)

    hp

    l'appauvrissement de la diversité spécifique. (Dupré, 2002), provoquant donc un déséquilibre de la chaîne trophique d'un plan d'eau et peut entraîner des conséquences écologiques importantes. Ce phénomène est devenu aujourd'hui un problème environnemental généralisé. Dans lacs artificiels de Yamoussoukro (Côte.

  8. On impact damage detection and quantification for CFRP laminates using structural response data only

    Science.gov (United States)

    Sultan, M. T. H.; Worden, K.; Pierce, S. G.; Hickey, D.; Staszewski, W. J.; Dulieu-Barton, J. M.; Hodzic, A.

    2011-11-01

    The overall purpose of the research is to detect and attempt to quantify impact damage in structures made from composite materials. A study that uses simplified coupon specimens made from a Carbon Fibre-Reinforced Polymer (CFRP) prepreg with 11, 12 and 13 plies is presented. PZT sensors were placed at three separate locations in each test specimen to record the responses from impact events. To perform damaging impact tests, an instrumented drop-test machine was used and the impact energy was set to cover a range of 0.37-41.72 J. The response signals captured from each sensor were recorded by a data acquisition system for subsequent evaluation. The impacted specimens were examined with an X-ray technique to determine the extent of the damaged areas and it was found that the apparent damaged area grew monotonically with impact energy. A number of simple univariate and multivariate features were extracted from the sensor signals recorded during impact by computing their spectra and calculating frequency centroids. The concept of discordancy from the statistical discipline of outlier analysis is employed in order to separate the responses from non-damaging and damaging impacts. The results show that the potential damage indices introduced here provide a means of identifying damaging impacts from the response data alone.

  9. Uncertainty Quantification for Large-Scale Ice Sheet Modeling

    Energy Technology Data Exchange (ETDEWEB)

    Ghattas, Omar [Univ. of Texas, Austin, TX (United States)

    2016-02-05

    This report summarizes our work to develop advanced forward and inverse solvers and uncertainty quantification capabilities for a nonlinear 3D full Stokes continental-scale ice sheet flow model. The components include: (1) forward solver: a new state-of-the-art parallel adaptive scalable high-order-accurate mass-conservative Newton-based 3D nonlinear full Stokes ice sheet flow simulator; (2) inverse solver: a new adjoint-based inexact Newton method for solution of deterministic inverse problems governed by the above 3D nonlinear full Stokes ice flow model; and (3) uncertainty quantification: a novel Hessian-based Bayesian method for quantifying uncertainties in the inverse ice sheet flow solution and propagating them forward into predictions of quantities of interest such as ice mass flux to the ocean.

  10. Swift Quantification of Fenofibrate and Tiemonium methylsulfate Active Ingredients in Solid Drugs Using Particle Induced X-Ray Emission

    International Nuclear Information System (INIS)

    Bejjani, A.; Nsouli, B.; Zahraman, K.; Assi, S.; Younes, Gh.; Yazbi, F.

    2011-01-01

    The quantification of active ingredients (AI) in drugs is a crucial and important step in the drug quality control process. This is usually performed by using wet chemical techniques like LC-MS, UV spectrophotometry and other appropriate organic analytical methods. However, if the active ingredient contains specific heteroatoms (F, S, Cl), elemental IBA like PIXE and PIGE techniques, using small tandem accelerator of 1-2 MV, can be explored for molecular quantification. IBA techniques permit the analysis of the sample under solid form, without any laborious sample preparations. In this work, we demonstrate the ability of the Thick Target PIXE technique for rapid and accurate quantification of both low and high concentrations of active ingredients in different commercial drugs. Fenofibrate, a chlorinated active ingredient, is present in high amounts in two different commercial drugs, its quantification was done using the relative approach to an external standard. On the other hand, Tiemonium methylsulfate which exists in relatively low amount in commercial drugs, its quantification was done using GUPIX simulation code (absolute quantification). The experimental aspects related to the quantification validity (use of external standards, absolute quantification, matrix effect,...) are presented and discussed. (author)

  11. Superposition Quantification

    Science.gov (United States)

    Chang, Li-Na; Luo, Shun-Long; Sun, Yuan

    2017-11-01

    The principle of superposition is universal and lies at the heart of quantum theory. Although ever since the inception of quantum mechanics a century ago, superposition has occupied a central and pivotal place, rigorous and systematic studies of the quantification issue have attracted significant interests only in recent years, and many related problems remain to be investigated. In this work we introduce a figure of merit which quantifies superposition from an intuitive and direct perspective, investigate its fundamental properties, connect it to some coherence measures, illustrate it through several examples, and apply it to analyze wave-particle duality. Supported by Science Challenge Project under Grant No. TZ2016002, Laboratory of Computational Physics, Institute of Applied Physics and Computational Mathematics, Beijing, Key Laboratory of Random Complex Structures and Data Science, Chinese Academy of Sciences, Grant under No. 2008DP173182

  12. Noninvasive Quantification of Pancreatic Fat in Humans

    OpenAIRE

    Lingvay, Ildiko; Esser, Victoria; Legendre, Jaime L.; Price, Angela L.; Wertz, Kristen M.; Adams-Huet, Beverley; Zhang, Song; Unger, Roger H.; Szczepaniak, Lidia S.

    2009-01-01

    Objective: To validate magnetic resonance spectroscopy (MRS) as a tool for non-invasive quantification of pancreatic triglyceride (TG) content and to measure the pancreatic TG content in a diverse human population with a wide range of body mass index (BMI) and glucose control.

  13. 15 CFR 990.52 - Injury assessment-quantification.

    Science.gov (United States)

    2010-01-01

    ... (Continued) NATIONAL OCEANIC AND ATMOSPHERIC ADMINISTRATION, DEPARTMENT OF COMMERCE OIL POLLUTION ACT..., trustees must quantify the degree, and spatial and temporal extent of such injuries relative to baseline. (b) Quantification approaches. Trustees may quantify injuries in terms of: (1) The degree, and...

  14. HPLC Quantification of astaxanthin and canthaxanthin in Salmonidae eggs.

    Science.gov (United States)

    Tzanova, Milena; Argirova, Mariana; Atanasov, Vasil

    2017-04-01

    Astaxanthin and canthaxanthin are naturally occurring antioxidants referred to as xanthophylls. They are used as food additives in fish farms to improve the organoleptic qualities of salmonid products and to prevent reproductive diseases. This study reports the development and single-laboratory validation of a rapid method for quantification of astaxanthin and canthaxanthin in eggs of rainbow trout (Oncorhynchus mykiss) and brook trout (Salvelinus fontinalis М.). An advantage of the proposed method is the perfect combination of selective extraction of the xanthophylls and analysis of the extract by high-performance liquid chromatography and photodiode array detection. The method validation was carried out in terms of linearity, accuracy, precision, recovery and limits of detection and quantification. The method was applied for simultaneous quantification of the two xanthophylls in eggs of rainbow trout and brook trout after their selective extraction. The results show that astaxanthin accumulations in salmonid fish eggs are larger than those of canthaxanthin. As the levels of these two xanthophylls affect fish fertility, this method can be used to improve the nutritional quality and to minimize the occurrence of the M74 syndrome in fish populations. Copyright © 2016 John Wiley & Sons, Ltd.

  15. MDCT quantification is the dominant parameter in decision–making regarding chest tube drainage for stable patients with traumatic pneumothorax

    Science.gov (United States)

    Cai, Wenli; Lee, June-Goo; Fikry, Karim; Yoshida, Hiroyuki; Novelline, Robert; de Moya, Marc

    2013-01-01

    It is commonly believed that the size of a pneumothorax is an important determinant of treatment decision, in particular regarding whether chest tube drainage (CTD) is required. However, the volumetric quantification of pneumothoraces has not routinely been performed in clinics. In this paper, we introduced an automated computer-aided volumetry (CAV) scheme for quantification of volume of pneumothoraces in chest multi-detect CT (MDCT) images. Moreover, we investigated the impact of accurate volume of pneumothoraces in the improvement of the performance in decision-making regarding CTD in the management of traumatic pneumothoraces. For this purpose, an occurrence frequency map was calculated for quantitative analysis of the importance of each clinical parameter in the decision-making regarding CTD by a computer simulation of decision-making using a genetic algorithm (GA) and a support vector machine (SVM). A total of 14 clinical parameters, including volume of pneumothorax calculated by our CAV scheme, was collected as parameters available for decision-making. The results showed that volume was the dominant parameter in decision-making regarding CTD, with an occurrence frequency value of 1.00. The results also indicated that the inclusion of volume provided the best performance that was statistically significant compared to the other tests in which volume was excluded from the clinical parameters. This study provides the scientific evidence for the application of CAV scheme in MDCT volumetric quantification of pneumothoraces in the management of clinically stable chest trauma patients with traumatic pneumothorax. PMID:22560899

  16. Model Uncertainty Quantification Methods In Data Assimilation

    Science.gov (United States)

    Pathiraja, S. D.; Marshall, L. A.; Sharma, A.; Moradkhani, H.

    2017-12-01

    Data Assimilation involves utilising observations to improve model predictions in a seamless and statistically optimal fashion. Its applications are wide-ranging; from improving weather forecasts to tracking targets such as in the Apollo 11 mission. The use of Data Assimilation methods in high dimensional complex geophysical systems is an active area of research, where there exists many opportunities to enhance existing methodologies. One of the central challenges is in model uncertainty quantification; the outcome of any Data Assimilation study is strongly dependent on the uncertainties assigned to both observations and models. I focus on developing improved model uncertainty quantification methods that are applicable to challenging real world scenarios. These include developing methods for cases where the system states are only partially observed, where there is little prior knowledge of the model errors, and where the model error statistics are likely to be highly non-Gaussian.

  17. Preparing suitable climate scenario data to assess impacts on local food safety

    NARCIS (Netherlands)

    Liu, C.; Hofstra, N.; Leemans, R.

    2015-01-01

    Quantification of climate change impacts on food safety requires food safety assessment with different past and future climate scenario data to compare current and future conditions. This study presents a tool to prepare climate and climate change data for local food safety scenario analysis and

  18. Real-time quantitative PCR for retrovirus-like particle quantification in CHO cell culture.

    Science.gov (United States)

    de Wit, C; Fautz, C; Xu, Y

    2000-09-01

    Chinese hamster ovary (CHO) cells have been widely used to manufacture recombinant proteins intended for human therapeutic uses. Retrovirus-like particles, which are apparently defective and non-infectious, have been detected in all CHO cells by electron microscopy (EM). To assure viral safety of CHO cell-derived biologicals, quantification of retrovirus-like particles in production cell culture and demonstration of sufficient elimination of such retrovirus-like particles by the down-stream purification process are required for product market registration worldwide. EM, with a detection limit of 1x10(6) particles/ml, is the standard retrovirus-like particle quantification method. The whole process, which requires a large amount of sample (3-6 litres), is labour intensive, time consuming, expensive, and subject to significant assay variability. In this paper, a novel real-time quantitative PCR assay (TaqMan assay) has been developed for the quantification of retrovirus-like particles. Each retrovirus particle contains two copies of the viral genomic particle RNA (pRNA) molecule. Therefore, quantification of retrovirus particles can be achieved by quantifying the pRNA copy number, i.e. every two copies of retroviral pRNA is equivalent to one retrovirus-like particle. The TaqMan assay takes advantage of the 5'-->3' exonuclease activity of Taq DNA polymerase and utilizes the PRISM 7700 Sequence Detection System of PE Applied Biosystems (Foster City, CA, U.S.A.) for automated pRNA quantification through a dual-labelled fluorogenic probe. The TaqMan quantification technique is highly comparable to the EM analysis. In addition, it offers significant advantages over the EM analysis, such as a higher sensitivity of less than 600 particles/ml, greater accuracy and reliability, higher sample throughput, more flexibility and lower cost. Therefore, the TaqMan assay should be used as a substitute for EM analysis for retrovirus-like particle quantification in CHO cell

  19. 2D histomorphometric quantification from 3D computerized tomography

    International Nuclear Information System (INIS)

    Lima, Inaya; Oliveira, Luis Fernando de; Lopes, Ricardo T.; Jesus, Edgar Francisco O. de; Alves, Jose Marcos

    2002-01-01

    In the present article, preliminary results are presented showing the application of the tridimensional computerized microtomographic technique (3D-μCT) to bone tissue characterization, through histomorphometric quantification which are based on stereologic concepts. Two samples of human bone were correctly prepared to be submitted to the tomographic system. The system used to realize that process were a radiographic system with a microfocus X-ray tube. Through these three processes, acquisition, reconstruction and quantification, it was possible to get the good results and coherent to the literature data. From this point, it is intended to compare these results with the information due the conventional method, that is, conventional histomorphometry. (author)

  20. Advances in forensic DNA quantification: a review.

    Science.gov (United States)

    Lee, Steven B; McCord, Bruce; Buel, Eric

    2014-11-01

    This review focuses upon a critical step in forensic biology: detection and quantification of human DNA from biological samples. Determination of the quantity and quality of human DNA extracted from biological evidence is important for several reasons. Firstly, depending on the source and extraction method, the quality (purity and length), and quantity of the resultant DNA extract can vary greatly. This affects the downstream method as the quantity of input DNA and its relative length can determine which genotyping procedure to use-standard short-tandem repeat (STR) typing, mini-STR typing or mitochondrial DNA sequencing. Secondly, because it is important in forensic analysis to preserve as much of the evidence as possible for retesting, it is important to determine the total DNA amount available prior to utilizing any destructive analytical method. Lastly, results from initial quantitative and qualitative evaluations permit a more informed interpretation of downstream analytical results. Newer quantitative techniques involving real-time PCR can reveal the presence of degraded DNA and PCR inhibitors, that provide potential reasons for poor genotyping results and may indicate methods to use for downstream typing success. In general, the more information available, the easier it is to interpret and process the sample resulting in a higher likelihood of successful DNA typing. The history of the development of quantitative methods has involved two main goals-improving precision of the analysis and increasing the information content of the result. This review covers advances in forensic DNA quantification methods and recent developments in RNA quantification. © 2014 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  1. Quantification of rice bran oil in oil blends

    Energy Technology Data Exchange (ETDEWEB)

    Mishra, R.; Sharma, H. K.; Sengar, G.

    2012-11-01

    Blends consisting of physically refined rice bran oil (PRBO): sunflower oil (SnF) and PRBO: safflower oil (SAF) in different proportions were analyzed for various physicochemical parameters. The quantification of pure rice bran oil in the blended oils was carried out using different methods including gas chromatographic, HPLC, ultrasonic velocity and methods based on physico-chemical parameters. The physicochemical parameters such as ultrasonic velocity, relative association and acoustic impedance at 2 MHz, iodine value, palmitic acid content and oryzanol content reflected significant changes with increased proportions of PRBO in the blended oils. These parameters were selected as dependent parameters and % PRBO proportion was selected as independent parameters. The study revealed that regression equations based on the oryzanol content, palmitic acid composition, ultrasonic velocity, relative association, acoustic impedance, and iodine value can be used for the quantification of rice bran oil in blended oils. The rice bran oil can easily be quantified in the blended oils based on the oryzanol content by HPLC even at a 1% level. The palmitic acid content in blended oils can also be used as an indicator to quantify rice bran oil at or above the 20% level in blended oils whereas the method based on ultrasonic velocity, acoustic impedance and relative association showed initial promise in the quantification of rice bran oil. (Author) 23 refs.

  2. HPLC for simultaneous quantification of total ceramide, glucosylceramide, and ceramide trihexoside concentrations in plasma

    NARCIS (Netherlands)

    Groener, Johanna E. M.; Poorthuis, Ben J. H. M.; Kuiper, Sijmen; Helmond, Mariette T. J.; Hollak, Carla E. M.; Aerts, Johannes M. F. G.

    2007-01-01

    BACKGROUND: Simple, reproducible assays are needed for the quantification of sphingolipids, ceramide (Cer), and sphingoid bases. We developed an HPLC method for simultaneous quantification of total plasma concentrations of Cer, glucosylceramide (GlcCer), and ceramide trihexoside (CTH). METHODS:

  3. Quantification of coating aging using impedance measurements

    NARCIS (Netherlands)

    Westing, E.P.M. van; Weijde, D.H. van der; Vreijling, M.P.W.; Ferrari, G.M.; Wit, J.H.W. de

    1998-01-01

    This chapter shows the application results of a novel approach to quantify the ageing of organic coatings using impedance measurements. The ageing quantification is based on the typical impedance behaviour of barrier coatings in immersion. This immersion behaviour is used to determine the limiting

  4. A Comparative Study on the Impact of Global Warming of Applying Low Carbon Factor Concrete Products

    OpenAIRE

    Su-Hyun Cho; Chang-U Chae

    2015-01-01

    Environmental impact assessment techniques have been developed as a result of the worldwide efforts to reduce the environmental impact of global warming. By using the quantification method in the construction industry, it is now possible to manage the greenhouse gas is to systematically evaluate the impact on the environment over the entire construction process. In particular, the proportion of greenhouse gas emissions at the production stage of construction material occu...

  5. Quantification of trace metals in water using complexation and filter concentration.

    Science.gov (United States)

    Dolgin, Bella; Bulatov, Valery; Japarov, Julia; Elish, Eyal; Edri, Elad; Schechter, Israel

    2010-06-15

    Various metals undergo complexation with organic reagents, resulting in colored products. In practice, their molar absorptivities allow for quantification in the ppm range. However, a proper pre-concentration of the colored complex on paper filter lowers the quantification limit to the low ppb range. In this study, several pre-concentration techniques have been examined and compared: filtering the already complexed mixture, complexation on filter, and dipping of dye-covered filter in solution. The best quantification has been based on the ratio of filter reflectance at a certain wavelength to that at zero metal concentration. The studied complex formations (Ni ions with TAN and Cd ions with PAN) involve production of nanoparticle suspensions, which are associated with complicated kinetics. The kinetics of the complexation of Ni ions with TAN has been investigated and optimum timing could be found. Kinetic optimization in regard to some interferences has also been suggested.

  6. Parsing and Quantification of Raw Orbitrap Mass Spectrometer Data Using RawQuant.

    Science.gov (United States)

    Kovalchik, Kevin A; Moggridge, Sophie; Chen, David D Y; Morin, Gregg B; Hughes, Christopher S

    2018-06-01

    Effective analysis of protein samples by mass spectrometry (MS) requires careful selection and optimization of a range of experimental parameters. As the output from the primary detection device, the "raw" MS data file can be used to gauge the success of a given sample analysis. However, the closed-source nature of the standard raw MS file can complicate effective parsing of the data contained within. To ease and increase the range of analyses possible, the RawQuant tool was developed to enable parsing of raw MS files derived from Thermo Orbitrap instruments to yield meta and scan data in an openly readable text format. RawQuant can be commanded to export user-friendly files containing MS 1 , MS 2 , and MS 3 metadata as well as matrices of quantification values based on isobaric tagging approaches. In this study, the utility of RawQuant is demonstrated in several scenarios: (1) reanalysis of shotgun proteomics data for the identification of the human proteome, (2) reanalysis of experiments utilizing isobaric tagging for whole-proteome quantification, and (3) analysis of a novel bacterial proteome and synthetic peptide mixture for assessing quantification accuracy when using isobaric tags. Together, these analyses successfully demonstrate RawQuant for the efficient parsing and quantification of data from raw Thermo Orbitrap MS files acquired in a range of common proteomics experiments. In addition, the individual analyses using RawQuant highlights parametric considerations in the different experimental sets and suggests targetable areas to improve depth of coverage in identification-focused studies and quantification accuracy when using isobaric tags.

  7. Quantification of glycyrrhizin biomarker in Glycyrrhiza glabra ...

    African Journals Online (AJOL)

    Background: A simple and sensitive thin-layer chromatographic method has been established for quantification of glycyrrhizin in Glycyrrhiza glabra rhizome and baby herbal formulations by validated Reverse Phase HPTLC method. Materials and Methods: RP-HPTLC Method was carried out using glass coated with RP-18 ...

  8. Data-driven Demand Response Characterization and Quantification

    DEFF Research Database (Denmark)

    Le Ray, Guillaume; Pinson, Pierre; Larsen, Emil Mahler

    2017-01-01

    Analysis of load behavior in demand response (DR) schemes is important to evaluate the performance of participants. Very few real-world experiments have been carried out and quantification and characterization of the response is a difficult task. Nevertheless it will be a necessary tool for portf...

  9. Uncertainty Quantification in Alchemical Free Energy Methods.

    Science.gov (United States)

    Bhati, Agastya P; Wan, Shunzhou; Hu, Yuan; Sherborne, Brad; Coveney, Peter V

    2018-05-02

    Alchemical free energy methods have gained much importance recently from several reports of improved ligand-protein binding affinity predictions based on their implementation using molecular dynamics simulations. A large number of variants of such methods implementing different accelerated sampling techniques and free energy estimators are available, each claimed to be better than the others in its own way. However, the key features of reproducibility and quantification of associated uncertainties in such methods have barely been discussed. Here, we apply a systematic protocol for uncertainty quantification to a number of popular alchemical free energy methods, covering both absolute and relative free energy predictions. We show that a reliable measure of error estimation is provided by ensemble simulation-an ensemble of independent MD simulations-which applies irrespective of the free energy method. The need to use ensemble methods is fundamental and holds regardless of the duration of time of the molecular dynamics simulations performed.

  10. Level 2 probabilistic event analyses and quantification

    International Nuclear Information System (INIS)

    Boneham, P.

    2003-01-01

    In this paper an example of quantification of a severe accident phenomenological event is given. The performed analysis for assessment of the probability that the debris released from the reactor vessel was in a coolable configuration in the lower drywell is presented. It is also analysed the assessment of the type of core/concrete attack that would occur. The coolability of the debris ex-vessel evaluation by an event in the Simplified Boiling Water Reactor (SBWR) Containment Event Tree (CET) and a detailed Decomposition Event Tree (DET) developed to aid in the quantification of this CET event are considered. The headings in the DET selected to represent plant physical states (e.g., reactor vessel pressure at the time of vessel failure) and the uncertainties associated with the occurrence of critical physical phenomena (e.g., debris configuration in the lower drywell) considered important to assessing whether the debris was coolable or not coolable ex-vessel are also discussed

  11. Seed shape quantification in the order Cucurbitales

    Directory of Open Access Journals (Sweden)

    Emilio Cervantes

    2018-02-01

    Full Text Available Seed shape quantification in diverse species of the families belonging to the order Cucurbitales is done based on the comparison of seed images with geometric figures. Quantification of seed shape is a useful tool in plant description for phenotypic characterization and taxonomic analysis. J index gives the percent of similarity of the image of a seed with a geometric figure and it is useful in taxonomy for the study of relationships between plant groups. Geometric figures used as models in the Cucurbitales are the ovoid, two ellipses with different x/y ratios and the outline of the Fibonacci spiral. The images of seeds have been compared with these figures and values of J index obtained. The results obtained for 29 species in the family Cucurbitaceae support a relationship between seed shape and species ecology. Simple seed shape, with images resembling simple geometric figures like the ovoid, ellipse or the Fibonacci spiral, may be a feature in the basal clades of taxonomic groups.

  12. Modeling qRT-PCR dynamics with application to cancer biomarker quantification.

    Science.gov (United States)

    Chervoneva, Inna; Freydin, Boris; Hyslop, Terry; Waldman, Scott A

    2017-01-01

    Quantitative reverse transcription polymerase chain reaction (qRT-PCR) is widely used for molecular diagnostics and evaluating prognosis in cancer. The utility of mRNA expression biomarkers relies heavily on the accuracy and precision of quantification, which is still challenging for low abundance transcripts. The critical step for quantification is accurate estimation of efficiency needed for computing a relative qRT-PCR expression. We propose a new approach to estimating qRT-PCR efficiency based on modeling dynamics of polymerase chain reaction amplification. In contrast, only models for fluorescence intensity as a function of polymerase chain reaction cycle have been used so far for quantification. The dynamics of qRT-PCR efficiency is modeled using an ordinary differential equation model, and the fitted ordinary differential equation model is used to obtain effective polymerase chain reaction efficiency estimates needed for efficiency-adjusted quantification. The proposed new qRT-PCR efficiency estimates were used to quantify GUCY2C (Guanylate Cyclase 2C) mRNA expression in the blood of colorectal cancer patients. Time to recurrence and GUCY2C expression ratios were analyzed in a joint model for survival and longitudinal outcomes. The joint model with GUCY2C quantified using the proposed polymerase chain reaction efficiency estimates provided clinically meaningful results for association between time to recurrence and longitudinal trends in GUCY2C expression.

  13. Recurrence quantification analysis in Liu's attractor

    International Nuclear Information System (INIS)

    Balibrea, Francisco; Caballero, M. Victoria; Molera, Lourdes

    2008-01-01

    Recurrence Quantification Analysis is used to detect transitions chaos to periodical states or chaos to chaos in a new dynamical system proposed by Liu et al. This system contains a control parameter in the second equation and was originally introduced to investigate the forming mechanism of the compound structure of the chaotic attractor which exists when the control parameter is zero

  14. The ratio of right ventricular volume to left ventricular volume reflects the impact of pulmonary regurgitation independently of the method of pulmonary regurgitation quantification

    Energy Technology Data Exchange (ETDEWEB)

    Śpiewak, Mateusz, E-mail: mspiewak@ikard.pl [Department of Coronary Artery Disease and Structural Heart Diseases, Institute of Cardiology, Warsaw (Poland); Cardiac Magnetic Resonance Unit, Institute of Cardiology, Warsaw (Poland); Małek, Łukasz A., E-mail: lmalek@ikard.pl [Cardiac Magnetic Resonance Unit, Institute of Cardiology, Warsaw (Poland); Department of Interventional Cardiology and Angiology, Institute of Cardiology, Warsaw (Poland); Petryka, Joanna, E-mail: joannapetryka@hotmail.com [Department of Coronary Artery Disease and Structural Heart Diseases, Institute of Cardiology, Warsaw (Poland); Cardiac Magnetic Resonance Unit, Institute of Cardiology, Warsaw (Poland); Mazurkiewicz, Łukasz, E-mail: lmazurkiewicz@ikard.pl [Cardiac Magnetic Resonance Unit, Institute of Cardiology, Warsaw (Poland); Department of Cardiomyopathy, Institute of Cardiology, Warsaw (Poland); Miłosz, Barbara, E-mail: barbara-milosz@o2.pl [Cardiac Magnetic Resonance Unit, Institute of Cardiology, Warsaw (Poland); Department of Radiology, Institute of Cardiology, Warsaw (Poland); Biernacka, Elżbieta K., E-mail: kbiernacka@ikard.pl [Department of Congenital Heart Diseases, Institute of Cardiology, Warsaw (Poland); Kowalski, Mirosław, E-mail: mkowalski@ikard.pl [Department of Congenital Heart Diseases, Institute of Cardiology, Warsaw (Poland); Hoffman, Piotr, E-mail: phoffman@ikard.pl [Department of Congenital Heart Diseases, Institute of Cardiology, Warsaw (Poland); Demkow, Marcin, E-mail: mdemkow@ikard.pl [Department of Coronary Artery Disease and Structural Heart Diseases, Institute of Cardiology, Warsaw (Poland); Miśko, Jolanta, E-mail: jmisko@wp.pl [Cardiac Magnetic Resonance Unit, Institute of Cardiology, Warsaw (Poland); Department of Radiology, Institute of Cardiology, Warsaw (Poland); Rużyłło, Witold, E-mail: wruzyllo@ikard.pl [Institute of Cardiology, Warsaw (Poland)

    2012-10-15

    Background: Previous studies have advocated quantifying pulmonary regurgitation (PR) by using PR volume (PRV) instead of commonly used PR fraction (PRF). However, physicians are not familiar with the use of PRV in clinical practice. The ratio of right ventricle (RV) volume to left ventricle volume (RV/LV) may better reflect the impact of PR on the heart than RV end-diastolic volume (RVEDV) alone. We aimed to compare the impact of PRV and PRF on RV size expressed as either the RV/LV ratio or RVEDV (mL/m{sup 2}). Methods: Consecutive patients with repaired tetralogy of Fallot were included (n = 53). PRV, PRF and ventricular volumes were measured with the use of cardiac magnetic resonance. Results: RVEDV was more closely correlated with PRV when compared with PRF (r = 0.686, p < 0.0001, and r = 0.430, p = 0.0014, respectively). On the other hand, both PRV and PRF showed a good correlation with the RV/LV ratio (r = 0.691, p < 0.0001, and r = 0.685, p < 0.0001, respectively). Receiver operating characteristic analysis showed that both measures of PR had similar ability to predict severe RV dilatation when the RV/LV ratio-based criterion was used, namely the RV/LV ratio > 2.0 [area under the curve (AUC){sub PRV} = 0.770 vs AUC{sub PRF} = 0.777, p = 0.86]. Conversely, with the use of the RVEDV-based criterion (>170 mL/m{sup 2}), PRV proved to be superior over PRF (AUC{sub PRV} = 0.770 vs AUC{sub PRF} = 0.656, p = 0.0028]. Conclusions: PRV and PRF have similar significance as measures of PR when the RV/LV ratio is used instead of RVEDV. The RV/LV ratio is a universal marker of RV dilatation independent of the method of PR quantification applied (PRF vs PRV)

  15. Genomic DNA-based absolute quantification of gene expression in Vitis.

    Science.gov (United States)

    Gambetta, Gregory A; McElrone, Andrew J; Matthews, Mark A

    2013-07-01

    Many studies in which gene expression is quantified by polymerase chain reaction represent the expression of a gene of interest (GOI) relative to that of a reference gene (RG). Relative expression is founded on the assumptions that RG expression is stable across samples, treatments, organs, etc., and that reaction efficiencies of the GOI and RG are equal; assumptions which are often faulty. The true variability in RG expression and actual reaction efficiencies are seldom determined experimentally. Here we present a rapid and robust method for absolute quantification of expression in Vitis where varying concentrations of genomic DNA were used to construct GOI standard curves. This methodology was utilized to absolutely quantify and determine the variability of the previously validated RG ubiquitin (VvUbi) across three test studies in three different tissues (roots, leaves and berries). In addition, in each study a GOI was absolutely quantified. Data sets resulting from relative and absolute methods of quantification were compared and the differences were striking. VvUbi expression was significantly different in magnitude between test studies and variable among individual samples. Absolute quantification consistently reduced the coefficients of variation of the GOIs by more than half, often resulting in differences in statistical significance and in some cases even changing the fundamental nature of the result. Utilizing genomic DNA-based absolute quantification is fast and efficient. Through eliminating error introduced by assuming RG stability and equal reaction efficiencies between the RG and GOI this methodology produces less variation, increased accuracy and greater statistical power. © 2012 Scandinavian Plant Physiology Society.

  16. Techniques of biomolecular quantification through AMS detection of radiocarbon

    International Nuclear Information System (INIS)

    Vogel, S.J.; Turteltaub, K.W.; Frantz, C.; Felton, J.S.; Gledhill, B.L.

    1992-01-01

    Accelerator mass spectrometry offers a large gain over scintillation counting in sensitivity for detecting radiocarbon in biomolecular tracing. Application of this sensitivity requires new considerations of procedures to extract or isolate the carbon fraction to be quantified, to inventory all carbon in the sample, to prepare graphite from the sample for use in the spectrometer, and to derive a meaningful quantification from the measured isotope ratio. These procedures need to be accomplished without contaminating the sample with radiocarbon, which may be ubiquitous in laboratories and on equipment previously used for higher dose, scintillation experiments. Disposable equipment, materials and surfaces are used to control these contaminations. Quantification of attomole amounts of labeled substances are possible through these techniques

  17. Quantification of transformation products of rocket fuel unsymmetrical dimethylhydrazine in soils using SPME and GC-MS.

    Science.gov (United States)

    Bakaikina, Nadezhda V; Kenessov, Bulat; Ul'yanovskii, Nikolay V; Kosyakov, Dmitry S

    2018-07-01

    Determination of transformation products (TPs) of rocket fuel unsymmetrical dimethylhydrazine (UDMH) in soil is highly important for environmental impact assessment of the launches of heavy space rockets from Kazakhstan, Russia, China and India. The method based on headspace solid-phase microextraction (HS SPME) and gas chromatography-mass spectrometry is advantageous over other known methods due to greater simplicity and cost efficiency. However, accurate quantification of these analytes using HS SPME is limited by the matrix effect. In this research, we proposed using internal standard and standard addition calibrations to achieve proper combination of accuracies of the quantification of key TPs of UDMH and cost efficiency. 1-Trideuteromethyl-1H-1,2,4-triazole (MTA-d3) was used as the internal standard. Internal standard calibration allowed controlling matrix effects during quantification of 1-methyl-1H-1,2,4-triazole (MTA), N,N-dimethylformamide (DMF), and N-nitrosodimethylamine (NDMA) in soils with humus content < 1%. Using SPME at 60 °C for 15 min by 65 µm Carboxen/polydimethylsiloxane fiber, recoveries of MTA, DMF and NDMA for sandy and loamy soil samples were 91-117, 85-123 and 64-132%, respectively. For improving the method accuracy and widening the range of analytes, standard addition and its combination with internal standard calibration were tested and compared on real soil samples. The combined calibration approach provided greatest accuracies for NDMA, DMF, N-methylformamide, formamide, 1H-pyrazole, 3-methyl-1H-pyrazole and 1H-pyrazole. For determination of 1-formyl-2,2-dimethylhydrazine, 3,5-dimethylpyrazole, 2-ethyl-1H-imidazole, 1H-imidazole, 1H-1,2,4-triazole, pyrazines and pyridines, standard addition calibration is more suitable. However, the proposed approach and collected data allow using both approaches simultaneously. Copyright © 2018 Elsevier B.V. All rights reserved.

  18. A non-invasive modality: the US virtual touch tissue quantification (VTTQ) for evaluation of breast cancer.

    Science.gov (United States)

    Tamaki, Kentaro; Tamaki, Nobumitsu; Kamada, Yoshihiko; Uehara, Kano; Miyashita, Minoru; Ishida, Takanori; Sasano, Hironobu

    2013-09-01

    We evaluated the biologic features of breast tissues using a newly developed non-invasive diagnostic system, named virtual touch tissue quantification. A total of 180 patients including 115 invasive ductal carcinoma, 30 ductal carcinoma in situ, 4 mucinous carcinoma, 7 invasive lobular carcinoma, 8 fibroadenoma, 12 fibrocystic change and 4 intraductal papilloma were studied at Nahanishi Clinic, Okinawa. We first compared the results of virtual touch tissue quantification according to each histologic subtype and determined the optimal cutoff values for virtual touch tissue quantification to distinguish benign from malignant tissues, using the receiver operating characteristic method. In addition, we also examined the correlation between virtual touch tissue quantification velocities and Ki-67, estrogen receptor, progesterone receptor or human epidermal growth factor receptor 2 in cases of invasive ductal carcinoma using linear regression analyses and Student's t-test. Virtual touch tissue quantification velocities were statistically higher in malignant cases than in benign cases (P breast cancer pathology in a non-invasive fashion.

  19. AtRTD2: A Reference Transcript Dataset for accurate quantification of alternative splicing and expression changes in Arabidopsis thaliana RNA-seq data

    KAUST Repository

    Zhang, Runxuan

    2016-05-06

    Background Alternative splicing is the major post-transcriptional mechanism by which gene expression is regulated and affects a wide range of processes and responses in most eukaryotic organisms. RNA-sequencing (RNA-seq) can generate genome-wide quantification of individual transcript isoforms to identify changes in expression and alternative splicing. RNA-seq is an essential modern tool but its ability to accurately quantify transcript isoforms depends on the diversity, completeness and quality of the transcript information. Results We have developed a new Reference Transcript Dataset for Arabidopsis (AtRTD2) for RNA-seq analysis containing over 82k non-redundant transcripts, whereby 74,194 transcripts originate from 27,667 protein-coding genes. A total of 13,524 protein-coding genes have at least one alternatively spliced transcript in AtRTD2 such that about 60% of the 22,453 protein-coding, intron-containing genes in Arabidopsis undergo alternative splicing. More than 600 putative U12 introns were identified in more than 2,000 transcripts. AtRTD2 was generated from transcript assemblies of ca. 8.5 billion pairs of reads from 285 RNA-seq data sets obtained from 129 RNA-seq libraries and merged along with the previous version, AtRTD, and Araport11 transcript assemblies. AtRTD2 increases the diversity of transcripts and through application of stringent filters represents the most extensive and accurate transcript collection for Arabidopsis to date. We have demonstrated a generally good correlation of alternative splicing ratios from RNA-seq data analysed by Salmon and experimental data from high resolution RT-PCR. However, we have observed inaccurate quantification of transcript isoforms for genes with multiple transcripts which have variation in the lengths of their UTRs. This variation is not effectively corrected in RNA-seq analysis programmes and will therefore impact RNA-seq analyses generally. To address this, we have tested different genome

  20. A multicenter study benchmarks software tools for label-free proteome quantification.

    Science.gov (United States)

    Navarro, Pedro; Kuharev, Jörg; Gillet, Ludovic C; Bernhardt, Oliver M; MacLean, Brendan; Röst, Hannes L; Tate, Stephen A; Tsou, Chih-Chiang; Reiter, Lukas; Distler, Ute; Rosenberger, George; Perez-Riverol, Yasset; Nesvizhskii, Alexey I; Aebersold, Ruedi; Tenzer, Stefan

    2016-11-01

    Consistent and accurate quantification of proteins by mass spectrometry (MS)-based proteomics depends on the performance of instruments, acquisition methods and data analysis software. In collaboration with the software developers, we evaluated OpenSWATH, SWATH 2.0, Skyline, Spectronaut and DIA-Umpire, five of the most widely used software methods for processing data from sequential window acquisition of all theoretical fragment-ion spectra (SWATH)-MS, which uses data-independent acquisition (DIA) for label-free protein quantification. We analyzed high-complexity test data sets from hybrid proteome samples of defined quantitative composition acquired on two different MS instruments using different SWATH isolation-window setups. For consistent evaluation, we developed LFQbench, an R package, to calculate metrics of precision and accuracy in label-free quantitative MS and report the identification performance, robustness and specificity of each software tool. Our reference data sets enabled developers to improve their software tools. After optimization, all tools provided highly convergent identification and reliable quantification performance, underscoring their robustness for label-free quantitative proteomics.

  1. Quantification of miRNAs by a simple and specific qPCR method

    DEFF Research Database (Denmark)

    Cirera Salicio, Susanna; Busk, Peter K.

    2014-01-01

    MicroRNAs (miRNAs) are powerful regulators of gene expression at posttranscriptional level and play important roles in many biological processes and in disease. The rapid pace of the emerging field of miRNAs has opened new avenues for development of techniques to quantitatively determine mi...... in miRNA quantification. Furthermore, the method is easy to perform with common laboratory reagents, which allows miRNA quantification at low cost....

  2. Quantification, improvement, and harmonization of small lesion detection with state-of-the-art PET

    Energy Technology Data Exchange (ETDEWEB)

    Vos, Charlotte S. van der [Radboud University Medical Centre, Department of Radiology and Nuclear Medicine, Nijmegen (Netherlands); University of Twente, MIRA Institute for Biomedical Technology and Technical Medicine, Enschede (Netherlands); Koopman, Danielle [University of Twente, MIRA Institute for Biomedical Technology and Technical Medicine, Enschede (Netherlands); Isala Hospital, Department of Nuclear Medicine, Zwolle (Netherlands); Rijnsdorp, Sjoerd; Arends, Albert J. [Catharina Hospital, Department of Medical Physics, Eindhoven (Netherlands); Boellaard, Ronald [University of Groningen, University Medical Centre Groningen, Department of Nuclear Medicine and Molecular Imaging, Groningen (Netherlands); VU University Medical Center, Department of Radiology and Nuclear Medicine, Amsterdam (Netherlands); Dalen, Jorn A. van [Isala Hospital, Department of Nuclear Medicine, Zwolle (Netherlands); Isala, Department of Medical Physics, Zwolle (Netherlands); Lubberink, Mark [Uppsala University, Department of Surgical Sciences, Uppsala (Sweden); Uppsala University Hospital, Department of Medical Physics, Uppsala (Sweden); Willemsen, Antoon T.M. [University of Groningen, University Medical Centre Groningen, Department of Nuclear Medicine and Molecular Imaging, Groningen (Netherlands); Visser, Eric P. [Radboud University Medical Centre, Department of Radiology and Nuclear Medicine, Nijmegen (Netherlands)

    2017-08-15

    In recent years, there have been multiple advances in positron emission tomography/computed tomography (PET/CT) that improve cancer imaging. The present generation of PET/CT scanners introduces new hardware, software, and acquisition methods. This review describes these new developments, which include time-of-flight (TOF), point-spread-function (PSF), maximum-a-posteriori (MAP) based reconstruction, smaller voxels, respiratory gating, metal artefact reduction, and administration of quadratic weight-dependent {sup 18}F-fluorodeoxyglucose (FDG) activity. Also, hardware developments such as continuous bed motion (CBM), (digital) solid-state photodetectors and combined PET and magnetic resonance (MR) systems are explained. These novel techniques have a significant impact on cancer imaging, as they result in better image quality, improved small lesion detectability, and more accurate quantification of radiopharmaceutical uptake. This influences cancer diagnosis and staging, as well as therapy response monitoring and radiotherapy planning. Finally, the possible impact of these developments on the European Association of Nuclear Medicine (EANM) guidelines and EANM Research Ltd. (EARL) accreditation for FDG-PET/CT tumor imaging is discussed. (orig.)

  3. Quantification, improvement, and harmonization of small lesion detection with state-of-the-art PET

    International Nuclear Information System (INIS)

    Vos, Charlotte S. van der; Koopman, Danielle; Rijnsdorp, Sjoerd; Arends, Albert J.; Boellaard, Ronald; Dalen, Jorn A. van; Lubberink, Mark; Willemsen, Antoon T.M.; Visser, Eric P.

    2017-01-01

    In recent years, there have been multiple advances in positron emission tomography/computed tomography (PET/CT) that improve cancer imaging. The present generation of PET/CT scanners introduces new hardware, software, and acquisition methods. This review describes these new developments, which include time-of-flight (TOF), point-spread-function (PSF), maximum-a-posteriori (MAP) based reconstruction, smaller voxels, respiratory gating, metal artefact reduction, and administration of quadratic weight-dependent 18 F-fluorodeoxyglucose (FDG) activity. Also, hardware developments such as continuous bed motion (CBM), (digital) solid-state photodetectors and combined PET and magnetic resonance (MR) systems are explained. These novel techniques have a significant impact on cancer imaging, as they result in better image quality, improved small lesion detectability, and more accurate quantification of radiopharmaceutical uptake. This influences cancer diagnosis and staging, as well as therapy response monitoring and radiotherapy planning. Finally, the possible impact of these developments on the European Association of Nuclear Medicine (EANM) guidelines and EANM Research Ltd. (EARL) accreditation for FDG-PET/CT tumor imaging is discussed. (orig.)

  4. Pore REconstruction and Segmentation (PORES) method for improved porosity quantification of nanoporous materials

    Energy Technology Data Exchange (ETDEWEB)

    Van Eyndhoven, G., E-mail: geert.vaneyndhoven@uantwerpen.be [iMinds-Vision Lab, University of Antwerp, Universiteitsplein 1, B-2610 Wilrijk (Belgium); Kurttepeli, M. [EMAT, University of Antwerp, Groenenborgerlaan 171, B-2020 Antwerp (Belgium); Van Oers, C.J.; Cool, P. [Laboratory of Adsorption and Catalysis, University of Antwerp, Universiteitsplein 1, B-2610 Wilrijk (Belgium); Bals, S. [EMAT, University of Antwerp, Groenenborgerlaan 171, B-2020 Antwerp (Belgium); Batenburg, K.J. [iMinds-Vision Lab, University of Antwerp, Universiteitsplein 1, B-2610 Wilrijk (Belgium); Centrum Wiskunde and Informatica, Science Park 123, NL-1090 GB Amsterdam (Netherlands); Mathematical Institute, Universiteit Leiden, Niels Bohrweg 1, NL-2333 CA Leiden (Netherlands); Sijbers, J. [iMinds-Vision Lab, University of Antwerp, Universiteitsplein 1, B-2610 Wilrijk (Belgium)

    2015-01-15

    Electron tomography is currently a versatile tool to investigate the connection between the structure and properties of nanomaterials. However, a quantitative interpretation of electron tomography results is still far from straightforward. Especially accurate quantification of pore-space is hampered by artifacts introduced in all steps of the processing chain, i.e., acquisition, reconstruction, segmentation and quantification. Furthermore, most common approaches require subjective manual user input. In this paper, the PORES algorithm “POre REconstruction and Segmentation” is introduced; it is a tailor-made, integral approach, for the reconstruction, segmentation, and quantification of porous nanomaterials. The PORES processing chain starts by calculating a reconstruction with a nanoporous-specific reconstruction algorithm: the Simultaneous Update of Pore Pixels by iterative REconstruction and Simple Segmentation algorithm (SUPPRESS). It classifies the interior region to the pores during reconstruction, while reconstructing the remaining region by reducing the error with respect to the acquired electron microscopy data. The SUPPRESS reconstruction can be directly plugged into the remaining processing chain of the PORES algorithm, resulting in accurate individual pore quantification and full sample pore statistics. The proposed approach was extensively validated on both simulated and experimental data, indicating its ability to generate accurate statistics of nanoporous materials. - Highlights: • An electron tomography reconstruction/segmentation method for nanoporous materials. • The method exploits the porous nature of the scanned material. • Validated extensively on both simulation and real data experiments. • Results in increased image resolution and improved porosity quantification.

  5. Improved quantification of farnesene during microbial production from Saccharomyces cerevisiae in two-liquid-phase fermentations

    DEFF Research Database (Denmark)

    Tippmann, Stefan; Nielsen, Jens; Khoomrung, Sakda

    2016-01-01

    Organic solvents are widely used in microbial fermentations to reduce gas stripping effects and capture hydrophobic or toxic compounds. Reliable quantification of biochemical products in these overlays is highly challenging and practically difficult. Here, we present a significant improvement...... carryover could be minimized. Direct quantification of farnesene in dodecane was achieved by GC-FID whereas GC-MS demonstrated to be an excellent technique for identification of known and unknown metabolites. The GC-FID is a suitable technique for direct quantification of farnesene in complex matrices...

  6. Prospective comparison of liver stiffness measurements between two point wave elastography methods: Virtual ouch quantification and elastography point quantification

    Energy Technology Data Exchange (ETDEWEB)

    Yoo, Hyun Suk; Lee, Jeong Min; Yoon, Jeong Hee; Lee, Dong Ho; Chang, Won; Han, Joon Koo [Seoul National University Hospital, Seoul (Korea, Republic of)

    2016-09-15

    To prospectively compare technical success rate and reliable measurements of virtual touch quantification (VTQ) elastography and elastography point quantification (ElastPQ), and to correlate liver stiffness (LS) measurements obtained by the two elastography techniques. Our study included 85 patients, 80 of whom were previously diagnosed with chronic liver disease. The technical success rate and reliable measurements of the two kinds of point shear wave elastography (pSWE) techniques were compared by χ{sup 2} analysis. LS values measured using the two techniques were compared and correlated via Wilcoxon signed-rank test, Spearman correlation coefficient, and 95% Bland-Altman limit of agreement. The intraobserver reproducibility of ElastPQ was determined by 95% Bland-Altman limit of agreement and intraclass correlation coefficient (ICC). The two pSWE techniques showed similar technical success rate (98.8% for VTQ vs. 95.3% for ElastPQ, p = 0.823) and reliable LS measurements (95.3% for VTQ vs. 90.6% for ElastPQ, p = 0.509). The mean LS measurements obtained by VTQ (1.71 ± 0.47 m/s) and ElastPQ (1.66 ± 0.41 m/s) were not significantly different (p = 0.209). The LS measurements obtained by the two techniques showed strong correlation (r = 0.820); in addition, the 95% limit of agreement of the two methods was 27.5% of the mean. Finally, the ICC of repeat ElastPQ measurements was 0.991. Virtual touch quantification and ElastPQ showed similar technical success rate and reliable measurements, with strongly correlated LS measurements. However, the two methods are not interchangeable due to the large limit of agreement.

  7. Network-Based Isoform Quantification with RNA-Seq Data for Cancer Transcriptome Analysis.

    Directory of Open Access Journals (Sweden)

    Wei Zhang

    2015-12-01

    Full Text Available High-throughput mRNA sequencing (RNA-Seq is widely used for transcript quantification of gene isoforms. Since RNA-Seq data alone is often not sufficient to accurately identify the read origins from the isoforms for quantification, we propose to explore protein domain-domain interactions as prior knowledge for integrative analysis with RNA-Seq data. We introduce a Network-based method for RNA-Seq-based Transcript Quantification (Net-RSTQ to integrate protein domain-domain interaction network with short read alignments for transcript abundance estimation. Based on our observation that the abundances of the neighboring isoforms by domain-domain interactions in the network are positively correlated, Net-RSTQ models the expression of the neighboring transcripts as Dirichlet priors on the likelihood of the observed read alignments against the transcripts in one gene. The transcript abundances of all the genes are then jointly estimated with alternating optimization of multiple EM problems. In simulation Net-RSTQ effectively improved isoform transcript quantifications when isoform co-expressions correlate with their interactions. qRT-PCR results on 25 multi-isoform genes in a stem cell line, an ovarian cancer cell line, and a breast cancer cell line also showed that Net-RSTQ estimated more consistent isoform proportions with RNA-Seq data. In the experiments on the RNA-Seq data in The Cancer Genome Atlas (TCGA, the transcript abundances estimated by Net-RSTQ are more informative for patient sample classification of ovarian cancer, breast cancer and lung cancer. All experimental results collectively support that Net-RSTQ is a promising approach for isoform quantification. Net-RSTQ toolbox is available at http://compbio.cs.umn.edu/Net-RSTQ/.

  8. Experimental quantification of contact forces with impact, friction and uncertainty analysis

    DEFF Research Database (Denmark)

    Lahriri, Said; Santos, Ilmar

    2013-01-01

    and whirl motions in rotor-stator contact investigations. Dry friction coefficient is therefore estimated using two different experimental setups: (a) standard pin-on-disk tests and (b) rotor impact test rig fully instrumented. The findings in both setups indicate that the dry friction coefficient for brass......During rotor-stator contact dry friction plays a significant role in terms of reversing the rotor precession. The frictional force causes an increase in the rotor's tangential velocity in the direction opposite to that of the angular velocity. This effect is crucial for defining ranges of dry whip......-aluminum configuration significantly varies in a range of 0.16-0.83. The rotor enters a full annular contact mode shortly after two impacts with a contact duration of approximately 0.004 s at each location. It is experimentally demonstrated that the friction force is not present when the rotor enters a full annular...

  9. WE-AB-204-05: Harmonizing PET/CT Quantification in Multicenter Studies: A Case Study

    International Nuclear Information System (INIS)

    Marques da Silva, A; Fischer, A

    2015-01-01

    Purpose: To present the implementation of a strategy to harmonize FDG PET/CT quantification (SUV), performed with different scanner models and manufacturers. Methods: The strategy was based on Boellaard (2011) and EARL FDG-PET/CT accreditation program, that propose quality control measurements for harmonizing scanner performance. A NEMA IEC Body phantom study was performed using four different devices: PHP-1 (Gemini TF Base, Philips); PHP-2 (Gemini GXL, Philips); GEH (Discovery 600, General Electric); SMS (Biograph Hi-Rez 16, Siemens). The SUV Recovery Coefficient (RC) was calculated using the clinical protocol and other clinically relevant reconstruction parameters. The most appropriate reconstruction parameters (MARP) for SUV harmonization, in each scanner, are those which achieve EARL harmonizing standards. They were identified using the lowest root mean square errors (RMSE). To evaluate the strategy’s effectiveness, the Maximum Differences (MD) between the clinical and MARP RC values were calculated. Results: The reconstructions parameters that obtained the lowest RMSE are: FBP 5mm (PHP-1); LOR-RAMLA 2i0.008l (PHP-2); VuePointHD 2i32s10mm (GEH); and FORE+OSEM 4i8s6mm (SMS). Thus, to ensure that quantitative PET image measurements are interchangeable between these sites, images must be reconstructed with the above-mentioned parameters. Although, a decoupling between the best image for PET/CT qualitative analysis and the best image for quantification studies was observed. The MD showed that the strategy was effective in reducing the variability of SUV quantification for small structures (<17mm). Conclusion: The harmonization strategy of the SUV quantification implemented with these devices was effective in reducing the variability of small structures quantification, minimizing the inter-scanner and inter-institution differences in quantification. However, it is essential that, in addition to the harmonization of quantification, the standardization of the

  10. Absolute quantification by droplet digital PCR versus analog real-time PCR

    Science.gov (United States)

    Hindson, Christopher M; Chevillet, John R; Briggs, Hilary A; Gallichotte, Emily N; Ruf, Ingrid K; Hindson, Benjamin J; Vessella, Robert L; Tewari, Muneesh

    2014-01-01

    Nanoliter-sized droplet technology paired with digital PCR (ddPCR) holds promise for highly precise, absolute nucleic acid quantification. Our comparison of microRNA quantification by ddPCR and real-time PCR revealed greater precision (coefficients of variation decreased by 37–86%) and improved day-to-day reproducibility (by a factor of seven) of ddPCR but with comparable sensitivity. When we applied ddPCR to serum microRNA biomarker analysis, this translated to superior diagnostic performance for identifying individuals with cancer. PMID:23995387

  11. A Micropillar Compression Methodology for Ductile Damage Quantification

    NARCIS (Netherlands)

    Tasan, C.C.; Hoefnagels, J.P.M.; Geers, M.G.D.

    2012-01-01

    Microstructural damage evolution is reported to influence significantly the failures of new high-strength alloys. Its accurate quantification is, therefore, critical for (1) microstructure optimization and (2) continuum damage models to predict failures of these materials. As existing methodologies

  12. A micropillar compression methodology for ductile damage quantification

    NARCIS (Netherlands)

    Tasan, C.C.; Hoefnagels, J.P.M.; Geers, M.G.D.

    2012-01-01

    Microstructural damage evolution is reported to influence significantly the failures of new high-strength alloys. Its accurate quantification is, therefore, critical for (1) microstructure optimization and (2) continuum damage models to predict failures of these materials. As existing methodologies

  13. A reduction approach to improve the quantification of linked fault trees through binary decision diagrams

    International Nuclear Information System (INIS)

    Ibanez-Llano, Cristina; Rauzy, Antoine; Melendez, Enrique; Nieto, Francisco

    2010-01-01

    Over the last two decades binary decision diagrams have been applied successfully to improve Boolean reliability models. Conversely to the classical approach based on the computation of the MCS, the BDD approach involves no approximation in the quantification of the model and is able to handle correctly negative logic. However, when models are sufficiently large and complex, as for example the ones coming from the PSA studies of the nuclear industry, it begins to be unfeasible to compute the BDD within a reasonable amount of time and computer memory. Therefore, simplification or reduction of the full model has to be considered in some way to adapt the application of the BDD technology to the assessment of such models in practice. This paper proposes a reduction process based on using information provided by the set of the most relevant minimal cutsets of the model in order to perform the reduction directly on it. This allows controlling the degree of reduction and therefore the impact of such simplification on the final quantification results. This reduction is integrated in an incremental procedure that is compatible with the dynamic generation of the event trees and therefore adaptable to the recent dynamic developments and extensions of the PSA studies. The proposed method has been applied to a real case study, and the results obtained confirm that the reduction enables the BDD computation while maintaining accuracy.

  14. A reduction approach to improve the quantification of linked fault trees through binary decision diagrams

    Energy Technology Data Exchange (ETDEWEB)

    Ibanez-Llano, Cristina, E-mail: cristina.ibanez@iit.upcomillas.e [Instituto de Investigacion Tecnologica (IIT), Escuela Tecnica Superior de Ingenieria ICAI, Universidad Pontificia Comillas, C/Santa Cruz de Marcenado 26, 28015 Madrid (Spain); Rauzy, Antoine, E-mail: Antoine.RAUZY@3ds.co [Dassault Systemes, 10 rue Marcel Dassault CS 40501, 78946 Velizy Villacoublay, Cedex (France); Melendez, Enrique, E-mail: ema@csn.e [Consejo de Seguridad Nuclear (CSN), C/Justo Dorado 11, 28040 Madrid (Spain); Nieto, Francisco, E-mail: nieto@iit.upcomillas.e [Instituto de Investigacion Tecnologica (IIT), Escuela Tecnica Superior de Ingenieria ICAI, Universidad Pontificia Comillas, C/Santa Cruz de Marcenado 26, 28015 Madrid (Spain)

    2010-12-15

    Over the last two decades binary decision diagrams have been applied successfully to improve Boolean reliability models. Conversely to the classical approach based on the computation of the MCS, the BDD approach involves no approximation in the quantification of the model and is able to handle correctly negative logic. However, when models are sufficiently large and complex, as for example the ones coming from the PSA studies of the nuclear industry, it begins to be unfeasible to compute the BDD within a reasonable amount of time and computer memory. Therefore, simplification or reduction of the full model has to be considered in some way to adapt the application of the BDD technology to the assessment of such models in practice. This paper proposes a reduction process based on using information provided by the set of the most relevant minimal cutsets of the model in order to perform the reduction directly on it. This allows controlling the degree of reduction and therefore the impact of such simplification on the final quantification results. This reduction is integrated in an incremental procedure that is compatible with the dynamic generation of the event trees and therefore adaptable to the recent dynamic developments and extensions of the PSA studies. The proposed method has been applied to a real case study, and the results obtained confirm that the reduction enables the BDD computation while maintaining accuracy.

  15. Accurate Quantification of Cardiovascular Biomarkers in Serum Using Protein Standard Absolute Quantification (PSAQ™) and Selected Reaction Monitoring*

    Science.gov (United States)

    Huillet, Céline; Adrait, Annie; Lebert, Dorothée; Picard, Guillaume; Trauchessec, Mathieu; Louwagie, Mathilde; Dupuis, Alain; Hittinger, Luc; Ghaleh, Bijan; Le Corvoisier, Philippe; Jaquinod, Michel; Garin, Jérôme; Bruley, Christophe; Brun, Virginie

    2012-01-01

    Development of new biomarkers needs to be significantly accelerated to improve diagnostic, prognostic, and toxicity monitoring as well as therapeutic follow-up. Biomarker evaluation is the main bottleneck in this development process. Selected Reaction Monitoring (SRM) combined with stable isotope dilution has emerged as a promising option to speed this step, particularly because of its multiplexing capacities. However, analytical variabilities because of upstream sample handling or incomplete trypsin digestion still need to be resolved. In 2007, we developed the PSAQ™ method (Protein Standard Absolute Quantification), which uses full-length isotope-labeled protein standards to quantify target proteins. In the present study we used clinically validated cardiovascular biomarkers (LDH-B, CKMB, myoglobin, and troponin I) to demonstrate that the combination of PSAQ and SRM (PSAQ-SRM) allows highly accurate biomarker quantification in serum samples. A multiplex PSAQ-SRM assay was used to quantify these biomarkers in clinical samples from myocardial infarction patients. Good correlation between PSAQ-SRM and ELISA assay results was found and demonstrated the consistency between these analytical approaches. Thus, PSAQ-SRM has the capacity to improve both accuracy and reproducibility in protein analysis. This will be a major contribution to efficient biomarker development strategies. PMID:22080464

  16. Emphysema quantification from CT scans using novel application of diaphragm curvature estimation: comparison with standard quantification methods and pulmonary function data

    Science.gov (United States)

    Keller, Brad M.; Reeves, Anthony P.; Yankelevitz, David F.; Henschke, Claudia I.; Barr, R. Graham

    2009-02-01

    Emphysema is a disease of the lungs that destroys the alveolar air sacs and induces long-term respiratory dysfunction. CT scans allow for the imaging of the anatomical basis of emphysema and quantification of the underlying disease state. Several measures have been introduced for the quantification emphysema directly from CT data; most,however, are based on the analysis of density information provided by the CT scans, which vary by scanner and can be hard to standardize across sites and time. Given that one of the anatomical variations associated with the progression of emphysema is the flatting of the diaphragm due to the loss of elasticity in the lung parenchyma, curvature analysis of the diaphragm would provide information about emphysema from CT. Therefore, we propose a new, non-density based measure of the curvature of the diaphragm that would allow for further quantification methods in a robust manner. To evaluate the new method, 24 whole-lung scans were analyzed using the ratios of the lung height and diaphragm width to diaphragm height as curvature estimates as well as using the emphysema index as comparison. Pearson correlation coefficients showed a strong trend of several of the proposed diaphragm curvature measures to have higher correlations, of up to r=0.57, with DLCO% and VA than did the emphysema index. Furthermore, we found emphysema index to have only a 0.27 correlation to the proposed measures, indicating that the proposed measures evaluate different aspects of the disease.

  17. MDCT quantification is the dominant parameter in decision-making regarding chest tube drainage for stable patients with traumatic pneumothorax.

    Science.gov (United States)

    Cai, Wenli; Lee, June-Goo; Fikry, Karim; Yoshida, Hiroyuki; Novelline, Robert; de Moya, Marc

    2012-07-01

    It is commonly believed that the size of a pneumothorax is an important determinant of treatment decision, in particular regarding whether chest tube drainage (CTD) is required. However, the volumetric quantification of pneumothoraces has not routinely been performed in clinics. In this paper, we introduced an automated computer-aided volumetry (CAV) scheme for quantification of volume of pneumothoraces in chest multi-detect CT (MDCT) images. Moreover, we investigated the impact of accurate volume of pneumothoraces in the improvement of the performance in decision-making regarding CTD in the management of traumatic pneumothoraces. For this purpose, an occurrence frequency map was calculated for quantitative analysis of the importance of each clinical parameter in the decision-making regarding CTD by a computer simulation of decision-making using a genetic algorithm (GA) and a support vector machine (SVM). A total of 14 clinical parameters, including volume of pneumothorax calculated by our CAV scheme, was collected as parameters available for decision-making. The results showed that volume was the dominant parameter in decision-making regarding CTD, with an occurrence frequency value of 1.00. The results also indicated that the inclusion of volume provided the best performance that was statistically significant compared to the other tests in which volume was excluded from the clinical parameters. This study provides the scientific evidence for the application of CAV scheme in MDCT volumetric quantification of pneumothoraces in the management of clinically stable chest trauma patients with traumatic pneumothorax. Copyright © 2012 Elsevier Ltd. All rights reserved.

  18. Quantification of competitive value of documents

    Directory of Open Access Journals (Sweden)

    Pavel Šimek

    2009-01-01

    Full Text Available The majority of Internet users use the global network to search for different information using fulltext search engines such as Google, Yahoo!, or Seznam. The web presentation operators are trying, with the help of different optimization techniques, to get to the top places in the results of fulltext search engines. Right there is a great importance of Search Engine Optimization and Search Engine Marketing, because normal users usually try links only on the first few pages of the fulltext search engines results on certain keywords and in catalogs they use primarily hierarchically higher placed links in each category. Key to success is the application of optimization methods which deal with the issue of keywords, structure and quality of content, domain names, individual sites and quantity and reliability of backward links. The process is demanding, long-lasting and without a guaranteed outcome. A website operator without advanced analytical tools do not identify the contribution of individual documents from which the entire web site consists. If the web presentation operators want to have an overview of their documents and web site in global, it is appropriate to quantify these positions in a specific way, depending on specific key words. For this purpose serves the quantification of competitive value of documents, which consequently sets global competitive value of a web site. Quantification of competitive values is performed on a specific full-text search engine. For each full-text search engine can be and often are, different results. According to published reports of ClickZ agency or Market Share is according to the number of searches by English-speaking users most widely used Google search engine, which has a market share of more than 80%. The whole procedure of quantification of competitive values is common, however, the initial step which is the analysis of keywords depends on a choice of the fulltext search engine.

  19. Survey and Evaluate Uncertainty Quantification Methodologies

    Energy Technology Data Exchange (ETDEWEB)

    Lin, Guang; Engel, David W.; Eslinger, Paul W.

    2012-02-01

    The Carbon Capture Simulation Initiative (CCSI) is a partnership among national laboratories, industry and academic institutions that will develop and deploy state-of-the-art computational modeling and simulation tools to accelerate the commercialization of carbon capture technologies from discovery to development, demonstration, and ultimately the widespread deployment to hundreds of power plants. The CCSI Toolset will provide end users in industry with a comprehensive, integrated suite of scientifically validated models with uncertainty quantification, optimization, risk analysis and decision making capabilities. The CCSI Toolset will incorporate commercial and open-source software currently in use by industry and will also develop new software tools as necessary to fill technology gaps identified during execution of the project. The CCSI Toolset will (1) enable promising concepts to be more quickly identified through rapid computational screening of devices and processes; (2) reduce the time to design and troubleshoot new devices and processes; (3) quantify the technical risk in taking technology from laboratory-scale to commercial-scale; and (4) stabilize deployment costs more quickly by replacing some of the physical operational tests with virtual power plant simulations. The goal of CCSI is to deliver a toolset that can simulate the scale-up of a broad set of new carbon capture technologies from laboratory scale to full commercial scale. To provide a framework around which the toolset can be developed and demonstrated, we will focus on three Industrial Challenge Problems (ICPs) related to carbon capture technologies relevant to U.S. pulverized coal (PC) power plants. Post combustion capture by solid sorbents is the technology focus of the initial ICP (referred to as ICP A). The goal of the uncertainty quantification (UQ) task (Task 6) is to provide a set of capabilities to the user community for the quantification of uncertainties associated with the carbon

  20. Surface Enhanced Raman Spectroscopy (SERS) methods for endpoint and real-time quantification of miRNA assays

    Science.gov (United States)

    Restaino, Stephen M.; White, Ian M.

    2017-03-01

    Surface Enhanced Raman spectroscopy (SERS) provides significant improvements over conventional methods for single and multianalyte quantification. Specifically, the spectroscopic fingerprint provided by Raman scattering allows for a direct multiplexing potential far beyond that of fluorescence and colorimetry. Additionally, SERS generates a comparatively low financial and spatial footprint compared with common fluorescence based systems. Despite the advantages of SERS, it has remained largely an academic pursuit. In the field of biosensing, techniques to apply SERS to molecular diagnostics are constantly under development but, most often, assay protocols are redesigned around the use of SERS as a quantification method and ultimately complicate existing protocols. Our group has sought to rethink common SERS methodologies in order to produce translational technologies capable of allowing SERS to compete in the evolving, yet often inflexible biosensing field. This work will discuss the development of two techniques for quantification of microRNA, a promising biomarker for homeostatic and disease conditions ranging from cancer to HIV. First, an inkjet-printed paper SERS sensor has been developed to allow on-demand production of a customizable and multiplexable single-step lateral flow assay for miRNA quantification. Second, as miRNA concentrations commonly exist in relatively low concentrations, amplification methods (e.g. PCR) are therefore required to facilitate quantification. This work presents a novel miRNA assay alongside a novel technique for quantification of nuclease driven nucleic acid amplification strategies that will allow SERS to be used directly with common amplification strategies for quantification of miRNA and other nucleic acid biomarkers.

  1. A phase quantification method based on EBSD data for a continuously cooled microalloyed steel

    Energy Technology Data Exchange (ETDEWEB)

    Zhao, H.; Wynne, B.P.; Palmiere, E.J., E-mail: e.j.palmiere@sheffield.ac.uk

    2017-01-15

    Mechanical properties of steels depend on the phase constitutions of the final microstructures which can be related to the processing parameters. Therefore, accurate quantification of different phases is necessary to investigate the relationships between processing parameters, final microstructures and mechanical properties. Point counting on micrographs observed by optical or scanning electron microscopy is widely used as a phase quantification method, and different phases are discriminated according to their morphological characteristics. However, it is difficult to differentiate some of the phase constituents with similar morphology. Differently, for EBSD based phase quantification methods, besides morphological characteristics, other parameters derived from the orientation information can also be used for discrimination. In this research, a phase quantification method based on EBSD data in the unit of grains was proposed to identify and quantify the complex phase constitutions of a microalloyed steel subjected to accelerated coolings. Characteristics of polygonal ferrite/quasi-polygonal ferrite, acicular ferrite and bainitic ferrite on grain averaged misorientation angles, aspect ratios, high angle grain boundary fractions and grain sizes were analysed and used to develop the identification criteria for each phase. Comparing the results obtained by this EBSD based method and point counting, it was found that this EBSD based method can provide accurate and reliable phase quantification results for microstructures with relatively slow cooling rates. - Highlights: •A phase quantification method based on EBSD data in the unit of grains was proposed. •The critical grain area above which GAM angles are valid parameters was obtained. •Grain size and grain boundary misorientation were used to identify acicular ferrite. •High cooling rates deteriorate the accuracy of this EBSD based method.

  2. GC-MS quantification of suspected volatile allergens in fragrances. 2. Data treatment strategies and method performances.

    Science.gov (United States)

    Bassereau, Maud; Chaintreau, Alain; Duperrex, Stéphanie; Joulain, Daniel; Leijs, Hans; Loesing, Gerd; Owen, Neil; Sherlock, Alan; Schippa, Christine; Thorel, Pierre-Jean; Vey, Matthias

    2007-01-10

    The performances of the GC-MS determination of suspected allergens in fragrance concentrates have been investigated. The limit of quantification was experimentally determined (10 mg/L), and the variability was investigated for three different data treatment strategies: (1) two columns and three quantification ions; (2) two columns and one quantification ion; and (3) one column and three quantification ions. The first strategy best minimizes the risk of determination bias due to coelutions. This risk was evaluated by calculating the probability of coeluting a suspected allergen with perfume constituents exhibiting ions in common. For hydroxycitronellal, when using a two-column strategy, this may statistically occur more than once every 36 analyses for one ion or once every 144 analyses for three ions in common.

  3. A fast and robust hepatocyte quantification algorithm including vein processing

    Directory of Open Access Journals (Sweden)

    Homeyer André

    2010-03-01

    Full Text Available Abstract Background Quantification of different types of cells is often needed for analysis of histological images. In our project, we compute the relative number of proliferating hepatocytes for the evaluation of the regeneration process after partial hepatectomy in normal rat livers. Results Our presented automatic approach for hepatocyte (HC quantification is suitable for the analysis of an entire digitized histological section given in form of a series of images. It is the main part of an automatic hepatocyte quantification tool that allows for the computation of the ratio between the number of proliferating HC-nuclei and the total number of all HC-nuclei for a series of images in one processing run. The processing pipeline allows us to obtain desired and valuable results for a wide range of images with different properties without additional parameter adjustment. Comparing the obtained segmentation results with a manually retrieved segmentation mask which is considered to be the ground truth, we achieve results with sensitivity above 90% and false positive fraction below 15%. Conclusions The proposed automatic procedure gives results with high sensitivity and low false positive fraction and can be applied to process entire stained sections.

  4. Within-day repeatability for absolute quantification of Lawsonia intracellularis bacteria in feces from growing pigs

    DEFF Research Database (Denmark)

    Pedersen, Ken Steen; Pedersen, Klaus H.; Hjulsager, Charlotte Kristiane

    2012-01-01

    Absolute quantification of Lawsonia intracellularis by real-time polymerase chain reaction (PCR) is now possible on a routine basis. Poor repeatability of quantification can result in disease status misclassification of individual pigs when a single fecal sample is obtained. The objective...

  5. Multi data reservior history matching and uncertainty quantification framework

    KAUST Repository

    Katterbauer, Klemens; Hoteit, Ibrahim; Sun, Shuyu

    2015-01-01

    A multi-data reservoir history matching and uncertainty quantification framework is provided. The framework can utilize multiple data sets such as production, seismic, electromagnetic, gravimetric and surface deformation data for improving

  6. Quantification of intraventricular hemorrhage is consistent using a spherical sensitivity matrix

    Science.gov (United States)

    Tang, Te; Sadleir, Rosalind

    2010-04-01

    We have developed a robust current pattern for detection of intraventricular hemorrhage (IVH). In this study, the current pattern was applied on two realistic shaped neonatal head models and one head-shaped phantom. We found that a sensitivity matrix calculated from a spherical model gave us satisfactory reconstructions in terms of both image quality and quantification. Incorporating correct geometry information into the forward model improved image quality. However, it did not improve quantification accuracy. The results indicate that using a spherical matrix may be a more practical choice for monitoring IVH volumes in neonates.

  7. The GEO-3 Scenarios 2002-2032. Quantification and Analysis of Environmental Impacts

    International Nuclear Information System (INIS)

    Bakkes, J.; Potting, J.; Kemp-Benedict, E.; Raskin, P.; Masui, T.; Rana, A.; Nellemann, C.; Rothman, D.

    2004-01-01

    The four contrasting visions of the world's next three decades as presented in the third Global Environment Outlook (GEO-3) have many implications for policy - from hunger to climate change and from freshwater issues to biodiversity. The four scenarios analysed are Markets First, Policy First, Security First, Sustainability First. Presenting a deeper analysis than the original GEO-3 report, this Technical Report quantifies the impacts of the scenarios for all 19 GEO 'sub-regions', such as Eastern Africa and Central Europe. Regional impacts are discussed in the context of sustainable development. The report summary compares the impacts of the four scenarios across regions - and for the world as a whole - in the light of internationally agreed targets including those in the Millennium Declaration where applicable. It provides an account of the analytical methods, key assumptions, models and other tools, along with the approaches used in the analyses. Based on the methods and results, the report looks back on the process of producing the forward-looking analysis for GEO-3. Were all analytical centres on the same track? Did the approach adopted for GEO-3 contribute to the overall GEO objective of strengthening global-regional involvement and linkages?

  8. Data-independent MS/MS quantification of neuropeptides for determination of putative feeding-related neurohormones in microdialysate.

    Science.gov (United States)

    Schmerberg, Claire M; Liang, Zhidan; Li, Lingjun

    2015-01-21

    Food consumption is an important behavior that is regulated by an intricate array of neuropeptides (NPs). Although many feeding-related NPs have been identified in mammals, precise mechanisms are unclear and difficult to study in mammals, as current methods are not highly multiplexed and require extensive a priori knowledge about analytes. New advances in data-independent acquisition (DIA) MS/MS and the open-source quantification software Skyline have opened up the possibility to identify hundreds of compounds and quantify them from a single DIA MS/MS run. An untargeted DIA MS(E) quantification method using Skyline software for multiplexed, discovery-driven quantification was developed and found to produce linear calibration curves for peptides at physiologically relevant concentrations using a protein digest as internal standard. By using this method, preliminary relative quantification of the crab Cancer borealis neuropeptidome (winnowing candidate NPs related to a behavior of interest in a functionally relevant manner, and demonstrates the success of such a UPLC-MS(E) quantification method using the open source software Skyline.

  9. Strawberry: Fast and accurate genome-guided transcript reconstruction and quantification from RNA-Seq.

    Science.gov (United States)

    Liu, Ruolin; Dickerson, Julie

    2017-11-01

    We propose a novel method and software tool, Strawberry, for transcript reconstruction and quantification from RNA-Seq data under the guidance of genome alignment and independent of gene annotation. Strawberry consists of two modules: assembly and quantification. The novelty of Strawberry is that the two modules use different optimization frameworks but utilize the same data graph structure, which allows a highly efficient, expandable and accurate algorithm for dealing large data. The assembly module parses aligned reads into splicing graphs, and uses network flow algorithms to select the most likely transcripts. The quantification module uses a latent class model to assign read counts from the nodes of splicing graphs to transcripts. Strawberry simultaneously estimates the transcript abundances and corrects for sequencing bias through an EM algorithm. Based on simulations, Strawberry outperforms Cufflinks and StringTie in terms of both assembly and quantification accuracies. Under the evaluation of a real data set, the estimated transcript expression by Strawberry has the highest correlation with Nanostring probe counts, an independent experiment measure for transcript expression. Strawberry is written in C++14, and is available as open source software at https://github.com/ruolin/strawberry under the MIT license.

  10. Tool for objective quantification of pulmonary sequelae in monitoring of patients with tuberculosis

    International Nuclear Information System (INIS)

    Giacomini, Guilherme; Alvarez, Matheus; Pina, Diana R. de; Bacchim Neto, Fernando A.; Pereira, Paulo C.M.; Ribeiro, Sergio M.; Miranda, Jose Ricardo de A.

    2014-01-01

    Tuberculosis (TB), caused by Mycobacterium tuberculosis, is an ancient infectious disease that remains a global health problem. Chest radiography is the method commonly employed in assessing the evolution of TB. However, lung damage quantification methods are usually performed on a computerized tomography (CT). This objective quantification is important in the radiological monitoring of the patient by assessing the progression and treatment of TB. However, precise quantification is not feasible by the number of CT examinations necessary due to the high dose subjected to the patient and high cost to the institution. The purpose of this work is to develop a tool to quantify pulmonary sequelae caused by TB through chest X-rays. Aiming the proposed objective, a computational algorithm was developed, creating a three-dimensional representation of the lungs, with regions of dilated sequelae inside. It also made the quantification of pulmonary sequelae of these patients through CT scans performed in upcoming dates, minimizing the differences in disease progression. The measurements from the two methods were compared with results suggest that the effectiveness and applicability of the developed tool, allowing lower doses radiological monitoring of the patient during treatment

  11. Interdependencies of acquisition, detection, and reconstruction techniques on the accuracy of iodine quantification in varying patient sizes employing dual-energy CT

    Energy Technology Data Exchange (ETDEWEB)

    Marin, Daniele; Pratts-Emanuelli, Jose J.; Mileto, Achille; Bashir, Mustafa R.; Nelson, Rendon C.; Boll, Daniel T. [Duke University Medical Center, Department of Radiology, Durham, NC (United States); Husarik, Daniela B. [University Hospital Zurich, Diagnostic and Interventional Radiology, Zurich (Switzerland)

    2014-10-03

    To assess the impact of patient habitus, acquisition parameters, detector efficiencies, and reconstruction techniques on the accuracy of iodine quantification using dual-source dual-energy CT (DECT). Two phantoms simulating small and large patients contained 20 iodine solutions mimicking vascular and parenchymal enhancement from saline isodensity to 400 HU and 30 iodine solutions simulating enhancement of the urinary collecting system from 400 to 2,000 HU. DECT acquisition (80/140 kVp and 100/140 kVp) was performed using two DECT systems equipped with standard and integrated electronics detector technologies. DECT raw datasets were reconstructed using filtered backprojection (FBP), and iterative reconstruction (SAFIRE I/V). Accuracy for iodine quantification was significantly higher for the small compared to the large phantoms (9.2 % ± 7.5 vs. 24.3 % ± 26.1, P = 0.0001), the integrated compared to the conventional detectors (14.8 % ± 20.6 vs. 18.8 % ± 20.4, respectively; P = 0.006), and SAFIRE V compared to SAFIRE I and FBP reconstructions (15.2 % ± 18.1 vs. 16.1 % ± 17.6 and 18.9 % ± 20.4, respectively; P ≤ 0.003). A significant synergism was observed when the most effective detector and reconstruction techniques were combined with habitus-adapted dual-energy pairs. In a second-generation dual-source DECT system, the accuracy of iodine quantification can be substantially improved by an optimal choice and combination of acquisition parameters, detector, and reconstruction techniques. (orig.)

  12. Compositional Solution Space Quantification for Probabilistic Software Analysis

    Science.gov (United States)

    Borges, Mateus; Pasareanu, Corina S.; Filieri, Antonio; d'Amorim, Marcelo; Visser, Willem

    2014-01-01

    Probabilistic software analysis aims at quantifying how likely a target event is to occur during program execution. Current approaches rely on symbolic execution to identify the conditions to reach the target event and try to quantify the fraction of the input domain satisfying these conditions. Precise quantification is usually limited to linear constraints, while only approximate solutions can be provided in general through statistical approaches. However, statistical approaches may fail to converge to an acceptable accuracy within a reasonable time. We present a compositional statistical approach for the efficient quantification of solution spaces for arbitrarily complex constraints over bounded floating-point domains. The approach leverages interval constraint propagation to improve the accuracy of the estimation by focusing the sampling on the regions of the input domain containing the sought solutions. Preliminary experiments show significant improvement on previous approaches both in results accuracy and analysis time.

  13. Method for indirect quantification of CH4 production via H2O production using hydrogenotrophic methanogens

    Directory of Open Access Journals (Sweden)

    Ruth-Sophie eTaubner

    2016-04-01

    Full Text Available ydrogenotrophic methanogens are an intriguing group of microorganisms from the domain Archaea. They exhibit extraordinary ecological, biochemical, physiological characteristics colorbox{yellow}{and have a huge biotechnological potential}. Yet, the only possibility to assess the methane (CH$_4$ production potential of hydrogenotrophic methanogens is to apply gas chromatographic quantification of CH$_4$.In order to be able to effectively screen pure cultures of hydrogenotrophic methanogens regarding their CH$_4$ production potential we developed a novel method for indirect quantification of colorbox{yellow}{the} volumetric CH$_4$ production rate by measuring colorbox{yellow}{the} volumetric water production rate. This colorbox{yellow}{ } method was established in serum bottles for cultivation of methanogens in closed batch cultivation mode. Water production was colorbox{yellow}{estimated} by determining the difference in mass increase in an isobaric setting.This novel CH$_4$ quantification method is an accurate and precise analytical technique, colorbox{yellow}{which can be used} to rapidly screen pure cultures of methanogens regarding colorbox{yellow}{their} volumetric CH$_{4}$ evolution rate. colorbox{yellow}{It} is a cost effective alternative colorbox{yellow}{determining} CH$_4$ production of methanogens over CH$_4$ quantification by using gas chromatography, especially if colorbox{yellow}{ } applied as a high throughput quantification method. colorbox{yellow}{Eventually, the} method can be universally applied for quantification of CH$_4$ production from psychrophilic, thermophilic and hyperthermophilic hydrogenotrophic methanogens.

  14. Quantification of prebiotics in commercial infant formulas.

    Science.gov (United States)

    Sabater, Carlos; Prodanov, Marin; Olano, Agustín; Corzo, Nieves; Montilla, Antonia

    2016-03-01

    Since breastfeeding is not always possible, infant formulas (IFs) are supplemented with prebiotic oligosaccharides, such as galactooligosaccharides (GOS) and/or fructooligosaccharides (FOS) to exert similar effects to those of the breast milk. Nowadays, a great number of infant formulas enriched with prebiotics are disposal in the market, however there are scarce data about their composition. In this study, the combined use of two chromatographic methods (GC-FID and HPLC-RID) for the quantification of carbohydrates present in commercial infant formulas have been used. According to the results obtained by GC-FID for products containing prebiotics, the content of FOS, GOS and GOS/FOS was in the ranges of 1.6-5.0, 1.7-3.2, and 0.08-0.25/2.3-3.8g/100g of product, respectively. HPLC-RID analysis allowed quantification of maltodextrins with degree of polymerization (DP) up to 19. The methodology proposed here may be used for routine quality control of infant formula and other food ingredients containing prebiotics. Copyright © 2015 Elsevier Ltd. All rights reserved.

  15. Outcome quantification using SPHARM-PDM toolbox in orthognathic surgery

    Science.gov (United States)

    Cevidanes, Lucia; Zhu, HongTu; Styner, Martin

    2011-01-01

    Purpose Quantification of surgical outcomes in longitudinal studies has led to significant progress in the treatment of dentofacial deformity, both by offering options to patients who might not otherwise have been recommended for treatment and by clarifying the selection of appropriate treatment methods. Most existing surgical treatments have not been assessed in a systematic way. This paper presents the quantification of surgical outcomes in orthognathic surgery via our localized shape analysis framework. Methods In our setting, planning and surgical simulation is performed using the surgery planning software CMFapp. We then employ the SPHARM-PDM to measure the difference between pre-surgery and virtually simulated post-surgery models. This SPHARM-PDM shape framework is validated for use with craniofacial structures via simulating known 3D surgical changes within CMFapp. Results Our results show that SPHARM-PDM analysis accurately measures surgical displacements, compared with known displacement values. Visualization of color maps of virtually simulated surgical displacements describe corresponding surface distances that precisely describe location of changes, and difference vectors indicate directionality and magnitude of changes. Conclusions SPHARM-PDM-based quantification of surgical outcome is feasible. When compared to prior solutions, our method has the potential to make the surgical planning process more flexible, increase the level of detail and accuracy of the plan, yield higher operative precision and control and enhance the follow-up and documentation of clinical cases. PMID:21161693

  16. Quantification model for energy consumption in edification

    Directory of Open Access Journals (Sweden)

    Mercader, Mª P.

    2012-12-01

    Full Text Available The research conducted in this paper focuses on the generation of a model for the quantification of energy consumption in building. This is to be done through one of the most relevant environmental impact indicators associated with weight per m2 of construction, as well as the energy consumption resulting from the manufacturing process of materials used in building construction. The practical application of the proposed model on different buildings typologies in Seville, will provide information regarding the building materials, the subsystems and the most relevant construction elements. Hence, we will be able to observe the impact the built surface has on the environment. The results obtained aim to reference the scientific community, providing quantitative data comparable to other types of buildings and geographical areas. Furthermore, it may also allow the analysis and the characterization of feasible solutions to reduce the environmental impact generated by the different materials, subsystems and construction elements commonly used in the different building types defined in this study.

    La investigación realizada en el presente trabajo plantea la generación de un modelo de cuantificación del consumo energético en edificación, a través de uno de los indicadores de impacto ambiental más relevantes asociados al peso por m2 de construcción, el consumo energético derivado del proceso de fabricación de los materiales de construcción empleados en edificación. La aplicación práctica del modelo propuesto sobre diferentes tipologías edificatorias en Sevilla aportará información respecto a los materiales de construcción, subsistemas y elementos constructivos más impactantes, permitiendo visualizar la influencia que presenta la superficie construida en cuanto al impacto ambiental generado. Los resultados obtenidos pretenden servir de referencia a la comunidad científica, aportando datos num

  17. A Java program for LRE-based real-time qPCR that enables large-scale absolute quantification.

    Science.gov (United States)

    Rutledge, Robert G

    2011-03-02

    Linear regression of efficiency (LRE) introduced a new paradigm for real-time qPCR that enables large-scale absolute quantification by eliminating the need for standard curves. Developed through the application of sigmoidal mathematics to SYBR Green I-based assays, target quantity is derived directly from fluorescence readings within the central region of an amplification profile. However, a major challenge of implementing LRE quantification is the labor intensive nature of the analysis. Utilizing the extensive resources that are available for developing Java-based software, the LRE Analyzer was written using the NetBeans IDE, and is built on top of the modular architecture and windowing system provided by the NetBeans Platform. This fully featured desktop application determines the number of target molecules within a sample with little or no intervention by the user, in addition to providing extensive database capabilities. MS Excel is used to import data, allowing LRE quantification to be conducted with any real-time PCR instrument that provides access to the raw fluorescence readings. An extensive help set also provides an in-depth introduction to LRE, in addition to guidelines on how to implement LRE quantification. The LRE Analyzer provides the automated analysis and data storage capabilities required by large-scale qPCR projects wanting to exploit the many advantages of absolute quantification. Foremost is the universal perspective afforded by absolute quantification, which among other attributes, provides the ability to directly compare quantitative data produced by different assays and/or instruments. Furthermore, absolute quantification has important implications for gene expression profiling in that it provides the foundation for comparing transcript quantities produced by any gene with any other gene, within and between samples.

  18. an expansion of the aboveground biomass quantification model for ...

    African Journals Online (AJOL)

    Research Note BECVOL 3: an expansion of the aboveground biomass quantification model for ... African Journal of Range and Forage Science ... encroachment and estimation of food to browser herbivore species, was proposed during 1989.

  19. Cytochrome c oxidase subunit 1-based human RNA quantification to enhance mRNA profiling in forensic biology

    Directory of Open Access Journals (Sweden)

    Dong Zhao

    2017-01-01

    Full Text Available RNA analysis offers many potential applications in forensic science, and molecular identification of body fluids by analysis of cell-specific RNA markers represents a new technique for use in forensic cases. However, due to the nature of forensic materials that often admixed with nonhuman cellular components, human-specific RNA quantification is required for the forensic RNA assays. Quantification assay for human RNA has been developed in the present study with respect to body fluid samples in forensic biology. The quantitative assay is based on real-time reverse transcription-polymerase chain reaction of mitochondrial RNA cytochrome c oxidase subunit I and capable of RNA quantification with high reproducibility and a wide dynamic range. The human RNA quantification improves the quality of mRNA profiling in the identification of body fluids of saliva and semen because the quantification assay can exclude the influence of nonhuman components and reduce the adverse affection from degraded RNA fragments.

  20. Quantification of the impacts of coalmine water irrigation on the underlying aquifers

    Energy Technology Data Exchange (ETDEWEB)

    Vermeulen, D.; Usher, B.; van Tonder, G. [University of Free State, Bloemfontein (South Africa). Institute of Groundwater Studies

    2009-07-15

    It is predicted that vast volumes of affected mine water will be produced by mining activities in the Mpumalanga coalfields of South Africa, The potential environmental impact of this excess water is of great concern in a water-scarce country like South Africa. Research over a period of more than 10 years has shown that this water can be used successfully for the irrigation of a range of crops. There is, however, continuing concern from the local regulators regarding the long-term impact that large-scale mine water irrigation may have on groundwater quality and quantity. Detailed research has been undertaken over the last three years to supplement the groundwater monitoring programme at five different pilot sites, on both virgin soils (greenfields) and in coalmining spoils. These sites range from sandy soils to very clayey soils. The research has included soil moisture measurements, collection of in situ soil moisture over time, long-term laboratory studies of the leaching and attenuation properties of different soils and the impact of irrigation on acid rock drainage processes, and in depth determination of the hydraulic properties of the subsurface at each of these sites, including falling head tests, pumping tests and point dilution tests. This has been supported by geochemical modelling of these processes to quantify the impacts. The results indicate that many of the soils have considerable attenuation capacities and that in the period of irrigation, a large proportion of the salts have been contained in the upper portions of the unsaturated zones below each irrigation pivot. The volumes and quality of water leaching through to the aquifers have been quantified at each site. From this mixing ratios have been calculated in order to determine the effect of the irrigation water on the underlying aquifers.

  1. Uncertainty quantification and sensitivity analysis with CASL Core Simulator VERA-CS

    International Nuclear Information System (INIS)

    Brown, C.S.; Zhang, Hongbin

    2016-01-01

    VERA-CS (Virtual Environment for Reactor Applications, Core Simulator) is a coupled neutron transport and thermal-hydraulics code under development by the Consortium for Advanced Simulation of Light Water Reactors (CASL). An approach to uncertainty quantification and sensitivity analysis with VERA-CS was developed and a new toolkit was created to perform uncertainty quantification and sensitivity analysis. A 2 × 2 fuel assembly model was developed and simulated by VERA-CS, and uncertainty quantification and sensitivity analysis were performed with fourteen uncertain input parameters. The minimum departure from nucleate boiling ratio (MDNBR), maximum fuel center-line temperature, and maximum outer clad surface temperature were chosen as the selected figures of merit. Pearson, Spearman, and partial correlation coefficients were considered for all of the figures of merit in sensitivity analysis and coolant inlet temperature was consistently the most influential parameter. Parameters used as inputs to the critical heat flux calculation with the W-3 correlation were shown to be the most influential on the MDNBR, maximum fuel center-line temperature, and maximum outer clad surface temperature.

  2. Theoretical Study of Penalized-Likelihood Image Reconstruction for Region of Interest Quantification

    International Nuclear Information System (INIS)

    Qi, Jinyi; Huesman, Ronald H.

    2006-01-01

    Region of interest (ROI) quantification is an important task in emission tomography (e.g., positron emission tomography and single photon emission computed tomography). It is essential for exploring clinical factors such as tumor activity, growth rate, and the efficacy of therapeutic interventions. Statistical image reconstruction methods based on the penalized maximum-likelihood (PML) or maximum a posteriori principle have been developed for emission tomography to deal with the low signal-to-noise ratio of the emission data. Similar to the filter cut-off frequency in the filtered backprojection method, the regularization parameter in PML reconstruction controls the resolution and noise tradeoff and, hence, affects ROI quantification. In this paper, we theoretically analyze the performance of ROI quantification in PML reconstructions. Building on previous work, we derive simplified theoretical expressions for the bias, variance, and ensemble mean-squared-error (EMSE) of the estimated total activity in an ROI that is surrounded by a uniform background. When the mean and covariance matrix of the activity inside the ROI are known, the theoretical expressions are readily computable and allow for fast evaluation of image quality for ROI quantification with different regularization parameters. The optimum regularization parameter can then be selected to minimize the EMSE. Computer simulations are conducted for small ROIs with variable uniform uptake. The results show that the theoretical predictions match the Monte Carlo results reasonably well

  3. A nuclear DNA-based species determination and DNA quantification assay for common poultry species.

    Science.gov (United States)

    Ng, J; Satkoski, J; Premasuthan, A; Kanthaswamy, S

    2014-12-01

    DNA testing for food authentication and quality control requires sensitive species-specific quantification of nuclear DNA from complex and unknown biological sources. We have developed a multiplex assay based on TaqMan® real-time quantitative PCR (qPCR) for species-specific detection and quantification of chicken (Gallus gallus), duck (Anas platyrhynchos), and turkey (Meleagris gallopavo) nuclear DNA. The multiplex assay is able to accurately detect very low quantities of species-specific DNA from single or multispecies sample mixtures; its minimum effective quantification range is 5 to 50 pg of starting DNA material. In addition to its use in food fraudulence cases, we have validated the assay using simulated forensic sample conditions to demonstrate its utility in forensic investigations. Despite treatment with potent inhibitors such as hematin and humic acid, and degradation of template DNA by DNase, the assay was still able to robustly detect and quantify DNA from each of the three poultry species in mixed samples. The efficient species determination and accurate DNA quantification will help reduce fraudulent food labeling and facilitate downstream DNA analysis for genetic identification and traceability.

  4. A scoping study for an environmental impact field programme in tidal current energy

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    2002-07-01

    This report summarises the results of a study to identify the environmental impacts of tidal current energy with the aim of prioritising research. The background to the study is traced, and the interaction between tidal current energy technology and the marine environment, the modeling of the consequences of the environmental interactions, the quantification of the environmental impacts of key environmental interactions, and the formulation of a programme of research are discussed. Recommendations are given and research needs are highlighted.

  5. UV-Vis as quantification tool for solubilized lignin following a single-shot steam process.

    Science.gov (United States)

    Lee, Roland A; Bédard, Charles; Berberi, Véronique; Beauchet, Romain; Lavoie, Jean-Michel

    2013-09-01

    In this short communication, UV/Vis was used as an analytical tool for the quantification of lignin concentrations in aqueous mediums. A significant correlation was determined between absorbance and concentration of lignin in solution. For this study, lignin was produced from different types of biomasses (willow, aspen, softwood, canary grass and hemp) using steam processes. Quantification was performed at 212, 225, 237, 270, 280 and 287 nm. UV-Vis quantification of lignin was found suitable for different types of biomass making this a timesaving analytical system that could lead to uses as Process Analytical Tool (PAT) in biorefineries utilizing steam processes or comparable approaches. Copyright © 2013 Elsevier Ltd. All rights reserved.

  6. Environmental impact evaluation from large energy projects; Avaliacao de impacto ambiental de grandes projetos energeticos

    Energy Technology Data Exchange (ETDEWEB)

    Uribe, Alberto [Bahia Univ., Salvador, BA (Brazil)

    1985-12-31

    This paper builds up theoretical framework and methodological approaches to assess environmental impacts from large energy projects. It aims towards the definition of concrete tools, and technical proceedings to allow identification and quantification (or weighing) of effects on the natural and social environment. The environmental impacts assessment, (EIA), studies are described as important instruments in planning and in the choice of alternative energy policies. (author). 15 refs., 2 figs

  7. Cross-impact method

    Directory of Open Access Journals (Sweden)

    Suzić Nenad

    2014-01-01

    Full Text Available The paper displays the application of the Cross-Impact method in pedagogy, namely a methodological approach which crosses variables in a novel, but statistically justified manner. The method is an innovation in pedagogy as well as in research methodology of social and psychological phenomena. Specifically, events and processes are crossed, that is, experts' predictions of about future interaction of events and processes. Therefore, this methodology is futuristic; it concerns predicting future, which is of key importance for pedagogic objectives. The paper presents two instances of the cross-impact approach: the longer, displayed in fourteen steps, and the shorter, in four steps. They are both accompanied with mathematic and statistical formulae allowing for quantification, that is, a numerical expression of the probability of a certain event happening in the future. The advantage of this approach is that it facilitates planning in education which so far has been solely based on lay estimates and assumptions.

  8. The GEO-3 Scenarios 2002-2032. Quantification and Analysis of Environmental Impacts

    Energy Technology Data Exchange (ETDEWEB)

    Bakkes, J.; Potting, J. (eds.) [National Institute for Public Health and the Environment RIVM, Bilthoven (Netherlands); Henrichs, T. [Center for Environmental Systems Research CESR, University of Kassel, Kassel (Germany); Kemp-Benedict, E.; Raskin, P. [Stockholm Environment Institute SEI, Boston, MA (United States); Masui, T.; Rana, A. [National Institute for Environmental Studies NIES, Ibaraki (Japan); Nellemann, C. [United Nations Environment Programme UNEP, GRID Global and Regional Integrated Data centres Arendal, Lillehammer (Norway); Rothman, D. [International Centre for Integrative Studies ICIS, Maastricht University, Maastricht (Netherlands)

    2004-07-01

    The four contrasting visions of the world's next three decades as presented in the third Global Environment Outlook (GEO-3) have many implications for policy - from hunger to climate change and from freshwater issues to biodiversity. The four scenarios analysed are Markets First, Policy First, Security First, Sustainability First. Presenting a deeper analysis than the original GEO-3 report, this Technical Report quantifies the impacts of the scenarios for all 19 GEO 'sub-regions', such as Eastern Africa and Central Europe. Regional impacts are discussed in the context of sustainable development. The report summary compares the impacts of the four scenarios across regions - and for the world as a whole - in the light of internationally agreed targets including those in the Millennium Declaration where applicable. It provides an account of the analytical methods, key assumptions, models and other tools, along with the approaches used in the analyses. Based on the methods and results, the report looks back on the process of producing the forward-looking analysis for GEO-3. Were all analytical centres on the same track? Did the approach adopted for GEO-3 contribute to the overall GEO objective of strengthening global-regional involvement and linkages?.

  9. Improved perfusion quantification in FAIR imaging by offset correction

    DEFF Research Database (Denmark)

    Sidaros, Karam; Andersen, Irene Klærke; Gesmar, Henrik

    2001-01-01

    Perfusion quantification using pulsed arterial spin labeling has been shown to be sensitive to the RF pulse slice profiles. Therefore, in Flow-sensitive Alternating-Inversion Recovery (FAIR) imaging the slice selective (ss) inversion slab is usually three to four times thicker than the imaging...... slice. However, this reduces perfusion sensitivity due to the increased transit delay of the incoming blood with unperturbed spins. In the present article, the dependence of the magnetization on the RF pulse slice profiles is inspected both theoretically and experimentally. A perfusion quantification...... model is presented that allows the use of thinner ss inversion slabs by taking into account the offset of RF slice profiles between ss and nonselective inversion slabs. This model was tested in both phantom and human studies. Magn Reson Med 46:193-197, 2001...

  10. Metering error quantification under voltage and current waveform distortion

    Science.gov (United States)

    Wang, Tao; Wang, Jia; Xie, Zhi; Zhang, Ran

    2017-09-01

    With integration of more and more renewable energies and distortion loads into power grid, the voltage and current waveform distortion results in metering error in the smart meters. Because of the negative effects on the metering accuracy and fairness, it is an important subject to study energy metering combined error. In this paper, after the comparing between metering theoretical value and real recorded value under different meter modes for linear and nonlinear loads, a quantification method of metering mode error is proposed under waveform distortion. Based on the metering and time-division multiplier principles, a quantification method of metering accuracy error is proposed also. Analyzing the mode error and accuracy error, a comprehensive error analysis method is presented which is suitable for new energy and nonlinear loads. The proposed method has been proved by simulation.

  11. Distinguishing enhancing from nonenhancing renal masses with dual-source dual-energy CT: iodine quantification versus standard enhancement measurements.

    Science.gov (United States)

    Ascenti, Giorgio; Mileto, Achille; Krauss, Bernhard; Gaeta, Michele; Blandino, Alfredo; Scribano, Emanuele; Settineri, Nicola; Mazziotti, Silvio

    2013-08-01

    To compare the diagnostic accuracy of iodine quantification and standard enhancement measurements in distinguishing enhancing from nonenhancing renal masses. The Institutional Review Board approved this retrospective study conducted from data found in institutional patient databases and archives. Seventy-two renal masses were characterised as enhancing or nonenhancing using standard enhancement measurements (in HU) and iodine quantification (in mg/ml). Sensitivity, specificity, positive predictive value (PPV) and negative predictive value (NPV) of standard enhancement measurements and iodine quantification were calculated from χ (2) tests of contingency with histopathology or imaging follow-up as the reference standard. Difference in accuracy was assessed by means of McNemar analysis. Sensitivity, specificity, PPV, NPV and diagnostic accuracy for standard enhancement measurements and iodine quantification were 77.7 %, 100 %, 100 %, 81.8 %, 89 % and 100 %, 94.4 %, 94.7, 100 % and 97 %, respectively. The McNemar analysis showed that the accuracy of iodine quantification was significantly better (P < 0.001) than that of standard enhancement measurements. Compared with standard enhancement measurements, whole-tumour iodine quantification is more accurate in distinguishing enhancing from nonenhancing renal masses. • Enhancement of renal lesions is important when differentiating benign from malignant tumours. • Dual-energy CT offers measurement of iodine uptake rather than mere enhancement values. • Whole-tumour iodine quantification seems more accurate than standard CT enhancement measurements.

  12. Detection and quantification of Leveillula taurica growth in pepper leaves.

    Science.gov (United States)

    Zheng, Zheng; Nonomura, Teruo; Bóka, Károly; Matsuda, Yoshinori; Visser, Richard G F; Toyoda, Hideyoshi; Kiss, Levente; Bai, Yuling

    2013-06-01

    Leveillula taurica is an obligate fungal pathogen that causes powdery mildew disease on a broad range of plants, including important crops such as pepper, tomato, eggplant, onion, cotton, and so on. The early stage of this disease is difficult to diagnose and the disease can easily spread unobserved; for example, in pepper and tomato production fields and greenhouses. The objective of this study was to develop a detection and quantification method of L. taurica biomass in pepper leaves with special regard to the early stages of infection. We monitored the development of the disease to time the infection process on the leaf surface as well as inside the pepper leaves. The initial and final steps of the infection taking place on the leaf surface were consecutively observed using a dissecting microscope and a scanning electron microscope. The development of the intercellular mycelium in the mesophyll was followed by light and transmission electron microscopy. A pair of L. taurica-specific primers was designed based on the internal transcribed spacer sequence of L. taurica and used in real-time polymerase chain reaction (PCR) assay to quantify the fungal DNA during infection. The specificity of this assay was confirmed by testing the primer pair with DNA from host plants and also from another powdery mildew species, Oidium neolycopersici, infecting tomato. A standard curve was obtained for absolute quantification of L. taurica biomass. In addition, we tested a relative quantification method by using a plant gene as reference and the obtained results were compared with the visual disease index scoring. The real-time PCR assay for L. taurica provides a valuable tool for detection and quantification of this pathogen in breeding activities as well in plant-microbe interaction studies.

  13. Volumetric adsorptive microsampling-liquid chromatography tandem mass spectrometry assay for the simultaneous quantification of four antibiotics in human blood: Method development, validation and comparison with dried blood spot.

    Science.gov (United States)

    Barco, Sebastiano; Castagnola, Elio; Moscatelli, Andrea; Rudge, James; Tripodi, Gino; Cangemi, Giuliana

    2017-10-25

    In this paper we show the development and validation of a volumetric absorptive microsampling (VAMS™)-LC-MS/MS method for the simultaneous quantification of four antibiotics: piperacillin-tazobactam, meropenem, linezolid and ceftazidime in 10μL human blood. The novel VAMS-LC-MS/MS method has been compared with a dried blood spot (DBS)-based method in terms of impact of hematocrit (HCT) on accuracy, reproducibility, recovery and matrix effect. Antibiotics were extracted from VAMS and DBS by protein precipitation with methanol after a re-hydration step at 37°C for 10min. LC-MS/MS was carried out on a Thermo Scientific™ TSQ Quantum™ Access MAX triple quadrupole coupled to an Accela ™UHPLC system. The VAMS-LC-MS/MS method is selective, precise and reproducible. In contrast to DBS, it allows an accurate quantification without any HCT influence. It has been applied to samples derived from pediatric patients under therapy. VAMS is a valid alternative sampling strategy for the quantification of antibiotics and is valuable in support of clinical PK/PD studies and consequently therapeutic drug monitoring (TDM) in pediatrics. Copyright © 2017 Elsevier B.V. All rights reserved.

  14. Species identification and quantification in meat and meat products using droplet digital PCR (ddPCR).

    Science.gov (United States)

    Floren, C; Wiedemann, I; Brenig, B; Schütz, E; Beck, J

    2015-04-15

    Species fraud and product mislabelling in processed food, albeit not being a direct health issue, often results in consumer distrust. Therefore methods for quantification of undeclared species are needed. Targeting mitochondrial DNA, e.g. CYTB gene, for species quantification is unsuitable, due to a fivefold inter-tissue variation in mtDNA content per cell resulting in either an under- (-70%) or overestimation (+160%) of species DNA contents. Here, we describe a reliable two-step droplet digital PCR (ddPCR) assay targeting the nuclear F2 gene for precise quantification of cattle, horse, and pig in processed meat products. The ddPCR assay is advantageous over qPCR showing a limit of quantification (LOQ) and detection (LOD) in different meat products of 0.01% and 0.001%, respectively. The specificity was verified in 14 different species. Hence, determining F2 in food by ddPCR can be recommended for quality assurance and control in production systems. Copyright © 2014 The Authors. Published by Elsevier Ltd.. All rights reserved.

  15. Investigation on feasibility of recurrence quantification analysis for ...

    African Journals Online (AJOL)

    The RQA parameters such as percent recurrence (REC), trapping time (TT), percent laminarity (LAM) and entropy (ENT), and also the recurrence plots color patterns for different flank wear, can be used in detecting insert wear in face milling. Keywords: milling, flank wear, recurrence plot, recurrence quantification analysis.

  16. Automatic quantification of subarachnoid hemorrhage on noncontrast CT

    NARCIS (Netherlands)

    Boers, Anna Maria Merel; Zijlstra, I.A.; Gathier, C.S.; van den Berg, R.; Slump, Cornelis H.; Marquering, H.A.; Majoie, C.B.

    2014-01-01

    Quantification of blood after SAH on initial NCCT is an important radiologic measure to predict patient outcome and guide treatment decisions. In current scales, hemorrhage volume and density are not accounted for. The purpose of this study was to develop and validate a fully automatic method for

  17. Double-layer Tablets of Lornoxicam: Validation of Quantification ...

    African Journals Online (AJOL)

    Double-layer Tablets of Lornoxicam: Validation of Quantification Method, In vitro Dissolution and Kinetic Modelling. ... Satisfactory results were obtained from all the tablet formulations met compendial requirements. The slowest drug release rate was obtained with tablet cores based on PVP K90 (1.21 mg%.h-1).

  18. Good quantification practices of flavours and fragrances by mass spectrometry.

    Science.gov (United States)

    Begnaud, Frédéric; Chaintreau, Alain

    2016-10-28

    Over the past 15 years, chromatographic techniques with mass spectrometric detection have been increasingly used to monitor the rapidly expanded list of regulated flavour and fragrance ingredients. This trend entails a need for good quantification practices suitable for complex media, especially for multi-analytes. In this article, we present experimental precautions needed to perform the analyses and ways to process the data according to the most recent approaches. This notably includes the identification of analytes during their quantification and method validation, when applied to real matrices, based on accuracy profiles. A brief survey of application studies based on such practices is given.This article is part of the themed issue 'Quantitative mass spectrometry'. © 2016 The Authors.

  19. Direct quantification of airborne nanoparticles composition by TXRF after collection on filters

    Energy Technology Data Exchange (ETDEWEB)

    Motellier, S; Lhaute, K; Guiot, A; Golanski, L; Tardif, F [CEA Grenoble, DRT, LITEN, DTNM, Laboratory of Nanochemistry and Nanosafety, 17 Avenue des Martyrs, Cedex 9, F-38054 Grenoble (France); Geoffroy, C, E-mail: sylvie.motellier@cea.fr [Elexience, 9 rue des petits ruisseaux, BP 61, 91371 Verrieres-le-Buisson Cedex (France)

    2011-07-06

    Direct TXRF analysis of nanoparticles deposited on filters was evaluated. Standard filters spiked with known amounts of NP were produced using an atomizer which generates an aerosol from a NP containing-liquid suspension. Polycarbonate filters provided the highest fluorescence signals and black polycarbonate filters containing chromium were further selected, Cr being used as internal standard for elemental quantification of the filter contaminants. Calibration curves were established for various NP (TiO{sub 2}, ZnO, CeO{sub 2}, Al{sub 2}O{sub 3}). Good linearity was observed. Low limits of detection were in the tens to the hundreds of ngs per filter, the method being less adapted to Al{sub 2}O{sub 3} due to the poor TXRF sensitivity for light elements. The analysis of MW-CNTs was attempted by quantification of their metal (Fe) catalyst impurities. Problems like CNT dispersion in liquids, quantification of the deposited quantity and high Fe-background contamination.

  20. A Constrained Genetic Algorithm with Adaptively Defined Fitness Function in MRS Quantification

    Science.gov (United States)

    Papakostas, G. A.; Karras, D. A.; Mertzios, B. G.; Graveron-Demilly, D.; van Ormondt, D.

    MRS Signal quantification is a rather involved procedure and has attracted the interest of the medical engineering community, regarding the development of computationally efficient methodologies. Significant contributions based on Computational Intelligence tools, such as Neural Networks (NNs), demonstrated a good performance but not without drawbacks already discussed by the authors. On the other hand preliminary application of Genetic Algorithms (GA) has already been reported in the literature by the authors regarding the peak detection problem encountered in MRS quantification using the Voigt line shape model. This paper investigates a novel constrained genetic algorithm involving a generic and adaptively defined fitness function which extends the simple genetic algorithm methodology in case of noisy signals. The applicability of this new algorithm is scrutinized through experimentation in artificial MRS signals interleaved with noise, regarding its signal fitting capabilities. Although extensive experiments with real world MRS signals are necessary, the herein shown performance illustrates the method's potential to be established as a generic MRS metabolites quantification procedure.

  1. Quaternary ammonium isobaric tag for a relative and absolute quantification of peptides.

    Science.gov (United States)

    Setner, Bartosz; Stefanowicz, Piotr; Szewczuk, Zbigniew

    2018-02-01

    Isobaric labeling quantification of peptides has become a method of choice for mass spectrometry-based proteomics studies. However, despite of wide variety of commercially available isobaric tags, none of the currently available methods offers significant improvement of sensitivity of detection during MS experiment. Recently, many strategies were applied to increase the ionization efficiency of peptides involving chemical modifications introducing quaternary ammonium fixed charge. Here, we present a novel quaternary ammonium-based isobaric tag for relative and absolute quantification of peptides (QAS-iTRAQ 2-plex). Upon collisional activation, the new stable benzylic-type cationic reporter ion is liberated from the tag. Deuterium atoms were used to offset the differential masses of a reporter group. We tested the applicability of QAS-iTRAQ 2-plex reagent on a series of model peptides as well as bovine serum albumin tryptic digest. Obtained results suggest usefulness of this isobaric ionization tag for relative and absolute quantification of peptides. Copyright © 2017 John Wiley & Sons, Ltd.

  2. Photoacoustic bio-quantification of graphene based nanomaterials at a single cell level (Conference Presentation)

    Science.gov (United States)

    Nedosekin, Dmitry A.; Nolan, Jacqueline; Biris, Alexandru S.; Zharov, Vladimir P.

    2017-03-01

    Arkansas Nanomedicine Center at the University of Arkansas for Medical Sciences in collaboration with other Arkansas Universities and the FDA-based National Center of Toxicological Research in Jefferson, AR is developing novel techniques for rapid quantification of graphene-based nanomaterials (GBNs) in various biological samples. All-carbon GBNs have wide range of potential applications in industry, agriculture, food processing and medicine; however, quantification of GBNs is difficult in carbon reach biological tissues. The accurate quantification of GBNs is essential for research on material toxicity and the development of GBNs-based drug delivery platforms. We have developed microscopy and cytometry platforms for detection and quantification of GBNs in single cells, tissue and blood samples using photoacoustic contrast of GBNs. We demonstrated PA quantification of individual graphene uptake by single cells. High-resolution PA microscopy provided mapping of GBN distribution within live cells to establish correlation with intracellular toxic phenomena using apoptotic and necrotic assays. This new methodology and corresponding technical platform provide the insight on possible toxicological risks of GBNs at singe cells levels. In addition, in vivo PA image flow cytometry demonstrated the capability to monitor of GBNs pharmacokinetics in mouse model and to map the resulting biodistribution of GBNs in mouse tissues. The integrated PA platform provided an unprecedented sensitivity toward GBNs and allowed to enhance conventional toxicology research by providing a direct correlation between uptake of GBNs at a single cell level and cell viability status.

  3. Lowering the quantification limit of the QubitTM RNA HS assay using RNA spike-in.

    Science.gov (United States)

    Li, Xin; Ben-Dov, Iddo Z; Mauro, Maurizio; Williams, Zev

    2015-05-06

    RNA quantification is often a prerequisite for most RNA analyses such as RNA sequencing. However, the relatively low sensitivity and large sample consumption of traditional RNA quantification methods such as UV spectrophotometry and even the much more sensitive fluorescence-based RNA quantification assays, such as the Qubit™ RNA HS Assay, are often inadequate for measuring minute levels of RNA isolated from limited cell and tissue samples and biofluids. Thus, there is a pressing need for a more sensitive method to reliably and robustly detect trace levels of RNA without interference from DNA. To improve the quantification limit of the Qubit™ RNA HS Assay, we spiked-in a known quantity of RNA to achieve the minimum reading required by the assay. Samples containing trace amounts of RNA were then added to the spike-in and measured as a reading increase over RNA spike-in baseline. We determined the accuracy and precision of reading increases between 1 and 20 pg/μL as well as RNA-specificity in this range, and compared to those of RiboGreen(®), another sensitive fluorescence-based RNA quantification assay. We then applied Qubit™ Assay with RNA spike-in to quantify plasma RNA samples. RNA spike-in improved the quantification limit of the Qubit™ RNA HS Assay 5-fold, from 25 pg/μL down to 5 pg/μL while maintaining high specificity to RNA. This enabled quantification of RNA with original concentration as low as 55.6 pg/μL compared to 250 pg/μL for the standard assay and decreased sample consumption from 5 to 1 ng. Plasma RNA samples that were not measurable by the Qubit™ RNA HS Assay were measurable by our modified method. The Qubit™ RNA HS Assay with RNA spike-in is able to quantify RNA with high specificity at 5-fold lower concentration and uses 5-fold less sample quantity than the standard Qubit™ Assay.

  4. A Java program for LRE-based real-time qPCR that enables large-scale absolute quantification.

    Directory of Open Access Journals (Sweden)

    Robert G Rutledge

    Full Text Available BACKGROUND: Linear regression of efficiency (LRE introduced a new paradigm for real-time qPCR that enables large-scale absolute quantification by eliminating the need for standard curves. Developed through the application of sigmoidal mathematics to SYBR Green I-based assays, target quantity is derived directly from fluorescence readings within the central region of an amplification profile. However, a major challenge of implementing LRE quantification is the labor intensive nature of the analysis. FINDINGS: Utilizing the extensive resources that are available for developing Java-based software, the LRE Analyzer was written using the NetBeans IDE, and is built on top of the modular architecture and windowing system provided by the NetBeans Platform. This fully featured desktop application determines the number of target molecules within a sample with little or no intervention by the user, in addition to providing extensive database capabilities. MS Excel is used to import data, allowing LRE quantification to be conducted with any real-time PCR instrument that provides access to the raw fluorescence readings. An extensive help set also provides an in-depth introduction to LRE, in addition to guidelines on how to implement LRE quantification. CONCLUSIONS: The LRE Analyzer provides the automated analysis and data storage capabilities required by large-scale qPCR projects wanting to exploit the many advantages of absolute quantification. Foremost is the universal perspective afforded by absolute quantification, which among other attributes, provides the ability to directly compare quantitative data produced by different assays and/or instruments. Furthermore, absolute quantification has important implications for gene expression profiling in that it provides the foundation for comparing transcript quantities produced by any gene with any other gene, within and between samples.

  5. Hepatic Iron Quantification on 3 Tesla (3 T Magnetic Resonance (MR: Technical Challenges and Solutions

    Directory of Open Access Journals (Sweden)

    Muhammad Anwar

    2013-01-01

    Full Text Available MR has become a reliable and noninvasive method of hepatic iron quantification. Currently, most of the hepatic iron quantification is performed on 1.5 T MR, and the biopsy measurements have been paired with R2 and R2* values for 1.5 T MR. As the use of 3 T MR scanners is steadily increasing in clinical practice, it has become important to evaluate the practicality of calculating iron burden at 3 T MR. Hepatic iron quantification on 3 T MR requires a better understanding of the process and more stringent technical considerations. The purpose of this work is to focus on the technical challenges in establishing a relationship between T2* values at 1.5 T MR and 3 T MR for hepatic iron concentration (HIC and to develop an appropriately optimized MR protocol for the evaluation of T2* values in the liver at 3 T magnetic field strength. We studied 22 sickle cell patients using multiecho fast gradient-echo sequence (MFGRE 3 T MR and compared the results with serum ferritin and liver biopsy results. Our study showed that the quantification of hepatic iron on 3 T MRI in sickle cell disease patients correlates well with clinical blood test results and biopsy results. 3 T MR liver iron quantification based on MFGRE can be used for hepatic iron quantification in transfused patients.

  6. Influence of Co-57 and CT Transmission Measurements on the Quantification Accuracy and Partial Volume Effect of a Small Animal PET Scanner.

    Science.gov (United States)

    Mannheim, Julia G; Schmid, Andreas M; Pichler, Bernd J

    2017-12-01

    Non-invasive in vivo positron emission tomography (PET) provides high detection sensitivity in the nano- to picomolar range and in addition to other advantages, the possibility to absolutely quantify the acquired data. The present study focuses on the comparison of transmission data acquired with an X-ray computed tomography (CT) scanner or a Co-57 source for the Inveon small animal PET scanner (Siemens Healthcare, Knoxville, TN, USA), as well as determines their influences on the quantification accuracy and partial volume effect (PVE). A special focus included the impact of the performed calibration on the quantification accuracy. Phantom measurements were carried out to determine the quantification accuracy, the influence of the object size on the quantification, and the PVE for different sphere sizes, along the field of view and for different contrast ratios. An influence of the emission activity on the Co-57 transmission measurements was discovered (deviations up to 24.06 % measured to true activity), whereas no influence of the emission activity on the CT attenuation correction was identified (deviations influenced by the applied calibration factor and by the object size. The PVE demonstrated a dependency on the sphere size, the position within the field of view, the reconstruction and correction algorithms and the count statistics. Depending on the reconstruction algorithm, only ∼30-40 % of the true activity within a small sphere could be resolved. The iterative 3D reconstruction algorithms uncovered substantially increased recovery values compared to the analytical and 2D iterative reconstruction algorithms (up to 70.46 % and 80.82 % recovery for the smallest and largest sphere using iterative 3D reconstruction algorithms). The transmission measurement (CT or Co-57 source) to correct for attenuation did not severely influence the PVE. The analysis of the quantification accuracy and the PVE revealed an influence of the object size, the reconstruction

  7. Quantification of cellular uptake of DNA nanostructures by qPCR.

    Science.gov (United States)

    Okholm, Anders Hauge; Nielsen, Jesper Sejrup; Vinther, Mathias; Sørensen, Rasmus Schøler; Schaffert, David; Kjems, Jørgen

    2014-05-15

    DNA nanostructures facilitating drug delivery are likely soon to be realized. In the past few decades programmed self-assembly of DNA building blocks have successfully been employed to construct sophisticated nanoscale objects. By conjugating functionalities to DNA, other molecules such as peptides, proteins and polymers can be precisely positioned on DNA nanostructures. This exceptional ability to produce modular nanoscale devices with tunable and controlled behavior has initiated an interest in employing DNA nanostructures for drug delivery. However, to obtain this the relationship between cellular interactions and structural and functional features of the DNA delivery device must be thoroughly investigated. Here, we present a rapid and robust method for the precise quantification of the component materials of DNA origami structures capable of entering cells in vitro. The quantification is performed by quantitative polymerase chain reaction, allowing a linear dynamic range of detection of five orders of magnitude. We demonstrate the use of this method for high-throughput screening, which could prove efficient to identify key features of DNA nanostructures enabling cell penetration. The method described here is suitable for quantification of in vitro uptake studies but should easily be extended to quantify DNA nanostructures in blood or tissue samples. Copyright © 2014 Elsevier Inc. All rights reserved.

  8. Development of Accident Scenarios and Quantification Methodology for RAON Accelerator

    International Nuclear Information System (INIS)

    Lee, Yongjin; Jae, Moosung

    2014-01-01

    The RIsp (Rare Isotope Science Project) plans to provide neutron-rich isotopes (RIs) and stable heavy ion beams. The accelerator is defined as radiation production system according to Nuclear Safety Law. Therefore, it needs strict operate procedures and safety assurance to prevent radiation exposure. In order to satisfy this condition, there is a need for evaluating potential risk of accelerator from the design stage itself. Though some of PSA researches have been conducted for accelerator, most of them focus on not general accident sequence but simple explanation of accident. In this paper, general accident scenarios are developed by Event Tree and deduce new quantification methodology of Event Tree. In this study, some initial events, which may occur in the accelerator, are selected. Using selected initial events, the accident scenarios of accelerator facility are developed with Event Tree. These results can be used as basic data of the accelerator for future risk assessments. After analyzing the probability of each heading, it is possible to conduct quantification and evaluate the significance of the accident result. If there is a development of the accident scenario for external events, risk assessment of entire accelerator facility will be completed. To reduce the uncertainty of the Event Tree, it is possible to produce a reliable data via the presented quantification techniques

  9. Use of quantification in cardiac reporting: How does it change the clinical result?

    International Nuclear Information System (INIS)

    Gnanasegaran, G.; Hilson, A.J.W.; Buscombe, J.R.

    2005-01-01

    Many gamma camera systems are now sold with cardiac quantification packages. These are said to increase the accuracy of reporting. However the use of such quantification packages may change the clinical report as read from the tomographic slices. The aim of this study was to quantify the differences between qualitative visual reporting and quantification. The stress and rest myocardial perfusion studies were quantitatively reported in 37 patients comprising 333 segments of the heart (9 segments/patient). A defect was defined by a reduction in activity of >50% in each of the segments. For the tomographic qualitative reporting the data was reconstructed using iterative reconstruction with a Wiener smoothing filter. Quantification used an Emory bull's eye system with gender and age matched normal controls. Number of abnormal segments noted by qualitative reading of data were 119 at stress and 79 at rest. For the bull's eye plot 98 abnormal segments were seen at stress and 76 at rest. Thirty-three segments (10%) were abnormal on the qualitative reading of data alone and 7 (2%) were abnormal on bull's eye alone. Of the 55 segments reported as ischaemic qualitative reading of data, 26 (48%) were normal on bull's eye, 13 of these in the right coronary artery (RCA) territory segments. Of the 67 segments reported on the qualitative reading of data as infarct, 10 (13%) were normal on bull's eye, 7 of these in the territory of the RCA segments. There are significant differences in the results of reporting scans using a bull's eye plot especially in identifying inferior wall ischaemia. Therefore before using such a quantification method a full assessment of the accuracy of each method should be performed. (author)

  10. DNA imaging and quantification using chemi-luminescent probes; Imagerie et quantification d`ADN par chimiluminescence

    Energy Technology Data Exchange (ETDEWEB)

    Dorner, G; Redjdal, N; Laniece, P; Siebert, R; Tricoire, H; Valentin, L [Groupe I.P.B., Experimental Research Division, Inst. de Physique Nucleaire, Paris-11 Univ., 91 - Orsay (France)

    1999-11-01

    During this interdisciplinary study we have developed an ultra sensitive and reliable imaging system of DNA labelled by chemiluminescence. Based on a liquid nitrogen cooled CCD, the system achieves sensitivities down to 10 fg/mm{sup 2} labelled DNA over a surface area of 25 x 25 cm{sup 2} with a sub-millimeter resolution. Commercially available chemi-luminescent - and enhancer molecules are compared and their reaction conditions optimized for best signal-to-noise ratios. Double labelling was performed to verify quantification with radioactive probes. (authors) 1 fig.

  11. Using risk analysis in Health Impact Assessment: the impact of different relative risks for men and women in different socio-economic groups

    DEFF Research Database (Denmark)

    Nilunger, Louise; Diderichsen, Finn; Burström, Bo

    2004-01-01

    The aim of this study is to contribute to the emerging field of quantification of Health Impact Assessment (HIA), by analysing how different relative risks affect the burden of disease for various socio-economic groups (SES). Risk analysis, utilising attributable and impact fraction, raises several...... methodological considerations. The present study illustrates this by measuring the impact of changed distribution levels of smoking on lung cancer, ischemic heart disease (IHD), chronic obstructive lung disorder (COLD) and stroke for the highest and lowest socio-economic groups measured in disability adjusted...... the highest and lowest socio-economic groups may decrease by 75% or increase by 21% depending on the size of the relative risk. Assuming the same smoking prevalence for the lowest socio-economic group as for the highest (impact fraction), then the inequality may decrease by 7-26%. Consequently, the size...

  12. Quantification of arbuscular mycorrhizal fungal DNA in roots: how important is material preservation?

    Science.gov (United States)

    Janoušková, Martina; Püschel, David; Hujslová, Martina; Slavíková, Renata; Jansa, Jan

    2015-04-01

    Monitoring populations of arbuscular mycorrhizal fungi (AMF) in roots is a pre-requisite for improving our understanding of AMF ecology and functioning of the symbiosis in natural conditions. Among other approaches, quantification of fungal DNA in plant tissues by quantitative real-time PCR is one of the advanced techniques with a great potential to process large numbers of samples and to deliver truly quantitative information. Its application potential would greatly increase if the samples could be preserved by drying, but little is currently known about the feasibility and reliability of fungal DNA quantification from dry plant material. We addressed this question by comparing quantification results based on dry root material to those obtained from deep-frozen roots of Medicago truncatula colonized with Rhizophagus sp. The fungal DNA was well conserved in the dry root samples with overall fungal DNA levels in the extracts comparable with those determined in extracts of frozen roots. There was, however, no correlation between the quantitative data sets obtained from the two types of material, and data from dry roots were more variable. Based on these results, we recommend dry material for qualitative screenings but advocate using frozen root materials if precise quantification of fungal DNA is required.

  13. Forest Carbon Leakage Quantification Methods and Their Suitability for Assessing Leakage in REDD

    Directory of Open Access Journals (Sweden)

    Sabine Henders

    2012-01-01

    Full Text Available This paper assesses quantification methods for carbon leakage from forestry activities for their suitability in leakage accounting in a future Reducing Emissions from Deforestation and Forest Degradation (REDD mechanism. To that end, we first conducted a literature review to identify specific pre-requisites for leakage assessment in REDD. We then analyzed a total of 34 quantification methods for leakage emissions from the Clean Development Mechanism (CDM, the Verified Carbon Standard (VCS, the Climate Action Reserve (CAR, the CarbonFix Standard (CFS, and from scientific literature sources. We screened these methods for the leakage aspects they address in terms of leakage type, tools used for quantification and the geographical scale covered. Results show that leakage methods can be grouped into nine main methodological approaches, six of which could fulfill the recommended REDD leakage requirements if approaches for primary and secondary leakage are combined. The majority of methods assessed, address either primary or secondary leakage; the former mostly on a local or regional and the latter on national scale. The VCS is found to be the only carbon accounting standard at present to fulfill all leakage quantification requisites in REDD. However, a lack of accounting methods was identified for international leakage, which was addressed by only two methods, both from scientific literature.

  14. Simultaneous digital quantification and fluorescence-based size characterization of massively parallel sequencing libraries.

    Science.gov (United States)

    Laurie, Matthew T; Bertout, Jessica A; Taylor, Sean D; Burton, Joshua N; Shendure, Jay A; Bielas, Jason H

    2013-08-01

    Due to the high cost of failed runs and suboptimal data yields, quantification and determination of fragment size range are crucial steps in the library preparation process for massively parallel sequencing (or next-generation sequencing). Current library quality control methods commonly involve quantification using real-time quantitative PCR and size determination using gel or capillary electrophoresis. These methods are laborious and subject to a number of significant limitations that can make library calibration unreliable. Herein, we propose and test an alternative method for quality control of sequencing libraries using droplet digital PCR (ddPCR). By exploiting a correlation we have discovered between droplet fluorescence and amplicon size, we achieve the joint quantification and size determination of target DNA with a single ddPCR assay. We demonstrate the accuracy and precision of applying this method to the preparation of sequencing libraries.

  15. Application of the third theory of quantification in coal and gas outburst forecast

    Energy Technology Data Exchange (ETDEWEB)

    Wu, C.; Qin, Y.; Zhang, X. [China University of Mining and Technology, Xuzhou (China). School of Resource and Geoscience Engineering

    2004-12-01

    The essential principles of the third theory of quantification are discussed. The concept and calculated method of reaction degree are put forward which extend the applying range and scientificalness of the primary reaction. Taking the Zhongmacun mine as example, on the base of analyzing the rules of gas geology synthetically and traversing the geological factors affecting coal and gas outburst. The paper adopts the method of combining statistical units with the third theory of quantification, screens out 8 sensitive geological factors from 11 geological indexes and carries through the work of gas geology regionalism to the exploited area of Zhongmacun according to the research result. The practice shows that it is feasible to apply the third theory of quantification to gas geology, which offers a new thought to screen the sensitive geological factors of gas outburst forecast. 3 refs., 3 figs., 3 tabs.

  16. Quantification of Uncertainties in Integrated Spacecraft System Models, Phase I

    Data.gov (United States)

    National Aeronautics and Space Administration — The proposed effort is to investigate a novel uncertainty quantification (UQ) approach based on non-intrusive polynomial chaos (NIPC) for computationally efficient...

  17. Automated image analysis for quantification of filamentous bacteria

    DEFF Research Database (Denmark)

    Fredborg, Marlene; Rosenvinge, Flemming Schønning; Spillum, Erik

    2015-01-01

    in systems relying on colorimetry or turbidometry (such as Vitek-2, Phoenix, MicroScan WalkAway). The objective was to examine an automated image analysis algorithm for quantification of filamentous bacteria using the 3D digital microscopy imaging system, oCelloScope. Results Three E. coli strains displaying...

  18. Development of hydrate risk quantification in oil and gas production

    Science.gov (United States)

    Chaudhari, Piyush N.

    order to reduce the parametric study that may require a long duration of time using The Colorado School of Mines Hydrate Kinetic Model (CSMHyK). The evolution of the hydrate plugging risk along flowline-riser systems is modeled for steady state and transient operations considering the effect of several critical parameters such as oil-hydrate slip, duration of shut-in, and water droplet size on a subsea tieback system. This research presents a novel platform for quantification of the hydrate plugging risk, which in-turn will play an important role in improving and optimizing current hydrate management strategies. The predictive strength of the hydrate risk quantification and hydrate prediction models will have a significant impact on flow assurance engineering and design with respect to building safe and efficient hydrate management techniques for future deep-water developments.

  19. A novel synthetic quantification standard including virus and internal report targets: application for the detection and quantification of emerging begomoviruses on tomato.

    Science.gov (United States)

    Péréfarres, Frédéric; Hoareau, Murielle; Chiroleu, Frédéric; Reynaud, Bernard; Dintinger, Jacques; Lett, Jean-Michel

    2011-08-05

    Begomovirus is a genus of phytopathogenic single-stranded DNA viruses, transmitted by the whitefly Bemisia tabaci. This genus includes emerging and economically significant viruses such as those associated with Tomato Yellow Leaf Curl Disease, for which diagnostic tools are needed to prevent dispersion and new introductions. Five real-time PCRs with an internal tomato reporter gene were developed for accurate detection and quantification of monopartite begomoviruses, including two strains of the Tomato yellow leaf curl virus (TYLCV; Mld and IL strains), the Tomato leaf curl Comoros virus-like viruses (ToLCKMV-like viruses) and the two molecules of the bipartite Potato yellow mosaic virus. These diagnostic tools have a unique standard quantification, comprising the targeted viral and internal report amplicons. These duplex real-time PCRs were applied to artificially inoculated plants to monitor and compare their viral development. Real-time PCRs were optimized for accurate detection and quantification over a range of 2 × 10(9) to 2 × 10(3) copies of genomic viral DNA/μL for TYLCV-Mld, TYLCV-IL and PYMV-B and 2 × 10(8) to 2 × 10(3) copies of genomic viral DNA/μL for PYMV-A and ToLCKMV-like viruses. These real-time PCRs were applied to artificially inoculated plants and viral loads were compared at 10, 20 and 30 days post-inoculation. Different patterns of viral accumulation were observed between the bipartite and the monopartite begomoviruses. Interestingly, PYMV accumulated more viral DNA at each date for both genomic components compared to all the monopartite viruses. Also, PYMV reached its highest viral load at 10 dpi contrary to the other viruses (20 dpi). The accumulation kinetics of the two strains of emergent TYLCV differed from the ToLCKMV-like viruses in the higher quantities of viral DNA produced in the early phase of the infection and in the shorter time to reach this peak viral load. To detect and quantify a wide range of begomoviruses, five duplex

  20. A novel synthetic quantification standard including virus and internal report targets: application for the detection and quantification of emerging begomoviruses on tomato

    Directory of Open Access Journals (Sweden)

    Lett Jean-Michel

    2011-08-01

    Full Text Available Abstract Background Begomovirus is a genus of phytopathogenic single-stranded DNA viruses, transmitted by the whitefly Bemisia tabaci. This genus includes emerging and economically significant viruses such as those associated with Tomato Yellow Leaf Curl Disease, for which diagnostic tools are needed to prevent dispersion and new introductions. Five real-time PCRs with an internal tomato reporter gene were developed for accurate detection and quantification of monopartite begomoviruses, including two strains of the Tomato yellow leaf curl virus (TYLCV; Mld and IL strains, the Tomato leaf curl Comoros virus-like viruses (ToLCKMV-like viruses and the two molecules of the bipartite Potato yellow mosaic virus. These diagnostic tools have a unique standard quantification, comprising the targeted viral and internal report amplicons. These duplex real-time PCRs were applied to artificially inoculated plants to monitor and compare their viral development. Results Real-time PCRs were optimized for accurate detection and quantification over a range of 2 × 109 to 2 × 103 copies of genomic viral DNA/μL for TYLCV-Mld, TYLCV-IL and PYMV-B and 2 × 108 to 2 × 103 copies of genomic viral DNA/μL for PYMV-A and ToLCKMV-like viruses. These real-time PCRs were applied to artificially inoculated plants and viral loads were compared at 10, 20 and 30 days post-inoculation. Different patterns of viral accumulation were observed between the bipartite and the monopartite begomoviruses. Interestingly, PYMV accumulated more viral DNA at each date for both genomic components compared to all the monopartite viruses. Also, PYMV reached its highest viral load at 10 dpi contrary to the other viruses (20 dpi. The accumulation kinetics of the two strains of emergent TYLCV differed from the ToLCKMV-like viruses in the higher quantities of viral DNA produced in the early phase of the infection and in the shorter time to reach this peak viral load. Conclusions To detect and

  1. Preliminary study on computer automatic quantification of brain atrophy

    International Nuclear Information System (INIS)

    Li Chuanfu; Zhou Kangyuan

    2006-01-01

    Objective: To study the variability of normal brain volume with the sex and age, and put forward an objective standard for computer automatic quantification of brain atrophy. Methods: The cranial volume, brain volume and brain parenchymal fraction (BPF) of 487 cases of brain atrophy (310 males, 177 females) and 1901 cases of normal subjects (993 males, 908 females) were calculated with the newly developed algorithm of automatic quantification for brain atrophy. With the technique of polynomial curve fitting, the mathematical relationship of BPF with age in normal subjects was analyzed. Results: The cranial volume, brain volume and BPF of normal subjects were (1 271 322 ± 128 699) mm 3 , (1 211 725 ± 122 077) mm 3 and (95.3471 ± 2.3453)%, respectively, and those of atrophy subjects were (1 276 900 ± 125 180) mm 3 , (1 203 400 ± 117 760) mm 3 and BPF(91.8115 ± 2.3035)% respectively. The difference of BPF between the two groups was extremely significant (P 0.05). The expression P(x)=-0.0008x 2 + 0.0193x + 96.9999 could accurately describe the mathematical relationship between BPF and age in normal subject (lower limit of 95% CI y=-0.0008x 2 +0.0184x+95.1090). Conclusion: The lower limit of 95% confidence interval mathematical relationship between BPF and age could be used as an objective criteria for automatic quantification of brain atrophy with computer. (authors)

  2. Identification of flow paths and quantification of return flow volumes and timing at field scale

    Science.gov (United States)

    Claes, N.; Paige, G. B.; Parsekian, A.

    2017-12-01

    Flood irrigation, which constitutes a large part of agricultural water use, accounts for a significant amount of the water that is diverted from western streams. Return flow, the portion of the water applied to irrigated areas that returns to the stream, is important for maintaining base flows in streams and ecological function of riparian zones and wetlands hydrologically linked with streams. Prediction of timing and volumes of return flow during and after flood irrigation pose a challenge due to the heterogeneity of pedogenic and soil physical factors that influence vadose zone processes. In this study, we quantify volumes of return flow and potential pathways in the subsurface through a vadose zone flow model that is informed by both hydrological and geophysical observations in a Bayesian setting. We couple a two-dimensional vadose zone flow model through a Bayesian Markov Chain Monte Carlo approach with time lapse ERT, borehole NMR datasets that are collected during and after flood irrigation experiments, and soil physical lab analysis. The combination of both synthetic models and field observations leads to flow path identification and allows for quantification of volumes and timing and associated uncertainties of subsurface return that stems from flood irrigation. The quantification of the impact of soil heterogeneity enables us to translate these results to other sites and predict return flow under different soil physical settings. This is key when managing irrigation water resources and predictions of outcomes of different scenarios have to be evaluated.

  3. Optical coherence tomography assessment and quantification of intracoronary thrombus: Status and perspectives

    International Nuclear Information System (INIS)

    Porto, Italo; Mattesini, Alessio; Valente, Serafina; Prati, Francesco; Crea, Filippo; Bolognese, Leonardo

    2015-01-01

    Coronary angiography is the “golden standard” imaging technique in interventional cardiology and it is still widely used to guide interventions. A major drawback of this technique, however, is that it is inaccurate in the evaluation and quantification of intracoronary thrombus burden, a critical prognosticator and predictor of intraprocedural complications in acute coronary syndromes. The introduction of optical coherence tomography (OCT) holds the promise of overcoming this important limitation, as near-infrared light is uniquely sensitive to hemoglobin, the pigment of red blood cells trapped in the thrombus. This narrative review will focus on the use of OCT for the assessment, evaluation and quantification of intracoronary thrombosis. - Highlights: • Thrombotic burden in acute coronary syndromes Is not adequately evaluated by standard coronary angiography, whereas Optical Coherence Tomography is exquisitely sensitive to the hemoglobin contained in red blood cells and can be used to precisely quantify thrombus. • Both research and clinical applications have been developed using the OCT-based evaluation of thrombus. In particular, whereas precise quantification scores are useful for comparing antithrombotic therapies in randomized trials, both pharmacological and mechanical, the most important practical applications for OCT-based assessment of thrombus are the individuation of culprit lesions in the context of diffuse atheromata in acute coronary syndromes, and the so-called “delayed stenting” strategies. • Improvements in 3D rendering techniques are on the verge of revolutionizing OCT-based thrombus assessment, allowing extremely precise quantification of the thrombotic burden

  4. Optical coherence tomography assessment and quantification of intracoronary thrombus: Status and perspectives

    Energy Technology Data Exchange (ETDEWEB)

    Porto, Italo, E-mail: italo.porto@gmail.com [Interventional Cardiology Unit, San Donato Hospital, Arezzo (Italy); Mattesini, Alessio; Valente, Serafina [Interventional Cardiology Unit, Careggi Hospital, Florence (Italy); Prati, Francesco [Interventional Cardiology San Giovanni Hospital, Rome (Italy); CLI foundation (Italy); Crea, Filippo [Department of Cardiovascular Sciences, Catholic University of the Sacred Heart, Rome (Italy); Bolognese, Leonardo [Interventional Cardiology Unit, San Donato Hospital, Arezzo (Italy)

    2015-04-15

    Coronary angiography is the “golden standard” imaging technique in interventional cardiology and it is still widely used to guide interventions. A major drawback of this technique, however, is that it is inaccurate in the evaluation and quantification of intracoronary thrombus burden, a critical prognosticator and predictor of intraprocedural complications in acute coronary syndromes. The introduction of optical coherence tomography (OCT) holds the promise of overcoming this important limitation, as near-infrared light is uniquely sensitive to hemoglobin, the pigment of red blood cells trapped in the thrombus. This narrative review will focus on the use of OCT for the assessment, evaluation and quantification of intracoronary thrombosis. - Highlights: • Thrombotic burden in acute coronary syndromes Is not adequately evaluated by standard coronary angiography, whereas Optical Coherence Tomography is exquisitely sensitive to the hemoglobin contained in red blood cells and can be used to precisely quantify thrombus. • Both research and clinical applications have been developed using the OCT-based evaluation of thrombus. In particular, whereas precise quantification scores are useful for comparing antithrombotic therapies in randomized trials, both pharmacological and mechanical, the most important practical applications for OCT-based assessment of thrombus are the individuation of culprit lesions in the context of diffuse atheromata in acute coronary syndromes, and the so-called “delayed stenting” strategies. • Improvements in 3D rendering techniques are on the verge of revolutionizing OCT-based thrombus assessment, allowing extremely precise quantification of the thrombotic burden.

  5. An open tool for input function estimation and quantification of dynamic PET FDG brain scans.

    Science.gov (United States)

    Bertrán, Martín; Martínez, Natalia; Carbajal, Guillermo; Fernández, Alicia; Gómez, Álvaro

    2016-08-01

    Positron emission tomography (PET) analysis of clinical studies is mostly restricted to qualitative evaluation. Quantitative analysis of PET studies is highly desirable to be able to compute an objective measurement of the process of interest in order to evaluate treatment response and/or compare patient data. But implementation of quantitative analysis generally requires the determination of the input function: the arterial blood or plasma activity which indicates how much tracer is available for uptake in the brain. The purpose of our work was to share with the community an open software tool that can assist in the estimation of this input function, and the derivation of a quantitative map from the dynamic PET study. Arterial blood sampling during the PET study is the gold standard method to get the input function, but is uncomfortable and risky for the patient so it is rarely used in routine studies. To overcome the lack of a direct input function, different alternatives have been devised and are available in the literature. These alternatives derive the input function from the PET image itself (image-derived input function) or from data gathered from previous similar studies (population-based input function). In this article, we present ongoing work that includes the development of a software tool that integrates several methods with novel strategies for the segmentation of blood pools and parameter estimation. The tool is available as an extension to the 3D Slicer software. Tests on phantoms were conducted in order to validate the implemented methods. We evaluated the segmentation algorithms over a range of acquisition conditions and vasculature size. Input function estimation algorithms were evaluated against ground truth of the phantoms, as well as on their impact over the final quantification map. End-to-end use of the tool yields quantification maps with [Formula: see text] relative error in the estimated influx versus ground truth on phantoms. The main

  6. Global Sensitivity Analysis and Estimation of Model Error, Toward Uncertainty Quantification in Scramjet Computations

    Science.gov (United States)

    Huan, Xun; Safta, Cosmin; Sargsyan, Khachik; Geraci, Gianluca; Eldred, Michael S.; Vane, Zachary P.; Lacaze, Guilhem; Oefelein, Joseph C.; Najm, Habib N.

    2018-03-01

    The development of scramjet engines is an important research area for advancing hypersonic and orbital flights. Progress toward optimal engine designs requires accurate flow simulations together with uncertainty quantification. However, performing uncertainty quantification for scramjet simulations is challenging due to the large number of uncertain parameters involved and the high computational cost of flow simulations. These difficulties are addressed in this paper by developing practical uncertainty quantification algorithms and computational methods, and deploying them in the current study to large-eddy simulations of a jet in crossflow inside a simplified HIFiRE Direct Connect Rig scramjet combustor. First, global sensitivity analysis is conducted to identify influential uncertain input parameters, which can help reduce the systems stochastic dimension. Second, because models of different fidelity are used in the overall uncertainty quantification assessment, a framework for quantifying and propagating the uncertainty due to model error is presented. These methods are demonstrated on a nonreacting jet-in-crossflow test problem in a simplified scramjet geometry, with parameter space up to 24 dimensions, using static and dynamic treatments of the turbulence subgrid model, and with two-dimensional and three-dimensional geometries.

  7. Global Sensitivity Analysis and Estimation of Model Error, Toward Uncertainty Quantification in Scramjet Computations

    Energy Technology Data Exchange (ETDEWEB)

    Huan, Xun [Sandia National Lab. (SNL-CA), Livermore, CA (United States); Safta, Cosmin [Sandia National Lab. (SNL-CA), Livermore, CA (United States); Sargsyan, Khachik [Sandia National Lab. (SNL-CA), Livermore, CA (United States); Geraci, Gianluca [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Eldred, Michael S. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Vane, Zachary P. [Sandia National Lab. (SNL-CA), Livermore, CA (United States); Lacaze, Guilhem [Sandia National Lab. (SNL-CA), Livermore, CA (United States); Oefelein, Joseph C. [Sandia National Lab. (SNL-CA), Livermore, CA (United States); Najm, Habib N. [Sandia National Lab. (SNL-CA), Livermore, CA (United States)

    2018-02-09

    The development of scramjet engines is an important research area for advancing hypersonic and orbital flights. Progress toward optimal engine designs requires accurate flow simulations together with uncertainty quantification. However, performing uncertainty quantification for scramjet simulations is challenging due to the large number of uncertain parameters involved and the high computational cost of flow simulations. These difficulties are addressed in this paper by developing practical uncertainty quantification algorithms and computational methods, and deploying them in the current study to large-eddy simulations of a jet in crossflow inside a simplified HIFiRE Direct Connect Rig scramjet combustor. First, global sensitivity analysis is conducted to identify influential uncertain input parameters, which can help reduce the system’s stochastic dimension. Second, because models of different fidelity are used in the overall uncertainty quantification assessment, a framework for quantifying and propagating the uncertainty due to model error is presented. Finally, these methods are demonstrated on a nonreacting jet-in-crossflow test problem in a simplified scramjet geometry, with parameter space up to 24 dimensions, using static and dynamic treatments of the turbulence subgrid model, and with two-dimensional and three-dimensional geometries.

  8. Quantification of the sequestration of indium 111 labelled platelets

    International Nuclear Information System (INIS)

    Najean, Y.; Picard, N.; Dufour, V.; Rain, J.D.

    1988-01-01

    A simple method is proposed for an accurate quantification of the splenic and/or hepatic sequestration of the 111 In-labelled platelets. It could be allow a better prediction of the efficiency of splenectomy in idiopathic thrombocytopenic purpura [fr

  9. Quantification of complex modular architecture in plants.

    Science.gov (United States)

    Reeb, Catherine; Kaandorp, Jaap; Jansson, Fredrik; Puillandre, Nicolas; Dubuisson, Jean-Yves; Cornette, Raphaël; Jabbour, Florian; Coudert, Yoan; Patiño, Jairo; Flot, Jean-François; Vanderpoorten, Alain

    2018-04-01

    Morphometrics, the assignment of quantities to biological shapes, is a powerful tool to address taxonomic, evolutionary, functional and developmental questions. We propose a novel method for shape quantification of complex modular architecture in thalloid plants, whose extremely reduced morphologies, combined with the lack of a formal framework for thallus description, have long rendered taxonomic and evolutionary studies extremely challenging. Using graph theory, thalli are described as hierarchical series of nodes and edges, allowing for accurate, homologous and repeatable measurements of widths, lengths and angles. The computer program MorphoSnake was developed to extract the skeleton and contours of a thallus and automatically acquire, at each level of organization, width, length, angle and sinuosity measurements. Through the quantification of leaf architecture in Hymenophyllum ferns (Polypodiopsida) and a fully worked example of integrative taxonomy in the taxonomically challenging thalloid liverwort genus Riccardia, we show that MorphoSnake is applicable to all ramified plants. This new possibility of acquiring large numbers of quantitative traits in plants with complex modular architectures opens new perspectives of applications, from the development of rapid species identification tools to evolutionary analyses of adaptive plasticity. © 2018 The Authors. New Phytologist © 2018 New Phytologist Trust.

  10. Comparative quantification of alcohol exposure as risk factor for global burden of disease.

    Science.gov (United States)

    Rehm, Jürgen; Klotsche, Jens; Patra, Jayadeep

    2007-01-01

    Alcohol has been identified as one of the most important risk factors in the burden experienced as a result of disease. The objective of the present contribution is to establish a framework to comparatively quantify alcohol exposure as it is relevant for burden of disease. Different key indicators are combined to derive this quantification. First, adult per capita consumption, composed of recorded and unrecorded consumption, yields the best overall estimate of alcohol exposure for a country or region. Second, survey information is used to allocate the per capita consumption into sex and age groups. Third, an index for detrimental patterns of drinking is used to determine the additional impact on injury and cardiovascular burden. The methodology is applied to estimate global alcohol exposure for the year 2002. Finally, assumptions and potential problems of the approach are discussed. Copyright (c) 2007 John Wiley & Sons, Ltd.

  11. Contrast enhanced CT-scans are not comparable to non-enhanced scans in emphysema quantification

    International Nuclear Information System (INIS)

    Heussel, C.P.; Kappes, J.; Hantusch, R.; Hartlieb, S.; Weinheimer, O.; Kauczor, H.-U.; Eberhardt, R.

    2010-01-01

    Systemic, interventional and surgical treatments have gone new ways in treatment of emphysema. For longitudinal therapy monitoring and as end-points for clinical trials, quantification of the disease is necessary. Sensitive, easy to measure, as well as stable and reproducible parameters have to be characterized. One parameter that might affect emphysema quantification is IV contrast enhancement, which might also be indicated. Whether or not the contrast enhanced scan is also suited for emphysema quantification or an additional scan is necessary, a retrospective analysis of 12 adult patients undergoing clinically indicated both, a non-enhanced and enhanced thin section MSCT within a week (median 0 days, range 0-4 days) was done. The in-house YACTA software was used for automatic quantification of lung and emphysema volume, emphysema index, mean lung density, and 5th, 10th, 15th percentile. After IV contrast administration, the median CT derived lung volume decreased mild by 1.1%, while median emphysema volume decreased by relevant 11%. This results in a decrease of median emphysema index by 9%. The median lung density (15th percentile) increased after contrast application by 18 HU (9 HU). CT quantification delivers emphysema values that are clearly affected by IV contrast application. The detected changes after contrast application show the results of higher density in the lung parenchyma. Therefore the amount of quantified emphysema is reduced and the lung density increased after contrast enhancement. In longitudinal analyses, non-enhanced scans should be the reference, while enhanced scans cannot be used.

  12. Inter-laboratory assessment of different digital PCR platforms for quantification of human cytomegalovirus DNA.

    Science.gov (United States)

    Pavšič, Jernej; Devonshire, Alison; Blejec, Andrej; Foy, Carole A; Van Heuverswyn, Fran; Jones, Gerwyn M; Schimmel, Heinz; Žel, Jana; Huggett, Jim F; Redshaw, Nicholas; Karczmarczyk, Maria; Mozioğlu, Erkan; Akyürek, Sema; Akgöz, Müslüm; Milavec, Mojca

    2017-04-01

    Quantitative PCR (qPCR) is an important tool in pathogen detection. However, the use of different qPCR components, calibration materials and DNA extraction methods reduces comparability between laboratories, which can result in false diagnosis and discrepancies in patient care. The wider establishment of a metrological framework for nucleic acid tests could improve the degree of standardisation of pathogen detection and the quantification methods applied in the clinical context. To achieve this, accurate methods need to be developed and implemented as reference measurement procedures, and to facilitate characterisation of suitable certified reference materials. Digital PCR (dPCR) has already been used for pathogen quantification by analysing nucleic acids. Although dPCR has the potential to provide robust and accurate quantification of nucleic acids, further assessment of its actual performance characteristics is needed before it can be implemented in a metrological framework, and to allow adequate estimation of measurement uncertainties. Here, four laboratories demonstrated reproducibility (expanded measurement uncertainties below 15%) of dPCR for quantification of DNA from human cytomegalovirus, with no calibration to a common reference material. Using whole-virus material and extracted DNA, an intermediate precision (coefficients of variation below 25%) between three consecutive experiments was noted. Furthermore, discrepancies in estimated mean DNA copy number concentrations between laboratories were less than twofold, with DNA extraction as the main source of variability. These data demonstrate that dPCR offers a repeatable and reproducible method for quantification of viral DNA, and due to its satisfactory performance should be considered as candidate for reference methods for implementation in a metrological framework.

  13. Quantification of the effects of dependence on human error probabilities

    International Nuclear Information System (INIS)

    Bell, B.J.; Swain, A.D.

    1980-01-01

    In estimating the probabilities of human error in the performance of a series of tasks in a nuclear power plant, the situation-specific characteristics of the series must be considered. A critical factor not to be overlooked in this estimation is the dependence or independence that pertains to any of the several pairs of task performances. In discussing the quantification of the effects of dependence, the event tree symbology described will be used. In any series of tasks, the only dependence considered for quantification in this document will be that existing between the task of interest and the immediately preceeding task. Tasks performed earlier in the series may have some effect on the end task, but this effect is considered negligible

  14. Semi-automated quantification of living cells with internalized nanostructures

    KAUST Repository

    Margineanu, Michael B.; Julfakyan, Khachatur; Sommer, Christoph; Perez, Jose E.; Contreras, Maria F.; Khashab, Niveen M.; Kosel, Jü rgen; Ravasi, Timothy

    2016-01-01

    novel method for the quantification of cells that internalize a specific type of nanostructures. This approach is suitable for high-throughput and real-time data analysis and has the potential to be used to study the interaction of different types

  15. A posteriori uncertainty quantification of PIV-based pressure data

    NARCIS (Netherlands)

    Azijli, I.; Sciacchitano, A.; Ragni, D.; Palha Da Silva Clérigo, A.; Dwight, R.P.

    2016-01-01

    A methodology for a posteriori uncertainty quantification of pressure data retrieved from particle image velocimetry (PIV) is proposed. It relies upon the Bayesian framework, where the posterior distribution (probability distribution of the true velocity, given the PIV measurements) is obtained from

  16. Quantification in dynamic and small-animal positron emission tomography

    NARCIS (Netherlands)

    Disselhorst, Johannes Antonius

    2011-01-01

    This thesis covers two aspects of positron emission tomography (PET) quantification. The first section addresses the characterization and optimization of a small-animal PET/CT scanner. The sensitivity and resolution as well as various parameters affecting image quality (reconstruction settings, type

  17. Current position of high-resolution MS for drug quantification in clinical & forensic toxicology.

    Science.gov (United States)

    Meyer, Markus R; Helfer, Andreas G; Maurer, Hans H

    2014-08-01

    This paper reviews high-resolution MS approaches published from January 2011 until March 2014 for the quantification of drugs (of abuse) and/or their metabolites in biosamples using LC-MS with time-of-flight or Orbitrap™ mass analyzers. Corresponding approaches are discussed including sample preparation and mass spectral settings. The advantages and limitations of high-resolution MS for drug quantification, as well as the demand for a certain resolution or a specific mass accuracy are also explored.

  18. A Study on Uncertainty Quantification of Reflood Model using CIRCE Methodology

    International Nuclear Information System (INIS)

    Jeon, Seongsu; Hong, Soonjoon; Oh, Deogyeon; Bang, Youngseok

    2013-01-01

    The CIRCE method is intended to quantify the uncertainties of the correlations of a code. It may replace the expert judgment generally used. In this study, an uncertainty quantification of reflood model was performed using CIRCE methodology. In this paper, the application process of CIRCE methodology and main results are briefly described. This research is expected to be useful to improve the present audit calculation methodology, KINS-REM. In this study, an uncertainty quantification of reflood model was performed using CIRCE methodology. The application of CIRCE provided the satisfactory results. This research is expected to be useful to improve the present audit calculation methodology, KINS-REM

  19. Towards a new method for the quantification of metabolites in the biological sample

    International Nuclear Information System (INIS)

    Neugnot, B.

    2005-03-01

    The quantification of metabolites is a key step in drug development. The aim of this Ph.D. work was to study the feasibility of a new method for this quantification, in the biological sample, without the drawbacks (cost, time, ethics) of the classical quantification methods based on metabolites synthesis or administration to man of the radiolabelled drug. Our strategy consists in determining the response factor, in mass spectrometry, of the metabolites. This approach is based on tritium labelling of the metabolites, ex vivo, by isotopic exchange. The labelling step was studied with deuterium. Metabolites of a model drug, recovered from in vitro or urinary samples, were labelled by three ways (Crab tree's catalyst ID2, deuterated trifluoroacetic acid or rhodium chloride ID20). Then, the transposition to tritium labelling was studied and the first results are very promising for the ultimate validation of the method. (author)

  20. RSEM: accurate transcript quantification from RNA-Seq data with or without a reference genome

    Directory of Open Access Journals (Sweden)

    Dewey Colin N

    2011-08-01

    Full Text Available Abstract Background RNA-Seq is revolutionizing the way transcript abundances are measured. A key challenge in transcript quantification from RNA-Seq data is the handling of reads that map to multiple genes or isoforms. This issue is particularly important for quantification with de novo transcriptome assemblies in the absence of sequenced genomes, as it is difficult to determine which transcripts are isoforms of the same gene. A second significant issue is the design of RNA-Seq experiments, in terms of the number of reads, read length, and whether reads come from one or both ends of cDNA fragments. Results We present RSEM, an user-friendly software package for quantifying gene and isoform abundances from single-end or paired-end RNA-Seq data. RSEM outputs abundance estimates, 95% credibility intervals, and visualization files and can also simulate RNA-Seq data. In contrast to other existing tools, the software does not require a reference genome. Thus, in combination with a de novo transcriptome assembler, RSEM enables accurate transcript quantification for species without sequenced genomes. On simulated and real data sets, RSEM has superior or comparable performance to quantification methods that rely on a reference genome. Taking advantage of RSEM's ability to effectively use ambiguously-mapping reads, we show that accurate gene-level abundance estimates are best obtained with large numbers of short single-end reads. On the other hand, estimates of the relative frequencies of isoforms within single genes may be improved through the use of paired-end reads, depending on the number of possible splice forms for each gene. Conclusions RSEM is an accurate and user-friendly software tool for quantifying transcript abundances from RNA-Seq data. As it does not rely on the existence of a reference genome, it is particularly useful for quantification with de novo transcriptome assemblies. In addition, RSEM has enabled valuable guidance for cost

  1. SU-D-303-03: Impact of Uncertainty in T1 Measurements On Quantification of Dynamic Contrast Enhanced MRI

    Energy Technology Data Exchange (ETDEWEB)

    Aryal, M; Cao, Y [The University of Michigan, Ann Arbor, MI (United States)

    2015-06-15

    Purpose: Quantification of dynamic contrast enhanced (DCE) MRI requires native longitudinal relaxation time (T1) measurement. This study aimed to assess uncertainty in T1 measurements using two different methods. Methods and Materials: Brain MRI scans were performed on a 3T scanner in 9 patients who had low grade/benign tumors and partial brain radiotherapy without chemotherapy at pre-RT, week-3 during RT (wk-3), end-RT, and 1, 6 and 18 months after RT. T1-weighted images were acquired using gradient echo sequences with 1) 2 different flip angles (50 and 150), and 2) 5 variable TRs (100–2000ms). After creating quantitative T1 maps, average T1 was calculated in regions of interest (ROI), which were distant from tumors and received a total of accumulated radiation doses < 5 Gy at wk-3. ROIs included left and right normal Putamen and Thalamus (gray matter: GM), and frontal and parietal white matter (WM). Since there were no significant or even a trend of T1 changes from pre-RT to wk-3 in these ROIs, a relative repeatability coefficient (RC) of T1 as a measure of uncertainty was estimated in each ROI using the data pre-RT and at wk-3. The individual T1 changes at later time points were evaluated compared to the estimated RCs. Results: The 2-flip angle method produced small RCs in GM (9.7–11.7%) but large RCs in WM (12.2–13.6%) compared to the saturation-recovery (SR) method (11.0–17.7% for GM and 7.5–11.2% for WM). More than 81% of individual T1 changes were within T1 uncertainty ranges defined by RCs. Conclusion: Our study suggests that the impact of T1 uncertainty on physiological parameters derived from DCE MRI is not negligible. A short scan with 2 flip angles is able to achieve repeatability of T1 estimates similar to a long scan with 5 different TRs, and is desirable to be integrated in the DCE protocol. Present study was supported by National Institute of Health (NIH) under grant numbers; UO1 CA183848 and RO1 NS064973.

  2. Uncertainty quantification metrics for whole product life cycle cost estimates in aerospace innovation

    Science.gov (United States)

    Schwabe, O.; Shehab, E.; Erkoyuncu, J.

    2015-08-01

    The lack of defensible methods for quantifying cost estimate uncertainty over the whole product life cycle of aerospace innovations such as propulsion systems or airframes poses a significant challenge to the creation of accurate and defensible cost estimates. Based on the axiomatic definition of uncertainty as the actual prediction error of the cost estimate, this paper provides a comprehensive overview of metrics used for the uncertainty quantification of cost estimates based on a literature review, an evaluation of publicly funded projects such as part of the CORDIS or Horizon 2020 programs, and an analysis of established approaches used by organizations such NASA, the U.S. Department of Defence, the ESA, and various commercial companies. The metrics are categorized based on their foundational character (foundations), their use in practice (state-of-practice), their availability for practice (state-of-art) and those suggested for future exploration (state-of-future). Insights gained were that a variety of uncertainty quantification metrics exist whose suitability depends on the volatility of available relevant information, as defined by technical and cost readiness level, and the number of whole product life cycle phases the estimate is intended to be valid for. Information volatility and number of whole product life cycle phases can hereby be considered as defining multi-dimensional probability fields admitting various uncertainty quantification metric families with identifiable thresholds for transitioning between them. The key research gaps identified were the lacking guidance grounded in theory for the selection of uncertainty quantification metrics and lacking practical alternatives to metrics based on the Central Limit Theorem. An innovative uncertainty quantification framework consisting of; a set-theory based typology, a data library, a classification system, and a corresponding input-output model are put forward to address this research gap as the basis

  3. An overview of quantification methods in energy-dispersive X-ray ...

    Indian Academy of Sciences (India)

    methods for thin samples, samples with intermediate thickness and thick ... algorithms and quantification methods based on scattered primary radiation. ... technique for in situ characterization of materials such as contaminated soil, archaeo-.

  4. Investigating Time-Varying Drivers of Grid Project Emissions Impacts

    Energy Technology Data Exchange (ETDEWEB)

    Barrett, Emily L.; Thayer, Brandon L.; Pal, Seemita; Studarus, Karen E.

    2017-11-15

    The emissions consequences of smart grid technologies depend heavily on their context and vary not only by geographical location, but by time of year. The same technology operated to meet the same objective may increase the emissions associated with energy generation for part of the year and decrease emissions during other times. The Grid Project Impact Quantification (GridPIQ) tool provides the ability to estimate these seasonal variations and garner insight into the time-varying drivers of grid project emissions impacts. This work leverages GridPIQ to examine the emissions implications across years and seasons of adding energy storage technology to reduce daily peak demand in California and New York.

  5. Molecular quantification of lactic acid bacteria in fermented milk products using real-time quantitative PCR.

    Science.gov (United States)

    Furet, Jean-Pierre; Quénée, Pascal; Tailliez, Patrick

    2004-12-15

    Real-time quantitative PCR assays were developed for the absolute quantification of lactic acid bacteria (LAB) (Streptococcus thermophilus, Lactobacillus delbrueckii, L. casei, L. paracasei, L. rhamnosus, L. acidophilus and L. johnsonii) in fermented milk products. The results of molecular quantification and classic bacterial enumeration did not differ significantly with respect to S. thermophilus and the species of the L. casei group which were detected in the six commercial fermented products tested, thus showing that DNA extraction was efficient and that genomic DNA solutions were free of PCR inhibitors. For L. delbrueckii, the results of bacterial enumeration were generally lower by a factor 10 to 100 than those of PCR quantification, suggesting a loss of viability during storage of the dairy products at 1-8 degrees C for most of the strains in this species. Real-time quantitative assays enabled identification of the species of lactic acid bacterial strains initially present in commercial fermented milk products and their accurate quantification with a detection threshold of 10(3) cells per ml of product.

  6. A method for simultaneous quantification of phospholipid species by routine 31P NMR

    DEFF Research Database (Denmark)

    Brinkmann-Trettenes, Ulla; Stein, Paul C.; Klösgen, Beate Maria

    2012-01-01

    We report a 31P NMR assay for quantification of aqueous phospholipid samples. Using a capillary with trimethylphosphate as internal standard, the limit of quantification is 1.30mM. Comparison of the 31P NMR quantification method in aqueous buffer and in organic solvent revealed that the two methods...... are equal within experimental error. Changing the pH of the buffer enables peak separation for different phospholipid species. This is an advantage compared to the commercial enzyme assay based on phospholipase D and choline oxidase. The reported method, using routine 31P NMR equipment, is suitable when...... fast results of a limited number of samples are requested. © 2012 Elsevier B.V.....

  7. Effects of Respiration-Averaged Computed Tomography on Positron Emission Tomography/Computed Tomography Quantification and its Potential Impact on Gross Tumor Volume Delineation

    International Nuclear Information System (INIS)

    Chi, Pai-Chun Melinda; Mawlawi, Osama; Luo Dershan; Liao Zhongxing; Macapinlac, Homer A.; Pan Tinsu

    2008-01-01

    Purpose: Patient respiratory motion can cause image artifacts in positron emission tomography (PET) from PET/computed tomography (CT) and change the quantification of PET for thoracic patients. In this study, respiration-averaged CT (ACT) was used to remove the artifacts, and the changes in standardized uptake value (SUV) and gross tumor volume (GTV) were quantified. Methods and Materials: We incorporated the ACT acquisition in a PET/CT session for 216 lung patients, generating two PET/CT data sets for each patient. The first data set (PET HCT /HCT) contained the clinical PET/CT in which PET was attenuation corrected with a helical CT (HCT). The second data set (PET ACT /ACT) contained the PET/CT in which PET was corrected with ACT. We quantified the differences between the two datasets in image alignment, maximum SUV (SUV max ), and GTV contours. Results: Of the patients, 68% demonstrated respiratory artifacts in the PET HCT , and for all patients the artifact was removed or reduced in the corresponding PET ACT . The impact of respiration artifact was the worst for lesions less than 50 cm 3 and located below the dome of the diaphragm. For lesions in this group, the mean SUV max difference, GTV volume change, shift in GTV centroid location, and concordance index were 21%, 154%, 2.4 mm, and 0.61, respectively. Conclusion: This study benchmarked the differences between the PET data with and without artifacts. It is important to pay attention to the potential existence of these artifacts during GTV contouring, as such artifacts may increase the uncertainties in the lesion volume and the centroid location

  8. The Qiagen Investigator® Quantiplex HYres as an alternative kit for DNA quantification.

    Science.gov (United States)

    Frégeau, Chantal J; Laurin, Nancy

    2015-05-01

    The Investigator® Quantiplex HYres kit was evaluated as a potential replacement for dual DNA quantification of casework samples. This kit was determined to be highly sensitive with a limit of quantification and limit of detection of 0.0049ng/μL and 0.0003ng/μL, respectively, for both human and male DNA, using full or half reaction volumes. It was also accurate in assessing the amount of male DNA present in 96 mock and actual casework male:female mixtures (various ratios) processed in this exercise. The close correlation between the male/human DNA ratios expressed in percentages derived from the Investigator® Quantiplex HYres quantification results and the male DNA proportion calculated in mixed AmpFlSTR® Profiler® Plus or AmpFlSTR® Identifiler® Plus profiles, using the Amelogenin Y peak and STR loci, allowed guidelines to be developed to facilitate decisions regarding when to submit samples to Y-STR rather than autosomal STR profiling. The internal control (IC) target was shown to be more sensitive to inhibitors compared to the human and male DNA targets included in the Investigator® Quantiplex HYres kit serving as a good quality assessor of DNA extracts. The new kit met our criteria of enhanced sensitivity, accuracy, consistency, reliability and robustness for casework DNA quantification. Crown Copyright © 2015. Published by Elsevier Ireland Ltd. All rights reserved.

  9. A multi-center study benchmarks software tools for label-free proteome quantification

    Science.gov (United States)

    Gillet, Ludovic C; Bernhardt, Oliver M.; MacLean, Brendan; Röst, Hannes L.; Tate, Stephen A.; Tsou, Chih-Chiang; Reiter, Lukas; Distler, Ute; Rosenberger, George; Perez-Riverol, Yasset; Nesvizhskii, Alexey I.; Aebersold, Ruedi; Tenzer, Stefan

    2016-01-01

    The consistent and accurate quantification of proteins by mass spectrometry (MS)-based proteomics depends on the performance of instruments, acquisition methods and data analysis software. In collaboration with the software developers, we evaluated OpenSWATH, SWATH2.0, Skyline, Spectronaut and DIA-Umpire, five of the most widely used software methods for processing data from SWATH-MS (sequential window acquisition of all theoretical fragment ion spectra), a method that uses data-independent acquisition (DIA) for label-free protein quantification. We analyzed high-complexity test datasets from hybrid proteome samples of defined quantitative composition acquired on two different MS instruments using different SWATH isolation windows setups. For consistent evaluation we developed LFQbench, an R-package to calculate metrics of precision and accuracy in label-free quantitative MS, and report the identification performance, robustness and specificity of each software tool. Our reference datasets enabled developers to improve their software tools. After optimization, all tools provided highly convergent identification and reliable quantification performance, underscoring their robustness for label-free quantitative proteomics. PMID:27701404

  10. Damage Localization and Quantification of Earthquake Excited RC-Frames

    DEFF Research Database (Denmark)

    Skjærbæk, P.S.; Nielsen, Søren R.K.; Kirkegaard, Poul Henning

    In the paper a recently proposed method for damage localization and quantification of RC-structures from response measurements is tested on experimental data. The method investigated requires at least one response measurement along the structure and the ground surface acceleration. Further, the t...

  11. MRI-based quantification of brain damage in cerebrovascular disorders

    NARCIS (Netherlands)

    de Bresser, J.H.J.M.

    2011-01-01

    Brain diseases can lead to diverse structural abnormalities that can be assessed on magnetic resonance imaging (MRI) scans. These abnormalities can be quantified by (semi-)automated techniques. The studies described in this thesis aimed to optimize and apply cerebral quantification techniques in

  12. Toponomics method for the automated quantification of membrane protein translocation.

    Science.gov (United States)

    Domanova, Olga; Borbe, Stefan; Mühlfeld, Stefanie; Becker, Martin; Kubitz, Ralf; Häussinger, Dieter; Berlage, Thomas

    2011-09-19

    Intra-cellular and inter-cellular protein translocation can be observed by microscopic imaging of tissue sections prepared immunohistochemically. A manual densitometric analysis is time-consuming, subjective and error-prone. An automated quantification is faster, more reproducible, and should yield results comparable to manual evaluation. The automated method presented here was developed on rat liver tissue sections to study the translocation of bile salt transport proteins in hepatocytes. For validation, the cholestatic liver state was compared to the normal biological state. An automated quantification method was developed to analyze the translocation of membrane proteins and evaluated in comparison to an established manual method. Firstly, regions of interest (membrane fragments) are identified in confocal microscopy images. Further, densitometric intensity profiles are extracted orthogonally to membrane fragments, following the direction from the plasma membrane to cytoplasm. Finally, several different quantitative descriptors were derived from the densitometric profiles and were compared regarding their statistical significance with respect to the transport protein distribution. Stable performance, robustness and reproducibility were tested using several independent experimental datasets. A fully automated workflow for the information extraction and statistical evaluation has been developed and produces robust results. New descriptors for the intensity distribution profiles were found to be more discriminative, i.e. more significant, than those used in previous research publications for the translocation quantification. The slow manual calculation can be substituted by the fast and unbiased automated method.

  13. Quantification of intensive hybrid coastal reclamation for revealing its impacts on macrozoobenthos

    International Nuclear Information System (INIS)

    Yan, Jiaguo; Cui, Baoshan; Zheng, Jingjing; Xie, Tian; Wang, Qing; Li, Shanze

    2015-01-01

    Managing and identifying the sources of anthropogenic stress in coastal wetlands requires an in-depth understanding of relationships between species diversity and human activities. Empirical and experimental studies provide clear evidence that coastal reclamation can have profound impacts on marine organisms, but the focus of such studies is generally on comparative or laboratory research. We developed a compound intensity index (reclamation intensity index, RI) on hybrid coastal reclamation, to quantify the impacts of reclamation on coastal ecosystems. We also made use of mean annual absolute changes to a number of biotic variables (biodiversity, species richness, biomass of total macrozoobenthos, and species richness and biomass of Polychaeta, Mollusca, Crustacea, and Echinodermata) to determine Hedges’d index, which is a measure of the potential effects of coastal reclamation. Our results showed that there was significant difference of coastal reclamation intensity between Yellow Sea, East China Sea and South China Sea, the biological changes in effect sizes of the three regions differed greatly over time. Our modelling analyses showed that hybrid coastal reclamation generally had significant negative impacts on species diversity and biomass of macrozoobenthos. These relationships varied among different taxonomic groups and included both linear and nonlinear relationships. The results indicated that a high-intensity of coastal reclamation contributed to a pronounced decline in species diversity and biomass, while lower-intensity reclamation, or reclamation within certain thresholds, resulted in a small increase in species diversity and biomass. These results have important implications for biodiversity conservation and the ecological restoration of coastal wetlands in face of the intensive reclamation activities. (letter)

  14. The role of PET quantification in cardiovascular imaging.

    Science.gov (United States)

    Slomka, Piotr; Berman, Daniel S; Alexanderson, Erick; Germano, Guido

    2014-08-01

    Positron Emission Tomography (PET) has several clinical and research applications in cardiovascular imaging. Myocardial perfusion imaging with PET allows accurate global and regional measurements of myocardial perfusion, myocardial blood flow and function at stress and rest in one exam. Simultaneous assessment of function and perfusion by PET with quantitative software is currently the routine practice. Combination of ejection fraction reserve with perfusion information may improve the identification of severe disease. The myocardial viability can be estimated by quantitative comparison of fluorodeoxyglucose ( 18 FDG) and rest perfusion imaging. The myocardial blood flow and coronary flow reserve measurements are becoming routinely included in the clinical assessment due to enhanced dynamic imaging capabilities of the latest PET/CT scanners. Absolute flow measurements allow evaluation of the coronary microvascular dysfunction and provide additional prognostic and diagnostic information for coronary disease. Standard quantitative approaches to compute myocardial blood flow from kinetic PET data in automated and rapid fashion have been developed for 13 N-ammonia, 15 O-water and 82 Rb radiotracers. The agreement between software methods available for such analysis is excellent. Relative quantification of 82 Rb PET myocardial perfusion, based on comparisons to normal databases, demonstrates high performance for the detection of obstructive coronary disease. New tracers, such as 18 F-flurpiridaz may allow further improvements in the disease detection. Computerized analysis of perfusion at stress and rest reduces the variability of the assessment as compared to visual analysis. PET quantification can be enhanced by precise coregistration with CT angiography. In emerging clinical applications, the potential to identify vulnerable plaques by quantification of atherosclerotic plaque uptake of 18 FDG and 18 F-sodium fluoride tracers in carotids, aorta and coronary arteries

  15. 'Motion frozen' quantification and display of myocardial perfusion gated SPECT

    International Nuclear Information System (INIS)

    Slomka, P.J.; Hurwitz, G.A.; Baddredine, M.; Baranowski, J.; Aladl, U.E.

    2002-01-01

    Aim: Gated SPECT imaging incorporates both functional and perfusion information of the left ventricle (LV). However perfusion data is confounded by the effect of ventricular motion. Most existing quantification paradigms simply add all gated frames and then proceed to extract the perfusion information from static images, discarding the effects of cardiac motion. In an attempt to improve the reliability and accuracy of cardiac SPECT quantification we propose to eliminate the LV motion prior to the perfusion quantification via automated image warping algorithm. Methods: A pilot series of 14 male and 11 female gated stress SPECT images acquired with 8 time bins have been co-registered to the coordinates of the 3D normal templates. Subsequently the LV endo and epi-cardial 3D points (300-500) were identified on end-systolic (ES) and end-diastolic (ED) frames, defining the ES-ED motion vectors. The nonlinear image warping algorithm (thin-plate-spline) was then applied to warp end-systolic frame was onto the end-diastolic frames using the corresponding ES-ED motion vectors. The remaining 6 intermediate frames were also transformed to the ED coordinates using fractions of the motion vectors. Such warped images were then summed to provide the LV perfusion image in the ED phase but with counts from the full cycle. Results: The identification of the ED/ES corresponding points was successful in all cases. The corrected displacement between ED and ES images was up to 25 mm. The summed images had the appearance of the ED frames but have been much less noisy since all the counts have been used. The spatial resolution of such images appeared higher than that of summed gated images, especially in the female scans. These 'motion frozen' images could be displayed and quantified as regular non-gated tomograms including polar map paradigm. Conclusions: This image processing technique may improve the effective image resolution of summed gated myocardial perfusion images used for

  16. A performance study on three qPCR quantification kits and their compatibilities with the 6-dye DNA profiling systems.

    Science.gov (United States)

    Lin, Sze-Wah; Li, Christina; Ip, Stephen C Y

    2018-03-01

    DNA quantification plays an integral role in forensic DNA profiling. Not only does it estimate the total amount of amplifiable human autosomal and male DNA to ensure optimal amplification of target DNA for subsequent analysis, but also assesses the extraction efficiency and purity of the DNA extract. Latest DNA quantification systems even offer an estimate for the degree of DNA degradation in a sample. Here, we report the performance of three new generation qPCR kits, namely Investigator ® Quantiplex HYres Kit from QIAGEN, Quantifiler ® Trio DNA Quantification Kit from Applied Biosystems™, and PowerQuant ® System from Promega, and their compatibilities with three 6-dye DNA profiling systems. Our results have demonstrated that all three kits generate standard curves with satisfactory consistency and reproducibility, and are capable of screening out traces of male DNA in the presence of 30-fold excess of female DNA. They also exhibit a higher tolerance to PCR inhibition than Quantifiler ® Human DNA Quantification Kit from Applied Biosystems™ in autosomal DNA quantification. PowerQuant ® , as compared to Quantiplex HYres and Quantifiler ® Trio, shows a better precision for both autosomal and male DNA quantifications. Quantifiler ® Trio and PowerQuant ® in contrast to Quantiplex HYres offer better correlations with lower discrepancies between autosomal and male DNA quantification, and their additional degradation index features provide a detection platform for inhibited and/or degraded DNA template. Regarding the compatibility between these quantification and profiling systems: (1) both Quantifiler ® Trio and PowerQuant ® work well with GlobalFiler and Fusion 6C, allowing a fairly accurate prediction of their DNA typing results based on the quantification values; (2) Quantiplex HYres offers a fairly reliable IPC system for detecting any potential inhibitions on Investigator 24plex, whereas Quantifiler ® Trio and PowerQuant ® suit better for Global

  17. Spectroscopic quantification of 5-hydroxymethylcytosine in genomic DNA.

    Science.gov (United States)

    Shahal, Tamar; Gilat, Noa; Michaeli, Yael; Redy-Keisar, Orit; Shabat, Doron; Ebenstein, Yuval

    2014-08-19

    5-Hydroxymethylcytosine (5hmC), a modified form of the DNA base cytosine, is an important epigenetic mark linked to regulation of gene expression in development, and tumorigenesis. We have developed a spectroscopic method for a global quantification of 5hmC in genomic DNA. The assay is performed within a multiwell plate, which allows simultaneous recording of up to 350 samples. Our quantification procedure of 5hmC is direct, simple, and rapid. It relies on a two-step protocol that consists of enzymatic glucosylation of 5hmC with an azide-modified glucose, followed by a "click reaction" with an alkyne-fluorescent tag. The fluorescence intensity recorded from the DNA sample is proportional to its 5hmC content and can be quantified by a simple plate reader measurement. This labeling technique is specific and highly sensitive, allowing detection of 5hmC down to 0.002% of the total nucleotides. Our results reveal significant variations in the 5hmC content obtained from different mouse tissues, in agreement with previously reported data.

  18. Construction and Quantification of the One Top model of the Fire Events PSA

    International Nuclear Information System (INIS)

    Kang, Dae Il; Lee, Yoon Hwan; Han, Sang Hoon

    2008-01-01

    KAERI constructed the one top model of the fire events PSA for Ulchin Unit 3 and 4 by using the 'mapping technique'. The mapping technique was developed for the construction and quantification of external events PSA models with a one top model for an internal events PSA. With 'AIMS', the mapping technique can be implemented by the construction of mapping tables. The mapping tables include fire rooms, fire ignition frequency, related initiating events, fire transfer events, and the internal PSA basic events affected by a fire. The constructed one top fire PSA model is based on previously conducted fire PSA results for Ulchin Unit 3 and 4. In this paper, we introduce the construction procedure and quantification results of the one top model of the fire events PSA by using the mapping technique. As the one top model of the fire events PSA developed in this study is based on the previous study, we also introduce the previous fire PSA approach focused on quantification

  19. QUANTIFICATION AND BIOREMEDIATION OF ENVIRONMENTAL SAMPLES BY DEVELOPING A NOVEL AND EFFICIENT METHOD

    Directory of Open Access Journals (Sweden)

    Mohammad Osama

    2014-06-01

    Full Text Available Pleurotus ostreatus, a white rot fungus, is capable of bioremediating a wide range of organic contaminants including Polycyclic Aromatic Hydrocarbons (PAHs. Ergosterol is produced by living fungal biomass and used as a measure of fungal biomass. The first part of this work deals with the extraction and quantification of PAHs from contaminated sediments by Lipid Extraction Method (LEM. The second part consists of the development of a novel extraction method (Ergosterol Extraction Method (EEM, quantification and bioremediation. The novelty of this method is the simultaneously extraction and quantification of two different types of compounds, sterol (ergosterol and PAHs and is more efficient than LEM. EEM has been successful in extracting ergosterol from the fungus grown on barley in the concentrations of 17.5-39.94 µg g-1 ergosterol and the PAHs are much more quantified in numbers and amounts as compared to LEM. In addition, cholesterol usually found in animals, has also been detected in the fungus, P. ostreatus at easily detectable levels.

  20. Quantification of heterogeneity observed in medical images

    OpenAIRE

    Brooks, Frank J; Grigsby, Perry W

    2013-01-01

    Background There has been much recent interest in the quantification of visually evident heterogeneity within functional grayscale medical images, such as those obtained via magnetic resonance or positron emission tomography. In the case of images of cancerous tumors, variations in grayscale intensity imply variations in crucial tumor biology. Despite these considerable clinical implications, there is as yet no standardized method for measuring the heterogeneity observed via these imaging mod...

  1. USACM Thematic Workshop On Uncertainty Quantification And Data-Driven Modeling.

    Energy Technology Data Exchange (ETDEWEB)

    Stewart, James R. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2017-05-01

    The USACM Thematic Workshop on Uncertainty Quantification and Data-Driven Modeling was held on March 23-24, 2017, in Austin, TX. The organizers of the technical program were James R. Stewart of Sandia National Laboratories and Krishna Garikipati of University of Michigan. The administrative organizer was Ruth Hengst, who serves as Program Coordinator for the USACM. The organization of this workshop was coordinated through the USACM Technical Thrust Area on Uncertainty Quantification and Probabilistic Analysis. The workshop website (http://uqpm2017.usacm.org) includes the presentation agenda as well as links to several of the presentation slides (permission to access the presentations was granted by each of those speakers, respectively). Herein, this final report contains the complete workshop program that includes the presentation agenda, the presentation abstracts, and the list of posters.

  2. Accurate quantification of 5 German cockroach (GCr) allergens in complex extracts using multiple reaction monitoring mass spectrometry (MRM MS).

    Science.gov (United States)

    Mindaye, S T; Spiric, J; David, N A; Rabin, R L; Slater, J E

    2017-12-01

    German cockroach (GCr) allergen extracts are complex and heterogeneous products, and methods to better assess their potency and composition are needed for adequate studies of their safety and efficacy. The objective of this study was to develop an assay based on liquid chromatography and multiple reaction monitoring mass spectrometry (LC-MRM MS) for rapid, accurate, and reproducible quantification of 5 allergens (Bla g 1, Bla g 2, Bla g 3, Bla g 4, and Bla g 5) in crude GCr allergen extracts. We first established a comprehensive peptide library of allergens from various commercial extracts as well as recombinant allergens. Peptide mapping was performed using high-resolution MS, and the peptide library was then used to identify prototypic and quantotypic peptides to proceed with MRM method development. Assay development included a systematic optimization of digestion conditions (buffer, digestion time, and trypsin concentration), chromatographic separation, and MS parameters. Robustness and suitability were assessed following ICH (Q2 [R1]) guidelines. The method is precise (RSD  0.99, 0.01-1384 fmol/μL), and sensitive (LLOD and LLOQ MS, we quantified allergens from various commercial GCr extracts and showed considerable variability that may impact clinical efficacy. Our data demonstrate that the LC-MRM MS method is valuable for absolute quantification of allergens in GCr extracts and likely has broader applicability to other complex allergen extracts. Definitive quantification provides a new standard for labelling of allergen extracts, which will inform patient care, enable personalized therapy, and enhance the efficacy of immunotherapy for environmental and food allergies. © 2017 The Authors. Clinical & Experimental Allergy published by John Wiley & Sons Ltd. This article has been contributed to by US Government employees and their work is in the public domain in the USA.

  3. The Emissions Impacts of Varied Energy Storage Operational Objectives Across Regions

    Energy Technology Data Exchange (ETDEWEB)

    Barrett, Emily L.; Thayer, Brandon L.; Studarus, Karen E.; Pal, Seemita

    2017-11-15

    The emissions consequences of smart grid technologies can be significant but are not always intuitive. This is particularly true in the implementation of energy storage (ES) systems that are being increasingly adopted to integrate more intermittent renewable generation, to reduce peak demand, and to participate in energy markets. Both the location of the ES system within the grid and the way it is operated will dictate its resulting impacts. The Grid Project Impact Quantification tool can provide insight into some of the emissions implications of hypothetical ES systems for a variety of operational objectives in diverse locations within the United States.

  4. The Method of Manufactured Universes for validating uncertainty quantification methods

    KAUST Repository

    Stripling, H.F.; Adams, M.L.; McClarren, R.G.; Mallick, B.K.

    2011-01-01

    The Method of Manufactured Universes is presented as a validation framework for uncertainty quantification (UQ) methodologies and as a tool for exploring the effects of statistical and modeling assumptions embedded in these methods. The framework

  5. The value of serum Hepatitis B surface antigen quantification in ...

    African Journals Online (AJOL)

    The value of serum Hepatitis B surface antigen quantification in determining viralactivity in chronic Hepatitis B virus infection. ... ofCHB andalso higher in hepatitis e antigen positive patients compared to hepatitis e antigen negative patients.

  6. Automated Quantification of Pneumothorax in CT

    Science.gov (United States)

    Do, Synho; Salvaggio, Kristen; Gupta, Supriya; Kalra, Mannudeep; Ali, Nabeel U.; Pien, Homer

    2012-01-01

    An automated, computer-aided diagnosis (CAD) algorithm for the quantification of pneumothoraces from Multidetector Computed Tomography (MDCT) images has been developed. Algorithm performance was evaluated through comparison to manual segmentation by expert radiologists. A combination of two-dimensional and three-dimensional processing techniques was incorporated to reduce required processing time by two-thirds (as compared to similar techniques). Volumetric measurements on relative pneumothorax size were obtained and the overall performance of the automated method shows an average error of just below 1%. PMID:23082091

  7. An approach to microstructure quantification in terms of impact properties of HSLA pipeline steels

    Energy Technology Data Exchange (ETDEWEB)

    Gervasyev, Alexey [Department of Materials Science and Engineering, Ghent University (Belgium); R& D Center TMK, Ltd., Chelyabinsk (Russian Federation); Carretero Olalla, Victor [SKF Belgium NV/SA, Brussels (Belgium); Sidor, Jurij [Department of Mechanical Engineering, University of West Hungary, Szombathely (Hungary); Sanchez Mouriño, Nuria [ArcelorMittal Global R& D/OCAS NV, Gent (Belgium); Kestens, Leo A.I.; Petrov, Roumen H. [Department of Materials Science and Engineering, Ghent University (Belgium); Department of Materials Science and Engineering, Delft University of Technology (Netherlands)

    2016-11-20

    Several thermo-mechanical controlled processing (TMCP) schedules of a modern pipeline steel were executed using a laboratory mill to investigate both the TMCP parameters influence on the ductile properties and the microstructure and texture evolution during TMCP. Impact fracture toughness was evaluated by means of instrumented Charpy impact test and results were correlated with the metallurgical characterization of the steel via electron backscattered diffraction (EBSD) technique. It is shown that the ductile crack growth observed in the impact test experiments can be reasonably correlated with the Morphology Clustering (MC) and the Cleavage Morphology Clustering (CMC) parameters, which incorporate size, shape, and crystallographic texture features of microstructure elements. The mechanism of unfavorable texture formation during TMCP is explained by texture changes occurring between the end of finish rolling and the start of accelerated cooling.

  8. Efficient Quantification of Uncertainties in Complex Computer Code Results, Phase I

    Data.gov (United States)

    National Aeronautics and Space Administration — This proposal addresses methods for efficient quantification of margins and uncertainties (QMU) for models that couple multiple, large-scale commercial or...

  9. Development and evaluation of a sandwich ELISA for quantification of the 20S proteasome in human plasma

    DEFF Research Database (Denmark)

    Dutaud, Dominique; Aubry, Laurent; Henry, Laurent

    2002-01-01

    Because quantification of the 20S proteasome by functional activity measurements is difficult and inaccurate, we have developed an indirect sandwich enzyme-linked immunosorbent assays (ELISA) for quantification of the 20S proteasome in human plasma. This sandwich ELISA uses a combination...

  10. Direct quantification of negatively charged functional groups on membrane surfaces

    KAUST Repository

    Tiraferri, Alberto; Elimelech, Menachem

    2012-01-01

    groups at the surface of dense polymeric membranes. Both techniques consist of associating the membrane surface moieties with chemical probes, followed by quantification of the bound probes. Uranyl acetate and toluidine blue O dye, which interact

  11. On the direct quantification of celecoxib in commercial solid drugs using the TT-PIXE and TT-PIGE techniques

    International Nuclear Information System (INIS)

    Nsouli, B.; Zahraman, K.; Bejjani, A.; Assi, S.; El-Yazbi, F.; Roumie, M.

    2006-01-01

    The quantification of the active ingredient (AI) in drugs is a crucial and important step in the drug quality control process. This is usually performed by using wet chemical techniques like LC-MS, UV spectrophotometry and other appropriate organic analytical methods. In the case of an active ingredient contains specific heteroatoms (F, S, Cl, etc.,), elemental IBA technique can be explored for molecular quantification. IBA techniques permit the analysis of the sample under solid form, without any laborious sample preparation. This is an advantage when the number of sample is relatively large. In this work, we demonstrate the ability of the thick target PIXE (TT-PIXE) and the TT-PIGE techniques for rapid and accurate quantification of celecoxib in commercial drugs. The experimental aspects related to the quantification validity are presented and discussed

  12. Real-Time PCR for Universal Phytoplasma Detection and Quantification

    DEFF Research Database (Denmark)

    Christensen, Nynne Meyn; Nyskjold, Henriette; Nicolaisen, Mogens

    2013-01-01

    Currently, the most efficient detection and precise quantification of phytoplasmas is by real-time PCR. Compared to nested PCR, this method is less sensitive to contamination and is less work intensive. Therefore, a universal real-time PCR method will be valuable in screening programs and in other...

  13. Evaluation of semi-automatic arterial stenosis quantification

    International Nuclear Information System (INIS)

    Hernandez Hoyos, M.; Universite Claude Bernard Lyon 1, 69 - Villeurbanne; Univ. de los Andes, Bogota; Serfaty, J.M.; Douek, P.C.; Universite Claude Bernard Lyon 1, 69 - Villeurbanne; Hopital Cardiovasculaire et Pneumologique L. Pradel, Bron; Maghiar, A.; Mansard, C.; Orkisz, M.; Magnin, I.; Universite Claude Bernard Lyon 1, 69 - Villeurbanne

    2006-01-01

    Object: To assess the accuracy and reproducibility of semi-automatic vessel axis extraction and stenosis quantification in 3D contrast-enhanced Magnetic Resonance Angiography (CE-MRA) of the carotid arteries (CA). Materials and methods: A total of 25 MRA datasets was used: 5 phantoms with known stenoses, and 20 patients (40 CAs) drawn from a multicenter trial database. Maracas software extracted vessel centerlines and quantified the stenoses, based on boundary detection in planes perpendicular to the centerline. Centerline accuracy was visually scored. Semi-automatic measurements were compared with: (1) theoretical phantom morphometric values, and (2) stenosis degrees evaluated by two independent radiologists. Results: Exploitable centerlines were obtained in 97% of CA and in all phantoms. In phantoms, the software achieved a better agreement with theoretic stenosis degrees (weighted kappa Κ W = 0.91) than the radiologists (Κ W = 0.69). In patients, agreement between software and radiologists varied from Κ W =0.67 to 0.90. In both, Maracas was substantially more reproducible than the readers. Mean operating time was within 1 min/ CA. Conclusion: Maracas software generates accurate 3D centerlines of vascular segments with minimum user intervention. Semi-automatic quantification of CA stenosis is also accurate, except in very severe stenoses that cannot be segmented. It substantially reduces the inter-observer variability. (orig.)

  14. Virus detection and quantification using electrical parameters

    Science.gov (United States)

    Ahmad, Mahmoud Al; Mustafa, Farah; Ali, Lizna M.; Rizvi, Tahir A.

    2014-10-01

    Here we identify and quantitate two similar viruses, human and feline immunodeficiency viruses (HIV and FIV), suspended in a liquid medium without labeling, using a semiconductor technique. The virus count was estimated by calculating the impurities inside a defined volume by observing the change in electrical parameters. Empirically, the virus count was similar to the absolute value of the ratio of the change of the virus suspension dopant concentration relative to the mock dopant over the change in virus suspension Debye volume relative to mock Debye volume. The virus type was identified by constructing a concentration-mobility relationship which is unique for each kind of virus, allowing for a fast (within minutes) and label-free virus quantification and identification. For validation, the HIV and FIV virus preparations were further quantified by a biochemical technique and the results obtained by both approaches corroborated well. We further demonstrate that the electrical technique could be applied to accurately measure and characterize silica nanoparticles that resemble the virus particles in size. Based on these results, we anticipate our present approach to be a starting point towards establishing the foundation for label-free electrical-based identification and quantification of an unlimited number of viruses and other nano-sized particles.

  15. CT quantification of central airway in tracheobronchomalacia

    Energy Technology Data Exchange (ETDEWEB)

    Im, Won Hyeong; Jin, Gong Yong; Han, Young Min; Kim, Eun Young [Dept. of Radiology, Chonbuk National University Hospital, Jeonju (Korea, Republic of)

    2016-05-15

    To know which factors help to diagnose tracheobronchomalacia (TBM) using CT quantification of central airway. From April 2013 to July 2014, 19 patients (68.0 ± 15.0 years; 6 male, 13 female) were diagnosed as TBM on CT. As case-matching, 38 normal subjects (65.5 ± 21.5 years; 6 male, 13 female) were selected. All 57 subjects underwent CT with end-inspiration and end-expiration. Airway parameters of trachea and both main bronchus were assessed using software (VIDA diagnostic). Airway parameters of TBM patients and normal subjects were compared using the Student t-test. In expiration, both wall perimeter and wall thickness in TBM patients were significantly smaller than normal subjects (wall perimeter: trachea, 43.97 mm vs. 49.04 mm, p = 0.020; right main bronchus, 33.52 mm vs. 42.69 mm, p < 0.001; left main bronchus, 26.76 mm vs. 31.88 mm, p = 0.012; wall thickness: trachea, 1.89 mm vs. 2.22 mm, p = 0.017; right main bronchus, 1.64 mm vs. 1.83 mm, p = 0.021; left main bronchus, 1.61 mm vs. 1.75 mm, p = 0.016). Wall thinning and decreased perimeter of central airway of expiration by CT quantification would be a new diagnostic indicators in TBM.

  16. Direct infusion-SIM as fast and robust method for absolute protein quantification in complex samples

    Directory of Open Access Journals (Sweden)

    Christina Looße

    2015-06-01

    Full Text Available Relative and absolute quantification of proteins in biological and clinical samples are common approaches in proteomics. Until now, targeted protein quantification is mainly performed using a combination of HPLC-based peptide separation and selected reaction monitoring on triple quadrupole mass spectrometers. Here, we show for the first time the potential of absolute quantification using a direct infusion strategy combined with single ion monitoring (SIM on a Q Exactive mass spectrometer. By using complex membrane fractions of Escherichia coli, we absolutely quantified the recombinant expressed heterologous human cytochrome P450 monooxygenase 3A4 (CYP3A4 comparing direct infusion-SIM with conventional HPLC-SIM. Direct-infusion SIM revealed only 14.7% (±4.1 (s.e.m. deviation on average, compared to HPLC-SIM and a decreased processing and analysis time of 4.5 min (that could be further decreased to 30 s for a single sample in contrast to 65 min by the LC–MS method. Summarized, our simplified workflow using direct infusion-SIM provides a fast and robust method for quantification of proteins in complex protein mixtures.

  17. The need for clinical quantification of combined PET/MRI data in pediatric epilepsy

    International Nuclear Information System (INIS)

    Muzik, Otto; Pai, Darshan; Juhasz, Csaba; Hua, Jing

    2013-01-01

    In the past, multimodality integrative analysis of image data has been used to obtain a better understanding of underlying mechanisms of seizure generation and propagation in children with extratemporal lobe epilepsy. However, despite important advances in the combined analysis of PET, MRI, DTI and EEG data, successful surgical outcome is only achieved in about 2/3 of patients undergoing resective surgery. The advent of simultaneous PET/MR data acquisition promises an important advance in neuroimaging through clinical quantification, which will finally translate the strength of PET (which is the ability to absolutely quantify physiological parameters such as metabolic rates and receptor densities) into clinical work. Taking advantage of recently developed integrated PET/MR devices, absolute physiological values will be available in clinical routine, replacing currently used visual assessment of relative tissue tracer uptake. This will allow assessment of global increases/decreases of brain function during critical phases of development and is likely to have a significant impact on patient management in pediatric epilepsy

  18. The need for clinical quantification of combined PET/MRI data in pediatric epilepsy

    Energy Technology Data Exchange (ETDEWEB)

    Muzik, Otto, E-mail: otto@pet.wayne.edu [Department of Pediatrics, Wayne State University School of Medicine, Detroit, MI (United States); Department of Radiology, Wayne State University School of Medicine, Detroit, MI (United States); Pai, Darshan [Department of Computer Science, Wayne State University School of Medicine, Detroit, MI (United States); Juhasz, Csaba [Department of Pediatrics, Wayne State University School of Medicine, Detroit, MI (United States); Hua, Jing [Department of Computer Science, Wayne State University School of Medicine, Detroit, MI (United States)

    2013-02-21

    In the past, multimodality integrative analysis of image data has been used to obtain a better understanding of underlying mechanisms of seizure generation and propagation in children with extratemporal lobe epilepsy. However, despite important advances in the combined analysis of PET, MRI, DTI and EEG data, successful surgical outcome is only achieved in about 2/3 of patients undergoing resective surgery. The advent of simultaneous PET/MR data acquisition promises an important advance in neuroimaging through clinical quantification, which will finally translate the strength of PET (which is the ability to absolutely quantify physiological parameters such as metabolic rates and receptor densities) into clinical work. Taking advantage of recently developed integrated PET/MR devices, absolute physiological values will be available in clinical routine, replacing currently used visual assessment of relative tissue tracer uptake. This will allow assessment of global increases/decreases of brain function during critical phases of development and is likely to have a significant impact on patient management in pediatric epilepsy.

  19. Development of computational algorithms for quantification of pulmonary structures

    International Nuclear Information System (INIS)

    Oliveira, Marcela de; Alvarez, Matheus; Alves, Allan F.F.; Miranda, Jose R.A.; Pina, Diana R.

    2012-01-01

    The high-resolution computed tomography has become the imaging diagnostic exam most commonly used for the evaluation of the squeals of Paracoccidioidomycosis. The subjective evaluations the radiological abnormalities found on HRCT images do not provide an accurate quantification. The computer-aided diagnosis systems produce a more objective assessment of the abnormal patterns found in HRCT images. Thus, this research proposes the development of algorithms in MATLAB® computing environment can quantify semi-automatically pathologies such as pulmonary fibrosis and emphysema. The algorithm consists in selecting a region of interest (ROI), and by the use of masks, filter densities and morphological operators, to obtain a quantification of the injured area to the area of a healthy lung. The proposed method was tested on ten HRCT scans of patients with confirmed PCM. The results of semi-automatic measurements were compared with subjective evaluations performed by a specialist in radiology, falling to a coincidence of 80% for emphysema and 58% for fibrosis. (author)

  20. Improved quantification for local regions of interest in preclinical PET imaging

    Science.gov (United States)

    Cal-González, J.; Moore, S. C.; Park, M.-A.; Herraiz, J. L.; Vaquero, J. J.; Desco, M.; Udias, J. M.

    2015-09-01

    In Positron Emission Tomography, there are several causes of quantitative inaccuracy, such as partial volume or spillover effects. The impact of these effects is greater when using radionuclides that have a large positron range, e.g. 68Ga and 124I, which have been increasingly used in the clinic. We have implemented and evaluated a local projection algorithm (LPA), originally evaluated for SPECT, to compensate for both partial-volume and spillover effects in PET. This method is based on the use of a high-resolution CT or MR image, co-registered with a PET image, which permits a high-resolution segmentation of a few tissues within a volume of interest (VOI) centered on a region within which tissue-activity values need to be estimated. The additional boundary information is used to obtain improved activity estimates for each tissue within the VOI, by solving a simple inversion problem. We implemented this algorithm for the preclinical Argus PET/CT scanner and assessed its performance using the radionuclides 18F, 68Ga and 124I. We also evaluated and compared the results obtained when it was applied during the iterative reconstruction, as well as after the reconstruction as a postprocessing procedure. In addition, we studied how LPA can help to reduce the ‘spillover contamination’, which causes inaccurate quantification of lesions in the immediate neighborhood of large, ‘hot’ sources. Quantification was significantly improved by using LPA, which provided more accurate ratios of lesion-to-background activity concentration for hot and cold regions. For 18F, the contrast was improved from 3.0 to 4.0 in hot lesions (when the true ratio was 4.0) and from 0.16 to 0.06 in cold lesions (true ratio  =  0.0), when using the LPA postprocessing. Furthermore, activity values estimated within the VOI using LPA during reconstruction were slightly more accurate than those obtained by post-processing, while also visually improving the image contrast and uniformity

  1. Development and validation of an open source quantification tool for DSC-MRI studies.

    Science.gov (United States)

    Gordaliza, P M; Mateos-Pérez, J M; Montesinos, P; Guzmán-de-Villoria, J A; Desco, M; Vaquero, J J

    2015-03-01

    This work presents the development of an open source tool for the quantification of dynamic susceptibility-weighted contrast-enhanced (DSC) perfusion studies. The development of this tool is motivated by the lack of open source tools implemented on open platforms to allow external developers to implement their own quantification methods easily and without the need of paying for a development license. This quantification tool was developed as a plugin for the ImageJ image analysis platform using the Java programming language. A modular approach was used in the implementation of the components, in such a way that the addition of new methods can be done without breaking any of the existing functionalities. For the validation process, images from seven patients with brain tumors were acquired and quantified with the presented tool and with a widely used clinical software package. The resulting perfusion parameters were then compared. Perfusion parameters and the corresponding parametric images were obtained. When no gamma-fitting is used, an excellent agreement with the tool used as a gold-standard was obtained (R(2)>0.8 and values are within 95% CI limits in Bland-Altman plots). An open source tool that performs quantification of perfusion studies using magnetic resonance imaging has been developed and validated using a clinical software package. It works as an ImageJ plugin and the source code has been published with an open source license. Copyright © 2015 Elsevier Ltd. All rights reserved.

  2. An Evaluation of Statistical Methods for Analyzing Follow-Up Gaussian Laboratory Data with a Lower Quantification Limit

    NARCIS (Netherlands)

    Karon, John M.; Wiegand, Ryan E.; van de Wijgert, Janneke H.; Kilmarx, Peter H.

    2015-01-01

    Laboratory data with a lower quantification limit (censored data) are sometimes analyzed by replacing non-quantifiable values with a single value equal to or less than the quantification limit, yielding possibly biased point estimates and variance estimates that are too small. Motivated by a

  3. Experimental design for TBT quantification by isotope dilution SPE-GC-ICP-MS under the European water framework directive.

    Science.gov (United States)

    Alasonati, Enrica; Fabbri, Barbara; Fettig, Ina; Yardin, Catherine; Del Castillo Busto, Maria Estela; Richter, Janine; Philipp, Rosemarie; Fisicaro, Paola

    2015-03-01

    In Europe the maximum allowable concentration for tributyltin (TBT) compounds in surface water has been regulated by the water framework directive (WFD) and daughter directive that impose a limit of 0.2 ng L(-1) in whole water (as tributyltin cation). Despite the large number of different methodologies for the quantification of organotin species developed in the last two decades, standardised analytical methods at required concentration level do not exist. TBT quantification at picogram level requires efficient and accurate sample preparation and preconcentration, and maximum care to avoid blank contamination. To meet the WFD requirement, a method for the quantification of TBT in mineral water at environmental quality standard (EQS) level, based on solid phase extraction (SPE), was developed and optimised. The quantification was done using species-specific isotope dilution (SSID) followed by gas chromatography (GC) coupled to inductively coupled plasma mass spectrometry (ICP-MS). The analytical process was optimised using a design of experiment (DOE) based on a factorial fractionary plan. The DOE allowed to evaluate 3 qualitative factors (type of stationary phase and eluent, phase mass and eluent volume, pH and analyte ethylation procedure) for a total of 13 levels studied, and a sample volume in the range of 250-1000 mL. Four different models fitting the results were defined and evaluated with statistic tools: one of them was selected and optimised to find the best procedural conditions. C18 phase was found to be the best stationary phase for SPE experiments. The 4 solvents tested with C18, the pH and ethylation conditions, the mass of the phases, the volume of the eluents and the sample volume can all be optimal, but depending on their respective combination. For that reason, the equation of the model conceived in this work is a useful decisional tool for the planning of experiments, because it can be applied to predict the TBT mass fraction recovery when the

  4. Structure determination of electrodeposited zinc-nickel alloys: thermal stability and quantification using XRD and potentiodynamic dissolution

    International Nuclear Information System (INIS)

    Fedi, B.; Gigandet, M.P.; Hihn, J-Y; Mierzejewski, S.

    2016-01-01

    Highlights: • Quantification of zinc-nickel phases between 1,2% and 20%. • Coupling XRD to partial potentiodynamic dissolution. • Deconvolution of anodic stripping curves. • Phase quantification after annealing. - Abstract: Electrodeposited zinc-nickel coatings obtained by electrodeposition reveal the presence of metastable phases in various quantities, thus requiring their identification, a study of their thermal stability, and, finally, determination of their respective proportions. By combining XRD measurement with partial potentiodynamic dissolution, anodic peaks were indexed to allow their quantification. Quantification of electrodeposited zinc-nickel alloys approximately 10 μm thick was thus carried out on nickel content between 1.2% and 20%, and exhibited good accuracy. This method was then extended to the same set of alloys after annealing (250 °C, 2 h), thus bringing the structural organization closer to its thermodynamic equilibrium. The result obtained ensures better understanding of crystallization of metastable phases and of phase proportion evolution in a bi-phasic zinc-nickel coating. Finally, the presence of a monophase γ and its thermal stability in the 12% to 15% range provides important information for coating anti-corrosion behavior.

  5. The industrial impact of Sizewell 'B'

    International Nuclear Information System (INIS)

    1988-01-01

    The paper is a report on the industrial impact of post-Sizewell nuclear reactor policy, as presented by a Working Group set up by the United Kingdom Advisory Council on Applied Research and Development. The primary objective of the Working Group was the quantification of the effects of the introduction of a non-UK design of reactor upon employment, the availability of skilled resources, and on imports and exports. The subject is discussed under the topic headings:-the effect of Sizewell-'B' on UK manufacturing industry, skilled resources, safety, reactor design choice, and replication of the PWR. (U.K.)

  6. Subnuclear foci quantification using high-throughput 3D image cytometry

    Science.gov (United States)

    Wadduwage, Dushan N.; Parrish, Marcus; Choi, Heejin; Engelward, Bevin P.; Matsudaira, Paul; So, Peter T. C.

    2015-07-01

    Ionising radiation causes various types of DNA damages including double strand breaks (DSBs). DSBs are often recognized by DNA repair protein ATM which forms gamma-H2AX foci at the site of the DSBs that can be visualized using immunohistochemistry. However most of such experiments are of low throughput in terms of imaging and image analysis techniques. Most of the studies still use manual counting or classification. Hence they are limited to counting a low number of foci per cell (5 foci per nucleus) as the quantification process is extremely labour intensive. Therefore we have developed a high throughput instrumentation and computational pipeline specialized for gamma-H2AX foci quantification. A population of cells with highly clustered foci inside nuclei were imaged, in 3D with submicron resolution, using an in-house developed high throughput image cytometer. Imaging speeds as high as 800 cells/second in 3D were achieved by using HiLo wide-field depth resolved imaging and a remote z-scanning technique. Then the number of foci per cell nucleus were quantified using a 3D extended maxima transform based algorithm. Our results suggests that while most of the other 2D imaging and manual quantification studies can count only up to about 5 foci per nucleus our method is capable of counting more than 100. Moreover we show that 3D analysis is significantly superior compared to the 2D techniques.

  7. Linking probe thermodynamics to microarray quantification

    International Nuclear Information System (INIS)

    Li, Shuzhao; Pozhitkov, Alexander; Brouwer, Marius

    2010-01-01

    Understanding the difference in probe properties holds the key to absolute quantification of DNA microarrays. So far, Langmuir-like models have failed to link sequence-specific properties to hybridization signals in the presence of a complex hybridization background. Data from washing experiments indicate that the post-hybridization washing has no major effect on the specifically bound targets, which give the final signals. Thus, the amount of specific targets bound to probes is likely determined before washing, by the competition against nonspecific binding. Our competitive hybridization model is a viable alternative to Langmuir-like models. (comment)

  8. Rapid Quantification of Low-Viscosity Acetyl-Triacylglycerols Using Electrospray Ionization Mass Spectrometry.

    Science.gov (United States)

    Bansal, Sunil; Durrett, Timothy P

    2016-09-01

    Acetyl-triacylglycerols (acetyl-TAG) possess an sn-3 acetate group, which confers useful chemical and physical properties to these unusual triacylglycerols (TAG). Current methods for quantification of acetyl-TAG are time consuming and do not provide any information on the molecular species profile. Electrospray ionization mass spectrometry (ESI-MS)-based methods can overcome these drawbacks. However, the ESI-MS signal intensity for TAG depends on the aliphatic chain length and unsaturation index of the molecule. Therefore response factors for different molecular species need to be determined before any quantification. The effects of the chain length and the number of double-bonds of the sn-1/2 acyl groups on the signal intensity for the neutral loss of short chain length sn-3 groups were quantified using a series of synthesized sn-3 specific structured TAG. The signal intensity for the neutral loss of the sn-3 acyl group was found to negatively correlated with the aliphatic chain length and unsaturation index of the sn-1/2 acyl groups. The signal intensity of the neutral loss of the sn-3 acyl group was also negatively correlated with the size of that chain. Further, the position of the group undergoing neutral loss was also important, with the signal from an sn-2 acyl group much lower than that from one located at sn-3. Response factors obtained from these analyses were used to develop a method for the absolute quantification of acetyl-TAG. The increased sensitivity of this ESI-MS-based approach allowed successful quantification of acetyl-TAG in various biological settings, including the products of in vitro enzyme activity assays.

  9. Multiplex electrochemical DNA platform for femtomolar-level quantification of genetically modified soybean.

    Science.gov (United States)

    Manzanares-Palenzuela, C Lorena; de-los-Santos-Álvarez, Noemí; Lobo-Castañón, María Jesús; López-Ruiz, Beatriz

    2015-06-15

    Current EU regulations on the mandatory labeling of genetically modified organisms (GMOs) with a minimum content of 0.9% would benefit from the availability of reliable and rapid methods to detect and quantify DNA sequences specific for GMOs. Different genosensors have been developed to this aim, mainly intended for GMO screening. A remaining challenge, however, is the development of genosensing platforms for GMO quantification, which should be expressed as the number of event-specific DNA sequences per taxon-specific sequences. Here we report a simple and sensitive multiplexed electrochemical approach for the quantification of Roundup-Ready Soybean (RRS). Two DNA sequences, taxon (lectin) and event-specific (RR), are targeted via hybridization onto magnetic beads. Both sequences are simultaneously detected by performing the immobilization, hybridization and labeling steps in a single tube and parallel electrochemical readout. Hybridization is performed in a sandwich format using signaling probes labeled with fluorescein isothiocyanate (FITC) or digoxigenin (Dig), followed by dual enzymatic labeling using Fab fragments of anti-Dig and anti-FITC conjugated to peroxidase or alkaline phosphatase, respectively. Electrochemical measurement of the enzyme activity is finally performed on screen-printed carbon electrodes. The assay gave a linear range of 2-250 pM for both targets, with LOD values of 650 fM (160 amol) and 190 fM (50 amol) for the event-specific and the taxon-specific targets, respectively. Results indicate that the method could be applied for GMO quantification below the European labeling threshold level (0.9%), offering a general approach for the rapid quantification of specific GMO events in foods. Copyright © 2015 Elsevier B.V. All rights reserved.

  10. Uncertainty Quantification of CFD Data Generated for a Model Scramjet Isolator Flowfield

    Science.gov (United States)

    Baurle, R. A.; Axdahl, E. L.

    2017-01-01

    Computational fluid dynamics is now considered to be an indispensable tool for the design and development of scramjet engine components. Unfortunately, the quantification of uncertainties is rarely addressed with anything other than sensitivity studies, so the degree of confidence associated with the numerical results remains exclusively with the subject matter expert that generated them. This practice must be replaced with a formal uncertainty quantification process for computational fluid dynamics to play an expanded role in the system design, development, and flight certification process. Given the limitations of current hypersonic ground test facilities, this expanded role is believed to be a requirement by some in the hypersonics community if scramjet engines are to be given serious consideration as a viable propulsion system. The present effort describes a simple, relatively low cost, nonintrusive approach to uncertainty quantification that includes the basic ingredients required to handle both aleatoric (random) and epistemic (lack of knowledge) sources of uncertainty. The nonintrusive nature of the approach allows the computational fluid dynamicist to perform the uncertainty quantification with the flow solver treated as a "black box". Moreover, a large fraction of the process can be automated, allowing the uncertainty assessment to be readily adapted into the engineering design and development workflow. In the present work, the approach is applied to a model scramjet isolator problem where the desire is to validate turbulence closure models in the presence of uncertainty. In this context, the relevant uncertainty sources are determined and accounted for to allow the analyst to delineate turbulence model-form errors from other sources of uncertainty associated with the simulation of the facility flow.

  11. Quantification of regional radiative impacts and climate effects of tropical fire aerosols

    Science.gov (United States)

    Tosca, M. G.; Zender, C. S.; Randerson, J. T.

    2011-12-01

    Regionally expansive smoke clouds originating from deforestation fires in Indonesia can modify local precipitation patterns via direct aerosol scattering and absorption of solar radiation (Tosca et al., 2010). Here we quantify the regional climate impacts of fire aerosols for three tropical burning regions that together account for about 70% of global annual fire emissions. We use the Community Atmosphere Model, version 5 (CAM5) coupled to a slab ocean model (SOM) embedded within the Community Earth System Model (CESM). In addition to direct aerosol radiative effects, CAM5 also quantifies indirect, semi-direct and cloud microphysical aerosol effects. Climate impacts are determined using regionally adjusted emissions data that produce realistic aerosol optical depths in CAM5. We first analyzed a single 12-year transient simulation (1996-2007) forced with unadjusted emissions estimates from the Global Fire Emissions Database, version 3 (GFEDv3) and compared the resulting aerosol optical depths (AODs) for 4 different burning regions (equatorial Asia, southern Africa, South America and boreal North America) to observed MISR and MODIS AODs for the same period. Based on this analysis we adjusted emissions for each burning region between 150 and 300% and forced a second simulation with the regionally adjusted emissions. Improved AODs from this simulation are compared to AERONET observations available at 15 stations throughout the tropics. We present here two transient simulations--one with the adjusted fire emissions and one without fires--to quantify the cumulative fire aerosol climate impact for three major tropical burning regions (equatorial Asia, southern Africa and South America). Specifically, we quantify smoke effects on radiation, precipitation, and temperature. References Tosca, M.G., J.T. Randerson, C.S. Zender, M.G. Flanner and P.J. Rasch (2010), Do biomass burning aerosols intensify drought in equatorial Asia during El Nino?, Atmos. Chem. Phys., 10, 3515

  12. Critical aspects of data analysis for quantification in laser-induced breakdown spectroscopy

    Science.gov (United States)

    Motto-Ros, V.; Syvilay, D.; Bassel, L.; Negre, E.; Trichard, F.; Pelascini, F.; El Haddad, J.; Harhira, A.; Moncayo, S.; Picard, J.; Devismes, D.; Bousquet, B.

    2018-02-01

    In this study, a collaborative contest focused on LIBS data processing has been conducted in an original way since the participants did not share the same samples to be analyzed on their own LIBS experiments but a set of LIBS spectra obtained from one single experiment. Each participant was asked to provide the predicted concentrations of several elements for two glass samples. The analytical contest revealed a wide diversity of results among participants, even when the same spectral lines were considered for the analysis. Then, a parametric study was conducted to investigate the influence of each step during the data processing. This study was based on several analytical figures of merit such as the determination coefficient, uncertainty, limit of quantification and prediction ability (i.e., trueness). Then, it was possible to interpret the results provided by the participants, emphasizing the fact that the type of data extraction, baseline modeling as well as the calibration model play key roles in the quantification performance of the technique. This work provides a set of recommendations based on a systematic evaluation of the quantification procedure with the aim of optimizing the methodological steps toward the standardization of LIBS.

  13. FRET-based modified graphene quantum dots for direct trypsin quantification in urine

    Energy Technology Data Exchange (ETDEWEB)

    Poon, Chung-Yan; Li, Qinghua [Department of Chemistry, Hong Kong Baptist University, Kowloon Tong, Hong Kong Special Administrative Region (Hong Kong); Zhang, Jiali; Li, Zhongping [Department of Chemistry, Hong Kong Baptist University, Kowloon Tong, Hong Kong Special Administrative Region (Hong Kong); Research Center of Environmental Science and Engineering, School of Chemistry and Chemical Engineering, Shanxi University, Taiyuan 030006 (China); Dong, Chuan [Research Center of Environmental Science and Engineering, School of Chemistry and Chemical Engineering, Shanxi University, Taiyuan 030006 (China); Lee, Albert Wai-Ming; Chan, Wing-Hong [Department of Chemistry, Hong Kong Baptist University, Kowloon Tong, Hong Kong Special Administrative Region (Hong Kong); Li, Hung-Wing, E-mail: hwli@hkbu.edu.hk [Department of Chemistry, Hong Kong Baptist University, Kowloon Tong, Hong Kong Special Administrative Region (Hong Kong)

    2016-04-21

    A versatile nanoprobe was developed for trypsin quantification with fluorescence resonance energy transfer (FRET). Here, fluorescence graphene quantum dot is utilized as a donor while a well-designed coumarin derivative, CMR2, as an acceptor. Moreover, bovine serum albumin (BSA), as a protein model, is not only served as a linker for the FRET pair, but also a fluorescence enhancer of the quantum dots and CMR2. In the presence of trypsin, the FRET system would be destroyed when the BSA is digested by trypsin. Thus, the emission peak of the donor is regenerated and the ratio of emission peak of donor/emission peak of acceptor increased. By the ratiometric measurement of these two emission peaks, trypsin content could be determined. The detection limit of trypsin was found to be 0.7 μg/mL, which is 0.008-fold of the average trypsin level in acute pancreatitis patient's urine suggesting a high potential for fast and low cost clinical screening. - Highlights: • A FRET-based biosensor was developed for direct quantification of trypsin. • Fast and sensitive screening of pancreatic disease was facilitated. • The direct quantification of trypsin in urine samples was demonstrated.

  14. Micromethod for quantification of SH groups generated after reduction of monoclonal antibodies

    International Nuclear Information System (INIS)

    Escobar, Normando Iznaga; Morales, Alejo; Nunez, Gilda

    1996-01-01

    A simple, rapid, and reproducible micromethod for quantification of sulfhydryl (SH) groups generated after reduction of monoclonal antibody (MAb) disulfide bonds with 2-mercaptoethanol (2-ME) is described. The number of SH groups per molecule of antibody in the 2-ME and in the other reducing agents was calculated from the cysteine standard curve using Ellman's reagent to develop the yellow color. Results were plotted as absorbance at 405 nm vs. cysteine concentration (μg/mL). After subtraction of the background due to Ellman's reagent, a straight-line relationship passing through the origin was obtained. Absorption spectrum of the yellow products was controlled, and no significative differences were found between optical density at 412 nm and 405 nm. Using a small quantity of antibody in the order of 37 μg, the lowest detection limit for cysteine quantification was 0.03 μg. An excellent linear correlation was found between both cysteine concentration and absorbance (r = 0.999), and the mean value of the relative error in the quantification of cysteine from samples was 2.8%. A statistical Student t-test showed an excellent linearity and parallelism between cysteine standard and samples

  15. Micromethod for quantification of SH groups generated after reduction of monoclonal antibodies

    Energy Technology Data Exchange (ETDEWEB)

    Escobar, Normando Iznaga; Morales, Alejo; Nunez, Gilda

    1996-07-01

    A simple, rapid, and reproducible micromethod for quantification of sulfhydryl (SH) groups generated after reduction of monoclonal antibody (MAb) disulfide bonds with 2-mercaptoethanol (2-ME) is described. The number of SH groups per molecule of antibody in the 2-ME and in the other reducing agents was calculated from the cysteine standard curve using Ellman's reagent to develop the yellow color. Results were plotted as absorbance at 405 nm vs. cysteine concentration ({mu}g/mL). After subtraction of the background due to Ellman's reagent, a straight-line relationship passing through the origin was obtained. Absorption spectrum of the yellow products was controlled, and no significative differences were found between optical density at 412 nm and 405 nm. Using a small quantity of antibody in the order of 37 {mu}g, the lowest detection limit for cysteine quantification was 0.03 {mu}g. An excellent linear correlation was found between both cysteine concentration and absorbance (r = 0.999), and the mean value of the relative error in the quantification of cysteine from samples was 2.8%. A statistical Student t-test showed an excellent linearity and parallelism between cysteine standard and samples.

  16. Quantification and Multi-purpose Allocation of Water Resources in a Dual-reservoir System

    Science.gov (United States)

    Salami, Y. D.

    2017-12-01

    Transboundary rivers that run through separate water management jurisdictions sometimes experience competitive water usage. Where the river has multiple existing or planned dams along its course, quantification and efficient allocation of water for such purposes as hydropower generation, irrigation for agriculture, and water supply can be a challenge. This problem is even more pronounced when large parts of the river basin are located in semi-arid regions known for water insecurity, poor crop yields from irrigation scheme failures, and human population displacement arising from water-related conflict. This study seeks to mitigate the impacts of such factors on the Kainji-Jebba dual-reservoir system located along the Niger River in Africa by seasonally quantifying and efficiently apportioning water to all stipulated uses of both dams thereby improving operational policy and long-term water security. Historical storage fluctuations (18 km3 to 5 km3) and flows into and out of both reservoirs were analyzed for relationships to such things as surrounding catchment contribution, dam operational policies, irrigation and hydropower requirements, etc. Optimum values of the aforementioned parameters were then determined by simulations based upon hydrological contributions and withdrawals and worst case scenarios of natural and anthropogenic conditions (like annual probability of reservoir depletion) affecting water availability and allocation. Finally, quantification and optimized allocation of water was done based on needs for hydropower, irrigation for agriculture, water supply, and storage evacuation for flood control. Results revealed that water supply potential increased by 69%, average agricultural yield improved by 36%, and hydropower generation increased by 54% and 66% at the upstream and downstream dams respectively. Lessons learned from this study may help provide a robust and practical means of water resources management in similar river basins and multi

  17. A validated Fourier transform infrared spectroscopy method for quantification of total lactones in Inula racemosa and Andrographis paniculata.

    Science.gov (United States)

    Shivali, Garg; Praful, Lahorkar; Vijay, Gadgil

    2012-01-01

    Fourier transform infrared (FT-IR) spectroscopy is a technique widely used for detection and quantification of various chemical moieties. This paper describes the use of the FT-IR spectroscopy technique for the quantification of total lactones present in Inula racemosa and Andrographis paniculata. To validate the FT-IR spectroscopy method for quantification of total lactones in I. racemosa and A. paniculata. Dried and powdered I. racemosa roots and A. paniculata plant were extracted with ethanol and dried to remove ethanol completely. The ethanol extract was analysed in a KBr pellet by FT-IR spectroscopy. The FT-IR spectroscopy method was validated and compared with a known spectrophotometric method for quantification of lactones in A. paniculata. By FT-IR spectroscopy, the amount of total lactones was found to be 2.12 ± 0.47% (n = 3) in I. racemosa and 8.65 ± 0.51% (n = 3) in A. paniculata. The method showed comparable results with a known spectrophotometric method used for quantification of such lactones: 8.42 ± 0.36% (n = 3) in A. paniculata. Limits of detection and quantification for isoallantolactone were 1 µg and 10 µg respectively; for andrographolide they were 1.5 µg and 15 µg respectively. Recoveries were over 98%, with good intra- and interday repeatability: RSD ≤ 2%. The FT-IR spectroscopy method proved linear, accurate, precise and specific, with low limits of detection and quantification, for estimation of total lactones, and is less tedious than the UV spectrophotometric method for the compounds tested. This validated FT-IR spectroscopy method is readily applicable for the quality control of I. racemosa and A. paniculata. Copyright © 2011 John Wiley & Sons, Ltd.

  18. Climate impacts on palm oil yields in the Nigerian Niger Delta

    Science.gov (United States)

    Okoro, Stanley U.; Schickhoff, Udo; Boehner, Juergen; Schneider, Uwe A.; Huth, Neil

    2016-04-01

    Palm oil production has increased in recent decades and is estimated to increase further. The optimal role of palm oil production, however, is controversial because of resource conflicts with alternative land uses. Local conditions and climate change affect resource competition and the desirability of palm oil production. Based on this, crop yield simulations using different climate model output under different climate scenarios could be important tool in addressing the problem of uncertainty quantification among different climate model outputs. Previous studies on this region have focused mostly on single experimental fields, not considering variations in Agro-Ecological Zones, climatic conditions, varieties and management practices and, in most cases not extending to various IPCC climate scenarios and were mostly based on single climate model output. Furthermore, the uncertainty quantification of the climate- impact model has rarely been investigated on this region. To this end we use the biophysical simulation model APSIM (Agricultural Production Systems Simulator) to simulate the regional climate impact on oil palm yield over the Nigerian Niger Delta. We also examine whether the use of crop yield model output ensemble reduces the uncertainty rather than the use of climate model output ensemble. The results could serve as a baseline for policy makers in this region in understanding the interaction between potentials of energy crop production of the region as well as its food security and other negative feedbacks that could be associated with bioenergy from oil palm. Keywords: Climate Change, Climate impacts, Land use and Crop yields.

  19. Quantification of massively parallel sequencing libraries - a comparative study of eight methods

    DEFF Research Database (Denmark)

    Hussing, Christian; Kampmann, Marie-Louise; Mogensen, Helle Smidt

    2018-01-01

    Quantification of massively parallel sequencing libraries is important for acquisition of monoclonal beads or clusters prior to clonal amplification and to avoid large variations in library coverage when multiple samples are included in one sequencing analysis. No gold standard for quantification...... estimates followed by Qubit and electrophoresis-based instruments (Bioanalyzer, TapeStation, GX Touch, and Fragment Analyzer), while SYBR Green and TaqMan based qPCR assays gave the lowest estimates. qPCR gave more accurate predictions of sequencing coverage than Qubit and TapeStation did. Costs, time......-consumption, workflow simplicity, and ability to quantify multiple samples are discussed. Technical specifications, advantages, and disadvantages of the various methods are pointed out....

  20. Quantification of C2 cervical spine rotatory fixation by X-ray, MRI and CT

    Energy Technology Data Exchange (ETDEWEB)

    Gradl, Georg [Chirurgische Klinik und Poliklinik der Universitaet Rostock, Abteilung Unfall- und Wiederherstellungschirurgie, Rostock (Germany); Maier-Bosse, Tamara; Staebler, Axel [Institut fuer Radiologische Diagnostik der Universitaet Muenchen, Klinikum Grobetahadern, Munich (Germany); Penning, Randolph [Institut fuer Rechtsmedizin der Universitaet Muenchen, Munich (Germany)

    2005-02-01

    Atlanto-axial rotatory displacement is known to be a cause of childhood torticollis and may as well be responsible for chronic neck pain after rear-end automobile collisions. The objective was to determine whether quantification of C2 malrotation is possible by plain radiographs in comparison to CT as the golden standard. MR imaging was evaluated as to whether it was of equal value in the detection of bony landmarks. C2 vertebra of five human cadaveric cervical spine specimens, ligamentously intact, were rotated using a Steinmann pin in steps of 5 up to 15 right and 15 left. Plain radiographs, CT and MRI images were taken in each rotational step. Data were analyzed for quantification of C2 rotation by three independent examiners. A rotation of 5 led to a spinous process deviation (SPD) from the midline of 3 mm as measured on an a.p. plain radiograph. A coefficient of rotation was calculated (1.62 mm{sup -1}). Data analyzed by three examiners revealed a small coefficient of variation (0.03). MRI and CT measurements showed comparable results for the quantification of rotation; however, in both techniques the 15 rotation was underestimated. Quantification of upper cervical spine malrotation was possible on plain radiographs using the SPD and a rotation coefficient. MRI and CT were equally successful in the assessment of C2 malrotation. (orig.)

  1. New approach for the quantification of processed animal proteins in feed using light microscopy.

    Science.gov (United States)

    Veys, P; Baeten, V

    2010-07-01

    A revision of European Union's total feed ban on animal proteins in feed will need robust quantification methods, especially for control analyses, if tolerance levels are to be introduced, as for fishmeal in ruminant feed. In 2006, a study conducted by the Community Reference Laboratory for Animal Proteins in feedstuffs (CRL-AP) demonstrated the deficiency of the official quantification method based on light microscopy. The study concluded that the method had to be revised. This paper puts forward an improved quantification method based on three elements: (1) the preparation of permanent slides with an optical adhesive preserving all morphological markers of bones necessary for accurate identification and precision counting; (2) the use of a counting grid eyepiece reticle; and (3) new definitions for correction factors for the estimated portions of animal particles in the sediment. This revised quantification method was tested on feeds adulterated at different levels with bovine meat and bone meal (MBM) and fishmeal, and it proved to be effortless to apply. The results obtained were very close to the expected values of contamination levels for both types of adulteration (MBM or fishmeal). Calculated values were not only replicable, but also reproducible. The advantages of the new approach, including the benefits of the optical adhesive used for permanent slide mounting and the experimental conditions that need to be met to implement the new method correctly, are discussed.

  2. Visualization and quantification of evolving datasets. Final report: 8-1-93 - 4-30-97

    International Nuclear Information System (INIS)

    Zabusky, N.; Silver, D.

    1999-01-01

    The material below is the final technical/progress report of the Laboratory for Visiometrics and Modeling (Vizlab) in visiometrics for the grant entitled Visualization and Quantification of Evolving Phenomena. This includes coordination with DOE supported scientists at Los Alamos National Laboratory (LANL) and Princeton Plasma Physics Laboratory (PPPL), and with theoretical and computational physicists at the National Institute of Fusion Science (NIFS) in Nagoya, Japan and the Institute of Laser Engineering (ILE) in Osaka, Japan. The authors research areas included: Enhancement and distribution of the DAVID environment, this is a 2D visualization environment incorporating many advanced quantifications and diagnostics useful for prediction, understanding, and reduced model formation; Feature extraction, tracking and quantification of 3D time-dependent datasets of non-linear and turbulent simulations both compressible and incompressible. This work is applicable to all 3D time-varying simulations; Visiometrics in shock-interface interactions and mixing for the Richtmyer-Meshkov (RM) environment. This work highlights reduced models for nonlinear evolutions and the role of density stratified interfaces (contact discontinuities) and has application to supernova physics, laser fusion and supersonic combustion. The collaborative projects included areas of (1) Feature extraction, tracking and quantification in 3D turbulence: compressible and incompressible; (2) Numerical Tokamak Project (NTP); (3) Data projection and reduced modeling for shock-interface interactions and mixing. (The Richtmyer-Meshkov (RM) environment relevant to laser fusion and combustion)

  3. Quantification of C2 cervical spine rotatory fixation by X-ray, MRI and CT

    International Nuclear Information System (INIS)

    Gradl, Georg; Maier-Bosse, Tamara; Staebler, Axel; Penning, Randolph

    2005-01-01

    Atlanto-axial rotatory displacement is known to be a cause of childhood torticollis and may as well be responsible for chronic neck pain after rear-end automobile collisions. The objective was to determine whether quantification of C2 malrotation is possible by plain radiographs in comparison to CT as the golden standard. MR imaging was evaluated as to whether it was of equal value in the detection of bony landmarks. C2 vertebra of five human cadaveric cervical spine specimens, ligamentously intact, were rotated using a Steinmann pin in steps of 5 up to 15 right and 15 left. Plain radiographs, CT and MRI images were taken in each rotational step. Data were analyzed for quantification of C2 rotation by three independent examiners. A rotation of 5 led to a spinous process deviation (SPD) from the midline of 3 mm as measured on an a.p. plain radiograph. A coefficient of rotation was calculated (1.62 mm -1 ). Data analyzed by three examiners revealed a small coefficient of variation (0.03). MRI and CT measurements showed comparable results for the quantification of rotation; however, in both techniques the 15 rotation was underestimated. Quantification of upper cervical spine malrotation was possible on plain radiographs using the SPD and a rotation coefficient. MRI and CT were equally successful in the assessment of C2 malrotation. (orig.)

  4. Chromatic and anisotropic cross-recurrence quantification analysis of interpersonal behavior

    NARCIS (Netherlands)

    Cox, R.F.A; van der Steen, Stephanie; Guevara Guerrero, Marlenny; Hoekstra, Lisette; van Dijk, Marijn; Webber, Charles; Ioana, Cornel; Marwan, Norbert

    Cross-recurrence quantification analysis (CRQA) is a powerful nonlinear time-series method to study coordination and cooperation between people. This chapter concentrates on two methodological issues related to CRQA on categorical data streams, which are commonly encountered in the behavioral

  5. Development of a competitive PCR assay for the quantification of ...

    African Journals Online (AJOL)

    ONOS

    2010-01-25

    Jan 25, 2010 ... quantification of total Escherichia coli DNA in water. Omar Kousar Banu, Barnard .... Thereafter the product was ligated into the pGEM®T-easy cloning ... agarose gel using the high pure PCR product purification kit. (Roche® ...

  6. Histomorphometric quantification of human pathological bones from synchrotron radiation 3D computed microtomography

    International Nuclear Information System (INIS)

    Nogueira, Liebert P.; Braz, Delson

    2011-01-01

    Conventional bone histomorphometry is an important method for quantitative evaluation of bone microstructure. X-ray computed microtomography is a noninvasive technique, which can be used to evaluate histomorphometric indices in trabecular bones (BV/TV, BS/BV, Tb.N, Tb.Th, Tb.Sp). In this technique, the output 3D images are used to quantify the whole sample, differently from the conventional one, in which the quantification is performed in 2D slices and extrapolated for 3D case. In this work, histomorphometric quantification using synchrotron 3D X-ray computed microtomography was performed to quantify pathological samples of human bone. Samples of human bones were cut into small blocks (8 mm x 8 mm x 10 mm) with a precision saw and then imaged. The computed microtomographies were obtained at SYRMEP (Synchrotron Radiation for MEdical Physics) beamline, at ELETTRA synchrotron radiation facility (Italy). The obtained 3D images yielded excellent resolution and details of intra-trabecular bone structures, including marrow present inside trabeculae. Histomorphometric quantification was compared to literature as well. (author)

  7. Uncertainty quantification for hyperbolic and kinetic equations

    CERN Document Server

    Pareschi, Lorenzo

    2017-01-01

    This book explores recent advances in uncertainty quantification for hyperbolic, kinetic, and related problems. The contributions address a range of different aspects, including: polynomial chaos expansions, perturbation methods, multi-level Monte Carlo methods, importance sampling, and moment methods. The interest in these topics is rapidly growing, as their applications have now expanded to many areas in engineering, physics, biology and the social sciences. Accordingly, the book provides the scientific community with a topical overview of the latest research efforts.

  8. SPECT quantification: a review of the different correction methods with compton scatter, attenuation and spatial deterioration effects

    International Nuclear Information System (INIS)

    Groiselle, C.; Rocchisani, J.M.; Moretti, J.L.; Dreuille, O. de; Gaillard, J.F.; Bendriem, B.

    1997-01-01

    SPECT quantification: a review of the different correction methods with Compton scatter attenuation and spatial deterioration effects. The improvement of gamma-cameras, acquisition and reconstruction software opens new perspectives in term of image quantification in nuclear medicine. In order to meet the challenge, numerous works have been undertaken in recent years to correct for the different physical phenomena that prevent an exact estimation of the radioactivity distribution. The main phenomena that have to betaken into account are scatter, attenuation and resolution. In this work, authors present the physical basis of each issue, its consequences on quantification and the main methods proposed to correct them. (authors)

  9. Quantification of Representative Ciguatoxins in the Pacific Using Quantitative Nuclear Magnetic Resonance Spectroscopy

    Directory of Open Access Journals (Sweden)

    Tsuyoshi Kato

    2017-10-01

    Full Text Available The absolute quantification of five toxins involved in ciguatera fish poisoning (CFP in the Pacific was carried out by quantitative 1H-NMR. The targeted toxins were ciguatoxin-1B (CTX1B, 52-epi-54-deoxyciguatoxin-1B (epideoxyCTX1B, ciguatoxin-3C (CTX3C, 51-hydroxyciguatoxin-3C (51OHCTX3C, and ciguatoxin-4A (CTX4A. We first calibrated the residual protons of pyridine-d5 using certified reference material, 1,4-BTMSB-d4, prepared the toxin solutions with the calibrated pyridin-d5, measured the 1H-NMR spectra, and quantified the toxin using the calibrated residual protons as the internal standard. The absolute quantification was carried out by comparing the signal intensities between the selected protons of the target toxin and the residual protons of the calibrated pyridine-d5. The proton signals residing on the ciguatoxins (CTXs to be used for quantification were carefully selected for those that were well separated from adjacent signals including impurities and that exhibited an effective intensity. To quantify CTX1B and its congeners, the olefin protons in the side chain were judged appropriate for use. The quantification was achievable with nano-molar solutions. The probable errors for uncertainty, calculated on respective toxins, ranged between 3% and 16%. The contamination of the precious toxins with nonvolatile internal standards was thus avoided. After the evaporation of pyridine-d5, the calibrated CTXs were ready for use as the reference standard in the quantitative analysis of ciguatoxins by LC/MS.

  10. The relative contributions of scatter and attenuation corrections toward improved brain SPECT quantification

    International Nuclear Information System (INIS)

    Stodilka, Robert Z.; Msaki, Peter; Prato, Frank S.; Nicholson, Richard L.; Kemp, B.J.

    1998-01-01

    Mounting evidence indicates that scatter and attenuation are major confounds to objective diagnosis of brain disease by quantitative SPECT. There is considerable debate, however, as to the relative importance of scatter correction (SC) and attenuation correction (AC), and how they should be implemented. The efficacy of SC and AC for 99m Tc brain SPECT was evaluated using a two-compartment fully tissue-equivalent anthropomorphic head phantom. Four correction schemes were implemented: uniform broad-beam AC, non-uniform broad-beam AC, uniform SC+AC, and non-uniform SC+AC. SC was based on non-stationary deconvolution scatter subtraction, modified to incorporate a priori knowledge of either the head contour (uniform SC) or transmission map (non-uniform SC). The quantitative accuracy of the correction schemes was evaluated in terms of contrast recovery, relative quantification (cortical:cerebellar activity), uniformity ((coefficient of variation of 230 macro-voxels) x100%), and bias (relative to a calibration scan). Our results were: uniform broad-beam (μ=0.12cm -1 ) AC (the most popular correction): 71% contrast recovery, 112% relative quantification, 7.0% uniformity, +23% bias. Non-uniform broad-beam (soft tissue μ=0.12cm -1 ) AC: 73%, 114%, 6.0%, +21%, respectively. Uniform SC+AC: 90%, 99%, 4.9%, +12%, respectively. Non-uniform SC+AC: 93%, 101%, 4.0%, +10%, respectively. SC and AC achieved the best quantification; however, non-uniform corrections produce only small improvements over their uniform counterparts. SC+AC was found to be superior to AC; this advantage is distinct and consistent across all four quantification indices. (author)

  11. Quantification of Representative Ciguatoxins in the Pacific Using Quantitative Nuclear Magnetic Resonance Spectroscopy.

    Science.gov (United States)

    Kato, Tsuyoshi; Yasumoto, Takeshi

    2017-10-12

    The absolute quantification of five toxins involved in ciguatera fish poisoning (CFP) in the Pacific was carried out by quantitative ¹H-NMR. The targeted toxins were ciguatoxin-1B (CTX1B), 52-epi-54-deoxyciguatoxin-1B (epideoxyCTX1B), ciguatoxin-3C (CTX3C), 51-hydroxyciguatoxin-3C (51OHCTX3C), and ciguatoxin-4A (CTX4A). We first calibrated the residual protons of pyridine- d ₅ using certified reference material, 1,4-BTMSB- d ₄, prepared the toxin solutions with the calibrated pyridin- d ₅, measured the ¹H-NMR spectra, and quantified the toxin using the calibrated residual protons as the internal standard. The absolute quantification was carried out by comparing the signal intensities between the selected protons of the target toxin and the residual protons of the calibrated pyridine- d ₅. The proton signals residing on the ciguatoxins (CTXs) to be used for quantification were carefully selected for those that were well separated from adjacent signals including impurities and that exhibited an effective intensity. To quantify CTX1B and its congeners, the olefin protons in the side chain were judged appropriate for use. The quantification was achievable with nano-molar solutions. The probable errors for uncertainty, calculated on respective toxins, ranged between 3% and 16%. The contamination of the precious toxins with nonvolatile internal standards was thus avoided. After the evaporation of pyridine- d ₅, the calibrated CTXs were ready for use as the reference standard in the quantitative analysis of ciguatoxins by LC/MS.

  12. Bone histomorphometric quantification by X-ray phase contrast and transmission 3D SR microcomputed tomography

    International Nuclear Information System (INIS)

    Nogueira, L.P.; Pinheiro, C.J.G.; Braz, D.; Oliveira, L.F.; Barroso, R.C.

    2008-01-01

    Full text: Conventional histomorphometry is an important method for quantitative evaluation of bone microstructure. X-ray computed tomography is a noninvasive technique, which can be used to evaluate histomorphometric indices. In this technique, the output 3D images are used to quantify the whole sample, differently from the conventional one, in which the quantification is performed in 2D slices and extrapolated for 3D case. Looking for better resolutions and visualization of soft tissues, X-ray phase contrast imaging technique was developed. The objective of this work was to perform histomorphometric quantification of human cancellous bone using 3D synchrotron X ray computed microtomography, using two distinct techniques: transmission and phase contrast, in order to compare the results and evaluate the viability of applying the same methodology of quantification for both technique. All experiments were performed at the ELETTRA Synchrotron Light Laboratory in Trieste (Italy). MicroCT data sets were collected using the CT set-up on the SYRMEP (Synchrotron Radiation for Medical Physics) beamline. Results showed that there is a better correlation between histomorphometric parameters of both techniques when morphological filters had been used. However, using these filters, some important information given by phase contrast are lost and they shall be explored by new techniques of quantification

  13. Air pollution in moderately polluted urban areas: How does the definition of “neighborhood” impact exposure assessment?

    International Nuclear Information System (INIS)

    Tenailleau, Quentin M.; Mauny, Frédéric; Joly, Daniel; François, Stéphane; Bernard, Nadine

    2015-01-01

    Environmental health studies commonly quantify subjects' pollution exposure in their neighborhood. How this neighborhood is defined can vary, however, leading to different approaches to quantification whose impacts on exposure levels remain unclear. We explore the relationship between neighborhood definition and exposure assessment. NO 2 , benzene, PM 10 and PM 2.5 exposure estimates were computed in the vicinity of 10,825 buildings using twelve exposure assessment techniques reflecting different definitions of “neighborhood”. At the city scale, its definition does not significantly influence exposure estimates. It does impact levels at the building scale, however: at least a quarter of the buildings' exposure estimates for a 400 m buffer differ from the estimated 50 m buffer value (±1.0 μg/m 3 for NO 2 , PM 10 and PM 2.5 ; and ±0.05 μg/m 3 for benzene). This variation is significantly related to the definition of neighborhood. It is vitally important for investigators to understand the impact of chosen assessment techniques on exposure estimates. - Highlights: • Residential building air pollution was calculated using 12 assessment techniques. • These techniques refer to common epidemiological definitions of neighborhood. • At the city scale, neighborhood definition does not impact exposure estimates. • At the building scale, neighborhood definition does impact exposure estimates. • The impact of neighborhood definition varies with physical/deprivation variables. - Ignoring the impact of the neighborhood's definition on exposure estimates could lead to exposure quantification errors that impact resulting health studies, health risk evaluation, and consequently all the decision-making process.

  14. Protocol for Quantification of Defects in Natural Fibres for Composites

    DEFF Research Database (Denmark)

    Mortensen, Ulrich Andreas; Madsen, Bo

    2014-01-01

    Natural bast-type plant fibres are attracting increasing interest for being used for structural composite applications where high quality fibres with good mechanical properties are required. A protocol for the quantification of defects in natural fibres is presented. The protocol is based...

  15. Recognition and quantification of pain in horses: A tutorial review

    DEFF Research Database (Denmark)

    Gleerup, Karina Charlotte Bech; Lindegaard, Casper

    2016-01-01

    Pain management is dependent on the quality of the pain evaluation. Ideally, pain evaluation is objective, pain-specific and easily incorporated into a busy equine clinic. This paper reviews the existing knowledge base regarding the identification and quantification of pain in horses. Behavioural...

  16. Collaborative framework for PIV uncertainty quantification: comparative assessment of methods

    International Nuclear Information System (INIS)

    Sciacchitano, Andrea; Scarano, Fulvio; Neal, Douglas R; Smith, Barton L; Warner, Scott O; Vlachos, Pavlos P; Wieneke, Bernhard

    2015-01-01

    A posteriori uncertainty quantification of particle image velocimetry (PIV) data is essential to obtain accurate estimates of the uncertainty associated with a given experiment. This is particularly relevant when measurements are used to validate computational models or in design and decision processes. In spite of the importance of the subject, the first PIV uncertainty quantification (PIV-UQ) methods have been developed only in the last three years. The present work is a comparative assessment of four approaches recently proposed in the literature: the uncertainty surface method (Timmins et al 2012), the particle disparity approach (Sciacchitano et al 2013), the peak ratio criterion (Charonko and Vlachos 2013) and the correlation statistics method (Wieneke 2015). The analysis is based upon experiments conducted for this specific purpose, where several measurement techniques are employed simultaneously. The performances of the above approaches are surveyed across different measurement conditions and flow regimes. (paper)

  17. WE-G-17A-03: MRIgRT: Quantification of Organ Motion

    International Nuclear Information System (INIS)

    Stanescu, T; Tadic, T; Jaffray, D

    2014-01-01

    Purpose: To develop an MRI-based methodology and tools required for the quantification of organ motion on a dedicated MRI-guided radiotherapy system. A three-room facility, consisting of a TrueBeam 6X linac vault, a 1.5T MR suite and a brachytherapy interventional room, is currently under commissioning at our institution. The MR scanner can move and image in either room for diagnostic and treatment guidance purposes. Methods: A multi-imaging modality (MR, kV) phantom, featuring programmable 3D simple and complex motion trajectories, was used for the validation of several image sorting algorithms. The testing was performed on MRI (e.g. TrueFISP, TurboFLASH), 4D CT and 4D CBCT. The image sorting techniques were based on a) direct image pixel manipulation into columns or rows, b) single and aggregated pixel data tracking and c) using computer vision techniques for global pixel analysis. Subsequently, the motion phantom and sorting algorithms were utilized for commissioning of MR fast imaging techniques for 2D-cine and 4D data rendering. MR imaging protocols were optimized (e.g. readout gradient strength vs. SNR) to minimize the presence of susceptibility-induced distortions, which were reported through phantom experiments and numerical simulations. The system-related distortions were also quantified (dedicated field phantom) and treated as systematic shifts where relevant. Results: Image sorting algorithms were validated for specific MR-based applications such as quantification of organ motion, local data sampling, and 4D MRI for pre-RT delivery with accuracy better than the raw image pixel size (e.g. 1 mm). MR fast imaging sequences were commissioning and imaging strategies were developed to mitigate spatial artifacts with minimal penalty on the image spatial and temporal sampling. Workflows (e.g. liver) were optimized to include the new motion quantification tools for RT planning and daily patient setup verification. Conclusion: Comprehensive methods were developed

  18. Evaluation of the reliability of maize reference assays for GMO quantification.

    Science.gov (United States)

    Papazova, Nina; Zhang, David; Gruden, Kristina; Vojvoda, Jana; Yang, Litao; Buh Gasparic, Meti; Blejec, Andrej; Fouilloux, Stephane; De Loose, Marc; Taverniers, Isabel

    2010-03-01

    A reliable PCR reference assay for relative genetically modified organism (GMO) quantification must be specific for the target taxon and amplify uniformly along the commercialised varieties within the considered taxon. Different reference assays for maize (Zea mays L.) are used in official methods for GMO quantification. In this study, we evaluated the reliability of eight existing maize reference assays, four of which are used in combination with an event-specific polymerase chain reaction (PCR) assay validated and published by the Community Reference Laboratory (CRL). We analysed the nucleotide sequence variation in the target genomic regions in a broad range of transgenic and conventional varieties and lines: MON 810 varieties cultivated in Spain and conventional varieties from various geographical origins and breeding history. In addition, the reliability of the assays was evaluated based on their PCR amplification performance. A single base pair substitution, corresponding to a single nucleotide polymorphism (SNP) reported in an earlier study, was observed in the forward primer of one of the studied alcohol dehydrogenase 1 (Adh1) (70) assays in a large number of varieties. The SNP presence is consistent with a poor PCR performance observed for this assay along the tested varieties. The obtained data show that the Adh1 (70) assay used in the official CRL NK603 assay is unreliable. Based on our results from both the nucleotide stability study and the PCR performance test, we can conclude that the Adh1 (136) reference assay (T25 and Bt11 assays) as well as the tested high mobility group protein gene assay, which also form parts of CRL methods for quantification, are highly reliable. Despite the observed uniformity in the nucleotide sequence of the invertase gene assay, the PCR performance test reveals that this target sequence might occur in more than one copy. Finally, although currently not forming a part of official quantification methods, zein and SSIIb

  19. Offshore wind turbine risk quantification/evaluation under extreme environmental conditions

    International Nuclear Information System (INIS)

    Taflanidis, Alexandros A.; Loukogeorgaki, Eva; Angelides, Demos C.

    2013-01-01

    A simulation-based framework is discussed in this paper for quantification/evaluation of risk and development of automated risk assessment tools, focusing on applications to offshore wind turbines under extreme environmental conditions. The framework is founded on a probabilistic characterization of the uncertainty in the models for the excitation, the turbine and its performance. Risk is then quantified as the expected value of some risk consequence measure over the probability distributions considered for the uncertain model parameters. Stochastic simulation is proposed for the risk assessment, corresponding to the evaluation of some associated probabilistic integral quantifying risk, as it allows for the adoption of comprehensive computational models for describing the dynamic turbine behavior. For improvement of the computational efficiency, a surrogate modeling approach is introduced based on moving least squares response surface approximations. The assessment is also extended to a probabilistic sensitivity analysis that identifies the importance of each of the uncertain model parameters, i.e. risk factors, towards the total risk as well as towards each of the failure modes contributing to this risk. The versatility and computational efficiency of the advocated approaches is finally exploited to support the development of standalone risk assessment applets for automated implementation of the probabilistic risk quantification/assessment. -- Highlights: ► A simulation-based risk quantification/assessment framework is discussed. ► Focus is on offshore wind turbines under extreme environmental conditions. ► Approach is founded on probabilistic description of excitation/system model parameters. ► Surrogate modeling is adopted for improved computational efficiency. ► Standalone risk assessment applets for automated implementation are supported

  20. Evidence-based quantification of uncertainties induced via simulation-based modeling

    International Nuclear Information System (INIS)

    Riley, Matthew E.

    2015-01-01

    The quantification of uncertainties in simulation-based modeling traditionally focuses upon quantifying uncertainties in the parameters input into the model, referred to as parametric uncertainties. Often neglected in such an approach are the uncertainties induced by the modeling process itself. This deficiency is often due to a lack of information regarding the problem or the models considered, which could theoretically be reduced through the introduction of additional data. Because of the nature of this epistemic uncertainty, traditional probabilistic frameworks utilized for the quantification of uncertainties are not necessarily applicable to quantify the uncertainties induced in the modeling process itself. This work develops and utilizes a methodology – incorporating aspects of Dempster–Shafer Theory and Bayesian model averaging – to quantify uncertainties of all forms for simulation-based modeling problems. The approach expands upon classical parametric uncertainty approaches, allowing for the quantification of modeling-induced uncertainties as well, ultimately providing bounds on classical probability without the loss of epistemic generality. The approach is demonstrated on two different simulation-based modeling problems: the computation of the natural frequency of a simple two degree of freedom non-linear spring mass system and the calculation of the flutter velocity coefficient for the AGARD 445.6 wing given a subset of commercially available modeling choices. - Highlights: • Modeling-induced uncertainties are often mishandled or ignored in the literature. • Modeling-induced uncertainties are epistemic in nature. • Probabilistic representations of modeling-induced uncertainties are restrictive. • Evidence theory and Bayesian model averaging are integrated. • Developed approach is applicable for simulation-based modeling problems

  1. Accurate and precise DNA quantification in the presence of different amplification efficiencies using an improved Cy0 method.

    Science.gov (United States)

    Guescini, Michele; Sisti, Davide; Rocchi, Marco B L; Panebianco, Renato; Tibollo, Pasquale; Stocchi, Vilberto

    2013-01-01

    Quantitative real-time PCR represents a highly sensitive and powerful technology for the quantification of DNA. Although real-time PCR is well accepted as the gold standard in nucleic acid quantification, there is a largely unexplored area of experimental conditions that limit the application of the Ct method. As an alternative, our research team has recently proposed the Cy0 method, which can compensate for small amplification variations among the samples being compared. However, when there is a marked decrease in amplification efficiency, the Cy0 is impaired, hence determining reaction efficiency is essential to achieve a reliable quantification. The proposed improvement in Cy0 is based on the use of the kinetic parameters calculated in the curve inflection point to compensate for efficiency variations. Three experimental models were used: inhibition of primer extension, non-optimal primer annealing and a very small biological sample. In all these models, the improved Cy0 method increased quantification accuracy up to about 500% without affecting precision. Furthermore, the stability of this procedure was enhanced integrating it with the SOD method. In short, the improved Cy0 method represents a simple yet powerful approach for reliable DNA quantification even in the presence of marked efficiency variations.

  2. Quantification and characterization of Si in Pinus Insignis Dougl by TXRF

    Energy Technology Data Exchange (ETDEWEB)

    Navarro, Henry; Bennun, Leonardo [Universidad de Concepcion, Laboratorio de Fisica Aplicada, Departamento de Fisica, Concepcion (Chile); Marco, Lue M. [Universidad Centro Lisandro Alvarado, Decanato de Agronomia, Depto. de Quimica, Barquisimeto (Venezuela, Bolivarian Republic of)

    2014-12-09

    A simple quantification of silicon is described, in woods such as Pinus Insigne Dougl obtained from the 8th region of Bio-Bio, 37 15'' South-73 19'' West, Chile. The samples were prepared through fractional calcination, and the ashes were directly analyzed by total reflection X-ray fluorescence (TXRF) technique. The analysis of 16 samples that were calcined is presented. The samples were weighed on plastic reflectors in a microbalance with sensitivity of 0.1 μg. Later, the samples were irradiated in a TXRF PICOFOX spectrometer, for 350 and 700 s. To each sample, cobalt was added as an internal standard. Concentrations of silicon over the 1 % in each sample and the self-absorption effect on the quantification were observed, in masses higher than 100 μg. (orig.)

  3. Quantification and characterization of Si in Pinus Insignis Dougl by TXRF

    International Nuclear Information System (INIS)

    Navarro, Henry; Bennun, Leonardo; Marco, Lue M.

    2015-01-01

    A simple quantification of silicon is described, in woods such as Pinus Insigne Dougl obtained from the 8th region of Bio-Bio, 37 15'' South-73 19'' West, Chile. The samples were prepared through fractional calcination, and the ashes were directly analyzed by total reflection X-ray fluorescence (TXRF) technique. The analysis of 16 samples that were calcined is presented. The samples were weighed on plastic reflectors in a microbalance with sensitivity of 0.1 μg. Later, the samples were irradiated in a TXRF PICOFOX spectrometer, for 350 and 700 s. To each sample, cobalt was added as an internal standard. Concentrations of silicon over the 1 % in each sample and the self-absorption effect on the quantification were observed, in masses higher than 100 μg. (orig.)

  4. Quantification of Wine Mixtures with an Electronic Nose and a Human Panel

    Science.gov (United States)

    Aleixandre, Manuel; Cabellos, Juan M.; Arroyo, Teresa; Horrillo, M. C.

    2018-01-01

    In this work, an electronic nose and a human panel were used for the quantification of wines formed by binary mixtures of four white grape varieties and two varieties of red wines at different percentages (from 0 to 100% in 10% steps for the electronic nose and from 0 to 100% in 25% steps for the human panel). The wines were prepared using the traditional method with commercial yeasts. Both techniques were able to quantify the mixtures tested, but it is important to note that the technology of the electronic nose is faster, simpler, and more objective than the human panel. In addition, better results of quantification were also obtained using the electronic nose. PMID:29484296

  5. Quantification of agricultural drought occurrence as an estimate for insurance programs

    Science.gov (United States)

    Bannayan, M.; Hoogenboom, G.

    2015-11-01

    Temporal irregularities of rainfall and drought have major impacts on rainfed cropping systems. The main goal of this study was to develop an approach for realizing drought occurrence based on local winter wheat yield loss and rainfall. The domain study included 11 counties in the state of Washington that actively grow rainfed winter wheat and an uncertainty rainfall evaluation model using daily rainfall values from 1985 to 2007. An application was developed that calculates a rainfall index for insurance that was then used to determine the drought intensity for each study year and for each study site. Evaluation of the drought intensity showed that both the 1999-2000 and 2000-2001 growing seasons were stressful years for most of the study locations, while the 2005-2006 and the 2006-2007 growing seasons experienced the lowest drought intensity for all locations. Our results are consistent with local extension reports of drought occurrences. Quantification of drought intensity based on this application could provide a convenient index for insurance companies for determining the effect of rainfall and drought on crop yield loss under the varying weather conditions of semi-arid regions.

  6. Direct quantification of nickel in stainless steels by spectrophotometry

    International Nuclear Information System (INIS)

    Singh, Ritu; Raut, Vaibhavi V.; Jeyakumar, S.; Ramakumar, K.L.

    2007-01-01

    A spectrophotometric method based on the Ni-DMG complex for the quantification of nickel in steel samples without employing any prior separation is reported in the present study. The interfering ions are masked by suitable complexing agents and the method was extended to real samples after validating with BCS and Euro steel standards. (author)

  7. Uncertainty quantification for PZT bimorph actuators

    Science.gov (United States)

    Bravo, Nikolas; Smith, Ralph C.; Crews, John

    2018-03-01

    In this paper, we discuss the development of a high fidelity model for a PZT bimorph actuator used for micro-air vehicles, which includes the Robobee. We developed a high-fidelity model for the actuator using the homogenized energy model (HEM) framework, which quantifies the nonlinear, hysteretic, and rate-dependent behavior inherent to PZT in dynamic operating regimes. We then discussed an inverse problem on the model. We included local and global sensitivity analysis of the parameters in the high-fidelity model. Finally, we will discuss the results of Bayesian inference and uncertainty quantification on the HEM.

  8. Serendipity: Global Detection and Quantification of Plant Stress

    Science.gov (United States)

    Schimel, D.; Verma, M.; Drewry, D.

    2016-12-01

    Detecting and quantifying plant stress is a grand challenge for remote sensing, and is important for understanding climate impacts on ecosystems broadly and also for early warning systems supporting food security. The long record from moderate resolution sensors providing frequent data has allowed using phenology to detect stress in forest and agroecosystems, but can fail or give ambiguous results when stress occurs during later phases of growth and in high leaf area systems. The recent recognition that greenhouse gas satellites such as GOSAT and OCO-2 observe Solar-Induced Fluorescence has added a new and complementary tool for the quantification of stress but algorithms to detect and quantify stress using SIF are in their infancy. Here we report new results showing a more complex response of SIF to stress by evaluating spaceborne SIF against in situ eddy covariance data. The response observed is as predicted by theory, and shows that SIF, used in conjunction with moderate resolution remote sensing, can detect and likely quantify stress by indexing the nonlinear part of the SIF-GPP relationship using the photochemical reflectance index and remotely observed light absorption. There are several exciting opportunities on the near horizon for the implementation of SIF, together with syngeristic measurements such as PRI and evapotranspiration that suggest the next few years will be a golden age for global ecology. Adancing the science and associated algorithms now is essential to fully exploiting the next wave of missions.

  9. A fiber-optic setup for quantification of root surface demineralization

    NARCIS (Netherlands)

    vanderVeen, MH; tenBosch, JJ

    A fiber-optic fluorescence observation (FOFO) technique has been developed for the quantification of demineralized root dentin, The method was tested on 40 specimens of in vitro demineralized parts of human root dentin, Fluorescein sodium salt was used as a penetrating dye, The fluorescein sodium

  10. Enhancement of Electroluminescence (EL) image measurements for failure quantification methods

    DEFF Research Database (Denmark)

    Parikh, Harsh; Spataru, Sergiu; Sera, Dezso

    2018-01-01

    Enhanced quality images are necessary for EL image analysis and failure quantification. A method is proposed which determines image quality in terms of more accurate failure detection of solar panels through electroluminescence (EL) imaging technique. The goal of the paper is to determine the most...

  11. Quantification of BCR-ABL transcripts in peripheral blood cells and ...

    African Journals Online (AJOL)

    Purpose: To investigate the feasibility of using peripheral blood plasma samples as surrogates for blood cell sampling for quantification of breakpoint cluster region-Abelson oncogene (BCR-ABL) transcript levels to monitor treatment responses in chronic myeloid leukemia (CML) patients. Methods: Peripheral blood samples ...

  12. Machine Learning for Quantification of Small Vessel Disease Imaging Biomarkers

    NARCIS (Netherlands)

    Ghafoorian, M.

    2018-01-01

    This thesis is devoted to developing fully automated methods for quantification of small vessel disease imaging bio-markers, namely WMHs and lacunes, using vari- ous machine learning/deep learning and computer vision techniques. The rest of the thesis is organized as follows: Chapter 2 describes

  13. A novel nano-immunoassay method for quantification of proteins from CD138-purified myeloma cells: biological and clinical utility.

    Science.gov (United States)

    Misiewicz-Krzeminska, Irena; Corchete, Luis Antonio; Rojas, Elizabeta A; Martínez-López, Joaquín; García-Sanz, Ramón; Oriol, Albert; Bladé, Joan; Lahuerta, Juan-José; Miguel, Jesús San; Mateos, María-Victoria; Gutiérrez, Norma C

    2018-05-01

    Protein analysis in bone marrow samples from patients with multiple myeloma has been limited by the low concentration of proteins obtained after CD138 + cell selection. A novel approach based on capillary nano-immunoassay could make it possible to quantify dozens of proteins from each myeloma sample in an automated manner. Here we present a method for the accurate and robust quantification of the expression of multiple proteins extracted from CD138-purified multiple myeloma samples frozen in RLT Plus buffer, which is commonly used for nucleic acid preservation and isolation. Additionally, the biological and clinical value of this analysis for a panel of 12 proteins essential to the pathogenesis of multiple myeloma was evaluated in 63 patients with newly diagnosed multiple myeloma. The analysis of the prognostic impact of CRBN /Cereblon and IKZF1 /Ikaros mRNA/protein showed that only the protein levels were able to predict progression-free survival of patients; mRNA levels were not associated with prognosis. Interestingly, high levels of Cereblon and Ikaros proteins were associated with longer progression-free survival only in patients who received immunomodulatory drugs and not in those treated with other drugs. In conclusion, the capillary nano-immunoassay platform provides a novel opportunity for automated quantification of the expression of more than 20 proteins in CD138 + primary multiple myeloma samples. Copyright © 2018 Ferrata Storti Foundation.

  14. Exploring the chemistry of complex samples by tentative identification and semi-quantification: a food contact material case

    DEFF Research Database (Denmark)

    Pieke, Eelco Nicolaas; Smedsgaard, Jørn; Granby, Kit

    2017-01-01

    to retrieve the most likely chemical match from a structure database. In addition, TOF-only data is used to estimate analyte concentration via semi-quantification. The method is demonstrated in recycled paper food contact material (FCM). Here, 585 chromatographic peaks were discovered, of which 117 were...... data. Overall, the described method is a valuable chemical exploration tool for non-identified substances, but also may be used as a preliminary prioritization tool for substances expected to have the highest health impact, for example in FCMs....... elucidation of a vast number of unknowns, of which only a fraction may be relevant. Here, we present an exploration and prioritization approach based on high resolution mass spectrometry. The method uses algorithm-based precursor/product-ion correlations on Quadrupole-Time of Flight (Q-TOF) MS/MS data...

  15. Validation of an HPLC-UV method for the identification and quantification of bioactive amines in chicken meat

    Directory of Open Access Journals (Sweden)

    D.C.S. Assis

    2016-06-01

    Full Text Available ABSTRACT A high-performance liquid chromatography with ultraviolet detection (HPLC-UV method was validated for the study of bioactive amines in chicken meat. A gradient elution system with an ultraviolet detector was used after extraction with trichloroacetic acid and pre-column derivatization with dansyl chloride. Putrescine, cadaverine, histamine, tyramine, spermidine, and spermine standards were used for the evaluation of the following performance parameters: selectivity, linearity, precision, recovery, limits of detection, limits of quantification and ruggedness. The results indicated excellent selectivity, separation of all amines, a coefficient of determination greater than 0.99 and recovery from 92.25 to 102.25% at the concentration of 47.2mg.kg-1, with a limit of detection at 0.3mg.kg-1 and a limit of quantification at 0.9mg.kg-1 for all amines, with the exception of histamine, which exhibited the limit of quantification, of 1mg.kg-1. In conclusion, the performance parameters demonstrated adequacy of the method for the detection and quantification of bioactive amines in chicken meat.

  16. Uncertainty Quantification Bayesian Framework for Porous Media Flows

    Science.gov (United States)

    Demyanov, V.; Christie, M.; Erbas, D.

    2005-12-01

    Uncertainty quantification is an increasingly important aspect of many areas of applied science, where the challenge is to make reliable predictions about the performance of complex physical systems in the absence of complete or reliable data. Predicting flows of fluids through undersurface reservoirs is an example of a complex system where accuracy in prediction is needed (e.g. in oil industry it is essential for financial reasons). Simulation of fluid flow in oil reservoirs is usually carried out using large commercially written finite difference simulators solving conservation equations describing the multi-phase flow through the porous reservoir rocks, which is a highly computationally expensive task. This work examines a Bayesian Framework for uncertainty quantification in porous media flows that uses a stochastic sampling algorithm to generate models that match observed time series data. The framework is flexible for a wide range of general physical/statistical parametric models, which are used to describe the underlying hydro-geological process in its temporal dynamics. The approach is based on exploration of the parameter space and update of the prior beliefs about what the most likely model definitions are. Optimization problem for a highly parametric physical model usually have multiple solutions, which impact the uncertainty of the made predictions. Stochastic search algorithm (e.g. genetic algorithm) allows to identify multiple "good enough" models in the parameter space. Furthermore, inference of the generated model ensemble via MCMC based algorithm evaluates the posterior probability of the generated models and quantifies uncertainty of the predictions. Machine learning algorithm - Artificial Neural Networks - are used to speed up the identification of regions in parameter space where good matches to observed data can be found. Adaptive nature of ANN allows to develop different ways of integrating them into the Bayesian framework: as direct time

  17. Thermodynamic behavior of erythritol in aqueous solutions and in gelatine gels and its quantification

    International Nuclear Information System (INIS)

    Tyapkova, Oxana; Bader-Mittermaier, Stephanie; Schweiggert-Weisz, Ute

    2013-01-01

    Highlights: • Differential scanning calorimetry as a method to determine erythritol crystallization. • Determination of crystallization using solution enthalpy. • Erythritol crystallization influenced by area of air–water-interfaces. • DSC method is applicable for both aqueous solutions and gels. • Adaption of DSC method to other, more complex food matrices is possible. - Abstract: As crystallization of erythritol can cause a sandy mouth-feel in sugar-free products, strategies to avoid crystallization or adaption of food formulation should be elucidated. However, until now erythritol crystallization was only quantified in aqueous solutions, but not in model food systems. Differential scanning calorimetry (DSC) is a simple method for the quantification of phase transition in various systems. However, no methods for the quantification of crystallization from aqueous systems based on DSC have been published until now. In the present study DSC was found to be suitable for the quantification of crystallization using supersaturated aqueous solutions of erythritol and erythritol containing gelatine gels for the first time. The developed method was validated by comparing the crystallization values determined by gravimetric measurement of erythritol crystals and the values obtained by DSC. No significant differences (p < 0.05) have been obtained between the results of the two methods if an appropriate design of measurements was applied. Additionally, the method was adapted to gelatine gels to elucidate the transferability to model food systems. Hence, the method is suitable for quantification of the amount of erythritol crystals present in aqueous solutions and gels, respectively

  18. Quantification of VX Nerve Agent in Various Food Matrices by Solid-Phase Extraction Ultra-Performance Liquid ChromatographyTime-of-Flight Mass Spectrometry

    Science.gov (United States)

    2016-04-01

    QUANTIFICATION OF VX NERVE AGENT IN VARIOUS FOOD MATRICES BY SOLID - PHASE EXTRACTION ULTRA-PERFORMANCE...TITLE AND SUBTITLE Quantification of VX Nerve Agent in Various Food Matrices by Solid - Phase Extraction Ultra-Performance Liquid Chromatography...QUANTIFICATION OF VX NERVE AGENT IN VARIOUS FOOD MATRICES BY SOLID - PHASE EXTRACTION ULTRA-PERFORMANCE LIQUID CHROMATOGRAPHY–TIME-OF-FLIGHT MASS

  19. PREMIUM - Benchmark on the quantification of the uncertainty of the physical models in the system thermal-hydraulic codes

    International Nuclear Information System (INIS)

    Skorek, Tomasz; Crecy, Agnes de

    2013-01-01

    PREMIUM (Post BEMUSE Reflood Models Input Uncertainty Methods) is an activity launched with the aim to push forward the methods of quantification of physical models uncertainties in thermal-hydraulic codes. It is endorsed by OECD/NEA/CSNI/WGAMA. The benchmark PREMIUM is addressed to all who applies uncertainty evaluation methods based on input uncertainties quantification and propagation. The benchmark is based on a selected case of uncertainty analysis application to the simulation of quench front propagation in an experimental test facility. Application to an experiment enables evaluation and confirmation of the quantified probability distribution functions on the basis of experimental data. The scope of the benchmark comprises a review of the existing methods, selection of potentially important uncertain input parameters, preliminary quantification of the ranges and distributions of the identified parameters, evaluation of the probability density function using experimental results of tests performed on FEBA test facility and confirmation/validation of the performed quantification on the basis of blind calculation of Reflood 2-D PERICLES experiment. (authors)

  20. Impact and correction of the bladder uptake on 18F-FCH PET quantification: a simulation study using the XCAT2 phantom

    Science.gov (United States)

    Silva-Rodríguez, Jesús; Tsoumpas, Charalampos; Domínguez-Prado, Inés; Pardo-Montero, Juan; Ruibal, Álvaro; Aguiar, Pablo

    2016-01-01

    The spill-in counts from neighbouring regions can significantly bias the quantification over small regions close to high activity extended sources. This effect can be a drawback for 18F-based radiotracers positron emission tomography (PET) when quantitatively evaluating the bladder area for diseases such as prostate cancer. In this work, we use Monte Carlo simulations to investigate the impact of the spill-in counts from the bladder on the quantitative evaluation of prostate cancer when using 18F-Fluorcholine (FCH) PET and we propose a novel reconstruction-based correction method. Monte Carlo simulations of a modified version of the XCAT2 anthropomorphic phantom with 18F-FCH biological distribution, variable bladder uptake and inserted prostatic tumours were used in order to obtain simulated realistic 18F-FCH data. We evaluated possible variations of the measured tumour Standardized Uptake Value (SUV) for different values of bladder uptake and propose a novel correction by appropriately adapting image reconstruction methodology. The correction is based on the introduction of physiological background terms on the reconstruction, removing the contribution of the bladder to the final image. The bladder is segmented from the reconstructed image and then forward-projected to the sinogram space. The resulting sinograms are used as background terms for the reconstruction. SUVmax and SUVmean could be overestimated by 41% and 22% respectively due to the accumulation of radiotracer in the bladder, with strong dependence on bladder-to-lesion ratio. While the SUVs measured under these conditions are not reliable, images corrected using the proposed methodology provide better repeatability of SUVs, with biases below 6%. Results also showed remarkable improvements on visual detectability. The spill-in counts from the bladder can affect prostatic SUV measurements of 18F-FCH images, which can be corrected to less than 6% using the proposed methodology, providing reliable SUV

  1. Quantification of the potential for biogas and biogas manure from the ...

    African Journals Online (AJOL)

    Thomas

    2013-09-04

    Sep 4, 2013 ... This wasted energy material is equivalent to 9000 L of diesel fuel that currently would cost 9389 ... Key words: Biogas potential, fruit waste, quantification, prediction, biogas manure. ... For example, consumption of fruits and.

  2. La politique de tarification des Transports Publics Urbains en Île-de-France

    OpenAIRE

    OUATTARA, Christophe

    2015-01-01

    À l'heure du débat environnemental et de l'émergence des modes alternatifs à l'usage de la voiture particulière (covoiturage, autopartage, vélo, marche, etc.), les transports collectifs urbains sont porteurs des orientations politiques en faveur de la mobilité durable. Facteurs de cohésion sociale et garants du droit au transport pour tous, ils constituent également une solution de taille face à la congestion du trafic urbain. Cela est rendu possible par la mise en place d'une politique de ta...

  3. Nature et souveraineté. Philosophie politique en temps de crise écologique

    OpenAIRE

    Gérard Mairet

    2015-01-01

    L’objet de cet essai est de proposer non certes des solutions pratiques immédiates au problème contemporain de la crise écologique, mais d’élaborer une réflexion théorique par où l’on apercevra l’insuffisance du principe de souveraineté. Au fondement de l’Etat, la souveraineté est gardienne du particularisme des Etats alors qu’une politique universaliste seule (mondiale ou régionale) s’impose face au désordre environnemental. Ainsi, en contrepartie, l’ouvrage dessine le passage d’une politiqu...

  4. Issues connected with indirect cost quantification: a focus on the transportation system

    Science.gov (United States)

    Křivánková, Zuzana; Bíl, Michal; Kubeček, Jan; Vodák, Rostislav

    2017-04-01

    Transportation and communication networks in general are vital parts of modern society. The economy relies heavily on transportation system performance. A number of people commutes to work regularly. Stockpiles in many companies are being reduced as the just-in-time production process is able to supply resources via the transportation network on time. Natural hazards have the potential to disturb transportation systems. Earthquakes, flooding or landsliding are examples of high-energetic processes which are capable of causing direct losses (i.e. physical damage to the infrastructure). We have focused on quantification of the indirect cost of natural hazards which are not easy to estimate. Indirect losses can also emerge as a result of meteorological hazards with low energy which only seldom cause direct losses, e.g. glaze, snowfall. Whereas evidence of repair work and general direct costs usually exist or can be estimated, indirect costs are much more difficult to identify particularly when they are not covered by insurance agencies. Delimitations of alternative routes (detours) are the most frequent responses to blocked road links. Indirect costs can then be related to increased fuel consumption and additional operating costs. Detours usually result in prolonged travel times. Indirect costs quantification has to therefore cover the value of the time. The costs from the delay are a nonlinear function of travel time, however. The existence of an alternative transportation pattern may also result in an increased number of traffic crashes. This topic has not been studied in depth but an increase in traffic crashes has been reported when people suddenly changed their traffic modes, e.g. when air traffic was not possible. The lost user benefit from those trips that were cancelled or suppressed is also difficult to quantify. Several approaches, based on post-event questioner surveys, have been applied to communities and companies affected by transportation accessibility

  5. Identification of Spectral Regions for Quantification of Red Wine Tannins with Fourier Transform Mid-Infrared Spectroscopy

    DEFF Research Database (Denmark)

    Jensen, Jacob Skibsted; Egebo, Max; Meyer, Anne S.

    2008-01-01

    Accomplishment of fast tannin measurements is receiving increased interest as tannins are important for the mouthfeel and color properties of red wines. Fourier transform mid-infrared spectroscopy allows fast measurement of different wine components, but quantification of tannins is difficult due...... to interferences from spectral responses of other wine components. Four different variable selection tools were investigated for the identification of the most important spectral regions which would allow quantification of tannins from the spectra using partial least-squares regression. The study included...... to be particularly important for tannin quantification. The spectral regions identified from the variable selection methods were used to develop calibration models. All four variable selection methods identified regions that allowed an improved quantitative prediction of tannins (RMSEP = 69−79 mg of CE/L; r = 0...

  6. A real-time PCR assay for detection and quantification of Verticillium dahliae in spinach seed.

    Science.gov (United States)

    Duressa, Dechassa; Rauscher, Gilda; Koike, Steven T; Mou, Beiquan; Hayes, Ryan J; Maruthachalam, Karunakaran; Subbarao, Krishna V; Klosterman, Steven J

    2012-04-01

    Verticillium dahliae is a soilborne fungus that causes Verticillium wilt on multiple crops in central coastal California. Although spinach crops grown in this region for fresh and processing commercial production do not display Verticillium wilt symptoms, spinach seeds produced in the United States or Europe are commonly infected with V. dahliae. Planting of the infected seed increases the soil inoculum density and may introduce exotic strains that contribute to Verticillium wilt epidemics on lettuce and other crops grown in rotation with spinach. A sensitive, rapid, and reliable method for quantification of V. dahliae in spinach seed may help identify highly infected lots, curtail their planting, and minimize the spread of exotic strains via spinach seed. In this study, a quantitative real-time polymerase chain reaction (qPCR) assay was optimized and employed for detection and quantification of V. dahliae in spinach germplasm and 15 commercial spinach seed lots. The assay used a previously reported V. dahliae-specific primer pair (VertBt-F and VertBt-R) and an analytical mill for grinding tough spinach seed for DNA extraction. The assay enabled reliable quantification of V. dahliae in spinach seed, with a sensitivity limit of ≈1 infected seed per 100 (1.3% infection in a seed lot). The quantification was highly reproducible between replicate samples of a seed lot and in different real-time PCR instruments. When tested on commercial seed lots, a pathogen DNA content corresponding to a quantification cycle value of ≥31 corresponded with a percent seed infection of ≤1.3%. The assay is useful in qualitatively assessing seed lots for V. dahliae infection levels, and the results of the assay can be helpful to guide decisions on whether to apply seed treatments.

  7. Uncertainty quantification for environmental models

    Science.gov (United States)

    Hill, Mary C.; Lu, Dan; Kavetski, Dmitri; Clark, Martyn P.; Ye, Ming

    2012-01-01

    Environmental models are used to evaluate the fate of fertilizers in agricultural settings (including soil denitrification), the degradation of hydrocarbons at spill sites, and water supply for people and ecosystems in small to large basins and cities—to mention but a few applications of these models. They also play a role in understanding and diagnosing potential environmental impacts of global climate change. The models are typically mildly to extremely nonlinear. The persistent demand for enhanced dynamics and resolution to improve model realism [17] means that lengthy individual model execution times will remain common, notwithstanding continued enhancements in computer power. In addition, high-dimensional parameter spaces are often defined, which increases the number of model runs required to quantify uncertainty [2]. Some environmental modeling projects have access to extensive funding and computational resources; many do not. The many recent studies of uncertainty quantification in environmental model predictions have focused on uncertainties related to data error and sparsity of data, expert judgment expressed mathematically through prior information, poorly known parameter values, and model structure (see, for example, [1,7,9,10,13,18]). Approaches for quantifying uncertainty include frequentist (potentially with prior information [7,9]), Bayesian [13,18,19], and likelihood-based. A few of the numerous methods, including some sensitivity and inverse methods with consequences for understanding and quantifying uncertainty, are as follows: Bayesian hierarchical modeling and Bayesian model averaging; single-objective optimization with error-based weighting [7] and multi-objective optimization [3]; methods based on local derivatives [2,7,10]; screening methods like OAT (one at a time) and the method of Morris [14]; FAST (Fourier amplitude sensitivity testing) [14]; the Sobol' method [14]; randomized maximum likelihood [10]; Markov chain Monte Carlo (MCMC) [10

  8. Nuclear and energies nr 57. Japan, another glance. The environmental and radiological impact. The international impact. The illusion of renewable energies in Japan; Nucleaire et energies no. 57. Le Japon, un autre regard. L'impact environnemental et radiologique. L'impact international. L'illusion des energies renouvelables au Japon

    Energy Technology Data Exchange (ETDEWEB)

    Lenail, B.

    2011-07-15

    The contributions of this publication first address the Japanese local context (organization, mentality, cultural background, thinking and action modes), and secondly the environmental and radiological impact of the Fukushima accident, notably in comparison with Chernobyl (contamination is much more localized, sometimes higher; a larger concerned population but quicker and more efficient protection measures; more severe consequences due to population displacement). The third article discusses the international impact of the accident: known or foreseen consequences on nuclear programs, discussion on safety strengthening and on governance, evolution of public opinion, possible consequences on climate negotiations. The last article proposes an overview of the current situation of Japan which must mobilize all the available energy resources to face the difficulties in electricity supply

  9. Methane emission quantification from landfills using a double tracer approach

    DEFF Research Database (Denmark)

    Scheutz, Charlotte; Samuelsson, J.; Fredenslund, Anders Michael

    2007-01-01

    A tracer method was successfully used for quantification of the whole methane (CH4) emission from Fakse landfill. By using two different tracers the emission from different sections of the landfill could be quantified. Furthermore, is was possible to determine the emissions from local on site...

  10. Impact of the 37M fuel design on reactor physics characteristics

    International Nuclear Information System (INIS)

    Perez, R.; Ta, P.

    2013-01-01

    For CANDU nuclear reactors, aging of the Heat Transport System (HTS) leads to, among other effects, a reduction on the Critical Heat Flux (CHF) and dryout margin. In an effort to mitigate the impact of aging of the HTS on safety margins, Bruce Power is introducing a design change to the standard 37-element fuel bundle known as the modified 37-element fuel bundle, or 37M for short. As part of the overall design change process it was necessary to assess the impact of the modified fuel bundle design on key reactor physics parameters. Quantification of this impact on lattice cell properties, core reactivity properties, etc., was reached through a series of calculations using state-of-the-art lattice and core physics models, and comparisons against results for the standard fuel bundle. (author)

  11. Improved Diagnoses and Quantification of Fusarium virguliforme, Causal Agent of Soybean Sudden Death Syndrome.

    Science.gov (United States)

    Wang, Jie; Jacobs, Janette L; Byrne, Jan M; Chilvers, Martin I

    2015-03-01

    Fusarium virguliforme (syn. F. solani f. sp. glycines) is the primary causal pathogen responsible for soybean sudden death syndrome (SDS) in North America. Diagnosis of SDS is difficult because symptoms can be inconsistent or similar to several soybean diseases and disorders. Additionally, quantification and identification of F. virguliforme by traditional dilution plating of soil or ground plant tissue is problematic due to the slow growth rate and plastic morphology of F. virguliforme. Although several real-time quantitative polymerase chain reaction (qPCR)-based assays have been developed for F. virguliforme, the performance of those assays does not allow for accurate quantification of F. virguliforme due to the reclassification of the F. solani species complex. In this study, we developed a TaqMan qPCR assay based on the ribosomal DNA (rDNA) intergenic spacer (IGS) region of F. virguliforme. Specificity of the assay was demonstrated by challenging it with genomic DNA of closely related Fusarium spp. and commonly encountered soilborne fungal pathogens. The detection limit of this assay was determined to be 100 fg of pure F. virguliforme genomic DNA or 100 macroconidia in 0.5 g of soil. An exogenous control was multiplexed with the assay to evaluate for PCR inhibition. Target locus copy number variation had minimal impact, with a range of rDNA copy number from 138 to 233 copies per haploid genome, resulting in a minor variation of up to 0.76 cycle threshold values between strains. The qPCR assay is transferable across platforms, as validated on the primary real-time PCR platform used in the Northcentral region of the National Plant Diagnostic Network. A conventional PCR assay for F. virguliforme detection was also developed and validated for use in situations where qPCR is not possible.

  12. Quantification of maltol in Korean ginseng (Panax ginseng) products by high-performance liquid chromatography-diode array detector

    Science.gov (United States)

    Jeong, Hyun Cheol; Hong, Hee-Do; Kim, Young-Chan; Rhee, Young Kyoung; Choi, Sang Yoon; Kim, Kyung-Tack; Kim, Sung Soo; Lee, Young-Chul; Cho, Chang-Won

    2015-01-01

    Background: Maltol, as a type of phenolic compounds, is produced by the browning reaction during the high-temperature treatment of ginseng. Thus, maltol can be used as a marker for the quality control of various ginseng products manufactured by high-temperature treatment including red ginseng. For the quantification of maltol in Korean ginseng products, an effective high-performance liquid chromatography-diode array detector (HPLC-DAD) method was developed. Materials and Methods: The HPLC-DAD method for maltol quantification coupled with a liquid-liquid extraction (LLE) method was developed and validated in terms of linearity, precision, and accuracy. An HPLC separation was performed on a C18 column. Results: The LLE methods and HPLC running conditions for maltol quantification were optimized. The calibration curve of the maltol exhibited good linearity (R2 = 1.00). The limit of detection value of maltol was 0.26 μg/mL, and the limit of quantification value was 0.79 μg/mL. The relative standard deviations (RSDs) of the data of the intra- and inter-day experiments were <1.27% and 0.61%, respectively. The results of the recovery test were 101.35–101.75% with an RSD value of 0.21–1.65%. The developed method was applied successfully to quantify the maltol in three ginseng products manufactured by different methods. Conclusion: The results of validation demonstrated that the proposed HPLC-DAD method was useful for the quantification of maltol in various ginseng products. PMID:26246746

  13. Comparative quantification of dietary supplemented neural creatine concentrations with (1)H-MRS peak fitting and basis spectrum methods.

    Science.gov (United States)

    Turner, Clare E; Russell, Bruce R; Gant, Nicholas

    2015-11-01

    Magnetic resonance spectroscopy (MRS) is an analytical procedure that can be used to non-invasively measure the concentration of a range of neural metabolites. Creatine is an important neurometabolite with dietary supplementation offering therapeutic potential for neurological disorders with dysfunctional energetic processes. Neural creatine concentrations can be probed using proton MRS and quantified using a range of software packages based on different analytical methods. This experiment examines the differences in quantification performance of two commonly used analysis packages following a creatine supplementation strategy with potential therapeutic application. Human participants followed a seven day dietary supplementation regime in a placebo-controlled, cross-over design interspersed with a five week wash-out period. Spectroscopy data were acquired the day immediately following supplementation and analyzed with two commonly-used software packages which employ vastly different quantification methods. Results demonstrate that neural creatine concentration was augmented following creatine supplementation when analyzed using the peak fitting method of quantification (105.9%±10.1). In contrast, no change in neural creatine levels were detected with supplementation when analysis was conducted using the basis spectrum method of quantification (102.6%±8.6). Results suggest that software packages that employ the peak fitting procedure for spectral quantification are possibly more sensitive to subtle changes in neural creatine concentrations. The relative simplicity of the spectroscopy sequence and the data analysis procedure suggest that peak fitting procedures may be the most effective means of metabolite quantification when detection of subtle alterations in neural metabolites is necessary. The straightforward technique can be used on a clinical magnetic resonance imaging system. Copyright © 2015 Elsevier Inc. All rights reserved.

  14. Quantification of cardiolipin by liquid chromatography-electrospray ionization mass spectrometry.

    Science.gov (United States)

    Garrett, Teresa A; Kordestani, Reza; Raetz, Christian R H

    2007-01-01

    Cardiolipin (CL), a tetra-acylated glycerophospholipid composed of two phosphatidyl moieties linked by a bridging glycerol, plays an important role in mitochondrial function in eukaryotic cells. Alterations to the content and acylation state of CL cause mitochondrial dysfunction and may be associated with pathologies such as ischemia, hypothyrodism, aging, and heart failure. The structure of CL is very complex because of microheterogeneity among its four acyl chains. Here we have developed a method for the quantification of CL molecular species by liquid chromatography-electrospray ionization mass spectrometry. We quantify the [M-2H](2-) ion of a CL of a given molecular formula and identify the CLs by their total number of carbons and unsaturations in the acyl chains. This method, developed using mouse macrophage RAW 264.7 tumor cells, is broadly applicable to other cell lines, tissues, bacteria and yeast. Furthermore, this method could be used for the quantification of lyso-CLs and bis-lyso-CLs.

  15. A practical method for accurate quantification of large fault trees

    International Nuclear Information System (INIS)

    Choi, Jong Soo; Cho, Nam Zin

    2007-01-01

    This paper describes a practical method to accurately quantify top event probability and importance measures from incomplete minimal cut sets (MCS) of a large fault tree. The MCS-based fault tree method is extensively used in probabilistic safety assessments. Several sources of uncertainties exist in MCS-based fault tree analysis. The paper is focused on quantification of the following two sources of uncertainties: (1) the truncation neglecting low-probability cut sets and (2) the approximation in quantifying MCSs. The method proposed in this paper is based on a Monte Carlo simulation technique to estimate probability of the discarded MCSs and the sum of disjoint products (SDP) approach complemented by the correction factor approach (CFA). The method provides capability to accurately quantify the two uncertainties and estimate the top event probability and importance measures of large coherent fault trees. The proposed fault tree quantification method has been implemented in the CUTREE code package and is tested on the two example fault trees

  16. An uncertainty inventory demonstration - a primary step in uncertainty quantification

    Energy Technology Data Exchange (ETDEWEB)

    Langenbrunner, James R. [Los Alamos National Laboratory; Booker, Jane M [Los Alamos National Laboratory; Hemez, Francois M [Los Alamos National Laboratory; Salazar, Issac F [Los Alamos National Laboratory; Ross, Timothy J [UNM

    2009-01-01

    Tools, methods, and theories for assessing and quantifying uncertainties vary by application. Uncertainty quantification tasks have unique desiderata and circumstances. To realistically assess uncertainty requires the engineer/scientist to specify mathematical models, the physical phenomena of interest, and the theory or framework for assessments. For example, Probabilistic Risk Assessment (PRA) specifically identifies uncertainties using probability theory, and therefore, PRA's lack formal procedures for quantifying uncertainties that are not probabilistic. The Phenomena Identification and Ranking Technique (PIRT) proceeds by ranking phenomena using scoring criteria that results in linguistic descriptors, such as importance ranked with words, 'High/Medium/Low.' The use of words allows PIRT to be flexible, but the analysis may then be difficult to combine with other uncertainty theories. We propose that a necessary step for the development of a procedure or protocol for uncertainty quantification (UQ) is the application of an Uncertainty Inventory. An Uncertainty Inventory should be considered and performed in the earliest stages of UQ.

  17. Quantification of organic acids in beer by nuclear magnetic resonance (NMR)-based methods

    International Nuclear Information System (INIS)

    Rodrigues, J.E.A.; Erny, G.L.; Barros, A.S.; Esteves, V.I.; Brandao, T.; Ferreira, A.A.; Cabrita, E.; Gil, A.M.

    2010-01-01

    The organic acids present in beer provide important information on the product's quality and history, determining organoleptic properties and being useful indicators of fermentation performance. NMR spectroscopy may be used for rapid quantification of organic acids in beer and different NMR-based methodologies are hereby compared for the six main acids found in beer (acetic, citric, lactic, malic, pyruvic and succinic). The use of partial least squares (PLS) regression enables faster quantification, compared to traditional integration methods, and the performance of PLS models built using different reference methods (capillary electrophoresis (CE), both with direct and indirect UV detection, and enzymatic essays) was investigated. The best multivariate models were obtained using CE/indirect detection and enzymatic essays as reference and their response was compared with NMR integration, either using an internal reference or an electrical reference signal (Electronic REference To access In vivo Concentrations, ERETIC). NMR integration results generally agree with those obtained by PLS, with some overestimation for malic and pyruvic acids, probably due to peak overlap and subsequent integral errors, and an apparent relative underestimation for citric acid. Overall, these results make the PLS-NMR method an interesting choice for organic acid quantification in beer.

  18. Quantification of organic acids in beer by nuclear magnetic resonance (NMR)-based methods

    Energy Technology Data Exchange (ETDEWEB)

    Rodrigues, J.E.A. [CICECO-Department of Chemistry, University of Aveiro, Campus de Santiago, 3810-193 Aveiro (Portugal); Erny, G.L. [CESAM - Department of Chemistry, University of Aveiro, Campus de Santiago, 3810-193 Aveiro (Portugal); Barros, A.S. [QOPNAA-Department of Chemistry, University of Aveiro, Campus de Santiago, 3810-193 Aveiro (Portugal); Esteves, V.I. [CESAM - Department of Chemistry, University of Aveiro, Campus de Santiago, 3810-193 Aveiro (Portugal); Brandao, T.; Ferreira, A.A. [UNICER, Bebidas de Portugal, Leca do Balio, 4466-955 S. Mamede de Infesta (Portugal); Cabrita, E. [Department of Chemistry, New University of Lisbon, 2825-114 Caparica (Portugal); Gil, A.M., E-mail: agil@ua.pt [CICECO-Department of Chemistry, University of Aveiro, Campus de Santiago, 3810-193 Aveiro (Portugal)

    2010-08-03

    The organic acids present in beer provide important information on the product's quality and history, determining organoleptic properties and being useful indicators of fermentation performance. NMR spectroscopy may be used for rapid quantification of organic acids in beer and different NMR-based methodologies are hereby compared for the six main acids found in beer (acetic, citric, lactic, malic, pyruvic and succinic). The use of partial least squares (PLS) regression enables faster quantification, compared to traditional integration methods, and the performance of PLS models built using different reference methods (capillary electrophoresis (CE), both with direct and indirect UV detection, and enzymatic essays) was investigated. The best multivariate models were obtained using CE/indirect detection and enzymatic essays as reference and their response was compared with NMR integration, either using an internal reference or an electrical reference signal (Electronic REference To access In vivo Concentrations, ERETIC). NMR integration results generally agree with those obtained by PLS, with some overestimation for malic and pyruvic acids, probably due to peak overlap and subsequent integral errors, and an apparent relative underestimation for citric acid. Overall, these results make the PLS-NMR method an interesting choice for organic acid quantification in beer.

  19. Quantification of total phosphorothioate in bacterial DNA by a bromoimane-based fluorescent method.

    Science.gov (United States)

    Xiao, Lu; Xiang, Yu

    2016-06-01

    The discovery of phosphorothioate (PT) modifications in bacterial DNA has challenged our understanding of conserved phosphodiester backbone structure of cellular DNA. This exclusive DNA modification in bacteria is not found in animal cells yet, and its biological function in bacteria is still poorly understood. Quantitative information about the bacterial PT modifications is thus important for the investigation of their possible biological functions. In this study, we have developed a simple fluorescence method for selective quantification of total PTs in bacterial DNA, based on fluorescent labeling of PTs and subsequent release of the labeled fluorophores for absolute quantification. The method was highly selective to PTs and not interfered by the presence of reactive small molecules or proteins. The quantification of PTs in an E. coli DNA sample was successfully achieved using our method and gave a result of about 455 PTs per million DNA nucleotides, while almost no detectable PTs were found in a mammalian calf thymus DNA. With this new method, the content of phosphorothioate in bacterial DNA could be successfully quantified, serving as a simple method suitable for routine use in biological phosphorothioate related studies. Copyright © 2016 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  20. Quantification of pharmaceutical peptides in human plasma by LC-ICP-MS sulfur detection

    DEFF Research Database (Denmark)

    Møller, Laura Hyrup; Macherius, André; Hansen, Thomas Hesselhøj

    2016-01-01

    A method for quantification of a pharmaceutical peptide in human plasma was developed using gradient elution LC-ICP-MS. A membrane desolvation (MD) system was applied to remove organic solvents from the eluent prior to the detection as SO+ in the dynamic reaction cell (DRC) of the ICP-DRC-MS inst......A method for quantification of a pharmaceutical peptide in human plasma was developed using gradient elution LC-ICP-MS. A membrane desolvation (MD) system was applied to remove organic solvents from the eluent prior to the detection as SO+ in the dynamic reaction cell (DRC) of the ICP......-DRC-MS instrument and subsequent quantification by post-column isotope dilution (IDA). Plasma proteins were precipitated prior to analysis. Analytical figures of merit including linearity, precision, LOD, LOQ and accuracy were considered satisfactory for analysis of plasma samples. The selectivity of the developed...... method was demonstrated for five pharmaceutically relevant peptides: desmopressin, penetratin, substance P, PTH (1-34) and insulin. Preliminary experiments on an ICP-MS/MS system using oxygen to reduce the effect of organic solvents were also performed to compare sensitivity. The results of the study...